Language selection

Search

Special report – Questions and Answers

Table of Contents

RCMP Investigation

How do you respond to those who argue that giving up some privacy is worth it to catch child abusers/other criminals/identify victims?

Was the RCMP successful in identifying any victims/perpetrators of child sexual exploitation or other crimes as a result of using this tool?

How many searches did the RCMP conduct with Clearview?

Do you believe the RCMP that most of those searches were related to child sex crime investigations?

Under what circumstances should police be allowed to use FRT?

Should a warrant be required to use FRT in an investigation?

Should its use be limited to only the most serious or violent crimes?

Should police be allowed to use facial recognition technology in terrorism cases?

If Clearview’s database had been compiled legally, would the RCMP have done anything wrong? What if the police had gathered images and created the database themselves? Would that be lawful?

What other potentially novel collection methods concern you? Any being used by the RCMP?

Media reports in April suggested the RCMP used FRT software from another company, IntelCentre. Did you examine this? Does this concern you? Have you reached out to the RCMP about this? Will you be investigating?

Is there any evidence the RCMP has used FRT to monitor the activities of protestors?

Has the RCMP completed a PIA for the use of FRT?

FRT guidance for law enforcement

Some are calling for a ban on the use of FRT - would you support a ban? Why not?

What law reforms are needed?

Were police consulted on this guidance already?

What steps will the consultation involve?

Will you be issuing guidance on private sector use of FRT?

Will you be issuing guidance for other public safety organizations like CBSA?

Questions:

RCMP Investigation

How do you respond to those who argue that giving up some privacy is worth it to catch child abusers/other criminals/identify victims?

Mass surveillance isn’t giving up “some” privacy. It is a question of proportionality.

These issues speak to the kind of society we want to live in.

Putting surveillance cameras on every street corner and collecting a DNA sample from everyone at birth would help us to solve more crimes. However, we have concluded as a society that that those actions would have a disproportionate impact on privacy – even though it means some criminals will not be caught.

On the other hand, most of us consider it acceptable for police to stop cars to look for impaired drivers at roadside spot checks. We consider the intrusion on privacy proportional to the benefit gained.

We have already established mechanisms to set limits on police use of fingerprints and DNA.

For example, the Identification of Criminals Act allows police to fingerprint or photograph individuals for the purpose of identification. But the act only permits these actions under specifics conditions, such as when an individual is in lawful custody, and charged with, or convicted of, an indictable offence.

The DNA Identification Act permits police to collect DNA samples for use in a national DNA databank, with the objective of identifying victims and perpetrators of serious and violent crimes. The DNA databank may only be used under strict conditions, and for specified crimes.

Our draft guidance provides recommendations on how the use of FRT may be implemented in a way that is compliant with privacy laws and minimizes privacy risks.

These issues are extremely complex, which is why we will consult with police, civil society and other stakeholders on the content of the guidance before it is finalized.

Was the RCMP successful in identifying any victims/perpetrators of child sexual exploitation or other crimes as a result of using this tool?

The RCMP told us, and publicly stated, that it had successfully used Clearview to identify two child victims.

We are not saying that facial recognition technology can never legally be used for law enforcement purposes.

What was at issue in this investigation was:

  • First, the collection of personal information from a third party agent which had itself collected that information in contravention of the law, and
  • Second, the RCMP’s lack of systems to assess, track, and control its use of new sources and new technologies – to ensure compliance with the law before using them.

How many searches did the RCMP conduct with Clearview?

Based on information we were able to obtain, the RCMP conducted approximately 500 searches using Clearview through purchased and free trial accounts.

Do you believe the RCMP that most of those searches were related to child sex crime investigations?

The RCMP did not demonstrate that most of the searches it conducted were related to child sexual exploitation investigations.

In fact, it provided a reasonable accounting for approximately 15 per cent of searches, according to Clearview records.

Approximately 6 per cent of the total searches appeared to be related to child exploitation.

We were told 19 searches were conducted by the National Child Exploitation Crime Centre for the purposes victim identification, 14 were to attempt to locate an identified suspect evading law enforcement and 45 were for testing using RCMP members’ own images, images of other consenting individuals, or inanimate objects.

We simply cannot say what most of the searches were for.

Under what circumstances should police be allowed to use FRT?

Currently, if police are relying on their common law powers to use facial recognition technology, consideration must be given to whether it is reasonably necessary and whether there is proportionality between the privacy impact and the public good.

But our current privacy regime is inadequate. Police have a great deal of discretion and this investigation has shown us how this could lead to inappropriate things happening.

The use of facial recognition technology is regulated through a patchwork of statutes and case law that, for the most part, do not specifically address the risks posed by the technology. This creates room for uncertainty concerning what uses of facial recognition may be acceptable, and under what circumstances.

At a minimum, we can address this by ensuring the principles of necessity and proportionality are clearly enshrined in our statutory law. With this approach, appropriate uses of FRT by police would vary based on the purpose of its use and the privacy risks involved.

Any use of facial recognition technology by police should be necessary to meet a specific need, proportionate to the privacy risks involved, lawfully authorized, and consistent with other widely accepted principles of privacy protection.

That being said, other countries have suggested more specific rules, such as prohibiting the use of facial recognition, except for serious crimes, or outright banning its use in public spaces.

This raises questions as to whether the risks of facial recognition technology are such that, due to the inalterable nature of the information involved, specific rules or new laws may be warranted. This is already the case for other forms of biometrics collected by law enforcement such as fingerprints and DNA profiles.

This is one of the questions we are going to consult on.

Should a warrant be required to use FRT in an investigation?

Warrants would certainly improve privacy protection as they require the approval of an independent judicial authority.

Should they be required? This is a question we will be consulting on. It is conceivable that police may use facial recognition without a warrant, for instance in exigent circumstances.

Should its use be limited to only the most serious or violent crimes?

This is the direction Europe is going. Another possible approach is through a regulatory framework based on legal and privacy principles including necessity, proportionality, and respect for fundamental rights.

The process of establishing appropriate limits on FRT use is ongoing. We look forward to engaging with police, lawmakers and other stakeholders on this question during our consultation on our proposed guidance.

In our draft guidance, we highlight the proportionality standard and the need to assess whether the intrusion on privacy caused by the program is proportional to the benefit gained.

This would involve first identifying how the specific use of facial recognition would impact the privacy of individuals. Then, police agencies should consider whether these intrusions on privacy are justified by the benefits of the specific deployment of facial recognition.

Inherent in this step is the recognition that not all objectives carry the same weight. For example, preventing an imminent terrorist plot would justify a greater privacy intrusion than catching someone who committed a minor act of vandalism.

In making this assessment, law enforcement agencies must be open to the possibility that, in a free and democratic society, a proposed FR system which has a substantial impact on privacy (such as via mass surveillance) may never be proportional to the benefits gained.

Should police be allowed to use facial recognition technology in terrorism cases?

Facial recognition technology is privacy-invasive and should only be used by police where there is proportionality between the privacy impact and the public good.

While terrorism is obviously an extremely serious crime, we must be careful to avoid using the term as a justification for disproportionate intrusions on privacy.

Trying to set out broad categories of crime for which using facial recognition technology would be acceptable is fraught with issues. This would not recognize:

  1. the wide range of specific cases that could be involved (investigating a cold case, versus an imminent credible threat for instance), and
  2. the wide range of different ways FRT could be used – some more intrusive than others (FRT to check that an incoming visitor’s face matches their passport, versus FRT of all faces at a lawful protest)

It is in recognition of such diversity that the courts have set out tests for police to determine if a particular search is appropriate.

The courts have set out a framework for looking at the question of what police can do within their ancillary powers at common law.

The framework speaks to reasonable necessity and proportionality – it looks at a specific case and considers the importance of the police duty and the public purpose on one hand and the intrusion on rights, such as privacy, on the other.

If Clearview’s database had been compiled legally, would the RCMP have done anything wrong? What if the police had gathered images and created the database themselves? Would that be lawful?

Our investigation did not speculate as to whether the use of a service like Clearview, if compiled legally, would have complied with the Privacy Act.

In our view, for an institution’s collection to be permissible under the Privacy Act, it must be directly related to an operating program or activity that falls within the institution’s legal authority and respects the general rule of law. For the RCMP, this includes compliance with common law.

Under common law, where a search intrudes on an individual’s reasonable expectation of privacy, the importance of the performance of the duty to the public good, and the extent of the interference with individual liberty, must be considered.

The use of facial recognition technology, with its power to disrupt anonymity in public spaces, can be a meaningful interference with individual liberty.

Further, the use of mass surveillance for the purposes of providing a service to police is a major and substantial intrusion into the lives of Canadians by the state. The necessity of using such practices, and the importance of the particular public good, must therefore be examined carefully on a case by case basis, for instance by way of an application for a warrant.

Separate from the issue of whether the Privacy Act was or wasn’t ultimately violated, the RCMP still should have conducted an assessment for novel collection of personal information.

What other potentially novel collection methods concern you? Any being used by the RCMP?

One of the significant concerns raised in our investigation is that the RCMP indicated its members were empowered to use their discretion to try new personal information collection techniques with appropriate approvals at the local level. So it is frankly an unknown.

We respect that the RCMP must ensure its investigative techniques evolve as the landscape they police evolves, but this should not happen without ensuring respect for privacy rights before implementing new approaches.

The RCMP has agreed, in response to our recommendations, to implement measures to actively track its own use of novel collection methods.

I can tell you we have an active investigation into the RCMP’s Wide Awake Project, which involves social media monitoring and other techniques.

We cannot speak to the specifics of ongoing investigations.

Media reports in April suggested the RCMP used FRT software from another company, IntelCentre. Did you examine this? Does this concern you? Have you reached out to the RCMP about this? Will you be investigating?

We are aware of the media reports about the RCMP’s historic use of IntelCentre, and have engaged with the RCMP on this subject. I cannot speak to the specifics regarding IntelCentre at this time but we believe our investigation of the RCMP’s use of Clearview’s facial recognition technology is relevant, applicable and instructive with regard to compliance by the RCMP with any other potential uses of FRT.

If pressed on whether this is part of the Wide Awake Project investigation …

I cannot get into details of that investigation as it is ongoing.

Is there any evidence the RCMP has used FRT to monitor the activities of protestors?

Our investigation into the RCMP’s use of Clearview did not uncover evidence that the RCMP used Clearview AI to monitor protestors.

The RCMP did not provide a reasonable accounting for the vast majority of searches.

We are aware of reports of such uses occurring outside of Canada, and we are concerned about the possibility of them occurring here. This speaks to the urgency of acting now to set appropriate limits on the use of FRT, rather than acting reactively once the damage has already been done.

If pressed on whether monitoring protestors has come up in the Wide Awake Project investigation …

That investigation is ongoing. We look forward to sharing details about it at the appropriate time.

Has the RCMP completed a PIA for the use of FRT?

The RCMP has not submitted a PIA for any program relying on FRT; it has agreed to do so. (Ceased offering services to the RCMP in July 2020).

The RCMP has had very preliminary discussions with the OPC about certain plans to use FRT. We would expect to receive a PIA in advance of any such program as required by federal government policy.

Under the TBS Directive on PIAs, the RCMP is required to do PIAs for the use of facial recognition technology to collect personal information, as PIAs are required for “substantial modifications” to activities involving personal information. “Substantial modifications” expressly includes “any change or amendment to the privacy practices related to activities that use automated or technological means to identify, create, analyze, compare, extract, cull match or define personal information”

Making sure PIAs happen – in advance of deploying new technology – is critical. In response to our recommendations, the RCMP has committed to put in place new training, systems, and processes across the RCMP to identify, assess, track and control novel collections of personal information (including the use of facial recognition technology). This includes commitments to dedicate resources to the timely completion of PIAs.

I would also note that the Toronto Police Service has agreed to a moratorium on the use of FRT in body-worn cameras pending the release of the guidance.

FRT guidance for law enforcement

Some are calling for a ban on the use of FRT - would you support a ban? Why not?

Used responsibly and in the right circumstances, facial recognition can provide benefits to security and public safety. I do not believe it should be banned outright, though the idea of explicit no-go zones for certain specific practices is something to consider.

For example, the European Commission’s proposed AI regulation seeks to establish prohibitions on a number of uses of the technology. If passed, it would ban AI that manipulates an individual’s behavior in a way that is likely to cause physical or psychological harm. It would also prohibit AI used for real-time remote biometric identification in public spaces by law enforcement – but with exceptions for cases involving terrorism, serious criminality, and targeted searches for crime victims and missing children.

That said, given the risks of FR use, we must ensure that our laws are effective in safeguarding the fundamental rights of individuals when this technology is used.

Ultimately, the decision rests with Parliament to consider a moratorium until such time that any specific legislative changes on this issue are recommended and adopted.

What law reforms are needed?

Stronger protections are needed in our privacy laws to ensure that facial recognition technology is regulated appropriately in both the private and public sectors.

Such protections should include a rights-based approach to privacy, meaningful accountability measures, and stronger enforcement powers, among others.

However, the risks of FRT are such that, due to the inalterable nature of the information involved, specific rules may be warranted. This is already the case for other forms of biometrics collected by law enforcement such as fingerprints and DNA profiles.

As we note in the special report, unlike other forms of biometrics collected by law enforcement, facial recognition is not subject to a clear and comprehensive set of rules. Its use is regulated through a patchwork of statutes and case law that, for the most part, do not specifically address the risks posed by the technology. This creates room for uncertainty concerning what uses of facial recognition may be acceptable, and under what circumstances.

That being said, it may also be possible to regulate FRT appropriately through reform of public and private sector privacy laws of general application.

We look forward to engaging with police, lawmakers and other stakeholders on important questions surrounding the use and regulation of this technology.

Were police consulted on this guidance already?

We plan to reach out to representatives from the RCMP and the Canadian Association of Chiefs of Police to invite their feedback on the draft guidance. We intend to meet with representatives of these groups, as well as other stakeholders such as from civil society over the coming months to receive their input.

What steps will the consultation involve?

Organizations and individuals interested in providing feedback on the guidance are invited to do so using the submission process outlined in our call for comments.

The consultation period will be open for approximately four months, from June 10 to October 15.

In addition to written submissions, we intend to meet with representatives from the RCMP and the Canadian Association of Chiefs of Police as well as with stakeholders from civil society over the coming months to receive their input.

Will you be issuing guidance on private sector use of FRT?

We currently have no plans to issue guidance specific to private sector uses of FRT. However, our office is in the process of finalizing guidance on public and private sector uses of biometrics, which would include FRT.

That being said, we recently published reports of findings in two cases involving the private sector’s use of FRT: Clearview AI and Cadillac Fairview. Both those reports contain recommendations that would be of interest to other companies considering using FRT.

Will you be issuing guidance for other public safety organizations like CBSA?

We currently have no plans to issue guidance specific to FRT use by other public safety organizations. However, our office is in the process of finalizing guidance on public and private sector uses of biometrics, which would include FRT.

Although our guidance is for police agencies and not those organizations outside of the police that are involved in law enforcement activities (such as border control), these organizations must still ensure their compliance with all applicable laws, including privacy and human rights laws. Sections of our guidance may be helpful for that purpose.

Table of Contents

RCMP Investigation

How do you respond to those who argue that giving up some privacy is worth it to catch child abusers/other criminals/identify victims?

Was the RCMP successful in identifying any victims/perpetrators of child sexual exploitation or other crimes as a result of using this tool?

How many searches did the RCMP conduct with Clearview?

Do you believe the RCMP that most of those searches were related to child sex crime investigations?

Under what circumstances should police be allowed to use FRT?

Should a warrant be required to use FRT in an investigation?

Should its use be limited to only the most serious or violent crimes?

Should police be allowed to use facial recognition technology in terrorism cases?

If Clearview’s database had been compiled legally, would the RCMP have done anything wrong? What if the police had gathered images and created the database themselves? Would that be lawful?

What other potentially novel collection methods concern you? Any being used by the RCMP?

Media reports in April suggested the RCMP used FRT software from another company, IntelCentre. Did you examine this? Does this concern you? Have you reached out to the RCMP about this? Will you be investigating?

Is there any evidence the RCMP has used FRT to monitor the activities of protestors?

Has the RCMP completed a PIA for the use of FRT?

FRT guidance for law enforcement

Some are calling for a ban on the use of FRT - would you support a ban? Why not?

What law reforms are needed?

Were police consulted on this guidance already?

What steps will the consultation involve?

Will you be issuing guidance on private sector use of FRT?

Will you be issuing guidance for other public safety organizations like CBSA?

Date modified: