Language selection

Search

Recommended legal framework for police agencies’ use of facial recognition

Joint Statement by Federal, Provincial and Territorial Privacy Commissioners

May 2, 2022

Facial Recognition (FR) has emerged as a tool of significant interest to police agencies across Canada. Despite its emergence, FR use by police is not subject to a clear and comprehensive set of rules. Instead, it is regulated through a patchwork of statutes that, for the most part, does not specifically address different uses or risks posed by the technology. Adding to this legal uncertainty is a lack of jurisprudence on FR. When taken as a whole, the current legal framework risks encouraging fragmented approaches to FR use that would take years to resolve before the courts.

Canadian privacy laws that govern the collection, use and disclosure of personal information are drafted at a level of principle that, in the context of FR use, is too high to be sufficiently protective given the invasive nature of this technology. Principle-based laws are beneficial in some respects, as the laws can remain adaptable to evolving technologies without constantly needing to be amended or risking obsolescence. However, in the case of potentially invasive technologies such as FR, relying solely on high level principles leaves considerable discretion to police agencies and does not give a high degree of certainty that individuals’ rights will be respected.

The absence of specific legislation for police use of FR also stands in contrast to how other forms of biometrics are governed. For instance, the DNA Identification Act and the Identification of Criminals Act regulate how police agencies can collect, use, disclose and destroy DNA samples, fingerprints, and mugshots.

The use of FR by police agencies in Canada ultimately raises important questions about the kind of society we want to live in. The capabilities of this technology are significant, and when used responsibly and in the right circumstances, its application could provide benefits for public safety. For instance, this technology can be used in complex investigations to help solve serious crimes, to locate missing persons, and to support national security objectives.

At the same time, the use of FR involves the collection and processing of highly sensitive personal information. Biometric facial data is unique to each individual, unlikely to vary significantly over periods of time, and difficult to change in its underlying features. This information speaks to the very core of individual identity. FR can collect this information at scale, for minimal cost, enabling police agencies to identify and potentially surveil individuals covertly and in large numbers.

The prospect of police agencies integrating FR into law enforcement initiatives raises the possibility of serious privacy harms unless appropriate protections are in place. Canadians must be free to participate voluntarily and actively in a modern society without the risk of being routinely identified, tracked and monitored. While certain intrusions on this right can be justified in specific circumstances, individuals do not forego their right to privacy, including their anonymity, merely by participating in the world in ways that may reveal their face to others, or that may enable their image to be captured on camera.

Privacy is vital to the realization of fundamental rights protected by the Canadian Charter of Rights and Freedoms including dignity, autonomy, personal growth and the free and open participation of individuals in democratic life. When surveillance increases, it can deter individuals from exercising these and other rights and freedoms. Inappropriate uses of FR may have severe and lasting effects not only for individuals whose personal information may be collected, processed, or disclosed, but for society more generally, as authorities expand their capacity to monitor the physical and virtual spaces in which we interact. Such expansion can be difficult to contain once it is set in motion.

In our view, the existing legal framework for police use of FR in Canada is insufficient to address the risks to privacy and other fundamental rights associated with the technology. Uses of FR are rapidly emerging along a spectrum of risks. Any such uses are, at present, governed by a patchwork of laws put in place largely before FR had emerged in its current state, and before its potential consequences were well understood. As a result, gaps exist in the current law which, if left unaddressed, leave space for the emergence of serious harms to an individual’s privacy and other fundamental rights.

That is why we are calling for a new legal framework that sets out appropriate limits on police use of FR. This legal framework should, we believe, establish clearly and explicitly the circumstances in which police use of FR is acceptable – and when it is not. It should include privacy protections that are specific to FR use, and it should ensure appropriate oversight when the technology is deployed.

This can be achieved either by incorporating changes into existing privacy laws, or by enacting a separate law governing FR, given that FR use affects rights beyond privacy as well (such as equality and non-discrimination). In either case, a new legal framework for police use of FR should include several key elements, as outlined below. These elements are informed by stakeholder feedback from our consultation, by our recent investigations into the use of FR by police, and by our own respective research and analysis.

The key elements of a legal framework for regulating police use of FR include:

I. Clear authorization for police use of FR subject to clear no-go zones

As a starting point, the law should clearly and explicitly define the purposes for which police are authorized to use FR. Without such clarity, there is a risk that varying interpretations of police authority to use FR may emerge, which can lead to inconsistency in expectations and application of the law.

The law should be clear that police may only use FR for purposes that are expressly authorized by statute, and not for any other purpose. It is important to include this clarification explicitly in the law, to prevent legislative gaps concerning the legality of other potential uses. Without this, ambiguity may persist in the interpretation of when FR use is lawful.

In our view, any authorization to use FR in the context of crime prevention or investigation should be reserved for compelling purposes, such as limiting use to crimes that are of a serious nature only.Footnote 1 Some jurisdictions have taken a similar approach, for example by limiting certain investigative uses of FR to violent crimes or crimes punishable by minimum prison sentences, depending on the circumstances.Footnote 2 Any authorization to use FR for purposes of crime prevention should be similarly circumscribed.

It may also be appropriate to authorize police use of FR for humanitarian purposes, such as to assist in the search of missing persons, or for administration purposes, including the redaction of faces in video surveillance footage to facilitate access to information requests. Limitations on such uses should be commensurate with the risks involved.

Given the privacy risks that can arise from FR use, the law should impose clear and explicit restrictions against certain uses (“no go” zones), regardless of the purpose, to prevent deployments of FR that could have indefensible effects on privacy and other human rights. In our view, these limits should include an explicit prohibition on the use of FR to monitor individuals involved in peaceful protests, and any use of FR that can result in mass surveillance, the definition of which should encompass the discriminate or indiscriminate monitoring of a population or a significant component of a population.Footnote 3 Even for the prevention and investigation of serious crimes, FR use by police should only be authorized if it is targeted, intelligence led, and subject to appropriate time limitations.

Appropriate limits on authorized uses should also include restrictions on the creation and use of databases of facial images in FR initiatives, which are a key source of risks associated with FR. In our view, these should include a prohibition on the indiscriminate collection of facial images for use in an FR comparison database, including images collected online.Footnote 4 It should also include limits on police authorization to access or use databases of images created or maintained by other public bodies or by private entities. If police in Canada are authorized to use any such databases, the authorization should be clearly defined and narrow in scope, and it should require that the personal information within the database was collected lawfully. We note that the European Parliament has called for a ban on the use of private facial recognition databases following the illegal and indiscriminate collection of images from the company Clearview AI.Footnote 5

The law should also include restrictions on the circumstances in which an individual’s image may be entered into a comparison database – for example, only upon conviction of certain offenses, or when specific threat criteria are met. We note that, similarly, the DNA Identification Act places statutory restrictions on the circumstances in which police are authorized to enter DNA samples into the national DNA data bank.

II. Strict necessity and proportionality requirements

The privacy risks associated with FR depend in part on circumstances that are specific to any given deployment of the technology. While some circumstances involve risks that are too great to ever justify its use, such as in the example of mass surveillance, other uses may be justifiable in the circumstance. Since it is not realistic for the law to anticipate and address all circumstances, it is important that, in addition to limitations on authorized purposes, the law also require police use of FR to be both necessary and proportionate for any given deployment of the technology.

We note that law enforcement in the European Union, where subject to the Law Enforcement Directive, can process biometric data for identification purposes only when doing so is “strictly necessary”.Footnote 6 Additional conditions include that the biometric processing must be authorized by Union or Member State law (or by certain other specified grounds for lawful processing), and is subject to appropriate safeguards for the rights and freedoms of the data subject. Likewise, the UK’s Surveillance Camera Code of Practice requires any police use of facial recognition or other biometric characteristic recognition system to be clearly justified, necessary to meet a pressing need, and proportionate in meeting the stated purpose.Footnote 7

It is our view that necessity and proportionality requirements should be included explicitly in the law as overarching pre-conditions for any use of FR. These should apply to the collection, use, or disclosure of comparison database images and probe images, as well as any other personal information involved in the initiative. Furthermore, necessity should be defined in such a way that any authorized use must be strictly necessary to meet a specific need.Footnote 8 Given the intrusiveness of FR, it is insufficient to rely on general public safety objectives to justify use of the technology. Likewise, proportionality should be defined in such a way that potential benefits are weighed in relation to both generalized impacts of FR use (e.g., increases in generalized surveillance) and specific impacts of the proposed initiative on individuals and groups.

III. Independent oversight for FR initiatives

Strong, independent oversight is essential to ensuring the responsible implementation of FR initiatives. Oversight is especially important in the context of police use of FR, where potential impacts can be complex, difficult to foresee, and highly consequential for the individuals, groups or communities involved. Done properly, oversight can also help to sustain public confidence in the responsible use of FR technology, and strengthen accountability for FR initiatives.

In our view, this would be best approached by empowering an independent, external public body with the necessary powers to proactively and meaningfully oversee police use of FR. The oversight body’s powers and responsibilities should include the following key elements:

First, there should be mandatory, proactive engagement between the oversight body and police throughout all phases of FR initiatives. Consulting with the appropriate oversight authority early and often can help to ensure that risks are accounted for and potential harms minimized. Doing so can also help to ensure initiatives comply with legal requirements throughout deployment, including proposed requirements for necessity and proportionality. A key mechanism for facilitating such engagement should involve a requirement for police to conduct a privacy impact assessment (or similar risk assessment) before engaging in any planned use of FR and to submit it to the oversight body for its review and input. Assessments such as privacy impact assessments (PIAs) are becoming a legal standard in a number of jurisdictions both in Canada and in other parts of the world, such as Europe.

Second, police should be required to obtain pre-authorization from an oversight body at the program level, or provide it with advance notice of a proposed use, before launching an FR initiative. In Quebec, the oversight model requires disclosure of the creation of a biometric database to its privacy regulator, the Commission d'accès à l'information du Québec (CAI), before the database is created. The CAI has a range of order making powers in relation to such databases and can also suspend or prohibit the coming into service of such a database if it is not in compliance with the CAI’s orders or otherwise constitutes an invasion of privacy.Footnote 9

Third, the oversight body should be granted the requisite powers needed to carry out compliance and enforcement functions effectively and efficiently. In the context of policing initiatives, where FR use can have serious impacts on privacy and other human rights, it is important that this include the ability for the oversight body to conduct proactive audits and inspections to ensure ongoing compliance with the law. In our view, such measures are crucial for supporting accountability for FR initiatives and transparency in how the technology is used. Oversight powers should also include the ability to make orders with respect to FR initiatives in instances where violations of the law occur. Ultimately, enforcement mechanisms should encourage broad and ongoing compliance by police and provide for quick and effective remedies for individuals.

IV. Privacy rights and protections

All uses of FR involve potential risks to privacy and other fundamental rights. Given the seriousness of some risks, it is important that, when FR is used, appropriate privacy protections are in place to mitigate the risks involved.

We are of the view that existing privacy laws should continue to govern police use of FR when they provide sufficient protection against these risks. However, additional changes are needed in order to make certain provisions and principles clearer and more specific to risks that can arise from FR use. These necessary changes could either be incorporated in privacy laws themselves or be contained in a separate law governing FR, given its use affects rights beyond privacy (such as equality and non-discrimination).

One such change concerns the accuracy of FR systems. For example, the Privacy Act requires federal institutions, including police, to keep information about individuals accurate and up to date. Mitigating risks associated with FR use, however, also requires accuracy specifically in relation to the matching procedure used to compare images. It is important, then, that the law address this nuance directly by requiring rigorous testing of FR systems given the potential consequences of mis-identification, which can include the reproduction of systemic bias against racialized populations and other groups.Footnote 10 To further mitigate risks related to algorithmic errors, the law should ensure that a human is involved in reviewing facial matches returned by facial recognition software.Footnote 11

Another proposed change concerns the retention of personal information. Retention considerations in FR initiatives can be complex because probe images, comparison database images, and faceprint data may all be needed for different lengths of time. In policing contexts, some information may also be needed for longer periods, for example, as evidence in court proceedings. For these reasons, it is important for the law to include clear and specific direction, taking into account the operational realities of policing, concerning when, and for how long, images and other personal information are necessary to be retained. This should include prohibitions against retaining images of innocent individuals or those who have been cleared of suspicion, such as bystanders captured on video, individuals who have never been convicted of an offence, and individuals who receive a pardon.

A third proposed change relates to transparency measures for FR initiatives. In our view, strong transparency measures are needed at the program level to ensure individuals have an opportunity to be informed about what personal information may be collected about them, and how it may be used. Such measures are important for accountability purposes and for establishing public trust in police use of FR technology, and should include explicit requirements to publish information about the nature, scope, methods, purposes, and handling of personal information in FR initiatives, including summaries of PIAs where appropriate.Footnote 12

Finally, while most privacy laws in Canada are currently framed to recognize individual privacy rights, it will be important to consider collective interests in information, including from Indigenous perspectives. The public consultation process around reform of the federal Privacy Act and in particular the report on Engagement with Indigenous PartnersFootnote 13 could be instructive in this regard. We encourage that any legislative framework for law enforcement use of facial recognition include a requirement for a meaningful and on-going consultation with Indigenous Peoples as well as other groups who experience disproportionate policing and over-representation in the criminal justice system.

Signatories:

Daniel Therrien
Privacy Commissioner of Canada

Patricia Kosseim
Information and Privacy Commissioner of Ontario

Diane Poitras
President of the Commission d’accès à l’information du Québec

Tricia Ralph
Information and Privacy Commissioner of Nova Scotia

Marie-France Pelletier
Ombud of New Brunswick

Jill Perron
Manitoba Ombudsman

Michael McEvoy
Information and Privacy Commissioner for British Columbia

Denise Doiron
Information and Privacy Commissioner of Prince Edward Island

Ron Kruzeniski
Saskatchewan Information and Privacy Commissioner

Jill Clayton
Information and Privacy Commissioner of Alberta

Michael Harvey
Information and Privacy Commissioner of Newfoundland and Labrador

Andrew Fox
Information and Privacy Commissioner of the Northwest Territories

Diane McLeod-McKay
Yukon Information and Privacy Commissioner

Graham Steele
Information and Privacy Commissioner of Nunavut


What We Heard - Background on our consultation on draft guidance and a future legal framework for police agencies’ use of facial recognition

In June 2021, Canada’s federal, provincial and territorial privacy commissioners published draft guidance intended to clarify police agencies’ existing privacy obligations relating to the use of FR technology. At the same time, we launched a public consultation, seeking feedback on both the guidance as well as a future legal and policy framework to govern police use of the technology.

Over the consultation period, we received 29 written submissions from stakeholders representing civil society, academia, government, police, legal and industry sectors, as well as individuals. Throughout the fall of 2021, our Office met with law enforcement agencies, civil society groups, organizations representing marginalized communities and equity seeking groups, as well as federal, provincial and territorial human rights commissioners to seek their feedback.

Additionally, the Office of the Information and Privacy Commissioner of Ontario hosted a consultation roundtable with a broad range of Ontario-based stakeholders, the Office of the Information and Privacy Commissioner of British Columbia met with local police services, the Commission d’accès à l’information du Québec consulted with police services and other stakeholders, and other provinces received additional feedback.

Feedback on draft guidance

In response to our draft guidance on the current law’s application to police use of FR, stakeholders mostly found the guidance to be relevant, timely and beneficial, but noted that it is drafted at a level of generality that may be too high to be of true practical use.

In particular, we were asked to:

  • clarify recommendations concerning accuracy, retention, and third party involvement in FR initiatives, as well as explanations of legal principles and concepts;
  • provide more direction as to what should be contained in police agencies’ public notices and how they should be published to meet transparency obligations;
  • include a requirement for police agencies to develop a formal FR policy given the high risks involved in using FR;
  • tailor the guidance to specific use cases of FR to be more instructive to police agencies; and,
  • add further procurement and testing requirements for FR initiatives and systems.

We also heard that guidance alone is not sufficient to address the challenges posed by police use of FR, with some calling for a moratorium until a new legal and policy framework can be put in place.

To address stakeholder feedback, we made a number of amendments to the guidance and intend to advise police agencies on specific use cases as they are developed across different jurisdictions. As privacy regulators, we continue to believe that general guidance, eventually augmented by more specific use cases, will be a helpful interim measure, pending clarifications to the law as per our recommended legal framework.

Feedback on the legal and policy framework

During our consultation, we had also sought stakeholder feedback on a future legal and policy framework to govern police use of FR. The comments we received were wide ranging with some variation between stakeholder groups. However, a few themes and common points emerged throughout.

Overall, there was general agreement among stakeholders on a need for action to develop a legal framework for police use of FR, with a large majority of stakeholders opining that legislative gaps exist and that new legislation specifically governing FR use by police is required.

Examples of gaps in our current privacy laws for which there was the most consensus among stakeholders included:

  • lack of specificity concerning police agencies’ transparency and accountability obligations when contracting with the private sector for FR services;
  • lack of legal clarity on what constitutes “publicly available information” recognizing that individuals may still retain a privacy interest in personal information shared online;
  • exceptions or ambiguous requirements for law enforcement and national security agencies relating to accuracy obligations; and
  • greater latitude for disclosures to specified investigative bodies.

As for key measures needed in a future legal framework to appropriately regulate use of FR by police agencies, stakeholders from legal, civil society, academic and government sectors largely recommended the creation of an independent, multidisciplinary oversight body to address accountability and transparency concerns. They also called for legislative changes to:

  • set limitations on FR uses, establish no-go zones, and clarify instances requiring prior authorization;
  • restrict data-sharing with private sector partners and other law enforcement agencies; and,
  • mandate assessments with criteria requiring police to justify necessity and proportionality before FR is used.

Other measures recommended by stakeholders included the incorporation of a rights-based approach in the law to protect against FR-related infringements on privacy and human rights, legislatively-mandated auditing (for accuracy and bias), public reporting and procurement requirements for FR, and an effective means of enforcement to ensure compliance with the law. A subset of stakeholders, notably from law enforcement sectors, wanted a future legal framework to consider FR uses for policing based on a continuum of risk, noting that not all uses of FR pose the same level of risk.

Most respondents supported legal protections in certain situations for individuals whose faceprints are held in a comparison database, such as:

  • the automatic destruction of information collected on individuals found not guilty of an offence (or in other defined circumstances);
  • clear limitations on retention periods for faceprints, and heightened safeguards for storing and preventing inappropriate access;
  • a requirement to collect and store accurate faceprints to avoid risk of error, misidentification, and to permit rectification of errors where appropriate;
  • notice to individuals whose faceprints were captured, such as for sentenced individuals; and,
  • restrictions against the inclusion of facial matches obtained through FR using illegally obtained data.

We heard from other stakeholders, largely from law enforcement sectors, who cautioned against enacting blanket protections for biometric information held in databases, noting that consideration of context matters. Further caution was also given about having to destroy information that may constitute exculpatory evidence. Finally, many stakeholders pointed to the importance of ensuring a fulsome public debate to inform future legislation on police use of FR before uses become more widespread.

Date modified: