Language selection

Search

Public Safety, Defence and Immigration Portfolio’s (PSDI) Learning Day 2022

Remarks at the Department of Justice Canada’s PSDI Annual Conference

February 2, 2022, by video conference

Speech delivered by Daniel Therrien
Privacy Commissioner of Canada

(Check against delivery)


Introduction

Thank you for inviting me here today.

I would like to address the privacy implications of new technologies from the perspective of my eight years as Privacy Commissioner.

Some of you know that before I became Privacy Commissioner, I was the Assistant Deputy Attorney General of the Public Safety, Defence and Immigration Portfolio at the Department of Justice.

I believe that my work on public safety issues prepared me well for my current role. There are certainly commonalities between these roles that, at their heart, involve ensuring compliance with the law.

On the surface, public safety and privacy may seem to be incompatible, but the two can be achieved concurrently. It is not a zero sum game.

I understand well that your role is to “see that the administration of public affairs is (done) in accordance with the law.”

As Privacy Commissioner of Canada, it is my role to ensure the activities of government and commercial organizations comply with our federal privacy legislation.

In essence, we share the same goal: to achieve both public safety and the protection of rights.  

To that end, technology is playing an increasingly important role.

I understand that law enforcement and national security agencies need modern tools to do their jobs effectively but new approaches and strategies must be necessary and proportionate.

Privacy risks must be mitigated, in other words, and new technologies must be deployed in a privacy protective manner.

We have come a long way in this regard. I believe we are better off in 2022 than we were in the wake of 9/11.

Of course, not everything is perfect, so I will talk about some of the current issues and how our laws should adapt to the situation now and in the years to come.

The evolution of information sharing and national security oversight

The privacy landscape has changed dramatically over the last number of years.

Following 9/11, the need to protect against further terrorist attacks seemed to outweigh personal privacy interests.

That trend, however, began to change with time, in part due to the Snowden revelations.

During my mandate, we helped push for the scaling back of some of the most sweeping state powers contained in Bills C-13 (which expanded collection powers in the Criminal Code) and C-51 (which expanded information sharing, via the Antiterrorism Act, 2015).

Bill C-59, passed later in 2019, rolled back some of the more concerning provisions in C-51.

In particular, amendments were introduced to ensure more appropriate thresholds were in place before personal information could be shared with national security agencies.

During these debates, the need for independent oversight also became apparent, leading to the creation of important new oversight bodies – the National Security and Intelligence Review Agency (NSIRA) and the Office of the Intelligence Commissioner.

In parallel with those developments, Bill C-22, introduced the National Security and Intelligence Committee of Parliamentarians (NSICOP).

My office’s expertise is complementary to that of those oversight bodies.

While my colleagues at NSIRA, for instance, are experts in national security law, we are experts in privacy law.

The combination of our respective expertise adds value and depth to the reviews we undertake together.

I would note that we regularly engage with NSIRA, for example, under an MOU whose terms and goals were finalized and made public this past July.

Indeed, in the coming days, the Public Safety Minister will table the first review of the Security of Canada Information Disclosure Act conducted jointly by my office and NSIRA.

We have and will continue to work with government agencies.

For instance, we recently reached out to the Department of National Defence (DND) and the Canadian Armed Forces.

We wanted to offer our support as they responded to recommendations issued by NSICOP in its special report on the collection, use, retention and dissemination of information on Canadians in the context of defence intelligence activities.

More specifically, we provided feedback on the updated Chief of Defence Intelligence Functional Directive: Guidance on the Handling and Protection of Canadian Citizen Information, which governs these activities and was reviewed and updated in response to recommendations in that report.

In particular, we recommended adjusting the text to place greater emphasis on the importance of the necessity threshold for collecting Canadian citizens’ personal information.

We also asked DND to clarify that the documentation requirements apply equally to information it shares with its Canadian partners orally and in writing. 

DND agreed to make several adjustments based on our recommendations.

We anticipate further engagement with DND and the Canadian Armed Forces in the area of defence intelligence activities in the near future.

Border searches of electronic devices

As I said, progress has been made to ensure that the use of new technologies to achieve public safety objectives is done in a manner that respects privacy rights.

In 2019, we issued our findings into a number of complaints against the Canada Border Services Agency (CBSA) with respect to the examination of personal digital devices – particularly cell phones, tablets, and laptop computers – at ports of entry by Border Services Officers.

Specifically, we argued the Customs Act should be updated to recognize that digital devices contain sensitive personal information and are not mere “goods” that should be subject to border searches without legal grounds.

This is a clearly outdated notion that does not reflect the realities of modern technology.

We argued the Act should include a clear legal framework for the examination of digital devices and the threshold for examinations of digital devices should be elevated to “reasonable grounds to suspect” a legal contravention.

I expressed the view in 2017 that Canadian courts would find groundless searches of electronic devices to be unconstitutional, even at the border.

Last year, the Alberta Court of Appeal found that the Customs Act provisions that permitted warrantless searches of devices by CBSA officers was unconstitutional and ordered the government to amend the legislation.

Last fall, it was granted a six-month extension to do just that.

We hope to see an amended law tabled imminently that will impose at least some threshold for border searches of digital devices. This would be another sign of things evolving in the right direction.

Facial Recognition Technology, Clearview AI and the RCMP

Facial recognition is another relatively new technology facing political, public and legal scrutiny at home and abroad, particularly with respect to its use by law enforcement.

Unlike other forms of biometrics collected and used by law enforcement agencies, such as fingerprints, facial recognition technology in Canada is regulated through a patchwork of statutes and case law that, for the most part, does not specifically address the risks posed by the technology.

This creates uncertainty concerning what uses of facial recognition may be acceptable, and under what circumstances.

As I’m sure you know, my office conducted a joint investigation with our Quebec, Alberta and B.C. counterparts into Clearview AI.

Our investigation found the company had collected highly sensitive biometric information without the express knowledge or consent of individuals.

Clearview AI argued that consent was not required because the information was publicly available and that the benefits of its services to law enforcement and national security outweighed any potential harm to privacy. 

We rejected these arguments.

In short, we concluded that what Clearview did was mass surveillance, that it was illegal and completely unacceptable for millions of people who will never be implicated in any crime to find themselves continually in a police lineup.

Related to this case, my office received a complaint under the Privacy Act over concerns about the Royal Canadian Mounted Police’s (RCMP's) use of Clearview’s services.

In that investigation, we found the RCMP violated the Privacy Act when it collected personal information from Clearview AI.

We expressed the view that a government institution cannot collect personal information from a third party agent if that third party agent collected the information unlawfully.

We think this should be a chief consideration anytime a public institution contracts a private sector company to help deliver a product or service to Canadians.

We understand that criminals are using new technologies to commit crimes, and so law enforcement and national security organizations need to modernize their approaches as well.

But, obviously, government institutions must use technology within the law, including privacy law.

In the end, we concluded the RCMP had serious and systemic gaps in its policies and systems to track, identify, assess and control new and novel collections of personal information.

At a minimum, we expect institutions with programs that collect personal information that could present a high risk to individuals’ privacy to have in place a number of measures.

This includes training programs and access to legal and other expertise to ensure decision-makers know their obligations.

They should also have systems and procedures to track potential and actual novel collection of personal information.

We would also look for processes to identify potential compliance issues as well as processes to complete timely assessments where warranted.

Finally, we are looking for effective controls to limit the collection of personal information by staff.

We found gaps in the RCMP practices in all of these areas.

For instance, with respect to knowledge of obligations, the RCMP defended the actions of its decision makers who used Clearview without conducting privacy impact assessments, saying that they relied on assertions from Clearview AI that their images were all from publicly available information.

In our view, it should be clear to decision-makers that the collection of information via facial recognition technology (FRT) would warrant a more rigorous assessment than relying on the claims of a commercial vendor.

Our recommendations focused on addressing these gaps and I’m pleased to report the RCMP is moving forward with measures to fill them.

For instance, the RCMP has begun to establish a National Technologies Onboarding Program that will serve to identify and assess the legal and privacy implications of new investigative technologies.

This is an encouraging and necessary step.

Guidance document on the use of FRT

At the same time as our investigative findings into the RCMP’s use of Clearview’s services were made public, we commenced a consultation on draft guidance on the use of facial recognition technology by police services across Canada in cooperation with our provincial and territorial counterparts. 

The guidance sets out the legal responsibilities of police agencies under the current legal framework with a view to ensuring any use of facial recognition complies with the law, minimizes privacy risks, and respects the fundamental human right to privacy.

The consultation was launched to solicit views on the draft guidance from stakeholders, including police, academics, industry, civil society and the public.

A preliminary analysis of submissions received suggests stakeholders agree on a number of themes related to the need for clearer legal rules, limiting facial recognition use to appropriate purposes, making provisions for accountability, transparency and external oversight, and finally, ensuring the accuracy of data and software.

A number of stakeholders, including the Canadian Human Rights Commission, have called for a moratorium on the use of facial recognition technology by law enforcement until such time as a new legal and policy framework is put in place.

Internationally, we have seen this approach used. For example, in the US, some cities have banned FRT for law enforcement purposes.

The European Commission did not go this far. But in a draft regulation to address AI, the Commission proposes to restrict the use of real-time facial recognition in public spaces by law enforcement to cases involving terrorism, serious criminality and targeted searches for crime victims and missing children.

The regulation also provides for other important standards, including a requirement of strict necessity, consideration of the consequences for the rights and freedoms of all persons concerned, compliance with necessary and proportionate safeguards, as well as prior authorization by a judicial or other independent administrative authority, among other protections.

While stakeholders agree on a number of issues, they do not on others, such as the imposition of a moratorium.

We may well recommend that Parliament study this issue further,  hear the conflicting views, and as elected officials, prescribe the conditions for the use of FRT.

In the meantime, we will publish our guidance under the current law in the spring.

Privacy and the pandemic

The pandemic has also shown how digital technologies are creating additional risks for privacy.

From the beginning of the COVID-19 pandemic, my office has focused on the fact that even during a grave national health emergency, it is always possible to protect personal information by applying a flexible and contextual approach to privacy laws. Since 2020, we have demonstrated that privacy is not a barrier to public health.

The federal government has engaged with our office on a range of files, including numerous initiatives related to COVID-19 infection tracking and tracing, border controls and initiatives to provide benefits during a period of economic crisis.

For example, my office provided advice and recommendations on the ArriveCAN mobile device application.

It allows international travellers to report on their compliance with mandatory isolation measures under the Quarantine Act.

Restricting the movement of individuals into Canada in response to COVID-19 represented a novel challenge for the CBSA and the Public Health Agency of Canada (PHAC).

Paper forms were introduced in February 2020 to enable contact tracing by public health authorities, but the volume of paper that required manual data entry and processing grew too large.

To operationalize its border restrictions, the government began to collect, under its authorities pursuant to the Quarantine Act, travellers’ contact information, details about their travel to Canada, a COVID-19 symptom self-assessment, and information about their quarantine plans.

As the pandemic evolved, collection expanded to include COVID-19 test results and proof of vaccination information.

Each time the government has added new collection activities, PHAC has engaged my office in consultation.

In keeping with the principle of minimizing data collection, we have stressed to PHAC the importance of only collecting the personal information that it needs in order to ensure compliance with its border restrictions, and using it only for that purpose.

Individuals’ reasonable expectations of privacy may be less in a public health crisis, but they would not reasonably expect that sensitive information, such as health or places or persons visited, would be available for other government or commercial purposes.

We further recommended that PHAC and its partners put measures in place to prevent the over-collection of personal information from travellers.

PHAC responded positively and addressed this concern by limiting open-text boxes and eliminating unnecessary sections of the app.

As we move towards a post-pandemic state, we understand the CBSA is exploring future uses for the ArriveCAN app, such as to streamline other mandatory declarations, such as customs and immigration information.

Simultaneously, the CBSA is exploring additional means to leverage mobile technologies, such as digital travel credentials for traveller identification purposes.

My office will continue to be engaged with the government on its digital travel tools.

Law reform and the way forward

If we are to adequately protect Canadians and ensure trust in the digital economy and federal institutions, we must have laws fit for the 21st century.

In mandate letters issued in December, Prime Minister asked his Minister of Innovation, Science and Industry to deliver on legislation to “strengthen privacy protections for consumers” and his Minister of Justice to continue the “substantive review of the Privacy Act” in order to “keep pace with the effects of both technological change and evolving Canadian values.”

I am hopeful these much needed reforms will soon come to pass.

We need federal privacy legislation that enables responsible innovation, but within a rights-based framework that recognizes the fundamental right to privacy.

We also need common principles in both federal laws.

For instance, necessity and proportionality have to be key factors in the decision to collect, use and retain information.

This means organizations should only pursue privacy-invasive activities and programs where it is demonstrated that they are necessary to achieve a pressing and substantial purpose and where the intrusion is proportional to the benefits to be gained.

This is especially important given government’s increasing reliance on privately developed technology to carry out its business.

Conclusion

There’s no question the digital world creates both new opportunities and challenges for law enforcement, national security and intelligence agencies.

If used inappropriately, technologies like facial recognition can have lasting and severe effects on privacy and other fundamental rights.

When we speak of harms, we’re not just talking about harms to individuals, but also the general societal harm that flows from authorities’ increased ability to monitor the physical and virtual spaces in which we interact.

The question of where acceptable use of these technologies begins and ends is in part a question of our values; of the expectations we set now for the future protection of privacy in the face of an ever-increasing ability to invade it.

When organizations set out to include new technologies at their disposal, they need to remember to design for privacy.

That means integrating privacy protections into the proposed initiatives before using them; conducting privacy impact assessments to ensure programs comply with the law; and monitoring and assessing risks and the effectiveness of protections on an ongoing basis.

I also recognize existing laws are not up to par. The incorporation of a rights-based framework in our privacy laws would help support responsible innovation and foster trust in government, giving people confidence to fully participate in the digital age.

As a society, we must project our values into the laws that regulate the digital space. Our citizens expect nothing less from their public institutions.

Only through respect for the law and the values we cherish will we be able to safely enjoy the benefits of new technologies, while preserving the freedoms and rights that we proudly count on as Canadians.

Thank you. And now I believe there is time for questions.

Date modified: