Language selection

Search

Strengthening National Security and Privacy in the Digital Era

Remarks at the CIGI National Security Strategy Event Series

October 14, 2021, by video conference

Speech delivered by Daniel Therrien
Privacy Commissioner of Canada

(Check against delivery)


Introduction

Thank you for the invitation to speak today. I appreciate the opportunity to participate in the ongoing Centre for International Governance Innovation (CIGI) project to rethink national security strategy for the twenty-first century.

CIGI founder Jim Balsillie has become an important thought leader in the debate around privacy and security, for the innovation sector and beyond. I agree with his observation that, “Data governance is the most important public policy issue of our time.”

One can hardly dispute that data is becoming increasingly important in the economy and we are witnessing the rapid development of data-driven technologies.

Where the first industrial revolution was powered by steam, the fourth is driven by data.

More and more companies are seeking to monetize our personal data, while governments are looking at various initiatives to promote data-driven decision-making, greater data sharing, and making data more accessible.

Technology and data can – and do, in many cases – serve the public good. The pandemic offers many examples of that: the use of data to support public health, or the platforms we have all been using to stay connected.

But the growing importance of data also comes with potentially serious new risks to our privacy. We must urgently consider how to derive benefit from data while respecting our democratic values and fundamental rights.

This afternoon, I would like to address two related themes with you: the first is the need to balance emergency measures with privacy protection. The second is the need to embed Canadian rights and values into updated privacy legislation in order to protect individuals, give government the tools it needs to carry out its work, and promote responsible business innovation.

No matter what role brought you here – whether your interest is in business, government or policy – the issue of protecting personal information is one that should matter to all of us.

In fact, if there is one idea you need to leave with today it is that privacy protection does not stifle economic innovation, nor does not stand in the way of Canada’s national security interests. Indeed, good privacy laws can support both.

Current privacy landscape

The privacy landscape has changed dramatically since I was appointed to this position a little over seven years ago. In 2014, senior executives of the tech industry openly talked about privacy being “dead.”

Following 9-11, it seemed like some were willing to cede their right to privacy to the state in the name of national security. Even when some information-gathering laws went too far, the need to protect against further terrorist attacks was seen to outweigh personal privacy interests.

That view started to change with the Snowden revelations.

In 2021, it is clear that privacy is very much a live issue. Indeed, our polling shows it preoccupies the vast majority of Canadians, who are very concerned about their inability to protect it, namely from the growing power of companies like Facebook and Google, which seem to know more about us than we know about ourselves.

No one has monetized personal data better than the tech giants behind our web searches and social media accounts.

But personal information is also the lifeblood of national security and intelligence agencies. The collection, use, and disclosure of personal information has always been integral to their work.

Our interest is to ensure that, while such agencies have the tools they need for their critical work, they use them in a manner that complies with the law and respects privacy rights.

Balancing emergency measures and privacy protection

As we saw after 9-11, when faced with a security crisis, decision-makers tend to take strong actions without necessarily giving rights due consideration.

So how do you define an emergency to figure out what is the reasonable response to it? And how do you determine when the crisis is over and whether it is safe to lift some of those strong actions?

It bears noting that national security issues are not confined to terrorism. The 9-11 terror attacks took the lives of nearly 3,000 people. The pandemic has claimed more than 4.8 million lives worldwide, nearly 7,500 per day. This too is a threat to national security that has resulted in strong actions that include using new technologies in ways that encroach on civil liberties. 

When the COVID-19 pandemic emerged, some felt privacy should be set aside in order to protect lives, just as happened after 9-11. This was based on the false assumption that we either protect public health or protect privacy – we couldn’t do both.

I disagree. Privacy and innovation brought about by new technology are not conflicting values. This is not a zero sum game.  In fact, trust in innovation depends on the protection of rights. Without privacy, there is no trust and there is no sustainable innovation.

While my office has recognized that the health crisis calls for a flexible and contextual application of privacy laws, because privacy is a fundamental right, key principles must continue to operate, even if some of the more detailed requirements are not applied as strictly as in normal times.

Last spring, my office, along with our provincial and territorial counterparts, issued a joint statement calling on governments to adhere to several key privacy principles as they developed vaccine passports.

Above all, we stressed the need to establish the necessity, effectiveness and proportionality of vaccine passports. More specifically, we said vaccine passports must be necessary to achieve their intended public health purpose; they must be effective; and the privacy risks associated must be proportionate to the purposes they are intended to address.

Vaccine passports are only one of many issues for the protection of personal information arising during the pandemic. Contact tracing applications, temperature checks and vaccine mandates have all raised important questions.

Meanwhile, for much of the pandemic, daily life has unfolded online – socializing, going to school and seeing the doctor took place through online platforms, raising new risks for our privacy. For example, telemedicine creates risks to doctor-patient confidentiality when virtual platforms involve commercial enterprises. E-learning platforms can capture sensitive information about students’ learning disabilities or behavioural issues.

As the pandemic speeds up digitization, it has served to highlight longstanding, significant gaps in Canada’s legislative framework.

We have seen that basic privacy principles that would allow us to use public health measures without jeopardizing our rights are, in some cases, best practices rather than requirements under the existing legal framework.

For instance, the law has not properly contemplated privacy protection in the context of public-private partnerships, nor does it mandate app developers to consider Privacy by Design, or the principles of necessity and proportionality.

It is deeply troubling to watch the pandemic drive rapid societal and economic transformation in a context where our laws fail to provide effective protection for Canadians’ privacy rights.

Rights and values

Trust

For all the talk about the fourth industrial revolution, technological advances and vulnerabilities surrounding the collection and use of personal information, I must emphasize that technology itself is neither good nor bad. New technologies can, of course, bring great benefits but there are also risks.

One of the primary purposes of regulation is to create an environment where individuals feel confident that their rights will be respected.

Unfortunately, the failure to ensure our laws keep up with the times has led to a crisis of trust in Canada, particularly when it comes to online activities.

Polling tells us Canadians are very concerned about their privacy. A scan of the headlines on just about any day of the week explains why.

This month, a whistleblower accused Facebook of prioritizing profits over fixing algorithms that spread hate and misinformation. A few years ago, the Facebook/Cambridge Analytica scandal revealed the potential impact of online platforms on privacy and the integrity of elections.

Meanwhile, ever-bigger data breaches have become shockingly regular events. We see breaches affecting hundreds of thousands, and even millions of people, indeed hundreds of millions of people. My office investigated the Desjardins breach that compromised the personal information of almost 10 million Canadians.

Other recent investigations by my office, meanwhile, have raised the spectre of mass surveillance in Canada.

I am referring to our related investigations into technology company Clearview AI and the RCMP.

Clearview scraped billions of images of people from across the Internet and then allowed law enforcement to match photos against the company’s massive databank. We found this collection of highly sensitive biometric information without the knowledge or consent of individuals to be illegal. Putting millions of people – the vast majority of whom will never be implicated in a crime – into a 24-7 police lineup is clearly unacceptable.

Our investigation into the RCMP found the police agency’s use of facial recognition technology to conduct hundreds of searches of a database compiled illegally by a commercial enterprise to be a violation of the Privacy Act

Our investigation also revealed serious and systemic gaps in the RCMP’s policies and systems to track, identify, assess and control novel collections of personal information through new technologies.

As a result of our investigation, the RCMP is establishing a National Technologies Onboarding Program which will serve to identify and assess the legal and privacy implications of new investigative technologies. This is an encouraging step; one that I hope will lead to change.

Ultimately, however, real progress in building trust in the digital environment will only come when we adopt robust privacy laws.

Canadians want to enjoy the benefits of digital technologies. It is the role of government, and Parliament, to give them the assurance that legislation will protect their rights.

Strong privacy laws build trust in our systems and institutions; they also build resilience in the face of security threats.

Some may see privacy protections as simply a constraint on our activities, but from another perspective, they are an active defence. Not having effective privacy laws creates vulnerability – what the military would call “targets of opportunity.”

You may remember a breach of athletes’ information at the World Anti-Doping Agency, or WADA, a few years ago.

The incident involved the disclosure of information about certain athletes who had competed in the 2016 Rio Olympic Games. Subsequently, Canada and its allies attributed the hack to Russian military intelligence.

At the time of the breach, Montreal-based WADA had certain safeguards in place, but an investigation by my office found they fell well below the level expected of an organization responsible for protecting highly sensitive medical information. WADA’s status as a potential “high-value” target for sophisticated statesponsored hackers also demanded an adequately robust safeguards framework.

Threats can come from anywhere: security and intelligence services of hostile states, ransomware hackers, organized crime syndicates, or other disruptors. They will target government installations, major oil companies – even hotels and dating websites. It doesn’t take sophisticated adversaries long to link data chains together, to gather enough sensitive information to cause harm.

Rolling back post 9-11 excesses

I alluded earlier to the fact that laws passed in the wake of 9-11 allowed a greater incursion into privacy. Bill C-51, the Anti-terrorism Act, 2015, was one example of an anti-terrorism measure that caused concerns because it went too far.

Bill C-59, passed in 2019, rolled back some of C-51’s excesses, after healthy democratic discussions in which my Office played a part.  It also created oversight bodies – the National Security and Intelligence Review Agency (NSIRA) and the Office of the Intelligence Commissioner. Bill C-22, meanwhile, introduced the National Security and Intelligence Committee of Parliamentarians.

My office’s expertise is complementary to that of those oversight bodies. While they are experts in national security law, we are experts in privacy law. The combination of our respective expertise adds value and depth to the reviews we undertake together. We regularly engage with NSIRA, for example, under an MOU whose terms and goals were finalized and made public a few weeks ago. We want to be transparent with the public and national security agencies on the terms of our collaboration.

We have recently seen a welcome rise in requests for consultations on national security and public safety issues. We continue to advise the government on these matters, for example in relation to the Passenger Protect Program.

Still, concerns will continue to arise.

For example, encryption. It is a policy area that cuts across traditional conceptions of privacy and national security. It directly affects commercial and governmental sectors and their day-to-day operations.

Some say better encryption tools provide individuals more protection, while others argue encrypted communications make us less safe because they embolden criminals, cost more and unnecessarily complicate or slow down analysis. Law enforcement and national security agencies have shown they would like to find ways around it.

We know basic encryption would have greatly benefited the privacy of those affected by the numerous data breaches my office has investigated over the years. And from the standpoint of classic “information security,” anyone who has worked a day in a “classified” environment will be very familiar with the added safeguards that encryption provides for protected information.

The widely publicized battle between Apple and the FBI over investigators’ access to the locked cellphone of a mass shooter in California brought difficult issues of privacy, governmental access to data and encryption to the forefront.

After considerable legal maneuvering, in the end, the FBI abandoned its attempts in court to compel Apple to alter the security features on its systems and products. Investigators with the FBI instead contracted with a separate software firm to circumvent the phone’s protections. 

Unfortunately, what that instance demonstrates fairly clearly, is there remains no known way to give systemic access to law enforcement without simultaneously creating an important risk for the population at large.

We have recommended Parliament proceed cautiously before attempting to legislate solutions. It would be preferable to explore technical solutions which might support discrete, lawfully authorized access to specific encrypted devices – as the FBI successfully did in the San Bernardino case – as opposed to imposing general legislative requirements.

All this to say, before legislators hasten to legislate limits on proven data protection safeguards such as encryption, I believe we need an open discussion of where the specific problems are and careful study of what measures for investigators already exist within the current law.

Projecting our values into laws

Canada has just elected a new Parliament. Among the central tasks for legislators, returning to Ottawa will be to adopt modern, effective laws that enable responsible innovation and economic growth while protecting the rights and values of Canadians.

We need a rights-based framework that recognizes the fundamental right to privacy.

Last year we saw some movement on both the public and private sector fronts. The federal government circulated a consultation paper proposing updates for the Privacy Act, which applies to the federal public sector. My office is optimistic about the direction it takes.

We were not able to say the same about Bill C-11, the proposal to update the Personal Information Protection and Electronic Documents Act. The Bill died on the Order Paper when the election was called, and I hope the new Parliament will present an opportunity to bring forward significant improvements.

Parliament should follow the lead of Ontario and Quebec, which brought forward proposals towards responsible digital innovation within a legal framework that recognizes privacy as a fundamental human right. Quebec recently passed Bill 64 and Ontario is in the midst of serious consultations about its own law.

Canada’s privacy regime had already fallen behind the privacy laws of many of its global trading partners, and now it is falling behind domestically, as the provinces step in to fill gaps.

This is not an abstract problem. The Council of Canadian Innovators considers a patchwork quilt approach to privacy in Canada a “recipe for disaster” and the “worst possible outcome” for innovation. It is normal in a federation to have different laws from one province to another, but there should be an effort to harmonize.

Along with the need for a rights-based framework, there is a need for common principles in both federal laws. For instance, necessity and proportionality have to be key factors in the decision to collect, use and retain information. This is especially important given government’s increasing reliance on privately developed technology to carry out its business.

Necessity and proportionality principles mean organizations should only pursue privacy-invasive activities and programs where it is demonstrated that they are necessary to achieve a pressing and substantial purpose and where the intrusion is proportional to the benefits to be gained. Think of facial recognition technology – it is inherently invasive but the incursion on privacy may be warranted if, for instance, using the technology could help rescue a specific missing child.

That does not mean that it must never be used – it means that any time it is used, the need for it must be established. The specific benefit, one that is demonstrably accrued by the action itself, must be proportional to the harm that is done.

And once a decision to proceed is made, other privacy principles like limiting collection, safeguarding the data, retention and disposal come into play.

Another key factor is interoperability. For both national security and business purposes, we need to be able to send and receive information across borders.

The European Union’s General Data Protection Regulation, or GDPR, sets high standards for protecting personal information. The EU will not freely share data with governments or companies whose own regulatory regimes don’t adequately protect that information. Canada currently meets the adequacy requirement, but our status is reviewed every four years and we are not guaranteed a pass.

As I’ve already said, domestic interoperability is also becoming a concern, as Canada’s largest provinces pass their own private-sector privacy laws – which in some cases are better than the federal one.

Conclusion

Canada aspires to regain its status as a global leader in privacy.

We believe that Canadians deserve federal privacy laws based on rights. The incorporation of a rights-based framework in our privacy laws would help support responsible innovation and foster trust in government, giving people confidence to fully participate in the digital age.

Adopting a rights-based approach would situate Canada as a leader on this file, showing the way by defining privacy laws that support the desired outcomes: protecting privacy, encouraging business development and innovation, and denying hostile states and other bad actors the targets of opportunity that can create national security threats.

As a society, we must project our values into the laws that regulate the digital space. Our citizens expect nothing less from their public institutions.

Strong privacy protection makes for a strong, resilient and secure Canada. We should all work toward that goal.

Date modified: