Language selection

Search

From state surveillance to surveillance capitalism: The evolution of privacy and the case for law reform

Remarks at 2021 Centre for Information and Privacy Law (CIPL) Learning Day hosted by Department of Justice Canada

June 16, 2021, by video conference
Ottawa, Ontario

Speech delivered by Daniel Therrien
Privacy Commissioner of Canada

(Check against delivery)


Introduction

Thank you for the invitation to speak about my experiences as Privacy Commissioner.

It is good to be back at the Department of Justice – even if only virtually. My career at the DOJ prepared me well for my current job. Indeed, there are many similarities in terms of the role of our two organizations.

I do not need to quote the Department of Justice Act to you—I have no doubt it is your nightly bedtime reading. You all know that your role is to “see that the administration of public affairs is in accordance with the law.” What a great concept, “to see to something.” It means to take special care of something or someone, in order to protect them; to pay considerable attention to them; to actively look after them.

At the Office of the Commissioner, our role is to see that the activities of government and commercial organizations comply with our privacy legislation, in the public and private sectors. You will be surprised to learn that this role does not come with order-making powers, but we still try to ensure that privacy rights are respected. By investigating. By conducting research. By providing advice and issuing guidelines. And, increasingly, by bringing cases before the courts so that the law will evolve. We are trying to be ever more proactive in the way we exercise our oversight functions.

Given this similarity in our roles, it is no surprise then that many of our colleagues have also moved so seamlessly between our two organizations. It’s hard to tell through video conference but I imagine there’s some familiar faces among you.

In my remarks today, I would like to talk about the evolution of privacy since becoming Commissioner in 2014 – from state surveillance, to corporate surveillance, to the emergence of public-private partnerships, which are raising new privacy issues.

I will also discuss the need for new laws and governance models to leverage information and innovate responsibly while protecting citizens’ rights and values. These are key issues as both the private and public sectors navigate this digital revolution.

           

Evolution of privacy since start of mandate in 2014

The landscape has shifted significantly since I took office.

In 2014, the talk was all about the Snowden affair and the state surveillance tactics that had emerged following the attacks of 9/11. In Canada, after the incidents in and around Parliament in 2015, new anti-terrorism legislation was debated, then passed, and ultimately amended in 2019.

Today, the privacy conversation is dominated by the growing power of tech giants like Facebook and Google, which seem to know more about us than we know about ourselves.

“Privacy is dead,” was the familiar refrain when I became Commissioner.

Even though privacy needs our oversight, today many, including some in the business community, are championing its essential nature. The most well-known among them is probably Tim Cook, CEO of Apple, who recognizes privacy as a fundamental right and who is concerned about the existence of a “data industrial complex” in which our own information “is being weaponized against us with military efficiency.”

Closer to home, Research In Motion co-founder Jim Balsillie recently wrote in a National Post op-ed criticizing Bill C-11 and demanding more effective measures that “Bill C-11 will do little to curb the abuses of the surveillance economy or protect the rights of Canadians.”

Consent and control were the golden rules of privacy protection seven years ago. Today, the governance structure is uncertain as we try to keep up with a rapidly evolving digital economy with an insatiable appetite for data.

The end of 2020 marked a milestone moment for privacy in Canada. As you know, in quick succession, the government unveiled Bill C-11 as well as the comprehensive public consultation issued by your department laying out a plan for the modernization of Canada’s 40-year-old public sector law.

These developments followed years of persistent calls for action from our office, as well as stakeholders from across industry and civil society.

The proposals hit the mark on some fronts, but not others. I will get into the details of that in a moment but I first want to talk a little bit more about privacy developments on the state surveillance front.  

Deeper dive

1. State surveillance

As you know, after 9/11, Canada and its allies enacted many laws and initiatives in the name of national security. Some of these laws went too far.

During my mandate, we helped push for the scaling back of some of the most sweeping state powers contained in Bills C-51 (the Antiterrorism Act) and C-13 (on cyber criminality).

During these debates, the need for independent oversight became apparent. From the Arar inquiry, to Snowden, to cases involving metadata collection by the CSE and CSIS, we were reminded that clear safeguards are needed to protect rights and prevent abuse.

Bill C-59 sought to address some of the problems with C-51 and saw the creation of a new expert national security oversight body: NSIRA. Bill C-22 introduced the National Security and Intelligence Committee of Parliamentarians.

While Bill C-59 was not perfect, after much discussion with my office and many others, amendments were introduced to ensure more appropriate thresholds were in place before personal information could be shared with national security agencies.

During the current COVID-19 pandemic, another period of fear and uncertainty, we are not seeing the same rush to introduce emergency measures with impacts on rights. Discussions around contact tracing apps and vaccine passports, for example, have very much included privacy considerations. 

This is not to say that all is well.

Encryption debates, such as the widely publicized dispute between Apple and the FBI over access to the locked cellphone of a mass shooter in California, continue. Including, from time to time, here in Canada.

We are also seeing other forms of state surveillance. Take, for instance, our Statistics Canada investigation, which found the agency had started collecting detailed credit information, and was proposing to collect line-by-line financial transaction data about millions of Canadians from private sector companies.

This case made clear the need to bring our public sector Privacy Act up to date and fit for purpose in this 21st century, notably by enacting a necessity and proportionality standard for the collection of personal information.

I will note that while the ETHI committee recommended the Privacy Act be amended to incorporate the principles of necessity and proportionality, the DOJ’s consultation document opted instead for a “reasonably required” standard. This “made-in-Canada” concept makes us a little apprehensive but in the end, regardless of the language used, we note the importance of ensuring that, in practice, this new concept is defined as essentially similar to the internationally recognized principles of necessity and proportionality.

2. Surveillance capitalism

While concerns about state surveillance have somewhat receded over the past few years, the threat of surveillance capitalism has taken centre stage.

Personal data has emerged as the world’s dominant currency and no one has leveraged it better than the tech giants behind our web searches and social media accounts.

The risks of surveillance capitalism were on full display in the Facebook/Cambridge Analytica scandal, where Facebook allowed app developers access to user data without their clear and informed consent, information that in the U.S. and the UK was later used by third parties for targeted political messaging. The UK imposed a 500,000 pound fine, while the U.S. issued an unprecedented $5 billion penalty to sanction Facebook for its defective consent practices. In Canada, Facebook faced no financial consequences.

Facebook actually rejected our investigation’s non-binding findings and recommendations. We launched proceedings in Federal Court to seek an order to force the company to correct its privacy practices, which we characterized as “an empty shell”.

The newest frontier of surveillance capitalism is artificial intelligence.

Digital technologies like AI that rely on the collection and analysis of personal data are at the heart of the fourth industrial revolution and are key to our socio-economic development.  However, they pose major risks to privacy and other human rights, like democracy and equality.

To draw value from data, the law should accommodate new, unforeseen but responsible uses of information for the public good. But, due to the frequently demonstrated violations of human rights, this additional flexibility should come within a rights-based framework. I will come back to this later.

3. Corporate know-how to assist state functions

Another trend seen during my mandate is the use of corporate expertise to assist state functions.

A prime example is our recent investigations of Clearview AI and the RCMP. We concluded Clearview AI violated Canada’s federal private sector privacy law by creating a databank of more than three billion images scraped from internet websites without users’ consent. Clearview users, such as the RCMP, could match photographs of people against the photographs in the databank.

The result was that billions of people essentially found themselves in a police line-up. We concluded this represented mass surveillance and was a clear violation of privacy.

As for the RCMP, we hold the view that a government institution cannot collect personal information from a third party agent if that third party agent collected the information unlawfully. Although the RCMP disagrees, and perhaps its lawyers, we maintain the onus was on the RCMP to ensure the database it was using was compiled legally. Otherwise, the RCMP could turn a blind eye to dubious commercial practices even when exercising the state’s coercive powers.

4. Information sharing and public-private partnerships

In its Data Strategy Roadmap for the Federal Public Service, the government called for increased data sharing and reliance on commercial services in order to “make better decisions, design better programs and deliver more effective services.”

The discussion paper on modernizing the Privacy Act proposed new authorities for data integration within government.

Information sharing between government departments can be difficult, as the Roadmap suggests, due to informal practices and legal limitations. Bear in mind, however, that what is a legal barrier to some may be seen as a privacy safeguard by another.

Many of the perceived legal barriers are found in sections 4 to 8 of the current Privacy Act. Should these rules be re-examined with an eye to improved government services in a digital age? Yes. Should some of these rules be amended? Probably.

What’s important is that while adjustments may be desirable, any changes to facilitate digital government services must respect privacy rights.

Integrating commercial tools into government service delivery raises many more issues. For instance, if a citizen relies on Alexa to request a public service, who owns the data? More importantly, who is responsible for protecting it?

The current pandemic has also underscored the risks of public-private partnerships that rely on technological innovation. Heated debates about contact tracing applications and vaccine passports and their impact on privacy have taken place around the world.        

Videoconferencing services and online platforms are allowing us to socialize, work, go to school and even see a doctor remotely but they also raise new privacy risks. Telemedicine creates risks to doctor-patient confidentiality. E-learning platforms can capture sensitive information about students’ learning disabilities and other behavioural issues.

As I have said before, and as the DOJ consultation paper on Privacy Act reform acknowledges, public-private partnerships highlight the need for interoperability between our two privacy laws to prevent gaps in accountability where the sectors interact.

At a minimum, this should include common or similar privacy principles enshrined in our public- and private-sector laws.

New laws and governance models

1. The limitations of the consent model

Which brings us to my point about the need for new laws and governance models.

The evolution of the past few years has brought us to the following juncture. Digital technologies have shown that they can serve the public interest, if properly designed. But they can also, and have, disrupted rights and values that have been centuries in the making.

How, then, can we draw value from data while protecting our rights and values?

Consent and control have been the hallmark of our existing privacy regime, albeit to a lesser extent on the public sector side.

While there is still a role for consent, it is neither realistic nor reasonable to ask individuals to consent to all possible uses of their data in today’s complex information environment. The power dynamic between consumers and companies, between citizens and government, is too uneven.

In fact, focussing solely on consent can be used to legitimize uses that, objectively, are completely unreasonable and contrary to our rights and values. And refusal to provide consent can sometimes be a disservice to the public interest.

So, what are the alternatives?

Quebec’s Bill 64 sets out some exceptions to consent, for instance de-identified information in the context of research, or when personal information is used for secondary purposes that are consistent with the purposes for which it was collected.  The DOJ consultation paper proposes similar flexibility to use or disclose de-identified personal information without consent, including when it is in the public interest.

Other data protection models take into account the limits of consent and try through other means to serve the public interest and protect privacy simultaneously.

For example, in Europe, data can be used when it is necessary for the performance of a task carried out in the public interest or for the purpose of the legitimate interests pursued by a business or public entity, while respecting basic rights.

What matters is that the law allow for uses of personal information in the public interest, in the pursuit of legitimate purposes or for the common good, within a rights-based regime.

Bill C-11 rightly introduces certain exceptions to consent, giving businesses greater flexibility in the processing of personal information. Unfortunately, some of the exceptions are too broad or ill defined to foster responsible innovation.

2. Accountability

Another potential solution is to focus on corporate accountability. Commercial organizations have long argued they can adequately protect privacy by being accountable.

It is a notion Canada accepted early on as one of the first countries to give legal effect to the principle of accountability, a principle that holds certain advantages. For one thing, companies and government institutions know their business best. Furthermore, regulators cannot possibly pre-approve every initiative that might impact privacy.

I agree accountability is important. However, it has shown its limits. As we have so clearly seen in Facebook, Equifax and other cases, the principle as currently framed is not sufficient to protect Canadians from companies that claim to be accountable, but actually are not.

No matter how socially responsible a business might be, the bottom line is the bottom line. Just last spring we saw the CEO of the multinational company Danone ousted after seven years. The accolades he had received for his company’s responsibility in environmental matters didn’t stand a chance against the profit drop that resulted from his responsible behaviour.

We cannot rely only on corporate accountability to protect consumer privacy. We need a law that requires businesses to be transparent and to demonstrate accountability to the regulating authority.

Unfortunately, Bill C-11 weakens existing accountability provisions, essentially allowing businesses to self-regulate. Instead, we should define accountability in a normative way, as the adoption of policies and procedures designed to comply with the law.

My office should also have the authority to proactively inspect, audit or investigate business practices to verify compliance with the law.

While the public sector may not be motivated by profit, it too has self-interests that require independent oversight and I welcome proposed reforms for proactive audits and other strengthened accountability mechanisms to help federal public bodies demonstrate how they are accountable.

3. Ethics

There are vocal advocates, particularly in the private sector, of the role of ethics in privacy protection. This school of thought sees ethics as a bridge between accountability and the law. Where policies and procedures fall short and no law exists, organizations should be governed by ethical principles.

In the public sector, the government is heavily invested in AI technologies as a way to better serve Canadians, but maintains it is pursuing such initiatives within a strong ethical framework.

As a construct, however, ethics can be quite subjective.

Organizations may make use of ethical principles, but we must be wary of ethics washing. If we are to take ethical approaches seriously, they must be more than window dressing.

4. Law and an independent regulator: from self-regulation to the democratic adoption of laws in the public interest

After years of self-regulation and ineffectual consent, I believe the way forward is to adopt principles based and objective, knowable standards adopted democratically and enforced by democratically appointed institutions.

The government’s C-11 narrative focused on the need for “certainty” and “flexibility” for businesses and “checks and balances” on the regulator. It maintained that privacy and commercial interests are competing interests that must be balanced.

Among those who advocate for law reform, few are those who ask for more flexibility for companies and more checks and balances for the regulator. In fact, citizens and consumers want protection through better laws, enforced by an independent regulator who can act for them in an extremely complex digital economy.

What we need is sensible legislation that allows responsible innovation that serves the public interest and is likely to foster trust, but that prohibits using technology in ways that are incompatible with our rights and values.

Conclusion

Which brings us back to the beginning: the law and values.

The legislative models I’ve discussed are not necessarily mutually exclusive. We want the OPC to have a regulator role, but not to the exclusion of other models. Sometimes consent and accountability do work. The OPC as an independent regulator is part of the solution, but not necessarily the whole solution.

As for values, privacy is nothing less than a prerequisite for freedom. The freedom to live and develop independently as persons away from the watchful eye of a surveillance state or commercial enterprises, while still participating voluntarily and actively in the regular, and increasingly digital, day-to-day activities of a modern society such as socializing, getting informed or buying goods.

We are in the midst of a fourth industrial revolution. Disruptive technologies are being adopted at a staggering pace. New technologies can provide important benefits but they also present huge challenges to the legal and social norms that protect the fundamental values of Canadians.

Only through respect for the law and the values we cherish will Canadians be able to safely enjoy the benefits of these technologies. Unfortunately, bad actors have eroded our trust and without legislative change, trust will continue to erode.

Rights-based statutes would serve to support responsible innovation by renewing confidence in both government and commercial activities.

I will end on this note. These past few years have opened our eyes to the exciting advantages and alarming risks that new technologies present for our values and our rights. The issues facing us are complex, but the path forward is clear. As a society, we must project our values into the laws governing the digital sphere. Canadians expect nothing less from their public institutions. Only then will confidence in the digital economy and in digital government—eroded by so many scandals—be restored.

Date modified: