Language selection

Search

Strategic Privacy Priorities and the themes and observations that emerged: 2015-2022

This page has been archived on the Web

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Introduction

In 2015, following significant public consultations, the Office of the Privacy Commissioner of Canada (OPC) identified four strategic priorities:

The priorities selected reflect the values and concerns of Canadians, as well as the views of numerous stakeholders in civil society and consumer advocacy groups, industry, legal service, academia and government.

The OPC’s aim was to hone its focus to make best use of its limited resources, to further its ability to inform Parliamentarians, organizations and the public of the issues at stake, to influence behaviour and to use the office’s regulatory powers most effectively.

The OPC sought to achieve its goals through a variety of strategic approaches that included public education, better addressing the privacy needs of vulnerable groups, and protecting Canadians’ privacy in a borderless world.

The strategic priorities have guided and helped focus the OPC’s work during Commissioner Daniel Therrien’s term, as he sought to restore Canadians’ trust in government and the digital economy.

Priorities: setting goals, exploring themes and reporting outcomes

For each strategic priority, the OPC began with a stated goal. What emerged in subsequent policy work, stakeholder interactions and investigations was a series of important themes.

It is these themes that, combined, have led to the conclusion that effective privacy protection demands immediate rights-based law reform. Below is an exploration of the OPC’s original goals, the themes that emerged and the key initiatives that support its position.

The economics of personal information

Initial goal: To enhance the privacy protection and trust of individuals so that they may confidently participate in the digital economy.

Emerging theme: Technology and new business models that rely on complex data practices are challenging the current consent model and raising questions about the relationship between privacy and other fundamental rights.

During an extensive consultation with stakeholders following its publication in 2016 of a discussion paper on how to improve the consent model, the OPC heard from and agreed with many who argued the increasingly complex digital environment poses challenges for the protection of privacy and the role of consent.

There was recognition that while consent can be meaningfully given in some situations, with better information, there were other circumstances where consent may be impracticable. This may be so, for instance, in some uses of big data or artificial intelligence, where it is no longer entirely clear to consumers who is processing their information and for what purposes. In fact, at times consent can be used to legitimize uses that, objectively, are completely unreasonable.

The OPC sought to address these challenges through various means discussed in its 2017 consent report, notably by clarifying in its Guidelines for obtaining meaningful consent the key elements to be conveyed to consumers to ensure consent is meaningful. However, where consent is not practicable, the OPC put forward the idea that alternatives to consent may need to be considered to maintain effective privacy protections.

At the same time, the OPC published guidance on “no go zones” for the collection, use and disclosure of personal information. It outlines practices that would be considered “inappropriate” by a reasonable person, even with consent, and therefore contrary to subsection 5(3) of Canada’s federal private sector privacy law, the Personal Information Protection and Electronic Documents Act (PIPEDA). The guidance included as inappropriate personal information practices that involve profiling or categorization that leads to unfair, unethical or discriminatory treatment, contrary to human rights law.

Some time later, a joint investigation into the Facebook/Cambridge Analytica scandal starkly illustrated how Canada had reached a critical tipping point and that privacy rights and democratic values were at stake.

That case underscored how privacy rights and data protection are not just a set of technical or procedural rules, settings, controls and administrative safeguards. Instead, privacy is a fundamental right and a necessary precondition for the exercise of other fundamental rights, including freedom, equality and democracy.

Most recently, global developments in machine learning and the increasingly widespread use of artificial intelligence have led to the parallel realizations that privacy and equality (non-discrimination) are also vitally linked.

Such examples informed the OPC’s views on law reform, as set out in its 2018-2019 Annual Report to Parliament, which described the urgent need for a rights-based law framework, and its subsequent submission to Parliament on Bill C-11, the Digital Charter Implementation Act, 2020.

In this submission, the OPC agreed with the intention behind Bill C-11 to give organizations greater flexibility to use personal information, even without consent, for legitimate commercial and socially beneficial purposes. But this should be done within a rights based framework that recognizes privacy as a human right and as a prior condition to the exercise of other fundamental rights. This, it argued, would promote responsible innovation.

While the government’s bill died on the order paper when the election was called in 2021, the government said it would table new legislation in 2022.

The OPC is not alone in this position, which is held by many international partners. In 2019, for instance, the OPC sponsored a resolution, adopted by the Global Privacy Assembly, a forum which brings together data regulators from around the world, to recognize privacy as a fundamental human right and vital to the protection of other democratic rights. The resolution called on governments to reaffirm a strong commitment to privacy as a human right and to review and update privacy and data protection laws.

Other OPC work:

  • Privacy breaches (or personal data leaks) remain a perennial problem that forever threatens consumer confidence in the digital economy. The OPC has investigated a number of significant breaches that underscore deficiencies with the security safeguards adopted by organizations:
    • An investigation into the Fédération des caisses Desjardins du Québec following a breach of security safeguards that ultimately affected close to 9.7 million individuals in Canada and abroad;
    • An investigation into Equifax that occurred when hackers gained access to the credit reporting agency’s systems through a security vulnerability the company had known about for more than two months, but had not fixed; and
    • An investigation into a breach of the World Anti-Doping Agency’s database that resulted in the disclosure of sensitive health and location information about more than 100 athletes who had competed in the 2016 Rio Olympic Games.

Government surveillance

Initial goal: To contribute to the adoption and implementation of laws and other measures that demonstrably protect both national security and privacy.

Emerging theme: Public safety and national security institutions require effective oversight and stronger legal thresholds are needed to ensure their activities are not just lawful, but also necessary and proportional to the outcomes they are seeking to achieve.

After 9/11, Canada and its allies enacted many laws and initiatives that broadened the authorities for government data collection in the name of national security, some of which adversely affected privacy. The OPC was called upon to comment on a number of these issues. In general, its advice stressed the importance of strict standards to limit the sharing of personal information to what is necessary for public safety purposes, and the importance of oversight to ensure the activities of law enforcement and national security agencies are conducted lawfully.

By and large, the recommendations were reflected in the bills that were passed, notably Bill C-51, the Anti-Terrorism Act, 2015 and Bill C-13, on cyber-criminality.

On the former, the OPC recognized the important work of national security agencies but stressed that collection thresholds should be sufficiently high in order to protect the privacy of law abiding citizens.

The investigation into the Canada Border Services Agency’s examination of digital devices was a good example of the OPC’s work to improve thresholds. The investigation raised a number of concerns with the CBSA’s practices and argued the Customs Act should be updated to recognize that digital devices contain sensitive personal information and are not mere “goods” that should be subject to border searches without legal grounds. This outdated notion does not reflect the realities of modern technology.

The OPC called for a clear legal framework for the examination of digital devices and the threshold for examinations of digital devices to be elevated to “reasonable grounds to suspect” a legal contravention. As early as 2017 in an appearance on border privacy, the OPC expressed the view that Canadian courts would find groundless searches of electronic devices to be unconstitutional, even at the border.

In 2020, the Alberta Court of Appeal ruled the Customs Act provisions that permitted warrantless searches of devices by CBSA officers were in fact unconstitutional and suspended its declaration of invalidity to give the government time to amend the legislation. On March 31, 2022, the government tabled Bill S-7 in response to the ruling.

The importance of robust collection thresholds was also highlighted in the OPC’s 2016 submissions on Privacy Act reform. Given the relative ease with which government institutions can collect personal information with today’s digital technologies, the Office recommended that the collection of personal information be subject to the international standard of necessity and proportionality.

The investigation of Statistics Canada’s collection of personal information from a credit bureau and financial institutions offered a practical example of the tension between broad government data collection and the public’s expectation of privacy. Although the investigation found the Statistics Act authorizes broad data collection, it also found significant privacy concerns regarding the initiative, which led the OPC to recommend that the agency adopt a standard of necessity and proportionality into all its collection activities. The OPC concluded:

We consider a complete record of financial transactions to be extremely sensitive personal information. Indeed, this line-by-line collection of information relating to individuals’ banking activities by a government institution could be considered akin to total state surveillance. It is doubtful that any public objective can be so compelling (pressing and substantial) as to justify this level of intrusiveness.

Beyond stronger thresholds and standards for the collection and sharing of personal information, the OPC also called for greater accountability and oversight over national security agencies. To that end, Bill C-22 introduced the National Security and Intelligence Committee of Parliamentarians and Bill C-59 created a new expert national security oversight body, the National Security and Intelligence Review Agency (NSIRA), which consolidated national security review under one roof.

In order to ensure that both privacy and national security expertise are brought to bear upon oversight activities, the OPC has developed strong partnerships with NSIRA, resulting in a collaborative review under the Security of Canada Information Disclosure Act (SCIDA) and issuance of the first joint report in February 2022.

Other OPC work:

Reputation and privacy

Initial goal: To help create an environment where individuals can use the Internet to explore their interests and develop as people without fear that their digital trace will lead to unfair treatment.

Emerging theme: Canadians deserve privacy laws that provide them with some measure of protection of their reputation, while respecting freedom of expression.

Central to the OPC’s work on reputation and privacy is the Draft Position on Online Reputation. Developed after a consultation and call for essays from various stakeholders, it highlighted the OPC’s preliminary views on existing protections in Canada’s federal private-sector privacy law, which includes the right to ask search engines to de-list web pages that contain inaccurate, incomplete or outdated information and remove information at the source. The Draft Position also emphasized the importance of education to help develop responsible, informed online citizens.

In 2018, the OPC filed a Reference with the Federal Court seeking clarity on whether Google’s search engine service is subject to federal privacy law when it indexes web pages and presents results in response to a search for a person’s name. The Court was asked to consider the issue in the context of a complaint involving an individual who alleged Google was contravening PIPEDA by prominently displaying links to online news articles about him when his name was searched.

The Federal Court issued its decision on the merits of the reference questions in July 2021. The OPC welcomed the Court’s decision, which aligned with its position that Google’s search engine service is collecting, using, and disclosing personal information in the course of commercial activities, and is not exempt from PIPEDA under the journalistic exemption. Google is appealing the decision. The OPC’s Draft Position will remain in draft until the conclusion of this litigation and the underlying investigation.

Since it is ultimately up to elected officials to confirm the right balance between privacy and freedom of expression, the OPC has stated that its preference would be for Parliament to clarify the law with regard to a right to request de-listing by search engines, as Quebec did through Bill 64.

Other OPC work:

  • The OPC launched an investigation into Romanian website Globe24h which was republishing court and tribunal decisions, including Canadian decisions available through legal websites like CanLII. Unlike CanLII, which uses the web’s robot exclusion standard to limit indexing of decisions by name and thereby minimize the privacy impact on individuals, the Globe24h site indexed decisions and made them searchable by name. It also charged a fee to individuals who wanted their personal information be removed. The investigation found the company did not obtain consent to collect, use and disclose the personal information found in the tribunal decisions and that its actions were not ones a reasonable person would consider appropriate in the circumstances.

    After the release of the OPC’s investigation report, a complainant pursued the matter further in Federal Court, seeking damages from the company as well as an enforceable order for the operator of the site to delete all Canadian court and tribunal decisions on its servers. Given the precedent-setting nature of the issues at play, the OPC intervened in the litigation and in January 2017, the Federal Court confirmed the findings of the OPC investigation and ordered Globe24h to remove Canadian court and tribunal decisions containing personal information from its website, and to refrain from further copying and republishing Canadian decisions in a manner that contravened PIPEDA. It also ordered the organization to pay for nominal damages incurred as a result of its offside practices. Shortly after the Court’s decision was issued, the website ceased to operate.
  • An investigation into the website RateMDs.com was initiated following a complaint that anonymous users of the site were posting reviews and ratings concerning her work as a dentist.
  • Several complaints about the RCMP’s use of non-conviction information in vulnerable sector checks led to an investigation that found the RCMP’s policy of reporting non-conviction information, including mental health incidents, in vulnerable sector checks was neither proportional nor minimally intrusive.

The body as information

Initial goal: To promote respect for the privacy and integrity of the human body as the vessel of our most intimate personal information.

Emerging theme: Because they rely on permanent characteristics that are so intimately personal, the collection, use and disclosure of biometrics and genetic information can lead to very significant privacy risks and must accordingly be protected under the highest possible privacy standards.

Canada’s privacy laws were designed to be technology neutral, which is positive, given the pace of technological change compared to that of legislative modernization.

However, the use of facial recognition technology (FRT) by both the private and public sectors has raised questions about whether specific rules may be warranted.

This is already the case for other forms of biometrics collected by law enforcement such as fingerprints and DNA profiles, and Quebec recently became the first jurisdiction in Canada to actually enact a law that specifically addresses biometrics, which encompasses FRT.

The OPC’s investigations into Clearview AI and the RCMP’s use of the company’s facial recognition technology thrust the issue into the spotlight, making it a key initiative under this priority.

The investigations found Clearview AI violated Canada’s federal private sector privacy law by creating a databank of more than three billion images scraped from internet websites without the express knowledge or consent of individuals. Clearview users, such as the RCMP, could match photographs of people against the photographs in the databank. The result was that billions of people essentially found themselves in a police line-up. We concluded this represented mass surveillance and was a clear violation of privacy.

Separately, the OPC concluded the RCMP violated the Privacy Act when it collected personal information from Clearview AI as a government institution cannot collect personal information from a third party agent if that third party agent collected the information unlawfully.

The investigations demonstrate that significant gaps remain in appropriately protecting this highly sensitive biometric information and highlight some of the risks that can arise when the public and private sectors interact. It’s another reason why the OPC has called for greater interoperability between the two federal privacy laws to prevent gaps in accountability where the sectors interact.

At present, the use of FRT is regulated through a patchwork of statutes and case law that, for the most part, do not specifically address the risks posed by the technology. This creates room for uncertainty concerning what uses of facial recognition may be acceptable, and under what circumstances.

At the same time as the OPC released its investigative findings into the RCMP’s use of Clearview’s services, it commenced an extensive consultation on draft guidance on the use of facial recognition technology by police services across Canada in cooperation with its provincial and territorial counterparts.

In May 2022, the OPC, working jointly with its provincial and territorial counterparts, finalized and released its guidance on the use of FRT by police services across Canada. The guidance sets out the legal responsibilities of police agencies under the current legal framework with a view to ensuring any use of facial recognition complies with the law, minimizes privacy risks, and respects the fundamental human right to privacy.

The OPC and other privacy guardians also called on legislators to develop a new legal framework that would define clearly and explicitly the circumstances in which police use of facial recognition may be acceptable. The framework, they agreed, should include a list of prohibited uses of FRT, strict necessity and proportionality requirements as well as explicit oversight and retention requirements.

Another important issue that emerged under this priority is the OPC’s work on genetic testing. In conjunction with its Alberta and B.C. counterparts, the OPC released guidance on direct-to-consumer genetic testing and privacy. It outlined key privacy risks associated with these tests and aimed to inform individuals of their rights.

The OPC also testified before the Standing Senate Committee on Human Rights in support of Bill S-201, an Act to Prohibit and Prevent Genetic Discrimination. It later released a policy statement on the collection, use and disclosure of genetic test results following the implementation of the federal Genetic Non-Discrimination Act. And when the constitutionality of the law was challenged before the Supreme Court of Canada, the OPC intervened to defend the position that individuals should not be compelled to disclose their genetic test results to an employer or insurance company or any other business. The OPC welcomed the Court’s decision to uphold the constitutionality of the Genetic Non-Discrimination Act.

Other OPC work:

  • The COVID-19 pandemic raised numerous issues for the protection of personal information. As a result, the OPC engaged on a range of files, including initiatives related to COVID-19 infection tracking and tracing and border controls. Some of those specific initiatives include:
    • Working with the Government of Canada and its provincial counterparts to perform a review of the privacy implications of the COVID Alert exposure notification application.
    • Releasing a joint statement in conjunction with provincial and territorial counterparts about the use of vaccine passports. It called for any use of such credentials to be time limited and developed and implemented in compliance with applicable privacy principles such as necessity and proportionality.
    • Releasing a framework early on in the pandemic to assess privacy-impactful initiatives in response to COVID-19. The aim was to ensure greater flexibility to use personal information in an emergency while still respecting privacy as a fundamental right; and
    • Publishing guidance to help organizations subject to federal privacy laws understand their privacy-related obligations during the pandemic. In May 2020, the OPC and its provincial and territorial counterparts also issued a joint statement outlining key privacy principles to consider as contact tracing and similar digital applications were being developed.
  • The OPC participated in a Global Privacy Enforcement Network Privacy Sweep of health and wellness devices, as well as their associated applications and websites. The sweep, which looked at 21 devices including smart scales, blood pressure monitors and fitness trackers, raised a number of questions about the amount of data collected and how privacy practices are explained, among other things.

Conclusion

The issues and goals of the strategic priority initiative set out some fairly encompassing categories of privacy concern, right from the outset in 2015, that have proven both suitable and adaptable as trends and events across the data protection landscape evolved.

From intelligence oversight and border security, to online electioneering and the role of social media in democracy; from smart cities and machine learning, to public health surveillance and medical profiling, many privacy issues have emerged since we launched our priorities.

And from these priorities came new questions that led to broader themes that now form the core of the OPC’s advisory, compliance and parliamentary work. These themes highlight a range of trends that have fuelled the appetite for personal information playing out across all organizations.

Both the OPC’s economics of personal information and government surveillance work underscores how advanced the collection, analysis, sharing and leveraging of personal information have become. These practices are now axiomatic in both public-sector service delivery and private sector commercial offerings. This is just as evident in Canada as it is elsewhere around the world.

Similarly, the OPC’s research and compliance work on the protection of reputation and the sensitivities of the body as information reveal an entire ecosystem of businesses, governments and intermediaries that remain – even today – unclear on the ethics, legalities and future implications of many of the technologies they are deploying.

The strategic priorities that helped guide the OPC’s work remain extremely relevant today. They have contributed significantly to the conclusion that effective privacy protection for Canadians requires the adoption of federal public and private sector privacy laws that are rights-based, interoperable and confer appropriate powers to the regulator to ensure compliance.

Date modified: