Language selection

Search

Certificate of nomination of Daniel Therrien to the position of Privacy Commissioner Issue Sheets

Table of Contents

Backlogs

Breach Statistics and Trends

Enforcement Collaboration

Privacy Act Investigations

PIPEDA Investigations

Guidance

Parliamentary

International

Contributions Program

Government Advisory – Statistics and Trends

Government Advisory – Key Activities

Business Advisory – Statistics and Trends

Communications – Statistics and Trends

Technology Analysis – Statistics and Trends

HOT TOPICS: Privacy Act Reform

HOT TOPICS: PIPEDA Reform (Bill C-11)

HOT TOPICS: COVID Alert App

HOT TOPICS: In the courts – Facebook

HOT TOPICS: In the courts – Google Reference

HOT TOPICS: FPT Resolution on Vaccine Passports

HOT TOPICS: ArriveCAN

HOT TOPICS: Transborder Data Flow

HOT TOPICS: Artificial Intelligence (AI)

HOT TOPICS: Cadillac Fairview

HOT TOPICS: Clearview AI

HOT TOPICS: Facebook Breach

HOT TOPICS: Desjardins Investigation

HOT TOPICS: Identity Verification

HOT TOPICS: StatsCan Follow-Up

HOT TOPICS: Pornhub

HOT TOPICS: Complaint against Political Parties

HOT TOPICS: Data Scraping (Publicly Accessible Personal Information) 

HOT TOPICS: Facial Recognition

Facial Recognition – Guidance

Facial Recognition – Public Sector Uses

Facial Recognition – Private Sector Uses

Facial Recognition – and Artificial Intelligence

Facial Recognition – Surveillance

Facial Recognition – Racialized Communities

Facial Recognition – Children, Seniors, Vulnerable Populations

Facial Recognition – Status of FR Investigations (RCMP)

Facial Recognition – Moratoria

Facial Recognition – Current Laws on Use

Facial Recognition – Other Jurisdictions

Facial Recognition – EU Ban

Compliance Backlog

Key Messages

  • We received temporary funding in 2019 to help us reduce our investigative backlog of complaints older than a year. We surpassed the target we set in 2019 and reduced the overall backlog of complaints older than 12 months by 91%.
  • The results we were able to achieve when entrusted with extra funds, notwithstanding the disruption of the global pandemic, speak for themselves.
  • Notwithstanding this success, absent the discretion to investigate matters, backlog pressures will persist. This is not sufficiently addressed by C-11.

Background

  • Backlog Progress

    Fiscal Year Backlogged cases
    under PA
    Backlogged cases
    under PIPEDA
    Total
    backlogged cases
    Reduction from
    2018/19 (%)
    2018/19 260 64 324 -
    2019/20 115 52 167 48%
    2020/21 14 15 29 91%
  • We reduced the backlog of complaints by implementing measures including:
    • Enhancing procedural efficiencies, notably at the front-end of the complaint handling process, such as the launch of an enhanced online complaint form that streamlined and automated processes for receiving and triaging complaints.
    • A temporary increase to our Office’s resources in the 2019 federal budget. We hired contracted staff and redistributed files, which increased capacity and allowed us to focus on aging files and those at risk of becoming backlogged.
    • Issuing deemed refusal findings where departments fail to respond to access requests, empowering complainants to pursue matters through Federal Court.

Prepared by: Compliance Sector


Breach Statistics and Trends

Key Messages

  • Combined, the federal public and private sectors submitted over 1,000 privacy breach reports last year (with PIPEDA breaches accounting for 74% of the total).
  • Timeliness is an ongoing challenge, with approximately 40% of private sector reports submitted 3 months + after a breach occurred. This, and other trends we’ve noted, has informed our analysis of law reform.
  • We have advocated for many years that privacy breach reporting be made mandatory under the Privacy Act to help combat systemic under-reporting in the federal public sector, which continues to be an issue. We are pleased to see this as part of Justice’s proposals for Privacy Act modernization.
  • Breach reports are important tools. They are a valuable source of business intelligence that provide information on current privacy threats Canadians are facing. They also allow my Office to ensure that appropriate mitigating measures to protect Canadians are applied following a breach.
  • We have and continue to investigate major privacy breaches, such as Desjardins, Capital One, and GCKey.

Background

  • PA and PIPEDA breaches reported 2015-16 to 2020-21

    FY PIPEDA PA
    2020-21 782 280
    2019-20 678 341
    2018-19 315 155
    2017-18 116 286
    2016-17 95 147
    2015-16 115 298

Prepared by: Compliance Sector


Enforcement Collaboration

Key Messages

  • In an increasingly digitized world, the challenge of protecting privacy as personal information flows across borders stands as an elevated common goal of many data protection authorities.
  • Our office is a leader in international and domestic enforcement collaboration, playing a key role in fora such as the Global Privacy Assembly, the Global Privacy Enforcement Network, and the Asia Pacific Privacy Authorities network.
  • On the domestic front, our Office has been involved in more collaborative investigations than at any point in OPC history, with the investigations of Desjardins, Clearview, and Tim Hortons just to name a few.

Background

  • GPA: We are co-chairs of two working groups: (i) the Digital Citizen and Consumer Working Group, which advocates for greater cross-regulatory cooperation concerning the intersections between the privacy, consumer protection, and competition regulatory spheres; and (ii) the International Enforcement Working Group, which focuses on enforcement collaboration.
  • GPEN: Sit on Executive Committee and introduced its annual Global Privacy Sweep, now in its 8th year.
  • APPA: Active membership; we develop partnerships, discuss best practices, and share information on emerging technologies and changes to privacy regulation.
  • DECF: We chair the Domestic Enforcement Collaboration Forum, which facilitates collaboration between our office and authorities for AB, BC and QC.
  • Enforcement examples: We investigated the Desjardins’ privacy breach alongside our counterparts in Quebec. Our investigation into Cadillac Fairview was carried out jointly with Alberta and British Columbia. Our joint investigation into Clearview AI was our first involving all three provinces with private-sector privacy legislation: Alberta, British Columbia, and Quebec. Our investigation of Facebook and AggregateIQ was also conducted jointly with BC.

Prepared by: Compliance Sector


Privacy Act - Ongoing Publicly Announced Investigations

Key Messages

  • CRA and GCKEY Breaches: We launched investigations after TBS publically disclosed credential stuffing attacks earlier this year against both the GCKey service, used by approximately 30 federal institutions across the government, and CRA accounts.
  • WE Charity / ESDC: We received complaints against WE Charity and ESDC related to personal information collected for the Canada Student Service Grant and are investigating the complaints.
  • RCMP and Project Wide Awake: We are investigating a complaint received about the RCMP’s Wide Awake Project.

Background

  • As these are active investigations, no additional details are available at this time.

Prepared by: Compliance Sector


PIPEDA Ongoing Investigations

Key Messages

  • Tim Hortons: Alongside privacy commissioners in Quebec, BC and Alberta, we launched a joint investigation into Tim Hortons’ mobile ordering app after media reports noted the app may be collecting and using data about people’s location and daily movements.
  • Capital One: We opened an investigation into a data breach at Capital One after receiving complaints from Canadian customers. Capital One notified OPC about the breach affecting 6M Canadians whose personal information — including, in some cases, Social Insurance Numbers — had been accessed without authorization.
  • Landlord Credit Bureau: we have also recently commenced an investigation into the privacy practices of this services company and credit reporting agency, following complaints from tenants that their landlords had shared their information with it for the purposes of establishing tenant credit records, without their knowledge or consent. 
  • Hai Di Lao Restaurants: Following recent media reports concerning the video surveillance system implemented in several Hai Di Lao Hot Pot restaurants located in BC and ON, our office is currently looking into the matter alongside the BC OIPC
  • WE Charity / Employment and Social Development Canada: We received complaints against WE Charity and ESDC related to personal information collected for the Canada Student Service Grant and are investigating the complaints.

Background

  • Tim Hortons, Capital One, Landlord Credit Bureau are active investigations, no additional details are now available.
  • Hai Di Lao Restaurants: No additional details are available.

Prepared by: Compliance Sector


Guidance Development

Key Messages

    • The 15% permanent funding increase in 2019 has allowed the OPC to expand its capacity to protect the privacy of Canadians in the face of the exponential growth of the digital economy.
    • In 2019-2020, we completed: advice for individuals on privacy breach notifications; an extensive update on our guidance to federal institutions on PIAs; pandemic-specific guidance and guidance for manufacturers of IOT devices.
    • In June 2021, we published draft guidance on police use of facial recognition (FRT), which is currently open for consultation until October 15, 2021.
    • We continue to focus on developing guidance on key privacy issues, including biometrics.

Background

  • Guidance released in 2019-2020 fiscal year:
    • Privacy and the COVID-19 outbreak: Guidance on applicable federal privacy laws (March 2020)
    • Expectations: OPC’s Guide to the Privacy Impact Assessment Process (March 2020)
    • Guidance for businesses doing e-marketing (January 2020)
    • Joint guidance with the Chief Electoral Officer on guidance to political parties to protecting the personal information of Canadians (April 2019)
    • Receiving a privacy breach notification (September 2019)
  • Status of key pieces of guidance:
    • We are currently working on guidance to industry on Biometrics. Other guidance in progress but delayed due to C-11 includes guidance on Financial Technologies (FinTech), In-Store Tracking, connected cars and for developers of Educational Apps for K-12 schools.
    • Draft FRT guidance is published, consultation with stakeholders is ongoing until October 15, 2021.

Prepared by: PRPA


Parliamentary Affairs

Key Messages

  • We are called on frequently to provide privacy expertise to Parliamentary Committees and individual MPs.
  • Because of COVID, the last Parliamentary year was an usual one, resulting in far fewer appearances than is usually the norm.
  • In the past two years, we:
    • Appeared five times before standing committees, one of them at the National Assembly of Quebec;
    • Tabled our Special Report to Parliament on the OPC’s investigation into the RCMP’s use of Clearview AI and draft joint guidance for law enforcement agencies considering the use of facial recognition technology;
    • Monitored and reviewed seventeen bills and studies;
    • Responded to twelve individual requests from MPs.

Background

  • Key appearances in 2020-2021:
    1. PROC Committee: Parliamentary Duties and the COVID-19 Pandemic (issue of Internet safety), April 29, 2020;
    2. INDU Committee: Canadian Response to the COVID-19 Pandemic (issue of contact tracing), May 29, 2020;
    3. Quebec National Assembly, Committee on Institutions: Bill 64, An Act to Modernize Legislative Provisions as Regards the Protection of Personal Information, September 24, 2020.
    4. ETHI Committee: 2021-22 Main Estimates and Facial Recognition, May 10, 2021
    5. LCJC Committee: Bill S-203, An Act to restrict young persons’ online access to sexually explicit material.
  • Privacy issues brought to our attention by the Parliamentarians over the last fiscal year:
    • Potential privacy law reforms (Bills C-11 and 64 in Quebec), internet safety, contact tracing, open banking and identity theft.

Prepared by: PRPA


International (General)

Key Messages

  • Working collaboratively with other regulators helps to better protect Canadians in a borderless world.
  • Stronger global privacy rights, and partnerships with international privacy enforcement authorities, helps ensure Canadians’ personal information remains protected when it is sent outside of Canada’s borders for processing.
  • Our office has long been cooperating with international counterparts to leverage resources, develop common policy positions, share best practices, and more effectively enforce privacy laws, in Canada and abroad.
  • We achieve this by taking part in international working groups, adopting joint resolutions, issuing joint statements and through enforcement collaboration with our counterparts.

Background

GPA:

  • Chair of Policy Strategy Working Group Three (Privacy and Other Rights and Freedoms) – focusing on developing a narrative about privacy and human rights
  • Co-chair of the Digital Citizens and Consumers Working Group (DCCWG)
  • Co-chair of the International Enforcement Cooperation Working Group (IEWG)
  • Member of other GPA working groups such as: Ethics and AI, Digital Education, the Future of the Conference, Metrics, Covid-19 and Facial Recognition.
  • Co-sponsor of 2020 resolutions on greater accountability in the development and use of facial recognition technology, AI, and privacy challenges arising from COVID-19.

Other regulator networks: 1) Asia Pacific Privacy Authorities (APPA) Forum; 2) Association francophone des autorités de protection des données personnelles (AFAPDP); 3) Common Thread Network (CTN); 4) Berlin Working Group.

Participation in International Government fora: 1) OECD Working Party on Data Governance and Privacy in the Digital Economy (DGP) ; 2) APEC Data Privacy Subgroup (DPS).

Prepared by: PRPA


Contributions Program

Key Messages

  • The Program funds independent research and knowledge translation initiatives to cultivate expertise and understanding on a broad range of privacy issues related to PIPEDA.
  • These projects generate new information and understanding to help organizations better safeguard personal information and assist Canadians in protecting their privacy. Last year most of the funded projects related to AI. For instance, one project (by the CSA Group) looked at the implications of AI on the privacy rights of children, and another (by UQAM) examined the responsible development of machine learning from a privacy perspective.
  • The Program was created in 2004 and since then it has allocated approximately $7 million to some 160 projects.
  • Last year 11 projects out of 43 proposals were selected for funding; this year’s recipients will be announced shortly.

Background

  • Program Focus: Funded projects help advance the Office’s privacy priorities, which focus mainly on responding to Canadians’ concerns about privacy. All projects must be PIPEDA focussed, as the Program derives its existence from that Act.
  • Funding: All projects are evaluated on the basis of merit by OPC subject matter experts and occasionally, when required to validate our assessments, by external peer reviewers. The annual budget for the Program is $500,000. Most years up to $50K is allocated per project and a maximum of $100K per recipient organization.
  • Program terms: The Terms and Conditions of the Program were renewed by the Minister of Justice in 2020-21 for five years. The full list of projects that have received funding can be seen on the OPC website, as well as summaries of all completed projects funded over the years.

Prepared by: PRPA


Government Advisory - Statistics and Trends

Key Messages

  • GA was established to increase early consultations with institutions and provide proactive advice to help mitigate privacy risks associated with government programs and activities through early and frequent engagement.
  • Institutions have been very receptive; we were consulted 109 times in 2020-21, compared to 66 times in 2019-20 and 48 in 2018-19.This indicates institutions find value when they engage with GA and underlines the need for privacy expertise which may be lacking at the institutional level.
  • Over the past year, we prioritized efforts to support activities in response to the COVID 19 pandemic. This included our review of the COVID Alert App, and advisory consultations on telemedicine and temperature scanning, among many others. In particular, Health Canada, PHAC and ESDC have relied heavily on GA for advice on COVID files during the pandemic.
  • Institutions consult GA for advice on complex and technically challenging files within short timelines in all areas, including law enforcement and public safety.

Background

  • Advisory Consultation Statistics: In FY 2020-21, we opened 109 new consultation files, including 28 for programs and activities related to COVID-19.
  • PIAs: In 2020-21 we received both PIAs and Privacy Compliance Evaluations (PCEs), a less rigorous assessment allowed by TBS interim policy for urgent, COVID related initiatives. We received 81 assessments in total: 65 full PIAs, and 16 PCEs.
  • Notifications: GA received 491 notifications of disclosures of personal information in the public interest, or in circumstances that benefited the individual, compared to 611 in FY 2019-20. This is a slight decrease, but continues the trend we’ve seen of large numbers of public interest disclosures over the past few years.

Prepared by: Government Advisory


Government Advisory – Key Activities

Key Messages

  • The Government Advisory (GA) Directorate provides advice and recommendations to institutions through reviews of PIAs, ISAs, and increasingly, by even earlier consultations with federal bodies as initiatives are conceptualized and developed.
    • Files are complex, varied, and increasingly involve emerging technologies and links with the private sector
  • GA also regularly consults with TBS on development of government-wide policies, directives, and standards.

Background

  • COVID-19 Alert App: consulted with Health Canada during App development to ensure privacy risks were addressed. Work is ongoing as the app is updated.
  • RCMP Remotely Piloted Aircraft Systems (Drones): advised on RCMP training for use of drones to ensure necessity, proportionality and minimal intrusiveness and to ensure accountability for any PI collected. The RCMP updated their policies based on our advice. Our work is ongoing.
  • RCMP National DNA Databank: is being modified to allow family members to voluntarily contribute DNA to help find missing persons and identify human remains. On our advice, the RCMP revised voluntary donor consent forms for clarity, and included privacy protections in agreements for sharing DNA profiles and other sensitive PI with foreign jurisdictions.
  • Canadian Framework for Collaborative Police Response on Sexual Violence: On our recommendation, the Canadian Association of Chiefs of Police included provisions for reporting privacy breaches to the confidentiality agreements signed by members of the sexual violence case review committees.
  • VidCruiter: with the increased use of digital remote interviewing platforms for staffing, we made several privacy-specific recommendations to Justice, the Canadian Space Agency, HC, DFO, Infrastructure Canada and ESDC, to protect privacy.

Prepared by: Government Advisory


Business Advisory Directorate

Key Messages

  • BA engages proactively with companies to assist in their assessment of the privacy implications of their work – and to consider how new technologies and business models impact privacy before they are deployed in the marketplace.
  • Addressing privacy issues upfront helps mitigate privacy risks, avoid compliance issues, and provide regulatory predictability.
  • 70% of the organizations engaged over the last year were SMEs. Most businesses using our advisory services were developing data and technology-intensive initiatives.
  • Our engagement with SMEs is critical given many do not have the resources to consider privacy issues proactively.

Background

Outreach: BA initiated 13 new advisory engagements, and conducted 33 virtual events, including exhibits, presentations, stakeholder meetings and dedicated advisory sessions:

  • Privacy Clinic – an engagement platform created to provide privacy advice to SMEs. In 2020-21, BA hosted a Privacy Clinic in collaboration with an Innovation Hub in Waterloo, Ontario, and provided privacy advice to 7 scale-up organizations.
  • ORSMEN – in 2020-21 BA continued its long-standing partnership with the Ontario Region SME Network to reach out to SMEs and provide helpful privacy guidance.

Key Consultations:

  • MILA COVID app: some measures adopted after our consultation included: using personal information for the narrowly defined, limited purpose of alleviating the public health crisis; limiting the use of the application over time, that is until the pandemic recedes; only sharing aggregated, de-identified data with government, if adopted.
  • HealthTech org: BA made 24 recommendations to a Canadian company with a cutting-edge product which voluntarily sought advice for the next-generation design.
  • Temperature checks: In response to a retailer’s proposal to introduce temperature checks at their establishments in order to minimize spread of COVID-19, BA provided advice on proposed design and implementation.

Prepared by: Business Advisory


Communications Statistics and Trends

Key Messages

  • The Communications Directorate supports the Office’s efforts to promote public awareness and understanding of privacy issues by implementing multi-year strategies to raise awareness of privacy rights among individuals, and of obligations among public servants and businesses.
  • Some of our activities include public opinion polling, media relations, developing publications, videos and infographics, attending events, and publishing content on our website and through social media.
  • We also reach vulnerable groups such as seniors and youth. For example, we developed a graphic novel for youth, our most popular publication, which was translated by organizations in Mexico, Italy and Switzerland. We also run campaigns in libraries and have spoken with seniors groups about privacy protection.

Background

Key stats: In 2020-21, we gave 35 speeches/presentations, published 33 news releases and announcements, and distributed 1,452 publications. There were 2.5 million website including over 26,000 blog visits, and we answered 300 media requests.

Information Centre:

  • Number of requests: Last year we received 7090 information requests, and cannot keep up with demand for information and advice on privacy rights and obligations.
  • Types of requests:
    • The majority of requests are from individuals on issues such as whether organizations are over-collecting information or using it without consent and also express concern about breaches.
    • 8% of requests are from private-sector organizations on topics such as COVID screening measures, inter-border transfer of personal information, safeguarding personal information and breach notification requirements.

Biannual Survey of Businesses Results: Our biannual survey of businesses showed larger companies are more likely to have policies/procedures in place to assess privacy risks. Other results pointed to the need for more attention to privacy.

Prepared by: Communications


Technology Analysis Directorate (TAD)

Key Messages

  • TAD supports the OPC’s work to assess the privacy impacts of technology, helping us to ensure that Canadians can enjoy the benefits of digital technologies safely.
  • Recent work of this group has included assessments of emerging global contact tracing technologies, the COVID Alert App, and support toward the Cadillac Fairview and Desjardins investigations.
  • To improve our capacity, TAD is working on an expansion and modernization of our technology laboratory.

Background

  • Analytical requests: From April 2020 to March 2021, 159 requests for support were opened or filed with TAD.
    • Forty-five percent (45%) of requests came from Policy and Promotion
    • Forty-six percent (46%) came from Compliance
    • Nine percent (9%) allocated to other technical internal support activities
    • Sixty-seven percent (67%) of those requests were completed and closed; the remainder remain active files and are still being supported.
  • Expansion and modernization of the technology laboratory: Expansion and modernization of our technology laboratory will better support the activities of the two program areas – Compliance (including CASL activities) and Policy & Promotion
    • New capacity will promote appropriate protections, support investigative activities, research development, and the promotion of general information and guidance.
  • Innovative Solutions Canada program: TAD recently entered a formal partnership with ISED, under the Innovative Solutions Canada program, and tested an AI-powered software platform that offers novel ways to empower care for people who have an intellectual or developmental disability (I/DD), including autism.
    • The Innovation is a convenient tool for each member of a care team that supports a person with I/DD - including parents, educators, therapists, direct support professionals, and care managers – to share information, engage with each other, develop personalized interventions, and improve long-term outcomes. The components of the Innovation are a mobile app, a wearable device and the supporting Python-based backend environment.

Prepared by: TAD


Privacy Act Reform

Key Messages

  • We are pleased to see that the law reform process appears to be truly underway with Justice’s recent public consultation on the modernization of the Privacy Act.
  • A number of Justice’s proposals on Privacy Act reform would bring positive changes to the law to deal with the privacy risks posed by emerging technologies, particularly the inclusion of rights-based language in a revised preamble.
  • It also includes enhanced obligations for privacy-by-design and PIAs, and meaningful oversight measures like proactive audit powers, a guidance role for OPC, and simple and effective order-making powers (though limited in scope). These are all necessary mechanisms to promote and enforce compliance.
  • In our submission, we propose some modifications to enhance the collection threshold, and the framework for publicly available personal information, and recommend that key elements for regulating AI be included in a modernized Privacy Act.

Background

  • Framework for “Publicly Available” Personal Information: We propose the definition could be enhanced by explicitly stating that publicly available personal information does not include information in respect of which an individual has a reasonable expectation of privacy.
  • Artificial Intelligence: We recommend the Act include a definition of automated decision-making, as well as a right to meaningful explanation and human intervention related its use, a standard established for the level of explanation required, and obligations for logging and tracing.
  • Collection Threshold: We believe the collection standard of “reasonably required” generally strikes the right balance, but have proposed key modifications to add clarity (such as that identified purposes be specific, explicit and lawful, and to include an explicit proportionality assessment, among others).

Prepared by: PRPA


PIPEDA Reform (Bill C-11)

Key Messages

  • Bill C-11 represents a serious effort to realize the reform we all recognize is badly needed. However despite its ambitious goals, our view is that in its current state, the Bill would represent a step back overall for privacy protection.
  • One of the key issues is that the new law does not provide for quick and effective remedies for individuals – due to severely restricted AMPs, the tribunal, and lack of discretion for the Commissioner.
  • Bill C-11 would impose several new responsibilities without added discretion, reducing our ability to make strategic use of resources and to prioritize our activities based on risks to Canadians.
  • Even if these new functions were appropriately resourced, the OPC should have the legal discretion to manage and prioritize activities based on risk, maximizing finite resources to produce the most effective outcomes for Canadians. This discretion exists in privacy laws in other jurisdictions.

Background

  • Adopting a strategic, risk-based approach is a key practice for effective regulation.
  • New non-discretionary responsibilities introduced in Bill C-11 include requiring OPC to approve codes, provide advice to organizations on privacy management programs at their request, and consult with affected stakeholders on all guidance.
  • Other jurisdictions provide greater discretion to data protection agencies to manage their work, enabling them to prioritize activities and engage more constructively with stakeholders. E.g. this discretion is provided for in Alberta and British Columbia, as well as internationally in the European Union and New Zealand.
  • The benefits of proactive engagement can be achieved without being mandatory and demand-led, which is resource-intensive to the detriment of other functions.

Prepared by: PRPA


COVID Alert App

Key Messages

  • Our engagement with Health Canada on the COVID Alert App helped its design respect all the key privacy principles from our Framework.
  • Despite this, the government asserted at the time that privacy laws do not apply given it was unlikely that personal information was being collected by the app.
  • We recommended that use of the app remain voluntary and that information collected for contact-tracing not be used for unrelated purposes.
    • Other jurisdictions have taken legislative and regulatory measures to ensure contact-tracing remains voluntary, and information used is limited to the purpose for which it was collected.

Background

  • Provincial use: The app can be used to report a diagnosis in 8 provinces: Ontario, Manitoba, Newfoundland and Labrador, New Brunswick, Nova Scotia, Prince Edward Island, Saskatchewan, and Quebec. Alberta and BC have decided against using the federal app. Voluntary use is a key principle of the Joint Statement by Federal, Provincial and Territorial Privacy Commissioners on principles for contact tracing and similar apps.
  • Ongoing engagement: GA communicates regularly with Health Canada on the app and Portal. Health Canada has committed to consulting with our office on changes to the app. Health Canada continues to consult our office on changes to the app. For instance, we were consulted on the new collection of performance metrics. OPC will participate in a joint audit (evaluation) with Health Canada of the app to assess effectiveness.
  • Current status of program evaluation: GA is working with HC on a joint evaluation of the App that is scheduled for completion by the end of 2021. The evaluation will specifically examine necessity and proportionality, effectiveness and continued adherence to FPT principles.

Prepared by: GA / PRPA


In the courts - Facebook

Key Messages

  • Our investigation found that Facebook failed to obtain meaningful consent and contravened the fair information principles relating to consent, safeguards, and accountability in PIPEDA.
  • Because Facebook refused to implement our recommendations, we are seeking a binding order from the Federal Court to require Facebook to take action to correct its privacy practices and comply with PIPEDA.
  • We have just received the Court’s decision on two preliminary challenges and are analyzing it. We cannot comment further at this time.

Background

  • Investigation: On April 25, 2019, the OPC released its Report of Findings on its investigation into FB’s compliance with PIPEDA, in relation to the “This is Your Digital Life” (“TYDL App”) and Cambridge Analytica, a UK political consulting firm. The OPC found that Facebook contravened the fair information principles relating to consent, safeguards, and accountability contained in Schedule 1 of PIPEDA. We also found that, with respect to Users’ downloads of the TYDL App after June 18, 2015, Facebook failed to obtain meaningful consent per s. 6.1 of PIPEDA.
  • Outcome of investigation: Facebook disputed the findings of the investigation and refused to implement the OPC’s recommendations to address the deficiencies identified. Therefore, OPC initiated an application pursuant to s. 15 of PIPEDA in Federal Court against Facebook.
  • Judicial review: Facebook brought a separate judicial review application challenging our decision to investigate, to continue to investigate, as well as the investigation process, and seeks to quash the report of findings. We attempted to strike Facebook’s application for judicial review but were unsuccessful; the application is therefore ongoing. By way of separate motion, FB asked the Court to strike significant portions of the OPC’s affidavit evidence in our s. 15 PIPEDA application but was largely unsuccessful. The Court heard these preliminary challenges in January 2021 and rendered its decision June 15, 2021.

Prepared by: Legal


In the courts - Google Reference

Key Messages

  • In 2018 OPC asked the Federal Court to consider whether Google’s search engine is subject to PIPEDA when it indexes web pages and presents search results in response to queries of a person’s name.
  • This matter arose in the context of a complaint in which an individual alleged Google contravened PIPEDA by prominently displaying links to articles about him when his name is searched, alleging the articles are outdated, inaccurate and disclose sensitive information. Google asserts that PIPEDA does not apply in this context.
  • Following public consultations, the OPC took the view in its draft position paper on online reputation that PIPEDA provides for a right to de-indexing – which removes links from search results without deleting the content itself – on request in certain cases. This would generally refer to web pages that contain inaccurate, incomplete or outdated information.
  • The Federal Court heard the matter on January 26-27, 2021; the Court has not yet issued its judgment – we cannot comment further on this issue at this time.

Background

  • Complaint: In 2017 we received a complaint from an individual alleging that Google is contravening PIPEDA by prominently displaying links to online news articles about him when his name is searched. The complainant alleges the articles are outdated, inaccurate and disclose sensitive information (eg: sexual orientation and a serious medical condition). He argues Google has caused him direct harm by linking the articles to his name.
  • Draft Position Paper: In 2018, the OPC published a Draft Position Paper on Online Reputation as part of an ongoing consultation on how privacy law could address harms to individuals resulting from the increased exposure of personal information online. In it, we stated that we believe that PIPEDA applies to search engines. The Paper remains a draft and will not be finalized until the conclusion of the reference proceeding.
  • Litigation: In 2018, Google disagreed with our Office’s position that PIPEDA applies to its search engine. In October 2018, we asked the Court to determine the preliminary issue of whether PIPEDA applies to Google’s search engine.

Prepared by: Legal


FPT Resolution on Vaccine Passports

Key Messages

  • In May 2021, our Office issued a joint statement on vaccine passports along with our provincial and territorial counterparts.
  • The statement outlined fundamental privacy principles to consider in development of vaccine passports, or any initiative that aims to confirm a person’s vaccination or immunity status.
  • Given significant privacy risks with vaccine passports, the statement stressed that any such credential must be demonstrably necessary, effective and proportional before being used.

Background

  • Vaccination status, access to services and legal authority: raises serious concerns, both from a privacy and wider rights perspective. As such, before this happens, if restaurants, retail, or music venues were to require a proof of vaccination as a condition of service, we would want to see a specific public health order or other legal authority to ensure that rights are not infringed.
  • Importance of consultation: scientific knowledge about COVID-19 and the vaccines is evolving, as are discussions about vaccine passports in many jurisdictions. The purpose of our statement was to ensure consideration of privacy at the earliest opportunity in vaccine passport development.
  • Importance of jurisdiction: Organizations considering vaccine passports should consult with the privacy commissioners in their jurisdiction as part of the development process. Several provinces (including Ontario, Alberta, Newfoundland) have laws that specifically protect individuals’ health data. Additionally, British Columbia, Alberta and Quebec have private-sector privacy laws substantially similar to PIPEDA.

Prepared by: PRPA


Mandatory Isolation Order and ArriveCAN

Key Messages

  • The OPC reviewed several Privacy Compliance Evaluations (PCEs) from PHAC for measures under the Mandatory Isolation Order, including the ArriveCAN app and web portal.
  • They were assessed against the OPC Framework and align with PHAC’s authorities under the Quarantine Act, and limited to purposes of reducing the spread of COVID-19 and preventing importation of the virus into Canada.
  • PHAC has been receptive to our advice on transparency and oversight; we have also advised that PHAC ensure security assessments for IT systems used to collect, transmit, and retain personal information are undertaken in a timely fashion.
  • We expect to receive further privacy assessments from PHAC, including potentially for changes to the ArriveCan app that would allow to submit proof of vaccination on a voluntary basis.

Background

  • PHAC was given an exception from doing full PIAs by TBS during the pandemic, even though the Interim Directive has expired, and continues to use a streamlined approach. PHAC indicated the need for rapid delivery of programs during the pandemic makes completing full PIAs prior to launch unfeasible. They continue to consult with us regularly to address privacy issues as they develop their programs.
  • The ArriveCAN app allows travelers to provide information digitally, enabling faster contract tracing compared to paper forms. According to PHAC, the app reduces the average time between date of entry and the date of disclosure for contact tracing from 9 days to 2 days.
  • PHAC is consulting with us on possible updates to the ArriveCAN app to enable fully vaccinated travellers to upload proof of vaccination on a voluntary basis for exemption from certain quarantine measures. PHAC shares information with public health authorities in the province or territory where the person quarantines. Information may be disclosed to the RCMP and other law enforcement bodies in Canada for quarantine enforcement purposes.

Prepared by: Government Advisory


Transborder Data Flow

Key Messages

  • PIPEDA does not adequately address risks to privacy posed by global data flows and Bill C-11 in its current form would not resolve those weaknesses.
  • Most modern privacy laws explicitly and separately address trans-border data flows, including those in Australia, New Zealand and the GDPR.
  • Bill C-11 does not establish a comprehensive scheme to govern trans-border data flows. It should be amended to contain such provisions, so that rights and obligations are clear and accessible.
  • Similarly to laws in Australia, New Zealand, the GDPR and Quebec’s Bill 64, the Bill should also be expanded to capture “disclosures” outside the country. As well, it should extend transparency obligations to foreign organizations that move Canadians’ personal information outside the country and expose that information to privacy risks.

Background

  • Section 11(1) of Bill C-11 would require transferring organizations to ensure that the service provider provides substantially the same protection as that which the organization would be required to provide under the Act. Principle 4.1.3. of PIPEDA requires transferring organizations to ensure a “comparable level of protection”.
  • Bill C-11 relies on the use of contractual means (or otherwise) to ensure “substantially the same” protections, as opposed to PIPEDA’s “contractual or other means”. Other jurisdictions provide a number of options in this regard, including mechanisms such as adequacy rulings, Standard Contractual Clauses, codes of conduct or other binding schemes such as binding corporate rules.
  • The OPC recommends that a separate scheme for tbdf address the considerations identified in Teresa Scassa’s paper relating to: 1) to whom the obligations apply; 2) accountability; 3) conditions to be met and 4) protections in the destination State.

Prepared by: PRPA


Artificial Intelligence (AI)

Key Messages

  • AI has immense promise, but must be implemented in ways that respect privacy, equality and other human rights.
  • In our view, an appropriate legal framework for AI would:
    • Allow personal information to be used for public and legitimate business interests, including for the training of AI; but only if privacy is entrenched in its proper human rights framework;
    • Create provisions specific to automated decision-making to ensure transparency and fairness (explanation and contest); and
    • Require businesses to demonstrate accountability to the regulator upon request, ultimately through proactive inspections and other enforcement measures to ensure compliance.
  • PIPEDA does not contain any of these measures, and is ill-suited to the AI environment. C-11 only contains an explanation requirement for automated decisions, but without any standard for what such an explanation should entail.

Background

  • OPC launched a public consultation in January 2020. We received 86 submissions, and held two in-person consultation sessions in Montreal and Toronto.
  • The wrap-up report, A Regulatory Framework for AI, contains our key recommendations for regulating AI, and is available on our website.
  • We published a separate report we commissioned by a recognized expert in AI, which informed our recommendations and accounts for stakeholder.
  • More broadly, OPC collaborates with international data protection authorities in working groups on AI through the Global Privacy Assembly, a forum of international Data Protection and Privacy Commissioners.

Prepared by: PRPA


Cadillac Fairview

Key Messages

  • Our investigation into Cadillac Fairview found that the company used embedded inconspicuous cameras in digital information kiosks at 12 shopping malls to collect customers’ images, and used FR technology to guess their age and gender.
  • Shoppers had no reason to expect that their sensitive biometric information would be collected and used in this way, and did not consent to this collection or use.
  • While the images were deleted, the biometric information of 5 million shoppers was sent to a 3rd party service provider, and was stored in a centralized database for no discernable purpose.
  • We remain concerned that Cadillac Fairview refused to commit to ensuring that express, meaningful consent is obtained from shoppers should it choose to redeploy the technology in future.

Background

  • Provincial cooperation: This was a joint investigation with AB, BC, and involved info-sharing with QC. Our findings were published in October 2020.
  • Collection purposes: Personal information was collected in order to track foot traffic patterns and predict demographic information about mall visitors (e.g. age and gender). Unknown to Cadillac Fairview, a biometric database consisting of 5M numerical representations of faces was also created and maintained by a third-party processor.
  • Outcomes: Cadillac Fairview has ceased use of this technology and has advised that they have no current plans to resume its use. We are concerned that Cadillac Fairview could simply recommence this practice, or one similar, requiring us to either go to court or start a new investigation.
  • Law reform: Cadillac Fairview’s refusal to commit to obtaining express and meaningful consent for future use of this technology demonstrates our need for stronger enforcement powers, including order making and AMPs to better protect Canadians’ privacy.

Prepared by: Compliance Sector


Clearview AI

Key Messages

  • We found Clearview failed to obtain consent for the collection, use and disclosure of millions of images of Canadians it scraped from websites, or of the biometric profiles it generated.
  • We also found that its practices amounted to continual mass surveillance, and were for an inappropriate purpose.
  • Clearview disputed our findings and refused to follow any of our recommendations. Stronger regulatory tools, including order-making powers and AMPs, are needed to help secure compliance from companies like Clearview. C-11 does not adequately address these shortcomings due to its AMP regime.

Background

  • Provincial cooperation: This was a joint investigation with AB, BC and QC. Our RF was published on February 2, 2021.
  • Summary: Clearview provided identification services via its Facial Recognition product to 48 Canadian organizations, who collectively conducted thousands of searches. Clearview collected over 3 billion images world-wide.
  • Consent: It did not seek consent for the use of individuals’ personal information, claiming that the information was “publicly available”. Social media organizations stated Clearview violated their terms of service with the scraping, and we found that the information was not publicly available, as defined in the Regulations.
  • Purposes: Clearview indiscriminately collected, used and disclosed personal information in order to allow third-party organizations who subscribed to its service to identify individuals by uploading photos in search of a match.
  • Outcomes: Clearview agreed to exit the Canadian market and cease offering its services to Canadians. At the conclusion of our investigation Clearview refused to follow any of the recommendations made by our Offices, which included that it (i) commit to not re-entering the Canadian market; (ii) cease collection, use and disclosure of images and biometric profiles; and, (iii) delete the images and biometric arrays in its possession.

Prepared by: Compliance Sector


Facebook Breach

Key Messages

  • The media reported in April 2021 that information from approximately 533 million Facebook users has been made publicly available.
  • The data set is alleged to include information about 3.5 million Canadians. The data set had been posted for sale as early as June 6, 2020.
  • Facebook has not submitted a breach report to our Office for this matter and we are currently in communication with the company.
  • We have received a complaint related to the matter and are now considering next steps. We are not in a position to provide any additional details at this time.

Background

  • The data was scraped from people's Facebook accounts through a vulnerability.
  • After Facebook detected this issue in August 2019, it made changes to correct the vulnerability in September 2019.
  • The data included a variety of Facebook profile information and contact details. According to Facebook, the data did not include financial information, health information, or passwords.

Prepared by: Compliance Sector


Desjardins Breach Investigation

Key Messages

  • In May 2019, Desjardins notified our Office of a breach that ultimately affected close to 9.7 million individuals in Canada and abroad. The OPC launched an investigation in collaboration with la Commission d’accès à l’information du Québec.
  • Our investigation concluded that Desjardins violated PIPEDA with regards to accountability, data retention periods, and security safeguard measures.
  • Desjardins will be providing the OPC progress reports every six months on its implementation of a comprehensive action plan following this breach.

Background

  • The compromised personal information included first and last names, dates of birth, social insurance numbers, residential addresses, telephone numbers, email addresses and transaction histories.
  • The breach had been committed by one of Desjardins’ employees, who had been exfiltrating personal information over a period of at least 26 months.
  • Our focus was on Desjardins’ security safeguards, and its accountability in terms of policies and training to protect personal information. Also, given that some records were decades old, we also looked at its retention and destruction policies.
  • Some key takeaways from this investigation:
    • While organizations need to guard against external vectors of attack, they also need to look within.
    • For policies and procedures to be effective, employee training and awareness is key to giving them life.
    • Risks can be reduced by employing good data retention practices.

Prepared by: Compliance Sector


Identity Verification

Key Messages

  • One way identity theft and fraud can be prevented is by verifying a person’s online identity using a trusted and secure digital ID. A digital ID can also help Canadians securely access online services.
  • On Sept 15, 2020, the Digital ID & Authentication Council of Canada (DIACC) launched the Pan-Canadian Trust Framework. The Framework is designed to help businesses and governments develop tools and services that enable information to be verified regarding a specific transaction or set of transactions.
  • Over the year, our Office have been following the development of the DIACC PCTF and the TBS Public Sector Profile (PSP) PCTF.
  • From open banking to e-health, digital ID services is a key enabler for the digital economy. To date, banks and Telco’s have been able to leverage existing digital ID services to support Canadians.

Background

  • A digital ID is a collection of features and characteristics associated with a uniquely identifiable individual or organization — stored and authenticated in the digital sphere — and used for transactions, interactions, and representations online.
  • The PCTF is at a proof of concept stage. More recently our office has been involved in a DIACC WG on Ethical and Acceptable Use of Biometrics within the Digital ID ecosystem.
  • The “CAN/CIOSC 103-1:2020 Digital Trust and Identity - Part 1” is a standard accredited by the SSC. SSC has recently been published and specifies the minimum requirements and set of controls required for creating and maintaining trust in digital systems and services that, as part of an organization’s mandate, assert and or consume Identity and Credentials in data pertaining to people and organization.

Prepared by: TAD


Follow-up to Statistics Canada Investigation

Key Messages

  • As a follow-up to our 2019 investigation, we provided Statistics Canada with direction to help it redesign the Credit Information Project and the Financial Transactions Project.
  • For the Credit Information Project, our investigation found that while Statistics Canada had the legal authority to collect the personal information it had not demonstrated that the collection was necessary or proportional.
  • Our investigation raised concerns that the Financial Transactions Project, if implemented, would have exceeded Statistics Canada’s legal authority. The project was halted during our investigation.
  • We also noted issues with transparency and internal threat actors. We asked Statistics Canada to increase transparency regarding its collection of personal information, address risks related to internal vulnerabilities and recommended that the above projects be halted and redesigned.
  • We are currently advising Statistics Canada as it redesigns these projects.

Background

  • OPC dedicated a full-time resource to support StatCan in applying our recommendations to ensure proposed collections of personal information are necessary for a substantial public goal and proportional to the privacy impact.
  • In fall 2020 StatCan provided us with redesigned project plans. Although progress has been made we found project plans lacking in a number of areas, including: 1) public goals are not sufficiently described; 2) effectiveness was not demonstrated; 3) privacy impacts were not given sufficient consideration.
  • We therefore made a number of recommendations to Statistics Canada to address our concerns in this regard, and that it resubmit project plans for our review.

Prepared by: Compliance Sector


Pornhub and MindGeek

Key Messages

  • We are reviewing your report on the “Protection of Privacy and Reputation on Platforms such as Pornhub”, tabled on June 17. We recognize the many serious concerns that have been raised for privacy and related issues, as it implicates highly sensitive personal information.
  • We have received a complaint related to consent for collection, use and disclosure of intimate images on the MindGeek websites and are investigating. As our investigation is ongoing, I cannot comment further on the details of the complaint.
  • It is of the utmost importance that websites collecting, using or disclosing intimate images comply with the law, to minimize privacy harms and respect Canadians’ fundamental right to privacy.

Background

  • ETHI began their study on February 1, 2021, held nine meetings on the topic with over 34 witnesses. A final report was issued on June 17, which addresses issues including age and identity verification; non-consensual distribution of intimate images; methods to remove underage and non-consensual content; potential impacts of the Minister of Heritage’s draft legislation addressing online harms; and whether enforcing existing regulations such as the Criminal Code and PIPEDA will ensure corporate compliance with legislative and privacy obligations.
    • We note the recommendation that the OPC be consulted with respect to age-verification requirements.
  • The Toronto Star published an article on the complaint in question in January 2021.
  • While our Office has not released the scope of our investigation, the Commissioner stated at his ETHI appearance on the Main Estimates that websites like Pornhub raise issues of consent and appropriate purposes. News outlets in several countries published articles on our Office’s investigation following his appearance.

Prepared by: PIPEDA Compliance / PRPA


Complaint against Federal Political Parties (FPPs)

Key Messages

  • The OPC has concluded that PIPEDA does not apply to the activities that were the subject of a complaint against the Liberal, Conservative and NDP political parties, as they are not commercial in character.
  • Although the sale of merchandise, memberships, and tickets involves an element of exchange, the OPC was not convinced that those transactions qualify as commercial in character given the context in which the federal political parties operate.
  • Even if those transactions were considered to be commercial in character, that would not allow the OPC to investigate the general practices of federal political parties in relation to political advertising for voters.
  • The OPC has repeatedly called for political parties to be subject to legislation that creates obligations based on internationally recognized privacy principles and provide for an independent third party authority to verify compliance.

Background

  • We received the complaint against the three main federal political parties on August 22, 2019. The other recipients of the complaint were: Elections Canada, the Competition Bureau, the CRTC and the BC OIPC.
  • Despite the thorough submissions made by the Complainant in this matter, we have concluded that PIPEDA does not apply to the activities of the Federal Political Parties that are the subject of the complaint as they are not commercial in character.
  • To come to our conclusion in this matter, the OPC carefully reviewed the complainant’s extensive representations on commercial activity and those received from the NDP and the Liberal Party of Canada. The Conservative Party of Canada did not provide a response.

Prepared by: Compliance Sector


Web Scraping (Publicly Accessible Personal Information)

Key Messages

  • Web scraping allows quick and easy collection of unprecedented amounts of personal information from publicly accessible websites, including social media.
  • Much of this information is not truly “publicly available”, as defined under PIPEDA, and therefore not exempt from consent requirements.
  • Our investigation into Clearview AI found they used such tools to collect billions of images from various websites in order to create a reference database for its facial recognition software illegally, as requisite consent was not obtained.
  • Experiences such as Clearview show there is a need to avoid broad interpretations of how “publicly available” information can be used without consent, as it could lead to serious harms. Our privacy laws must ensure that an individual’s reasonable expectations are taken into consideration in determining whether information is “publicly available”.

Background

  • Web Scraping is the extraction of data from websites – generally via software to automatically browse the web and collect data. Web crawlers navigate through websites, creating copies of data they access, without requiring the permission of a website’s operator.
  • While website operators can employ various anti-scraping techniques to identify and block automated crawlers, most of these countermeasures can be defeated.
  • As an example, Clearview AI conducted its scraping activities in contravention of terms of service of a variety of websites. Despite receiving cease-and-desist requests from Google, Facebook and Twitter, it continues to scrape images from these websites, stating that it believes it has “the right to do so”.

Prepared by: Compliance Sector


Facial Recognition (FR)

Key Messages

  • Recent investigations conducted by my office have shown that FR is a powerful technology that poses serious risks to privacy.
  • These risks include intrusions on individuals’ basic right to navigate public and private spaces, including online spaces, without the fear of being identified, monitored, and tracked at every turn by corporate and government entities.
  • FR use also creates risks of bias and discrimination towards certain groups, and can limit individuals’ ability to exercise other democratic rights and freedoms.
  • The global privacy community has acknowledged these risks by passing a resolution on FR use at the last annual conference of the GPA. My office is participating in a working group, formed under this resolution, to develop a set of global policy principles and expectations for the use of FR.

Background

  • Complaints: OPC recently completed three investigations into the use of FR: two under PIPEDA (Clearview AI and Cadillac Fairview) and one under the Privacy Act (the RCMP).
  • Guidance: OPC is currently working with our provincial and territorial counterparts to prepare joint guidance on FR use by police agencies. We plan to open a public consultation on a complete draft of the guidance in the coming months.
  • International: On October 15, 2020 at the Annual GPA Conference, a Resolution on FR was unanimously passed, which commits the GPA to working to develop a set of agreed upon principles and expectations for the use of personal information in facial recognition technology, including signalling where it poses the greatest risk to data protection and privacy rights, and recommend how those risks can be mitigated.

Prepared by: PRPA


FR Guidance

Key Messages

  • Last year, multiple media reports confirmed that numerous Canadian police agencies were using Clearview AI’s services.
  • While FR presents serious threats to privacy, there can be legitimate uses for public safety when used responsibly and in the right circumstances.
  • We recently published draft guidance on police use of facial recognition technology, developed jointly with our provincial and territorial counterparts.
  • We are consulting on the draft guidance until October 15, 2021.

Background

  • Police use of FR is regulated through a patchwork of statutes and case law that, for the most part, do not contemplate the risks of FR specifically. This creates room for uncertainty concerning what uses of FR may be acceptable, and under what circumstances.
  • The guidance clarifies legal responsibilities, as they currently stand, with a view to ensuring any use of FR by police agencies complies with the law, minimizes privacy risks and respects the fundamental human right to privacy.
  • Providing guidance specifically on police use of FR is timely given real-world use of FR by the police, the potential for legitimate public safety benefits, and the very serious risks to fundamental rights and freedoms at play.
  • Our office is also developing guidance on the use of biometrics, which includes facial recognition, by public and private organizations.

Prepared by: PRPA


Public Sector Uses of FR

Key Messages

  • Institutions should engage with us early to assess the privacy impacts of a new initiative, especially high-risk ones like FR.  We would also expect to receive a PIA in advance of any such program as required by federal government policy.
  • We have engaged with departments on FR initiatives for immigration and border security, national security, and law enforcement.
  • Some of our recommendations included:
    • Advising institutions to consider whether use of FR is necessary, proportionate and effective in context. 
    • Regarding accuracy, we commented on potential bias and possible disproportionate impact on some individuals in related to CBSA Primary Inspection Kiosks.
    • We recommended the CBSA ensure meaningful consent was obtained during a pilot program to test the efficacy of FR software in an operational border context (“Faces on the Move”).

Background

  • Since 2010, the Government Advisory Directorate has received four (4) Privacy Impact Assessments and completed six (6) consultations on initiatives explicitly involving facial recognition technology.  Discussions with law enforcement and national security on FR initiatives were only preliminary.
  • We are only aware of FR initiatives that have been reported to us by the CBSA, the Royal Canadian Mounted Police, the Department of National Defence, and Immigration, Refugees and Citizenship Canada.

Prepared by: GA


Uses of FR in the Private Sector

Key Messages

  • Private companies are increasingly using FR for a variety of purposes.
  • Examples include:
    • Monitoring test-taking in universities;
    • Verifying identity during financial transactions;
    • Controlling access to private property.
  • These uses raise a number of privacy concerns, including:
    • Is meaningful consent being obtained?
    • Are appropriate safeguards in place to protect highly sensitive facial data?
    • Will individuals be required to surrender sensitive biometric data to private companies in order to access goods and services?

Background

  • Examples of private sector FR use:
    • Canadian universities have used an FR service provider to monitor and verify the identities of students while writing exams; the provider was the subject of a breach in October 2020.
    • Canadian banks have deployed FR as a means of verifying identity during financial transactions, including online banking and credit card transactions.
    • A Canadian real estate firm has deployed FR to control access and monitor entry to residential buildings; tenants can opt out of the system, but images are captured automatically from anyone entering the building (including delivery drivers, for example).

Prepared by: PRPA/TAD


FR and Artificial Intelligence (AI)

Key Messages

  • FR systems rely on AI to detect, analyze and compare the biometric features of facial images.
  • Recent advancements in AI modelling techniques have enabled FR algorithms to undergo improvements in accuracy.
  • However, the benefits of AI also bring risks: if the training data used to generate a FR algorithm lacks diversity, the algorithm may encode such biases by default.
  • In November 2020 my office released a series of recommendations for how PIPEDA should be amended to ensure the appropriate regulation of AI. Very few of our recommended measures are in C-11. In our view, this puts Canadians’ rights at continued risk.

Background

  • The most popular AI modelling technique is known as “deep convolutional neural networks,” by which FR algorithms are able to create 3-D facial templates consisting of close to a hundred biometric features from 2-D images.
  • FR systems have existed since the late 1960s; early versions relied on humans to manually select and measure the landmarks of an individual’s face, while today the process is fully automated using AI.
  • C-11 contains certain measures which go towards our recommended enhancements for regulating AI (but which are not exact): the use of de-identified information for socially beneficial purposes in certain circumstances (s39) and for internal research and development (s21); a form of explanation (s63(3)) and order making (s92(2)).

Prepared by: TAD/PRPA


How FR Technologies Transform Surveillance

Key Messages

  • The use of facial recognition technology, with its power to disrupt anonymity in public spaces, can create a very risk and interference for individual liberty.
  • Further, in the Canadian context, our recent report to Parliament demonstrates the use of mass surveillance for the purposes of providing a service to police is a major and substantial intrusion into the lives of our citizens by the state.
  • The necessity of using such practices, and the importance of the particular public good, must be examined carefully, on a case by case basis, for instance by way of an application for a warrant.

Background

  • Both of the recent Cadillac-Fairview and Clearview AI investigations showed how use of facial recognition capabilities amplify existing surveillance practices.
  • In the instance of the Clearview complaint, we observed mass collection of images, along with creation of biometric facial recognition arrays, represented mass identification and surveillance of individuals.
  • Our investigation into the RCMP’s use of Clearview did not uncover evidence that the RCMP used Clearview AI to monitor protestors. However, the RCMP did not provide a reasonable accounting for the vast majority of searches.
  • We are aware of reports of such uses occurring outside of Canada, and we are concerned about the possibility of them occurring here. This speaks to the urgency of acting now to set appropriate limits on the use of FRT, rather than acting reactively once the damage has already been done.

Prepared by: PRPA


Racialized Communities

Key Messages

  • Despite advances in the sophistication of facial recognition, many FR algorithms still exhibit significant differences in accuracy across demographic groups (e.g., race and gender).
  • Studies have found that facial recognition technologies are more likely to misidentify or produce false positives when assessing faces of people of colour, and particularly women of colour, which could result in discriminatory treatment.
  • A false positive result in the context of a law enforcement investigation creates compelling risks of harm to racialized individuals, especially where they may be disproportionality mistakenly placed under suspicion of serious criminality.

Background

  • Studies have shown that some commercial FR software tends to misidentify individuals from different racial and gender groups at different rates, creating a risk of potential bias in the utilization of FR technology.
  • Given the nature of these inaccuracies, some scholars have expressed concern that, if not appropriately managed, FR use may ultimately serve to deepen existing tensions and inequalities relating to policing institutions.
  • It is imperative that organizations take steps to minimize inaccuracy and bias in any deployment of FR technology.
  • As a response to the frustration following the police officer’s killing of George Floyd, congressional Democrats released a law reform bill in June 2020 (Justice in Policing Act, 2020), which would, among other things, impose restrictions on the use of FR by police officers’ body worn cameras. The bill would not create an all-out ban or moratorium on FR generally. Currently, the bill has been passed in the House, it has been read a second time in the Senate.

Prepared by: PRPA


Children, Seniors, Vulnerable Populations

Key Messages

  • The accuracy of FR technology can vary significantly across demographic groups. This creates a risk of unfair, unethical, or discriminatory treatment resulting from FR use.
  • If used inappropriately, FR technology can have lasting and severe effects on privacy and other fundamental rights, and this may be particularly true for social groups that have historically experienced marginalization or vulnerability.
  • When implementing FR for a specific purpose, organizations should carefully consider the unique needs, sensitivities, and disproportionate impacts on vulnerable populations, and address these risks at the outset in the design of the program.

Background

  • Vulnerable populations may have less capacity or diminished digital literacy to fully understand privacy implications that can stem from the analysis of their facial images, or to provide meaningful consent for the collection and use of their images for FR.
  • Vulnerable populations may also be less likely or able to avoid public spaces in which FR is deployed, given disparities in resources and capacity. This may lead vulnerable populations to be over-represented in the deployment of FR systems.
  • That said, there may also be some beneficial uses for vulnerable populations:
    • Law enforcement could use the technology to find missing children or seniors.
    • The RCMP have also publicly stated, that it had successfully used Clearview to identify two child victims.

Prepared by: PRPA


Investigation into RCMP use of Clearview AI

Key Messages

  • Our investigation into the RCMP’s use of Clearview AI found that Canada’s national police force contravened the Privacy Act when it collected personal information from Clearview AI.
    • A government institution cannot collect personal information from a third party agent if that third party agent collected the information unlawfully.
  • The investigation revealed that the RCMP has serious and systemic gaps in its policies and systems to track, identify, assess and control novel collections of personal information.
    • Such system checks are critical to ensuring that the RCMP complies with the law when it uses new technology such as FRT, and new sources, such as private databases.
  • While the RCMP did not agree with our conclusion, it nonetheless agreed to implement our recommendations to improve its policies, systems and training. This includes conducting fulsome privacy assessments of third party data collection practices to ensure compliance with Canadian privacy legislation.

Background

  • The report of findings of the investigation was included in a special report to Parliament by the OPC on June 10, 2021.
  • The OPC, jointly with provincial and territorial counterparts, has launched a public consultation on draft FR guidance for police. The consultation solicits input from stakeholders and the public on the contents of the guidance document, and on the Canadian legal and policy framework for police use of FR.

Prepared by: Compliance Sector


Moratoria

Key Messages

  • Used responsibly and in the right circumstances, facial recognition can provide benefits to security and public safety. I do not believe it should be banned outright, though the idea of no-go zones for certain specific practices is something to consider.
  • For example, the European Commission’s proposed AI regulation seeks to establish prohibitions on a number of uses of AI.
  • That said, given the risks of FR use, we must ensure that our laws are effective in safeguarding the fundamental rights of individuals when this technology is used.
  • Ultimately, the decision rests with Parliament to consider a moratorium until such time that any specific legislative changes on this issue are recommended and adopted.

Background

  • The EU has proposed legislation on AI that prohibits biometric identification in publicly accessible places for the purpose of law enforcement, with exceptions.
  • In July 2020, the International Civil Liberties Monitoring Group, in conjunction with civil society organizations and privacy advocates, published a letter to Minister Blair calling for a ban on “facial recognition surveillance” by law enforcement.
  • In September 2020, Montreal City Council passed a motion placing restrictions on the use of facial recognition by city police. The Toronto Police Service has agreed to a moratorium on the use of FRT in body-worn cameras pending the release of the OPC guidance.
  • Several US cities and states have banned law enforcement use of FR. Examples include Boston, San Francisco, Oakland, Portland, Virginia, and Vermont.
  • Over the past year, several major technology firms, including Microsoft, IBM, and Amazon, have publicly committed to self-imposed bans on the sale of FR to police agencies, pending law reform.

Prepared by: PRPA


Current Laws on FR Use

Key Messages

  • FR in Canada is regulated by a patchwork of laws, except for in Quebec, which has a specific biometrics regime. Since FR operates through the use of personal information, it is captured by privacy legislation, including PIPEDA for private companies, and the Privacy Act for federal institutions.
  • Police and state use of FR is limited by the Charter, which grants individuals the right to be free from unreasonable search and seizure. The Supreme Court of Canada has recognized a general right to anonymity, and a right to privacy within public spaces.
  • The lack of specific legislation for police FR use stands in contrast with the DNA Identification Act and the Identification of Criminals Act, which regulate how DNA samples, fingerprints, and mugshots can be collected, used, disclosed, and destroyed.

Background

  • In Quebec, the Act to Establish A Legal Framework for Information Technology governs the use of biometrics. Proposed Bill 64 requires organizations to notify the Commission at least 60 days before a biometric database is brought into service.
  • Charter s. 8 grants that unless a search is authorized by law, it will be unlawful where an individual has a reasonable expectation of privacy. Anonymity is part of the right to privacy (R v. Spencer, 2014 SCC 43), and individuals retain expectations of privacy in public spaces (R v. Jarvis, 2019 SCC 10).
  • The DNA Identification Act and Identification of Criminals Act set out thresholds for the collection of their specified biometrics, as well as conditions for collection (including compelled production and judicial warrants). The DNA Identification Act uses highly specific rules to limit how the National DNA Databank may be used. This reflects the sensitivity of the personal information involved.

Prepared by: Legal


 

Other Jurisdictions

Key Messages

  • For the most part, courts and regulators are trying to apply existing privacy and biometrics laws while policy makers consider what form additional regulation, if any, should take.
  • Existing laws for biometrics generally entail stronger privacy obligations such as registration and secure storage of biometric databases, as in Quebec, or the right to sue for collection of biometric information without consent, as in Illinois.
  • Many states and municipalities in the States have developed FR legislation. For example, California and Oregon have banned the use of FR in police body-worn cameras. San Francisco, Virginia and Oakland, have banned the use of FR by the police and other public bodies.
  • The Australian parliament is working on a bill that would establish a national FR database for government use.

Background

  • Biometrics are a special category of data under the GDPR. Per Article 9, processing of special categories is prohibited unless a separate condition for processing is met. A DPIA is required for any type of processing likely to be high risk, meaning one is generally required for processing a special category of data.
  • Europe is going in the direction of limiting FR to only the most serious or violent crimes. For example, the European Commission’s proposed AI regulation seeks to establish prohibitions on a number of uses of the technology.
  • Montreal is the first Canadian city to begin FR regulation, creating guidelines for police use. The U.S. does not have federal regulations on FR, but many individual states and municipalities have developed legislation. For example, in 2019, San Francisco banned the use of FR software by the police, and other agencies.
  • The Australian Identity-matching Services Bill would allow sharing of identification information, including facial images, between the federal, states and territories for the purposes of identity-matching to prevent identity crime, support law enforcement, uphold national security, and improve service delivery.

Prepared by: PRPA


EU Ban

Key Messages

  • The European Commission introduced its Artificial Intelligence Act, which clearly outlaws harmful applications of AI that are incompatible with human rights, up-front in legislation. Once again, we see Europe leading the way.
  • They propose to prohibit AI systems used for mass surveillance, including remote biometric identification in public (subject to special authorization), with fines up to 6% of global revenue.
  • While we are supportive of explicit prohibitions in the law, we see a continuing role for ex-poste decisions by the OPC on inappropriate practices. Prohibitions carry the gravitas of law and reflect the will of Parliament. OPC decisions consider the appropriateness of an activity in specific circumstances, which may not be as widely applicable.

Background

  • AI related prohibitions already exist in Europe’s privacy legislation. The GDPR prohibits automated decisions that produce legal/significant affects – unless an individual consents.
  • The proposed EU Law prohibits: AI that deploys “subliminal techniques” to manipulate behaviour (including exploiting vulnerabilities of groups), use by public authorities for social scoring, and use by police for remote biometric identification in publicly accessible spaces.
  • The proposed law also includes a list of AI applications deemed as “high-risk”, which would not be prohibited, but require: record keeping, a risk management system, incorporate human oversight, and accurate and representative data used for AI training.
  • Our AI recommendations included the conduct of PIAs to assess risk and balance against the individual’s rights, and safeguards such as de-identification where possible, traceability, and the ability to request human intervention.
  • It would establish a European AI Board, which would recommend use cases to be deemed “high-risk”. The EU Data Protection Supervisor would be a member of the board.

Prepared by: PRPA


Table of Contents

Backlogs

Breach Statistics and Trends

Enforcement Collaboration

Privacy Act Investigations

PIPEDA Investigations

Guidance

Parliamentary

International

Contributions Program

Government Advisory – Statistics and Trends

Government Advisory – Key Activities

Business Advisory – Statistics and Trends

Communications – Statistics and Trends

Technology Analysis – Statistics and Trends

HOT TOPICS: Privacy Act Reform

HOT TOPICS: PIPEDA Reform (Bill C-11)

HOT TOPICS: COVID Alert App

HOT TOPICS: In the courts – Facebook

HOT TOPICS: In the courts – Google Reference

HOT TOPICS: FPT Resolution on Vaccine Passports

HOT TOPICS: ArriveCAN

HOT TOPICS: Transborder Data Flow

HOT TOPICS: Artificial Intelligence (AI)

HOT TOPICS: Cadillac Fairview

HOT TOPICS: Clearview AI

HOT TOPICS: Facebook Breach

HOT TOPICS: Desjardins Investigation

HOT TOPICS: Identity Verification

HOT TOPICS: StatsCan Follow-Up

HOT TOPICS: Pornhub

HOT TOPICS: Complaint against Political Parties

HOT TOPICS: Data Scraping (Publicly Accessible Personal Information) 

HOT TOPICS: Facial Recognition

Facial Recognition – Guidance

Facial Recognition – Public Sector Uses

Facial Recognition – Private Sector Uses

Facial Recognition – and Artificial Intelligence

Facial Recognition – Surveillance

Facial Recognition – Racialized Communities

Facial Recognition – Children, Seniors, Vulnerable Populations

Facial Recognition – Status of FR Investigations (RCMP)

Facial Recognition – Moratoria

Facial Recognition – Current Laws on Use

Facial Recognition – Other Jurisdictions

Facial Recognition – EU Ban

Date modified: