Language selection

Search

Issue Sheets on the Study of Bill C-27

Table of Contents

General Comments on C-27

Amendments Proposed by Minister – CPPA

Amendments Proposed by Minister – AIDA

Adequacy of Current OPC Powers

Admissibility of OPC Recommendations

Authorized Individuals

Codes and Certification

Compliance Agreements

Domestic Collaboration

Discretion for Resource Allocation

Private Right of Action

Commissioner-initiated Complaints

Audits

Protocols to Protect Investigations

Resource implications of C-27

Section 109 – Powers, Duties and Functions - Factors to consider

The Tribunal

Timelines: Breach Reporting

Timelines: Return of Records

Offences

Artificial Intelligence and Data Act (AIDA)

Amendments Proposed by Minister – AIDA

AI and Data Commissioner

Automated Decision Making

Automated Decision Making – Right to Contest

C-27 Submission – AI-related Recommendations

ChatGPT Investigation

EU AI Act

Types of Generative AI

Government of Canada Voluntary Code of Conduct on Generative AI

OPC Regulatory Framework for AI

Amendments to the Australian Privacy Act

The California Delete Act

EU AI Act

TikTok

Chart: Comparison of Selected International AI Regulatory Models – In force and Proposed

Chart: Jurisdictional Scan of Definitions for De-identification, Anonymization and Pseudonymization

Accountability

Administrative Monetary Penalties

Appropriate Purpose

Breaches - Reparation for Damages

Business Activities

Cybersecurity

Data Mobility

Definition of Minors

Definition of PI (Inferences)

Definition of Sensitive Information

De-Identification / Anonymization

Disclosures to Law Enforcement

Disposal

Domestic Service Providers

Exceptions to Consent for Research

Facebook Litigation

Google Decision

Implications of C-27 on Substantially Similar legislation

Interoperability of Public/Private Legal Requirements

Legitimate Interest

Online Behavioural Advertising

Privacy by Design / Privacy Impact Assessments

Political Parties

Preamble

Publicly Available Information

Purpose Clause

Reputation (De-indexing)

Safeguards

Socially Beneficial Purposes (Exceptions to Consent)

Trans-border Data Flows

Appearance before the Standing Committee on Industry and Technology (INDU) of October 19, 2023 - The Study of Bill C-27


General Comments on C-27

Key Messages

  • Bill C-27 is an important step towards establishing stronger privacy protections while allowing responsible innovation.
  • The bill is an improvement over both PIPEDA and the former Bill C-11 in many ways. It contains many positive developments addressing several of the OPC’s key concerns, such as the addition of a preamble and new protections for minors.
  • Despite the many positive aspects of the Bill, the OPC believes that C-27 can be further improved in a few key areas, as detailed in the submission shared with you May 10, 2023.
  • We have reviewed the Bill through the lens of three key elements:
    • Privacy as a fundamental right;
    • Privacy in support of the public interest and Canada’s innovation and competitiveness; and
    • Privacy as an accelerator of Canadians’ trust in their institutions and in their participation as digital citizens.

Background

Privacy as a Fundamental Right

  • Privacy is a fundamental right and instrumental to the exercise of other rights.

Privacy in support of the Public Interest and Canada’s innovation and competitiveness

  • Privacy is not a barrier to innovation. Canadians should not have to choose between protecting their privacy and participating in the digital economy.
  • Organizations should consider privacy impacts at the front-end.

Privacy as an accelerator of Canadians’ trust

  • Greater trust can be fostered by ensuring privacy is respected and individuals can exercise their rights.
  • Strong private-sector privacy laws include effective enforcement mechanisms that help instill confidence in the data-driven economy.

Prepared by: PRPA


Amendments Proposed by Minister – CPPA

Key Messages

  • We generally welcome the Minister’s proposed amendments, which align with our recommendations on several issues, including recognizing privacy as a fundamental right.
  • Regarding the Minister’s proposal to strengthen protections for children, we would recommend that the Bill refer to the “best interests of the child” and not “special interests of minors” as the former is a well-established legal concept originating in the UN Convention on the Rights of the Child.
  • We would not oppose amending section 12 to include consideration of whether the information relates to a minor. However, we still believe that the list of factors should be non-exhaustive to provide greater flexibility.
  • We are pleased with the Minister’s proposals on compliance agreements, including that they can contain “financial considerations” and will be final and not appealable. We are also happy to hear the Government is carefully examining our other recommendations to allow for more flexibility in this area.

Background

  • On October 3, 2023, Minister Champagne provided the Standing Committee on Industry and Technology with a letter that included proposed amendments to Part 1 of Bill C-27, the Consumer Privacy Protection Act (CPPA).
  • In the letter, the Minister proposes amending the preamble of the Bill and the purpose clause of the CPPA (s. 5) to qualify privacy as a fundamental right.
  • The Minister also proposes amending the preamble to include a reference to “the special interests of children…” and to include this as a factor to consider under subsection 12(2) (appropriate purposes).
  • The Minister proposes permitting compliance agreements to contain “financial consideration”. Such agreements would be final and could not be appealed. The letter also states that the Government is “carefully examining” the OPC’s other proposals to grant us more flexibility with respect to enforcement.

Prepared by: PRPA


Amendments Proposed by Minister – AIDA

Key Messages

  • The amendments to AIDA proposed by Minister Champagne seem generally positive, as they would bring clarity about the Act’s intended scope of application and obligations. They also further alignment with the EU AI Act.
  • Given the nuance required when dealing with AI, it is challenging to provide more detailed comment without seeing proposed legislative text.
  • Of particular interest will be the relation between the CPPA and AIDA in practice, as many of the proposed “high impact systems” would appear to be governed by both the CPPA and AIDA and certain obligations appear to have overlaps across both (e.g. AIDA’s accountability framework and the CPPA’s privacy management program).

Background

  • The Minister’s proposed amendments are grouped into five categories:
    • defining classes of systems that would be considered high impact;
    • specifying distinct obligations for generative general-purpose AI systems, like ChatGPT;
    • clearly differentiating roles and obligations for actors in the AI value chain;
    • strengthening and clarifying the role of the proposed AI and Data Commissioner; and
    • aligning with the EU AI Act as well as other advanced economies.
  • No legislative text is provided, though detail is included re: high-impact systems and a new definition of AI (that aligns with the OECD).
  • Accountability framework: A new proposal, by which all parties conducting regulated activities would need to prepare a framework (which can be requested by the AIDC) setting out: roles and responsibilities for key personnel; training requirements; and, policies and procedures related to: (i) risk management; (ii) notice of serious incidents; and (iii) respecting the data used by the system.

Prepared by: PRPA


Adequacy of Current OPC Powers

Key Messages

  • Under the current law (PIPEDA), the OPC is unable to issue binding orders or financial penalties.
  • If the OPC learns of a potential privacy concern relating to an organization subject to PIPEDA but has not received a complaint from an individual, the Commissioner may initiate a complaint himself (s.11(2) PIPEDA) or do an audit of the organization’s practices (s.18 PIPEDA) subject to legislative thresholds.
  • The OPC may also issue non-binding guidance about topics coming to our attention relating to certain information handling practices in order to help organizations better comply with PIPEDA.

Background

  • The threshold under PIPEDA for the Commissioner to initiate an investigation against an organization is outlined in s.11(2) (i.e., “reasonable grounds to investigate a matter under [Part I]”), whereas for audits (s.18), the threshold is higher and requires “reasonable grounds to believe that the organization has contravened a provision of Division 1 or 1.1 or is not following a recommendation set out in Schedule 1”.
  • There have been approximately 30 Commissioner-initiated investigations in the last decade, but very few audits, primarily because of the differences in legislative thresholds needed to trigger such activities (i.e., in most scenarios, the Commissioner-initiated complaint threshold of “reasonable grounds to investigate a matter” is easier to meet than the threshold required for audits of “reasonable grounds to believe that an organization has contravened a provision”).
  • Under PIPEDA, recommendations following an audit or investigation are non-binding. If an organization does not follow recommendations following an investigation, the complainant or the OPC may in certain circumstances commence a de novo application to have those recommendations enforced by the Federal Court (s.14 PIPEDA).
  • Powers proposed in the CPPA would give the OPC enforcement tools to ensure or incentivize compliance with the Act (i.e., issuing orders, recommending AMPs).

Prepared by: Legal


Admissibility of OPC Recommendations

Key Messages

  • It has been suggested that some amendments to Bill C-27 proposed by the OPC are inadmissible now that the Bill is at second reading based on rules of legislative process/procedure.
  • It is not too late for INDU to propose clearly relevant and substantive amendments that go beyond the scope of the referred Bill—even after second reading—to make changes to the preamble or to make consequential amendments to other Acts.
  • At committee, a bill’s scope of application can be expanded/narrowed by way of motion authorizing what would otherwise be beyond the HoC’s powers if necessary and where an amendment is clearly germane to the objective of the bill.

Background

  • Substantive amendments to preambles are admissible only if it is rendered necessary by amendments made to the bill; amendments are also allowed to clarify or ensure the uniformity of English and French versions of a preamble.
  • OPC recommends inserting a standalone preamble to the CPPA and/or adjusting C-27’s preexisting overarching preamble as this may still be possible after second reading. Similarly, OPC maintains there are no apparent legal obstacles for amending the bill to require judicial reviews of Tribunal decisions go directly to the Federal Court of Appeal even if s.28(1) of the Federal Courts Act cannot be modified because the latter Act was not before committee.
  • There are occasions when an in-scope and relevant amendment of an Act not named in the bill can only be accomplished by modifying a section of the parent act untouched by the bill; see Decision of Speaker Geoff Regan: Hansard – 341 (October 24, 2018, Debates pp.2297-8) [Bill C76 – … Canada Elections Act…]
  • Prior Examples of motions expanding the scope of a bill: (1) “power to expand the scope of the Bill in order to allow members of the RCMP to qualify for the priority hiring program”: HoC Journals, 41st Parl, 2nd Sess, No. 107, June 19, 2014, p. 1321 [Bill C-27 – An Act to amend the PSEA]; (2) “the committee be granted the power to expand [the Bill’s] scope”: HoC Journals, 44th Parl, 1st Sess, No. 193, 9 May 2023 [Bill C-21 – An Act to amend certain Acts (firearms)]

Prepared by: Legal


Authorized Individuals

Key Messages

  • Canadians should be able to authorize someone else to represent them for the purposes of exercising their rights under the CPPA in cases where they might not be able to do so by themselves due to a disability, language barrier, or other reason.
  • The former Bill C-11 included a clause (4(2)(c)) which provided that the rights and recourses provided under the Act could be exercised on behalf of any other individual by any person authorized in writing to do so by the individual, which does not appear in C-27.
  • We recommend reinserting this provision, and clarifying who can be authorized to act on behalf of individuals who wish to file a privacy complaint with my Office

Background

  • Re-inserting the provision originally included in the former Bill C-11 would add clarity and ultimately ensure there are no new barriers introduced in Bill C-27 reducing an individual’s ability to exercise their privacy rights under the CPPA.
  • We understand that some may interpret such a provision in a manner that raises concerns for potential abuse or fraud.
  • However, as a result of the removal, it is now unclear if individuals will still have this flexibility, or if the CPPA creates the possibility that an individual may not be able to get help from a third party of their choosing.
  • It may now be unclear if individuals still have the ability to file complaints with the OPC through their chosen representative
  • Currently, PIPEDA only addresses this issue in a limited fashion. For instance, the law does not currently specify whether someone can submit a complaint on behalf of someone else. In practice, the OPC has accepted complaints from an individual’s personal representative, with their written consent.
  • Although PIPEDA does not currently specify if one/who can represent complainant, the OPC allows this in practice.

Prepared by: Legal Services


Codes and Certification

Key Messages

  • S. 76-77 of the CPPA gives the OPC the power to approve codes of practice and certification programs, which are an excellent means of bringing the Act’s privacy principles to a more concrete level, adding certainty for both organizations and consumers.
  • However, the CPPA differs from other international models. For example, almost all of the parameters of the codes and certification schemes are not within the discretion of the OPC, but rather will be set by government, to be prescribed by regulations.
  • This is not the case in other jurisdictions such as Australia, New Zealand and the GDPR.
  • The provisions on codes of practice and certification programs should give the OPC the same discretion as those of our colleagues internationally.

Background

  • Provisions related to codes of practice in the GDPR, as well as in the privacy laws of Australia and New Zealand reveal dissimilarities with the model proposed in the CPPA, generally relating to the lack of flexibility provided to the regulator with respect to procedures for approving codes and certification programs.
  • Under the CPPA, the Commissioner would have powers to receive entities’ applications for Codes of Practice, approve the Code and revoke it. The submission, approval, and revocation processes are to be set out in regulations.
  • The Commissioner would also receive, review, and make decisions on applications for certification programs. As well, he would have certain powers such as revoking an approval of a certification program and recommending to program operators that an organization’s certification be withdrawn. Most parameters for certification programs are also to be set in regulations.

Prepared by: PRPA


Compliance Agreements

Key Messages

  • S.87(1) of the CPPA allows the OPC to resolve complaints via negotiated compliance agreements in certain circumstances.
  • While we support the inclusion of compliance agreements, having used them effectively under PIPEDA, the regime set out in C-27 could make agreements less effective than they are now.
  • As written, C-27 could be interpreted as limiting the use of compliance agreements to investigations, preventing the negotiation of such agreements in advance of investigation or during inquiries. This would limit opportunities for negotiated solutions where voluntary agreements would allow expeditious and timely resolution of matters to the benefit of all parties.
  • Under C-27, compliance agreements would not be immediately enforceable, instead requiring our Office to establish non-compliance through a lengthy inquiry and possible appeal simply to enforce compliance with a negotiated settlement.
  • For clarity, C-27 should adopt text confirming that compliance agreements can include payment of AMPs and other terms.

Background

  • It would provide an enhanced flexibility and benefit to all parties involved if complaints could be resolved by compliance agreements at any time in advance of the conclusion of an inquiry and recommendations to the tribunal.
  • C-27 should enable compliance agreements to be registered with the Federal Court to give them the same force and effect as a court order, as with Competition Bureau compliance orders. The proposal in C-27 delays enforcement of organizations’ compliance with terms to which they have voluntarily agreed, and which the OPC has accepted as an alternative to pursuing further enforcement action.
  • C-27 does not expressly indicate that compliance agreements can include AMPs or other negotiated measures, which should be addressed.

Prepared by: Compliance


Domestic Collaboration

Key Messages

  • Given that data flows have implications for privacy and other rights, enforcement collaboration within privacy and across regulatory spheres is critical to protecting individual rights.
  • C-27 only allows the OPC to collaborate with Provincial/Territorial Information and Privacy Commissioners, the CRTC and the Competition Bureau. We believe it is important to have greater flexibility to collaborate with domestic authorities that have a mandate in digital regulation, including the proposed AI and Data Commissioner, consistent with our ability to collaborate with a wide variety of international authorities.
  • It should also be made clear that collaboration with the CRTC and the Competition Bureau can be for addressing broad compliance-related matters in areas such as data and the digital market, advertising, and privacy initiatives with competitive impacts.

Background

  • Applying language found in ss. 120(1)(b) of the CPPA to domestic cooperation would achieve parity with respect to international and domestic collaboration.
  • OPC could benefit from collaboration with additional domestic regulators such as credit reporting regulators, the Superintendent of Financial Institutions, Human Rights commissions, and the proposed AI and Data Commissioner.
  • (Redacted). Many of our Office’s most significant cases in the past several years, such as Tim Hortons, Clearview AI, and Facebook Cambridge Analytica, have been conducted jointly with our provincial partners.
  • Some benefits of these investigations included limitation of duplicative processes, streamlined feedback on compliance across multiple jurisdictions, increased certainty for respondents and increased resources for involved regulators. Collaboration with the CRTC and/or Competition Bureau could yield similar benefits for Canadians and organizations.

Prepared by: Compliance


Discretion for Resource Allocation

Key Messages

  • The OPC prefers to assist entities in complying with the law, rather than moving too quickly to use enforcement powers. Effective regulators prioritize activities based on risk, maximizing resources to produce the most effective outcomes.
  • Compared with Bill C-11, Bill C-27 provides greater discretion to the OPC to manage complaints and investigations, and gives the Office more flexibility to undertake guidance in a manner the Commissioner deems appropriate.
  • This will allow us more flexibility to use resources strategically and prioritize our activities based on risks to Canadians.
  • Aside from ensuring the OPC is resourced to carry out new functions, we have no further comments on these changes.

Background

  • Guidance Functions: Section (s.110) of the former Bill C-11 required the OPC to consult with stakeholders when developing guidance and tools (s.110(1)(b)); to conduct research on the operation or implementation of the Act at the request of the Minister (s.110(1)(c)); and to recommend corrective measures on PMPs at the request of organization (s.110(1)(e)), all without the discretion to manage these new responsibilities.
  • Bill C-27 now includes amendments to increase OPC’s discretion regarding how these requirements are carried out, provides flexibility regarding who is to be consulted in guidance development and includes a “greater certainty” clause which clarifies the OPC has flexibility to manage and prioritize workload. As such, in our view, our previous concerns have been addressed.
  • Complaints and Investigations: Ss. 83 (1)(I) sets out that the Commissioner may choose not to proceed with an investigation where he is of the opinion that investigation or any further investigation is unnecessary having regard to all the circumstances of the complaint. This further allows us the discretion to manage our work.

Prepared by: PRPA (Consultation with Compliance)


Private Right of Action

Key Messages

  • S.107 allows individuals to pursue a private right of action (PRA) against organizations subject to certain conditions, including a finding made by the OPC. We believe this right should not be contingent on a finding by my office of a contravention of the Act.
  • The OPC offers our expertise and a relatively quick and inexpensive remedy to individuals alleging violations of the Act. That said, it is not our only role; to be an effective regulator, we must be able to be strategic and flexible to address systemic issues. We cannot be the only gateway to all consumer remedies.
  • Accordingly, we recommend allowing an individual to start a PRA that is still dependent on a complaint to our Office, but not on the finding of a violation of the Act.
  • The OPC should be the first level decision maker for issues under our jurisdiction, but there may be circumstances where an individual raises a cause of action beyond a pure breach of the Act, where the Courts may be an appropriate option for individuals pursuing a remedy without being contingent upon a finding by the OPC.

Background

  • The former C-11 included the same provision; and our recommendations on the provision remains the same. Some suggest there be no pre-conditions to exercise the PRA, and that the CPPA clarify that the PRA is not a “complete code”.
  • This proposal may help avoid risks associated with an entirely free-standing PRA (e.g. concurrent proceedings based on an alleged breach of the CPPA, possible inconsistent decisions by the Courts and OPC, etc.).
  • In Hopkins v. Kay, the Ontario Court of Appeal held that allowing individuals to pursue common law claims would not conflict with or undermine the scheme established by PHIPA.

Prepared by: Legal Services


Commissioner-initiated Complaints

Key Messages

  • s. 82(2) states the OPC may initiate a proactive investigation only where it has “reasonable grounds to investigate”.
  • Since technologies and business models are increasingly complex and opaque, evidence of contraventions and associated privacy risks may not be apparent until serious harms to Canadians have already occurred.
  • Proactive investigations, or inspections, without the precondition of “reasonable grounds”, would allow the OPC verify compliance and prevent privacy harms before they occur.
  • When compared to many international and domestic privacy authorities, the OPC is an outlier. Provincial and international regulators have broader powers to proactively investigate to ensure compliance without the strictures imposed by C-27.

Background

  • Many privacy practices in the modern digital world are opaque to individuals, who may not have sufficient information or awareness to file a complaint. Similarly, our Office may not be aware of organizations’ practices or lack access to the evidence necessary to establish reasonable grounds to investigate.
  • To meet the “reasonable grounds” threshold, we have often relied on media to identify the evidence necessary to commence an investigation, only after serious privacy violations and damage to Canadians have occurred. This was the case in our investigations into Tim Hortons, Clearview AI and Cadillac Fairview.
  • Examples of legislation with far more permissive standards include:
    • Alberta Personal Information Protection Act (s. 36(1)(a))
    • Quebec’s private-sector privacy law (S. 81)
    • Australia’s Privacy Act, ss.40(2)(a)
    • Ireland’s Data Protection Act ss. 110(1) and ss. 123(1)
    • GDPR Article 57(1)(a) and
    • New Zealand’s Privacy Act 79(b)

Prepared by: Compliance


Audits

Key Messages

  • We are pleased to see that under C-27, the OPC may now initiate an audit where there are reasonable grounds to believe the organization has contravened, is contravening or is likely to contravene the Act.
  • However, the Bill continues to require the Commissioner to meet a threshold before conducting an audit, which is contrary to the approach in numerous other jurisdictions including Alberta, Quebec and the EU.
  • The ability to examine complex technologies and business models is needed to provide Canadians assurance that their privacy rights are being protected.

Background

  • Currently, in PIPEDA (and the former C-11) the Commissioner can only undertake an audit where there are reasonable grounds to believe an organization has contravened the Act.
  • Although C-27 has broadened the OPC’s ability to initiate an audit to include situations where there are reasonable grounds to believe that an organization “is contravening or is likely to contravene” Part 1 of the Act, the threshold of “reasonable grounds to believe” remains, which is unnecessarily restrictive.
  • We note that C-27 requires that, following an audit, the Commissioner must provide the organization with a report that contains findings and recommendations (s. 98 (1)). As well, the OPC may make interim orders in carrying out an audit per s. 99 (1)(a). We are supportive of these provisions.
  • The former Bill C-11 prohibited audits based on information obtained during a privacy management program review. C-27 (s.111) now permits the use of such information to launch an audit if an organization has “willfully” disregarded the corrective measures proposed by the OPC, following its review of the PMP. We believe this to be a welcome development.
  • Other laws that do not have thresholds for audits include Alberta (S. 36 of PIPA), Quebec (S.81 – 81.3 of P-39.1) and the GDPR (Articles 57, 58, and Recital 129).

Prepared by: PRPA


Protocols to Protect Investigations

Key Messages

  • CPPA’s new adjudicative powers (i.e., order-making under subsection 93(2)) in the inquiries context will likely require some restructuring at the OPC to avoid perceptions of unfairness
  • It is anticipated that inquiries and order-making will necessitate more formal procedures by the OPC and that these will be made public. OPC will be transparent and publish our rules and procedures as they are developed.

Background

  • Currently, one of the OPC’s main compliance activity is the resolution of complaints through various investigative approaches. The procedure for investigations are flexible and may be adapted depending on the nature of the allegations and the evidence the OPC requires to make findings in any given case. The OPC’s existing procedures for investigations are currently described in general guidance on our website.
  • It is anticipated that inquiries and order-making will necessitate more formal procedures by the OPC and these will be made public.
  • Some of the OPC’s provincial and territorial counterparts that have order-making power have developed either formal codes of procedures or guidelines for their adjudication processes, which are more akin to formal tribunal proceedings than to the OPC’s existing administrative investigations. Some have also set up distinct adjudication units, staffed by professional adjudicators, which act independently of the Commissioner and of their office’s investigation units.
  • Section 92 of the CPPA provides that the Commissioner must make rules respecting the conduct of an inquiry and make those rules publicly available.
  • Pursuant to section 112 of the CPPA, the Commissioner must make readily available information on the manner in which the Commissioner exercises the Commissioner’s powers or performs the Commissioner’s duties or functions under this Act.

Prepared by: Legal Services


Resource implications of C-27

Key Messages

  • C-27 expands existing responsibilities for the OPC and introduces new ones while also granting some discretion in how they are exercised.
  • Funding was assigned to the OPC in the 2020 Fall Economic Statement for the implementation of the former Bill C-11, and more recently, as part of Budget 2023 for the ramping up phase of Bill C-27.
  • These funds will help us to prepare for the coming-into-force of the legislation and deliver on our new responsibilities.
  • However, these funds are not sufficient to permanently address new activities as well as chronic underfunding of existing activities, preventing our Office from appropriately dealing with the full volume/complexity of issues emerging which significantly impact the privacy rights of Canadians.

Background

  • The 2020 Fall Economic Update allocated the following total funding for the OPC to “support the implementation and enforcement of private sector legislation”:
    2020-21 2021-22 2022-23 2023-24 2024-25 2025-26
    (Redacted) (Redacted) (Redacted) (Redacted) (Redacted) (Redacted)
  • Budget 2023 set aside additional temporary funding meant to support our Office in implementing new mandate obligations in the first few years of the new law.
    2023-24 2024-25 2025-26 2026-27 2027-28
    2M 4M 4M 3M 2M
  • Our current estimate for the total implementation cost of Bill C-27, including addressing existing underfunding, is approximately $25 Million (excluding the Employee Benefit Plan) in additional funding – more than twice what has been allotted in permanent funding in the Fiscal Framework.

Prepared by: Corporate


Section 109 – Powers, Duties and Functions - Factors to consider

Key Messages

  • S. 109 of the CPPA outlines factors the Commissioner must take into account when performing his functions and exercising his powers (purpose of Act, size and revenue, volume and sensitivity of personal information and matters of general public interest).
  • We are concerned that the mandatory and broad nature of s. 109 could create challenges in its implementation, and recommend it not be compulsory.
  • Instead, the Commissioner should be encouraged to consider the same factors.
  • In practice, OPC already carefully considers the realities and circumstances of organizations, including small and medium enterprises, a subset of organizations for which s. 109 seems tailor-made.

Background

  • A variation of this provision was in the former Bill C-11, and we remain concerned that the provision may lead to unnecessary litigation - our Office could be prevented from acting or our procedures delayed on frivolous grounds because we did not give due consideration to the context.
  • We note several other C-27 provisions that accommodate SMEs (beyond s. 109):
    • The Preamble refers to the necessity of an “agile regulatory framework” to facilitate compliance with rules by (and promote innovation within) organizations “of all sizes”;
    • s. 9(2) – Privacy management programs requirements scale with the volume and sensitivity of information an organization controls;
    • s. 95(5)(b) - Tribunal must consider impacts on business and its ability to pay when determining appropriate penalties; and,
    • s. 110(1)(e) - OPC must provide guidance to organizations on their privacy management programs on request.

Prepared by: Legal Services


The Tribunal

Key Messages

  • Part 2 of Bill C-27 establishes the Personal Information and Data Protection Tribunal, which is empowered to hear appeals of the OPC’s findings, orders, decisions, interim orders and recommendations for administrative monetary penalties.
  • The Tribunal can substitute its own finding, order, or decision for the one under appeal if it finds an error was committed by the OPC. Per s.21 of PIDPTA, a decision by the Tribunal is final and can only be judicially reviewed by the Federal Court.
  • The current review process is lengthy and costly. The OPC recommends that reviews of Tribunal decisions be conducted directly by the Federal Court of Appeal.
  • This would remove one layer of review, support more comparable levels of review with certain provincial counterparts and bring finality to matters more quickly.

Background

  • The Tribunal will be comprised of three to six members appointed by the Governor in Council on recommendation of the Minister. At least three of the members will be required to have experience in the field of information and privacy law.
  • Having the Tribunal’s decisions directly reviewed by the FCA would remove a layer of review, expedite the review process, and bring finality to matters more quickly.
  • None of the OPC’s substantially similar provincial counterparts have a review body equivalent of a Tribunal; instead, such matters would go directly to a provincial court or superior court.
  • Having reviews go to the FCA would support more comparable levels of review with certain provincial counterparts and bring finality to matters more quickly.

Prepared by: Legal


Timelines: Breach Reporting

Key Messages

  • Breaches of personal information must be addressed swiftly to reduce the risk of harm to individuals.
  • S. 58 of the CPPA imposes breach reporting timelines (“as soon as feasible”) which are vague and lead to delays which reduces the OPC’s ability to enforce compliance and protect Canadians.
  • Further, the CPPA only requires organizations with personal information under its control (known as data controllers) to report breaches to my office. If the breach occurs at a service provider (that provides services on the data controllers behalf) it is only required to report to the affected data controller.
  • All organizations - whether data controllers or service providers - should be required to report breaches to my Office within 7 days.
  • A 7-day reporting timeline for all organizations provides priority and clarity to breach reporting and can assist organizations in advancing remedial measures and mitigating ensuing damages.

Background

  • The CPPA echoes wording in PIPEDA that breaches must be reported “as soon as feasible after the organization determines that the breach has occurred”.
  • This means that the OPC receives 40% of breach reports more than 3 months after the breach occurred.
  • The imbalanced breach reporting requirements proposed by the CPPA results in the OPC relying on data controllers to report breaches when the service provider that suffered the breach has an asymmetrically more complete picture about the breach incident.
  • We expect that implementing our recommendations for reporting and timelines will result in the timely and complete breach reports needed to protect the privacy rights of Canadians impacted by a breach.

Prepared by: CIRD


Timelines: Return of Records

Key Messages

  • Section 99(2) of C-27 requires the OPC to return records or things 10 days after they have been requested by the organization.
  • This timeline is problematic, as the OPC makes use of digital forensic techniques to extract evidence which can take well over 10 days to complete.
    • If such techniques and processes are interrupted, they must be restarted anew to maintain the integrity of the evidence involved.
    • A 10-day time limit could prevent the use of these techniques entirely, negatively affecting our investigative capabilities.
  • Consequently, we believe that 10 days is too restrictive a timeframe which risks impeding our investigative processes, and would advocate for more flexibility.

Background

  • Subsection 99(2) states that “The Commissioner or the Commissioner’s delegate must return to a person or an organization any record or thing that they produced under this section within 10 days after the day on which they make a request to the Commissioner or the delegate, but nothing precludes the Commissioner or the delegate from again requiring that the record or thing be produced.”
  • We would propose that 99(2) should be modified to oblige the Commissioner to return such records or things after the investigation, inquiry, or audit is completed and after all related proceedings have been concluded.

Prepared by: Legal


Offences

Key Messages

  • S. 128 of C-27 makes it an offence to contravene certain provisions or orders under the Act or obstruct OPC officials in complaints, inquiries, or audits.
  • We support this, but have two recommendations to increase the likelihood that CPPA offence investigations are effective:
    • The confidentiality requirement in the CPPA should not preclude OPC from disclosing evidence of offences under both the CPPA and AIDA directly to the police.
    • The limitation period for summary conviction offences under the CPPA should be extended.

Background

  • Like PIPEDA (s. 20(5)), the CPPA allows the OPC to disclose information relating to the commission of an offence to the Attorney General of Canada (s.113(6)). However, subject to limited exceptions, (e.g. information about data breaches – s. 113(7)) the confidentiality requirements in the CPPA prevent the OPC from sharing evidence of offences with the RCMP or other police forces (who actually investigate offences). OPC’s Bill C-11 submission sought a lead role for OPC in CPPA offence investigations; we are now comfortable with police being the lead.
  • As the regulator under the CPPA, the OPC is uniquely positioned to encounter evidence of criminal liability under both the CPPA and AIDA, (which overlaps with the CPPA). The OPC’s inability to proactively share such evidence can significantly reduce the effectiveness of police investigations.
  • Section 128 of the CPPA can be prosecuted as an indictable, or summary conviction offence. For summary conviction offences, the Criminal Code of Canada sets a 12-month limitation period (unless both parties agree to an extension). The limitation period starts when the offence was committed.
  • Prosecutions are likely to take place either well into or after OPC investigations, which can be lengthy-- especially with the new authorities contained in CPPA (e.g. inquiries, orders, etc.). Notably, Bill 64 amended Quebec’s private sector privacy law (s. 92.2 – not yet in force) to create a five-year limitation period.

Prepared by: Legal


Artificial Intelligence and Data Act (AIDA)

Key Messages

  • AIDA represents an effort by Canada to regulate AI technology, which is commendable given the technology’s risks to human rights.
  • One of AIDA’s main requirements relates to identifying and mitigating serious risks of harm and bias of “high-impact” AI systems.
  • Our understanding is that AIDA is not meant to address privacy risks specifically, and that the CPPA would apply to the processing of personal information within AI systems.
  • To ensure privacy risks are identified and mitigated by organizations, particularly for technologies such as AI, we recommend the CPPA include a PIA requirement for high-risk activities. This would help accelerate trust in such technologies.

Background

  • Relevant Section of AIDA: Section 8 requires a person who is responsible for a high-impact system to, in accordance with the regulations, “establish measures to identify, assess and mitigate the risks of harm or biased output that could result from the use of the system.”
  • Rationale for PIA: It is a risk-management process that helps ensure privacy risks are assessed and addressed. Given AIDA’s limited definition of “harm”, it would not require privacy risks to be mitigated. Adding a PIA requirement for higher-risk activities in the CPPA would close this gap. There may be opportunity to have the different risk assessment feed into each other. PIAs are particularly important for when an exception to consent is used to process personal information using AI, as an individual would not be aware.
  • Examples of high-risk activities: AI models have the capability to analyze, infer and predict aspects of individuals’ behaviour, interests and even their emotions in striking ways. AI systems can use such insights to make automated decisions about individuals, including whether they get a job offer, qualify for a loan, pay a higher insurance premium, or are suspected of suspicious or unlawful behaviour.

Prepared by: PRPA


Amendments Proposed by Minister – AIDA

Key Messages

  • The amendments to AIDA proposed by Minister Champagne seem generally positive, as they would bring clarity about the Act’s intended scope of application and obligations. They also further alignment with the EU AI Act.
  • Given the nuance required when dealing with AI, it is challenging to provide more detailed comment without seeing proposed legislative text.
  • Of particular interest will be the relation between the CPPA and AIDA in practice, as many of the proposed “high impact systems” would appear to be governed by both the CPPA and AIDA and certain obligations appear to have overlaps across both (e.g. AIDA’s accountability framework and the CPPA’s privacy management program).

Background

  • The Minister’s proposed amendments are grouped into five categories:
    • defining classes of systems that would be considered high impact;
    • specifying distinct obligations for generative general-purpose AI systems, like ChatGPT;
    • clearly differentiating roles and obligations for actors in the AI value chain;
    • strengthening and clarifying the role of the proposed AI and Data Commissioner; and
    • aligning with the EU AI Act as well as other advanced economies.
  • No legislative text is provided, though detail is included re: high-impact systems and a new definition of AI (that aligns with the OECD).
  • Accountability framework: A new proposal, by which all parties conducting regulated activities would need to prepare a framework (which can be requested by the AIDC) setting out: roles and responsibilities for key personnel; training requirements; and, policies and procedures related to: (i) risk management; (ii) notice of serious incidents; and (iii) respecting the data used by the system.

Prepared by: PRPA


AI and Data Commissioner

Key Messages

  • As currently drafted, AIDA’s proposal that the AI and Data Commissioner be a senior official within ISED could create a real or perceived conflict, given ISED’s role in promoting the AI industry.
  • As with any regulation, the effectiveness of AIDA will in large part be determined by the effectiveness of its oversight.
  • Minister Champagne’s statement of support for amendments to AIDA which would “build confidence in [the AI and Data Commissioner’s] ability to carry out their mandate independently” is a positive step.
  • We look forward to working with the Commissioner on matters where AIDA and the CPPA both apply.

Background

  • AIDA s.33(1): The Minister may designate a senior official of the department over which the Minister presides to be called the Artificial Intelligence and Data Commissioner, whose role is to assist the Minister in the administration and enforcement of this Part.
  • Minister Champagne’s letter to INDU: “The Government would support amendments to clarify more specifically the functions and roles of the AI and Data Commissioner. These would be intended to provide clarity regarding the role of the AIDC, build confidence in their ability to carry out their mandate independently, and enable them to play a strong coordinating role across the AI regulatory system to ensure coherence and avoid duplication.”

Prepared by: PRPA


Automated Decision Making

Key Messages

  • Like the previous Bill C-11, C-27 includes provisions specific to automated decision-making. We support this approach to address the unique risks stemming from automated decisions. However, as drafted, the provisions scope the obligations too narrowly to provide the needed transparency to build trust.
  • The requirement to provide an explanation of automated decisions is now only limited to decisions that could have a “significant impact”, which would likely exclude things like personalized digital environments.
    • Such a narrow application would not be in the interest of achieving algorithmic transparency and its benefits.
  • The automated decision making obligations also do not explicitly apply to profiling, as in other modern privacy laws in Quebec, California, and Europe. This should be clarified, and the law should define “profiling” similar to these other privacy laws.

Background

  • Explanation provision: The addition of “significant impact” to the explanation requirement at s. 63(3) is a problematic change from the former Bill C-11, as it narrows the scope and would likely exclude decisions for a matter such as online advertising, personalized news feeds and digital environments.
  • Other jurisdictions are going further: The EU Digital Services Act, which comes into force on January 1, 2024, requires all online platforms to include the parameters used for recommender systems in their policies (Article 27). There is no limitation to “significant impacts”. It will also provide for an opt-out of recommendations based on profiling from large online platforms (Article 38).
  • Definition of profiling in GDPR (Quebec/California nearly identical): profiling means “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.”

Prepared by: PRPA


Automated Decision Making – Right to Contest

Key Messages

  • S. 63(3) of C-27 requires organizations who use automated decision systems to make predictions, recommendations or decisions about individuals to provide them with an explanation of that decision. This is progressive and welcome.
  • However, technology is not perfect, and individuals should not be bound by automated decisions without a way to seek human intervention, particularly when such decisions can be based on inaccurate data, reflect bias, or otherwise result in a decision that a human would deem inappropriate.
  • Individuals should be provided with a right to contest automated decisions that goes beyond the proposed explanation requirement, in the interest of fairness and ensuring human review.
  • The right to contest would be in addition to the ability to withdraw consent currently provided for in the CPPA. It is necessary to have both rights, as withdrawal of consent is an all-or-nothing decision, whereas contestation provides individuals with recourse even when they choose to continue to participate in the activity for which automated decision-making was employed.

Background

  • Such a right would be consistent with the approach taken in other jurisdictions, including Europe under the GDPR and Quebec under its Law 25 (Act respecting the protection of personal information in the private sector), where a human could be required to review a decision upon request in certain circumstances.
  • In the public sector, the federal Directive on Automated Decision-making already requires that higher-risk decisions cannot be made without having specific human intervention points during the decision-making process.

Prepared by: PRPA


C-27 Submission – AI-related Recommendations

Key Messages

  • The OPC’s submission on C-27 makes three main recommendations related to AI under the CPPA.
  • To ensure that privacy risks are identified and mitigated by organizations for technologies such as AI, we recommend the CPPA include a PIA requirement for high-risk activities.
  • To promote transparency and accountability, we recommend that organizations be required to explain, on-request, all predictions, recommendations and decisions made using automated decision-systems, not just those that could have a “significant impact” – and that this applies to profiling as well.
  • To permit greater efficiency and coordination, the OPC should be given greater ability to coordinate with domestic regulators – including the proposed AI and Data Commissioner.

Background

  • Rationale for PIA: It is a risk-management process that helps ensure privacy risks are assessed and addressed. Given AIDA’s limited definition of “harm”, it would not require privacy risks to be mitigated. Adding a PIA requirement for higher-risk activities in the CPPA would close this gap. This is particularly important for when an exception to consent is used to process personal information using AI, as an individual would not be aware.
  • Rationale for expanded explanations: In C-11, the “significant impact” limitation applied only to providing a general account of an ADM system. C-27 added this limitation to explanations of decisions – potentially excluding matters such as online advertising and personalized news feeds. This would not be in the interest of algorithmic transparency.
  • Rationale for including “profiling”: Unlike the GDPR and other modern privacy laws, obligations related to automated decision-making do not explicitly apply to profiling. This could mean that often-opaque activities, such as data brokering, fall outside of transparency obligations.

Prepared by: PRPA


ChatGPT Investigation

Key Messages

  • In May 2023, my Office commenced an investigation jointly with our counterparts in Alberta, British Columbia, and Québec into the practices of OpenAI in relation to its ChatGPT service.
  • The investigation is examining the issues of consent and transparency, access and accuracy, accountability, appropriate purposes and limiting collection.
  • As this is an ongoing investigation, I am limited as to what I can share at this time.
  • AI is a priority area for my Office, as an emerging technology that poses significant risks for privacy and other fundamental rights.
  • We are aiming to complete this investigation within a year.

Background

  • ChatGPT is a natural language processing tool, or chatbot, driven by AI technology. The language model can answer questions and assist users with tasks such as composing emails and essays.
  • In April 2023, the OPC commenced an investigation into ChatGPT, after receiving a complaint that alleged (Redacted). In May 2023, we closed this investigation to pursue a broader joint Commissioner-Initiated Complaint.
  • Data protection authorities around the world, including various European authorities, have commenced, or are considering investigations into ChatGPT.
  • The European Data Protection Board launched a dedicated task force on ChatGPT, to “exchange information on possible enforcement actions.”
  • As a member of the Global Privacy Assembly’s AI Working Group, we are exchanging information and learning from the experiences of our counterparts.
  • As a co-chair of the Global Privacy Assembly’s International Enforcement Cooperation Working Group, we organized a “closed enforcement session” for member authorities to discuss enforcement in relation to ChatGPT.

Prepared by: Compliance


EU AI Act

Key Messages

  • The proposed EU AI Act was first introduced in 2021 and continues to be actively debated. Currently, trilogue negotiations on proposed amendments are underway between the European Parliament, European Council and European Commission.
  • The EU AI Act would outlaw harmful AI applications such as those that subliminally manipulate or are used for social scoring. It also takes a risk-based approach by outlining certain uses of AI that warrant more stringent regulation, such as in the employment context, or for accessing essential public and private services.
  • Even where not used in a high-risk context, generative AI would be subject to transparency requirements under the AI Act.
  • The AI Act has stricter requirements than AIDA, though the extent of this difference will largely be dependent on AIDA regulations. A notable difference is that whether an AI system is high-risk under the AI Act considers whether it poses “a risk of adverse impact on fundamental rights”, which is not considered by AIDA’s definition of harms.

Background

  • In the EU AI Act, “high-risk” AI is subject to requirements including:
    • risk and quality assessments,
    • logging and record-keeping for traceability,
    • general human oversight,
    • accurate and representative data for AI training,
    • ex-ante conformity assessments, and
    • demonstrable accountability.

Prepared by: PRPA


Types of Generative AI

Key Messages

  • There are two main types of generative AI tools: large language model “chatbots” and text-to-image model “art systems.”
  • Examples of generative AI chatbots include ChatGPT (OpenAI), Bing Chat (Microsoft), Bard (Google), LLaMA (Meta) and GitHub Copilot (GitHub and OpenAI).
  • Examples of generative AI art systems include Stable Diffusion (Stability AI), Midjourney and DALL-E (OpenAI).

Background

Large language model chatbots:

  • ChatGPT – stands for Chat Generative Pre-trained Transformer; released by OpenAI on November 30, 2022; built on GPT-3.5 / GPT-4; fine-tuned for conversational user interactions.
  • Bing Chat – released by Microsoft in 2023; built into Microsoft Edge web browser; built on OpenAI’s GPT-4; mimics ChatGPT.
  • Bard – initially released by Google in March 2023 (excluding European Union and Canada); European Union added in July 2023; still unavailable in Canada.
  • LLaMA – initially released by Meta in February 2023; second generation model released in July 2023 as open source; available for free for research and commercial use.
  • GitHub Copilot – released by GitHub in October 2021; built on OpenAI’s GPT-3; generates solution code in various programing languages from text prompts.

Text-to-image art systems:

  • Stable Diffusion – released in 2022 by Midjourney Inc.; can generate new images from text prompts; small enough to run on consumer computers.
  • Midjourney – released in July 2022; only available through the Midjourney server on the Discord communication app.
  • DALL-E – initially released by OpenAI in January 2021; built as a multimodal implementation of GPT-3; version 3 is scheduled to be released into ChatGPT in October 2023.

Prepared by: TA


Government of Canada Voluntary Code of Conduct on Generative AI

Key Messages

  • We understand this is meant to be a voluntary code of conduct, with signatories committing to adopting the specific measures in advance of binding regulation pursuant to AIDA.
  • We are generally supportive of the concept of codes of practice, so long as they are buttressed by strong and effective legislation.
  • My Office welcomes the acknowledgement of privacy and data protection implications of generative AI, including the reference to the G7 Data Protection and Privacy Authorities’ Statement on Generative AI as well as to legal obligations under PIPEDA.
  • Generative AI is an area of particular interest for my office and we recognize the privacy risks of this emerging technology.

Background

  • ISED hosted a series of roundtables on the development of this code with key stakeholders in the AI industry and civil society organizations. We were not involved with this consultation process and we have not seen a report or other outputs from those roundtables.
  • The code was published in late September and as of October 12, 2023 had 14 signatories.
  • Minister Champagne’s remarks at INDU on September 26, 2023 emphasized the need for consistent regulations and standards across international jurisdictions. It is expected that other G7 countries will release similar “voluntary codes of conduct.”
  • In the fall, the OPC will be releasing Principles for Responsible, Trustworthy and Privacy-Protective Generative AI pending FPT approval. If approved and when released, the document will outline several key concerns around the development, deployment, and use of Generative AI systems. This document will establish our Office’s position on generative AI.

Prepared by: PRPA


OPC Regulatory Framework for AI

Key Messages

  • AI has immense promise, but must be implemented in ways that respect privacy, equality and other human rights.
  • Following a public consultation, in 2020 the OPC published a proposed Regulatory Framework for AI. We recommended that an appropriate legal framework for AI would:
    • Allow personal information to be used for public and legitimate business interests, including for the training of AI; but only if privacy is entrenched in its proper human rights framework;
    • Create provisions specific to automated decision-making to ensure transparency and fairness (explanation and contest); and,
    • Require businesses to demonstrate accountability to the regulator upon request, ultimately through proactive inspections and other enforcement measures to ensure compliance.
  • PIPEDA does not contain any of these measures and is ill-suited to the AI environment. C-27 contains a limited explanation requirement and a legitimate interest ground for use, but does so without recognizing privacy as a fundamental right.

Background

  • OPC launched a public consultation in January 2020. We received 86 submissions, and held two in-person consultation sessions in Montreal and Toronto.
  • The wrap-up report, A Regulatory Framework for AI, contains our key recommendations for regulating AI, and is available on our website.
  • We also published a separate report we commissioned by a recognized expert in AI, which informed our recommendations and accounts for stakeholder feedback.
  • More broadly, OPC collaborates with international data protection authorities in working groups on AI through the Global Privacy Assembly, a forum of international Data Protection and Privacy Commissioners.

Prepared by: PRPA


Amendments to the Australian Privacy Act

Key Messages

  • Last year’s key amendments to the Australian Privacy Act increased penalties and provided the Australian Commissioner with greater enforcement and information sharing powers.
  • The amendments grant the Australian Commissioner new infringement notice powers to penalize entities for failing to provide information. No such explicit power exists in Bill C-27.
  • Under Bill C-27, the maximum penalty for all contraventions in the Commissioner’s penalty recommendation is the higher of $10M and 3% of the organization’s gross global revenue in the year before the one in which the penalty is imposed (s. 95(4) CPPA).
  • Under the Australian amendments, the maximum penalty for a corporation’s serious or repeated interference(s) with privacy can not exceed the greater of:
    1. AU $50M (≈ Can $43.4M);
    2. three times the value of the benefit obtained; or
    3. if the benefit’s value cannot be determined, 30% of the adjusted turnover in the relevant period.

Background

  • In December 2022, amendments to the Australian Privacy Act came into force.
  • Australian Government agencies and organizations that have an annual turnover exceeding AU $3M have responsibilities under the Australian Privacy Act (subject to some exceptions).
  • Some of the amendments also (i) empower the Commissioner to require that a respondent engage an independent adviser to assist in ensuring that conduct is not repeated or continued; (ii) provide the Commissioner with a new information-gathering power for the purpose of assessing actual or suspected data breaches; (iii) expand the Commissioner’s capacity to share information with an enforcement body; and (iv) empower the Commissioner to release certain information if they are satisfied it is in the public interest.

Prepared by: Legal Services


The California Delete Act

Key Messages

  • The California Delete Act will enable Californian consumers to require all registered data brokers to delete their personal information through a single request.
  • Data brokers are defined as businesses that knowingly collect and sell to third parties the personal information of a consumer with whom the business does not have a direct relationship.
  • In comparison, Bill C-27 would require organizations, including data brokers, to “dispose” of personal information in certain circumstances upon request. Separate requests would need to be made to each organization in control of the information.

Background

  • Governor Gavin Newsom signed the bill into law on October 10, 2023. The Act’s requirements will come into effect in 2026.
  • The definition of “data broker” does not include financial institutions, insurance companies and healthcare institutions covered by federal privacy regimes.
  • The Act:
    • Requires data brokers to access a list of consumers who have requested the deletion of their information and to delete all their personal information at least once every 45 days.
    • Prohibits data brokers from selling or sharing new personal information for such consumers and requires data brokers to submit an independent audit attesting compliance every three years.
    • Empowers the California Privacy Protection Agency to develop a system for residents to make one data deletion request across registered data brokers operating in the state and to enforce parts of the Act.
  • There may be significant operational issues with the Act, for example, how to address data brokers that sell data based on cookie numbers, IP addresses, and identifiers other than names.
  • The Act may be subject to freedom of speech challenges from the digital advertising industry in a jurisdiction particularly protective of free speech, even in commercial contexts.

Prepared by: Legal


EU AI Act

Key Messages

  • The proposed EU AI Act was first introduced in 2021 and continues to be actively debated. Currently, trilogue negotiations on proposed amendments are underway between the European Parliament, European Council and European Commission.
  • The EU AI Act would outlaw harmful AI applications such as those that subliminally manipulate or are used for social scoring. It also takes a risk-based approach by outlining certain uses of AI that warrant more stringent regulation, such as in the employment context, or for accessing essential public and private services.
  • Even where not used in a high-risk context, generative AI would be subject to transparency requirements under the AI Act.
  • The AI Act has stricter requirements than AIDA, though the extent of this difference will largely be dependent on AIDA regulations. A notable difference is that whether an AI system is high-risk under the AI Act considers whether it poses “a risk of adverse impact on fundamental rights”, which is not considered by AIDA’s definition of harms.

Background

  • In the EU AI Act, “high-risk” AI is subject to requirements including:
    • risk and quality assessments,
    • logging and record-keeping for traceability,
    • general human oversight,
    • accurate and representative data for AI training,
    • ex-ante conformity assessments, and
    • demonstrable accountability.

Prepared by: PRPA


TikTok

Key Messages

  • In 2023, the UK and Irish DPAs issued findings and fines against TikTok, particularly focusing on children’s privacy. The DPA of France also issued a fine for breaking rules on cookie consent.
  • Other regulatory action included a 2019 settlement with the FTC for alleged violations of US Children’s Online Privacy Protection Act (COPPA), and in 2020, South Korea’s telecommunications regulator issued a fine for violations related to children’s privacy and overseas data transfer.

Background

  • In April 2023, the UK ICO found that TikTok breached the UK GDPR by: (1) handling data of kids too young to consent; (2) failing to provide users with easy to understand information; (3) not processing user data lawfully, fairly or transparently. A fine of £12.7 million was imposed.
  • In September 2023, Ireland’s Data Protection Commission (DPC) found that TikTok breached the EU GDPR by: (1) making child profiles public by default; (2) allowing “Family Pairing” feature of accounts by (unverified) non-parents/guardians; (3) not being transparent with child users; (4) employing “dark patterns” nudging users towards more privacy-intrusive options. It gave TikTok 3 months to comply with the law and levied a €345 million fine.
  • In July 2021, the Dutch DPA announced that TikTok had breached the EU GDPR by failing to offer a Dutch privacy statement to Dutch users (including children), and imposed a €750 thousand fine. The Dutch DPA handed over the investigation to its Irish counterpart after TikTok’s headquarters moved to Ireland.
  • In February 2019, the FTC announced that TikTok (then known as Musical.ly) agreed to pay $5.7 million to settle alleged violations of the COPPA.
  • In July 2020, South Korea’s Communications Commission announced that TikTok had violated local telecommunications laws by mishandling personal information of its users and issued a ₩186 million fine.
  • In January 2023, France’s CNIL announced that TikTok had violated the French Data Protection Act through its cookie consent flows that made it easier for users to accept than opt-out of tracking and issued a €5 million fine.

Prepared by: Legal


Chart: Comparison of Selected International AI Regulatory Models – In force and Proposed

Country Proposed Regulation or Relevant Policy Context
United States

Voluntary:

States:

  • California Privacy Rights Act (CPRA)
  • Colorado Privacy Act (CPA)
  • Connecticut Data Privacy Act (CTDPA)
  • Virginia Consumer Data Privacy Act (VCDPA)
  • Connecticut “An Act Concerning Artificial Intelligence, Automated Decision-Making and Personal Data Privacy”
  • California AB 331
  • New Jersey S1402
  • New York Digital Fairness Act

The US does not have comprehensive AI regulation at the federal level. Multiple states have enacted or plan to enact AI regulation.

In October 2022, the White House published a “Blueprint for an AI Bill of Rights,” which outlines key principles to help guide the design, development, and deployment of AI systems. Adherence to these principles is voluntary.

In January 2023, the National Institute of Standards and Technology published an “AI Risk Management Framework” that establishes standards for incorporating trustworthiness into the design, development, and use of AI systems. Adherence toss this standard is voluntary.

There have been multiple Congressional efforts to enact federal legislation to regulate AI. The proposed bills have not built significant momentum and Congressional legislation is not expected in the immediate future.

Several State governments have enacted broad privacy legislation that would regulate AI within their jurisdictions. Many of these state laws establish rights to opt-out of automated decisions and profiling and require organizations to complete a data protection assessment in certain circumstances.

Other state governments are currently considering bills to specifically regulate AI, aside from privacy and data protection:

  • Connecticut - establish an Office of Artificial Intelligence and develop an AI Bill of Rights.
  • California - require impact assessments for automated decision-making tools and give consumers the right to request manual review for consequential decisions.
  • New Jersey - make it unlawful for an automated decision system to discriminate against members of a protected class in loans, insurance, or healthcare settings.
  • New York - require automated decision system impact assessments, prevent discriminatory practices with targeted advertising, and regulate the use of biometric data, among other provisions.

The FTC has led current enforcement efforts under the FTC Act, which prohibits “unfair or deceptive acts or practices in or affecting commerce.” The agency recently opened an investigation into OpenAI and ChatGPT.

European Union

In force:

  • GDPR

In consideration:

  • EU AI Act

The GDPR regulates collection, use, and disclosure of personal information, including in the context of AI systems.

In June 2023, the EU AI Act was adopted by the European Parliament and is currently in trilogue negotiations. The EU AI Act takes a risk-based approach that identifies four categories of AI systems based on the risk of harm they pose. The adopted act establishes new restrictions on the technology’s highest risk uses, including real-time biometric processing and emotion recognition, and addresses novel general-purpose AI (GPAI) systems.

The act proposes 6 general principles that should apply to the development and use of all AI systems, in order to establish a “high-level framework that promotes a coherent human-centric European approach to ethical and trustworthy” AI systems. These 6 principles are: (1) human agency and oversight, (2) technical robustness and safety, (3) privacy and data governance, (4) transparency, (5) diversity, non-discrimination, and fairness, and (6) social and environmental well-being. The act encourages the voluntary creation of codes of conduct intended to foster compliance with these principles.

“High-risk” systems are subject to several requirements, including the establishment of a “risk management system”, stricter data governance and technical documentation, human oversight, and registration with a publicly accessible EU database managed by the European Commission.

The Council of Europe (CoE) is developing a treaty on artificial intelligence (AI), human rights, democracy, and the rule of law. The purpose of the treaty is to address potential harms and other risks from the design, development and use of AI. The treaty may be complemented by other binding or non-binding instruments. Once concluded, this treaty could be the first international convention on the regulation of AI and the first legally binding international instrument on AI and human rights. The Government of Canada is participating in international negotiations (Redacted).

Recently, Spain established the Agency for the Supervision of Artificial Intelligence (AESIA), which will work to develop “inclusive, sustainable and citizen-centered” AI. It is expected that this regulatory agency would support the implementation of the EU AI Act within Spain.

G7

Hiroshima AI Process

Proposed:

  • Codes of Conduct (domestically and internationally)
  • Comprehensive policy framework

At the G7 Leaders Summit in Hiroshima in May 2023, the Communiqué “establish[ed] the Hiroshima AI process, through a G7 working group, in an inclusive manner and in cooperation with the OECD and GPAI, for discussions on generative AI by the end of this year.”

After a virtual meeting between the G7 Digital and Tech Ministers on September 7th 2023, the group released a statement that committed to developing guiding principles and an international code of conduct, with an aim to develop a comprehensive policy framework by the end of the year.Footnote 1

The Government of Canada’s “Voluntary Code of Conduct on the Responsible Development and Management of Advanced Generative AI Systems” is an initiative under the Hiroshima AI Process and is meant as a model for other G7 countries to develop their own codes.Footnote 2

Minister Champagne’s remarks at INDU on September 26, 2023 emphasized the need for consistent regulations and standards across international jurisdictions. Thus, it is expected that other G7 countries will release similar “voluntary codes of conduct.”

United Kingdom

White paper:

Parliamentary report:

In a white paper, the UK government proposed a context-based, proportionate approach to regulation and will rely on existing sectoral laws to impose guardrails on AI systems. The white paper outlines five principles to frame regulatory activity, guide future development of AI models and tools, and their use. These principles would be interpreted and implemented by sectoral-specific regulators:

  • Safety, security and robustness;
  • Appropriate transparency and explainability;
  • Fairness;
  • Accountability and governance; and
  • Contestability and redress.

A UK parliamentary committee advocated for “a tightly-focussed AI Bill in the new session of Parliament” that would complement the proposed approach in the white paper.

The UK government will host its first AI safety summit on November 1 and 2.

Japan

White papers and policy guidance:

There is no comprehensive AI regulation in Japan. Japan promotes the notion of “agile governance,” whereby the government provides non-binding guidance and defers to the private sector’s voluntary efforts to self-regulate.

South Korea

Proposed:

The proposed AI Act incorporates seven previously fragmented pieces of legislation on AI. The proposal ensures accessibility to AI technology for all developers without government approval but also requires protection measures. Certain types of AI that are used in direct connection with human life and safety as “high-risk AI” requires that such achieve a certain level of trustworthiness.

South Korea is also setting new standards on copyrights of AI-generated content.

Australia

Discussion paper:

There are no laws and policies specific to AI governance in Australia, but the government highlighted the application of existing regulatory frameworks for AI.

The government hosted a consultation that would inform appropriate regulatory and policy responses. In this process, the government released a discussion paper on “Safe and responsible AI in Australia” that proposes that governance measures should consider:

  • ensuring there are appropriate safeguards, especially for high-risk applications of AI and ADM
  • providing greater certainty and make it easier for businesses to confidently invest in AI-enabled innovations and ADM activities and engage in these activities responsibly.
New Zealand

Public sector risk management framework:
Algorithm Charter for Aoteara New Zealand

Under the Algorithm Charter, government agencies must carefully manage how algorithms will be used to strike the right balance between privacy and transparency and prevent unintended bias. A risk matrix assesses the likelihood and impact of potential AI risks. The Charter commits agencies to assessing factors like transparency, privacy, ethics, and human rights, and human oversight for the highest risk uses of algorithms.

Brazil

Proposed:

A committee of the Brazilian Senate proposed a draft AI law that proposes regulation on the principles of “good faith… self-determination and freedom of choice; transparency, explainability… non-discrimination, justice, equity, and inclusion” among others. The definition in the draft law aligns with the OECD definition.

The draft law obliges providers and users of AI systems to conduct and document a risk assessment prior to placing any AI system on the market, lists examples of “high-risk” and prohibited AI systems that carry additional responsibilities and must be registered in a public database, and establishes rights to explanation, challenge, and human intervention for automated decisions.

Association of
Southeast Asian
Nations (ASEAN)

Draft “Guide to AI ethics and governance

ASEAN countries are reportedly developing a guide to AI regulation that asks companies to take countries’ cultural differences into consideration and does not prescribe unacceptable risk categories. It is voluntary and is meant to guide domestic regulations. The draft takes a “business-friendly approach” that could contradict the EU’s regulatory approach.

The guide is still being drafted and is currently being circulated to technology companies for feedback. It is expected to be finalized and adopted in January 2024 at the ASEAN Digital Ministers Meeting.


Chart: Jurisdictional Scan of Definitions for De-identification, Anonymization and Pseudonymization

Jurisdiction De-identification Anonymization Pseudonymization
Bill C-27 Section 2(1): De-identify means to modify personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified remains. Section 2(1): Anonymize means to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means. Not defined.
Quebec Law 25 Section 110: Personal information is de-identified if it no longer allows the person concerned to be directly identified. Section 119: Information concerning a natural person is anonymized if it is, at all times, reasonably foreseeable in the circumstances that it irreversibly no longer allows the person to be identified directly or indirectly.

Information anonymized under this Act must be anonymized according to generally accepted best practices and according to the criteria and terms determined by regulation.
Not defined.
Ontario Personal Heath Information Protection Act (PHIPA) Section 2: “De-identify”, in relation to the personal health information of an individual, means to remove any information that identifies the individual or for which it is reasonably foreseeable in the circumstances that it could be utilized, either alone or with other information, to identify the individual, and “de-identification” has a corresponding meaning Not defined. Not defined.
GDPR Not defined. Recital 26: Information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable. Article 4(5): “Pseudonymisation” means the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person.
UK Data Protection Act (UK GDPR) Not defined. Same as GDPR Same as GDPR
US Health Insurance Portability and Accountability Act (HIPAA) 45 CFR S. 164.514(a): Standard: De-identification of protected health information. Health information that does not identify an individual and with respect to which there is no reasonable basis to believe that the information can be used to identify an individual is not individually identifiable health information. Not defined. Not defined.
California Consumer Privacy Act Section 1798.140(m): “De-identified” means information that cannot reasonably be used to infer information about, or otherwise be linked to, a particular consumer”. Not defined. Section 1798.140(aa): “Pseudonymize” or “Pseudonymization” means the processing of personal information in a manner that renders the personal information no longer attributable to a specific consumer without the use of additional information, provided that the additional information is kept separately and is subject to technical and organizational measures to ensure that the personal information is not attributed to an identified or identifiable consumer.

Accountability

Key Messages

  • C-27 strengthens accountability by requiring organizations to implement and maintain a privacy management program (PMP), and to provide OPC with access to the PMP on request.
  • However, the CPPA would benefit from inclusion of an objective standard for accountability and a requirement for organizations to maintain adequate records to demonstrate compliance.
  • The CPPA should also ensure that accountability obligations scale depending on the nature and importance of the personal information, the size and revenue of the organization, and relevant risks and threats.
  • OPC welcomes language in Bill C-27 which provides some flexibility to use information obtained in a PMP review to initiate a complaint/audit.

Background

  • C-27 obligates organizations to implement and maintain a PMP that includes the policies, practices, and procedures put in place to fulfill obligations (s. 9(1)) and must provide OPC with access to the PMP on request (s. 10(1)).
  • OPC must provide guidance on organizations’ PMPs on request (s.110(1)(e)), but this can be in the form/manner OPC considers appropriate (s. 110(1)). The OPC does not have to act on unreasonable requests (s. 110(2)), and may prioritize requests based on perceived needs of organizations (s.110(2)). This is consistent with OPC’s C-11 recommendation for greater discretion.
  • S. 111 prohibits the Commissioner from using information from a PMP review to initiate a complaint or audit unless the Commissioner considers that the organization has willfully disregarded his recommendations.
  • Several related recommendations in our C-11 submission are not addressed in C-27: (1) an objective standard for accountability (former Recommendation 20); (2) a requirement to maintain adequate records to demonstrate compliance with obligations (former Recommendation 21); and (3) scaling of accountability/record-keeping obligations (former Recommendation 21).

Prepared by: PRPA


Administrative Monetary Penalties

Key Messages

  • S.89-90 of the CPPA allows the Commissioner to recommend that an administrative monetary penalty (AMP) be imposed by the Personal Information and Data Protection Tribunal if an organization has contravened one or more provisions outlined under subsection 94(1) (e.g., obtaining consent and proper retention/disposal of personal information).
  • OPC supports the introduction of AMPs as an incentive for organizations to comply with federal privacy law.
  • We would, however, recommend expanding the list of contraventions qualifying for AMPs to include ss. 12(1) & 12(2) of the CPPA (i.e., dealing with “appropriate purposes") given these are keystone provisions in protecting personal information.

Background

  • Subsection 94(1) of the CPPA significantly expands the list of violations qualifying for AMPs from those proposed in the former Bill C-11. We support this positive development but believe it falls short.
  • The list does not, for example, allow AMPs to be levied against organizations for violating the appropriate purposes provisions (i.e., ss.12(1) & 12(2)); those two subsections ought to be added to the list under subsection 94(1) to better protect individuals’ personal information.
  • The appropriate purposes provisions (ss.12(1) and 12(2)) require organizations to only collect, use and disclose personal information in a manner and for purposes that a reasonable person would consider appropriate in the circumstances, regardless of whether there is consent.
  • We are of the view that an assessment of the sensitivity of the information and whether the loss of privacy would be proportionate to the benefit gained must be taken into account when making this determination.

Prepared by: Legal


Appropriate Purpose

Key Messages

  • As drafted, subsection 12(2) of the CPPA enumerates a closed list of factors that must be considered when determining the appropriateness of the purpose for collection, use or disclosure.
  • While a list of factors is useful, it should be non-exhaustive, allowing the relevant and applicable factors to vary by context.
  • Further, not all provinces with substantially similar laws have a mandatory list of factors for determining appropriateness, which could be challenging when my office is collaborating with other regulators.
  • Subsection 12(2) should therefore be amended to allow for flexibility, including consideration of other relevant factors.

Background

  • The “appropriate purpose” provision of PIPEDA, found at subsection 5(3), does not enumerate or require a list of factors to be satisfied. This means that the factors for determining what is an appropriate purpose have been able to develop over time, through experience, and as a result of relevant case law.
  • Although the factors listed in subsection 12(2) appear to have been derived from case law on subsection 5(3) of PIPEDA, it is important to note that the courts have never held that these factors had to be satisfied in all cases to establish appropriateness.
  • In fact, in Eastmond v Canadian Pacific Railway (2004 FC 852), the Court specifically stated that not all factors may be relevant in all instances and held that subsection 5(3) required a contextual assessment (paras 130-131).
  • More recently, in AT v Globe24h.com, the Federal Court did not consider all the factors now listed in subsection 12(2) (2017 FC 114, paras. 73-76).
  • BC and Alberta have not adopted a closed list of factors for assessing whether an organization’s purposes for collection are “reasonable” or “appropriate” in accordance with their private-sector privacy laws. This could pose challenges for when conducting joint investigations.

Prepared by: Legal Services


Breaches - Reparation for Damages

Key Messages

  • S.93(2) of C-27 gives the Commissioner the power to issue orders to organization to promote compliance with the Act.
  • While we welcome this as a positive development, we would recommend the OPC be permitted to order an organization to take measures which allow individuals to be compensated for damages suffered, financial or otherwise, stemming from a breach or violation of security safeguards required by law.
  • Organizations derive significant commercial benefits from the use of personal information yet when their security safeguards are deficient and lead to a breach, the pecuniary consequences can fall on affected consumers.
  • In certain circumstances, compensation to individuals for damages stemming from a breach may be warranted.
  • For individuals to be made whole after a breach, the OPC requires the authority to order compensation for damages suffered.

Background

  • Similar to PIPEDA, CPPA safeguard provisions can be interpreted as requiring organizations to undertake post-breach mitigation measures (for example, credit monitoring) to prevent harm to individuals from materializing. We view such mitigation measures as “safeguards”.
  • During breach investigations, the OPC will frequently have discussions with organizations about measures to reduce harm following a breach. For example, following a major breach, Desjardins voluntarily offered both credit monitoring and compensation for costs associated with identify recovery process. However, organizations will not always volunteer such measures.
  • In cases where the OPC has investigated a breach and found a violation of the safeguards obligation, in order to make individuals whole the OPC requires the authority to make an order requiring organizations to take measures to provide compensation for damages suffered.

Prepared by: CIRD


Business Activities

Key Messages

  • The CPPA allows the collection and use of personal information without consent for specified business activities, where it is within a reasonable person’s expectations, and it is not to influence an individual’s behaviour or decisions. (S. 18(1))
  • This consent exception is improved from the former C-11, in that two overly broad activities have been removed and the remaining activities must all be “necessary” for a specified purpose.
  • However, there is no requirement that all activities potentially to be prescribed be “necessary” for a specified purpose, which could lead to overly broad activities being prescribed in the future, such as to “improve services”. Adding a necessity requirement at s. 18(2)(d) would ensure the exception is narrow.
  • As well, the potential to exempt activities from the full scope of the Act through Regulations is of serious concern and needs to be addressed.

Background

  • Our C-11 Submission expressed concern over the breadth of two listed business activities, namely activities to reduce commercial risk, and where consent would be impracticable due to a lack of direct relationship with the individual.
  • Under s. 122(1)(a), regulations can be made to specify activities that would be completely excluded from the application of the Act. This aligns with the French version of this provision in the former C-11. We recommend re-instating the language from the English version of the former Bill C-11 to ensure the regulation remains focused on the consent exception for business activities.
  • S. 15(6) prohibits implied consent for business activities (s.18(2)) or for activities in which the organization has a legitimate interest (s.18(3)). We believe this aims to ensure that organizations that choose to rely on consent for a collection or use related to those activities must obtain express consent.

Prepared by: PRPA


Cybersecurity

Key Messages

  • Cybersecurity is a multifaceted issue (spanning regulatory domains), with privacy being a key component of effective cybersecurity resilience.
  • Stronger breach and safeguarding measures in C-27 could help to improve cybersecurity protection in the private sector.
  • Extending OPC’s powers to collaborate with other domestic regulators could also help to improve safeguarding compliance and mitigate effects of cybersecurity incidents.
  • Bill C-26, which is currently before Parliament, would address cybersecurity from a different angle, by establishing a framework for regulating the security of certain cyber systems.

Background

  • During second reading debate on C-27, several members raised concerns about the need to protect/safeguard growing amounts of personal information collected from Canadians against cybersecurity threats, including malicious attacks (cybercrime) and foreign interference.
  • C-27 leaves safeguarding requirements largely unchanged from PIPEDA. While organizations would be required to protect personal information in proportion to its sensitivity, other important factors are not mentioned (e.g. risk to consumers).
  • We recommend that safeguard obligations consider not just the sensitive nature of the information being processed, but also the risks to consumers, in the event of a breach, associated with the nature, scope, and context of its use of personal information, in light of the organization’s business activities.
  • C-27 would allow OPC to share information only with certain regulators (CRTC and Competition Bureau). Authorizing sharing with other bodies (such as credit reporting regulators, the Superintendent of Financial Institutions, and the proposed AI and Data Commissioner) would enable collaboration on efforts to mitigate effects of cybersecurity incidents and promote safeguarding compliance. OPC’s powers to collaborate with NSIRA led to a successful joint review of government disclosures under the Security of Canada Information Disclosure Act (SCIDA) in 2021.

Prepared by: PRPA


Data Mobility

Key Messages

  • Overall, the OPC supports the introduction of data mobility provisions in the CPPA; however, we recommend certain amendments to better align the Bill with international models.
  • C-27 could be strengthened, and consumer control over their information improved, by expanding s.72 to apply to all personal information – including derived or inferred information.
  • We also recommend a clear consultative, advisory or approval role be established for the OPC, as has been done in Australia, with clear roles and responsibilities for mobility frameworks.
  • We would also support the expansion of the right to provide individuals, when technically feasible, the ability to receive this information in a structured, commonly used, machine-readable format as has been done in other jurisdictions.

Background

  • S. 72 obligates organizations to disclose, upon request, an individual’s personal information that it has collected from the individual to an organization designated by the individual, provided both organizations are subject to a data mobility framework.
  • The Australian Consumer Data Right (CDR) Act clarifies that derived information is to be included within each class of information potentially to be shared between organizations. The Act also sets out clear roles and responsibilities for the Australian Information Commissioner, including explicit requirements for it to be consulted on both the designation of, and the rules for, sectors. By contrast, s. 123 of the CPPA specifies that data mobility frameworks are to be established through regulations, without specifying any role for the Privacy Commissioner.
  • The right for individuals to receive their own data in a structured, commonly used, machine-readable format is common in other data protection laws such as the GDPR, Australian CDR Act, California, and in Quebec (in 2024).

Prepared by: PRPA


Definition of Minors

Key Messages

  • The CPPA makes reference to personal information of “minors” [ss.2(2), 4(a)-(b), 55(2)(d) and (f)] but does not define the term.
  • s 2(2) of the CPPA interprets personal information of “minors” to be “considered sensitive information” but organizations might be unable to determine who is (and is not) a “minor” without a definition in the CPPA. A definition would provide clarity.
  • Defining who is (or is not) a “minor” in the CPPA would likely not be problematic if the definition incorporates a reference to provincial/territorial legislation setting out the age of majority in their respective jurisdictions since this is typically a matter of provincial/territorial, rather than federal, jurisdiction.

Background

  • Each province/territory has enacted legislation that defines who has (or has not) attained the age of majority within their respective jurisdictions.
  • The age of majority is either 18 or 19 years of age, depending on the province or territory:
    • Alberta, Manitoba, Ontario, PEI, Saskatchewan, and Quebec = 18 years
    • British Columbia, New Brunswick, Newfoundland and Labrador, Nova Scotia, and the 3 territories = 19 years
  • Federal legislation does not typically define who is or is not a minor, but there are some examples where it defines “age of majority” (i.e., s.2 of the Divorce Act, RSC 1985, c 3): “age of majority, in respect of a child, means the age of majority as determined by the laws of the province where the child habitually resides, or, if the child habitually resides outside of Canada, eighteen years of age; (majeur)”
  • Am example of possible definition of “minor” to be included in the CPPA respecting provincial/territorial jurisdiction: “Minor, in respect of a child, means an individual who has not yet attained the age of majority as determined by the laws of the province where the child habitually resides, or, if the child habitually resides outside of Canada, eighteen years of age.

Prepared by: Legal Services


Definition of PI (Inferences)

Key Messages

  • AI models can infer and predict aspects of individuals’ behaviour and interests based on evidence and reasoning. These inferences can lead to a depth of revelations, such as those relating to political affinity, interests, financial class, and race.
  • Despite OPC decisions and jurisprudence supporting the idea that inferences must be treated as all other personal information, there remains some debate as to how inferences are regarded.
    • Some view them as an output derived from personal information, like an opinion might be, and argue these are outside the purview of privacy legislation.
  • To ensure that inferences are considered to be personal information and appropriately protected, we recommend the definition of personal information be amended to expressly include inferred information.

Background

  • The OPC has found that credit scores amount to personal information, and that inferences amount to personal information under the Privacy Act. This is also consistent with the Supreme Court’s understanding of informational privacy, which includes inferences and assumptions drawn from information (R v Spencer, 2014 SCC 43; R v Kang-Brown, 2008 SCC 18; R v Gomboc, 2010 SCC 55).
  • The California Consumer Privacy Act explicitly includes inferences in its definition of personal information. The Australian OAIC also proposed similar amendments to the Australian Privacy Act. In August 2022, the European Court of Justice issued an Opinion noting that data which are capable of indirectly revealing data such as sexual orientation, is considered sensitive itself.

Prepared by: PRPA


Definition of Sensitive Information

Key Messages

  • Although sensitive information is not explicitly defined in Bill C-27 or in PIPEDA, we believe C-27 would benefit from a definition that sets out a general principle, followed by a list of non-exhaustive examples.
  • A hybrid definition would ensure that certain types of information are always considered sensitive (such as biometric and location information) but also support a contextual interpretation and account for new types of information that may emerge over time.

Background

  • The sensitivity of personal information is a consideration relevant to a requirement within the CPPA including:
    • s. 9(2), relating to the development of an organization’s privacy management program;
    • s. 12(2)(a), as one of the factors to consider when determining whether a reasonable person would consider an organization’s purposes appropriate under the circumstances;
    • s. 15(5), as a consideration that organizations must take into account with respect to the form of consent;
    • s. 53(3), as a factor in determining retention periods;
    • s. 57(1), relating to the level of protection provided by security safeguards;
    • s. 58(8) pertaining to the determination of whether a breach creates a real risk of significant harm.
  • s. 2(2) explicitly states that the personal information of minors is sensitive.
  • Quebec’s Act respecting the protection of personal information in the private sector has a principle-based definition for sensitive information.
  • The GDPR (Art. 9(1)) identifies special categories of data (like genetic and biometric data) that could pose risks to individuals’ rights and therefore require specific conditions to be met (like express consent) before processing is permitted.
  • In May 2022, the OPC published an interpretation bulletin on sensitive information which outlines how certain categories of information have generally been considered as sensitive.

Prepared by: PRPA


De-Identification / Anonymization

Key Messages

  • The OPC supports the implementation of a framework for de-identification/anonymization that would clarify organizations’ responsibilities, provide some flexibility in privacy requirements, and minimize privacy risks (including re-identification risk).
  • In our view, several changes are needed to strengthen and clarify the framework proposed in C-27:
    • S. 74 should explicitly require organizations to account for the risk of re-identification when applying de-identification measures.
    • S. 2(3) should clarify that de-identified information always remains personal information that is subject to the Act.
    • S. 2(1) should be amended to remove reference to “generally accepted best practices”.

Background

  • Strong protections are needed because (1) a broad range of information can be considered de-identified under the proposed framework (i.e., only direct identifiers need be removed); (2) flexibility in privacy requirements reduces individuals’ control over their personal information (consent, accuracy, and disposal on request are reduced); and (3) a persistent risk of re-identification always remains.
  • S. 74 requires organizations to use de-identification measures that are proportionate to the sensitivity of the information and the purpose of de-identifying it, but it does not explicitly address re-identification risk.
  • S. 2(3) states de-identified information remains personal information except in relation to certain provisions (e.g. consent exceptions, accuracy requirements, disposal on request). This creates ambiguity in the status of de-identified information in certain circumstances.
  • S. 2(1) allows organizations to anonymize information in accordance with “generally accepted best practices” but does not specify what these are or what counts as generally accepted.

Prepared by: PRPA


Disclosures to Law Enforcement

Key Messages

  • In general, we believe that the language in sections 43 to 50, for disclosures to law enforcement and other government agencies, represents an improvement and clarification from PIPEDA.
  • However, two potential further enhancements would be to:
    • Clarify the concept of “lawful authority” and organizations’ obligations under section 44; and,
    • Introduce reporting provisions for government and record-keeping (and/or regular reporting) for organizations.
  • In past OPC investigations and reviews, the “lawful authority” concept in PIPEDA has proved ambiguous, and remains so in the CPPA, as companies continue to disclose personal information, without consent, to law enforcement agencies, absent a warrant.

Background

  • Particularly for the purposes of section 44 and following a 2014 Supreme Court decision on the issue (R. v Spencer), the OPC concluded that the concept of “lawful authority” should be defined clearly.
  • Ideally, the definition would clarify that the discretionary disclosures to law enforcement envisaged under section 44 should only be permissible when:
    • there are exigent circumstances,
    • it is pursuant to a reasonable law (other than section 44 of the CPPA), or,
    • it is in prescribed circumstances where personal information would not attract a reasonable expectation of privacy.
  • In 2015, when PIPEDA was last amended (through Bill S-4), the OPC recommended: “a legal framework, based on the Spencer decision, is needed to provide clarity and guidance to help organizations comply with PIPEDA and ensure that state authorities respect the Supreme Court of Canada’s decision. Such a framework would provide Canadians with greater transparency about private sector disclosures of personal information to state agencies.”

Prepared by: PRPA


Disposal

Key Messages

  • Reputation issues related to disposal, raised in the OPC’s submission on the former Bill C-11, have been partially addressed by C-27. For example, C-27 has addressed our recommendation that s. 55 be expanded to include all personal information held by an organization about an individual.
  • However, s. 55(2)(f) of C-27 would allow organizations to refuse a deletion request if the personal information is to be disposed of according to a retention policy (and is not about a minor), and the individual is informed of the remaining time period.
  • Our Office recommends that s. 55(2)(f) be removed from the bill, as the provision may make it difficult in practice for Canadians to delete their personal information, putting the information at risk of being subject to a data breach.

Background

  • s. 55(1) provides that an organization must dispose of an individual’s personal information on request as soon as feasible if one of the following applies: (i) the Act has been contravened; (ii) an individual has fully/partially withdrawn their consent; (iii) the information is no longer necessary to provide a product or service to the individual. These limitations were not present in C-11. They are similar to, but more limited than, conditions found in Article 17 of the GDPR.
  • In instances where an individual has withdrawn their consent or where the information is no longer necessary to provide a product or service, an organization may refuse a disposal request if certain exceptions apply. Other than s. 55(2)(f) concerned with retention policies, these exceptions include when:
    • the personal information of another individual would also be disposed of;
    • statutory or contractual requirements prevent disposal;
    • the information is necessary for legal defence or other legal remedies;
    • the information is necessary to the ongoing provision of a product or service to the individual in question and is not about a minor; or
    • the request is vexatious or made in bad faith.

Prepared by: PRPA


Domestic Service Providers

Key Messages

  • S. 11(1) of Bill C-27 requires that transferring organizations ensure, by contract or otherwise, an “equivalent” level of protection as that required under the Act. This scheme is overall reasonable.
  • However, certain of our recommendations made in response to the former Bill C-11 would still apply in the context of domestic service providers.
  • These recommendations would help to ensure accountability in situations such as when a service provider collects, uses or discloses personal information directly, or subcontracts.

Background

  • The English version of the former Bill C-11 (at s.11(1)) would have required organizations to, when transferring information to a service provider, ensure a level of protection that is “substantially the same” as that required under the Act. The French version required an “equivalent” level of protection. Bill C-27 has been modified to now require an “equivalent” level in both French and English.
  • One recommendation from OPC’s Bill C-11 submission relating to domestic service providers was adopted. Recommendation 5 of Appendix B stated that service providers should not be able to avail themselves of the “business activities” exception to consent at s. 18(2)(e) and this provision has been removed.
  • All other service provider recommendations remain relevant, specifically:
    • Former Recommendation 3: s. 7(2), s. 11(1), s. 11(2), s. 19, s. 62(2)(d) of the CPPA should be amended to ensure these rules adequately reflect the broad scope of data transfers between organizations and service providers.
    • Former Recommendation 4: The law should make it clear that in a sub-contracting situation, the organization/controller remains accountable.
    • Former Recommendation 7: s. 11(2) of the CPPA should be amended so accountability is not limited to information transferred to an organization. It should also apply to information that service providers collect, use or disclose on behalf of the organization.

Prepared by: PRPA


Exceptions to Consent for Research

Key Messages

  • Section 35 of the CPPA facilities the disclosure of personal information without knowledge or consent where the disclosure is for research, study or statistical purposes. S.7(3)(f) of PIPEDA allows for disclosure without knowledge or consent for statistical or scholarly study or research.
  • The removal of the qualifying word “scholarly” from this provision turns a narrow exception into a potentially expansive authority to disclose personal information without consent.
  • Without parameters for the type of “study” or “research” that this exception can be used for, these terms may be interpreted to include almost any kind of commercial study or research.
  • There are also no limits on the types of organizations that can receive information pursuant to this exception, meaning that it could be used to disclose information to government institutions and bypass section 39 of the CPPA, which limits disclosures to government to prescribed “socially beneficial purposes”.
  • Re-inserting “scholarly” to section 35 would ensure this exception to consent for research and study remains narrow.

Background

  • There are minimal conditions that must be met before this exception can be used: the purposes for which the information is being disclosed cannot be achieved without the information, consent must be impracticable to obtain, and the OPC must be notified in advance.
  • There are no requirements for the receiving organization to have safeguards in place to protect the information.
  • Provincial privacy laws contain exceptions permitting disclosures for research or study, but with more stringent safeguards than the CPPA. For example, Quebec’s new private sector privacy law will permit disclosures for research purposes without consent, but a PIA addressing a number of requirements must first be completed. This provision will be in force as of September 2023.

Prepared by: Legal Services


Facebook Litigation

Key Messages

  • The OPC initiated an Application under section 15 of PIPEDA to protect Canadians’ privacy.
  • The OPC is appealing the Federal Court’s decision because the matter raises important questions with respect to the interpretation and application of privacy law that will benefit from clarification by the Federal Court of Appeal.
  • The issues at the heart of this matter are critically important for the fundamental privacy rights of Canadians and their ability to participate with trust in our digital society.

Background

  • In March 2018, the OPC received a complaint about Facebook that arose amid media reports that Cambridge Analytica had accessed the personal information of Facebook users without their consent via a third-party application (TYDL App).
  • The OPC and the Office of the Information and Privacy Commissioner for British Columbia jointly investigated and found that Facebook had not obtained meaningful consent from its users before disclosing their personal information and that it had not implemented adequate safeguards.
  • The OPC filed an application with the Federal Court under s. 15 of PIPEDA seeking, inter alia, an order requiring Facebook to correct its practices to comply with PIPEDA as Facebook didn’t agree to implement the OPC recommendations.
  • In April 2020, Facebook filed an application seeking judicial review of the OPC’s investigation process and the resulting Report of Findings.
  • On April 13, 2023, the Federal Court dismissed Facebook’s application for judicial review. The Court found that the OPC had not breached procedural fairness and that the investigation was not out of time. This decision is not being appealed.
  • On April 13, 2023, the Court also dismissed the Commissioner’s s. 15 application finding that there was insufficient evidence to conclude that Facebook had failed to comply with PIPEDA. The OPC appealed this decision on May 12.
  • The Centre for Free Expression is seeking leave to intervene in the appeal.

Prepared by: Legal


Google Decision

Key Messages

  • I welcome the Federal Court of Appeal’s decision to uphold the Federal Court’s decision that Google’s search engine service is subject to PIPEDA and that it is not exempt from the law under the “journalistic purposes” exemption.
  • In 2018, my Office filed a reference with the Federal Court seeking clarity on whether Google’s search engine is subject to PIPEDA when it indexes webpages and presents search results in response to queries of a person’s name.
  • The reference proceedings stem from an investigation into a complaint alleging that Google’s search engine was contravening PIPEDA by including links to news articles when their name was searched. The individual argued the articles were outdated, inaccurate and disclosed sensitive personal information.

Background

  • During the reference proceeding, Google made numerous attempts to raise constitutional arguments reflecting its position that applying PIPEDA to Google’s search engine in this context would likely be found to be unconstitutional on freedom of expression grounds. The OPC consistently argued that the constitutional arguments are premature.
  • The Federal Court dismissed Google’s argument that it “should decline to answer the reference questions or dismiss the Reference because the questions cannot/should not be answered without addressing constitutional issues…”.
  • In the 2-1 decision, all three Federal Court of Appeal justices agreed that the reference judge did not err in dismissing Google’s constitutional arguments and finding that the Reference questions could be answered without considering the Charter. However, the dissent held that the journalistic purposes exemption applied to Google to the extent that it is collecting and disclosing news articles.
  • Google has 60 days (until November 28, 2023) to seek leave to appeal the FCA’s decision to the Supreme Court.
  • OPC will proceed with our investigation into Google’s compliance with PIPEDA in relation to the de-indexing of search results containing personal information.

Prepared by: PRPA


Implications of C-27 on Substantially Similar legislation

Key Messages

  • The CPPA maintains PIPEDA’s provisions whereby organizations will be largely exempt from the federal statute in provinces or territories with substantially similar privacy laws.
  • C-27 is silent on what will happen with existing exemption orders for substantially similar provincial legislation.
  • It is up to the Government to ensure substantially similar legislation maintains this status if C-27 is passed.

Background

  • In accordance with section 26 of PIPEDA, there are currently several provincial privacy statutes that have been deemed substantially similar to Part 1 of PIPEDA.
  • By Governor in Council order, organizations (except FWUBs) are therefore exempt from the application of Part 1 of PIPEDA with respect to the collection, use and disclosure of personal information that occurs within those provinces.
  • ISED has established criteria for when provincial privacy legislation will be deemed substantially similar, including that the legislation provides privacy protection that is consistent with and equivalent to that in PIPEDA.
  • C-27 would repeal Part 1 of PIPEDA and enact the CPPA.
  • The CPPA authorizes the Governor in Council to issue exemption orders for substantially similar legislation; however the Act does not contain any provisions regarding the existing exemption orders made pursuant to PIPEDA.
  • Subparagraph 44(g) of the Federal Interpretation Act states that all regulations made under a repealed enactment remain in force and are deemed to have been made under the new enactment, in so far as they are not inconsistent with the new enactment, until they are repealed or replaced.
  • Even if the exemption orders made under PIPEDA remain in force, they would be meaningless since they are exemptions from the application of Part 1 of PIPEDA, which will no longer exist.
  • The Government can issue new orders for substantially similar legislation under the CPPA or could potentially amend the existing orders.

Prepared by: Legal


Interoperability of Public/Private Legal Requirements

Key Messages

  • It is important for both the private and public sectors to have modernized laws that treat privacy as a fundamental right.
  • Protecting Canadians’ privacy does not have to come at the expense of commercial interests or innovation in the private-sector context, or at the expense of the public interest in the public-sector context.
  • With these two sectors interacting ever more frequently, common privacy principles in our federal privacy laws would enable organizations active in both sectors to operate in a more homogenous regulatory environment and serve to mitigate accountability gaps.
  • With this proposal for private-sector law reform advancing through Parliament, we hope to see public sector law reform to soon follow.

Background

  • We have seen several pandemic-related initiatives and the RCMP’s use of Clearview’s facial recognition technology as recent examples of the growing reliance on public-private partnerships which demonstrate the need for common requirements between our two federal privacy laws.
  • In its most recent consultation on modernization of the Privacy Act, Justice Canada suggested that stronger alignment between the Privacy Act and PIPEDA could enhance domestic interoperability, prevent gaps in accountability where public and private sector entities interact, and further confirm the Privacy Act’s alignment with established global standards.
  • Justice Canada’s recent report summarizing its consultations also confirmed that stakeholders agree with the need to align our public and private sector frameworks for the protection of personal information.
  • The Mandate letters for both Minister Champagne and Minister Lametti contain commitments to advance privacy law reform, and they jointly introduced Bill C-27.

Prepared by: Legal Services


Legitimate Interest

Key Messages

  • We have previously recommended that a legitimate interest exception to consent be included into the CPPA, accompanied by a rights-based regime and pre-conditions such as the requirement to conduct a PIA and a balancing test. We also called for this exception to be monitored through proactive compliance checks by my Office.
  • Bill C-27’s exception to consent for legitimate interest aligns with most elements of our former recommendations.
  • We have recommended that the Bill’s preamble and purpose clauses be strengthened to recognize the importance of privacy as a fundamental right, which would ensure that the “Legitimate Interest” consent exception is situated in a rights-based regime.
  • We welcome the limits placed on this exception to consent, specifically, the requirement that a reasonable person expect the collection or use for the activity, and that the information is not collected or used to influence behavior or decisions.

Background

  • Bill C-27’s exception to consent for legitimate interest aligns with most elements of our former recommendations (Former recommendation 14). Bill C-27 contains:
    • A balancing test, albeit against any potential adverse effect on the individual instead of the fundamental rights of the individual.
    • An assessment to identify any potential adverse effects on the individual likely to result from the collection or use, and a requirement to take reasonable measures to reduce, mitigate or eliminate the likelihood of the effects.
    • A requirement that the assessment to be provided to OPC upon request.

Prepared by: PRPA


Online Behavioural Advertising

Key Messages

  • Under the CPPA, organizations cannot require an individual to consent to the collection, use or disclosure of their personal information beyond what is necessary to provide the product or service (s. 15(7)).
  • If the collection, use or disclosure of PI is necessary for the provision of services offered by online platforms, its context will inform whether express or implied consent is required and will need to be evaluated on a case-by-case basis.
  • The business activities (s.18(1)) and legitimate interest (s. 18(3)) exceptions to consent in the CPPA would likely not apply to online behavioural advertising, as both require PI not to be collected or used for the purpose of influencing the individual’s behaviour or decisions. However, influencing an individual’s behaviour or decisions has not been defined by the legislation.

Background

  • Section 15(5) of the CPPA requires organizations to obtain express consent unless it is appropriate to rely on an individual’s implied consent, taking into account the reasonable expectations of the individual and the sensitivity of the personal information that is to be collected, used or disclosed.
  • Section 15(6) clarifies that it is not appropriate to rely on implied consent for activities described in ss. 18(2) and (3). These subsections refer to exceptions to consent for certain business activities and legitimate interests.
  • It was recently reported that the European Data Protection Board has ruled that Facebook and Instagram must get explicit consent from users before serving them targeted ads based on their activity within those apps. The EDPB has reportedly called on Ireland’s Data Protection Commission, which regulates Meta (once Facebook), to issue public orders and levy fines. Currently, users of those platforms do not have the option to opt out of ads based on their activity within Meta’s apps, such as what videos they watch or what items they tap.

Prepared by: Business Advisory Directorate


Privacy by Design / Privacy Impact Assessments

Key Messages

  • Organizations should be required to implement privacy by design and to conduct PIAs for high-risk activities.
  • Such proactive requirements would help organizations demonstrate that they are accountable for personal information under their control, ensure they comply with the law, and limit the risk of privacy breaches.
  • Compliance with the law cannot rest only on investigations and penalties. Proactive strategies can be more impactful in achieving ongoing compliance and respect for the privacy rights of consumers.

Background

  • Like PIPEDA and the former Bill C-11, Bill C-27 does not require organizations to implement privacy by design nor to conduct PIAs for high-risk activities.
    • Privacy by design refers to proactively integrating privacy-protective measures into the design of a product or service, from the initial phases of development up to deployment.
    • A PIA is a risk management process which should be undertaken before deployment of any high-risk activity which implicates sensitive information or high-impact AI systems, such as those making decisions about job offers, loan eligibility, insurance premiums, or suspected suspicious or unlawful behaviour.
  • Requiring PIAs for all activities could pose an excessive burden, especially on small- and medium-sized enterprises. However, a PIA requirement for high-risk activities would help ensure that more significant privacy risks are being assessed and addressed.
  • This would align with Article 25 of the GDPR which requires organizations to meet a general standard for privacy by design that applies to all processing activities and Article 35 which requires organizations to conduct a PIA for high-risk initiatives.

Prepared by: PRPA


Political Parties

Key Messages

  • My Office has repeatedly called for political parties to be subject to privacy legislation, based on internationally recognized privacy principles, including an independent third party with authority to verify and enforce compliance.
  • I recently appeared before the Senate on amendments to the Canada Elections Act in Bill C-47 and stressed that the proposal fails to impose specific privacy-related requirements on political parties or to provide for independent oversight.
  • I reaffirmed our position that political parties should be subject to privacy obligations and that our Office should play a role to ensure the protection of privacy rights in this context.

Background

  • Bill C-47, Budget Implementation Act: includes amendments to authorize the collection, use, disclosure, retention and disposal of personal information by political parties and their affiliates in accordance with their own privacy policies. There are no provisions for adhering to specific privacy principles, nor for independent review, access, or complaints. We appeared on these provisions before the Senate Legal and Constitutional Affairs Committee on May 3, 2023.
  • 2019 Complaint: In August 2019, we received a PIPEDA complaint against the three major federal political parties. We concluded that PIPEDA did not apply to the activities of the political parties at issue in the complaint, (e.g., targeted advertising to voters) as these were not “commercial” in character.
  • Other jurisdictions: Privacy laws in other jurisdictions such as British Columbia, the UK and EU apply to political parties, and others like Quebec are moving to do so (with coming into force of relevant provisions of Bill 64 in Sept. 2023).
  • Bill C-76: In 2018, Bill C-76 amended the Canada Elections Act to require limited privacy obligations (i.e., developing, registering, and publishing privacy policies). That obligation, however, imposed no new substantive protections for voter information or operational limitations on party use of personal information.

Prepared by: PRPA


Preamble

Key Messages

  • The addition of the preamble in Bill C-27 is a positive development. As drafted, however, it does not go far enough in recognizing that privacy is a fundamental right.
  • The term “privacy interests” is ambiguous and should be amended to “fundamental privacy right of individuals”.
  • Emphasizing privacy as a fundamental right would help ensure that the appropriate balance is struck between the individual and legitimate commercial interests.
  • The placement of the preamble in Bill C-27, as opposed to within each of the Acts contained in the Bill, raises an additional concern that the important principles contained therein could be overlooked in the future.

Background

  • Preambles are an important interpretive aid for understanding Parliament’s intent when enacting legislation. The current preamble is only found in the introductory section of Bill C-27, however. This means that once enacted, none of the Acts contained in Bill C-27 will have their own preambles.
  • We recommend that a separate preamble be embedded in each of the Acts to ensure that the important principles contained in the overarching preamble are not overlooked when interpreting the legislation in the future.
  • More substantively, the preamble’s current language of “privacy interests” appears to downplay the importance of privacy in relation to commercial interests, which could lead a future court to weigh these interests in a manner that diminishes privacy.
  • The preamble should explicitly recognize privacy as a fundamental right of all individuals. This is necessary to foster greater consumer confidence in the digital economy and encourage responsible use of personal information by organizations in a way that supports commerce and economic growth.

Prepared by: Legal


Publicly Available Information

Key Messages

  • The OPC generally supports the approach taken for publicly available personal information in s.51 of CPPA (which mirrors existing provisions under s.7 of PIPEDA).
  • However, appropriate limitations should be established in the Act to place parameters on any modifications or additions to the list of publicly available information to be specified by regulation.
  • We recommend that s.51 of the CPPA be amended to include a condition that an individual’s reasonable expectation of privacy be considered in the assessment of whether information is publicly available.
  • OPC investigations have shown there is a need to avoid an overly broad interpretation of how “publicly available information” can be used without consent, as such an interpretation could lead to serious harms.

Background

  • S.51 allows organizations to collect, use or disclose an individual’s personal information without their knowledge or consent, if that information is publicly available and specified by regulations. This is essentially unchanged from PIPEDA (S.7(1)(d), 7(2)(c.1), and 7(3)(h.1)).
  • The OPC has seen multiple instances in which an organization has argued for the legitimacy of their activities based at least in part on their reliance on information that was accessible to the public (Clearview AI (2021) and Globe24h (2015)).
  • In the public sector context, the 2021 OPC submission to the Department of Justice consultations on Privacy Act reform supported a clear definition and regulation of “publicly available personal information”, rather than excluding it from the requirements of the Act.

Prepared by: PRPA


Purpose Clause

Key Messages

  • This Office has previously advocated for amending the PIPEDA purpose clause to properly reflect the fundamental right of privacy.
  • While section 5 of the CPPA does acknowledge the “right of privacy of individuals with respect to their personal information”, this language should be strengthened to ensure that the important interests at stake are properly balanced.
  • Consistent with our submissions with respect to the preamble, section 5 should be amended to refer to the “fundamental privacy right of individuals” as forming an integral part of what the CPPA is meant to protect and to foster.

Background

  • Ensuring that privacy is characterized as a fundamental right has been a consistent concern for the OPC, and was something emphasized in our May 2021 Bill C-11 submission.
  • Characterizing privacy as a fundamental right is consistent with international human rights instruments that recognize a right to privacy and with the Supreme Court of Canada’s description of privacy as a fundamental right (Douez v Facebook Inc, 2017 SCC 33 at para. 105).
  • Using consistent language between both Part I and the preamble – which does not speak to “privacy rights”, and for which we are recommending a similar amendment – ensures consistency in interpreting key principles animating the legislation.
  • Such an amendment would not, in our view, impact on the Constitutional validity of the legislation. Recognizing privacy as a fundamental right does not diminish the general trade and commerce objective of this Act, it simply ensures that the interests to be balanced in this equation are properly calibrated.

Prepared by: Legal


Reputation (De-indexing)

Key Messages

  • There are no provisions specific to de-indexing in Bill C-27. My Office recommends a clear and explicit right to de-indexing and/or removal of personal information from search results and other online sources. Quebec’s Law 25 is a useful model.
  • This form of protection is especially relevant for minors, who deserve to develop as persons in an online environment without fear every digital trace could lead to unfair treatment in the future.
  • As constitutional rights risk coming into conflict here, Parliament should enact mechanisms and criteria from the outset to balance protection of individuals’ reputation, particularly children, while also preserving free expression.

Background

  • We use the term “de-indexing” to describe the process by which a webpage, image, or other online resource is removed from appearing in search results (e.g., an individual’s name). The EU instead uses the term “de-listing”.
  • s. 28.1 of Quebec’s Law 25 sets out reasonable criteria for the de-indexing or removal of information: if it causes reputational injury; if the injury is greater than the interest of the public in knowing the information or the interest of free speech; and the cessation of dissemination, re-indexation or de-indexation requested does not exceed what is necessary for preventing the perpetuation of the injury.
  • In assessing injury, certain factors must be considered, including whether the person concerned is a public figure, a minor, the accuracy and sensitivity of the information, the context in which it was disseminated, and so on.
  • In a 2-1 decision, the Federal Court of Appeal upheld the Federal Court decision on the reference proceeding brought by the OPC in 2018. The appeal stems from a Federal Court decision finding that Google’s search engine collects, uses and discloses personal information in the course of commercial activities and that it is not exempt from PIPEDA under the law’s journalistic purpose exemption.
  • OPC will proceed with our investigation into Google’s compliance with PIPEDA in relation to the de-indexing of search results containing personal information.

Prepared by: PRPA


Safeguards

Key Messages

  • The CPPA incorporates without much change the provisions of PIPEDA with respect to security measures organizations must take to protect consumers’ personal information, but we believe there is an opportunity to improve these provisions.
  • In particular, we recommend that safeguard obligations take into account not just the sensitive nature of the information being processed, but also the risks to consumers, in the event of a breach, associated with the nature, scope, and context of its use of personal information, in light of the organization’s business activities.
  • That said, we are supportive of the inclusion at s. 57(3) that safeguards must include reasonable measures to authenticate the identity of individuals.

Background

  • WADA Investigation: As noted in our submission on the former C-11, in 2016, the OPC became aware of a breach of personal information at the World Anti-Doping Agency (WADA) which involved the sensitive medical information of several athletes.
    • Our investigation found that while a robust safeguard framework must take into account the sensitivity of the data, there are many other factors, including the risk that an organization and the information it holds will be hacked because it is a valuable target.
  • Authentication: We support the amendment at s. 57(3) relating to identity authentication and note that the GDPR (at Recital 64) encourages use of “all reasonable measures to verify the identity of a data subject who requests access, in particular in the context of online service and online identifiers”.
    • Concerns had been raised during second reading of the former Bill C-11 that the Bill failed to protect people’s identity online to prevent fraud due to identity theft, especially during financial transactions.

Prepared by: PRPA


Socially Beneficial Purposes (Exceptions to Consent)

Key Messages

  • The OPC is generally supportive of the CPPA’s introduction, at s.39, of an exception to consent for socially beneficial purposes.
  • There are many advantages to permitting the use of personal information for socially beneficial purposes, and modern privacy legislation should responsibly facilitate this.
  • While we have certain recommendations for how this measure could be enhanced, we note that particular features have been added to limit the potential privacy risks associated with the provision.
    • For example, only de-identified personal information may be disclosed, and disclosures can only be made to listed or prescribed entities that have a mandate to carry out a socially beneficial purpose.

Background

  • We recommend that requests be made in writing so the disclosing organization can satisfy itself the use is of societal benefit; agreements prohibit the re-identification of personal information as well as limit use of information for secondary purposes not of a societal benefit. Finally, the definition of “socially beneficial purposes” be modified to only include “activities that are beneficial to society and not simply of individual or commercial interest or profit”.
  • While it is an offence under the CPPA to use de-identified information to identify an individual (s.128), this would not likely extend to the entities with whom de-identified information is disclosed. Re-identification should therefore be prohibited.
  • S.39 shares similarities with the GDPR’s public interest lawful ground for processing (Article 6 (e)), which allows processing that is “necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller”. Both s.39 and Article 6(e) are primarily concerned with processing operations carried out by public sector agencies or private organizations that have a public mandate or authority.

Prepared by: PRPA


Trans-border Data Flows

Key Messages

  • PIPEDA does not adequately address risks to privacy posed by global data flows. Bill C-27 in its current form would not resolve those weaknesses.
  • For example, C-27 does not establish a comprehensive scheme to govern trans-border data flows, nor do its provisions distinguish between domestic and trans-border flows of information.
  • C-27 should be amended to include provisions specific to international flows of data so that obligations in this respect are clear and specific. This would bring it into alignment with most modern privacy laws, which explicitly and separately address trans-border data flows (as in Australia, New Zealand and the GDPR).

Background

  • Principle 4.1.3. of PIPEDA requires organizations to ensure a “comparable level of protection” when transferring information to a third party for processing. Former Bill C-11 would have required organizations to ensure service providers provide “substantially the same” level of protection (s.11(1)) while Bill C-27 now requires an “equivalent” level.
  • Bill C-27 relies on the use of contractual means (or otherwise) to ensure “equivalent” level of protections, as opposed to PIPEDA’s “contractual or other means”. By contrast, other jurisdictions enumerate specific options in this regard, including mechanisms such as adequacy rulings, standard contractual clauses, codes of conduct or other schemes such as binding corporate rules.
  • The OPC recommends that a separate scheme for trans-border data flows be implemented and address the considerations identified in Teresa Scassa’s paper (appended to the OPC’s May 2021 C-11 submission) relating to: 1) to whom the obligations apply; 2) accountability; 3) conditions to be met and 4) protections in the destination State.

Prepared by: PRPA


Table of Contents

General Comments on C-27

Amendments Proposed by Minister – CPPA

Amendments Proposed by Minister – AIDA

Adequacy of Current OPC Powers

Admissibility of OPC Recommendations

Authorized Individuals

Codes and Certification

Compliance Agreements

Domestic Collaboration

Discretion for Resource Allocation

Private Right of Action

Commissioner-initiated Complaints

Audits

Protocols to Protect Investigations

Resource implications of C-27

Section 109 – Powers, Duties and Functions - Factors to consider

The Tribunal

Timelines: Breach Reporting

Timelines: Return of Records

Offences

Artificial Intelligence and Data Act (AIDA)

Amendments Proposed by Minister – AIDA

AI and Data Commissioner

Automated Decision Making

Automated Decision Making – Right to Contest

C-27 Submission – AI-related Recommendations

ChatGPT Investigation

EU AI Act

Types of Generative AI

Government of Canada Voluntary Code of Conduct on Generative AI

OPC Regulatory Framework for AI

Amendments to the Australian Privacy Act

The California Delete Act

EU AI Act

TikTok

Chart: Comparison of Selected International AI Regulatory Models – In force and Proposed

Chart: Jurisdictional Scan of Definitions for De-identification, Anonymization and Pseudonymization

Accountability

Administrative Monetary Penalties

Appropriate Purpose

Breaches - Reparation for Damages

Business Activities

Cybersecurity

Data Mobility

Definition of Minors

Definition of PI (Inferences)

Definition of Sensitive Information

De-Identification / Anonymization

Disclosures to Law Enforcement

Disposal

Domestic Service Providers

Exceptions to Consent for Research

Facebook Litigation

Google Decision

Implications of C-27 on Substantially Similar legislation

Interoperability of Public/Private Legal Requirements

Legitimate Interest

Online Behavioural Advertising

Privacy by Design / Privacy Impact Assessments

Political Parties

Preamble

Publicly Available Information

Purpose Clause

Reputation (De-indexing)

Safeguards

Socially Beneficial Purposes (Exceptions to Consent)

Trans-border Data Flows

Date modified: