Issue sheets on the review of Alberta’s Personal Information Protection Act
Appearance before the Alberta Standing Committee
Federal, provincial, territorial, and international comparators
In this section
Substantially similar regimes
Key messages
- Alberta’s PIPA was deemed substantially similar to Part 1 of PIPEDA in 2004.
- The Consumer Privacy Protection Act proposed in C-27 maintains PIPEDA’s provisions whereby organizations will be largely exempt from the federal statute in provinces or territories with substantially similar privacy laws.
- C-27 is silent on what will happen with existing exemption orders for substantially similar provincial legislation.
Background
- In accordance with ss. 26(2)(b) of PIPEDA, there are currently several provincial privacy statutes that have been deemed substantially similar to Part 1 of PIPEDA.
- By Governor in Council order, organizations (except Federal works, undertakings and businesses) are therefore exempt from the application of Part 1 of PIPEDA with respect to the collection, use and disclosure of personal information that occurs within those provinces.
- ISED has established criteria for when provincial privacy legislation will be deemed substantially similar, including that the legislation provides privacy protection that is consistent with and equivalent to that in PIPEDA.
- C-27 would repeal Part 1 of PIPEDA and enact the CPPA.
- The CPPA authorizes the Governor in Council to issue exemption orders for substantially similar legislation; however, the Act does not contain any provisions regarding the existing exemption orders made pursuant to PIPEDA.
- The current exemption orders for substantially similar legislation, including the one for Alberta’s PIPA, would need to be replaced with new ones issued by the Governor in Council should the CPPA come into force because the current orders are framed as exemptions from PIPEDA’s application rather than from the application of the CPPA.
Lead: Legal
Federal, provincial, and territorial (FPT) collaboration
Key messages
- Interoperability of privacy laws helps to reassure individuals that their data is subject to similar protections when it crosses borders and to reduce compliance costs for organizations.
- As Commissioners, we try to have a consistent approach across jurisdictions to address privacy compliance challenges.
- I meet monthly with my provincial and territorial counterparts and in-person annually to discuss issues of shared interest, provide updates and identify areas of mutual concern.
- The group also issues joint resolutions and statements to express consensus on public policy matters of common interest or concern to Commissioners.
- My Office works especially closely with our counterparts in Alberta, British Columbia, and Quebec, as they have provincial laws deemed substantially similar to the Personal Information Protection and Electronic Documents Act, on activities such as joint investigations and guidance for organizations.
Background
- In May 2022, the OPC, along with the Alberta, Quebec and BC offices, entered into a Memorandum of Understanding to allow us to share information, consult on enforcement matters, discuss areas of mutual policy interest and develop public education and compliance resources. To achieve these goals, our offices have formed two fora that meet regularly, namely the Private Sector Privacy Forum and the Domestic Enforcement Collaboration Forum.
- In October 2023, the FPT Commissioners adopted 2 privacy resolutions: Putting Best Interests of Young People at the Forefront of Privacy and Access to Personal Information and Protecting Employee Privacy in the Modern Workplace. In December 2023, the FPT Commissioners released principles for responsible, trustworthy and privacy-protective generative artificial intelligence technologies.
- The OPC is currently conducting several joint investigations with our provincial counterparts, including on TikTok, Open AI, and Certn. Results will be released in the coming months.
Lead: IPT
Quebec Law 25
Key messages
- Law 25, formerly Bill 64, has the set the bar high for privacy protection in Canada.
- Unlike Bill C-27, Law 25 includes provisions that protect the right to reputation, provide individuals with the right to contest automated decisions, and is applicable to political parties.
- It also provides my Quebec counterpart with the ability to proactively verify an organization’s compliance and to issue monetary penalties for a wide range of violations, including those connected to the collection, use or disclosure of personal information.
- Many of my recommendations on Bill C-27 were informed by Law 25, and by laws of other domestic and international jurisdictions.
Background
- Law 25 was enacted on September 21, 2021, and entered into force in three phases. As of September 22, 2024, it is fully in force, with data portability rights implemented as the final phase.
- Although the CPPA provides for some new individual rights, obligations, and enforcement mechanisms, they are limited compared to what was introduced under Law 25. Law 25 includes provisions relating to privacy by default, PIAs, transborder data flows, proactive audits, and application to political parties.
- Law 25 permits disclosure of PI for research purposes without consent, but also provides for safeguards, such as completion of a PIA by the organization before disclosure. In the context of Bill C-27 the OPC recommended that the CPPA’s exception to consent for research instead be narrowed to “scholarly” study or research.
- Law 25 includes a requirement that anonymization be done in accordance with “generally accepted best practices”. In ongoing clause-by-clause consideration of C-27, INDU passed a motion to amend the CPPA’s definition of “anonymize” to remove this language, as the OPC had recommended. Regulations published on May 15th, 2024 for Law 25 detail a three-stage process requiring organizations to define purposes, implement anonymization techniques based on best practices and risk of re-identification, and periodically reassess anonymized data.
Lead: PRPA
GDPR and Adequacy
Key messages
- On January 15th, 2024, the European Commission concluded that Canada continues to offer an adequate level of protection under the General Data Protection Regulation (GDPR), meaning organizations will be able to continue transferring data from the EU to Canada without additional requirements.
- The Commission recommended enshrining in legislation some of the protections that have been developed at the sub-legislative level, in order to enhance legal certainty and consolidate new requirements, such as those regarding sensitive personal information.
- The Commission noted that the ongoing reform of PIPEDA (C-27) could offer such an opportunity.
Background
- Article 45 of the GDPR allows personal data to be transferred to a third country without additional requirements where the European Commission has decided that the third country ensures an “adequate level of protection”.
- PIPEDA was granted adequacy status in 2001 under Article 25(6) of Directive 95/46/EC. When the GDPR came into force in 2018, existing decisions remained in force until amended, replaced, or repealed by the Commission.
- On January 15th, 2024, the European Commission concluded a review of 11 adequacy decisions, including Canada’s. The review focused on developments in Canada since the adoption of the 2001 decision, including Canada’s data protection framework, rules on oversight, enforcement, redress, and government access to data. As decisions are reviewed every four years, we expect another review in 2028.
- Provinces do not need substantially similar laws in place for organizations to receive personal information from the EU under the adequacy decision, as these transfers are cross-border data transfers and therefore subject to PIPEDA.
Lead: PRPA
International developments: General
Key messages
- Stronger global privacy rights and international partnerships help ensure that Canadians’ personal information remains protected when it is processed outside of Canada’s borders.
- International interoperability of privacy laws also benefits organizations by reducing compliance costs.
- The OPC participates in international networks and works with counterparts to leverage resources, develop common policy positions, and share best enforcement practices.
Background
- G7 DPA Roundtable: The 2024 G7 DPA Roundtable will be in Rome in October. OPC will host in 2025. The 3 working groups are Data Free Flow with Trust, Enforcement Cooperation, and Emerging Technologies (which OPC chairs).
- Global Privacy Forum (GPA): OPC chairs 3 working groups (Data Protection and Other Rights and Freedoms, International Enforcement, and Digital Citizens and Consumers), and is a member of 8 others, including Ethics in AI and Digital Education. The 2024 GPA will take place in October in Jersey (Channel Islands).
- Other fora: OPC is a member of the Global Privacy Enforcement Network (GPEN), the Asia Pacific Privacy Authorities (APPA), and l’Association francophone des autorités de protection des données (AFAPDP). The OPC also engages with the Asia-Pacific Economic Cooperation (APEC) Data Privacy Subgroup, the Global Cross Border Privacy Rules Forum, and the OECD Working Party on Data Governance and Privacy in the Digital Economy.
- Joint initiatives: OPC has signed 10 bilateral and 3 multilateral memoranda of understanding and participates in the APEC Cross-Border Privacy Enforcement Arrangement, the GPA Cross Border Enforcement Arrangement, and the Global Cooperation Arrangement for Privacy Enforcement (Global CAPE). OPC is currently jointly investigating 23andMe with the UK Information Commissioner under the MOU between the two offices.
- GDPR Adequacy: In January 2024, the European Commission determined that Canada’s private sector privacy legislation, PIPEDA, continued to offer adequate protection under the General Data Protection Regulation (GDPR).
Lead: IPT
International developments: EU AI Act
Key messages
- The EU AI Act takes a risk-based approach, prohibiting some artificial intelligence practices (such as social scoring) and labelling others as high-risk (such as AI systems used to evaluate eligibility for public benefits or services).
- The EU AI Act’s main requirements apply to these high-risk systems, and to “general purpose” AI models (such as those upon which generative AI is built). Certain transparency requirements will also apply to lower risk systems.
- Canada’s AIDA takes a similar approach, but consistency between the Acts will largely depend on what amendments are made to AIDA during or after INDU’s study of C-27, and on the regulations that will be required if the Bill passes.
Background
- The EU AI Act came into force on August 1st 2024; its provisions will come into application in stages, from 6 months to 2 years after that date.
- The next step of the EU AI Act’s implementation is for each member state to identify an oversight authority, which must occur by November 2024.
- In the EU AI Act, “high-risk” AI systems are subject to requirements including:
- risk and quality assessments,
- logging and record-keeping for traceability,
- general human oversight,
- accurate and representative data for AI training,
- ex-ante conformity assessments, and
- demonstrable accountability.
- Similarities between the EU AI Act and AIDA include that most requirements apply to high-risk AI systems and general-purpose AI models; the types of systems identified as high-risk; and, (broadly) the nature of requirements.
- Differences include that AIDA does not include any prohibited AI practices, and (depending on final text) specifics regarding terminology and requirements.
Lead: PRPA
Bill C-27
In this section
Bill C-27: General
Key messages
- If passed, Bill C-27 would establish stronger privacy protections for individuals and create incentives for organizations to comply while allowing for greater flexibility to innovate.
- However, I believe that it must go further to ensure that Canadians’ privacy rights are better protected in the digital environment, to promote innovation, and to avoid leaving too much to regulation.
- My office has proposed 15 recommendations to strengthen the Bill, including recognizing privacy as a fundamental right and protecting children’s privacy and the best interests of the child.
- I was greatly encouraged to see the Committee reflect these recommendations, as well as others made by my Office, in their amendments to the Bill. We continue to closely monitor the Bill’s progress.
Background
- The Standing Committee on Industry and Technology (INDU) began its clause-by-clause consideration of C-27 on April 8, 2024. To date, INDU has held 10 meetings and adopted 10 amendments, including:
- embedding the preamble of the bill in the Consumer Privacy Protection Act and amending it to recognize the fundamental right to privacy and the importance of protecting minors and their best interests;
- amending the definition of “anonymize” to remove “generally accepted best practices” and add the standard of “no reasonably foreseeable risk in the circumstances” for re-identification;
- amending the French definition of “de-identify” to better align with the English;
- adding definitions of “lawful authority,” “minor,” and “profiling;”
- amending the definition of “personal information” to include inferred information.
- Bill C-27 would maintain PIPEDA’s approach to substantial similarity, though new provisions would allow the Governor in Council to make regulations establishing the criteria and process for making, or reconsidering, a determination of substantial similarity.
Lead: PRPA
Bill C-27: Data Protection Tribunal
Key messages
- Part 2 of Bill C-27 would establish the Personal Information and Data Protection Tribunal, which would be empowered to hear appeals of the OPC’s findings, orders, decisions, interim orders and recommendations for administrative monetary penalties.
- The Tribunal can substitute its own finding, order, or decision for the one under appeal if it finds an error was committed by the OPC. Per s.21 of Personal Information and Data Protection Tribunal Act, a decision by the Tribunal is final and can only be judicially reviewed by the Federal Court.
- Unlike the OPC’s orders under the CPPA, the Alberta OIPC’s orders are not subject to appeals by an external Tribunal.
Background
- The Tribunal would be comprised of three to six members appointed by the Governor in Council on recommendation of the Minister, at least three of which must have experience in the field of information and privacy law (s.6 PIDPTA).
- None of the OPC’s substantially similar provincial counterparts have a review body equivalent to a Tribunal; instead, such matters would go directly to a provincial court or superior court.
- Unlike the CPPA, our sub-sim counterparts’ orders are not subject to appeals, only to judicial reviews by a court.
- The tribunal may create uncertainty vis-à-vis the ability for OPC to conduct joint investigations with the Alberta OIPC (and other sub-sim counterparts) since the latter does not have these types of appeal mechanisms to an external review body.
- Having the Tribunal’s decisions directly reviewed by the Federal Court of Appeal would remove a layer of review, expedite the review process, bring finality to matters more quickly and be more in line with the levels of review of our counterparts with substantially similar legislation.
- This new Tribunal can also impose administrative monetary penalties (AMPs) as recommended by the OPC, or on its own initiative. Alberta’s PIPA, like PIPEDA, currently does not have an AMPs regime.
Lead: Legal
Artificial Intelligence and Data Act (AIDA)
Key messages
- AIDA would establish requirements relating to identifying and mitigating the serious risks of harm and bias which can occur through the use of “high-impact” AI systems.
- Given the technology’s risks to human rights, this is a commendable effort by Canada.
- However, it is our understanding that AIDA is not meant to address the privacy risks of AI systems specifically, which would instead be covered by the CPPA. It is therefore important that privacy law has similar requirements for identifying and mitigating privacy risks, particularly for technologies such as AI.
- To this end, we have recommended the CPPA include a PIA requirement for high-risk activities. This would help accelerate trust in such technologies.
Background
- For context, the Alberta Standing Committee was asked to “consider whether to grant the Minister power to regulate AI standards and guidelines, similar to AIDA, to ensure organizations would be subject to government oversight for the ethical use of AI technologies.”
- Relevant Section of AIDA: Section 8 requires a person who is responsible for a high-impact system to, in accordance with regulations, “establish measures to identify, assess and mitigate the risks of harm or biased output that could result from the use of the system.”
- Rationale for PIA requirement in CPPA: Given AIDA’s limited definition of “harm”, privacy risks would not need to be addressed or mitigated. Adding a PIA requirement for higher-risk activities in the CPPA would close this gap. There may be opportunity to have the AIDA and CPPA risk assessments feed into each other. PIAs are particularly important for when an exception to consent is used to process personal information using AI, as an individual would not be aware.
Lead: PRPA
Issues
In this section
- Administrative monetary penalties
- Artificial intelligence regulation
- Altered content / deepfakes
- Automated decision-making: transparency
- Biometric information
- Breach notification
- Children’s privacy
- Consent
- Data portability
- Definition of “commercial activity”
- De-identification / anonymization
- De-indexing
- Order-making powers
- Political parties
- Privacy impact assessments (PIAs)
- Privacy management programs (PMPs)
- Right to erasure / deletion
- Right to contest
- Sensitive personal information
- Service providers
Administrative monetary penalties
Key messages
- Adding an administrative monetary penalties (AMPs) regime to Alberta’s PIPA could help to incentivize organizations to comply with the province’s private-sector privacy law.
- My office supported the introduction of AMPs under the Consumer Privacy Protection Act (CPPA) in C-27 as an incentive for organizations to comply with federal privacy law.
- S.94 of the CPPA allows the Commissioner to recommend that an AMP be imposed by the Personal Information and Data Protection Tribunal if an organization has contravened one or more provisions outlined under subsection 94(1) (e.g., obtaining consent and proper retention/disposal of personal information).
Background
- Like PIPEDA, Alberta’s PIPA currently does not have an AMPs regime. Quebec’s CAI is currently the only privacy regulator in Canada able to impose AMPs.
- Many of our international counterparts have been able to levy financial penalties for a number of years under their privacy legislation (e.g., UK, France, Germany).
- Subsection 94(1) of the CPPA lists the violations qualifying for AMPs recommendations by the OPC to the Tribunal and includes things such as collecting, using, or disclosing personal information without consent where the Act does not create a corresponding exception.
- The CPPA’s maximum AMP amount is the higher of $10M or 3% of the organization’s “gross global revenue in its financial year before which the penalty is imposed”.
- Factors to be taken into account under the CPPA to recommend an AMP include the nature and scope of the contravention, evidence (or lack thereof) of the organization exercising due diligence, and efforts to mitigate consequences.
- The OPC maintains that an assessment of the sensitivity of the information and whether the loss of privacy would be proportionate to the benefit gained ought to be taken into account when making an AMP determination under the CPPA.
Lead: Legal
Artificial intelligence regulation
Key messages
- To maintain relevance in the modern world, privacy legislation should effectively regulate the privacy risks associated with technologies such as AI and automated decision-making.
- Measures such as those the Alberta Ministry of Technology and Industry recommended for consideration would serve to enhance transparency, accountability, accuracy and security of AI systems and would be positive steps.
- As set out in our recommendations for Bill C-27, other measures that might be considered include:
- Requiring privacy impact assessments for high-risk activities (which, in many cases, would include the use of AI systems);
- Allowing individuals to request explanations of all automated decisions and profiling;
- Enabling privacy commissioners to co-ordinate with other domestic regulators that oversee AI systems where not already possible.
Background
- For context, the provincial Ministry of Technology and Innovation recommended that the Standing Committee on Resource Stewardship “consider establishing individual rights and privacy protections within PIPA regarding automated decision-making and AI systems such as:
- requiring organizations to inform individuals of the use of automated decision making before using their personal information in this manner;
- enabling individuals to request a human review of decisions;
- implementing processes to safeguard the security of personal information used in AI systems and the logic involved in automated decisions; and
- requiring organizations to conduct audits or ensure the accuracy of data processing, with provisions for individuals to request corrections.”
Lead: PRPA
Altered content / deepfakes
Key messages
- In December 2023, Canada’s provincial and territorial information and privacy commissioners and I published joint principles for responsible, trustworthy, and privacy-protective generative AI technologies.
- In that document, we suggested that the creation of AI content for malicious purposes, including deepfakes and intimate images of an identifiable person generated without their consent, would likely constitute a “no-go zone.”
- Where an AI system is trained on or uses personal information, privacy legislation applies. However, explicit clarification that legislation applies to the creation or dissemination of altered content may provide regulatory certainty. This is similar to our recommendation about “profiling” under the CPPA.
- My colleagues in the Canadian Digital Regulators Forum and I are exploring synthetic media, including deepfakes, this year and how the issue may touch each of our respective mandates. We look forward to publishing our findings later this year.
Background
- In its submission to the Standing Committee on Resource Stewardship, the Alberta Ministry of Technology and Innovation notes that some Canadian jurisdictions have taken steps to address “altered content”. Saskatchewan and British Columbia have made legislative changes to better protect victims of non-consensual image-sharing (in BC, this applies regardless of whether the images are synthetic or real).
- The submission acknowledges that provisions in Alberta’s PIPA law may already apply to the creation and dissemination of altered content but calls upon the Committee to consider clarifying as much for greater certainty.
- Altered content/deepfakes will generally be considered personal information under PIPEDA and therefore be subject to regulation, so long as they remain about an identifiable individual and meets other requirements (e.g. commercial activity).
Lead: PRPA
Automated decision-making: transparency
Key messages
- Bill C-27 would require organizations to provide a general account of their use of any automated system to make predictions, recommendations, or decisions about individuals that could have a significant impact on them.
- It would also allow individuals to request an explanation of those decisions. This is a positive and welcome development.
- There are, however, potential improvements that might also be considered, which include:
- Allowing individuals to request an explanation of any automated decision (not just those with a significant impact);
- Explicitly applying transparency requirements to the profiling of individuals; and,
- Making transparency more impactful by providing individuals the right to request human review or to otherwise challenge automated decisions.
Background
- The Alberta Standing Committee was asked to consider establishing within PIPA a requirement for organizations to inform individuals of the use of automated decision-making to enable individuals to request a human review.
- Scope of explanations: While a narrow scope for an organization’s “general account” of the use of automated decision-making is reasonable, limiting the ability of individuals to request an explanation to decisions that might have a “significant impact” on them is at odds with the principle of algorithmic transparency.
- Profiling: Many modern privacy laws, including those in Quebec, California, and Europe, include specific reference to profiling (i.e., the use of personal information by automated decision-making systems to assess or predict a person’s characteristics) as being included within automated decision-making. As we noted in our submission on C-27, this inclusion may be implied in C-27, but would be preferable if made explicit.
Lead: PRPA
Biometric information
Key messages
- OPC is of the view that biometric information is generally sensitive in that biometric characteristics tend to be unique, difficult to change, and intimately linked to an individual’s body.
- I was greatly encouraged to see the INDU Committee reflect a recommendation made by my Office in their amendments to C-27 relating to sensitive information. Sensitive information would now be defined in a manner that establishes a general principle for sensitivity, followed by an open-ended list of examples, that includes biometric data that can uniquely identify an individual.
- More generally, OPC supports an approach to sensitive information at the federal level that is context-specific and includes stronger protections where risks are higher.
Background
- The “Emerging Issues” paper prepared by the Alberta Legislative Assembly Office asks whether provisions should be added to PIPA to further protect potentially sensitive information, including biometric information.
- In its submission the Alberta Ministry of Technology and Innovation suggested that the committee consider creating a specific category of sensitive personal information that would include biometric information. The Ministry also asked the committee to consider introducing strict requirements regarding the collection, use, access, disclosure and retention of the new category of sensitive personal information, such as requiring PIAs if sensitive data is collected.
- In its submission on the former C-11 the OPC recommended that a definition of sensitive information be included in the CPPA, that would establish a general principle for sensitivity followed by an open-ended list of examples. The proposed list included “biometric information for the purposes of uniquely identifying an individual”. INDU passed a motion introducing a definition for “sensitive information”, with a non-exhaustive list of examples, which aligns with the OPC’s recommendation.
- OPC is finalizing guidance on biometrics for public and private sector organizations following a public consultation conducted in fall 2023 and winter 2024.
Lead: PRPA
Breach notification
Key messages
- Breaches of personal information must be addressed swiftly to reduce the risk of harm to affected individuals. However, PIPEDA’s breach-reporting timelines are vague (“as soon as feasible”), which can lead to delays and reduces the OPC’s ability to enforce compliance.
- I have recommended that organizations be required to report a privacy breach to the OPC within 7 days of detection.
- Further, under PIPEDA, only organizations that have personal information under their control are required to report breaches to the OPC; if a breach occurs at a third-party service provider, they are not required to report it.
- Requiring service providers to report breaches to the OPC, along with a list of affected clients, would allow my Office to better assess what happened, whether safeguards were adequate, and to ensure that appropriate and timely remedial actions are taken.
Background
- Alberta’s PIPA requires organizations that suffer a privacy breach to notify the OIPC “without an unreasonable delay” if the breach poses a real risk of significant harm to affected individuals. If the Commissioner determines that the breach poses such a risk, he or she may then require the organization to notify affected individuals within a specified period of time. The Alberta Legislative Assembly Office’s “Emerging Issues” paper asks whether PIPA’s breach-notification provisions are appropriate.
- For this fiscal year to date, the average length of time for breach reports to be submitted to the OPC under PIPEDA has been four months.
- Some service providers have voluntarily submitted breach reports or provided information relating to a breach to the OPC. In other cases, organizations have relayed partial information (without knowing the specifics).
- Since January 2024, breaches at three service providers have generated nearly 100 breach reports to the OPC.
Lead: Compliance
Children’s privacy
Key messages
- Ensuring that children’s privacy is protected, and that young people understand and can exercise their privacy rights, is one of my key strategic priorities.
- I am encouraged by recent federal proposals to protect children’s privacy. For example, the proposed Consumer Privacy Protection Act (CPPA) would introduce additional protections for minors, including defining their information as sensitive.
- In my recommendations for the CPPA, I called for the Bill to recognize children’s privacy and the best interests of the child—a recommendation that was adopted at committee.
- This concept requires that young people’s rights and well-being be primary considerations in any decisions or actions concerning them, while also encouraging organizations to build privacy for minors into products and services by design.
Background
- Other measures to protect minors in the CPPA include:
- That parents, guardians, or tutors may exercise rights provided for under the Act on behalf of a minor, unless the minor wishes to personally exercise those rights and is capable of doing so;
- That express consent will be required by default for the collection, use, and disclosure of children’s information; and
- That organizations will be prohibited from using manipulative techniques to collect children’s information.
- The Alberta Legislative Assembly Office’s “Emerging Issues” paper inquires whether provisions should be added to PIPA to enhance the protection of children’s personal information, referencing the approach in C-27, while the Alberta Ministry of Technology and Innovation inquires whether there should be a specific category of sensitive information created under PIPA that would include children’s personal information, among other categories.
Lead: PRPA
Consent
Key messages
- Consent is an essential element of PIPEDA. The law generally requires organizations to obtain valid consent for the collection, use, and disclosure of personal information.
- Bill C-27 maintains this approach, and further specifies certain information that must be provided to individuals in order for consent to be valid. This information must be provided in plain language that an individual to whom the organization’s activities are directed would reasonably be expected to understand.
- This new requirement is an improvement and would bring the Bill more into alignment with my Office’s guidelines for obtaining meaningful consent.
Background
- Bill C-27 would insert a plain-language requirement that partially addresses our recommendation on the former Bill C-11, now requiring that consumers be provided with consent information in plain language and understand what they are being asked to consent to (s. 15(4)).
- Under s. 15(3) of the CPPA, organizations must inform individuals of (1) the purposes of the collection, use, or disclosure; (2) the manner in which the PI will be collected, used, or disclosed; (3) any reasonably foreseeable consequences; (4) the specific type of personal information that is to be collected, used, or disclosed; and (5) the names or types of parties with whom the information may be shared.
- Similarly, Alberta OIPC has recommended that Alberta’s Personal Information and Privacy Act (PIPA) be amended to require that, in order for consent to be valid, organizations must provide comprehensive, specific, clear, and plain notice of all purposes of the collection, use, or disclosure, such that an individual could reasonably be expected to understand them.
- The Alberta Legislative Assembly Office’s “Emerging Issues” paper inquires about the appropriateness of PIPA’s current consent provisions and whether individuals should receive notice in plain language. The Ministry of Technology and Innovation’s submission similarly inquires about a plain language requirement.
Lead: PRPA
Data portability
Key messages
- Overall, the OPC supports the introduction of data mobility provisions in the CPPA; however, I recommended certain amendments to better align the Bill with international models.
- Specifically, I recommended that the mobility provision be expanded to apply to all personal information about an individual – including derived or inferred information – not only information that is collected from an individual.
- I also support the expansion of the right to provide individuals, when technically feasible, the ability to receive this information in a structured, commonly used, machine-readable format as has been done in other jurisdictions, commonly referred to as data portability.
Background
- The Alberta Legislative Assembly Office’s “Emerging Issues” paper inquires if PIPA should be amended to allow, upon an individual’s request, that an organization be required to transfer that individual’s digital personal information to another organization in a structured, commonly used, and machine-readable format when it is technically feasible. This is similarly raised in the submission of the Ministry of Technology and Innovation.
- The Alberta OIPC has called for PIPA to be amended to include a “right to portability and data mobility” by including the right of an individual to obtain their personal information from an organization, or have their personal information directly transferred to another organization in a structured, commonly-used machine readable format.
- S. 72 of the CPPA requires organizations to disclose, upon request, an individual’s personal information that it has collected from the individual to an organization designated by the individual, provided both organizations are subject to a data mobility framework.
- The right for individuals to receive their own data in a structured, commonly used, machine-readable format is not within the CPPA, but is found in other data protection laws such as the GDPR, California, and in Quebec.
Lead: PRPA
Definition of “commercial activity”
Key messages
- Under PIPEDA, “commercial activity” means any particular transaction, act, or conduct or regular course of conduct that is of a commercial character.
- Organizations that are not-for-profit or that offer “free” services may nevertheless be subject to PIPEDA to the extent that their services qualify as commercial activities.
- Alberta’s Personal Information and Privacy Act defines “commercial activity” in a similar way but also explicitly lists certain private educational entities (i.e., private schools, early childhood services programs, post-secondary institutions) as defined in separate Alberta Acts.
Background
- Whether an organization can be said to collect, use, or disclose personal information in the course of a commercial activity under PIPEDA will vary depending on the facts of each case.
- In most cases, determining whether an organization is engaged in a commercial activity is straightforward; in others, the issue is more complex and requires closer examination.
- Organizations offering ostensibly “free” services may nevertheless be engaged in a commercial activity within the meaning of PIPEDA: the question must be considered within the context of the business model as a whole [Reference re Subsection 18.3(1) of the Federal Courts Act, 2021 FC 723].
- Organizations that are considered “not-for-profit” for tax purposes are also not automatically exempt from PIPEDA if they collect, use, or disclose personal information in the course of a commercial activity [Rodgers v. Calvert, 2004 CanLII 22082 (ON SC)].
- PIPEDA does not apply to the disclosure of relevant information required in the context of a legal proceeding (which is not considered “commercial activity”) [Hatfield v. Intact Insurance Company, 2014 NSSC 232].
Lead: Legal
De-identification / anonymization
Key messages
- Federally, the OPC supports a framework for regulating the use and subsequent disclosure of de-identified and anonymized information in a way that clarifies organizations’ responsibilities and provides some flexibility in privacy requirements.
- It is important that thresholds for considering personal information de-identified or anonymized are set at an appropriate level relative to the risk of re-identification and the information’s sensitivity.
- It is also important for any framework to prohibit unauthorized attempts to re-identify information, and specify that de-identified information (not anonymized) remains personal information.
- As a general principle, OPC is of the view that individuals should be notified of all purposes for which their personal information is collected, used, and disclosed.
Background
- The Alberta Legislative Assembly Office’s “Emerging Issues” paper inquires whether PIPA should regulate the de-identification and/or anonymization of personal information within the control of an organization, as well as the subsequent use or disclosure of that information (and if so, how).
- In its submission to the Committee, the Alberta Ministry of Technology and Innovation recommended provisions be considered regarding notification to individuals when de-identified or anonymized information is used, as well as penalties for re-identification.
- The AB OIPC recommends that PIPA contain a framework for de-identified and anonymized information, including definitions, standards and certain prohibited practices.
- With respect to C-27, OPC has recommended that:1) Organizations be required to apply de-identification measures that are proportionate to the risk of the information being re-identified; 2) The Bill clarify that de-identified information remains personal information; and 3) A high threshold be used for considering information anonymized (in particular, by removing reference in the threshold to “generally accepted best practices”).
Lead: PRPA
De-indexing
Key messages
- In the context of C-27, my Office has recommended a clear and explicit right to having personal information de-indexed or removed from search results and other online sources. Quebec’s Law 25 is a useful model.
- This form of protection is especially relevant for minors, who should be able to develop as persons in an online environment without fear that every trace of their digital lives could lead to unfair treatment in the future.
- As constitutional rights risk coming into conflict here, Alberta may wish to consider enacting mechanisms and criteria from the outset to balance protection of individuals’ reputation, particularly children, while also preserving free expression.
Background
- The Alberta Legislative Assembly Office’s “Emerging Issues” paper includes a question about whether the PIPA should provide for a right of de-indexation.
- The term “de-indexing” refers to removing a website, webpage, or part of a webpage from search-engine results.
- Section 28.1 of Quebec’s Law 25 sets out reasonable criteria for requiring the de-indexing or removal of information, namely: if it causes the person concerned serious injury in relation to reputation or privacy; if the injury is greater than the interest of the public in knowing the information or the interest of free speech; and if the requested cessation of dissemination, re-indexation, or de-indexation does not exceed what is necessary to prevent the perpetuation of the injury.
- In a 2023 decision, the Federal Court of Appeal upheld a 2021 Federal Court decision that found that Google’s search engine collects, uses, and discloses personal information in the course of commercial activities and that it is not exempt from PIPEDA under the law’s journalistic purpose exemption.
- We are finalizing our investigation into Google’s compliance with PIPEDA in relation to the de-indexing of search results containing personal information.
Lead: PRPA
Order-making powers
Key messages
- Unlike Alberta’s Information and Privacy Commissioner, who has order-making powers under the Alberta Personal Information and Protection Act (PIPA), I am unable to issue binding orders under PIPEDA.
- My position is that the OPC needs order-making powers in order to promote compliance and enforcement.
- Powers proposed in Bill C-27 (the Consumer Privacy Protection Act, or CPPA) would give the OPC the necessary enforcement tools to promote compliance with the Act (i.e., issuing orders, recommending administrative monetary penalties).
Background
- Under PIPEDA, recommendations following an audit or investigation are non-binding: if an organization does not follow the recommendations, the complainant or the OPC may, in certain circumstances, commence a de novo application to have those recommendations enforced by the Federal Court [s.14 PIPEDA].
- Under Alberta’s PIPA, the Commissioner can issue an order after an inquiry under s.50 to require, among other things, that organization do (or cease doing) a specified thing in order to comply with the requirements of the Act [s.52]. Such orders are final and not subject to appeal but remain subject to judicial review by the Court of King’s Bench.
- Under the proposed CPPA, the federal Privacy Commissioner’s orders would be subject to an appeal before an external tribunal that could override the decision. That tribunal’s orders would in turn be subject to judicial review before the Federal Court.
- If these orders were reviewable by the Federal Court instead of an external tribunal, the process would be more aligned with that of our provincial counterpart.
Lead: Legal
Political parties
Key messages
- Political parties should be subject to privacy rules that are similar to the requirements that are set out for the public and private sectors in the Privacy Act and PIPEDA.
- Such privacy requirements should be grounded in legislation, conform with internationally recognized privacy principles, and include recourse to an independent third party.
- In May 2023, when appearing before the Senate on amendments to the Canada Elections Act in Bill C-47, I stressed that current rules do not establish minimum privacy requirements for political parties nor provide for independent oversight of privacy practices.
- Political parties and their affiliates can collect, use, retain, disclose, and dispose of personal information in accordance with the party’s own privacy policy – which they develop and revise at their own discretion.
Background
- PIPA vs. PIPEDA vs. CPPA: As noted in the Alberta Legislative Assembly Office’s “Emerging Issues” paper, political parties are not currently defined as “organizations” under Alberta’s commercial sector privacy legislation. Rather, Alberta’s PIPA has exemptions specifically for registered provincial political parties, constituency associations, and individuals running for office. At present, PIPEDA does not specifically cover political parties and the CPPA follows the same approach.
- Provincial comparisons: in contrast, British Columbia’s PIPA has been found to apply to both provincial and federal political parties operating in the province. Similarly, Quebec’s commercial sector privacy law has been amended to apply to personal information held by a political party.
- Changes proposed in Bill C-65: At the federal level, Bill C-65 adopts the approach of placing privacy regulation and oversight into the existing regime for political parties found in the Canada Elections Act. Elections Canada and the Commissioner of Canada Elections would be responsible for oversight of political parties’ privacy policies. We have yet to be called to appear on this Bill.
Lead: PRPA
Privacy impact assessments (PIAs)
Key messages
- In my submission to Parliament on C-27, I cautioned that requiring PIAs for all activities could pose an excessive burden, especially on small- and medium-sized enterprises, but requiring PIAs for higher risk activities would help to ensure that more significant privacy risks are assessed and addressed.
- “Higher-risk activities” could include initiatives where AI systems make impactful decisions about individuals, including whether they get a job offer, qualify for a loan, pay a higher insurance premium, or are suspected of suspicious or unlawful behaviour.
- Requiring PIAs for high-risk activities aligns with Article 35 of the EU’s General Data Protection Regulations (GDPR). Alignment with the GDPR supports Canada’s adequacy status, which enables data to be transferred from the EU to Canada without additional requirements.
Background
- The Alberta Legislative Assembly Office’s “Emerging Issues” paper asks if organizations should be required to submit PIAs to the Alberta OIPC for specific initiatives involving personal information. In its submission, the Alberta Ministry of Technology and Innovation raises whether PIPA should include privacy-management program and PIA requirements for organizations, scaled to the size of the business and sensitivity of information in question.
- The Alberta OIPC recommended that PIAs be submitted for review and comment before organizations undertake activities involving sensitive personal information, profiling, or any significant change to an existing program involving the aforementioned; and the Commissioner be authorized to require an organization to provide a PIA for review where there is reasonable belief that the processing activity creates risks to the privacy rights of Albertans.
- S. 109 of the CPPA outlines factors the Commissioner must take into account when performing his functions and exercising his powers including the purpose of the Act, size and revenue of organizations, volume and sensitivity of personal information, and matters of general public interest.
Lead: PRPA
Privacy management programs (PMPs)
Key messages
- A privacy management program (PMP) is a fundamental accountability tool, helping organizations to demonstrate their compliance with privacy requirements.
- C-27 strengthens accountability by requiring organizations to implement and maintain PMPs and to provide the OPC with access to a PMP on request.
- Section 9(2) of the CPPA requires organizations to take into account the volume and sensitivity of personal information when developing PMPs.
- I am supportive of the inclusion of PMP requirements in the law.
Background
- The Alberta Legislative Assembly Office’s “Emerging Issues” paper asks if PIPA should require organizations to have a PMP and to provide written information about PMPs to individuals and the Alberta OIPC. In its submission, the Alberta Ministry of Technology and Innovation also asked the Standing Committee to consider whether to implement mandatory PMP requirements.
- The Alberta OIPC recommends that PIPA be amended to require organizations to implement a PMP with components similar to those set out in Bill C-27 and Quebec’s Law 25 and allow the Commissioner to audit an organization’s PMP.
- C-27 requires organizations to implement and maintain a PMP that includes the policies, practices, and procedures put in place to fulfill obligations (s. 9(1)), take into account the volume and sensitivity of personal information (s. 9(2)), and provide OPC with access to the PMP on request (s. 10(1)). The OPC must provide guidance on organizations’ PMPs on request (s. 110(1)(e)), though the OPC need not act on unreasonable requests, and may prioritize requests based on organizations’ needs (s.110(2)).
- S. 111 prohibits the OPC from using information from a PMP review to initiate a complaint or audit unless the organization willfully disregards recommendations.
- In 2012, the OPC and Commissioners of BC and Alberta published Getting Accountability Right with a Privacy Management Program that provides guidance on the components of a strong PMP.
Lead: PRPA
Right to erasure / deletion
Key messages
- At the federal level, Bill C-27 provides that organizations must dispose of an individual’s personal information on request if certain conditions are met.
- However, it also creates exceptions whereby such requests can be refused, including where the information is already scheduled to be disposed of according to the organization’s retention policy (provided that it does not concern a minor and the individual is informed of the remaining time period).
- I have recommended removing this exception from Bill C-27 because it would limit the ability of individuals to have their information disposed of in a timely way and generally diminish their control over their personal information.
Background
- The Alberta Legislative Assembly Office’s “Emerging Issues” paper notes that PIPA does not include provisions regarding erasure, disposition or de-indexing and asks if PIPA should provide for an individual’s right to be forgotten or de-indexed. In its submission, the Alberta Ministry of Technology and Innovation also recommends consideration be given to whether to establish individual data rights within PIPA, including the right to erasure.
- The Alberta OIPC recommends the inclusion of both a right to de-indexing and deletion in PIPA in certain circumstances.
- Section 55(1) of Bill C-27 provides that an organization must dispose of an individual’s personal information on request as soon as feasible if (i) the information was collected, used, or disclosed in contravention of the Act; (ii) the individual has fully or partially withdrawn their consent; or (iii) the information is no longer necessary to provide a product or service to the individual.
- Bill C-27 also provides for additional exceptions where an organization may refuse a disposition request under 55(1): the personal information of another individual would also be disposed of; statutory or contractual requirements prevent disposal; the information is necessary for legal defence or other legal remedies; the information is necessary to the ongoing provision of a product or service to the individual in question and is not about a minor; or the request is vexatious or made in bad faith.
Lead: PRPA
Right to contest
Key messages
- Bill C-27 would require organizations that use automated decision systems to make predictions, recommendations, or decisions about individuals to provide them with an explanation of that decision. I am generally supportive of these new obligations, though I have made recommendations to help increase transparency.
- However, individuals would not be provided with a right to contest or request human review of such decisions.
- Individuals should not be bound by automated decisions without recourse to human review, particularly when such decisions may be based on inaccurate data, reflect bias, or otherwise result in a decision that a human would consider inappropriate.
Background
- In its submission to the Standing Committee, the Alberta Ministry of Technology and Innovation asked members to consider whether to establish individual data rights within PIPA regarding automated decision making and AI, including enabling individuals to request a human review of a decision.
- The AB OIPC recommends the inclusion of a right to contest automated decision making in PIPA (in addition to transparency and accountability requirements).
- In its C-27 submission, the OPC recommended removal of the qualifier of “significant impacts” for the explanation requirement under ss. 63(3) of the CPPA for automated decision-making systems. In its submission on the former Bill C-11, the OPC recommended the inclusion of a right to contest automated decisions.
- As a general principle, the right to contest should be provided in addition to the ability to withdraw consent. It is necessary to have both rights, as withdrawal of consent is an all-or-nothing decision, whereas contestation provides individuals with recourse even when they choose to continue to participate in the activity for which automated decision-making was employed.
- Quebec’s Loi 25 (s.12.1) and the GDPR (Article 22.3) both provide for a right to contest and/or request human review of an automated decision.
Lead: PRPA
Sensitive personal information
Key messages
- My Office has called for a definition of sensitive information to be included in Bill C-27 that would set out a general principle, followed by a non-exhaustive list of examples.
- Such a definition would ensure that certain types of information are always considered sensitive while also allowing for contextual interpretation to account for new forms of information.
- My submission on Bill C-27 also recommended adding provisions to require organizations to conduct privacy impact assessments (PIAs) for high-risk activities, which would include activities that involve sensitive information.
Background
- The Alberta Legislative Assembly Office’s “Emerging Issues” paper asks if provisions should be added to PIPA to further protect potentially sensitive information, including biometric information and children’s personal information.
- The Alberta Ministry of Technology and Innovation asked the Standing Committee to consider creating a specific category of sensitive information under PIPA, including children’s information, intimate personal information, and biometric data. The Ministry also encouraged the Committee to consider strict requirements for the collection, use, access, disclosure, and retention of such information.
- The AB OIPC recommends that a definition of sensitive information be included in PIPA with specific requirements reflective of the sensitivity and potential for harm.
- During ongoing clause-by-clause consideration of C-27, INDU passed a motion introducing a definition for “sensitive information”, with a non-exhaustive list of examples, which aligns with the OPC’s recommendation.
- Bill C-27 identifies the personal information of minors as sensitive (section 2(2)).
- Quebec’s Act respecting the protection of personal information in the private sector has a principles-based definition for sensitive information.
- The GDPR (Art. 9(1)) identifies special categories of data (e.g., genetic and biometric data) that could pose risks to individuals’ rights and therefore require specific conditions (e.g., express consent) before processing is permitted.
Lead: PRPA
Service providers
Key messages
- Ensuring organizations have strong accountability measures in place is especially important when they are contracting with service providers to collect or use personal information on their behalf.
- The CPPA requires organizations transferring personal information to a service provider to ensure that they provide an “equivalent” level of protection as required under the Act (ss. 11(1)).
- Overall, this scheme is reasonable. However, I have recommended that the provisions that apply to service providers be amended to capture a broader range of actions with respect to the collection, use, and disclosure of data (beyond data transfers).
- I have also recommended that the law be amended to clarify that, in sub-contracting situations, the organization or controller ultimately remains accountable for the personal data.
Background
- The Alberta OIPC’s submission includes a host of recommendations specific to service providers, including that PIPA be amended to require organizations to ensure, by contract, that service providers offer “the same or better privacy protection” as the organization is required to provide (as in Bill C-27). The Alberta OIPC has also recommended that any downstream service providers be subject to PIPA in the same way so that they too can be held accountable for non-compliance.
- Part of Recommendation 26 from the OPC’s Bill C-11 submission, proposing that domestic service providers should not be able to rely on the “business activities” exception to consent at s. 18(2)(e), was adopted and the provision is not in Bill C-27.
- The submission on Bill C-27 carried forward other related recommendations, including that sections 7(2), 11(1), 11(2), 19, and 62(2)(d) of the CPPA be amended to ensure that these rules adequately reflect the broad scope of data transfers between organizations and service providers; that section 11(2) be amended so that accountability is not limited to data transfers but extended to information that service providers collect, use, or disclose on behalf of the organization; and that the law clarify that controllers ultimately remain accountable in sub-contracting situations.
Lead: PRPA
- Date modified: