Language selection

Search

INDU Appearance on Bill C-27: Additional questions and answers

October 19, 2023


What are the definitions used internationally for “de-identification” and “anonymization”? Is there consensus and if not, what are the G7 DPAs doing about it?

  • Definitions used internationally for anonymization and de-identification are listed in the chart.
  • There is no global consensus on the definition of these terms.
  • G7 DPAs, led by OPC, are developing a working paper on the use of definitions for de-identification, anonymization, and related terms across the G7.

Background

In the GDPR and Quebec’s Law 25, “anonymization” generally refers to a process of transforming personal information such that the information is no longer considered identifiable. In the US Health Insurance Portability and Accountability Act and California Consumer Privacy Act (CCPA), this process more closely corresponds with the definition of “de-identification”.

In Bill C-27 and Quebec’s Law 25, “de-identification” generally refers to a process of reducing the identifiability of personal information without rendering it anonymous. In the GDPR, this process aligns more closely with the definition of “pseudonymization”.

Legal thresholds for these terms also vary across jurisdictions. For example, the threshold for anonymization in the GDPR is for data subjects to “not or no longer [be] identifiable,” while in Quebec’s Law 25 the threshold is for individuals to not be identifiable in “reasonably foreseeable circumstances”.

Related requirements vary by jurisdiction as well. Quebec’s Law 25, for example, requires anonymization to be carried out in accordance with generally accepted best practices and terms and criteria established by regulation, while the US CCPA requires (among other items) that organizations take “reasonable measures” to ensure information cannot be associated with a consumer or household.

G7 DPAs met at the working level on September 26, 2023, to discuss the ongoing development of a working paper on terms for de-identification and anonymization in the G7, including the lack of consistent definitions and other key issues. This work is led by the OPC as chair of the G7 Emerging Technologies Working Group.

We are recommending the removal of “in accordance with generally accepted best practices” from the definition of “anonymize”. Has the notion of “generally accepted best practices” been used or defined in other legislation?

  • Quebec’s Law 25 includes a requirement that anonymization be done “according to generally accepted best practices and according to the criteria and terms determined by regulation.”
  • International privacy legislation does not refer to generally accepted best practices per se, but the GDPR does reference “available technology” and “technological developments” in relation to the standard for pseudonymization.
  • Some federal statutes unrelated to privacy reference best practices or similar concepts (e.g. unspecified industry standards) in relation to certain legal requirements. However, the more common approach is to either incorporate specific standards into the law, or to allow governments to set out best practices in regulation, or both.

Background

Quebec’s Law 25 (s. 110) includes a direct reference to generally accepted best practices in the definition of anonymization: “Information anonymized under this Act must be anonymized according to generally accepted best practices and according to the criteria and terms determined by regulation.” Law 25 does not provide further clarity on how “generally accepted best practices” should be interpreted. Unlike CPPA, however, Law 25 does explicitly provide for the possibility of regulatory requirements related to anonymization.

The GDPR (Recital 26) provides guidance that, in ascertaining whether information that has been pseudonymized remains identifiable, organizations should take account of “all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments.”

While AIDA points explicitly to the CPPA definition of “personal information”, AIDA (unlike the CPPA) provides flexibility to define specific anonymization requirements through regulation (ss. 6 and 36(c)). These regulatory requirements under AIDA could be persuasive in how “generally accepted best practices” is interpreted in the CPPA.

Should the definition of “anonymization” in the CPPA be aligned with the definition used in Québec’s private sector law?

  • While it is beneficial to have consistent definitions of terms, the standard/definition of anonymization is only one part of the overall framework for anonymization set out in these statutes. In some ways, the anonymization framework under Quebec’s Law 25 is more restrictive than that set out in the CPPA; simply aligning the definition or standard for anonymization would not bring these frameworks into alignment overall.
  • Notably, in Law 25, anonymization is addressed only in the context of alternatives to destruction of personal information (s. 119), and in that context only explicitly permits anonymization for “serious and legitimate purposes”, “where the purposes for which personal information was collected or used are achieved”. Law 25 also requires that information be anonymized according to generally accepted best practices and “the criteria and terms determined by regulation”. Comparable restrictions on the purposes and circumstances in which organizations can anonymize information are not found in the CPPA (with the exception of reference to generally accepted best practices).
  • OPC would therefore not be supportive of simply adopting the “reasonably foreseeable in the circumstances” standard without further changes to ensure that the threshold for anonymizing personal information is high and leaves no space for insufficient practices.

Background

The CPPA uses an (almost) absolute standard that sets a higher threshold than Law 25, i.e. anonymization means “to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information” (s. 2(1)).

Some online commentators have called for the definition of “anonymization” in the CPPA to be aligned with the definition used in Quebec’s Law 25 (amendments to the private sector law). Law 25 uses a relative standard that is similar to the standard used in Ontario’s PHIPA, i.e. information is anonymized if it is, at all times, “reasonably foreseeable in the circumstances that it irreversibly no longer allows the person to be identified” (s. 119).

Some commentators, including the Canadian Anonymization Network (CANON) and Barry Sookman of McCarthy Tetrault, have called for the definition of anonymization in the CPPA to be aligned or “harmonized” with the definition in Law 25, i.e. to use the “reasonably foreseeable in the circumstances” standard in the CPPA.

We are recommending PIAs for “high-risk” activities. What would be our definition and/or criteria for “high risk”?

  • To maintain the language of the CPPA, our definition/criteria for a “high-risk activity” would be:

    “A ‘high-risk activity’ is one which will have, or is reasonably foreseeable to have, a significant impact on an individual, or where the collection, use or disclosure of personal information associated with the activity could lead to significant harm to an individual (which includes bodily harm, humiliation, damage to reputation or relationships, loss of employment, business or professional opportunities, financial loss, identity theft, negative effects on the credit record and damage to or loss of property) without appropriate safeguards.

  • We may also want to emphasize that we generally recommend organizations undertake a PIA for any collection, use or disclosure of personal information, but that a PIA be required for high-risk activities.

Background

This would ground our definition in language that exists in the CPPA already for automated decision making (significant impact) and breach reporting (significant harm). It also keeps the definition from being too AI-centric, allows us a good opportunity to provide further detail via guidance or investigative findings, and maintains flexibility.

Is the ability to provide fines a requirement for EU adequacy?

  • No, fines are not an explicit requirement for EU/GDPR “adequacy” under Article 45 or Recital 104 of the GDPR; the same goes for administrative monetary penalties (AMPs).
  • When assessing adequacy, the European Commission considers a number of factors including whether a third country offers guarantees ensuring an adequate level of protection essentially equivalent to that ensured within the EU and “effective and enforceable … rights and effective administrative and judicial redress” for the data subjects whose personal data are being transferred (article 45, GDPR)
  • The availability of financial consequences like fines or AMPs may help satisfy the requirements of “effective and enforceable rights and effective administrative and judicial redress” and increase the chances of CPPA being deemed “adequate”.
  • Because fines and/or AMPs are typically part of GDPR states’ national rules (Recitals 148-149; Article 84), including such tools in the CPPA would be more consistent with international standards and provide comparable levels of compliance incentivization (AMPs) and punishment for non-compliance (fines).

Background

Even though the GDPR sets out that member states must have fines and/or administrative penalties for non-compliance, there does not appear to be an explicit requirement that 3rd party states (e.g., Canada) would require such financial consequences in their privacy legislation in order to be considered “adequate” as per Recital 104 and Article 45.

Getting “adequacy” status by the European Commission is not an exact science and many factors may come into play in its assessment. Therefore, the more tools the CPPA/OPC can have in the enforcement toolbox, the better our chances are at demonstrating that Canada has “effective and enforceable data subject rights and effective administrative and judicial redress for the data subjects whose personal data are being transferred” (art. 45 GDPR). Since fines and/or AMPs are the norm in GDPR member states (as well as in a number of other countries), Canada too should also have these strong compliance-enhancing tools even if they are not expressly required for adequacy status.


Recital 104 GDPR:

In line with the fundamental values on which the Union is founded, in particular the protection of human rights, the Commission should, in its assessment of the third country, or of a territory or specified sector within a third country, take into account how a particular third country respects the rule of law, access to justice as well as international human rights norms and standards and its general and sectoral law, including legislation concerning public security, defence and national security as well as public order and criminal law. The adoption of an adequacy decision with regard to a territory or a specified sector in a third country should take into account clear and objective criteria, such as specific processing activities and the scope of applicable legal standards and legislation in force in the third country. The third country should offer guarantees ensuring an adequate level of protection essentially equivalent to that ensured within the Union, in particular where personal data are processed in one or several specific sectors. In particular, the third country should ensure effective independent data protection supervision and should provide for cooperation mechanisms with the Member States’ data protection authorities, and the data subjects should be provided with effective and enforceable rights and effective administrative and judicial redress.


Article 45 GDPR:

[…]

  1. 2. When assessing the adequacy of the level of protection, the Commission shall, in particular, take account of the following elements:
    1. the rule of law, respect for human rights and fundamental freedoms, relevant legislation, both general and sectoral, including concerning public security, defence, national security and criminal law and the access of public authorities to personal data, as well as the implementation of such legislation, data protection rules, professional rules and security measures, including rules for the onward transfer of personal data to another third country or international organisation which are complied with in that country or international organisation, case-law, as well as effective and enforceable data subject rights and effective administrative and judicial redress for the data subjects whose personal data are being transferred;
    2. the existence and effective functioning of one or more independent supervisory authorities in the third country or to which an international organization is subject, with responsibility for ensuring and enforcing compliance with the data protection rules, including adequate enforcement powers, for assisting and advising the data subjects in exercising their rights and for cooperation with the supervisory authorities of the Member States; and […]

Please provide an overview of the EU Digital Services Act, including whether it been fully adopted and if not, what stages remain?

  • The EU Digital Services Act (DSA) establishes rules for online intermediaries – those services that connect individuals to goods, services, and content. These include social media networks, search engines, online marketplaces, app stores, and so on. Specific to digital content, new obligations include:
    • Establishing a mechanism to flag harmful content (Article 16)
    • Providing reasons for account suspensions or removal of content (Article 17)
    • Prohibiting the use of “dark patterns” which would manipulate users into making choices they would not have otherwise made. (Recital 67; Article 25)
    • Transparency for online advertising, including the main factors for why a targeted ad was served to an individual (Article 26)
    • Transparency for recommender systems, including main parameters used to make recommendations and any options available to individuals (Article 27)
    • Prohibiting targeted advertising based on profiling of minors (Article 28)
  • Very large online platforms (VLOPs) and search engines (VLOSEs) – in short, those with > 45M monthly users in the EU – have additions obligations, including to:
    • Undertake a risk assessment to identify any systemic risks stemming from the design or use of the platform, including risks to privacy and data protection, and establish reasonable, proportionate and effective mitigation measures (Articles 34 and 35)
    • Allow for individuals to opt-out of recommender systems (Article 38)
  • The European Commission has established the European Centre for Algorithmic Transparency to support EU regulators in assessing whether the functioning of algorithmic systems is in line with the risk management obligations that the DSA establishes for VLOPs and VLOSEs.
  • As of August 25, 2023, the DSA came in force for VLOPs and VLOSEs – 19 designated organizations (17 platforms, 2 search engines) which have > 45 million monthly active users in the EU. It will come into force for all affected parties on January 1, 2024.

How does the legitimate interest provision in the CPPA compare to that in the GDPR?

  • Under subsection 18(3) of the CPPA, organizations would be able to collect or use personal information without consent for activities in which the organization has a legitimate interest if the legitimate interest outweighs any potential adverse effect on the individual arising from that collection or use.
  • Organizations relying on this consent exception under the CPPA would have to identify any likely potential adverse effects on the individual; identify, and take reasonable measures to reduce, mitigate or eliminate the likelihood of those effects; and comply with any prescribed requirement. They would also have to record an assessment of how they meet these conditions and provide it to the OPC upon request.
  • The use of the GDPR’s lawful ground for processing personal data for legitimate interests requires organizations to conduct a balancing test, to ensure these interests are not overridden by the interests or fundamental rights and freedoms of the data subject. The processing also has to be “necessary” for the purpose of the legitimate interest.
  • This is a marked difference, as under the CPPA, an individual’s fundamental rights and freedoms are not a factor on their own that must be considered; they only come into play where the activity would have a potential adverse effect on these rights and freedoms.

Background

Legitimate Interest is one of the six legal basis for processing under the EU General Data Protection Regulation (GDPR), allowing organizations to use personal information without prior consent where “processing is necessary for the purposes of the legitimate interest pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.” (emphasis added).

This is different from subsection 18(3) of the CPPA, which allows organizations to collect or use personal information without consent if the collection or use is made “for the purpose of an activity in which the organization has a legitimate interest that outweighs any potential adverse effect on the individual resulting from that collection or use…” (emphasis added).

A “potential adverse effect” within the meaning of s.18(3) of the bill is broad enough to include potential adverse effects on the fundamental rights and freedoms of an individual, however, in practice such a weighing may not take place in practice because it is not explicitly required by the bill. An individual’s fundamental rights and freedoms are not a factor on their own that must be considered as part of the Bill’s balancing test.

Professor Teresa Scassa has noted that the focus on “adverse effects” in this provision “runs the risk of equating privacy harm with quantifiable harm, thus trivializing the human and social value of privacy”. As Professor Scassa notes, in the context of data breach class action litigation, it has been very difficult for individuals who, for example, are impacted by data breaches to show actual quantifiable damages that flowed from the breach that they can be compensated for. Assuming that this challenge is carried over to the legitimate interests analysis that subsection 18(3) would require, there is the risk, as Professor Scassa notes, that what is an exceptional basis for processing personal data under the GDPR will become a more mainstream exception under the CPPA.

Section 38 of AIDA makes it an offense for a person to use personal information for the purpose of developing or using an AI system knowing that the information has been obtained as a result of the commission of an offense. What would be an example of such a situation?

  • To apply, this provision requires that personal information is possessed or used knowing or believing that it was obtained or derived from the commission of a federal or provincial offence in Canada or an act anywhere that would have constituted an offence if committed in Canada.
  • The offence under the CPPA most likely to trigger this provision would be a violation of s. 75 – the prohibition on re-identification of personal information except for certain purposes. That is, if personal information was knowingly re-identified in contravention of the CPPA (meeting the offence threshold under s. 128 of the CPPA) and that information was possessed or used for developing an AI system knowing or believing that the information was impermissibly re-identified, it could constitute an offence under AIDA.
  • There are a variety of other offences outside of privacy law that could potentially ground reliance on s. 38; for instance, where an AI developer knows or believes that information they possess or use was obtained through “unauthorized use of a computer” (i.e. where PI was obtained through hacking) per s.342.1(1) of the Criminal Code.
  • As s. 38 also applies to provincial offences, provincial privacy laws can expand the set possible s.38 triggers. For instance, in Ontario’s PHIPA, s.72(1)(a) sets out that any willful collection use or disclosure of personal health information in contravention of the Act can constitute an offence.
  • Notably, this provision would not apply where personal information was knowingly obtained in contravention of the CPPA if this contravention would not constitute an offence (for instance, if a 3rd-party did not have consent to disclose the information to the AI developer, or where data was scraped from the Internet).

What are your views on the recent AIDA open letter?

  • Certain questions raised in the letter – such as the extent to which significant details should be left up to regulation – are a choice for this House.
  • However, in principle we agree with many of the points raised by the letter’s authors about the regulation of AI. From a privacy perspective, this includes that regulation of AI should explicitly recognize privacy as a human right.
  • We stand ready to be a central player in any consultation on – or further developments related to – AI regulation as it relates to privacy.

Background

On September 25, 2023, an open letter was sent to the Minister of Innovation, Science and Industry François-Philippe Champagne with just under 50 signatories from many well-known organizations and experts expressing concern that the proposed AIDA fails to protect the rights and freedoms of people across Canada from the risks that come with burgeoning developments in AI.

In short, they called for AIDA to be removed from Bill C-27 and raised five primary concerns consistent with those previously raised by many stakeholders:

  1. Human rights: Despite the Preamble, neither the CPPA’s nor AIDA’s text recognizes privacy as a fundamental human right.
  2. Definitional gaps: Large gaps in definitions like “high-impact system” leave major aspects of AIDA illegible and void of substance. The current approach, leaving the majority of the detail to future regulation, is not facilitating agility but rather diminishing democratic accountability.
  3. Independence: It is inappropriate for the regulation of AI to fall completely under the auspices of ISED, whose mandate is to support the economic development of the AI industry.
  4. Consultation: The lack of structured, deliberative, and wide-ranging consultations before and since tabling AIDA is anti-democratic, and it has deprived people in Canada of the rights-protecting, stress-tested AI legislation they need. ISED’s more active consultation on a generative AI Code of Practice—a document that is akin to a statement of principles, yet fails to mention privacy or questionable data practices as a factor in the fairness and equity assessment—is effectively a distraction from getting AIDA right.
  5. The use of AI in public and private sectors: The government should have envisioned AI rules for both the private and public sector at once, instead of taking a patchwork approach.

How would Bill C-27 apply to “cookies”? How would this be different from PIPEDA?

  • There are no provisions that specifically apply to, or that would appear to be specifically designed for, cookies within Bill C-27.
  • The Bill would apply to the extent that a cookie either is a unique identifier for an individual or is used to collect personal information about an individual.
  • There would be no notable practical differences between C-27 and PIPEDA with respect to strictly operational cookies (for example, first-party cookies that enable operation of a website), with the exception that under C-27 any collection of personal information using such cookies would likely be permissible without consent under s.18(2)(a) – as opposed to relying on implied consent under PIPEDA.
  • The most significant impact that C-27 could have might be on cookie-enabled online behavioural advertising (OBA), again related to the form of consent required.
    • Unlike PIPEDA, OBA might be permitted without consent under s.18(3) depending on whether:
      • An organization has a “legitimate interest” in OBA (noting that the OPC has previously found that OBA may be an appropriate purpose);
      • That legitimate interest outweighs any potential adverse effect on the individual; and
      • OBA is considered to influence a person’s behaviour and decisions (if so, an organization using OBA could not rely on s.18(3)).
    • If s.18(3) does not apply, consent would be required for OBA. Whether this consent is implied will depend on reasonable expectations and sensitivity of the information, per s.15(5) (noting that s.15(6) would not limit the use of implied consent if it were determined that s.18(3) is not applicable to OBA).
      • This is similar to PIPEDA.
  • Another potential – if unlikely – impact of C-27 would relate to automated decision systems. Under the CPPA, the use of cookies for OBA would meet the criteria to be such a system. However, it is unlikely that OBA would be considered to have a “significant impact” on individuals, meaning that the transparency (s.62(2)(c)) and explanation (s.63(3)) provisions would likely not apply.
  • Finally – the above points may become less relevant going forward as companies such as Google shift to new, non-cookie-based mechanisms for online advertising.

Recommendation 4 in the OPC’s submission states that at a minimum, you would like to see a violation of the appropriate purpose provisions added to the list for available AMPs. What other violations would you want to include?

  • The list of contraventions potentially subject to AMPs is a significant improvement over the former Bill C-11.
  • However, as noted in our submission on the former Bill C-11, our research has demonstrated that the AMP regimes in other jurisdictions including the United Kingdom, Australia, Singapore, Quebec, Ontario (under the Personal Health Information Protection Act), and California provide data protection authorities with the ability to issue AMPs for almost any violation connected to the collection, use or disclosure of personal information.
  • While the primary omission, in our opinion, is violations of the appropriate purposes provision (s.12(1)), we suggest that other violations that could potentially be subject to AMPs include:
    • s.8(1) – Designation of an individual to be responsible for matters related to obligations under the Act
    • s.10(1) – Providing access to the Privacy Commissioner, on request, to the policies, practices and procedures included in a privacy management program
    • s.56 – Maintaining the accuracy of personal information
    • s.59 – Providing notice of a data breach to an organization that may be able to reduce risk of, or mitigate, harm from the breach
    • s. 71 – Amending personal information that is demonstrated to be inaccurate
  • Finally, there are two provisions which are currently listed as offences could also be included in the AMP provisions to allow for violations to be addressed without engaging the public prosecutor:
    • s.60(1) – Maintaining records of data breaches
    • s.75 – Prohibition on re-identification, except in defined scenarios
    • If AMPs and offences are an either/or, we prefer that these violations be kept as offences.

Group Privacy – Does Bill C-27 adequately address the rights of vulnerable groups?

  • There is a growing recognition that an action that breaches individual privacy can also produce collective harms. For instance, though mass surveillance impacts individuals, it may cause collective harm through larger-scale behavioural changes or through a widespread lessening of dignity and autonomy.
  • As well, contemporary data analytics can reduce individuals to shared characteristics within groups, potentially leading to both individual and collective harms being experienced when assumptions made about the group’s characteristics are applied to both the group and to those considered part of it. Here, privacy harms are closely tied to the right to equality and to be free from discrimination.
  • It is fundamental that the law define the right to privacy in its broadest sense, which means to make explicit that a central purpose of the law should be to protect privacy as a human right in and of itself, and as essential for the realization and protection of other human rights.

Background

Some stakeholders, including Blair Attard-Frost, the Women’s Legal Education & Action Fund (LEAF) and Jane Bailey have recommended that C-27 should better recognize harms to groups/collective rights. There has been a particular, but not exclusive, focus on AIDA.

Potential harms posed by AI systems that have been highlighted are not limited to privacy harms but also include a broad range of collective harms, including related to human rights violations, IP rights violations, impacts on workers, and environmental impacts.

Specific to privacy, Jane Bailey notes that “when adequate privacy protections are in place with respect to the collection, use and disclosure of personal data, equality rights that are threatened by excessive, unfair, or discriminatory uses of that data stand also to benefit from those protections. By contrast, where privacy protections fail to appropriately limit the collection, use, and disclosure of personal data, those data have the potential to be used in ways that undermine equality rights”.

All recommend that there be public consultation to examine these issues further.

Children’s Privacy - Concretely, how will businesses identify whether someone is a minor without requiring too much personal information? Can you provide concrete OPC-approved methods?

  • We acknowledge that age assurance raises challenges for organizations, and that organizations will need support in carrying this out. This will be a priority for my Office.
  • Current digital age assurance systems use diverse technologies, analytical methods, and safeguards. Techniques vary and include methods such as self-declaration, AI and biometric-based systems, or hard identifiers such as a driver’s licence, for example. No two systems are identical: the design, implementation and potential for vulnerabilities may differ from one system to another. Moreover, the risks are constantly evolving. In our opinion, the key is to ensure that there are several lines of defence.
  • Analysis should be undertaken to ensure that a specific technical solution is appropriate in the circumstances. Regardless of the mechanism chosen, the user will ultimately be required to provide some amount of personal information. However, in any digital age assurance system, the principle of data minimization should be applied to reduce data matching and surveillance of individuals. The method chosen should also be proportionate to the risks. For example, in some circumstances it may be appropriate to use self-declaration to ask the child to confirm they are in a certain age bracket to help customize a learning game. For more adult content, such as relating to tobacco products, more rigorous age verification may be needed or required by law.
  • The use of biometrics or facial recognition to verify or estimate a user’s age raises unique privacy concerns. Biometric technologies are generally very intrusive. In addition, their effectiveness in accurately verifying age remains to be demonstrated.
  • Other methods of age assurance that do not require the digital storage of personal information could also be considered. For example, an individual could have their government-issued ID card visually verified at a point-of-service location to have their age confirmed, and then receive a verified key or code that can be used online in a way that cannot be traced back to them.
  • Important safeguards that should form part of any age-assurance system include:
    • Data minimization: collecting, using, and disclosing the minimum amount of information may reduce data linkage.
    • Access controls: restrict who has access to user data.
    • Encryption: strong encryption methods should be in place for data in use, in transit, and at rest. That said, encryption is not a foolproof method to eliminate the risk of user re-identification.
    • Tokenization: tokens can substitute sensitive information with a random string of characters, which have no value.
    • Limiting use: Age assurance information should not be repurposed for other uses.
    • Ensure that the system is fit-for-purpose, including that it is appropriately trained or developed, is accurate, and avoids bias.
    • Explain how individuals can challenge an inaccurate assessment of their age.
  • The UK Information Commissioner’s Office (ICO) has conducted research on age assurance as part of their work on the UK Children’s Code. This has included research and guidance for organizations on assessing whether an online service is likely to be accessed by children. In general, the ICO advises that if a provider concludes that children are likely to access the service and the service is not appropriate for children, the provider should apply age assurance measures to restrict access or to ensure that the service complies with the Children’s Code in a “risk-based and proportionate manner”.
  • France’s data protection authority has also done work to assess the main types of age verification systems. They’ve provided some guidance on how organizations can fulfill legal obligations in that jurisdiction, but also found that current systems tend to be circumventable and intrusive. As a result, they have called for the implementation of more privacy-friendly models and have worked with a researcher to demonstrate the feasibility of a privacy-friendly version.

Children’s Privacy - Why should the concept of the best interests of the child be added to the Bill?

  • Canada ratified the United Nations Convention on the Rights of the Child (UNCRC) in 1991. The UNCRC affirms children’s rights, including the right to privacy, and introduces the concept of the best interests of the child.
  • The “best interests of the child” is a principle that plays a significant role in Canadian law: “In all actions concerning children, whether undertaken by public or private social welfare institutions, courts of law, administrative authorities or legislative bodies, the best interests of the child shall be a primary consideration” (Article 3(1)).
  • This concept implies that young people’s well-being and rights be primary considerations in decisions or actions concerning them directly or indirectly. As a guiding principle, this concept can be applied in a variety of contexts to help assess and balance the interests of young people against others.
  • The UNCRC also expressly speaks to a child’s right to privacy. Article 16 of the UNCRC provides that “[n]o child shall be subjected to arbitrary or unlawful interference with his or her privacy, family, home or correspondence, nor to unlawful attacks on his or her honour and reputation.” Children have the right to the protection of the law against such interference or attacks.
  • The UNCRC also contains other rights that may be relevant in assessing how best to offer children and youth the privacy protection outlined in Article 16. Article 13 guarantees a child’s freedom of expression, including access to information and ideas through any media of the child’s choice. Article 17 recognizes the important role of mass media and asks States Parties to ensure children have access to information from a diversity of sources.
  • Article 12 requires that children who are capable of forming their own views be given the right to express those views freely in all matters affecting them, the views of the child being given due weight in accordance with the age and maturity of the child.
  • These Articles suggest that children are not only the subject of rights, but are intended to exercise those rights or at minimum provide input as to how those rights are exercised on their behalf in accordance with their capacity to do so. The UNCRC recognizes that the capacity of each child evolves with age, thereby increasing the extent to which he or she may be capable of exercising the rights guaranteed under the Convention, including without adult representation.
  • In over 30 years, the UNCRC has had a tremendous influence on young people’s rights around the world, including privacy. Many jurisdictions have recognized that young people may be impacted by technologies differently than adults, be at greater risk of being affected by privacy-related issues, and therefore require special protections.
  • OPC and its provincial and territorial privacy counterparts recently published a resolution on putting the best interests of young people at the forefront of privacy and access to personal information. In order to be helpful to businesses and government organizations, it includes a series of recommended practices for organizations and provides examples to help interpret the best interests of the child in a privacy context.

In the OPC’s recommendation 3, you state that a number of SUBSIM provinces have a non-exhaustive list of factors for appropriate purposes. Which provinces?

  • British Columbia: Personal Information Protection Act, SBC 2003, c 63 at ss. 11, 14 & 17.
  • Alberta: Personal Information Protection Act, SA 2003, c P-6.5 at ss. 11, 16 & 19.
  • Quebec: Act respecting the protection of personal information in the private sector, CQLR c P-39.1 at ss. 4, 5, 13 & 18.1.

Background

There are 3 provinces who’s private sector legislation is substantially similar to Part 1 of PIPEDA:

  1. British Columbia;
  2. Alberta; and
  3. Quebec

Unlike s.12(2) of the CPPA, these 3 “sub-sim” provinces do not have exhaustive lists of mandatory factors needing to be considered vis-à-vis appropriate purposes.

BC PIPA Alberta PIPA QC Private Sector Act

Limitations on collection of personal information

11 Subject to this Act, an organization may collect personal information only for purposes that a reasonable person would consider appropriate in the circumstances and that

(a) fulfill the purposes that the organization discloses under section 10 (1), or

(b) are otherwise permitted under this Act.

Limitations on collection

11(1) An organization may collect personal information only for purposes that are reasonable.

(2) Where an organization collects personal information, it may do so only to the extent that is reasonable for meeting the purposes for which the information is collected.

DIVISION II

COLLECTION OF PERSONAL INFORMATION

4. Any person carrying on an enterprise who may, for a serious and legitimate reason, establish a file on another person must, when establishing the file, enter its object.

The entry is part of the file.

5. Any person collecting personal information to establish a file on another person or to record personal information in such a file may collect only the information necessary for the object of the file.

Such information must be collected by lawful means.

Limitations on use of personal information

14 Subject to this Act, an organization may use personal information only for purposes that a reasonable person would consider appropriate in the circumstances and that

(a) fulfill the purposes that the organization discloses under section 10 (1),

(b) for information collected before this Act comes into force, fulfill the purposes for which it was collected, or

(c) are otherwise permitted under this Act.

Limitations on use

16(1) An organization may use personal information only for purposes that are reasonable.

(2) Where an organization uses personal information, it may do so only to the extent that is reasonable for meeting the purposes for which the information is used.

§ 1. — Retention, use and non-communication of information

13. No person may communicate to a third person the personal information contained in a file he holds on another person, or use it for purposes not relevant to the object of the file, unless the person concerned consents thereto or such communication or use is provided for by this Act.

Limitations on disclosure of personal information

17 Subject to this Act, an organization may disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances and that

(a) fulfill the purposes that the organization discloses under section 10 (1),

(b) for information collected before this Act comes into force, fulfill the purposes for which it was collected, or

(c) are otherwise permitted under this Act.

Limitations on disclosure

19(1) An organization may disclose personal information only for purposes that are reasonable.

(2) Where an organization discloses personal information, it may do so only to the extent that is reasonable for meeting the purposes for which the information is disclosed.

§ 2. — Communication to third persons

18.1. In addition to the cases referred to in section 18, a person who carries on an enterprise may also communicate personal information included in a file the person holds on another person, without the consent of the persons concerned, in order to prevent an act of violence, including a suicide, where there is reasonable cause to believe that there is a serious risk of death or serious bodily injury threatening a person or an identifiable group of persons and where the nature of the threat generates a sense of urgency.

The information may in such case be communicated to any person exposed to the danger or that person’s representative, and to any person who can come to that person’s aid.

A person carrying on an enterprise who communicates information pursuant to this section may only communicate such information as is necessary to achieve the purposes for which the information is communicated.

Where information is so communicated by a person carrying on an enterprise, the person must make an entry of the communication. That entry is part of the file.

For the purposes of the first paragraph, “serious bodily injury” means any physical or psychological injury that is significantly detrimental to the physical integrity or the health or well-being of a person or an identifiable group of persons.

Canada Post – Has Canada Post reached out since the tabling of the OPC’s Annual Report last week? What are the next steps in this file?

  • My Office was in regular contact with Canada Post throughout the investigation. Canada Post contacted my office after we issued the final Report of Findings. We have not heard from Canada Post since the tabling of the OPC’s Annual Report on September 19, 2023.
  • We would expect Canada Post to take the necessary steps to bring its activities into compliance with the Privacy Act and continue to urge the organization to re-consider the remedial measures we have suggested, or to propose other measures to bring it into compliance with the Act.
    • Our expectations for an appropriate remedy were set out in our Report of Findings. We invited Canada Post to consider potential options to obtain authorization from Canadians. Further, we suggested to Canada Post that our recommendation could be satisfied through a mail-out process to households. We continue to be open to hearing about the transparency measures Canada Post proposes to seek and obtain authorization from Canadians.
  • We take note of the public statement made by Canada Post on September 22, 2023, and its commitment to review its data services program following the findings of our investigation. We look forward to hearing from Canada Post and the measures it proposes to take to meet its obligations under the Act, and to protect the privacy of Canadians.

Background

We sent a final Report of Findings to Canada Post on May 12, 2023. Throughout the investigation, we worked collaboratively with Canada Post. We also met with Canada Post officials multiple times.

While we appreciate Canada Post’s willingness to be collaborative, we note that it ultimately did not agree to our recommendation that it cease its practice of using and disclosing personal information (leveraged from its operational data) for mail marketing activities without first seeking authorization from Canadians. Canada Post also refused our suggestion that this recommendation could be satisfied by a mail out to all households with clear information about the program, and easy instructions on how to opt-out.

Canada Post proposed some measures to inform Canadians about this practice (such as brochures in post offices and information on their website); however, we found that these measures put the responsibility on Canadians to proactively seek out the information, when the onus should be on Canada Post. As such, we continue to be concerned that Canada Post’s Smartmail Marketing Program is out of compliance with section 5 of the Privacy Act.

Home Depot – What is the status of this file? Does the OPC have any remaining concerns with respect to organizations not complying with our recommendations/expectations?

  • OPC ’s investigation into Home Depot’s use of Meta’s Offline Conversions tool has concluded. Home Depot advised that it has discontinued use of the tool.
  • In the aftermath of the Home Depot investigation, our office proactively reached out to eight retailers to inform them of the results of our investigation and highlight the need to obtain adequate consent. Most have responded confirming that they are not, or are no longer, using Offline Conversions.
  • We are pleased to see that these companies, and the industry in general, appear to be taking our findings seriously. We have had much positive engagement, including via a presentation at the request of the Retail Council of Canada and another at IAPP Canada in Toronto, where we were able to share our findings with hundreds of privacy professionals. However, we recognize that other companies may still be disclosing customer data via Offline Conversions and similar tools without adequate consent, and we will continue to monitor the situation.

Background

  • We issued our Report of Findings re. Home Depot on Jan 26, 2023.
  • In April 2023, following a media report, we sent compliance letters to eight retailers (i.e., redacted), in which we asked them to explain how they complied, or planned to comply, with our expectations.
  • Several confirmed that they were not, or had stopped, using Offline Conversions or similar tools. (Redacted) did not respond, despite follow-ups.

House of Commons – It was reported that the House of Commons (HOC) was the victim of a cyber attack earlier this week. Has the HOC notified the OPC? Does the HOC have any obligations under the Privacy Act? If not, do you think that they should?

  • The House of Commons is neither subject to the Privacy Act, nor to the Personal Information Protection and Electronic Documents Act. It had no statutory obligations to report the incident to my Office, and no breach report has been received.
  • What is important to me, as Privacy Commissioner of Canada, is that personal information is properly collected, managed, and protected, be that through the Privacy Act, or by other means that are deemed appropriate in the context of the House of Commons.
  • In that regard, I offer my and my Office’s services should our advice be required.

Background

The OPC has not advocated for the application of the Privacy Act to the House of Commons. However, in 2016, the then Privacy Commissioner, Daniel Therrien, presented to the Standing Committee on Access to Information, Privacy and Ethics his recommendations for amending the Privacy Act. This included extending the application of the Act to all federal government institutions, including Ministers’ offices and the Prime Minister’s Office. Commissioner Therrien noted that “Extending coverage of the Privacy Act in that way would be consistent with one of the fundamental purposes for which Agents of Parliament were created: as a window into the activities for the executive branch of government.” The Standing Committee, in its 2016 Report, recommended that the Government of Canada explore extending the scope of the Privacy Act to all federal government institutions, including ministers’ offices and the Prime Minister’s Office.

With regards to political parties, calls have frequently been made that they be subject to privacy legislation. Most recently, in May 2023, during your appearance before the Standing Senate Committee on Legal and Constitutional Affairs, to discuss amendments to the Canada Elections Act proposed in Bill C-47, the Budget Implementation Act. The proposed amendments would authorize political parties and their affiliates to collect, use, disclose, retain and dispose of personal information in accordance with the party’s own privacy policy. In your remarks you stated that political parties should be subject to privacy requirements that are grounded in legislation and based on internationally recognized privacy principles, including recourse to an independent third party with authority to verify and enforce compliance and provide remedies in case of a breach.

It is also worth noting that breach reporting for federal public sector institutions subject to the Privacy Act, is not a legislated requirement, but is rather made mandatory through policy instruments issued by the Treasury Board of Canada Secretariat (TBS). Here too, frequent calls have been made to imbed mandatory breach reporting in legislation, where a privacy breach presents a risk of harm to affected individuals. In our 2021 submission provided in the context of the Public Consultation on the Modernization of the Privacy Act, we strongly supported Justice’s proposals to introduce a “safeguarding” principle, a breach record-keeping requirement, and mandatory reporting to our Office and notification to affected individuals of a breach where there is a risk of significant harm to an individual.

What are the privacy implications of the government’s statement that they may publish the names of former soldiers who fought alongside the Nazis in the Second World War and were subsequently allowed to immigrate to Canada?

  • Subparagraph 8(2)(m)(i) Privacy Act currently allows federal government institutions to disclose personal information in the public interest.
  • However, this is discretionary and 8(2)(m)(i) has a high threshold: the public interest to disclose must clearly outweigh any invasion of privacy.
  • Decisions related to the disclosure of personal information in the public interest under 8(2)(m)(i) must be made by heads of institutions on a case-by-case basis.
  • Our Office is available for consultation to support institutions in considering privacy issues related to public interest disclosures.
  • In its 2021 consultation on modernizing the Privacy Act, the Department of Justice proposed a new framework for the disclosure of personal information in the public interest. We would welcome the opportunity to work with the Department of Justice to ensure that any new framework strikes a balance to ensure that the impact on privacy is proportionate to the public interest.
  • “Personal information” is defined under an exhaustive enumerated list under section 3 of the Privacy Act. Paragraph (m) of the definition of “personal information” states that, in the context of an access to information request under the Access to Information Act, “information about an individual who has been dead for more than twenty years” would not be considered personal information. However, in all other contexts outside an access to information request, it would remain “personal information” protected under the Privacy Act.
  • Subsection 69(2) of the Privacy Act permits the disclosure of “personal information” that is already “publicly available” (i.e., s. 8 would not apply to that PI).

Background

Subparagraph 8(2)(m)(i) of the Privacy Act

Disclosure of personal information

  • 8 (1) Personal information under the control of a government institution shall not, without the consent of the individual to whom it relates, be disclosed by the institution except in accordance with this section.
  • Where personal information may be disclosed
    1. Subject to any other Act of Parliament, personal information under the control of a government institution may be disclosed […]
      1. for any purpose where, in the opinion of the head of the institution,
        • (i) the public interest in disclosure clearly outweighs any invasion of privacy that could result from the disclosure, or […]

Section 3 “Personal Information” [paragraph (m)] of the Privacy Act

personal information means information about an identifiable individual that is recorded in any form including, without restricting the generality of the foregoing,

[…]

but, for the purposes of sections 7, 8 and 26 and section 19 of the Access to Information Act, does not include […]

(m) information about an individual who has been dead for more than twenty years; (renseignements personnels)


Subsection 69(2) of the Privacy Act

69(2) Sections 7 and 8 do not apply to personal information that is publicly available.

Date modified: