Language selection

Search

Privacy and age assurance – Exploratory consultation

Many jurisdictions, including Canada, have introduced or adopted legislation intended to increase the safety of young peopleFootnote 1 in online spaces. This could mean, for example, that young people’s access to certain content must be restricted or that their personal information must not be collected, used or disclosed in ways or for purposes set out in the legislation. This leads to the question – how can an online service determine whether a user is a young person, and thus subject to these restrictions?

A common approach is the use of age assurance, a term that refers to a variety of processes by which the age (or age group) of a user is determined with varying levels of specificity and certainty. In some cases, the use of age assurance is mandated by law or regulation; in others, it is adopted by organizations as part of their overall compliance strategy.

Age assurance can be an effective technique to promote online safety for young people. As well as restricting access to harmful content, age assurance could be used to direct young people to a version of a service that uses data practices tailored to youth and children. However, this technique can also have impacts on privacy and other fundamental rights. Given this, the Office of the Privacy Commissioner of Canada (OPC) intends to undertake policy and guidance work on the development and use of age-assurance technologies.

We recognize the complexity of this issue and the need to hear the views of a wide range of relevant parties. To that end, we are undertaking this exploratory consultation to provide parties with an opportunity to comment on our initial direction and thinking on this topic.

What is an exploratory consultation?

In this document, we set out our current understanding of, and thinking on, the topic of privacy and age assurance. Our positions are not final and are subject to change based on the feedback we receive. The goal of this consultation is to prompt meaningful discussion of this topic, and to increase our understanding of the benefits, concerns, and existing research or writing associated with age assurance. This will support the next step(s) of our work, which will include the creation of a draft guidance document about the use and design of age-assurance systems that will also be subject to consultation.

We are open to receiving information that builds upon our understanding of the overall field of age assurance, that supports or challenges our preliminary thinking on the topic, or that helps us to refine an appropriate path forward.

Instructions for submitting comment are at the end of this document or can be found online at: Age assurance exploratory consultation – call for comments.

OPC preliminary positions

In general (and based on the information that follows), we take the position that it is possible to design and use age assurance in a privacy-protective manner. However, this does not mean that the use of age assurance will be necessary to the same extent in all circumstances.

In our preliminary opinion, the use of age-assurance systems:

  • Should be restricted to situations that pose a high risk to the best interests of young people; and,
  • Must consider impacts on the privacy rights of both young persons and adult users of the online service.

Moreover, legislation or regulations requiring the use of age-assurance systems to restrict young people’s access to content:

  • Should be proportionate to the risk and have taken into account potential alternative means of restricting access to content such as education, device-level parental controls, or individual or household-level Internet filtering technologies.

As well, the use of age assurance to limit the exposure of young people to data practices that might negatively influence their behaviour or cause them harm:

  • Should require that an organization demonstrates the necessity of applying those practices by default.
    • That is, organizations should be required to justify why a particular age assurance technique is a more appropriate option than, for example, assuming all users are young people and applying appropriate practices.

Finally, age-assurance systems:

  • Should be designed to minimize the identifiability of users and the ability to link users across services;
  • Should not permit information collected for age-assurance purposes to be used for other purposes;
  • Should be designed in accordance with relevant industry standards and guidance from regulators (including the OPC) and be subject to effective oversight; and,
  • Should not require individuals to undergo an age assurance process to access non-restricted content.

Background and context

To provide background and context for this discussion, below we describe key terms that we will use in the document, highlight a selection of existing work on age assurance and privacy, and set out some considerations about the use of age-assurance technologies.

Key terms

“Age assurance” is an umbrella term for multiple ways of determining the age of an individual. For this paper, we will define three subcategories:

  • Age declaration: An individual, or a person who knows them, declares that the individual is above a defined age. Generally, these declarations are not confirmed by the organization.
  • Age verification: An individual provides the organization with proof of their age. This can be direct (e.g. providing a copy of a government-issued identifier), or indirect (e.g. directing a third-party service to provide proof of age). The nature and amount of information received by the organization in this process will vary.
  • Age estimation: The individual’s age is estimated based on an analysis of biometrics or behaviours, generally performed by an artificial intelligence system.

Existing work

There is a growing body of work that examines the relationship between privacy and age assurance, or that proposes principles to ensure appropriate use of the technology. Positions developed by data protection authorities in Europe – including SpainFootnote 2, FranceFootnote 3, and the UKFootnote 4 – have been particularly influential in shaping our initial understanding. We are also aware of work from industry organizations such as the Digital Trust & Safety Partnership.Footnote 5

Standards could also play an important role in the development of age assurance. Current projects in this space include those from the British Standards Institution (BSI PAS 1296:2018Footnote 6), the IEEE (IEEE 2089.1Footnote 7), ISO (ISO/IED WD 27566Footnote 8) and Canada’s Digital Governance Standards Institute (Technical Committee 18Footnote 9).

Considerations

To further contextualize our initial positions on age assurance, we understand the following:

  • Age estimation is not precise: Age estimation systems are best operated with a buffer. For instance, where a requirement exists to restrict access for any user under 18, an age estimation service could estimate whether that person is over 21 (a 3-year buffer), rejecting access or requiring additional verification steps for anyone deemed to be under 21.Footnote 10 As such, by design an age estimation system will reject (or refer to an alternative assurance method) a number of individuals who are above the threshold age for the restriction, but below the buffer age.
  • Age assurance can pose equity issues: In general, age verification methods will rely on the individual having access to an authenticated identity (a government-issued ID; an account with a trusted party, such as a bank; etc.). This may pose challenges for groups who do not have ready access to such identifiers, including younger teenagers (in situations where those over 13 are permitted access to content or a service), unhoused or unbanked individuals, or non-citizens. Age estimation techniques (such as facial analysis) have also faced issues with respect to inconsistent performance across skin tones and genders.
  • Age-assurance techniques can exacerbate the digital divide: Some age-assurance technologies may rely on individuals having specific technologies (such as a web camera or a non-shared cell phone) or to undertake a complex sign-up and authentication process which is not accessible to all individuals.
  • Age assurance is not the only available option: Tools such as parental controls on devices or browsers, public education and awareness campaigns, and other approaches can work either in place of, or in concert with, age assurance methods to increase the protection of young people online.

Privacy analysis

The privacy impacts of age assurance will differ depending on what type of age assurance is used (declaration, verification, or estimation). However, the most likely high-level impacts of age assurance (and age verification or estimation in particular) include:

  • Identifiability / linkability: Similar to a digital identity system, if not designed appropriately, age assurance can increase the identifiability of an individual or the linkability of their action across time or across websites/services. It is important to address this given the sensitivity of the information that may be revealed based on an individual’s access to restricted material,Footnote 11 as well as the overall potential for increased online surveillance.
  • Residual information / metadata: It is possible that metadata generated or retained by an age verification system could also create privacy impacts. For example, users would likely not want to undergo an age assurance process each time they access a given service; as such, it is likely that some form of persistent token will be used to indicate that the individual has already undergone an age check. Improperly designed, this token could serve as a unique identifier and/or reveal age or age group in other online spaces that do not require age assurance. Similarly, simple presence of that token could allow inferences to be made about the individual if the token is exclusively associated with access to certain kinds of content or services.
  • Normalization of digital identification: The 2022 Resolution on the Digital Identity Ecosystem by Canada’s Federal, Provincial and Territorial privacy commissioners and ombuds with responsibility for privacy oversightFootnote 12 emphasized that:

    “Digital identification should not be used for information or services that could be offered to individuals on an anonymous basis and systems should support anonymous and pseudonymous transactions wherever appropriate.”

    This general principle – that, wherever possible, individuals should be able to use online services without establishing identity characteristics – may be impacted if access to general use services (such as search engines or social media) become subject to age assurance.

Many of the above impacts can be mitigated using privacy-enhancing technologies or well-designed policy measures. However, some (real or perceived) privacy impacts may be more difficult to address. These include:

  • Phishing: A significant difference between online and offline age assurance is that offline, it would be rare for an individual to receive a fraudulent request to verify age. If one is asked for proof of age when purchasing alcohol, context alone should make clear whether the request is legitimate. Online, sophisticated phishing messages and fraudulent websites make this determination more difficult, which could lead individuals to provide detailed and/or sensitive personal information to bad actors.
  • Trust deficit: The OPC’s 2022-23 Survey of Canadians found that only 4-in-10 Canadians felt that businesses respected their privacy rights, and that trust was lowest in “Big Tech” (64% reporting ‘not much’ or no trust) and social media (88% reporting ‘not much’ or no trust). This means that age assurance may require individuals to provide additional, potentially sensitive, personal information to an organization that they do not know or trust.

    Even if privacy-enhancing technologies are implemented and age-assurance services are subject to strict oversight, it is likely that some individuals will forego accessing content or services that requires age assurance or seek that content in riskier ways (such as via the dark web).

Role of the OPC

Age assurance is a cross-jurisdictional issue with potential to impact multiple rights. 

The OPC’s role is to:

  • Highlight privacy risks: We are well-placed to highlight the potential or known privacy impacts of age-assurance systems, both with respect to the impact of any collection, use, and disclosure of personal information associated with the age-assurance system itself and to the broader potential impacts on anonymous use of the Internet.
  • Support strong design: We are well-positioned to issue comment or guidance on appropriate privacy-design practices for online services – including with respect to the appropriate design and use of age assurance. We are also able to engage with organizations directly to clarify our expectations with respect to privacy obligations.
  • Encourage the development of privacy-protective age assurance techniques or services: By drafting guidelines, developing or reviewing Codes of Practice, supporting standards development, or other measures, we play a key role in ensuring that privacy-protective age assurance techniques are available and expectations or requirements for such systems are known.
  • Examine, and where appropriate promote, privacy-protective measures that can complement or replace age assurance. Age assurance should be seen as one option among many for either restricting access to content or determining appropriate data practices for a user. The OPC’s expertise can support the consideration and discussion of privacy-related impacts of these various options.

How to comment

Following this consultation, we intend to draft a guidance document that elaborates on and defines expectations for the development and use of privacy-protective age assurance. In our Strategic PlanFootnote 13, we identify “championing children’s privacy rights” and “addressing and advocating for privacy in this time of technological change” as two of our strategic priorities for 2024-27. The issue of age assurance sits in the overlap between these areas. We believe that technological innovation does not have to come at a risk to privacy, but instead that innovation can be used to protect this fundamental right – and that our proposed guidance will help to achieve this.

We welcome your comments on the OPC’s preliminary positions. We are particularly interested in responses to the following questions:

  • What additional context should we be aware of in developing our future work? Are there other key resources (guidance documents; principles; standards; research; etc.) we should review?
  • Are there other significant privacy considerations we should be aware of?
  • Do you have any comments on our preliminary views?
  • What complementary steps might the OPC consider taking to promote privacy and online safety for young people?

Comments will be accepted by email (OPC-CPVPconsult1@priv.gc.ca) until September 10, 2024. Further information about this consultation can be found at: Age assurance exploratory consultation – call for comments.

Date modified: