Language selection

Search

Issue Sheets on Bill S-210

Appearance before the Standing Committee on Public Safety and National Security


Comparison of S-210 and S-203

Speaking points

  • Bill S-210 is similar to its predessor, Bill S-203, but there are important differences between the two.
  • From a privacy perspective, one of the most notable changes is that Bill S-210 requires the Governor in Council to consider several privacy-related criteria before it prescribes an age-verification method.
  • This is a valuable addition to the bill. However, despite this and other changes, I believe that aspects of S-210 could still be improved to further mitigate its potential privacy risks.
  • For example, I would respectfully suggest that the list of criteria set out in section 11(2) be expanded to include the principles of necessity and proportionality.

Background

  • The previous iteration of Bill S-210, Bill S-203, was introduced by Senator Julie Miville-Dechêne in 2020 and passed by the Senate in 2021. It died on the order paper later that year when the election was called.
  • During clause-by-clause consideration of S-210, the Standing Senate Committee on Legal and Constitutional Affairs added a new provision to require the GIC, before prescribing an age verification method, to consider whether it “(a) is reliable; (b) maintains user privacy and protects user personal information; (c) collects and uses personal information solely for age-verification purposes, except to the extent required by law; (d) destroys any personal information collected for age-verification purposes once the verification is completed; and (e) generally complies with best practices in the fields of age verification and privacy protection.”
  • In contrast to S-203, S-210 also:
    • refers briefly to pornographic websites in its preamble (though the wording of the offence in section 5 is not limited to such sites)
    • limits liability for the offence to organizations only (s.5)
    • creates a new defence for organizations that take steps to comply with a notice received from the enforcement agency (s. 6(3))
    • no longer refers to “obscene material” as an aggravating circumstance

Lead: Legal


Definition of "young person"

Speaking points

  • Bill S-210 makes it an offence to make sexually explicit material available online for commercial purposes to a “young person,” which is defined as a person under 18 years of age.
  • The definition of “young person” in Bill S-210 is consistent with the definition of “child” in Bill C-63 and the definition of “minor” that was recently added to Bill C-27.

Background

  • While Bill C-63 (Online Harms Act) and Bill C-27 (Consumer Privacy Protection Act) do not use the term “young person,” they adopt the same age threshold as Bill S-210. Bill C-63 uses the term “child”, which is defined as a person under 18 years of age. Bill C-27 uses the term “minor”, which was initially not defined. An amendment adopted by the Standing House Committee on Industry and Technology during clause-by-clause review defines a minor as an individual under 18 years of age.
  • The Canadian Bar Association recommended to the Senate Committee on Legal and Constitutional Affairs that the definition of “young person” in Bill S-210 be changed to one under 16 in order to align with the age of consent in the Criminal Code. Bill S-210’s sponsor, Senator Julie Miville-Dechêne, defended the existing definition, noting that individuals must be at least 18 years old to access pornographic material offline.
  • A province or territory defines the age of majority within its respective jurisdiction, which is either 18 or 19 years of age. Federal legislation does not typically define who is or is not a minor, but there are some exceptions (e.g., there is a definition of age of majority in the Divorce Act).
  • Since Bill S-210 is made under the federal criminal law power, the age of majority established by a province or territory is not relevant.

Lead: Legal


Offence provision

Speaking points

  • I share the concerns of many stakeholders that, as drafted, the offence provision would capture a wide range of online services and content, which increases the likelihood and potential impacts of unintended consequences on privacy and other human rights.
  • S-210 would essentially mandate the use of age-verification methods online as only those organizations that use the age-verification methods that are specified in the regulations would be able to rely on the defence that they believed that the person accessing sexually explicit material was an adult.
  • I recommend that the Committee consider restricting the requirement for age-verification to websites that primarily provide sexually explicit material for commercial purposes.

Background

  • While the preamble of S-210 highlights pornographic websites in particular, the offence set out in section 5 is not limited to these sites. Rather, it encompasses any organization that makes sexually explicit material available for commercial purposes.
  • More specifically, section 5 would create a summary offence for organizations that make sexually explicit material available to young persons on the Internet for commercial purposes. The offence is punishable by fines of up to $250,000 for organizations on a first offence and up to $500,000 for subsequent offences.
  • Bill S-210 adopts the definition of “sexually explicit material” used in s.171.1 of the Criminal Code, which sets out offences related to making sexually explicit material available to a child in visual, audio, or written formats.
  • We do not see a clear overlap between the offence provision in Bill S-210 and offences in PIPEDA or proposed in Bill C-27. However, an organization could find itself subject to Bill S-210 as well as PIPEDA or Bill C-27 (e.g., where sexually explicit material made available online to a young person also involves a disclosure of personal information in the course of a commercial activity).

Lead: Legal


Approaches to online age verification in other jurisdictions

Speaking points

  • The legal and regulatory landscape on the issue of limiting children’s access to pornography is rapidly evolving. Several jurisdictions—including Germany, France, and the UK—have enacted laws requiring age verification to access pornography.
    • We note that Australia has announced a pilot on the use of age assurance to protect children from harmful content like pornography. The pilot will consider privacy implications.
  • Bill S-210 would impose age verification requirements on a broader range of content than many other jurisdictions. For example, unlike Bill S-210, Texas and Utah only require age verification for sites with a certain threshold of pornographic content. In Texas, the threshold is where more than one third of the material on a website is sexual material that is harmful to minors; in Utah, the requirement is triggered where a “substantial portion” of the material on a website is sexual material that is harmful to minors."  
  • Bill S-210 also applies to sexually explicit material in various formats including text, while other jurisdictions like the UK would not require age verification for content in text only.

Background

  • (redacted)

Lead: Legal


Aylo litigation

Speaking points

  • In April 2023, Aylo (formerly MindGeek) brought an application for judicial review in Federal Court, arguing that my Office’s intent to publish its Report of Findings regarding a complaint received against the company was unreasonable and unfair.
  • Aylo then sought an injunction that would have prevented my Office from issuing and publishing the final Report of Findings while the judicial review application was pending.
  • In August 2023, the Federal Court dismissed Aylo’s injunction request, in particular because it had not demonstrated that it would be irreparably harmed by the issuance and publication of the report. Aylo appealed that decision.
  • In February 2024, the Federal Court of Appeal unanimously dismissed Aylo’s appeal. That same day, my Office issued and published the final Report of Findings, which found that Aylo contravened PIPEDA by enabling intimate content to be shared on its websites without the direct knowledge or consent of everyone depicted.

Background

  • In April 2020, the OPC received a complaint against Aylo stemming from its alleged failure to obtain consent from everyone depicted in intimate content posted on its various websites. After investigating, the OPC informed Aylo in March 2023 of its intention to issue and publish the Report of Findings.
  • We recommended that Aylo immediately stop the collection, use and disclosure of user-generated intimate images and videos until it has implemented measures to ensure compliance with its obligations under PIPEDA.
  • We are currently in discussions with Aylo regarding its recommendations.

Lead: Legal


OPC views on age verification

Speaking points

  • The OPC’s overall position is that age assurance (including age verification) is a tool that can support online safety for young people, but its design and use must consider the significant potential privacy impacts that it can have.
  • Age-assurance systems – particularly if they are mandatory to access certain content or services – must not result in increased online surveillance or disclosures of sensitive information.
  • My Office plans to develop guidance for organizations on age assurance and privacy, and to help inform this work we intend to launch an exploratory consultation next month.

Background

  • Age assurance broadly consists of age declaration (an individual asserts their age), age verification (an individual provides proof of their age),and age estimation (an AI system estimates age based on, for example, a facial image).
  • Bill S-210 currently refers only to “age verification,” with the details to be prescribed in regulations made by the Governor in Council. As a result, it is not possible to fully assess the potential privacy risks of the bill at this stage.
  • The OPC intends to release an exploratory consultation document seeking comment on our preliminary positions on age assurance in June 2024. In our consultation document we put forward a number of preliminary positions, including that the use of age assurance should:
    • be restricted to situations that pose a high risk to the rights of children;
    • not permit information collected for age verification to be used for any other purposes;
    • be designed in accordance with relevant industry standards and guidance from regulators, including my office; and
    • be subject to effective oversight.

Lead: PRPA


Privacy risks related to age verification

Speaking points

  • The main privacy risks associated with age assurance include:
    • an individual’s activities within or across websites may be tracked or traced back to them (i.e., increasing their identifiability based on their actions); and,
    • identity information may be accessed or compromised through phishing, fraud, hacks, or data breaches.
  • There is also a risk that individuals may avoid accessing content and services, or access them in riskier ways, if they do not trust that a given age-assurance system is privacy-protective.
  • Such risks – and particularly the risk of tracking – can be mitigated by considering privacy throughout the design and operation of an age-assurance system and by applying innovative privacy-enhancing technologies.
  • Age assurance is an area in which, properly harnessed, innovation can drive privacy protection.

Background

  • Measures to improve trust in age assurance include strong oversight and the development of standards and/or certification mechanisms.
  • When S-210 was before the Legal and Constitutional Affairs committee in the Other Place, questions were asked about whether individuals’ personal information (including government-issued IDs) might be stolen and used for cybercrime, and whether age verification would create opportunities for the government to collect or supervise the collection of sensitive personal information.
    • In short, both are risks, but such risks can be mitigated (for instance, by limiting what is collected and retained and by preventing tracking).

Lead: PRPA


Age verification vs identity verification

Speaking points

  • Age verification and identity verification differ significantly in their purpose: one is intended to establish your age; the other, who you are.
  • It is possible for an age verification system to determine (with a reasonably high degree of certainty) whether an individual is above a defined age without any further information about them. That is how privacy-protective age verification systems should operate.
  • It is important that the design and use of age verification systems not also increase the identifiability of individuals, which could allow for profiling or tracking of their Internet activities.

Background

  • Age verification is one form of the broader category of age assurance methods.
  • Proving age without disclosing any further information is a form of “zero-knowledge proof,” a protocol that allows a person to prove the truth of a statement (e.g., “I am over 18”) without disclosing any other information.
  • A simple example would be an individual with a digital wallet on their device that is able to generate a token confirming that they have previously proved they are over 18, which could then serve as “proof of age” to an online service.
  • We are not aware of any jurisdiction that recommends identity verification (as opposed to age assurance/verification) for the purpose of accessing restricted content.

Lead: PRPA


Use of third-party intermediaries

Speaking points

  • Third-party age assurance occurs when a website or app relies on another entity to establish an individual’s age.
  • We have not examined any specific third-party age-assurance services so cannot speak to their particular strengths or weaknesses.
  • However, such services will likely play a significant role in age assurance.
  • The French Data Protection Authority recommends that age assurance “should rely on third-party solutions whose validity has been independently verified.”
  • Third-party age-assurance may be privacy-protective, but without proper design and oversight it could also enable online tracking.
  • Any third-party intermediary that collects, uses, or discloses personal information in the course of a commercial activity would be subject to PIPEDA.

Background

  • In general, third-party age assurance refers to a process whereby a third party establishes the age (or age range) of an individual – e.g., by checking an identity document, estimating age, or drawing on information the individual has already provided – and then sends a signal back to the original website or app indicating whether the individual meets its age requirements.
  • The French DPA (the CNIL) makes that case that separating the functions of issuing proof of age and transmitting this certified proof of age creates a three-fold privacy protection:
    • The entity issuing proof of age knows the identity of the Internet user, but not the site that user is visiting;
    • The entity that transmits proof of age may know the site being visited, but not the identity of the user;
    • The site being visited knows the age of the user, but not their identity.

Lead: PRPA


Use of facial recognition (FR) for age verification

Speaking points

  • Facial-age estimation is an emerging technology that has not yet reached levels of maturity of other FR applications. It is generally used for age ranges (such as “13-18”), rather than exact age.
  • Age estimation should not result in the creation of a unique and identifiable “faceprint”; however, the potential for creating such a biometric means it is generally considered an intrusive process.
  • In addition to privacy concerns, all FR technologies, including facial-age estimation, have the potential for bias. For example, the rate at which a face “ages” is influenced by a range of factors (e.g. genetics, environment, etc.), which could affect error rates or create disproportionate impacts on certain individuals.
  • It is important to weight the intrusiveness and potential bias of the use of facial age estimation against its proported benefits, taking into account any risk mitigations in place.

Background

  • FR bias concerns have focused on disparate error rates based on demographics such as race and gender. Age estimation introduces additional sources of bias.
  • There is a lack of research into the effectiveness of algorithms in real-world settings: it is unknown how well they perform in uncontrolled environments with challenges such as sub-optimal lighting, spoofing attempts, and the use of makeup.
  • The lack of training data to improve accuracy or reduce potential bias is exacerbated by the sensitivity of collecting personal information from minors and children.
  • The US National Institute of Standards and Technology conducts “face analysis technology evaluations” to asses their accuracy. The results of their latest evaluation are expected in May or June 2024.
  • In Canada, FR is regulated by a patchwork of laws, except in Quebec, which has established a specific biometrics regime. Since FR uses personal information, its use may be subject to PIPEDA or the Privacy Act

Lead: TAD


Use of VPNs to bypass requirements

Speaking points

  • Virtual private networks (VPNs) have become increasingly accessible and widespread in recent years.
  • They can keep individuals’ online activities private and secure by encrypting their data, protecting them from online trackers, and hiding their location.
  • They are also used by organizations around the world to allow employees to securely connect to a corporate network remotely.
  • However, this same technology could enable minors to readily bypass age-verification systems by establishing a connection in a country where no verification is required.
  • This underlines the need to consider the effectiveness of different age-assurance methods – in addition to their degree of intrusiveness – when weighing their potential benefits.

Background

  • VPNs operate by establishing a secure connection between a user’s device, a VPN service provider, and the internet. A VPN will encrypt internet traffic, mask the user’s real IP address, and route it through a server in a location of the user’s choice before connecting (e.g., to a website).
  • VPNs are widely available and easy to use, and online resources explaining how to evade restrictions are widespread (e.g., videos, forums, tutorials, etc.).
  • A 2020 study by a UK research firm (Young People, Pornography & Age-Verification) found that 23% of children surveyed were aware of potential methods to bypass age-restriction measures, including VPNs. Younger children (aged 11 to 13) were least likely to know about workarounds, but knowledge increases with age (25% of 14-15 year-olds and 33% of 16-17 year-olds were aware).
  • Some major online content providers – such as Netflix – have been developing strategies to detect and prevent the use of VPN to circumvent location-based content restrictions (with limited or uncertain effectiveness).

Lead: TAD


Encryption as an effective safeguard

Speaking points

  • Encryption is a fundamental safeguard that protects the confidentiality, integrity, and authenticity of personal data and digital identities in transit over the internet and at rest. It is a cornerstone of online privacy and security.
  • Effectiveness of encryption depends on several factors, including the chosen encryption algorithms, key length, key management schemes, and their implementation. The integrity of the underlying system where encryption is used is also paramount to ensuring confidentiality, integrity, and authenticity.
  • Encryption alone is not sufficient to protect an individual’s or a company’s data: it is not a substitute for a comprehensive data-protection strategy, with a variety of safeguards to prevent or mitigate the impacts of a security breach.
  • Encryption remains one of the best tools available to protect privacy online, but we must continue to look for new and innovative ways.
  • All age-verification systems will entail different privacy and security risks based on how, where, and for how long user data is stored.

Background

  • Encryption can be used to protect data stored (“at rest”) or sent (“in transit”) between user devices and servers, images, photo IDs, biometric data, credit card numbers, social insurance numbers, and security and audit logs.
  • Other important safeguards include data minimization, advanced cryptographic techniques (such as zero-knowledge proofs that verify age without revealing additional information), and robust access controls.

Lead: TAD


International best practices

Speaking points

  • Best practices related to the design and use of age-assurance tools are still developing.
  • Data protection authorities in Spain, France, and the United Kingdom have all published important guidance on the topic. Common principles include:
    • that age assurance should not lead to online tracking or the disclosure or re-use of personal information; and
    • that the collection of personal information should be limited to what is necessary.
  • The Spanish and French data protection authorities have worked with researchers to develop proofs of concept for privacy-protective age-assurance systems.
  • We are also following ongoing standards development by groups such as the International Organization for Standardization (ISO), the British Standards Institute, and Canada’s Digital Governance Standards Institute.

Background

  • Referenced standards for age assurance:
    • ISO/IED WD 27566 – Age assurance systems (in development);
    • British Standards Institute (BSI) PAS 1296:2018 – Online Age Checking – Code of Practice; and
    • Digital Governance Standards Institute – Technical Committee 18 – Age Verification.
  • While there are broad consistencies, the Spanish, French and UK age assurance guidelines do not have significant commonalities in their listed principles (other than those mentioned).

Lead: PRPA


International age-assurance working group

Speaking points

  • International collaboration on age assurance is essential given the challenge of developing principles, codes, and standards for websites and services that are accessible across multiple jurisdictions.
  • Data protection authorities around the world are still considering how best to approach online age assurance in light of its implications for privacy.
  • My Office is a member of an international age assurance working group chaired by the UK Information Commissioner’s Office. We have been been participating in the group since July 2022.
  • Other members include data protection authorities from Australia, Belgium, Ireland, France, Germany, Spain, Portugal, Slovenia, Cyprus, Morocco, USA, Mexico, Singapore, and Japan.
  • The working group intends to publish a joint statement of principles for age assurance later this year

Background

  • The International Age-Assurance Working Group meetings occur two to three times a year. The last working group meeting took place on May 2nd and the next meeting will take place later this summer.
  • The objectives of the working group are to
    • share information on approaches to age assurance among data protection authorities, including their accuracy and efficacy;
    • promote the harmonization of policy approaches to age assurance where possible; and
    • develop a joint statement on age assurance.
  • A firm publication date for the joint statement of principles for age assurance has not been set, but the ICO recently announced the intent to publish the principles at the Spring DPA conference (held May 14-16, 2024).

Lead: PRPA


Status of digital ID implementation in Canada

Speaking points

  • Much like age assurance, digital ID may also entail privacy risks, including the potential overcollection of personal information and the increased incidence of identity theft, fraud, and other harms.
  • Digital identity initiatives are being implemented across the country to help expand, simplify, and secure Canadians’ access to public and commercial services.
  • There is currently no single, government-issued digital ID available to all Canadians; British Columbia’s BCeID is the furthest developed government-issued digital ID amongst the provinces and territories.
  • In 2002, my provincial and territorial counterparts and I published a resolution on ensuring the right to privacy and transparency in Canada’s digital identity ecosystem to help guide the developlement of any digital ID initiatives in Canada.

Background

  • Many jurisdictions in Canada have issued a digital credential that allows access to online services, including Newfoundland and Labrador (MyGovNL), Nova Scotia (MyAccountNS), Ontario (ONe-key), Yukon (MyYukon), Quebec (clicSÉQUR); and the Northwest Territories (NTKey).
  • In addition to Alberta and BC, Quebec is actively pursuing a digital-ID system that would issue credentials (identity, photo, and address) to be stored in a digital wallet.
  • In other jurisdictions, work on digital ID systems has been paused or has not progressed in recent years.
  • However, many elements of a digital ID ecosystem are widely available. For instance, Canadians can authenticate themselves to the Canada Revenue Agency using digital credentials issued by BC or Alberta or through online banking credentials.
  • Canada’s federal, provincial, and territorial privacy commissioners jointly monitor developments in the digital ID space through the Digital Identity Working Group, which we chair.

Lead: PRPA


FPT Commissioner views on Digital ID

Speaking points

  • The development and implementation of a digital ID ecosystem is an opportunity to demonstrate how innovation and privacy protection can co-exist.
  • In September 2022, Canada’s federal, provincial and territorial privacy commissioners issued a joint resolution on ensuring the right to privacy and transparency in the digital ID ecosystem in Canada.
  • This resolution re-iterates our commitment to collaborating with each other, governments, and other interested parties to ensure that digital ID is designed and implemented responsibly in Canada.
  • It also sets out a non-exhaustive list of desirable ecosystem properties, individual rights and remedies, and governance and oversight measures that should be considered.

Background

  • Selected measures from the joint resolution:
    • A privacy impact assessment should be conducted in the early design, development and update stages of a digital ID project;
    • Digital ID should not be used for services that could be offered on an anonymous basis;
    • Digital ID ecosystems must not allow tracking, tracing, or credential use for other purposes;
    • Individual participation should be voluntary; and
    • Governments should establish clear accountability mechanisms.
  • Following the release of this resolution, the FPT Commissioners convened a Digital Identity Working Group to further collaboration and information sharing.

Lead: PRPA


OPC actions to protect and promote children’s privacy

Speaking points

  • Ensuring that children’s privacy is protected and that young people understand and are able to exercise their privacy rights is one of my key strategic priorities.
  • My Office’s strategic plan details specific areas of focus and various initiatives we will be undertaking over the next three years to help advance this priority.
  • Last year, my counterparts in Alberta, Quebec, British Columbia, and I launched an investigation into TikTok to determine whether their practices are in compliance with privacy legislation, with a particular focus on their privacy practices as they relate to younger users.
  • As well, last October, my provincial and territorial counterparts and I issued a joint resolution calling on governments and organizations to put the best interests of young people first and recommending the adoption of specific practices to this end.
  • My office is planning to consult on guidance on age assurance and on other children’s privacy issues this year.

Background

  • In line with the OPC’s strategic plan, we will further develop our expertise by conducting research and outreach with young people to identify privacy harms and better understand their perceptions of their privacy rights.
  • Children’s privacy was a primary theme for our office’s 2024-2025 call for proposals for the contributions program. This will fund important research to further our understanding of young people’s privacy over the coming year.
  • OPC sits on the Global Privacy Assembly’s Digital Education Working Group and on an international age-assurance working group, where we collaborate and share best practices with our international counterparts.

Lead: PRPA


Stakeholder views on S-210

Speaking points

  • As many stakeholders have pointed out, age verification in the online environment can have significant implications for privacy and other fundamental human rights.
  • To some extent, Bill S-210 recognizes this by setting out different criteria for the Governor in Council to consider when making regulations, including whether the age-verification method maintains user privacy.
  • However, to further mitigate privacy risks, I agree that these criteria can and should be strengthened, including by adding the principles of necessity and proportionality.
  • I also share concerns raised by some stakeholders that the scope of the bill is too broad, which increases the likelihood and potential impact of any unintended consequences on privacy and other human rights.

Background

  • In written briefs or appearances before the committee, some stakeholders have expressed support for Bill S-210 (notably certain faith-based organizations, women’s rights groups, and parents’ associations). However, many others, including academics, open-internet advocates, and other civil society representatives, have been highly critical of the bill as drafted.
  • Criticism has generally focused on S-210’s broad scope; reliance on website blocking; issues related to age verification (including the accuracy, effectiveness, and privacy impacts of different methods); and its risk-mitigation safeguards.
  • During clause-by-clause review in the Other Place, S-210 was amended to include several criteria that the GIC must consider when prescribing age-verification methods through regulation with a view to containing their privacy risks. Some stakeholders have argued that these criteria do not go far enough.
  • Since S-210 would effectively require a wide range of services to implement age verification in order to rely on the defence that they believed their users were adults, some stakeholders have also argued that it could discourage Canadians from seeking, receiving, and imparting certain forms of legal, constitutionally protected speech out of concern for their privacy.

Lead: PRPA


Bill C-63 (Online Harms Act)

Speaking points

  • The government tabled Bill C-63, the Online Harms Act, with the stated goal of holding social media platforms accountable for addressing harmful content on their platforms and to create a safer online space that protects all people in Canada, especially kids.
  • I have made championing children’s privacy rights a strategic priority of my Office, as children need to be able to navigate online spaces securely.
  • This priority relates to areas of C-63, such as the duty to protect children and certain design features for the protection of children that may be required to be integrated into services, such as age-appropriate design.
  • I look forward to having the opportunity to discuss the privacy implications of Bill C-63 should I be called to comment on the bill before Parliament.

Background

  • While both Bill C-63 and Bill S-210 focus on preventing harm, S-210 focuses on preventing organizations from sharing sexually explicit material to minors. C-63 has a similar goal of preventing harm, but is broader, focusing on early mitigation of harms and differs in the mechanisms it uses for holding organizations accountable.
  • C-63 would help prevent the spread of intimate images shared without consent and content that sexually victimizes a child or revictimizes a survivor.
  • Bill C-63 legislates a duty to protect children. As part of this duty, s. 65 states that “an operator must integrate into a regulated service that it operates any design features respecting the protection of children, such as age appropriate design, that are provided for by regulations.”
    • S. 140(o) outlines that regulations respecting design features may include privacy settings for children.
  • S-210 makes it an offense for organizations to make sexually explicit material available to young people on the Internet.

Lead: PRPA


Aylo investigation

Speaking points

  • In February 2024, my office concluded its investigation into a complaint against Aylo (formerly MindGeek), which operates Pornhub and other popular pornographic websites.
  • The complaint was from a woman whose ex-partner had uploaded an intimate video and images of her, along with other identifying information, to Pornhub and other Aylo-owned websites without her knowledge or consent.
  • Our key finding was that Aylo failed to ensure the complainant’s consent prior to allowing the upload and disclosure of her images.
  • We made a number of recommendations, including that Aylo stop sharing user-created intimate content until it implements measures to obtain express, meaningful consent directly from each individual who appears in uploaded content.
  • We are currently in discussions with Aylo with a view to securing a commitment to implement measures that will comply with our recommendations and Canadian privacy law.

Background

  • In May 2023, when the report of findings from the Aylo investigation was nearing completion, Aylo launched a judicial review application with the Federal Court seeking to challenge our findings and recommendations and to prevent us from finalizing and releasing the report.
    • Aylo was ultimately unsuccessful and we were able to issue our final report.
  • In March 2024, after we released our report, Aylo published a blog post in which it indicated that it made changes to its “co-performer” consent requirements in January 2024.
  • We are currently in discussions with Aylo to determine if these changes comply with our recommendations and Canadian privacy law.

Lead: Compliance


TikTok investigation

Speaking points

  • In February 2023, my office, along with my counerparts in Québec, British Columbia, and Alberta, launched a joint investigation into TikTok.
  • We are examining whether TikTok’s practices comply with Canadian privacy legislation and, in particular, whether they are obtaining valid and meaningful consent for the collection, use, and disclosure of personal information.
  • Given the importance of protecting children’s privacy, which remains one of my office’s strategic priorities, the joint investigation has focused in particular on TikTok’s privacy practices as they relate to younger users.
  • We are aiming to release the results of the investigation in the coming months.

Background

  • The TikTok investigation was initiated in the wake of now-settled class-action lawsuits in the United States and Canada, as well as numerous media reports related to their collection, use, and disclosure of personal information.
  • Subsection 11(2) of PIPEDA states that “[i]f the Commissioner is satisfied that there are reasonable grounds to investigate a matter under [Part I of PIPEDA], the Commissioner may initiate a complaint in respect of the matter.”
  • Through collaboration with provincial counterparts, we are able to leverage our limited resources and distinct capabilities and share best practices and comparative strengths to more effectively and efficiently enforce privacy laws.
  • Children's information is particularly sensitive, requires special consideration, and demands even greater privacy safeguards. If it is not realistic to expect adults to understand and be accountable for complex privacy consent forms and rules, it is simply unacceptable to put this burden on children.

Lead: Compliance

Date modified: