Language selection

Search

Issue Sheets on Social Media and Foreign Entities

Appearance before Standing Committee on Access to Information, Privacy, and Ethics (ETHI) of October 25, 2023


TikTok Investigation

Key Messages

  • My Office, along with the privacy protection authorities for Quebec, British Columbia, and Alberta, are continuing to make good progress on the joint investigation into TikTok and aim to conclude the matter by March of 2024.
  • While we are continuing to examine whether valid and meaningful consent is being obtained for the collection, use, and disclosure of personal information by TikTok, we are now also considering whether TikTok’s practices associated with personal information of children and youth are appropriate.
  • This is of particular importance given that a notable proportion of TikTok users are younger users. Protecting children’s privacy is a priority of my Office.

Background

  • Through collaboration with our provincial counterparts, we are able to leverage our limited resources and distinct capabilities and share best practices and comparative strengths to more effectively and efficiently enforce privacy laws across jurisdictions.
  • (Redacted)
  • My Office has taken note of the recent findings of the Irish Data Protection Commission which fined TikTok €345 million, and the UK’s Information Commissioner’s Office which fined TikTok £12.7 million, both of which related to children’s privacy.
  • Children’s information is particularly sensitive, requires special consideration and requires even greater privacy safeguards. If it is not realistic to expect adults to understand and be accountable for complex privacy consent forms and rules, it is simply unacceptable to put this burden on children.

Prepared by: Compliance


TikTok – Findings from other DPA investigations

Key Messages

  • In 2023, the UK and Irish DPAs issued findings and fines against TikTok, particularly focusing on children’s privacy. The DPA of France also issued a fine for breaking rules on cookie consent.
  • Other regulatory action included a 2019 settlement with the FTC for alleged violations of US Children’s Online Privacy Protection Act (COPPA), and in 2020, South Korea’s telecommunications regulator issued a fine for violations related to children’s privacy and overseas data transfer.

Background

  • In April 2023, the UK ICO found that TikTok breached the UK GDPR by: (1) handling data of kids too young to consent; (2) failing to provide users with easy to understand information; (3) not processing user data lawfully, fairly or transparently. A fine of £12.7 million was imposed.
  • In September 2023, Ireland’s Data Protection Commission (DPC) found that TikTok breached the EU GDPR by: (1) making child profiles public by default; (2) allowing “Family Pairing” feature of accounts by (unverified) non-parents/guardians; (3) not being transparent with child users; (4) employing “dark patterns” nudging users towards more privacy-intrusive options. It gave TikTok 3 months to comply with the law and levied a €345 million fine.
  • In July 2021, the Dutch DPA announced that TikTok had breached the EU GDPR by failing to offer a Dutch privacy statement to Dutch users (including children), and imposed a €750 thousand fine. The Dutch DPA handed over the investigation to its Irish counterpart after TikTok’s headquarters moved to Ireland.
  • In February 2019, the FTC announced that TikTok (then known as Musical.ly) agreed to pay $5.7 million to settle alleged violations of the COPPA.
  • In July 2020, South Korea’s Communications Commission announced that TikTok had violated local telecommunications laws by mishandling personal information of its users and issued a ₩186 million fine.
  • In January 2023, France’s CNIL announced that TikTok had violated the French Data Protection Act through its cookie consent flows that made it easier for users to accept than opt-out of tracking and issued a €5 million fine.

Prepared by: Legal


TikTok ban by the Government of Canada

Key Messages

  • Five days after the announcement of the OPC’s investigation into TikTok on February 23, 2023, the Government of Canada banned use of TikTok on all GoC devices citing privacy and security concerns.
  • We were not specifically consulted on the ban of TikTok on government devices; it was a decision made by Canada’s Chief Information Officer that the application poses an unacceptable level of risk to privacy and security.
  • Under the policy, TikTok must be removed from government devices and the application blocked so that it cannot be installed going forward.

Background

  • Our Office falls under the government Policy on Service and Digital and thus we are subject to the ban. Consequently, we have removed and blocked the TikTok application for our devices as required.
  • The app has been banned for use on government devices by other jurisdictions, including all Canadian provinces, Britain, France, Australia, the United States federal government, and many states.
  • The government’s Policy on Privacy Protection requires institutions to notify our office of any planned initiatives that may have an impact on the privacy of Canadians. We consult regularly with TBS on issues related to privacy policies but ultimately, as with all initiatives, policies and programs, the institution makes the final decision on what course of action to take.
  • Treasury Board Secretariat and the Office of the Chief Information Officer consult regularly with our office on overall implications of digital services; we have monthly working level meetings and I also meet with CIO Luelo regularly. This specific matter was not discussed as part of these meetings.

Prepared by: GA


TikTok Investigation – Class Actions

Key Messages

  • The OPC’s TikTok investigation announcement in February 2023 included that it “was initiated in the wake of now settled, class action lawsuits in the United States and Canada”.
  • This referred to two cases.
    • The US class action alleged that TikTok surreptitiously harvested and profited from collecting users’ personal information, including biometric data.
    • The Canadian class action alleged that TikTok breached users’ privacy, including re: consent for minor users.

Background

  • In the US, several federal lawsuits against TikTok were consolidated into a class action before the US District Court for the Northern District of Illinois in 2020. The lawsuit had a nation-wide class and an Illinois state subclass.
  • The nation-wide class alleged that without users’ consent, TikTok collected personal information (e.g. biometric data), disclosed video-related personal information to Facebook and Google and violated consumer protection laws.
    • The Illinois subclass’s allegations included that TikTok harvested users’ biometric information without consent and sold it to third parties.
  • A settlement was approved in March 2022, with TikTok agreeing to, among other things, pay $92 million USD to class members.
  • In Canada, two class action lawsuits against TikTok were certified together for settlement purposes by the Supreme Court of British Columbia. The class members were TikTok users in Canada, with a subclass of minor users.
  • The class members’ allegations included that TikTok collected, used, and sold to third parties, users’ personal information without their consent. The subclass also alleged that TikTok did not obtain parental consent for minors.
  • A settlement was approved in February 2022, with TikTok agreeing to pay $2 million CDN in donations to a BC legal foundation and children’s charities.
  • TikTok did not admit to any wrongdoing in either settlement agreement.

Prepared by: Legal


Clearview Investigation

Key Messages

  • US company Clearview AI creates and maintains a database of over 20 billion facial images scraped from the Internet. Account holders, primarily law enforcement, can search this database for matching faces using facial recognition technology.
  • In 2020, the OPC and counterparts in Quebec, BC and Alberta investigated Clearview AI and found that it failed to obtain consent for its scraping of data from the Internet, and that its purposes were inappropriate, in that they effectively placed individuals under a state of constant surveillance.
  • My counterparts issued orders, a power not available to my Office, compelling Clearview to cease collection and to delete Canadians’ information. These matters remain before the courts.
  • Subsequent to the investigation, in 2021, the OPC tabled a Special Report to Parliament on the RCMP’s use of Clearview AI. We found that the RCMP contravened the Privacy Act, as an institution cannot collect personal information from a third party who collected the information unlawfully, a principle which has since been incorporated in TBS guidance to federal institutions.

Background

  • The Clearview AI tool functions as follows - Clearview:
    • “scrapes” images of faces and associated data from publicly accessible online sources, and stores that information in its database;
    • creates biometric identifiers (numerical representations) for each image;
    • allows users to upload an image, which is then assessed against those biometric identifiers and matched to images in its database; and
    • provides a list of results, containing all matching images and metadata. If a user clicks on a result, they are directed to the original image’s source.
  • Post investigation, the RCMP implemented OPC recommendations, creating the NTOP to review, with a privacy sensitive lens, new technologies before onboarding.

Prepared by: Compliance


Facebook/Cambridge Analytica

Key Messages

  • The issues at the heart of this matter are critically important for the fundamental privacy rights of Canadians and their ability to participate with trust in our digital society.
  • The OPC initiated its application under section 15 of PIPEDA to protect Canadians’ privacy.
  • The OPC is appealing the Federal Court’s decision because the matter raises important questions with respect to the interpretation and application of privacy law that will benefit from clarification by the Federal Court of Appeal.
  • Given this is before the Courts, we cannot comment further.

Background

  • In March 2018, the OPC received a complaint about Facebook that arose amid media reports that Cambridge Analytica had accessed the personal information of Facebook users without their consent via a third-party application (TYDL App).
  • The OPC and the Office of the Information and Privacy Commissioner for British Columbia jointly investigated and found that Facebook had not obtained meaningful consent from its users before disclosing their personal information and that it had not implemented adequate safeguards.
  • The OPC filed an application with the Federal Court under s. 15 of PIPEDA seeking, inter alia, an order requiring Facebook to correct its practices to comply with PIPEDA as Facebook didn’t agree to implement the OPC recommendations.
  • In April 2020, Facebook filed an application seeking judicial review of the OPC’s investigation process and the resulting Report of Findings.
  • On April 13, 2023, the Federal Court dismissed Facebook’s application for judicial review. The Court found that the OPC had not breached procedural fairness and that the investigation was not out of time. This decision has not been appealed.
  • That same date, the Court also dismissed the Commissioner’s s. 15 application finding that there was insufficient evidence to conclude that Facebook had failed to comply with PIPEDA. The OPC appealed this decision on May 12.
  • The parties have filed their respective factums; a date will be set for a hearing.

Prepared by: Legal


Privacy concerns with social media

Key Messages

  • Social media prompts individuals to share personal information online. Users can be careful about the information they share and adjust their privacy settings, but social networking sites must also meet their legal obligations under privacy laws.
  • Information shared on social media may be accessible to many parties, who may collect, use or scrape it for a myriad of purposes including profiling, tracking, and targeted advertising.
  • Individuals should be able to use social media while maintaining a reasonable expectation of privacy.
  • While considering the issue, it is important to remember that young people are impacted by social media differently than adults, are at greater risk of being affected by privacy-related issues, and therefore require special protections.

Background

  • In October 2023, our Office and FPT counterparts published a resolution on putting the best interests of young people at the forefront, which in part sets guidelines for organizations to ensure young people are not tracked or profiled without justification, knowledge, or consent.
  • In August 2023, our Office published a joint statement on data scraping, which outlines the privacy risks, sets out how social media companies can safeguard personal information, and suggests ways users can protect themselves,
  • The Privacy Act only permits federal entities to collect information directly related to operating a program or activity, whether or not the information is publicly available. PIPEDA permits organizations to collect, use or disclose PI without knowledge or consent if it is publicly available and specified by the regulations. Personal information that is posted to social media sites is not specified by the current regulations.

Prepared by: PRPA


PIPEDA’s applicability to personal information stored abroad

Key Messages

  • PIPEDA can still apply to personal information stored on servers located outside of Canada in some circumstances.
  • PIPEDA applies to transborder data flows of personal information entering/leaving Canada or between provinces.
  • A real and substantial connection to Canada is required vis-à-vis an organization’s collection, use, or disclosure of personal information for PIPEDA to apply.

Background

  • When there is an ongoing investigation, PIPEDA’s confidentiality provisions apply and we may not be in a position to publicly reveal specific jurisdictional aspects of that matter for the duration of that investigation.
  • Before asserting jurisdiction over a matter, the OPC looks for indicators establishing a “real and substantial connection to Canada”, which includes a non exhaustive list of elements developed by jurisprudence over time.
  • Relevant court decisions on “real and substantial connection to Canada” applicable to PIPEDA: (1) A.T. v. Globe24h.com, 2017 FC 114 at paras 50-64; (2) Lawson v. Accusearch Inc. (F.C.), 2007 FC 125 at paras 38-51; (3) Facebook, Inc. v. Canada (Privacy Commissioner), 2023 FC 534 at para 84.
  • A non-exhaustive list of factors set out in A.T. v. Globe24h to establish a “real and substantial connection to Canada”: (1) the location of the target audience of the website, (2) the source of the content on the website, (3) the location of the website operator, and (4) the location of the host server.
  • Physical presence of organization in Canada is not required for a “real and substantial connection” as per Federal Court in A.T. v. Globe24h.
  • PI collected, used (including storage), or disclosed within the borders of provinces with privacy legislation deemed substantially similar to Part 1 of PIPEDA (i.e., BC, Alberta, and Quebec) is subject to provincial privacy laws, whereas PIPEDA would apply to the rest of Canada or where there are transborder data flows between provinces or to/from another country.

Prepared by: Legal


Data Harvesting

Key Messages

  • Data harvesting is another term for “data scraping” – the automated extraction of data from the web.
  • Scraped personal information can be exploited for various purposes, such as monetization through re-use on websites, sale to malicious actors or intelligence-gathering.
  • Social media companies have obligations to protect personal information on their platforms.
  • Personal information that is publicly available is still subject to privacy laws in most jurisdictions.
  • Mass data scraping incidents that harvest personal information can constitute reportable privacy breaches.

Background

  • The privacy risks of data harvesting / data scraping include: targeted cyberattacks; identify fraud; monitoring, profiling and surveilling individuals; unauthorized political or intelligence-gathering purposes; and unwanted direct marketing or spam.
  • Social media companies should implement multi-layered technical and procedural controls to mitigate the risks.
  • These controls include: “rate limiting” the number of visits per hour or day by one account; taking steps to detect “bots,” such as by using CAPTCHAs, and blocking IP addresses; and taking appropriate legal action such as sending “cease and desist” letters.
  • Social media companies should continuously monitor for, and respond with agility to, new security risks and threats from malicious actors.
  • However, it is important to note that no safeguards are 100% effective and individuals should therefore be mindful that the personal information they share online may be at risk.

Prepared by: TA


Data Harvesting in OPC Investigations

Key Messages

  • Data harvesting tools enable collection of vast amounts of personal information from publicly accessible online sources. Much of this information will not be “publicly available” under PIPEDA, and not exempt from consent requirements.
  • At the same time, we continue to see growth in large-scale data harvesting, fuelling uses such as facial recognition, like that we investigated in Clearview AI, and the training of generative AI solutions like OpenAI’s ChatGPT, which my Office is currently jointly investigating with Alberta, British Columbia and Quebec.
  • Such cases emphasize the need for companies to avoid indiscriminate collection and proactively determine what publicly accessible information can be collected lawfully.

Background

  • Clearview claims to maintain a facial recognition databank of over 20 billion scraped images, up from 3 billion at the time of our finding in Feb 2021. Our joint investigation found this scraping to be inappropriate and, in any event, not exempt from consent requirements.
  • In an earlier case, we found that Profile Technologies could not rely on the publicly available exemption in relation to the information of millions of Facebook users that it scraped to populate its own social media platform.
  • OpenAI publicly states that it relies on information including “publicly available information that is freely and openly available on the Internet” to train the models that power ChatGPT. Our investigation will seek to determine whether this practice is compliant with PIPEDA.
  • In our 2016 Compufinder investigation, we found that while the company harvested electronic addresses pre-CASL, they continued to use the addresses after CASL came into effect, without requisite consent, as required pursuant to the CASL provision, subsection 7.1(2) of PIPEDA.

Prepared by: Compliance


Joint Statement on Data Scraping

Key Messages

  • The OPC and 11 other global privacy authorities issued a joint statement in August detailing our expectations vis-à-vis protection against unlawful data scraping. We also sent the statement to 6 major social media companies for comment.
  • We are pleased that 5 of those companies have responded to our Offices explaining measures that they have implemented to comply with our expectations.
  • Signatories discussed the responses earlier this week at a meeting during the GPA annual conference. We will be following up with the companies to better understand their practices and inform lessons learned from this initiative.

Background

  • The statement has its roots in two Closed Enforcement Sessions organized by the IEWG to discuss global privacy issues resulting from mass scraping. Certain members of the IEWG decided to pursue this joint initiative.
  • Ultimately, 12 authorities from 6 continents signed/issued the joint statement on August 24, 2023: OAIC Australia, OPC Canada, UK-ICO, PCPD Hong Kong, Norway (Datatilsynet), Swiss FDPIC, SIC Colombia, OPC New Zealand, JOIC Jersey, CNDP Morocco, AAPI Argentina and INAI Mexico.
  • On behalf of the signatories, OPC Canada also sent a letter to: Alphabet Inc. (YouTube), ByteDance Ltd (TikTok), Meta Platforms, Inc. (Instagram, Facebook and Threads), Microsoft Corporation (LinkedIn), Sina Corp. (Weibo), and X Corp. (X, previously Twitter) asking them to explain how their respective companies comply, or intend to comply with the expectations outlined in the statement. 5 companies out of 6 responded (Redacted)
  • In a meeting on the margins of the GPA Conference on October 17, 2023, the signatories discussed the SMCs’ responses and decided to organize follow-up meetings with each to get more details on outstanding questions. The signatories then plan to issue a final public statement on the main takeaways from the engagement.

Prepared by: Compliance

Date modified: