Language selection

Search

Study of the Federal government’s use of technological tools capable of extracting personal data from mobile devices and computers Issue Sheets

Spyware vs. Other Digital tools (General)

Key Messages

  • Given the highly intrusive nature of spyware, the OPC expects federal departments to adhere to the TBS Policy on Privacy Protection and engage with our Office prior to deployment. This could be either through a formal consultation or in the context of a PIA.
  • Absent such engagement, the OPC can neither meaningfully review the assessments of privacy risks inherent in various tools or software, nor provide appropriate guidance.
  • In consultation, the Royal Canadian Mounted Police (RCMP) confirmed their use of Mobile Digital Forensics Technology (MDFT). The OPC advised that new technology use could trigger the requirement to conduct a PIA on the relevant program.
  • The Canada Border Services Agency (CBSA) has also acknowledged their use of this technology but declined to provide us details on specific tools.

Background

  • Policy on Privacy Protection: Section 4.2.2 states that the OPC must be notified of any planned initiatives that may have an impact on the privacy of Canadians. This notification is to take place at a sufficiently early stage to permit the OPC to review and discuss the issues involved.
  • Engagement with OPC: In response to a media request, Government Advisory requested consultations with the RCMP and the CBSA about their use of MDFT in January and February 2021. No institutions have proactively contacted the OPC to discuss the use of this type of technology.

Prepared by: TA


Glossary and definitions

Key Messages

  • It is useful to distinguish between different investigative tools and their varying privacy implications.
  • Malware and spyware are deployed with criminal intent; while investigative tools can be misused but are generally not intended to cause harm.

Background

  • Malware: Malware is a category of software that is designed to conduct malicious activities on a device. For instance, cyber criminals will use malware to infect systems and devices and steal sensitive information (e.g., identification credentials, financial details, trade secrets) stored on mobile devices or computers. Malware may even control mobile devices or computer systems to erase or alter data. Common forms of malware include viruses, ransomware, Trojans, and spyware used to intercept communications or copy data and re-direct it.
  • Spyware: Spyware is a subset of malware specifically used to track a user's digital actions and information without the user's knowledge or consent. The delivery method of spyware can range from a “free” download, to being remotely installed by exploiting vulnerabilities on the target device and installed without consent. These are difficult to remove and can infect mobile devices or computers.
  • Digital forensic tools: Digital forensics are tools, software and techniques used to access, extract, analyze, and disclose information from digital devices. Besides preserving evidence, digital forensic tools permit structured investigation by collecting, identifying, and validating data for the purpose of reconstructing events.
  • ODITs: On-Device Investigative Tools (ODITs) are specialized software known to be used by law enforcement agencies (e.g. RCMP) to covertly gather evidence on personal devices following a judicial authorization. ODITs can be installed physically or remotely on a device, and often has the same technical capabilities as spyware.

Prepared by: TA


Use of Digital Tools

Key Messages

  • The OPC’s Technology Analysis Directorate (TAD) uses digital forensic tools to support the OPC’s legislative mandate. Notably, these are used in support of Compliance activities, such as breach investigations and to validate information and statements obtained during subject interviews.
  • TAD does not use digital forensics tools to render decisions on individuals.
  • Strong controls are in place to limit the access to any information that is collected, processed and stored in relation to the use of digital forensic tools.
  • The OPC has not used digital forensic tools in support of administrative investigations.

Background

  • The OPC has leveraged digital forensic tools since at least 2018 to support Compliance-based investigations.
  • Common digital forensic tools that are used by the OPC include (redacted).
  • To ensure the integrity of the evidence collected during investigations and to limit access to sensitive information (such as private information), TAD:
    • Stores collected evidence in a RCMP-approved secure cabinet;
    • Maintains strict physical access controls to the room where collected evidence stored and processed (the TAD Lab);
    • Maintains strict logical access controls to systems used to process and analyze collected evidence; and,
    • Minimizes the amount of data contained in its reports.
  • It is anticipated that should Bill C-27 come to pass, the OPC will see an increase in the use of digital forensic tools to support its Compliance activities.

Prepared by: TAD


Privacy Act Obligations / Privacy Act Reform

Key Messages

  • To be compliant with their obligations under section 4 of the Privacy Act, federal government institutions must ensure they deploy digital forensic tools in direct support of an operating program or activity, and in a manner that respects the law.
  • The use of sophisticated collection technology by government institutions would be better governed if the Privacy Act were to explicitly require institutions to consider the principles of necessity and proportionality prior to their deployment.
  • Further, government institutions should be legislatively required to conduct Privacy Impact Assessments in appropriate circumstances, such as when new or substantially modified programs or activities will have an impact on overall privacy.

Background

  • Program legislation: Operating programs or activities are often grounded in program or enabling legislation. In media reports, several government institutions pointed to such legislation as authority for their use of digital forensic tools (e.g., the Department of Fisheries and Oceans referenced the Fisheries Act).
  • Necessity and Proportionality: Although not a legal requirement under the current law, the OPC has long advocated that the collection of personal information by government institutions be governed by a “necessity and proportionality” standard, a position taken up by ETHI in 2016 in its report Protecting the Privacy of Canadians: Review of the Privacy Act.
    • Justice Canada’s November 2020 discussion paper on modernizing the Privacy Act goes some way towards codifying this standard by introducing a “reasonably required” threshold, but as outlined in the OPC’s public submissions, more work needs to be done.
  • Privacy Impact Assessments (PIAs): While TBS policy requires government institutions to conduct PIAs in a range of circumstances, the OPC has consistently called for PIAs to be made a legislative requirement under the Privacy Act.
    • This was proposed by the Department of Justice in the above-noted discussion paper and was ETHI’s first recommendation in its November 2022 report on the RCMP’s use of ODITs.

Prepared by: Legal


Current PIA Requirements

Key Messages

  • The obligation for federal departments to develop PIAs is currently a policy requirement under s 6.3.1 of the TBS Directive on Privacy Impact Assessment.
  • The aim of the Directive is to ensure that privacy implications are identified, assessed and resolved before a new or substantially modified program or activity involving personal information is implemented.
  • The OPC has recommended a legislative requirement for PIAs to be undertaken as part Privacy Act reform.

Background

  • The TBS Directive requires at minimum the completion of a core PIA identifying levels and categories of privacy risk, personal information elements and data flows, ensuring compliance with sections 4 to 8 of the Act, and documenting the conclusions and recommendations drawn from the privacy analysis.
  • The Directive requires federal departments to submit the final, approved versions of PIAs to the OPC and TBS, along with any further documentation we might ask for. Federal departments are responsible for deciding whether and how to undertake PIAs.
  • We are aware of many instances where programs are launched before PIAs are completed (e.g. the RCMP CAIT program and recently the race-based data collection pilot).
  • The Directive also contains a provision requiring a PIA for a significant modification to existing programs. There are many longstanding departmental programs that predate the PIA requirement, which came into force in 2002.
  • In many cases, we note that the introduction of a new technological solution to an existing program can represent a significant modification and provide an opportunity for an institution to conduct a whole of program PIA and assess the privacy risks related to existing programs that may not have undergone a previous assessment or update an existing PIA.

Prepared by: GA


Role of the OPC in Advising Government

Key Messages

  • The Government Advisory Directorate provides non-binding advice to federal public sector institutions on specific programs and activities involving personal information primarily by reviewing Privacy Impact Assessments (PIAs).
  • GA also offers voluntary consultation services to help departments identify and mitigate risk early in program development. We have seen benefits for those who participate.
  • We engage in regular contact with ATIP Offices from most institutions, but we can only provide advice and recommendations on the programs about which we are notified and based on the information that we are provided.

Background

  • The TBS Directive on Privacy Impact Assessment requires PIAs be undertaken for programs or activities that use personal information as part of a decision-making process that directly affects the individual. Departments must send completed PIAs to the OPC, but we do not approve or endorse them.
  • Our consults and PIA reviews are based on what we are told, they are not formal audits. Therefore, our advice and recommendations are based on what the institution chooses to share. It is the institution’s responsibility to provide an accurate representation of their initiatives.
  • TBS is responsible for monitoring compliance with the Directive. OPC has the power to review compliance with sections 4 to 8 of the Act, and could audit and report to Parliament on departmental PIA processes, but this would be through a separate process under the Compliance Sector.
  • GA triages and reviews PIAs on the basis of risk, such as the use of sensitive or large amounts of personal information, if the program impacts vulnerable populations or uses novel technologies.
  • The Directive requires that PIAs be provided to the OPC and TBS but it does not give timelines for this step. The OPC has no power to compel institutions to complete a PIA, nor do we have powers to sanction non-compliance.

Prepared by: GA


Background

OPC Views – Outreach and Responses to Date

Key Messages

  • After media reports, Government Advisory (GA) sent a request for information to thirteen government institutions inquiring about their use of technological tools capable of covertly extracting personal data from mobile devices.
  • Most of the programs that institutions described in their responses clearly meet the threshold to require a Privacy Impact Assessment (PIA) – they collect personal information to make administrative decisions about individuals.
  • Many Access to Information and Privacy (ATIP) teams seem to have been unaware that their institutions used these tools and needed to engage internally for information.
  • Generally, there is a lack of awareness and motivation to conduct full PIAs across program areas, perhaps because they are currently not a legal requirement. Despite this, many of the responses also indicate that institutions have imposed internal limits and policies on their use of these tools.

Background

  • Response to Follow-Up: GA has received responses to our request for information from 11 of the 13 institutions listed in the ETHI motion.
  • GA can confirm that of the institutions that responded, only three have completed a formal PIA for OPC review as of January 2024.
  • 8 responding institutions noted that they have begun work on a new PIA or are contemplating updating an existing PIA to cover use of the new technology (under a pre-existing program). Institutions undertaking new PIAs or updating PIAs indicated their programs are investigative in nature and use PI.
  • One institution noted that while it had procured the technology in question, it had never used it.

Prepared by: GA


OPC Involvement in RCMP CAIT Program

Key Messages

  • The OPC learned about the Covert Access and Intercept Team (CAIT) Program and their use of On-Device Investigative Tools (ODITs) from a media request received on June 27, 2022.
  • Despite TBS requirements that PIAs be completed prior to program implementation and despite regular meetings my office has with the RCMP, the OPC neither received a PIA on the program, nor was it consulted on it by the RCMP.
  • At our request, in the fall of 2022, the RCMP did provide us with a one-day demonstration on the use of ODITs.
  • We eventually received a PIA on September 12, 2023, and provided advice and recommendations in December 2023. Among our recommendations, we noted that the PIA was conducted at a very high level which made it difficult to assess privacy risks.

Background

  • CAIT was not the only RCMP program we have learned about through the media. For instance, in 2020 we learned about RCMP’s use of Clearview AI’s facial recognition technology.
  • S. 6.3.1 of the TBS Directive on PIA requires institutions to develop PIAs for new or substantially modified programs where personal information is used as part of a decision-making process that directly affects the individual.
  • Based on our understanding of ODITs, we would expect an institution to consult with our office and submit a PIA in advance of using them to fulfil a program’s mandate, allowing us sufficient time to review and provide recommendations.
  • OPC has not received a complaint or initiated an investigation on the CAIT program’s use of ODITs at this time. Should information come to light suggesting a potential contravention of the Privacy Act, the OPC could launch an investigation based on the identified risks to Canadians.

Prepared by: GA


Statutory Authorities for RCMP Use of Investigative Tools

Key Messages

  • The RCMP was one of the 13 government institutions reported to have been using digital forensic tools. The RCMP has a statutory and common law mandate to preserve the peace, prevent crime, apprehend criminals, and execute warrants.
  • Depending on the purpose, and privacy-invasiveness of the investigative tools that the RCMP deploys, a range of privacy-protective measures may be required, including: judicial authorization; after-the-fact notification of relevant parties; and Parliamentary reporting.
    • These measures are generally set out in criminal law statutes, most notably the Criminal Code. As a state actor, the RCMP is also subject to the Charter, including the s. 8 protection against unreasonable search and seizure.
  • The OPC’s 2021 Special Report on the RCMP’s use of Clearview AI’s facial recognition technology found that the RCMP had failed to verify the lawfulness of the collection practices of partners from whom it collects personal information. This remains an area of concern for our office.

Background

  • RCMP Clearview Investigation: The OPC found that the RCMP’s reliance on Clearview AI’s facial recognition technology to collect personal information contravened s. 4 of the Privacy Act since Clearview AI itself had collected the information in contravention of (private sector) Canadian privacy laws.
  • Commitments from RCMP post Clearview: The RCMP agreed to conduct “fulsome privacy assessments of third-party data collection practices to ensure any personal information is collected and used in accordance with Canadian privacy legislation.” In furtherance of this commitment, the RCMP created the National Technology Onboarding Program, (NTOP).

Prepared by: Legal


Relevant Investigative Findings

Key Messages

  • In 2017 we found the RCMP’s collection of phone location data by cell site simulators in 6 cases where there were neither warrants nor exigent circumstances. In those cases, we found the RCMP contravened the collection provisions in section 4 of the Privacy Act.
  • When we investigated RCMP’s use of Clearview AI in 2020-21, we found that it had also contravened section 4 of the Act by collecting information from Clearview AI that had itself collected in contravention of privacy laws to which it was subject.
  • Aside from an investigation where we found that the unauthorized use of such a tool was an isolated incident, we have not received any complaints for on-device investigative tools (ODITs).

Background

  • We engaged with the RCMP in respect of its implementation of the 8 recommendations from our investigation of its use of Clearview AI, which included launching a National Technology Onboarding Program (NTOP). Ultimately, we were satisfied that the RCMP had implemented our recommendations.
  • A PIA had not been received regarding its use of Clearview. We note, however, that Clearview has ceased offering its services in Canada.
  • OPC is currently investigating a complaint about RCMP’s Project Wide Awake (open-source intelligence gathering) which does not involve on-device investigation tools. Given the investigation is ongoing, we can provide no further information.

Prepared by: Compliance


OPC Treatment of incidental PI

Key Messages

  • Complainants and respondents to our investigations are asked not to provide the OPC with personal information beyond what is necessary for the purposes of the investigation.
  • Organizations that submit breach reports to my office are directed to not include personal information in their submissions, and are asked instead to provide a description of the compromised personal information.
  • Irrelevant or superfluous personal information incidentally collected during an investigation or assessment of a breach report is generally either deleted or the party is asked to resubmit their submissions without it.

Background

  • When reading how to submit a complaint on our website, Canadians have access to OPC’s Privacy Policy, which asks them not to provide us with information beyond what is necessary.
  • OPC’s requests for information – sent to parties in the course of an investigation – specify the document(s) needed and the elements that the responses should include.
  • The Breach notification form explicitly states that organizations should not include any personal information in the form other than the business contact information of the person(s) at the organization that the OPC can contact if it has any follow-up questions.
  • If the information shared by the parties contains irrelevant or superfluous personal information, the OPC assesses on a case-by-case basis appropriate actions to be taken applying a privacy sensitive lens, which can include the deletion of unsolicited personal information or a request that the information be resubmitted without the unnecessary personal information.

Prepared by: Compliance


Adequacy

Key Messages

  • We welcome the European Commission’s finding that Canada continues to offer an adequate level of protection under the General Data Protection Regulation (GDPR), as organizations will continue to be able to transfer data from the EU to Canada without additional requirements.
  • We note that the Commission recommends legislating some of the protections developed at the sub-legislative level, as this would enhance legal certainty and consolidate new requirements, such as those regarding sensitive personal information. The Commission notes that the ongoing reform of PIPEDA could offer such an opportunity.
  • We hope that the Government continues to take steps to reform Canada’s federal privacy laws to ensure we retain adequacy.

Background

  • Article 45 of the GDPR allows personal data to be transferred to a third country without additional requirements where the European Commission has decided that the third country ensures an “adequate level of protection”.
  • Canada was granted adequacy status in 2001 for organizations subject to PIPEDA on the basis of Article 25(6) of Directive 95/46/EC. When the GDPR came into force in 2018, existing adequacy decisions remained in force until amended, replaced, or repealed by the Commission.
  • On January 15th, 2024, the European Commission concluded a review of 11 existing adequacy decisions, including Canada’s.
  • The review of the adequacy decision focused on developments since the adoption of the decision, including Canada’s data protection framework, issues concerning oversight, enforcement, redress, and government access to data.
  • The Commission states an intention to closely monitor future developments in Canada. As the GDPR requires adequacy decisions to be reviewed every four years, we expect another review to come in 2028.

Prepared by: PRPA


EU AI Act

Key Messages

  • The proposed EU AI Act was first introduced in 2021 and appears to be nearing final agreement, with a vote to approve the legislation expected in early February.
  • The AI act takes a risk-based approach, prohibiting some artificial intelligence practices (such as social scoring) and labelling others as high-risk systems (such as those used to evaluate eligibility for public benefits or services).
  • The AI Act’s primary requirements apply to high-risk systems, with certain ‘limited-risk’ systems requiring only transparency.
  • Extensive technical documentation, reviewable by regulators, must also be maintained for most ‘general purpose’ AI models (such as those upon which generative AI is built).
  • The extent of overlap and consistency between the AI Act and Canada’s AIDA will largely depend on the regulations developed under the latter. However, key differences exist.
  • AIDA does not designate prohibited AI practices, and risk-level under the EU AI Act considers whether it poses “a risk of adverse impact on fundamental rights” – a factor that is not considered by AIDA’s definition of harms.

Background

  • In the EU AI Act, “high-risk” AI is subject to requirements including:
    • risk and quality assessments,
    • logging and record-keeping for traceability,
    • general human oversight,
    • accurate and representative data for AI training,
    • ex-ante conformity assessments, and
    • demonstrable accountability.

Prepared by: PRPA


Breach Reporting Trends

Key Messages

  • Our Office remains concerned about the under-reporting of privacy breaches in the public sector, as many institutions that handle sensitive personal information have never reported a breach to us.
  • In 2022-2023, only 1 cyber breach was reported by the public sector. This contrasts markedly with the private sector, where 278 cyber breaches were reported during the same period.
  • Over the course of the last year, we noticed a new trend: cyber attacks target service providers, impacting numerous organizations at once. This was the case with the recent BGRS breach, as well as the MoveIT and GoAnywhere breaches.
  • We note with concern that, under PIPEDA, service providers do not currently have direct breach reporting obligations to the OPC. Rather, they are to inform their affected clients. Experience has shown that this approach leaves gaps.

Background

  • To date this fiscal year, our Office received 467 reports of breaches from the public sector. Those breaches were primarily relating to the loss of personal information (69%), with unauthorized disclosure (16%) and unauthorized access (15%) being the next most common cause of breaches.
  • For the same reporting period, the OPC received 573 breach reports from the private sector. Slightly less than half of those (262) were said to be cyberattacks initiated through malware, compromised credentials, or phishing schemes.
  • We note that the gap between public and private sector reporting narrowed; with 573 reports received from the private sector and 467 from the public sector (as of January 30). Nevertheless, we remain concerned that most breach reports from the public sector come from the same departments.
  • Under C-27 we proposed that service providers report breaches that meet RROSH to their clients that control the personal information, and to the OPC.

Prepared by: CIRD


Global Affairs Canada Breach

Key Messages

  • On January 26, GAC verbally advised our Office of a breach of their Canadian VPN network from 20 December 2023 to 24 January 2024. The breach is now contained.
  • GAC is still investigating, but was able to advise that the breach was caused by a cyberattack and that data, including personal information, was exfiltrated.
  • We are in communication with the department to collect more information about the incident and GAC’s related response.

Background

  • As of January 26, GAC had (redacted).
  • In accordance with TBS policy, institutions are required to report material privacy breaches to TBS and the OPC no later than seven days after the institution determines the breach is material.
  • We are expecting GAC to submit a breach report in the coming days.

Prepared by: CIRD

Date modified: