Language selection

Search

Additional questions and answers

Appearance before Standing Committee on Access to Information, Privacy, and Ethics (ETHI) of February 1st, 2024

Table of Contents


Q1. What are these tools? How do they work? [TA]

  • Digital forensics tools can be viewed as a collection of specialized software and hardware that are used to capture and analyze digital evidence (e.g. hard drives, cellphones, etc.) in support of an investigation, or as part of the technical analysis of a given system.
  • When used in combination with established/documented procedures around the acquisition, handling, processing, storage and disposition of digital evidence, they can help derive conclusions that can withstand a court challenge.
  • Digital forensic tools can:
    • Extract and examine the entirety of files from a given device (e.g. a laptop or a server’s hard drive, or a cellphone) in a forensically sound fashion.
    • Recover deleted data.
    • Analyze a given file’s properties and metadata, such as but not limited to when a file was created, modified or deleted; whether it an executable, a picture, a word document or another type of file.
    • Create timelines of events, such as when users accounts were used, websites were accessed, when attributes of the operating system were changed, when programs that launch automatically when the computer starts were created and last executed; among many other things.
    • Validate that previously acquired evidence has been properly sanitized (“wiped”) before it’s being disposed of.
  • Spyware and ODITs will often share the same technical capabilities, such as the ability to covertly record keystrokes, turn on a microphone or camera, capture screenshots, exfiltrate files, etc.
  • The critical differentiator is that spyware is deployed illegally, while ODITs are deployed under a judicial authorization, such as a search warrant, by entities such as law enforcement.

Q2. What are the concerns with these tools? What should Canadians know? [TA]

  • The initial CBC article erroneously referred to digital forensic tools as "spyware" being used by 13 Government of Canada departments. This may lead Canadians to believe that a large number of departments are engaged in the covert and unauthorized surveillance of Canadians and operating beyond their legislated mandate, which is highly inaccurate.
  • It is important to note that digital forensic tools ARE NOT spyware. They are used in a deliberate and overt fashion in support of various legislative mandates. In some cases, such as the use of digital forensic tools by law enforcement, additional authorizations must be sought, such as a search warrant to collect digital evidence (e.g., laptops, desktops, cellphones).
  • These tools are also used to help protect Canadians and their data, such as over the course of an investigation into a cyber breach.

Q3. How widely available are these tools in Canada? Are we concerned with their availability in Canada? [TA]

  • Digital forensic tools are widely available in Canada.
  • They are used by law enforcement agencies, Government of Canada departments either enforcing their legislative mandate or for administrative investigations, cyber incident responders, cyber security and privacy researchers (including academia), law firms, and data recovery specialists.
  • While a number of digital forensic tools can be purchased by anyone, with some even being open sourced, others (or certain features) are restricted to law enforcement and government agencies.
  • The OPC does not a have a concern with the availability per se of digital forensic tools in Canada. However, the OPC could have concerns depending on how digital forensic tools are used by organizations and whether appropriate privacy safeguards have been put in place.

Q4. What are other countries and DPAs doing about these tools, if anything? [PRPA]

  • There has been great attention paid to the use of spyware, and in particular Pegasus, in recent years. Of note is a 2022 report by the Council of Europe that examined Pegasus and its impact on human rights, including privacy.Footnote 1
  • In 2021 the UK ICO published an investigation report on mobile phone data extraction by police in which they made a number of pertinent recommendations including that the police treat personal information fairly, transparently, and lawfully, and that only data that is necessary be collected.Footnote 2
  • More generally, in 2021 the Global Privacy Assembly passed a resolution (that we sponsored) outlining conditions for ensuring that any type of public authorities’ legitimate access to personal information held by the private sector for purposes related to national security or public safety also contribute to the preservation of privacy and the rule of law.Footnote 3
  • This resolution also called on governments and international organizations to engage in discussions aiming at regulating the sale, export and use of technologies allowing for particularly intrusive and disproportionate access to individual personal data, in particular electronic communication content and data.

Q5. What did the OPC do following the CBC article? [GA]

  • Following the CBC article release, Government Advisory (GA) sent a request for information to thirteen government institutions in response to media coverage about their use of technological tools capable of covertly extracting personal data from mobile devices.
  • We asked each institution to respond to three questions:
    • What program is using the technology associated with the article?
    • Is personal information collected as part of the program?
    • What privacy assessments have been conducted for this program, and the associated technology?

Q6. Are we satisfied with the responses provided by the government departments? [GA]

  • GA has received responses to our request for information from all thirteen of the institutions listed in the ETHI motion.
  • We are satisfied that the institutions that have responded have provided reasonably complete answers to our questions.
  • We note that many institutions’ Access to Information and Privacy (ATIP) teams seem to have been unaware of the use of these tools and needed to conduct their own fact-finding exercises with program teams.
  • While we are disappointed to learn that PIAs were not conducted when they should have been, in some cases, we are encouraged by the indication that this work is underway by most of these institutions now.

Q7. What are our top concerns? [GA/TA]

  • Most of the programs that institutions described in their responses clearly meet the threshold to require a Privacy Impact Assessment (PIA) – they collect personal information to make administrative decisions about individuals. Despite this, most of the institutions did not complete or update a PIA before using these technology tools.
  • There seems to be a lack of awareness and motivation to conduct full PIAs across program areas, perhaps because they are currently a policy and not a legal requirement.
  • This speaks to a broader concern that there is a lack of urgency placed on conducting privacy assessments prior to launching programs or engaging in privacy invasive activities. This may also speak to lack of resources available to conduct them.

Q8. Based on what we know, are the allegations in the CBC articleFootnote 4 substantiated (or not)? [GA]

  • All but one of the institutions who responded to our request for information confirmed that they were using the technology tools in question to collect personal information. GA can confirm that of the institutions that responded, only three have completed a formal PIA for OPC review as of January 2024.
  • While the description of the technology as “spyware” was not strictly accurate, based on the information that we have received from institutions, it appears to be accurate that many of these federal institutions did not conduct PIAs when they should have.
  • We do not have complete program details, we have only the responses we were provided over the past weeks since the article was released. Our assessment is based only on this and not a fulsome review or consultation.

Q9. Status update on ODITs? Is that resolved? Are we reassured that Pegasus is not available for public sector? Do we have any related ongoing investigations on this? [COMPLIANCE/TA/GA]

  • Currently we do not have any investigations related to ODITs nor Pegasus.
  • Organizations and institutions are required to ensure that, when using technology, they adhere to privacy requirements.
  • Shortly after my appearance before the ETHI Committee on its study of device investigation tools used by the RCMP in August 2022, my Office received a briefing from the RCMP on its use of On-Device Investigative Tools.
  • At that meeting and in the year that followed, we reminded the RCMP of the need to submit a Privacy Impact Assessment.
  • We received the PIA on September 12, 2023, and issued a letter of recommendation on December 4, 2023.
  • (redacted)
  • We have asked for a response to our recommendations by February 12, 2024. We will follow up if we do not receive one at this point.

Q10. What engagement has the OPC had with the RCMP’s National Technologies Onboarding Program (NTOP)? [GA/Compliance]

  • The OPC has been engaged with the RCMP about its NTOP beginning in 2021 when GA and CMU attended a presentation on the NTOP.
  • The OPC’s Compliance Monitoring Unit (“CMU”) engaged with the RCMP regularly for over a year to monitor and assess the implementation of recommendations following our investigation into the RCMP’s use of Clearview AI. During these meetings, the CMU received regular updates on NTOP and, in turn, provided feedback and advice.
  • GA has also provided advice on the NTOP and has been made aware of its use during consultations on technologies that have gone through the NTOP processes.
  • On March 31, 2023, we closed the CMU file after determining that the RCMP had satisfactorily implemented our recommendations.

Background:

  • NTOP emerged in part from OPC’s investigation into the RCMP’s use of Clearview AI’s facial recognition technology. The RCMP recognized that it lacked a centralized system for identifying, assessing, and tracking new and emerging technologies. (Link in reference)

CMU Engagement:

  • Based on the Clearview investigation, we made 8 recommendations, which the RCMP agreed to implement. For example, recommendation 3 required that within 12 months the RCMP institute systems and procedures to ensure that all novel collections of personal information across the RCMP are reliably tracked internally in such a way that they can meaningfully inform, and be meaningfully informed by, the RCMP’s decision-making on such collections.(Link in reference)
  • (redacted)
  • (redacted)

Project Wide Awake (PWA) Investigation

  • OPC is currently investigating a complaint about RCMP’s Project Wide Awake (open-source intelligence gathering). Given the investigation is ongoing, we can provide no further information.

Q11. If an organization consulted GA and was using digital forensic tools in the same way as the OPC, would GA conclude that a PIA was necessary? [GA/TA]

  • No, a PIA would not be necessary if an organization was using digital forensic tools in the same way as the OPC.
  • The OPC is not using digital forensic tools to make a decision about an individual.
  • PIAs are conducted on programs rather than tools and this tool does not represent a substantial modification to our current investigative processes.
  • That being said, we are having discussions about undertaking a PIA for the OPC Compliance Program for a variety of reasons including the potential changes in processes that could result from implementation of C-27, the fact that a PIA has never been conducted on the Compliance Program and due to the increasing use of technological tools to support our work.
  • PIAs are evergreen and we encourage institutions to assess whether a new PIA or an update to a PIA a longstanding program should be or could be conducted on a continual basis. The introduction of new tools can be an opportunity to engage in these discussions.

Background

  • The OPC Compliance Program predates the PIA requirement (2002) and therefore did not require a PIA. GA often recommends that, when introducing a technological tool or modification to a longstanding program, institutions consider whether there is an opportunity to take a whole of program PIA assessment on programs that may have never been assessed for privacy risks because of they predated the requirement.

Q12. For the departments listed in the CBC report, what are some of the ways that these tools are being used (i.e., types of programs/activities)? Which ones require PIAs? Did the departments do a PIA when they were supposed to, and did they advise the OPC of the program? [GA]

  • Digital forensics tools are currently used by the departments listed in the CBC report for either enforcing their enforcement mandate or for administrative investigations, cyber security activities, and data recovery specialists.
  • The personal information collected are mainly used in support of investigations, or as part of the technical analysis of a given system.

Which ones require PIAs?

  • For Transportation Safety Board (TSB) – yes, they are using the digital forensic tools for their transportation investigation program, and their use of these tools would require to be part of a PIA as lead to make a decision about an individual.
  • For Environment and Climate Change Canada (ECCC) – yes, when authorized by the courts in a search warrant, they are using the digital forensic tools during their investigations for their Enforcement and Intelligence programs. Their use of these tools would require a PIA.
  • For Canada Border Services Agency (CBSA) – yes, they are using the digital forensic tools for their Criminal Investigations Program, and their use of these tools would require a PIA as it can lead to make a decision about an individual.
  • For Fisheries and Oceans Canada (DFO) – yes, they are using the digital forensic tools for their Conservation and Protection program and their Cybersecurity program. Their use of these tools would require to be covered in a PIA as lead to make a decision about an individual.
  • For Canada Revenue Agency (CRA) – yes, they are using these digital forensic tools for their Criminal Investigations Program, and this would require to be covered in a PIA as the personal informal collected can lead to a decision being made on an individual.
  • For Royal Canada Mounted Police (RCMP) – yes, they are using the digital forensic tools mainly as part of their Digital Forensics Program to fulfill the RCMP’s mandates for criminal and national security investigations and protective policing. This would require a PIA as the personal informal collected can lead to a decision being made on an individual.
  • For the Canadian Radio-television and Telecommunications Commission (CRTC) – yes, they are using the digital forensic tools to enforce Canada’s Anti-Spam Legislation (CASL). The tools are used in very strict and limited circumstances as part of investigations, and only with a warrant approved by a justice of the peace. CRTC conducted a PIA in 2014.
  • For National Defence (DND) – yes, they are using the digital forensic tools to support lawful Defence activities and during lawful investigations. This would require to be covered in a PIA if the personal informal collected can lead to a decision being made on an individual. We note that it is not clear that this is the case based on the response received.
  • For Shared Services Canada (SSC) – yes, they are using the digital forensic tools for their admin investigations of government employees, and this use would require a PIA if this use leads to a decision being made on an individual. We note that it is not clear that this is the case based on the response received.
  • Natural Resources Canada (NRCan) – no, they are not using the procured digital forensic tools, so this would not require a PIA as long as their use of these tools remains the same.
  • Global Affairs Canada (GAC) – yes, they are using the digital forensic tools for conducting administrative investigations of government employees, and this use would require a PIA if this use leads to a decision being made on an individual. We note that it is not clear that this is the case based on the response received.
  • Correctional Services Canada (CSC) – possibly, they are using the digital forensics tools to examine contraband electronic devices seized from offenders for intelligence purposes. This use would require a PIA if this use leads to a decision being made on an individual. We note that CSC previously conducted a “PIA Checklist” assessment and concluded that it did not require a PIA. It is not clear whether this assessment would still apply.
  • For the Competition Bureau of Canada – yes, they are using these tools in its mandate to enforce the Competition Act, such as during investigative processes or the execution of search warrants. This would require to be covered in a PIA if the personal informal collected can lead to a decision being made on an individual. We note that it is not clear that this is the case based on the response received.

Q13. How long does it take to do a PIA? What is involved in preparing a PIA? [GA]

  • The size and scope of a PIA can vary widely based on the complexity of a program. For example, a very simple program that presents few risks and handles very limited personal information could be quite short whereas a complex project that involves many partners, sensitive information and complex uses may be much longer.
  • That said, fulfilling the Core Requirements of a PIA as set out in Appendix C of the Directive on Privacy Impact Assessment should not be an onerous task.
  • The Core Requirements consist of identifying and assessing key foundational elements of a program such as a description, legal authority, the sensitivity of personal information involved, any technology involved, how personal information may be transmitted as well as items such as an analysis of the personal information elements being collected and a data flow.
  • PIAs should also include documentation of other key components of a program such as policies and procedures, privacy notice statements, technology risk assessments, legal opinions, information sharing agreements etc.
  • Most importantly, it is crucial to remember that a PIA is a risk assessment process and not just a report. Going through the PIA process allows institutions the opportunity to assess risks and mitigate them as they go. This is substantially less effort than discovering a risk after implementation.

Q14. What has the OPC done to identify and address the privacy impacts of the use of DFTs? [TAD]

  • The OPC has determined that a PIA is not required for its use of DFTs in the context of its investigations as none of the triggers to conduct a PIA are present:
    • It does not use DFTs to assess personal information and make any decisions that affect an individual;
    • There have been no changes to the OPC’s investigative program;
    • The reviews conducted using DFTs are done in-house at the OPC; and
    • The DFTs do not represent a modification to the OPC investigative program or activities affecting overall privacy.
  • The OPC has used digital forensic tools in support of activities, such as breach investigations or systemic investigations. These tools have been used specifically to acquire digital evidence, analyze files located on acquired evidence, and to create timelines of events.
  • Individuals are not affected by the OPC’s use of DFTs.
  • The introduction of DFTs has not created any privacy impactful changes to the investigation program. Rather, they have accelerated the speed at which an investigation can proceed.
  • The OPC maintains a strict control on who can use DFTs and in what context. It also controls who within the office can access and view related data and conclusions.

Q15. What are some examples of DFTs versus spyware? [TAD]

  • Digital forensics tools can be viewed as a collection of specialized software and hardware that are used to capture and analyze digital evidence in support of an investigation, or as part of the technical analysis of a given system.
  • Common features across digital forensic tools include the ability to acquire digital evidence, recover deleted files, analyze files of interest, and create a timeline of events.
  • Some digital forensic tools are tailored for more general purposes, such as Magnet Forensic Axiom which can be used to analyze both hard drives and mobile devices, while others such as Cellebrite UFED are designed for specific purposes, such as mobile devices exclusively.
  • These almost always require physical access to the device, with some tools (such as Magnet Forensic Axiom) having the ability to do remote acquisitions where physical access would be complicated, such as in the case of a host residing in the Cloud or a complex system spanning multiple servers in a data centre.
  • Spyware and ODITs are a completely different category of tools, in that they are remotely deployed and operated in a covert fashion. While spyware and ODITs often share the same technical capabilities, such as collecting files, activating a device’s microphone or camera, recording keystrokes, etc., they differ on the legal authority under which they are used. Spyware is used illegally by threat actors, while ODITs are used following a judicial authorization such as a search warrant.
  • Examples of spyware include Pegasus and DarkComet.
  • Questions regarding the specific use of ODITs should be directed at the RCMP should they be asked to appear in front of this Committee.

Q16. What do you think of the USA’s “Key AI Actions” following President Biden’s Executive Order on AI? [PRPA/IPT]

  • President Biden’s Executive Order on AI remains a very positive step, promising advances related to topics such as AI education, algorithmic discrimination, and privacy-enhancing technologies.
  • Though many of the “Key AI Actions” recently highlighted recently by the administration are primarily internal to US interests, they – and other future developments arising from the Executive Order – are likely to raise the bar of safety AI developers, leading to increased protections for individuals both in the United States and globally.
  • That said, I believe that Canada still has an opportunity to be a global leader in both AI innovation and governance. To achieve this, it remains important for us to establish a robust legislative framework for artificial intelligence technologies, including modernized privacy law.

Q17. Do you have any concerns about how this technology was procured? Are there sufficient privacy controls in the general Government of Canada procurement process as it stands? [GA]

  • Some of the government institutions, when questioned by media, suggested that questions should be directed to Shared Services Canada as the purchaser of the tools in question. This raises a couple of issues: one, the idea that SSC’s procurement role does not remove responsibility from individual GIs for undertaking a PIA prior to deployment of tools where a PIA is warranted. It also raises the idea of ‘due diligence’ in the selection and purchase of such tools, an issue that was important in Clearview and in Project Wide Awake.
  • OPC Government Advisory has previously highlighted to SSC the importance of clearly outlining privacy roles and responsibilities in Information Sharing Agreements with its client departments. Our recommendations pointed out a lack of clear accountability for security assessment, audit log monitoring, and general privacy obligations.
  • Despite attempts at follow-up, our office has not received a response to our recommendations.

Q18. Do employees have an expectation of privacy on their work phones and computers? Can an employer search their devices without a warrant and the person’s knowledge or consent? Is judicial authorization required for these types of activities? Is it true that the data on government-owned devices belongs to the employer? What are federal employer’s privacy obligations under the Privacy Act with respect to monitoring of employees? [PRPA/Legal]

  • Individuals have a right to privacy at work, even if they are on their employer’s premises and/or using their employer’s devices, such as computers and mobile phones. Last year, my Office updated our guidance on Privacy in the Workplace.Footnote 5 This included a new section on employee monitoring. In our guidance we are clear that any employee monitoring be limited to purposes that are specific, targeted and appropriate in the circumstances.
  • Monitoring measures should take into consideration an assessment of the privacy risks and any mitigating measures, including limiting collection to that which is necessary for the stated purpose, and ensuring that the least privacy invasive measure in the circumstances is used. While there may be a number of technological measures that are available to monitor employees, not all measures will necessarily be the most effective in meeting the desired goals or be the least intrusive means available to achieve the stated purpose.
  • In assessing privacy rights in the context of employment, the question is less about data ownership and more about whether employees retain a privacy interest in data generated in the workplace. The Privacy Act makes it clear that employees have rights relating to the personal information held about them by federal government institutions. For example, an individual, including a government employee, has a right of access to their personal information, including to personal information held by their federal government employer (s.12(1)PA).
  • Given the vast amounts of personal information computers and cell phones can contain, courts have consistently found that searches of them attract a reasonable expectation of privacy, and accordingly will typically require a warrant. However, employers have legitimate reasons to collect, use and disclose personal information about employees; they may consider electronic monitoring and other surveillance as necessary to prevent workplace harassment, or maintain critical infrastructure. In purely regulatory contexts such as these, it may be possible for employers to search employer devices used by employees without a warrant. However, even in regulatory contexts, in order to comply with the Charter, where the predominant purpose of the search is to uncover criminal liability, a warrant will be required.
  • Regardless of whether a warrant is required, employers do not have carte blanche: the Charter, the Privacy Act, and TBS policy all regulate employee monitoring by government institutions. Among other things, the Privacy Act places limits on the collection, use, and disclosure of personal information. TBS policy requires employers to limit collection to what is “demonstrably necessary” for the government institutions programs or activities.
  • Lastly, transparency about the potential for employee monitoring is fundamental. Employers must make employees aware of the purpose, nature, extent and reasons for monitoring, as well as potential consequences for workers, unless there are exceptional circumstances at play.

Background (if necessary)

Relevant jurisprudence

  • Searches of a computer can be highly invasive and can attract a reasonable expectation of privacy (R. v. Morelli, [2010] 1 S.C.R. 253);
  • Similar considerations apply to cell phones (R. v. Marakah, 2017 SCC 59 at paragraph 35);
  • Where the predominant purpose of a search relates to potential criminal liability, the “full panoply” of s. 8 Charter protections apply (R. v. Jarvis, [2002] 3 S.C.R. 757; R. v. Ling, [2002] 3 S.C.R. 814)
  • The degree to which s. 8 Charter protections apply to workplace surveillance is also presently before the Supreme Court in York Region District School Board v. Elementary Teachers’ Federation of Ontario, 2023 CanLII 19753 (SCC)
  • Relevant provisions of the Privacy Act:
    • Section 4: direct relationship to an operating program or activity
    • Section 5: general requirements for direct collection (s. 5(1)), and purpose specification (s. 5(2)), (subject to exceptions in s. 5(3).
    • Sections 7 and 8: general consent requirement for use (s. 7) and disclosure (s. 8) subject to listed exceptions.
  • TBS Directive on Privacy Practices: s. 4.2.9 requires that collection be limited to what is “directly related and demonstrably necessary for the government institution’s programs or activities.”

Q19. Is what is being reported in the media consistent with what the departments have told the OPC in terms of how they are using these tools? [GA]

  • Institutions’ responses are generally consistent with what is reported in CBC’s article of January 31, 2024 on the topic.
  • It is reported that "many departments" use these tools for investigations into alleged violations of the law under the authority of a search warrant, while others use the tools without a warrant for internal investigations into the use of government devices. This aligns with responses from institutions.
  • It is also reported that institutions acted in accordance with legal and policy requirements although they did not conduct PIAs. This is consistent with the responses they have provided to the OPC.

Background

  • Five institutions specifically mentioned the authority of a warrant for their investigations (CRA, ECCC, CRTC, Competition Bureau and RCMP).
  • Three institutions specifically mentioned the use of this technology for internal administrative investigations (DFO, SSC and GAC).

Q20. Is it possible to use digital forensic tools on a person’s device without their knowledge (i.e., surreptitiously or remotely)? [TAD]

  • The possibility is there, though the likelihood is low. There are three (3) possible scenarios under which this could happen:
    • A department undertaking an administrative investigation against an employee that does not notify the employee that they are under investigation, and when the employee is not tending to their Crown-owned devices, digital forensics tools are used to acquire images of those devices. Specific questions around a given department’s procedures when conducting administrative investigations and their use of digital forensic tools should be directed at them when they come to testify in front of this Committee.
    • A law enforcement agency could conduct a surreptitious entry of a suspect’s dwelling when they are away, and then use digital forensic tools to acquire digital evidence located inside the dwelling. This would be undertaken under a judicial authorization, such as a search warrant.
    • A hostile foreign intelligence service could conduct a surreptitious entry of a Canadian target’s dwelling when they are away, and then use digital forensic tools to acquire intelligence from digital devices that are located inside the dwelling. This would be illegal.

Q21. What are some examples of when departments could use digital forensic tool appropriately (assuming they did a PIA)? [EVERYONE]

  • In the exercise of their legislative mandate.
  • To conduct an administrative investigation when there are reasonable grounds that a policy has been violated by an employee.
  • With the conduct of necessary assessments, including a PIA and security assessments.
  • We also note that any use of such a technology should be accompanied by transparency. If the use of a digital forensic tool is a potential action that a department could take for suspected non-compliance with a policy, employees should be made aware the same way that they are aware of the presence or potential use of other privacy invasive tools, such as video surveillance.

Q22. If the Financial Administration Act authorizes the President of Shared Service Canada (SSC) to conduct these types of investigations, is it still necessary to do a PIA? [Legal/GA]

  • Yes, it is still necessary to do a PIA even with legal authorization to conduct a program.
  • Section 4 of the Privacy Act requires that collection of personal information by a government institution be directly related to an operating program or activity. Shared Service Canada’s authority to undertake an investigation under the Financial Administration Act is relevant to its authority to collect personal information in the context of such an investigation under the Privacy Act.
  • However, this is distinct from the question of whether a PIA should be conducted relating to that particular activity. For example, a PIA is required by Treasury Board policy in a range of circumstances, including when personal information may be used as part of a decision-making process that directly affects an individual. It is therefore important for government institutions to remember that establishing authority for collection of personal information does not obviate the need to consider whether a PIA should be conducted in particular circumstances.
  • A PIA is a separate process that assesses the risks to privacy associated with the implementation of a given program or initiative. PIAs are risk assessment tools that ensure that authorized programs are carried out in the least privacy intrusive manner possible.
  • Using the example of an investigative program, an institution would need to have legal authority to conduct such a program, but a PIA would guide the institution through the implementation of this program including factors such as the minimal personal information elements required to achieve the objectives of the program, how this information will be stored and safeguarded, who will have access to this information etc.

Q23. What are the best practices that the OPC recommends to ensure the security of the personal information seized? [GA/TAD]

  • Access to evidence collected with digital forensics tools or the results of their analysis should be kept under a strict need-to-know principle.
  • Strong physical and logical access controls around the collection, storage and disposal of evidence gathered with digital forensics tools should be put in place to reduce the risk of unauthorized access.
  • Security mechanisms used to protect acquired evidence should be periodically audited and reviewed to detect unauthorized access and ensure that they reflect industry best practices.

Q24. In the last 5 years, do we have an example of a department that conducted a PIA and consulted us before the digital forensic tool was deployed? [GA]

  • No. In all the cases of which we are aware, all institutions were using the technology prior to engaging our office or conducting a PIA.

Background

  • We were consulted by two institutions in early 2021 following an earlier article – the RCMP and CBSA.

PIA-001711 – RCMP use of GrayKey, a mobile device forensic tool

  • Consult in January 2021 on GrayKey (which has since been acquired by Magnet Forensics), Cellebrite and other mobile device forensic tools (MDFTs), a suite of technological tools with varied capabilities, which are used to unlock mobile devices.
  • In our advice we emphasized the need to conduct a formal privacy assessment and importance of putting safeguards in place to prevent any unauthorized access or misuse of these tools.
  • No PIA has been received.

PIA-001710 – CBSA use of GrayKey

  • Consultation in February 2021 on technology to unlock mobile devices. Border Service Officers (BSOs) can provide seized devices to Digital Forensics Unit (DFU) employee for unlocking, or a DFU employee can bring the technology to the border in order to unlock a device.
  • The CBSA used device unlocking technology 11 times in 2020 and 16 time in 2019 in the border management context.
  • We did not provide any specific advice at the consult.
  • No PIA has been received on this.

Q25. Does Québec have employment obligations in its privacy law? [Legal]

  • Quebec’s Act respecting the protection of personal information in the private sector contains a number of provisions that also apply in the employment context, thereby creating certain privacy-related obligations for employers and rights for employees.
  • Of particular interest are provisions imposing transparency obligations on organizations, including in the context of employee monitoring. Where an organization intends to use technology that allows individuals to be identified, located or profiled, it must first provide notice of its intention to use such technology and explain how such technology is activated. The definition of profiling employed in the law specifically references "analysing a person’s work performance" as an example.
  • Further, where an organization intends to collect personal information using such technology, it must publish a policy clearly explaining the proposed use of the technology and proposed collection of personal information and make that information available on the organization’s website, if applicable, as well as sharing the policy by any appropriate means to reach the persons concerned.

Background

Act respecting the protection of personal information in the private sector, CQLR c P-39.1:

8.1. In addition to the information that must be provided in accordance with section 8, any person who collects personal information from the person concerned using technology that includes functions allowing the person concerned to be identified, located or profiled must first inform the person
(1) of the use of such technology; and
(2) of the means available to activate the functions that allow a person to be identified, located or profiled.
“Profiling” means the collection and use of personal information to assess certain characteristics of a natural person, in particular for the purpose of analyzing that person’s work performance, economic situation, health, personal preferences, interests or behaviour.

8.2. Any person who collects personal information through technological means must publish on the enterprise’s website, if applicable, a confidentiality policy drafted in clear and simple language and disseminate it by any appropriate means to reach the persons concerned. The person must do the same for the notice required for any amendment to such a policy.

Q26. Does the FPT Resolution on Protecting Employee Privacy in the Modern Workplace apply in this case? [PRPA]

  • Yes it does.
  • While the Resolution acknowledges that some level of information collection about employees is reasonable, and may even be necessary to ensure accountability, it stresses the importance of protecting employee privacy rights, and the potential harms that can occur when these rights are not respected in the workplace.
  • In particular, it calls for employers to respect the principles of reasonableness, necessity, and proportionality when considering or reviewing any collection or use of employee information through electronic surveillance.
  • In addition, it calls for employers to provide employees with clear information about how to object to the collection, use, or disclosure of their personal information, how to challenge decisions, and how to exercise access rights.
  • In May 2023, my Office updated our workplace guidance and have outlined a number of tips for employers. This document, together with the FPT Resolution, contain important information that can help employers protect privacy.

Background

Q27. Did the GPA issue a resolution on employee privacy? [PRPA]

  • Yes; in October 2023, the GPA issued a resolution on Artificial Intelligence and Employment, which the OPC co-sponsored. Its purpose was to highlight considerations for the use of AI in employment activities such as recruitment, monitoring and managing, or termination of employees.
  • While the drafters of that resolution may not have had data extraction tools specifically in mind, many of the emphasized points remain valid – including that the processing of data must be lawful, fair, and transparent, and that safeguards need to be established to avoid disproportionate surveillance of employees.
  • Data extraction tools may have a place in the employment context; however, as was noted with respect to AI, when use of such tools is opaque, misapplied, incorrectly designed, or inappropriately relied upon it may lead to harms or infringement of fundamental rights and freedoms, including privacy, human dignity, and equality.

Background

Q28. What criteria do you consider when determining whether to launch a Commissioner initiated investigation (CII)? [Compliance]

  • Under PIPEDA and the Privacy Act, the Commissioner may initiate a complaint where satisfied that there are reasonable grounds to investigate a matter.
  • In determining whether to conduct a CII, we consider the level of risk to the privacy of Canadians and the most effective strategy to address that risk.
  • Risk is determined based on factors including:
    • the impact on Canadians privacy rights in general,
    • the impact on affected individuals,
    • the scope of the actions of the organization in question, and
    • whether the matter involves novel or systemic issues warranting examination.

Background

Subsection 11(2) of PIPEDA Subsection 29(3) of the Privacy Act
Commissioner may initiate complaint

(2) If the Commissioner is satisfied that there are reasonable grounds to investigate a matter under this Part, the Commissioner may initiate a complaint in respect of the matter
Privacy Commissioner may initiate complaint

(3) Where the Privacy Commissioner is satisfied that there are reasonable grounds to investigate a matter under this Act, the Commissioner may initiate a complaint in respect thereof.

Q29. Of the three institutions that have submitted a PIA – did they submit them to the OPC (or just TBS), and if so, are we satisfied with them? [GA]

  • All three institutions submitted their PIAs to the OPC.
  • (redacted)
  • (redacted)
  • (redacted)
Date modified: