Remarks by the Privacy Commissioner of Canada at the Access to Information and Privacy Communities Meeting for Data Privacy Week
January 28, 2025
Virtual
Address by Philippe Dufresne
Privacy Commissioner of Canada
(Check against delivery)
Good morning and Happy Data Privacy Week!
It is my pleasure to mark this occasion among federal government privacy professionals. You are the privacy champions of your organizations and key allies in our efforts to protect and promote the fundamental right to privacy.
You do essential work to ensure compliance with the Privacy Act and Treasury Board directives, and to improve the personal information management practices of your organizations. Supporting individuals’ fundamental right to privacy builds and maintains trust among Canadians that their sensitive personal information is being managed with the care that it deserves.
The theme that my Office has chosen for this year’s Data Privacy Week is “Put Privacy First.” It speaks to proactively addressing privacy issues as a means of future-proofing your organization’s programs, services, and systems against harm, and building a culture within your organization that values privacy.
With more data being collected, shared, used, and stored than ever before, perhaps never have these mitigation measures been more important. This is especially true as federal institutions adopt or expand the use of third-party software and service providers, leverage Artificial Intelligence (AI), and explore other data-driven innovations to better serve clients and use resources more efficiently.
These are important goals, and in pursuing them, we must emphasize that innovation and privacy are not a zero-sum game. Just as data is used to fuel innovation, innovation must also be used to protect data. As federal institutions, we serve Canadians and it is incumbent upon us that we respect peoples’ fundamental rights, including their right to privacy, to maintain public trust.
In my remarks today, I will focus on how your organizations can put privacy first and how the OPC can help. I will also talk about compliance and breach trends and what the OPC is doing to maximize resources and improve service delivery.
Putting privacy first: Privacy impact assessments
When onboarding new technologies and designing and modernizing programs and services that involve personal information, it is important to put privacy first in order to proactively address privacy issues before they happen.
To that end, privacy impact assessments, or PIAs, are an important risk-mitigation tool. They can help institutions demonstrate that they are accountable for the personal information under their control, ensure that they are complying with the Privacy Act, and limit their risk of privacy breaches. They should be completed prior to launching a new initiative or when contemplating significant changes to an existing program that has privacy implications, ideally in the very early stages to ensure that privacy protections are baked in from the very start.
This is what we mean when we talk about privacy-by-design. Creating a culture of privacy within your organization that values data protection by considering, and mitigating against, privacy risks at the outset of any initiative.
For example, limiting the collection of information to just what is needed for a new program, defining a retention period and disposing of the information accordingly, and planning how the data will be protected can reduce the risk of last-minute privacy issues impacting a program launch, or having to go back to redo work to retrofit privacy into the development.
Other risk-mitigation measures include enhanced protections for employee credentials, applying security patches as they become available, which from experience has often been the source of breaches that could likely have been prevented, as well as requiring multi-factor authentication, properly training employees, having a robust privacy management program, and investing in cybersecurity to prevent unauthorized access.
The Treasury Board Secretariat (TBS) recently revised its policy instruments for PIAs. TBS consulted the OPC throughout the development of these updates and accepted many of our recommendations.
For instance, the new requirements indicate that substantial modifications of programs through the use of third-party contractors or automated decision-making covered by the Directive on Automated Decision-Making should trigger the need for a full PIA.
The requirements also introduce a formalized approach for conducting multi-institutional PIAs and boost transparency with a new template for publishing PIA web summaries. They require that institutions document, review, and update their risk-mitigation measures annually.
The OPC has also made it easier for you to submit PIAs. Last year, we launched an updated version of the form that allows institutions to submit PIAs online securely through our website, reducing the risk that a PIA might be sent to the wrong inbox. It accepts documents up to Protected B and makes it easier for institutions to later add documents and link them to previous submissions.
If you have questions about the new policy updates or wish to consult at any time during the development of a PIA, our Privacy Act experts are available to assist.
Putting privacy first: generative AI
With generative AI transforming how many federal institutions operate, another way to put privacy first is to consider our Principles for responsible, trustworthy and privacy-protective generative AI technologies. Established in cooperation with our provincial and territorial counterparts, the document provides important advice for organizations developing, providing, or using the technology.
The OPC is working to lead by example in this area by building our own internal capacity and engaging with domestic and international fora to better understand and address the privacy implications of generative AI.
Moreover, our Technology Analysis Division is currently evaluating how OPC employees could safely leverage the efficiencies promised by AI with privacy by design.
Addressing and advocating for privacy in a time of technological change, with a focus on AI and generative AI, is among the strategic priorities that are guiding my mandate through 2027.
The integration of AI into the applications used by individuals, businesses, and increasingly, government departments, offers many efficiencies and conveniences but also presents new risks.
Key privacy concerns relate to the collection and use of personal information for training AI systems, transparency and the explainability of data sources and AI decision-making processes, consent mechanisms, the accuracy of decisions, including those generated through inferences, and the risk of bias.
For organizations using generative AI, we have emphasized the need to ensure that solutions supplied by third-party service providers comply with privacy standards and that AI-generated content is labeled as such. Institutions need to be transparent about their use of AI and accountable for any AI-generated decisions about individuals, be it whether to grant someone immigration status or employment insurance benefits.
Putting privacy first: employee privacy
On that note, I will say a few words about privacy in the context of the workplace and the use of digital staffing platforms.
Although some level of information collection will be reasonable, and even necessary, to manage the employer-employee relationship, overcollection of personal information can have a disproportionate impact on employee privacy.
To put privacy first in the employment context, there are key points to keep in mind:
- Only collect the personal information that is necessary for a stated purpose and collect it by fair and lawful means. Develop clear policies explaining what will be collected, why, and with whom it will be disclosed and communicate these policies to employees before putting them into practice.
- In the hybrid workplace, technologies such as live video interviews, asynchronous staffing platforms, and AI tools are increasingly common. These innovations should trigger updated privacy analyses and contracts with staffing platforms should include clauses on data ownership, retention, and safeguarding.
- Any use of AI tools within staffing will likely be subject to the Treasury Board Secretariat’s Directive on Automated Decision Making, and therefore require an Algorithmic Impact Assessment.
If you have questions about digital staffing platforms or other workplace privacy concerns, I invite you to consult the OPC and TBS for advice.
Putting privacy first: contracting
I hope that some of you had an opportunity to attend the Data Privacy Week kick-off webinar on privacy in contracting that we held through the Canada School of Public Service a week ago in conjunction with the TBS.
Designed for public servants involved in procurement as well as privacy officials, it was an opportunity to enhance awareness of procurement processes and improve privacy practices in contracting, with a special focus on how to handle the personal information of Canadians securely and efficiently.
For those who missed it, there were several key takeaways that can help you put privacy first when contracting that are worth repeating.
Importantly, institutions should include specific privacy clauses in contracts with third party service providers that spell out certain obligations, including:
- The obligation to report any breaches to the institution responsible for the contract;
- The obligation to coordinate with the institution responsible for the contract to respond to access requests;
- Details about administrative and technical safeguards and retention periods employed by the third-party service provider; and
- Any limitations on the subsequent disclosure of personal information by the third-party service provider.
To be most effective, consult the privacy and legal experts within your institution, as well as those responsible for cybersecurity and technology, when drafting contracts.
Another key takeaway is the importance of conducting due diligence with third party service providers.
When a federal government institution enters into a contract with a private sector organization, it must take sufficient steps to satisfy itself that the private sector organization is complying with its own obligations under applicable private sector privacy legislation.
Due diligence on the part of federal institutions involves ensuring that any personal information collected by the vendor or exchanged between the vendor and institution is protected appropriately, that privacy assessments have been completed, and ensuring that the vendor can explain how it will comply with privacy laws, rather than simply assert that it will. The degree of due diligence required should be proportionate to the volume and sensitivity of the personal information involved in a contract.
Be especially mindful if engaging a third party that provides facial recognition services or other AI applications that may have used personal information in their training data, commercial data providers and open-source intelligence services, as well as services associated with broad surveillance.
Compliance trends
Over the last few years, we have seen investigations involving public-private partnerships and contracting relationships, often related to digital technologies. We have observed how these relationships can create additional complexities and risks for privacy and how they may result in parallel investigations under both our federal public and private sector privacy laws.
Indeed, we have ongoing investigations involving public-private partnerships and contracting relationships including our investigation into the ArriveCAN app, as well as simultaneous investigations under the Privacy Act and PIPEDA following a cyberattack that resulted in a data breach affecting federal government personnel who used government-contracted relocation services over the last two decades.
In terms of overall compliance trends, in recent months we have seen a considerable increase in the number of complaints received under both our acts. Since April, the number of complaints filed under the Privacy Act and PIPEDA has increased by 22% compared to the same period last year.
More than half the complaints received under the Privacy Act in the last fiscal year were time limit complaints, in other words, complaints about how long it takes for institutions to respond to requests for access to personal information.
Moreover, approximately a third of the complaints received so far this year relate to the application of exemptions to withhold requested personal information, or to allegations of missing records.
The remaining complaints relate to allegations of unauthorized collection, use, and disclosure of personal information.
The OPC has adopted an early resolution process to try and resolve most complaints through a process inspired from mediation.
This process, which involves engagement and negotiation, remains an important tool for dealing with individual complaints more effectively and expeditiously and to the greater satisfaction of all parties.
In 2023-2024, 87% of all complaints under the Privacy Act were closed either through early resolution or summary investigation, which are simpler investigations involving less complex, non-systemic complaints. Summary investigations conclude with the issuance of a brief report or letter of findings, rather than a full investigative report.
That being said, there are still a number of cases that deal with complex emerging technologies and systemic issues that increasingly require substantial technical analysis.
Unfortunately, this has resulted in a rise of our complaints backlog which we are working to address by introducing efficiencies to our processes.
Breaches
Before I expand on those efforts, let me say a few words about public sector breaches.
We are seeing an increase in both the number and impact of data breaches, including those resulting from cyberattacks. These breaches are a constant source of concern due to their growing complexity and severity.
In 2023-2024, federal institutions reported 561 breaches to the OPC, representing an 88% increase over the previous year. This increase can be explained in part by a greater number of reports from Employment and Social Development Canada (ESDC), since its mandate requires it to collect and use a significant amount of personal information. It may also reflect the Department’s breach detection efforts.
We also know that federal institutions are an increasingly attractive target for cyberattacks. While we believe that federal institutions have historically underreported cyberbreaches, this year we have already launched three major cyber breach investigations involving federal government institutions.
The most recent breach involves cyberattacks at the Canada Revenue Agency (CRA) that led to more than 31,000 breaches. These were in addition to cyberbreaches at ESDC and the CRA that were the subject of a special report issued by the OPC last February.
That investigation concluded that hackers used credential stuffing to fraudulently access government services and apply for or redirect payments to themselves. The attack compromised the sensitive financial, banking, and employment data of tens of thousands of Canadians, leading to numerous cases of fraud and identity theft – including a high volume of fraudulent applications for COVID-19 Emergency Response Benefits.
The second ongoing investigating is a cyberbreach at Global Affairs Canada in which unauthorized individuals accessed the department’s virtual private network. The third is the one I mentioned earlier affecting the personal information of federal government personnel who used government-contracted relocation services over two decades.
To help public-sector stakeholders report breaches, last spring we launched a new web form that allows federal institutions to send breach reports to the OPC and TBS simultaneously. We strongly encourage institutions to use this web form to submit their material privacy breach reports.
We are also working to finalize an online tool to help organizations assess whether a breach has met the real risk of significant harm reporting threshold. We plan to launch it in the coming weeks.
We know that breaches can occur even when organizations have put in place safeguards. This is why an effective response to a breach is also critical to mitigating the impacts on Canadians and preserving trust in their institutions. It is also worth noting that breaches do not automatically mean non-compliance with the law, and in most cases, a breach report does not result in an investigation.
Importantly, institutions should report breaches as soon as they become aware of one, even if they have not yet established with certainty that the breach has reached the reporting threshold. This is especially important for high profile breaches that are likely to or have already made the news.
Reporting early gives you access to our advice and expertise in responding to an incident and gives us a clearer picture of the current and emerging risks you face. This will help us respond to inquiries from Parliamentarians, the public, and media and allow us to better help you prevent future breaches.
Restructuring at the OPC
Before I conclude, I will say a few words about what the OPC is doing internally to position itself to better help you and other stakeholders.
Earlier this month, I unveiled a Transformation Plan aimed at strengthening our ability to effectively and efficiently protect and promote the fundamental right to privacy in an increasingly complex and evolving digital world.
The changes will further enhance collaboration and cohesion across my Office and streamline processes, ultimately supporting more integrated, agile and strategic approaches that will maximize the OPC’s impact for Canadians.
It includes changes that will allow us to place an even greater focus on our strategic priorities; respond more rapidly and effectively to emerging issues; evolve our approach to compliance; and bring stronger alignment to our policy and legal work and our enforcement and advisory activities.
It also includes some changes that will impact how you communicate with the OPC. Importantly, the OPC has moved from four sectors to three, each overseen by a deputy commissioner.
Isabelle Gervais will oversee the rebranded Compliance Promotion and Enforcement Sector. She will continue to oversee enforcement activities, and her directorate will now include the Business Advisory and Government Advisory Directorates.
These changes will allow us to better align our proactive and reactive work and create opportunities for timelier and more impactful compliance outcomes.
Moreover, they will allow us to more effectively address emerging data-protection issues and leverage proactive interventions more strategically in support of compliance. It will also provide more flexibility in the allocation of resources and help alleviate backlogs. The goal will be to make better use of the tools available to us across the full compliance and enforcement spectrum. These changes flow from my first strategic priority to protect and promote privacy with maximum impact through greater efficiency, adaptability and preparedness in a constantly evolving privacy landscape.
We will continue to rely on ethical walls as we always have, including operational procedures and good information management practices, to divide our enforcement and advisory work. Whether consulting us on an initiative, seeking support on a PIA, or participating and asking questions during an OPC-led workshop, coming to us will help you to address and mitigate privacy risks from the outset.
Should you have any questions about these changes or any other matter, please reach out to my Office.
Another key change involves bringing Legal Services and our Policy, Research and Parliamentary Affairs directorate together under the oversight of Deputy Commissioner and Senior General Counsel Marc Chénier.
This will better align our analysis and advice and help to support and ensure a more harmonized response to legal and policy issues that might arise.
Deputy Commissioner Richard Roulx will continue to oversee corporate management under a new title: the Enabling Services Sector, which will also become the central hub for our business intelligence functions under the Business Planning, Performance, Audit, and Evaluation Directorate.
I am very excited about the way forward and to continuing to work with all of you in the ATIP community. I am confident that these changes will allow us to better serve our stakeholders by optimizing our programs and services and better addressing compliance pressures.
Conclusion
I would like to conclude by inviting you to take some time this week to consider how your organization is putting privacy first when it onboards new technologies like generative AI or rolls out new programs.
Are you: Developing PIAs prior to launch? Doing your due diligence when engaging third-party contractors? Doing everything in your power to properly manage privacy in the employer-employee relationship?
As privacy professionals, you play an important role in building a culture of privacy within your organizations where privacy is valued, and data protection is taken seriously.
Collaboration is central to everything that we do and the ATIP community is a valued partner when it comes to protecting and promoting privacy across government.
To ensure public trust in our federal institutions, individuals must be assured that their data is being managed appropriately and you all play a big part in making sure that this happens.
Again, happy Data Privacy Week and I would be happy to take a few questions.
- Date modified: