Language selection

Search

Remarks by the Privacy Commissioner to the Anti-Racism Ambassadors Network and the Interdepartmental Network for Diversity and Employment Equity

January 28, 2025
Gatineau, Quebec / Virtual

Address by Philippe Dufresne
Privacy Commissioner of Canada

(Check against delivery)


Thank you for inviting me to speak with you today. I am delighted to have an opportunity to address this audience, whose members work daily to protect the rights of equity-deserving Canadians, which is a priority for me and my Office.

I celebrate the dedication and commitment that you demonstrate by working to combat racism, and to advance equity, diversity, inclusion, and accessibility.

I appreciate that you are also interested in how your work intersects with privacy. Privacy protection is not just a set of technical rules and regulations. Rather, it is about preserving fundamental human rights and democratic values.

This week is Data Privacy Week. It is an opportunity to highlight the impact that technology is having on our privacy and to underscore the importance of valuing and protecting individuals’ fundamental right to privacy.

We are in a data-driven era, where personal information is being captured, used, and shared in unprecedented ways and transforming the landscape in which we live and work.

As individuals adopt and embrace new technologies, it is essential that we focus our efforts on empowering them to understand and make informed choices about how their personal information is used.

It is equally important to ensure that federal institutions and businesses are equipped with knowledge and guidance to ensure strong data-management and privacy protections when they onboard new technologies.

This will especially resonate for communities that have historically been on the receiving end of bias, discrimination and other harms.

Today I would like to talk about my approach to privacy, the impact that evolving technology has on historically underrepresented groups, the crucial role of trust and how organizations can build this trust to promote privacy as well as equity, diversity, and inclusion.

Protection and promotion of rights

I will start off by telling you a little about myself.

In my professional life as a lawyer, my work has focused on strengthening Canada’s public institutions and protecting and promoting the fundamental rights of Canadians.

My previous roles as Senior General Counsel at the Canadian Human Rights Commission, Law Clerk and Parliamentary Counsel of the House of Commons, and as a legal officer with Global Affairs, each required me to promote and protect fundamental legal and constitutional norms while simultaneously achieving practical objectives and interests.

This meant rejecting the false choice of saying that you can have either human rights or national security, parliamentary privilege or health and safety, or the principle of sovereignty of nations or a system of international criminal law.

This approach also applies to my current role as Privacy Commissioner of Canada, where my mandate is to oversee compliance with our federal privacy laws and to promote and protect the privacy rights of individuals.

This is especially important in the current digital environment. The rapid advancements that we are seeing in technology are exciting. They offer tremendous potential for innovation and for improving the lives of Canadians.

Ensuring that we are able to use these innovations while protecting privacy is critical to our success as a free and democratic society, grounded in the protection and recognition of individual and collective rights.

During my confirmation hearings in 2022, I presented the three pillars of my vision for privacy to the House of Commons and Senate: That privacy is a fundamental right; that it supports the public interest and Canada’s innovation and competitiveness; and that it is an accelerator of Canadians’ trust in their institutions and in their participation as digital citizens.

These pillars reflect the reality that Canadians want to be able to fully participate as active and informed digital citizens without having to choose between this participation and their fundamental privacy rights.

Privacy as a fundamental right

In 2019, my Office and our international counterparts in the Global Privacy Assembly declared in a resolution that there is an “indispensable link between the protection of the right to privacy and a society’s commitment to promote and respect human rights and development.”

This description of privacy as a necessary condition for the existence of other fundamental rights is consistent with the Supreme Court of Canada’s long-standing interpretation of privacy law as having quasi-constitutional status.

Treating privacy as a fundamental right means treating it as we do other human rights: as a priority. In a Joint Statement on Privacy and Democratic Rights with the United Nations Special Rapporteur on the Right to Privacy, we concluded that effective data protection and meaningful privacy rights specifically support democratic ideals, processes, participation, and debate.

This means that privacy must be legally protected and promoted with a strong, fair and enforceable legal and rights-based regime. There must be meaningful remedies that prevent and address violations and that will act as an incentive for organizations to create a culture of privacy, where privacy is considered, valued, and prioritized.

It also means including and embedding privacy at the outset of innovation, not as an afterthought or regulatory irritant.

The second element of my vision is a recognition of the positive impacts of privacy and a rejection of the false choice between privacy and the public interest or innovation.

My third pillar, that privacy is an accelerator of Canadians’ trust in their institutions, needs to be a consideration when it comes to equity, diversity, and inclusion activities. Sharing personal information with government institutions is an exercise in trust, particularly for historically marginalized groups. When individuals trust that their privacy will be protected, they may participate more freely in the digital environment.

Strategic priorities

My commitment to working toward a future where innovation can flourish, and fundamental privacy rights are upheld, is contained in the strategic plan that I launched a year ago to guide the work of the OPC. Building on my vision, it established three strategic priorities to guide our work through 2027. They are:

  • Protecting and promoting privacy with maximum impact by using business intelligence to identify trends that need attention, producing focused guidance and outreach, leveraging strategic partnerships, and streamlining our resources and processes to optimize service delivery;
  • Addressing and advocating for privacy in this time of technological change, with a focus on artificial intelligence and generative AI, the proliferation of which brings both potential benefits, and increased risks to privacy; and
  • Championing children’s privacy rights to ensure that their unique privacy needs are met, and that they can exercise their rights.

Across these priorities is a commitment to collaboration and working with domestic and international privacy regulators, government, civil society, academics, and industry, as well as with other regulators across jurisdictions.

The priority with perhaps the most direct impact and immediate relevance for the work that you do is addressing and advocating for privacy in this time of technological change.

Technology offers many benefits for government including in the delivery and accessibility of services, in creating operational efficiencies, and in advancing the analysis and use of data to support decision making, and ultimately the public interest and Canada’s institutions.

As federal organizations continue to leverage technology and explore the use of AI, collaboration with privacy professionals on initiatives involving the collection, use, and retention of personal information will only become more important.

Regulators, including the OPC, are working hard to keep pace with emerging and rapidly evolving technologies and to identify best practices for regulating them.

Much of the OPC’s work in this area has stressed the need for developers and providers of generative AI to embed privacy in the design, conception, operation, and management of new products and services.

In December 2023, I released principles for responsible and trustworthy generative AI, which my Office developed jointly with my provincial and territorial counterparts. These principles were reiterated at the G7 data Protection and Privacy Authorities Roundtable last October, which issued statements on the role of data protection authorities in fostering trustworthy AI, and on child-appropriate AI. Over the last year, the OPC has collaborated on similar initiatives with the Global Privacy Assembly, and, domestically, with our provincial and territorial counterparts.

The principles spell out the need for developers and organizations deploying generative AI to protect children and groups that have historically experienced discrimination or bias.

They state that developers, providers, and organizations must actively work to ensure that generative AI systems are fair, and that they do not replicate or amplify biases, or introduce new biases. If they fail to do so, the use of generative AI models and applications may be more likely to result in discriminatory outcomes based on race, gender, sexual orientation, disability, or other protected characteristics.

It is, in part, why I stress the need for organizations that use AI to be transparent about it and accountable for any AI-generated decisions about individuals, be it whether to grant someone immigration status or employment insurance benefits. Such bias has also, for example, been referenced in the context of government staffing and recruitment.

This is a particular risk where AI is used in contexts such as health care, employment, education, policing, immigration, criminal justice, housing, or access to finance.

This is an issue of some concern because we expect to see an explosion in the use of AI over the coming years – we see predictions that by 2026, 90% of all online content could be at least partially generated by AI.

In other words, AI could soon be involved in almost everything that we touch online.

I expect in the coming months to release the findings of my joint investigation with my counterparts in Quebec, British Columbia, and Alberta, into OpenAI, the company behind ChatGPT. The results of this investigation will help inform our recommendations to both the public and private sectors with respect to the use of the technology.

When I discuss AI governance with my domestic and international partners, we talk about ensuring that this technology is designed and built on a solid foundation, and adaptable over time.

We call this privacy by design.

Privacy by design

Privacy by design is intertwined with human-centered design. Privacy by design contributes to users’ perceptions of care, competence, and integrity, which are key drivers of trust in government services.

Privacy by design means building in privacy concepts at the beginning of an initiative, making them part of the foundation of what you plan to do.

For example, deciding at the beginning to limit the collection of personal information to just what is needed for a new program, defining a retention period, and planning how the data will be protected are all ways to build in privacy principles at the outset and reduce the risk of later privacy issues.

Your organization’s privacy officer and my Office are resources that can be leveraged to advance privacy protective approaches to data management.

Privacy impact assessments, or PIAs, are another important risk mitigation tool. They can help institutions demonstrate that they are accountable for the personal information under their control, ensure that they are complying with the law, and limit their risk of privacy breaches. They should be completed prior to launching a new initiative, ideally in the very early stages to ensure privacy protections are baked in from the very start.

Last fall the Treasury Board Secretariat updated its policy instruments for PIAs.

Among other things, the new standard emphasizes privacy by design and introduces a checklist that must be completed before initiating a privacy impact assessment, to help document whether a PIA is necessary.

Privacy and EDI

I would like to focus now on “data for equity.” It is a term that encompasses a general trend that we are seeing in Canada, and around the world, and there are things that you need to consider when collecting data for equity purposes.

In the 2020 Speech from the Throne, the Government of Canada identified the need for disaggregated data to address systemic racism and capture lived experiences of racialized Canadians and Indigenous Peoples. This was also embedded in the 2020 Call to Action on Anti-Racism, Equity and Inclusion in the Federal Public Service by the former Clerk of Privy Council, who challenged federal government leaders to take an active role in ending all forms of discrimination and oppression.

Many government institutions collect and use personal information to help address social inequities in service and product delivery. Such initiatives take different forms, including Gender-Based Analysis Plus collection, race-based data collection in law enforcement, employment equity data collection, and voluntary self-identification questionnaires.

Data can be used to reduce the risk of bias, or to demonstrate accountability more generally. Data can also inform operational decision-making, whether for resource allocation or program performance management.

Because data for equity is so important, it is essential for those undertaking collection to first consider privacy protection risk in order to avoid further marginalizing equity-deserving groups and negatively affecting trust.

Certain types of personal information are considered sensitive because of the specific risks associated with the collection, use or disclosure of these categories of information, such as health and financial data, ethnic and racial origins, political opinions, genetic and biometric data, information on an individual’s sex life or sexual orientation, and religious or philosophical beliefs.

An important way to collect data for equity-deserving groups is through self-identification. Organizations can only gather this data if members of these groups voluntarily provide it.

To successfully obtain identity data that is complete enough to be effective, organizations must build and maintain trust. Demonstrating due diligence and care for privacy is an important element of trust-building.

To that end, when an organization sets out to collect personal information, it must be able to clearly explain how the information it intends to collect is necessary, and that the organization has a legal authority to collect that information.

Once this is done, any privacy risk must be identified and mitigated.

For example, a privacy impact assessment will be necessary if an institution offers grants to a specific equity-deserving group.

In many cases, entities collect identity information for more general, non-administrative, purposes, including using such information in aggregate for research, statistical, audit or evaluation purposes.

Even for these uses, institutions must still ensure that any privacy risks are identified and mitigated.

There are a number of other privacy principles that should also be considered, beginning with efficacy and limiting collection.

Efficacy and limiting collection

Ensuring that you have high-quality and relevant data is key to the effectiveness of equity initiatives. On the federal public sector side, two key points of law and policy emphasize this importance:

The Privacy Act states that government institutions must take all reasonable steps to ensure that any personal information used to make administrative decisions – that is, that directly affect an individual through decision-making – must be as accurate, up-to-date, and complete as possible.

Therefore, if the information is to be used for administrative purposes, institutions should implement measures to verify, validate, and update the information they use.

The TBS Directive on Privacy Practices says that institutions must show that the personal information they collect is directly related to and necessary for the program.

You must also regularly assess whether these measures contribute to evidence-based policy making. If the data is not meeting its intended purpose, collection should be stopped.

Other privacy principles that should be considered before institutions collect, use, or analyze personal information for equity initiatives are:

  1. Necessity: Is the information essential for the purpose?
  2. Proportionality: Is the amount of information collected proportionate to the need?
  3. Minimal Intrusiveness: Is the collection method the least intrusive option.
  4. Effectiveness: Will the information effectively contribute to the program’s goals?

When the goal is to reduce societal inequities, the data required to measure efficacy can be vast. In many cases of data for equity, institutions “don’t know what they don’t know,” potentially leaving the door wide open for over-collection.

Limiting collection involves making choices about the breadth of personal information collection, it also means being deliberate about the depth of this information.

For example, an organization may determine that it has the legal authority and needs to collect information about whether a person is Indigenous but it may not need, or have the legal authority to ask for further information about membership in more specific Indigenous communities.

De-identification

One way to enhance privacy protection is to apply effective de-identification measures.

De-identification is the process of removing or transforming information in a dataset to make it less identifiable. There are a variety of approaches, algorithms, and tools that can be used to de-identify and anonymize different kinds of data with varying levels of effectiveness.

In terms of de-identification, the OPC recommends that people consider:

  • Suppressing, masking or redacting direct identifiers such as names or social insurance numbers in situations where they have no analytical value;
  • Determining the appropriate minimum group size when releasing data. A minimum group size of 10 is often cited as a best practice for public releases of less sensitive data, while a minimum group size of 20 is cited for more sensitive data; and
  • Being transparent about de-identification, considering the risks of re-identification, and mitigating those risk through a privacy impact assessment.

When individuals have confidence in an institution’s data handling practices, they are more likely to share personal information. By effectively de-identifying data, organizations can maintain its utility and relevance while also encouraging greater participation, ultimately improving the efficacy of equity initiatives.

Conclusion

Privacy is an essential element of trust, which is crucial to the success of equity, diversity, and inclusion initiatives.

The work of the OPC intersects with your work at the level of fundamental human rights.

When the Clerk of the Privy Council released his Call to Action on Anti-Racism, Equity, and Inclusion in the Federal Public Service four years ago, he said, “Being a leader means taking an active role in ending all forms of discrimination and oppression, consciously and constantly challenging our own biases, and creating an environment in which our employees feel empowered and safe to speak up when they witness barriers to equity and inclusion.”

My Office can provide advice and guidance to support organizations in their efforts to respect privacy as they build a culture of equity, diversity and accessibility.

We can work together to ensure that individuals in all segments of our population, and particularly those in equity-deserving groups, feel empowered to make informed privacy decisions.

As champions for equity, you can advocate for privacy within your organizations by encouraging your organizations to consult with my Office before collecting and using personal information to identify and eliminate barriers.

Thank you again for inviting me here today. I believe that we now have time for questions.

Date modified: