Protecting privacy in a digital age
Keynote remarks at Canadian Access and Privacy Association (CAPA) Annual Conference
November 27, 2023
Address by Philippe Dufresne
Privacy Commissioner of Canada
(Check against delivery)
Thank you for that kind introduction, and thanks also to the members of the Canadian Access and Privacy Association for the invitation to speak to you today. CAPA is an important stakeholder for my Office and it is an honour to be here with people from both the public and private sectors who play an increasingly critical role in protecting the privacy rights of Canadians.
It is great to see the different perspectives that you will be covering throughout this conference today – including hearing from esteemed journalist Jim Bronskill, my colleague Caroline Maynard, Information Commissioner of Canada, and this afternoon’s expert panels that are focussed on artificial intelligence, as well as key legal decisions that affect access to information and privacy.
This is an exciting – and challenging – time to be working in privacy. It is an era that is characterized by fast-paced information flows, massive amounts of data exchange and collection, and rapid advancement and adoption of generative artificial intelligence, to a point where technology has permeated into all aspects of our lives to an unprecedented degree.
If ever there was a time to prioritize privacy as a fundamental right, this is it.
As the people who champion privacy, you are key to creating a culture of privacy within your organizations. This will help ensure that the government can harness the power of new technologies to the benefit of the public, in a privacy-respectful way.
Take, for example, this building (the Shaw Centre), where we can draw inspiration from how a goal – in this instance, environmental sustainability – can be integrated into so many elements of a design.
In this case, the goal, environmental sustainability, was determined well before construction started. Every step that followed had that goal in mind. A cistern below the building holds rainwater from the roof, which is used to flush the toilets. The roof trusses are made of recycled steel. Even the art is environmentally sustainable – the piece you may have noticed on the first floor as you came in – called the Wall of Three Rivers – uses logs reclaimed from the bottom of the Ottawa River.
When I talk about privacy by design, I have a similar vision. Just as the award-winning architect of this building has achieved, embedding a broader objective of responsible design that further enhanced creativity and innovation.
When I took on this job 18 months ago, I laid out three pillars of my vision for privacy. They are:
- Privacy is a fundamental right;
- Privacy supports the public interest and Canada’s innovation and competitiveness; and
- Privacy accelerates the trust that Canadians have in their institutions and in their participation as digital citizens.
I am sure this resonates with this audience of privacy champions.
These pillars reflect the reality that Canadians want to be active and informed digital citizens, and should not have to choose between this participation and their fundamental right to privacy.
Through my work and interactions over the past year, three strategic privacy priorities have crystallized:
- Addressing the privacy impacts of new technologies, including generative AI;
- Protecting the privacy of young people; and
- Maximizing the impact of my Office in fully and effectively promoting and protecting the fundamental right to privacy.
Today I would like to discuss each of these priorities, and how they relate to your work and the work of my Office.
I will also share an update on some of the activities that my Office has carried out in the past year, highlight key investigations, and talk about how we can work together to support Canadians’ fundamental right to privacy.
In this digital age, it seems that all aspects of our lives, from socializing online to shopping, up to and including our democratic rights and the rule of law, have privacy implications. Digital technology is also at the heart of government service delivery.
This is why addressing the privacy impacts of the fast-moving pace of technological advancement, especially in the world of artificial intelligence and generative AI, is one of my Office’s strategic priorities.
Many government departments are looking to leverage technology, such as digital identification or generative AI, to work smarter and better. An important part of my Office’s role is to help support you in ensuring that these innovations respect Canadians’ right to privacy.
Last year we received 110 privacy impact assessments and 73 advisory consultation requests from federal government institutions. We also conducted several outreach and capacity-building sessions for public servants.
These days when we talk about technological advances, I am sure ChatGPT and generative AI come to mind.
Fei-Fei Li, the co-founder of Stanford’s Institute for Human-Centered Artificial Intelligence, has been called the godmother of AI. She says that she views AI as a tool. And she says that “like other tools, our relationship with it is messy. Tools are invented by and large to deliver good but there are unintended consequences, and we have to understand and mitigate their risks well.”
The possibilities and challenges of generative AI demand a global response. Earlier this month, Canada was one of more than two dozen countries that signed on to the Bletchley Declaration at the AI Safety Summit in the U.K., committing to greater cooperation on ensuring that artificial intelligence is made and used responsibly.
My Office has been working with its domestic and international counterparts on the regulatory response to this game-changing technology.
Next week, we will host a privacy symposium as well as a meeting of the International Working Group on Data Protection in Technology, also known as the Berlin Group. Both events will focus on AI, and I am looking forward to what are sure to be productive and interesting conversations with leading experts in this domain and representatives from other data protection organizations from across the country and around the world.
This past summer, with my G7 Data Protection and Privacy Authorities colleagues, I issued a joint statement on generative AI, where we called on developers and providers to embed privacy in new technologies at the ground level, in the design and conception of these new products and services.
AI was also the subject of resolutions that were passed by my DPA colleagues all over the world during the Global Privacy Assembly meetings last month. We jointly urged developers and providers of AI to recognize data protection as a fundamental right and called for the creation of responsible and trustworthy generative AI technologies.
In May, I announced that we had launched a joint investigation with three provincial counterparts into OpenAI, the company behind ChatGPT, to determine whether its practices comply with Canadian privacy laws. While our privacy laws need to be modernized, they do currently apply in this space, and I am committed to their implementation.
That investigation is ongoing, and we are continuing to monitor these and other new technologies so that we can anticipate how they may impact privacy, recommend best practices to ensure compliance with privacy laws and promote the use of privacy-enhancing technologies.
In September, the Treasury Board Secretariat released a guide for the use of generative AI in the public service. The guide urged caution, saying that federal institutions should explore how they could use generative AI tools to support their operations. At the same time, they should also assess and mitigate risks and restrict their use to activities where they can manage those risks effectively.
For those of you in the federal public service, if your department is considering adopting generative AI tools – or any other new technology or approach, such as public-private partnerships – I encourage you to work with my Office to identify and resolve any privacy concerns. Privacy impact assessments are an important tool for protecting Canadians’ privacy rights.
Another issue related to technology and privacy is the growing use of biometrics, such as facial recognition and genetic information. In October my Office released two draft biometrics guidance documents for consultation, one for the private sector and one for the public. We are seeking input to ensure that organizations use these technologies in a privacy-protective way.
I invite you to take the time to review the document that is relevant to your sector, and provide your valuable input before the January 12 deadline.
The online world brings with it a host of possibilities for connection and creativity, but it also carries potential for significant harm, especially for young people, which is why protecting the privacy of this group is another of my top priorities.
Young people have a right to privacy both on- and off-line, but growing up in the digital age presents significant new challenges for their privacy.
As children and youth embrace new technologies and experience much of their lives online, we need strong safeguards to protect their personal information and the ways in which it may be collected, used, and disclosed. Increasingly, their information is being used to create personalized content and advertising profiles that are ultimately aimed at influencing their behaviours.
Children have a right to be children, and to do so safely, even in the digital world.
As UNICEF notes in its policy guidance on artificial intelligence for children, young people are affected by digital technologies to a greater extent than adults. They may also face greater long-term implications of consenting to their data collection.
Earlier this year, I launched a joint investigation with my counterparts in Québec, Alberta and British Columbia into the privacy practices of TikTok, an app that is particularly popular among young people. I expect that our findings will be informative for many organizations that collect and handle children’s sensitive personal information.
At our annual meeting of federal, provincial and territorial privacy regulators last month, my colleagues and I issued a joint resolution calling on organizations in both the private and public sectors to put the best interests of young people first when dealing with that group’s sensitive information. There are many ways they can do this, for example, by:
- providing privacy tools and consent mechanisms that are appropriate for young people;
- rejecting the kind of deceptive practices that influence young people to make poor privacy decisions or to engage in harmful behaviours; and
- allowing for the deletion and de-indexing of information that was collected when users were children – something that I am happy to see was included in Bill C-27.
It is critical that government and private-sector organizations take action to ensure that young people can benefit from technology and be active online without the risk of being targeted, manipulated, or harmed as a result. The joint resolution can be found on our OPC website.
My Office will also soon release a Privacy Act Bulletin for the public service with information and advice about how to proactively protect children’s privacy.
I would encourage you to read these documents, and to consider what initiatives your organization has that should be further examined through the lens of the best interests of young people.
Maximizing the OPC’s impact
Addressing technological advances and protecting young people’s right to privacy are broad-based issues that require a broad-based, global and collaborative approach. They are an important part of my third strategic priority, which is to maximize the OPC’s impact in promoting and protecting the fundamental right to privacy.
One way we work toward the goal of protecting Canadians is by providing advice to Parliament on privacy law reform and other areas with an impact on privacy.
Undertaking important investigations and publicizing our findings are other ways that our work has an impact, as is providing advice and recommendations to government institutions and organizations in the private sector. We also have recourse to the courts when necessary.
This fall, I had the opportunity to present my views on Bill C-27, the Digital Charter Implementation Act, before a Parliamentary committee. The Bill includes the Consumer Privacy Protection Act, or CPPA, which would essentially replace PIPEDA.
It addresses concerns that were previously raised by my Office and others. For example, it requires that information used to obtain consent be in understandable language; it provides my Office with order-making powers; and it includes an expanded list of contraventions to which administrative monetary penalties may apply in cases of violations.
Overall, the Bill is a step in the right direction, but it can and must go further to protect fundamental privacy rights.
Last month, the Minister of Innovation, Science and Industry proposed certain amendments to the Bill that would address some of the 15 key recommendations that were put forward by my Office to strengthen the legislation. These include explicitly recognizing privacy as a fundamental right; strengthening privacy protections for young people and providing more flexibility for my Office to use compliance agreements to correct privacy behaviours, including through the use of financial penalties. The Minister also stated that he would propose amendments allowing for greater cooperation between regulators.
Of course, Bill C-27 also introduces the Artificial Intelligence and Data Protection Act or AIDA. The adoption of AIDA could make Canada one of the first countries to regulate AI, which is important given the technology’s potential risks. Although AIDA does not specifically address privacy risks, the Consumer Privacy Protection Act would apply to the processing of personal information within AI systems, and I have recommended important ways to improve this.
Among them is a recommendation that organizations be required to conduct privacy impact assessments to ensure that privacy risks are identified and mitigated for high-risk activities. We know that privacy harms are one of the top 3 risks of AI according to the G7 Digital Ministers.
Given the concerns around how AI systems reach decisions, as well as issues of fairness, accuracy, bias and discrimination, organizations should also be required to explain, on request, all predictions, recommendations, decisions and profiling made using automated decision systems. Such decisions can have a profound impact on individuals’ lives and Canadians should have the right to request an explanation if they find themselves on the receiving end of an automated decision.
Privacy Act reform is also a pressing need, as the legislation has not been significantly updated since it was passed 40 years ago. The government’s discussion paper on Privacy Act modernization, introduced in 2021, is still undergoing consultations. I also welcome the consultations on Indigenous perspectives that the Department of Justice launched last month.
Another important privacy consideration and one of the recurring themes in our investigative findings is the issue of consent. I would like to mention two investigations that were highlighted in our annual report that underscore the need for institutions and organizations to obtain meaningful consent before collecting and using personal information.
One involved Canada Post’s Smartmail Marketing Program.
Our investigation revealed that Canada Post builds marketing lists with information gleaned from the envelopes and packages that it delivers to homes across Canada. It then makes these lists available to advertisers for a fee.
This contravened the Privacy Act because the postal service had been acting without the knowledge and consent of Canadians. We recommended that Canada Post stop its practice of using and disclosing personal information without first seeking authorization.
Canada Post committed to reviewing its data services program following the findings of our investigation. We look forward to hearing what measures it will propose to ensure that the privacy of Canadians is protected in accordance with the Privacy Act.
Another investigation involved the Canada Border Services Agency’s use of genetic genealogy. The CBSA obtained an individual’s consent to collect a DNA sample, and then sent it to the genetic genealogy company Family Tree DNA. It was an unsuccessful attempt to confirm his nationality in order to deport him. The complainant argued that this contravened his rights under the Privacy Act.
Our investigation found a number of issues, including that the CBSA did not provide him with key information, and therefore lacked valid authorization to collect information about him from Family Tree DNA.
Although it is a Treasury Board Secretariat policy to conduct a privacy impact assessment before undertaking this kind of activity, the agency did not do so. The CBSA has since put a moratorium on its use of genetic genealogy services, and has either closed the Family Tree DNA accounts or released them to the individuals.
Consent must be meaningful under Canada’s privacy legislation.
Canada Post asserted that individuals implicitly authorized the agency to collect information from their mail by accepting mail delivery from Canada Post in general, and by not using the opt-out feature on the Crown corporation’s website. We disagreed because, in our view, neither constituted authorization. Most Canadians would not be aware of this practice, nor would they reasonably expect it to be taking place. CBSA obtained consent, but that consent was not meaningful because the agency failed to disclose key information.
Obtaining meaningful consent is not just a legal obligation, it is also about trust: the results of my Office’s most recent survey of Canadians suggest that only 4 in 10 respondents felt that businesses protect their privacy. Meanwhile, not quite 6 in 10 believe that government will do so, down 5% from the previous poll in 2020.
These figures tell us that Canadians want and need to trust that their privacy rights and personal information are being protected so that they can feel confident about participating freely in the digital economy.
My Office has an important role to play in helping support organizations to innovate and operate in a privacy-protective manner that will generate trust.
I would now like to discuss breaches.
As you may know, last week I decided to launch investigations into a cyberattack that has resulted in a breach affecting the personal information of federal government personnel who used government-contracted relocation services over the past 24 years.
The investigation will assess compliance with the Privacy Act, the federal public sector privacy law, by Public Services and Procurement Canada and TBS, the two government departments that contracted with the companies.
It will also assess compliance with the Personal Information Protection and Electronic Documents Act or PIPEDA, Canada’s federal private sector law, by Brookfield Global Relocation Services, a relocation management company, and Sirva, a household goods transportation company that contracted with government departments.
My intent is that this investigation will help us to understand why this happened, and also what must to be done to remedy the situation and prevent such things from happening again.
My Office continues to be concerned about the possible under-reporting of breaches in the public sector. Last year, the number of breaches reported to my Office dropped by 36 per cent, to 298. Most of the reported breaches come from the same federal institutions every year. Many of the institutions that are subject to the Privacy Act and handle sensitive personal information have never reported a breach.
Only one of the public-sector breaches reported last year involved a cyberattack. That compares to 278 reported cyberattack breaches in the private sector. The Communications Security Establishment reported earlier this year that it blocks billions of cyber attempts against Government of Canada networks each day. It seems incredible that more government departments would not be affected.
One reason why breaches might be under-reported is uncertainty about whether a breach created what the law calls a “real risk of significant harm”.
My Office has created guidance to help organizations assess real risk of significant harm, which is a broad concept that includes: bodily harm, humiliation, damage to reputation or relationships, loss of employment, financial loss, identity theft, negative effects on credit record and damage or loss of property. In determining whether there is a real risk of significant harm, it is necessary to consider the degree of sensitivity of the personal information involved and the probability that the information has been, is being, or will be misused.
We have also developed a desktop app to guide risk assessments by asking a series of questions to help determine whether it is reasonable to believe that a privacy breach creates a risk of significant harm. It does not replace human judgment, but it does provide data to inform that judgment.
The first phase of the tool was launched internally in March 2022. In a recent pilot project, 20 organizations were given access to try the tool. The feedback has been very positive, and will inform a future public-facing version of the tool.
In the meantime, I encourage you to err on the side of caution, and report breaches to my Office even if you are unsure whether they qualify. Your organization will be able to benefit from our expertise in addressing and mitigating breaches involving personal information.
In closing, all of us in the privacy community have our work cut out for us, as technology continues to advance at an unparalleled pace. But the good news is that this is a committed community, full of expert, principled and dedicated professionals. So we will meet this challenge to protect the fundamental right to privacy, achieve public and private interests, and build the trust of Canadians.
Thank you for your attention and for everything that you do every day.
- Date modified: