Contributions Program projects underway
On August 28, 2023, the Office of the Privacy Commissioner of Canada (OPC) announced funding for a new round of independent research and knowledge translation projects funded under its Contributions Program. These projects will be completed by March 31, 2024. The OPC will post a summary of completed projects, as well as links to their outcomes, once the projects are completed and reviewed by the OPC.
2023-24 Contributions Program funding recipients
Organization: York University (Ontario)
Amount awarded: $ 50,000
Project leader: Jonathan Obar
Organization: Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic – CIPPIC (Ontario)
Project title: Making Privacy More than a Virtual Reality: The Challenges of Extending Canadian Privacy Law to Extended Reality
Amount awarded: $ 49,450
Project leader: Vivek Krishnamurthy
Extended reality (“XR”) technologies – which include augmented, mixed, and virtual reality – promise to revolutionize many aspects of our lives. Yet, the nature of these technologies’ data collection poses profound new implications for personal privacy. This project will begin with a survey of current XR technologies and anticipated developments to contextualize the privacy concerns surrounding XR hardware. Following the survey, the researchers will undertake a comprehensive study evaluating whether PIPEDA and the proposed CPPA are up to the task of protecting Canadians’ privacy in XR environments. The researchers will explore whether alternative approaches may be required to protect the privacy of Canadians in immersive environments, offering suggestions for Canadian privacy laws to address challenges posed by XR technologies.
Organization: University of Ottawa (Ontario)
Project title: Benchmarking Differential Privacy and Existing Anonymization or Deidentification Guidance
Amount awarded: $ 47,370
Project leader: Rafal Kulik
Government and private industry, including official statistics organizations or health institutions, collect information from individuals and publish aggregate data to serve the public interest. Organizations have long collected information under a promise of confidentiality, on the understanding that the information provided will be used for statistical purposes only and that the release and sharing of information will prevent information from being traced back to a specific individual. Differential privacy provides a means of limiting the information that is released so that an individual’s contribution remains hidden from a statistical release of a single query (or a small number of queries).
Recently there has been a significant push to establish differential privacy as a standard in emerging AI technologies. Though the technique is starting to be widely used by tech companies and government agencies, there are challenges that must be overcome before we can see a full adoption of this technology when it comes to deidentification and anonymization. This project will aid in developing a framework necessary to implement differential privacy in practice, as well as help form a decision-making protocol in terms of other privacy technologies and current guidance.
Organization: Queen's University (Ontario)
Project title: Large Language Models and the Disappearing Private Sphere
Amount awarded: $ 50,000
Project leader: Catherine Stinson
This project will examine possible futures for large language models (LLMs) and privacy in the private sector in the age of immersive and embeddable technologies. The project will produce a report, guidelines for institutional review boards (IRBs), and a public website to increase knowledge and understanding about the actual and potential future implications of LLMs and the collection of data used to train them. The project will also examine the differential effects of LLMs on the privacy of marginalized Canadians and members of minority language groups.
The researchers will focus on five questions: 1. What is the de facto status of web scraping in Canada, according to IRBs? 2. How much data about individuals can be retrieved from LLMs? 3. Are marginalized groups and minority language groups more susceptible to privacy leakage from LLMs? 4. Are Canada’s privacy laws and regulations capable of dealing with these models and their privacy implications? 5. What changes to law and regulation might be needed?
Organization: University of Western Ontario (Ontario)
Project title: Identifying and Responding to Privacy Dark Patterns
Amount awarded: $ 49,717
Project leader: Jacquelyn Burkell
The aim of this project is to help minimize the impact of privacy dark patterns on Canadian youth by informing the development of effective regulatory frameworks and educational materials that will assist users to resist these tactics. Privacy dark patterns are interface design strategies intended to “nudge” users to reveal personal information, either directly, or by enabling (or failing to disable) privacy-invasive platform or profile settings. Teens are especially vulnerable to the effects of dark patterns on privacy choices, both as avid users of the Internet and social media and because of their awareness levels of commercial surveillance online.
The researchers will conduct focus groups with teen users of social media sites to determine whether they are able to identify privacy dark patterns and how they respond to these strategies. The researchers will also review current regulatory responses to privacy dark patterns, identifying both the variety of approaches and challenges to effectiveness. The results of this research will inform the development of educational materials in collaboration with MediaSmarts that teach teens how to resist privacy dark patterns on social media.
Organization: Option consommateurs (Quebec)
Project title: In the Matrix – Consumer Privacy in the Metaverse
Amount awarded: $ 49,758
Project leader: Alexandre Plourde
The advent of the metaverse—a three-dimensional virtual reality where users can interact with others—poses privacy risks for consumers. The metaverse’s immersive capabilities allow for the unprecedented collection of personal information. It is conceivable that the analysis of data generated in the metaverse could infer thoughts, emotions or other sensitive information about consumers. As a result, the scope of the metaverse’s data-collection abilities raises multiple issues related to the legal framework for personal information protection.
In this research project, Option consommateurs will outline the various metaverse models Canadians have access to, as well as those that could emerge in the coming years. Option consommateurs will also analyze the privacy policies, user agreements and informational content of a representative sample of three types of companies in the metaverse environment: metaverse developers, companies that are doing business in the metaverse and companies that create the devices used to access the metaverse. Lastly, Option consommateurs will look at what legislation applies to personal information protection in these new environments—in Canada and abroad.
Organization: The Centre for International Governance Innovation – CIGI (Ontario)
Project title: Hacking the Human Mind: Lessons for Canada’s Democracy
Amount awarded: $ 30,000
Project leader: Aaron Shull
Companies are now deploying technological, psychological, and sociological methods to get inside the minds of users, collecting data of millions of people, many of whom may not be aware of it. This approach embodies a form of behavioural manipulation that is threatening the right to freedom of thought and opinion and is an invasion of one’s mental privacy. Given this new reality, there is an urgent need to implement strategies to protect Canadians’ human autonomy.
This research project will draw on diverse subject matter perspectives to prepare an explainer video and policy brief for the Canadian public, exploring questions such as: How does privacy protect our inner freedom? In our data-driven world, where do we draw the line between legitimate influence and unlawful manipulation of thought? How are challenges to the freedom of thought present in the Canadian information ecosystem context? What are the risks to privacy and cognitive freedom posed by advances in neurotechnology? How can immersive and embeddable technologies enable individuals to thrive while protecting their privacy? How can we chart a path to effective protection? What should privacy legislation and policies address in response to the risks and challenges that arise from technologies?
Organization: Law Commission of Ontario (Ontario)
Project title: Privacy and Human Rights Impact Assessment in Canadian AI Systems
Amount awarded: $ 39,600
Project leader: Nye Thomas
Notwithstanding AI’s potential, private sector use of AI is often very controversial. There are many examples of private-sector AI systems that have violated privacy protections or proven to be biased or discriminatory. Privacy and human rights compliance are the foundations of trustworthy AI. Canadian AI systems must be both privacy-compliant and human rights compliant to ensure Canadians can “trust” AI systems and to unlock the extraordinary economic potential of this technology. To date, it appears initiatives that promote privacy and human rights in Canadian AI systems have developed on distinct tracks.
This project will produce a comprehensive report identifying law and policy reform recommendations discussing the relationship between privacy and human rights in AI governance and regulation. The project will also make recommendations considering whether an integrated “Trustworthy AI” impact assessment tool addressing both privacy and human rights is desirable, achievable, or practical.
Organization: University of Waterloo (Ontario)
Project title: A Pan-Canadian Data Governance Framework for Health Synthetic Data
Amount awarded: $ 48,875
Project leader: Anindya Sen
Health data, especially electronic medical records, are often stored in disparate systems and formats, rendering integration and standardization difficult. Researchers and developers often depend on de-identified or aggregated data to test theories, data models, algorithms, or prototype innovations, but it takes a substantial amount of time and resources to retrieve, aggregate, and deidentify relevant data before it can be used. One approach proposed to solve these challenges is the creation of realistic, high-quality, synthetic health datasets that capture as many of the complexities of the original datasets, but do not include any real patient data.
However, compared to other countries, health synthetic data have remained implicit in Canada’s regulations, and not received explicit treatment. In addition, there is no universal framework to govern health synthetic data and assess its impacts. This project aims to develop a data governance framework for health synthetic data, ethical guidelines for research involving health synthetic data, recommended policy changes, and a cost impacts framework.
Organization: University of Calgary (Alberta)
Project title: Mitigating Race, Gender and Privacy Impacts of AI Facial Recognition Technology
Amount requested: $ 49,772
Project leader: Gideon Christian
The increasing use of Artificial Intelligence Facial Recognition Technology (AI-FRT) in the private and public sectors has been plagued by issues of privacy as well as racial and gender bias, as the technology regularly misidentifies or fails to identify individuals of a particular gender or race.
This research project seeks to examine and identify the race, gender and privacy issues related mainly to the development of AI-FRT by the private sector in Canada, and its use by both the private and public sectors in Canada. Other objectives of the research include to develop a framework and guidelines to address race, gender and privacy impacts arising from the development and deployment of AI-FRT by private sector developers in Canada; to identify possible reform of the Personal Information Protection and Electronic Documents Act (PIPEDA) to legislatively address race, gender and privacy impacts arising from private sector development and deployment of AI-FRT; to collaborate with the Alberta Civil Liberties Research Center (ACLRC) to increase public understanding and awareness of race, gender and privacy impacts of AI-FRT through webinars, workshops and publication of research papers; and to strengthen research capacity in academia by training graduate students to research on race, gender and privacy issues related to AI-FRT.
Organization: Concordia University (Quebec)
Project title: Privacy Analysis of Virtual Reality/Augmented Reality Online Shopping Applications
Amount awarded: $ 35,454.50
Project leaders: Mohammad Mannan (principal investigator) and Amr Youssef
This project will investigate the ecosystem of virtual reality/augmented reality (VR/AR) e-commerce, retail apps and websites, including virtual try-on, virtual makeup and beauty apps or websites, and the technological tools that are used by developers of these systems. The researchers will design and implement a privacy and security analysis framework to find and analyze these apps and websites as well as a selected set of their software development tools and libraries.
The researchers will produce a public report summarizing the findings of their investigation and presenting recommendations for improving the security and privacy of VR/AR systems. The report will also include some easy-to-follow guidelines for Canadian shoppers who use these apps. The findings of this report will be provided online for free. The researchers will also produce a technical paper detailing their full methodology and results, as well as their technical recommendations.
2022-23 Contributions Program funding recipients
Organization: York University
Project title: Privacy Evaluation of Virtual Classrooms
Amount awarded: $ 49,679.00
Project leader: Yan Shvartzshnaider
The COVID-19 pandemic has forced universities to transition to online platforms, which has exposed them to greater privacy challenges and threats. What makes the situation more complex is that the information handling practices of these platforms often go beyond the educational context. This project seeks to understand the privacy implications of using online platforms for educational purposes.
Specifically, the project will measure to what extent the functionalities and information-handling practices of platforms align with privacy regulations, users' expectations, and ethical concerns. The project will also explore how the pandemic has changed established norms, as remote learning becomes more pervasive, focusing on technological practices such as facial recognition or location-based tracking services to mark student attendances or in-class attention. At the end of the project, the research team hopes to be able to provide informative guidance to stakeholders in designing effective privacy preserving online systems.
Organization: University of Guelph
Project title: Securing Privacy: Examining the Tension Between Push and Pull of Cybersecurity Adoption
Amount awarded: $ 49,450.00
Project leader: Davar Rezania
This research project seeks to examine the privacy protection practices of small and medium-sized enterprises (SMEs). Specifically it looks at how the interaction of external factors (for example government policies, technological advances and market forces) and internal factors (such as the characteristics of an organization, its privacy practices and reputation) affects cybersecurity adoption within those organizations. The researchers hope that identifying the factors that contribute to an SME’s decision to adopt cybersecurity measures can form the basis of recommendations for future amendments to the Personal Information Protection and Electronic Documents Act (PIPEDA).
Organization: York University
Project title: Deconstructing/Performing the Amazon Ring Security Apparatus
Amount awarded: $ 27,020.00
Project leader: Evan Light
This project will examine how the Amazon Ring home security system watches and tracks people, where the data generated by this surveillance goes, and the laws and policies that govern these interactions. A key component of the project is the construction of an Amazon Ring show home within an exhibition space at York University. Within and outside the home, a number of technical components will perform and display a real-time analysis of what is happening behind the scenes.
Organization: Pinnguaq Association
Project title: Privacy, AI, and Machine Learning Through a Rural, Remote, and Indigenous Lens: A Resource and Toolkit
Amount awarded: $ 50,000.00
Project leader: Ryan Oliver
Who is impacted by the privacy and digital safety risks associated with artificial intelligence (AI) and machine learning in rural, remote and Indigenous communities across Canada? This project will explore that question and produce a free, responsive educator toolkit to be distributed across Canada that supports the assessment and mitigation of privacy risks, as well as an understanding of the barriers and inequalities for Canadians living in these communities.
Organization: First Nations Information Governance Centre (FNIGC)
Project title: First Nations Data Sovereignty and the Personal Information Protection and Electronic Documents Act (PIPEDA)
Amount awarded: $48,740.45
Project leader: Albert Armieri
This project intends to support First Nations’ awareness and understanding of the Personal Information Protection and Electronic Documents Act (PIPEDA). FNIGC will develop a plain language guide to the legislation with a focus on practical information for First Nations governments, organizations, and individuals responsible for its application. By enabling First Nations to better understand and apply the legislation, this project also seeks to help First Nations businesses to thrive.
Organization: Open North
Project title: The Intersectional Privacy Risks of Law Enforcement Influence and Involvement in Smart City Projects
Amount awarded: $ 50,000.00
Project leader: Merlin Chatwin
This project focuses on the link between the private and public sectors in the development of smart cities. Specifically, it will investigate how smart city projects in Canada have been influenced by law enforcement agencies, or have taken steps to accommodate law enforcement’s interest in their projects. The project will also develop an intersectional analysis of differential privacy harms caused by such influence. Finally, a final report, as well as a set of field guides and workshop templates will be produced for existing or aspiring smart city projects, to help them navigate law enforcement influence and make informed decisions about smart city project designs.
Organization: University of Regina
Project title: Public Perspectives on Facial Recognition Technology: Attitudes, Preferences, Hopes, and Concerns
Amount awarded: $ 49,450.00
Project leader: Justin Longo
This research proposes to test empirically what Canadians deem acceptable in the context of facial recognition technology (FRT) applications used by private sector actors. A survey of Canadian residents from all provinces and territories will gather information on attitudes towards FRT used in a variety of settings, focusing on safety, privacy, fairness, and discrimination concerns. The researchers hope their work will have implications for the adoption of FRT by the private sector and the development of legislation and regulation in response to its use.
Organization: Queen's University
Project title: Proof of Erasure: Secure Personal Data Deletion with Public Verifiability
Amount awarded: $ 50,000.00
Project leader: Jianbing Ni
Customers increasingly are asking service providers that maintain their digital information to delete their data. However, the current legislative landscape raises a number of questions, notably with regard to the right to erasure of Canadians. This project aims to acquire knowledge and understanding of data deletion policies, as well as explore effective approaches for secure data deletion and advanced designs of proof of erasure. In particular, the project will study the extent to which the right of data deletion is recognized by various privacy laws and regulations, such as the Personal Information Protection and Electronic Documents Act (PIPEDA), and analyze how that right is deployed on existing data platforms. It will also identify through an online survey and a case study the secure personal data deletion policies expected by the public.
Organization: Centre de documentation sur l’éducation des adultes et la condition féminine (CDEACF)
Project title: Protecting victims’ privacy to prevent spousal homicide
Amount awarded: $50,000.00
Project leader: Lise Chovino
This project aims to help us understand and document the need for digital security and privacy protection when evaluating the safety net of women in second-stage housing (i.e. shelters for women who are at risk of being victims of spousal homicide after leaving an emergency shelter). It will allow us to evaluate security breaches due to digital service provider practices, gauge the level of safety felt by housed women with access to these services and better protect them from the impacts of information collection by developing adapted teaching tools.
Organization: Canadian National Institute for the Blind (CNIB)
Project title: Consent and Inclusion, Diversity, Equity and Accessibility
Amount awarded: $ 48,944.00
Project leader: Mahadeo Sukhai
This project will research the relationship between individual consent as it relates to personal information and the factors that make up a person’s identity. Specifically, the project team will seek to better understand the ways in which people who are blind in Canada understand, provide and conceptualize consent, while also considering other factors in the course of their research, such as mother tongue, ethnicity, race, educational background, additional disability, age, and employment. This will in turn allow the CNIB to shape its practices in ways that respect the concepts of inclusion, diversity, equity and accessibility. These findings and practices will be shared with other Canadian organizations in the hopes of shaping their practices for the better.
Organization: Concordia University
Project title: Privacy Analysis of Technologies Used in Intimate Partner Abuse
Amount awarded: $ 26,716.55
Project leader: Mohammad Mannan
Technology plays a major role in facilitating intimate partner violence (IPC) where invasion of privacy has become a significant form of IPC. Such violation of privacy can happen through monitoring the victim’s movements using stalkerware and hidden Wi-Fi cameras, and by using drones to non-consensually film or harass the victim. Manipulating and distributing intimate images obtained through creepshots and deepfakes are other forms of such abuse. The aim of this project is to investigate the cyber-IPC ecosystem including the technological tools that are used by abusers, as well as the computer security tools or apps that can be provided to help victims. The researchers will produce a public report that summarizes their findings, presents recommendations for solutions, and include guidelines for victims. The team will also produce a technical paper.
- Date modified: