Appearance on Bill C-11 Issue Sheets
Scope of application
Key Messages
- During the Commons Heritage Committee study and the current Senate study, academics and industry posed questions about the scope of the law and coverage of social media and user generated content.
- Lack of clarity in the legislation – specifically exceptions to exclusions in the scope of application – creates both legal ambiguity and uncertainty for companies and regulators.
- We recommend that [redacted] any personal information at issue be de-identified or anonymized.
Background
- Public perception – While certain domestic industries and associations remain largely supportive of C-11, both Members of Parliament and Senators noted high volumes of correspondence voicing concern their online activities will be scrutinized by CRTC regulators. Opinions vary on whether this concern is justified, but the issue continues to lay at the crux of the privacy / freedom of expression debate.
- Government response – Response to date is that C-11 “does not apply to individual users of social media" and no “online creators will be regulated.”
- Clarification of exclusion – At section 4.1 there exists an exclusion for “programs that are uploaded to an online undertaking that provides a social media service by a user of the service for transmission over the Internet and reception by other users of the service”.
- Exception to exclusion – There is an exception to the exclusion at section 4.1(2) to allow the Act to apply to programs prescribed in regulation, and a new section 4.2 that imposes conditions.
Prepared by: PRPA
Collection and disclosure provisions
Key Messages
- The bill creates several powers for collection and disclosure of information by the CRTC to regulate online streaming services.
- Clause 10 also includes an order-making power to impose conditions on persons carrying on broadcasting undertakings, including conditions for provision of information to CRTC.
- The CRTC may disclose information to Statistics Canada, the Minister of Heritage, or the Competition Bureau, in certain cases.
- In many instances, we do not believe personal information will be at issue, given C-11 stipulates under subparagraph 9.1(1)(o)(iv)) that personally identifiable information is not to be included.
Background
- Provision of information to CRTC by online streaming services: Clause 10 allows the CRTC to order provision of information from broadcasting undertakings considered necessary for the administration of the Act, including information regarding audience measurement. Importantly, information that could identify any individual audience member is excluded (9.1(1)(o)(iv)).
- Disclosure from CRTC to Statistics Canada and Heritage: Clause 22 enacts three new provisions governing the disclosure of information by or under the direction of the CRTC. A new s. 25.1 requires the CRTC to provide the Chief Statistician and the Minister, on request, any information submitted to the CRTC.
- Disclosure to the Competition Bureau: Exceptions are included in new sections 25.3(4), (5) and (7), which allow the CRTC to disclose confidential information, while protecting the privacy of Canadians, upon receiving submissions from interested parties on potential disclosure, if the CRTC determines that the disclosure is in the public interest as well as to the Commissioner of Competition where it is relevant to competition issues.
- CRTC public disclosure: A new s. 25.2 provides that CRTC shall proactively make available for public inspection any information submitted to it during proceedings. This is subject to a new s. 25.3(1), which allows a person who submits information to CRTC to designate information as confidential.
Prepared by: PRPA
Algorithmic manipulation
Key Messages
- While Bill C-11 would restrict the CRTC from making orders to require the use of a specific algorithm, its discoverability requirements enable the CRTC to manipulate the outcome of them. The extent of these interventions will depend on how discoverability is defined and enforced.
- The inclusion of an explicit policy objective protecting the privacy of persons would have a meaningful impact on how the Act’s requirements are implemented in practice and create the legal expectation that privacy protective methods are favoured.
- Privacy risks would be reduced [redacted] if there was a requirement for personal information to be de-identified or anonymized.
- In the context of Bill C-27, while it would impose obligations on automated decision-making, it is only for those with “significant impacts” – which is unlikely to increase transparency of recommendation algorithms.
Background
- Bill C-11 provisions for CRTC regulation of discoverability: s.3(7)(q) of C-11 declares as the broadcasting policy of Canada, that online undertakings should increase the discoverability of Canadian content, and s.9.1 (1)(e) provides the CRTC with order-making powers related to the presentation of programs, and the discoverability of Canadian programs.
- Bill C-27 requirements on automated decision-making (ADM): the CPPA includes a requirement for organizations to explain predictions, recommendations or decisions that could have a “significant impact”, to individuals upon their request (s.63(3)).
Prepared by: PRPA
Potential right to opt-out of algorithmic recommendations
Key Messages
- Bill C-11 seeks to shape how content is individually targeted, in contrast with approaches taken in Europe, where the Digital Services Act will require large online platforms to provide an option to turn off recommendations based on profiling.
- A recent study funded by my Office on recommendation algorithms used in online streaming platforms found that more than 80% of Canadians felt their online activities are being watched most or all the time to deliver such recommendations.
- Much of this happens without sufficient transparency or meaningful consent, and so the Committee may wish to consider such an opt-out as part of Canada’s approach to digital regulation.
Background
- Background on Digital Services Act: The EU Parliament passed the Digital Services Act in July 2022, and it is expected to come into force in 2024.
- Article 29 of the Digital Services Act states “providers of very large online platforms that use recommender systems shall provide at least one option for each of their recommender systems which is not based on profiling”.
- Views of Canadians cited: The OPC-funded research project is entitled “Alter Algo” and was completed by the Internet Society – Quebec Chapter.
- Benefits of an opt-out: Under PIPEDA (and CPPA), organizations must have a specific purpose for data collection. By providing an opt-out of profiling-based recommendations, the data collected for that purpose would no longer be necessary and could not be collected for that purpose, resulting in a stronger privacy protection.
- An opt-out may bring other desirable impacts, as researchers and whistleblowers have suggested broader issues with recommendation systems based on profiling on social media platforms – such as negative reinforcement, and adverse effects on the mental health of young people.
Prepared by: PRPA
Section 8 of the Charter
Key Messages
- The Charter Statement for C-11 identified s.8 rights of “online undertakings” (e.g., streaming services, social media platforms) as potentially engaged, as opposed to rights of individual users.
- S.8 considerations may be engaged regarding CRTC’s collection of information from online undertakings to ensure compliance with the Act and Canadian broadcasting policy.
- Bill C-11’s information requirements (e.g., the CRTC’s authority to disclose information to the Chief Statistician of Canada, Minister of ISED, Competition Bureau and the public) are for an administrative purpose, not to determine penal liability.
- On that basis, the Statement finds C-11 consistent with the Charter.
Background
- Information provisions: The CRTC could require any broadcasting undertaking to provide any information that the CRTC considers necessary for the administration of the Act (s. 9.1(1)(o)). CRTC would have the authority to audit or examine records and books of accounts of any person carrying on a broadcasting undertaking (s. 10(8)(j)).
- CRTC processes: The CRTC typically makes orders or policy directions following hearings or other consultative processes. Broadcasters, industry organizations, consumer protection groups, academics and the public can make submissions. In the past, the OPC has made written submissions with respect to privacy-related matters under the Telecommunications Act and the Broadcasting Act.
- Penalties: CRTC would have authority to issue administrative monetary penalties for violation of certain provisions of the Act (cl. 28, s. 34.4(1)) and related information requirements, that could require any person to provide information in their possession relevant to the determination of whether a violation had been committed. Violations would expressly not be considered offences under the Criminal Code (s. 34.991(2)).
- Parallel with CASL: In 2020, the FCA upheld the CASL regime as constitutional & consistent with s. 8, a regulatory scheme analogous to Bill C-11 (in the Compufinder case: 3510395 Canada Inc. v. Canada (Attorney General), 2020 FCA 103).
Prepared by: Legal Services
Engagement in Set-Top Box Working Group
Key Messages
- Between 2015 and 2020, the OPC held several meetings with the broadcast industry’s Set-Top Working Group. This group was directed by the CRTC to develop a set-top box based audience measurement system that included privacy protections.
- They provided our office with details on how they planned to implement their audience measurement system.
- As we were not provided with relevant Privacy Impact Assessments it was not possible for our Office to confirm the full range of data elements involved nor the governance and technical measures in place.
- We wrote to the Chair of the WG in September 2020 and provided them with key privacy considerations, including pertaining to de-identification practices, safeguards, and meaningful consent.
Background
- Origin of Working Group: In 2015 the CRTC mandated Canadian broadcasters to set up a Set Top Box Working Group (STB WG). The STB WG, comprised of private sector broadcasters, was tasked to address how to measure TV audience metrics to take advantage of innovation and provide aggregated TV metrics to smaller broadcasters.
- OPC consultation: The STB WG was required to meet with our Office and to address privacy in their implementation. Our Office had 4 meetings with them between 2015 to 2020.
- [redacted]
- While the STB WG believed only de-identified data would be involved, we were not able to verify this. We sent a letter to the STB WG in September 2020 providing privacy advice to consider during implementation, namely on PIAs, de-identification, consent, openness and safeguards.
Prepared by: PRPA
Amendments to Canada’s Anti-Spam Legislation
Key Messages
- CASL does not apply to broadcasters. Clause 32 of C-11 amends section 5 of CASL to maintain this exclusion but clarifies that “online undertakings” are covered by CASL.
- C-11 introduces an exception (cl. 33) to CASL (s. 6) excluding electronic messages from an online undertaking if an individual has “expressly or implicitly consented” to a program’s transmission, and “the message is or forms part of that program or is sent in the course of the transmission”.
- As a result, it appears that CASL will only apply to the electronic messages sent by online undertakings to individuals outside of such programming or streaming and we do not expect these amendments to have an impact on OPC CASL investigations.
Background
- New exception: Clause 33 of C-11 introduces an exception to section 6 of CASL that excludes companies’ electronic messages to individuals, if:
- the person to whom the message is sent has expressly or implicitly consented to the transmission of a program, as defined in that subsection, from that online undertaking to an electronic address; and
- the message is or forms part of that program or is sent in the course of the transmission of that program to the electronic address to which that program is transmitted.
- CASL enforcement: The OPC, CRTC and Competition Bureau share enforcement duties, under CASL, provisions of PIPEDA and the Competition Act. CRTC enforces organizations’ compliance with sections 6-9 of CASL. Section 6 includes among other things, the sending of electronic messages, the content of such messages and exceptions to the application of the section.
- OPC role: OPC enforces CASL’s amendments to PIPEDA [subsections 7.1(2) and (3)]. These sections did not introduce new contraventions of PIPEDA. Rather they clarify that where organizations harvest electronic addresses or use spyware or other illicit means to collect and use individuals’ personal information, they cannot rely upon consent exemptions under PIPEDA.
Prepared by: Compliance
Collaboration with the CRTC and the Competition Bureau
Key Messages
- At present, the OPC, CRTC and Competition Bureau can only share information and collaborate on enforcement matters set out in Canada’s Anti-Spam Legislation.
- We saw the impact of this limitation in OPC’s investigation of adult dating website Ashley Madison, where we were able to share information with the US Federal Trade Commission, but not with Canada’s own Competition Bureau.
- We welcome the addition, under Bill C-27, enabling cooperation with the CRTC and Competition Bureau beyond CASL.
Background
- Shared enforcement: The OPC, CRTC and Competition Bureau share responsibility for enforcing CASL and related amendments to PIPEDA and the Competition Act.
- Consultation and collaboration: Under sections 57 and 58 of CASL, our agencies must consult with each other, to the extent appropriate, and may share information obtained during, or otherwise related to, our respective CASL-related enforcement activities. In 2014, the OPC CRTC and Competition Bureau entered into a CASL Memorandum of Understanding to formalize such collaboration.
- Joint work: Examples of mutual collaboration under CASL include:
- Regular meetings to discuss CASL enforcement matters of mutual interest.
- Common access to the Spam Reporting Centre, the repository of online reports from Canadians about spam and other electronic communications.
- The sharing of information in specific investigations, e.g., OPC PIPEDA Report of Findings #2016-003: investigation into the personal information handling practices of Compu-Finder (3510395 Canada Inc.)
- In November 2020, our agencies conducted a CASL awareness-raising campaign: a joint letter to 36 mobile app publishers operating in Canada.
- Improving collaboration: While acknowledging improvements in C-27, in the past we have recommended clarifying provisions for cooperation between our organizations on investigations, inquiries and other formal compliance matters.
Prepared by: Compliance
Privacy Protective Measures / Anonymization
Key Messages
- Effective anonymization techniques exist for aggregate statistics, and this represents a well-established privacy protective measure.
- If anonymized aggregate statistics are provided to the CRTC, the privacy of individuals would be protected.
- Importantly, the bill excludes streaming companies from sharing any identifiable information about individual audience members with the CRTC.
- In effect, this indicates that audience metrics provided by broadcasters to the CRTC must first be anonymized.
Background
- Anonymization of audience measurement information: Clause 10 amend the Broadcasting Act to allow the CRTC to make orders respecting the provision of information by persons carrying on broadcasting undertakings, including information regarding audience measurement. Importantly, information that could identify any individual audience member is excluded (9.1(1)(o)(iv)).
- Aggregate statistics and anonymization: Claims of anonymization are generally controversial, except when it comes to aggregate statistics. Of all the types of information to which anonymization techniques may be applied, aggregate statistics are the one type for which a well-defined standard of anonymization is generally agreed to exist. The main issue to address is small count totals, which should be avoided to protect against the risk of re-identification.
- Types of aggregate statistics: Examples of types of information in aggregate form include summary statistics such as counts, percentages, and averages.
Prepared by: Technology Analysis Directorate
Privacy Impact Assessments
Key Messages
- To the extent that personal information is collected by the CRTC, the Privacy Act would apply. We would therefore expect the CRTC to handle any personal information in accordance with the Act’s requirements.
- Additionally, the CRTC would need to follow the TBS Directive on Privacy Impact Assessment and provide the OPC with PIAs for relevant programs and activities stemming from this legislation.
- My office should be consulted and receive PIAs early enough in the process so that we can give meaningful advice and course corrections can be made. This is currently not always the case.
- New technologies and tools such as algorithms have the potential to be privacy invasive; the necessity and risks of using such methods should be carefully assessed.
Background
- Value of PIAs: PIAs are valuable tools for helping institutions ensure that privacy is considered at the outset - before implementation of the programs and activities envisioned in this legislation. These checks and balances protect privacy and can reassure Canadians that their personal information rights are being respected.
- Legislative requirement from PIA work: We would like to see a requirement to complete PIAs included as a binding legal obligation in a modernized Privacy Act and were pleased to see this proposed by the Department of Justice in its recent consultation paper on Privacy Act reform.
- Expectations on legislative consultations: C-11 highlights privacy concerns around the use of algorithms by online platforms to discover content and/or to move users towards specific content based on their online viewing habits.
- Past CRTC GA work: Although we have had more recent consultations with the Commission about initiatives under development, we have not received a PIA from the CRTC since 2014.
- Linkages to PIA work and privacy-by-design: Including a provision in C-11 establishing that one of the objectives of the Act is to “contribute to the protection of the privacy of persons” may help ensure the CRTC focuses on taking privacy into account in the design of its processes and activities.
Prepared by: Government Advisory Directorate
Business Consultation and Advice
Key Messages
- To the extent that private sector organizations subject to PIPEDA will need to collect, use or disclose personal information as part of the bill’s implementation, our Business Advisory Directorate is able to provide advice to help address privacy risks proactively.
- Engaging with businesses early ensures personal information practices comply with the law, privacy risks are mitigated, and provides regulatory predictability for responsible innovation.
- Most organizations, irrespective of sector, using our advisory services have data and technology-intensive initiatives/practices.
Background
Engagement with small and medium size enterprises (SMEs): This BA work remains critical, given their resource constraints, and pivotal role in post-pandemic economic recovery.
- 57% of the newly initiated business advisory engagements over the last year involved small and mid-sized enterprises (SMEs).
BA Activities: In 2021-22, we initiated 14 new advisory engagements, while conducting 24 promotional activities including exhibits, presentations, and stakeholder meetings.
- Privacy Clinics for SMEs: In 2021-2022, we hosted Privacy Clinics in collaboration with an Innovation Hub in Waterloo, Ontario to provide privacy advice to start-up and scale-up organizations with data practices having potentially significant impact on the privacy of Canadians. We continue to explore and host such Clinics across sectors and regions.
Key Consultation:
- Our Business Advisory Directorate provided recommendations for privacy improvements in a number of significant areas, such as: accountability, purpose specification, consent, limiting collection and use of personal information, as well as safeguards.
- Business Advisory consultations are conducted on a confidential basis.
- Privacy and AVA: A digital services company approached BA for advice on its Anonymous Video Analytics product. This engagement followed OPC report of findings in Cadillac Fairview Corporation Limited.
Prepared by: Business Advisory Directorate
- Date modified: