Language selection

Search

The (in)finite life of personal information in a digital age

This page has been archived on the Web

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Address given to the Yukon Bench and Bar Seminar

Whitehorse, Yukon
September 10, 2015

Address by Patricia Kosseim
Senior General Counsel and Director General, Legal Services, Policy, Research and Technology Analysis Branch

(Check against delivery)


Introduction

Some of you will remember the American crime drama, Donnie Brasco, based on the true story of an undercover cop, Joseph Pistone, who infiltrates a Mafioso family in New York during the 1970's and finds himself gradually taking on the persona of a criminal.

In a lighter scene, Donnie Brasco is asked by one of his fellow wiretapping federal agents what "fuhgeddaboudit" means".

Fuhgeddaboudit he says, is if you agree with someone, like if someone says Raquel Welch is really somethin’, you say fuhgeddaboudit.

Sometimes it means you disagree, like "a Lincoln is better than a Cadillac?", fuhgeddaboudit!

Sometimes it means the greatest thing in the world like "'Minghia! Those peppers, fuhgeddaboudit!"

It’s also like telling someone where to go.

And sometimes, fuhgeddaboudit just means fuhgeddaboudit!

The question I would like to raise today is whether we can "just fuhgeddaboudit" at all anymore in a digital age where personal information posted on the Internet is there for everyone to see forever more.

It's a question which University of Oxford Professor Viktor Mayer-Schonberger asked more seriously six years ago in his book Delete: The Virtue of Forgetting in the Digital Age. In it he describes the profound human and societal implications of mass digitization and cheap online storage and how these have shifted our behavioral default from forgetting most things, to remembering everything.

This is a profound change indeed. Think about:

  • politicians who had to give up their political aspirations because of something they regrettably tweeted years prior to seeking office;
  • people who lost their jobs because of online videos of them over the weekend shouting obscenities while inebriated;
  • professionals who lost the ability to practice in their field because of something they've done in the past which has come back to haunt them;
  • young adults who were jeered and taunted by the masses on social media because of an admittedly stupid post; and
  • most tragically, teenagers cyberbullied into depression and even suicide because of cruel things, photos or videos others posted about them and just won’t go away.

Privacy, once protected by the sheer practical difficulty of finding or remembering things, no longer enjoys this veil of obscurity. How are we, as a society, grappling with this new phenomenon of a permanent and pervasive memory that is painfully unforgiving?

Historically, the law has provided remedy and made whole victims of defamatory statements published by others. More recently, criminal law has created new offenses to get at non-consensual distribution of intimate images and cyberbullying.   

But these legal remedies do not cover all situations — for instance, the unwise post, the embarrassing photo or the news story which, although factually true, no longer seems relevant.

Seeking redress is not always practicable when the information is widely dispersed across the Internet. Nor is it feasible when the individuals who post the information - or the websites that publish it — have done nothing legally wrong, or worse, cannot be found. As a result, attention is increasingly turning to intermediaries, such as search engines, given their pivotal role in directing Internet traffic to the material in question.

The Google Spain ruling

The case “that started the song that started the whole world singing”, so to speak, was the May 2014 ruling of Google Spain v AEPD and Mario Costeja González. In a seminal ruling against Google, the Court of Justice of the European Union required the search engine to honor a lawyer’s request to remove links to old newspaper articles about financial debts he had long since repaid. The source articles themselves would remain accessible through the journal’s website, but they would no longer be listed among results of a Google search using the lawyer’ name.

In a sense, the ruling ascribed to search engines the responsibility of determining when privacy rights of an individual outweigh the public interest in accessing information, including where search results “appear to be inadequate, irrelevant or no longer relevant or excessive in relation to the purposes for which they were processed and in light of the time that had elapsed."

Data subjects whose delisting requests are refused may bring a complaint to the relevant data protection authorities. In an effort to guide data protection authorities through difficult decisions, the Article 29 Working Party, set up under the EU Data Protection Directive, issued guidelines on the implementation of the Google Spain decision.

The Article 29 Working Party sets out a formula for balancing privacy interests of an individual with the public interest in having access to information, including considerations such as:

  • whether the person is a natural person, plays a role in public life, or is a minor;
  • whether the data is accurate, relevant, up to date and not excessive;
  • whether the personal information is sensitive;
  • if the information is causing prejudice to the data subject or putting them at risk; and
  • the context in which the material was originally published — whether it was for journalistic, legal or other reasons, or related to a criminal offence.

The Working Party strongly encouraged search engines “to publish their own delisting criteria, and make more detailed statistics available”.

Google has since done so. As of August 30th, the company had received over 300 000 requests and evaluated over one million URLs. According to the Guardian newspaper, more than 95% of these requests have come from everyday members of the public, and less than 5% from public figures, politicians or criminals. Of the total, the company agreed to delist about 42% of URLs on request.

Some have hailed the EU ruling as a victory for privacy, or at least an incremental one. Others are more critical. Harvard law professor, Jonathan Zittrain, has called the judgement “a bad solution to a very real problem”.

Jules Polonetsky of the US think tank, Future of Privacy Forum, says asking Google to stop linking to something illegal is one thing. “But for the Court to outsource to Google complicated case-specific decisions about whether to publish or suppress something is wrong. Requiring Google to be a court of philosopher kings shows a real lack of understanding about how this will play out in reality.”  

A live issue now is the scope of application of the Google Spain ruling outside the EU.  In a recent showdown with the company, France’s Commission nationale de l'informatique et des libertés (CNIL) is holding firm to its position that the ruling requires search engines to remove URL links to all domains, not just in the EU. For, if the information remains searchable under google.com or google.ca, this would circumvent the very purpose of delisting and effectively strip the ruling of any meaning.

Google’s position, confirmed by the view of its Advisory Council of independent experts, is that the ruling is limited to the Europe and does not apply to block URLs from search results worldwide. It takes the position that one region should not exhort its values and principles to other sovereign states that may have different views of the appropriate balance.

Indeed, several commentators have strongly argued that the balance, as struck in the Google Spain ruling, would not measure up in the U.S., particularly under the First Amendment right of free speech. Writing for the New Yorker, Jeffrey Toobin describes “the anxiety felt keenly on both sides of the Atlantic. In Europe, the right to privacy trumps the freedom of speech; [whereas] the reverse is true in the United States.”

Application in the Canadian Context

What about here in Canada?

While we do not have case law directly on point yet, a couple of recent decisions from British Columbia are instructive on how courts might view the responsibility of search engines to remove links to offending information and how far they will go in ordering them to do so, depending on the circumstances.

Niemela v. Malamas stemmed from an underlying action in defamation brought by a lawyer against one of his former clients. The lawyer applied for an injunction against Google (non-party to the action), seeking to enjoin the search engine from providing links to the defamatory comments in its worldwide search results - both URLs and accompanying ‘snippets’.  

Google had voluntarily de-indexed URLs pointing to the defamatory materials from its google.ca website but refused the lawyer’s request to go further.  

Considering the plaintiff was a Vancouver lawyer, the vast majority of his clients were Canadian and more than 90% of the searches in his name were shown to have originated from Canadian IP addresses.

Justice Fenlon of the British Columbia Supreme Court found that the voluntary efforts made by Google up to that point were proving to be effective in preventing people in Canada from linking to the offending websites. There was insufficient evidence of potential harm that would come to the plaintiff if an injunctive order to do so worldwide were not granted.

Interestingly, another factor the judge considered was the reality that Google could not likely comply with a worldwide order anyway, given its statutory obligations in the U.S. not to block search results that would offend the First Amendment right of free speech.

The plaintiff-lawyer also brought a separate action against Google itself for its role in displaying links and snippets that pointed users to webpages containing the defamatory content. This separate action was based on several claims which Google sought to have summarily dismissed.

On the claim of breach of privacy under the BC Privacy Act, the court held that in the specific circumstances of this case, the plaintiff could have no reasonable expectation of privacy in the manner in which he performs his professional work.  As per Justice Fenlon, “Mr. Niemela’s complaint is not that the articles describe an area of his life that he expects to be kept private and sheltered from public view, but rather that they are unflattering and false descriptions of his work performance.”

In rejecting the plaintiff’s claim of defamation against Google for publishing the snippets in its search results, the Court, inspired by the Supreme Court’s reasoning in Crookes v. Newton and a line of recent English jurisprudence, found that Google was “a passive instrument and not a publisher of the snippets”.

It is interesting to contrast this decision with an earlier decision of Justice Fenlon upheld by the Court of Appeal in Equustek Solutions Inc. v. Jack. This case involved a similar application for an injunctive order against Google to cease referencing URLs associated with certain websites in its search results worldwide.

The underlying proceedings in this case were against a defendant gone AWOL for trademark infringement and misappropriation of trade secrets. Refusing to abide by several court orders, the defendant ceased operating in Vancouver, but then went underground to continue marketing and selling the counterfeit products through a number of “clandestine” websites.

Google had volunteered to take down links to specific web pages from its google.ca search results, however, each time it did, the defendant would simply move the objectionable content to a new webpage in a game the plaintiffs referred to as "Whack-a-Mole".

The plaintiffs requested Google to block the defendant’s entire websites from its search results worldwide, but Google refused to go that far.

Accepting that “an order with international scope should not be made lightly”, and should not be granted when an order with domestic consequences will suffice to avert harm, the Court of Appeal upheld Fenlon J.’s finding that, in this case, an order limited to search results on google.ca would not be effective given the defendant’s work-arounds. The Court upheld the worldwide order against Google.

It is interesting to note the earlier backdrop leading up to this. After careful consideration of in personam jurisdiction and principles of comity, which ultimately led Justice Fenlon to assert territorial competence over the matter, the Court of Appeal agreed that there was “no realistic assertion that the judge’s order (would) offend the sensibilities of any other nation.”

As for the possibility that the defendants might wish to use their websites for legitimate expression other than unlawful marketing, the Court of Appeal found this to be “entirely speculative”. If such were the case, however, it remained open for the defendant to seek to vary the order based on free speech.

How a court would decide breach of privacy claims against Internet intermediaries in different circumstances remains to be seen.  However, what these two cases show us so far, is that a court will be sensitive to the nature of the offending material and how closely it bumps up against freedom of speech — both here and elsewhere. These cases also demonstrate that courts, while cognizant of the global nature of the Internet, will require concrete evidence of the insufficiency of a domestic order before it will order search engines to go further.

Analysis

In public commentary, Google has been likened to the world's largest library. Some have argued that no reasonable person would expect librarians to censure and destroy books in a free and democratic society, and so far no one is vigorously debating that the original material be removed from the Internet altogether.

But would a reasonable person expect the librarian to amass and catalogue everything we've ever written and everything others have ever written about us — including all non-published works, never vetted by publishers’ lawyers or peer reviewers? Would they expect it to be neatly organized on a shelf or entire rack labelled by individual name, accessible to everyone in the world, 24-7?

While Google has attracted the most attention as the world's most powerful search engine, responsible for nearly 75% of all Internet searches, it is not the only actor involved in this debate about the right to be forgotten.

Several website operators have also taken on the self-proclaimed role of librarian, sometimes with questionable motives. Global24h is an interesting example on point.  

Globe24h is a website operated out of Romania that has taken it on itself to republish court and tribunal decisions — including those from Canada.  Unlike legal websites like CanLII that use the web’s robot exclusion standard to limit indexing of decisions by name and thereby minimize the privacy impact on individuals, the Globe24h site is indexed and searchable by individual name.

While the website’s claim is “to make law accessible for free on the internet”, our Office found, after an investigation this past year, that the website was generating revenue through paid advertisements and charging a fee for individuals who wish to have their personal information removed.

Ultimately, we found that Globe24h’s purposes of making available Canadian court and tribunal decisions by individual name and charging individuals to have the information taken down were not ones that a reasonable person would consider to be appropriate in the circumstances. While the information was publicly available at its source from Canadian legal directories, it had not been made publicly available for the purposes for which Globe24h was using it and the website had not obtained consent from individuals to use it for these different purposes.   

We therefore recommended that Globe24h delete from its servers Canadian court and tribunal decisions that contain personal information made searchable by name. Unfortunately, the website refused to implement this recommendation and one of the individuals who brought the complaint to our Office has since initiated an Application in Federal Court against the website.

As we consider next steps in this matter, including potential litigation options, we have undertaken other practical measures in parallel effort to minimize potential harm to individuals. We have reached out to major search engines requesting that they voluntarily remove links to the Globe24h website, or otherwise reduce the company’s prominence in search results, and we are beginning to see some success in this regard.

Potential Solutions

This example is a good segue into a broader discussion of potential solutions being considered in the current right to be forgotten debate.

Legislative Solutions

Among possible solutions are legislative initiatives. Right to be Forgotten laws have emerged recently in several countries, including: Spain, Germany, Iceland, Norway, Liechtenstein, Switzerland, and most recently Russia, whose law will take effect at the beginning of 2016.

Under California’s new “Minors in the Digital World” law — also known as the “Erasure Law” — kids are now able to have removed any personal information they posted about themselves online.

Canada has introduced anti-cyberbullying legislation in Nova Scotia and federally which makes it an offence to post intimate images online without consent.

Practical solutions

Practical solutions have also been developed to respond to requests for removal of information from the Internet.

Several search engines and social media sites have voluntary mechanisms in place to delist or take down illegal content such as copyright infringements, defamatory information, stolen financial data and revenge porn.

How these mechanisms operate in practice, and how far they should go, raises some challenges.

For example, an investigation we concluded in 2013 involved a teen victim of online impersonation. On confirming that the imposter Facebook account was indeed fake, Facebook deleted the account. However, despite her mother’s plea, Facebook would not go as far as notifying all of the imposter’s friends of the deception. By then, the account had been deleted, but Facebook also took the general view that notification might have the unintended effect of re-stigmatizing the teen and it would not be appropriate, practical or beneficial for Facebook to intervene in personal relations between individuals.

Nonethless, after consultation with our Office, the company did undertake, in appropriate situations, to facilitate a process whereby non-Facebook users — like the teen in this case — could send out a notice on her own behalf.  

As another example, Twitter, like others, allows users to delete their own tweets. Until recently, a third-party network of sites called Politwoops, operated by the Open State Foundation in more than 30 countries, had an arrangement with Twitter which allowed it to access Twitter’s application program interface so as to resurrect deleted tweets from politicians.

In late August, Twitter announced that it was calling off this arrangement with Politwoops Canada (as it had done for the U.S. version of Politwoops earlier in May). In other words, Twitter will henceforth treat everyone the same.

This move has sparked media debate about, on the one hand, respecting the right of individuals to speak freely without fear that everything they say will be immutable and irrevocable and, on the other hand, conserving what politicians say as a matter of public record for which they should be held publicly accountable and to which the public should have right of access.

Technological Solutions

Add to the legal and practical, technological solutions that are also contributing to a way forward. 

Take for example the Web robot exclusion protocols I mentioned earlier that have been adopted by courts and administrative tribunals, and legal websites like CanLII.

Blurring technologies have been developed and refined over time to allow persons captured by Google Streetview cars, for example, to request that their picture be rendered non-identifiable from the company’s street-level images and maps. UK media have also begun blurring childrens’ faces.

Victor Mayer-Schonberger, among others, has mused with the concept of information ecology, where best before dates, upon reaching expiry, would automatically delete content from the internet.

The Office of the Privacy Commissioner of Canada has identified reputation and privacy as one of its strategic priorities over the next few years. To help advance the debate, we will publish a discussion paper later this year as a launch pad for a call for proposals on potential solutions — including innovative technological ones. Informed by these, we hope to develop a policy position on the Right to Be Forgotten in the Canadian context.

Conclusion

To conclude, I leave you with two poignant stories that starkly illustrate the need for a path forward.   

Jeffrey Toobin, in a New Yorker piece aptly called “The Solace of Oblivion”, chronicled the sad case of a family who saw circulate wildly across the Internet gruesome images of their deceased daughter following a car crash. The photos were posted by employees of the California Highway Patrol “for pure shock value” on Halloween. The parents tried in vain to have the photos removed.

Google would not delist links to the offensive photographs and the family had no legal means available in the U.S. to force Google to do so. The family tried negotiating with the Highway Patrol to transfer ownership of the photos so they could assert a legal claim of copyright over the images and have them delisted by Google on that basis, but this too was unsuccessful.

The best the family could do was secure the paid services of a company called Reputation.com. Short of removing the links entirely, which it cannot do, the company manipulates Google searches so as to push down the undesirable search results as far as possible, providing at least some partial relief for those who can afford it.

In a recent TED talk, Monica Lewinsky emerged from decades of self-imposed obscurity to describe her experience as Patient Zero of the digital age. News of her affair with the President of the United States, along with all the personal ridicule and salacious commentary, spread pervasively and internationally, like wildfire, primarily via email at the time.

When one considers that case study against the criteria set out by the European Court of Justice or the Article 29 Working Party Guidelines, any take down request she could have made at the time would certainly, and universally, have been refused on all counts. This was after all a story which nearly took down the President of the most powerful country in the world!

But this legal conclusion, while perfectly rational, feels cold and unsatisfactory when one hears Monica Lewinski describe the painful and permanent impact she suffered over a mistake she made as a young intern who fell in love with her boss.

Is there not, she asks, a role for societal compassion or some convention of ethical behavior on the Internet?

She is not alone. President Obama has called for a Consumer Privacy Bill of Rights for the Digital Age. Recently, the UN’s first Special Rapporteur on privacy, Joseph Cannataci, has publicly called for a Geneva-style Convention for safeguarding data on the Internet.

Last year, at the Canadian Bar Association Annual Conference, I participated on a panel that was asked to comment on whether the time has come for a digital bill of rights, as proposed by Tim Berners-Lee, the inventor of the world-wide web.

Perhaps the stars are aligning. Perhaps the time has come for a universal societal pact about the kind of online world we and our children — both present and future — want to live in.

As members of the Judiciary and the Bar, your role in helping design and develop such a pact will be instrumental.

Thank you.

Date modified: