Language selection

Search

Don’t Sweep Privacy under the Internet of Things Rug

This page has been archived on the Web

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Address given to the Connected+ Conference 2016

Toronto, Ontario
October 12, 2016

Address by Luk Arbuckle
Director of Technology Analysis

(Check against delivery)

I’m very excited to have recently joined the Office of the Privacy Commissioner of Canada. I’ve worked as a data scientist for over a decade, in academia and industry. I eventually ended up specializing in the area of de-identification methods and re-identification risk measurement tools, leading a team of expert analysts.

Now let me explain a bit about our office, in case you’re not familiar with our work. The Office of the Privacy Commissioner of Canada is responsible for the laws that govern privacy, both for federal institutions and for commercial organizations that collect, use and disclose personal information in Canada (with some exceptions). Commercial activities that involve personal information generally fall under the Personal Information Protection and Electronic Documents Act (PIPEDA). We have a guide that goes over what business and organizations need to know, which I’ll briefly mention later.

Privacy is a close cousin to security, but may be overlooked when designing IoT devices. My focus here will be to talk about privacy, but we can’t escape the subject of security altogether. Our office has done a variety of work in this area, from background research to guidelines that touch on some of the privacy-related issues relevant to IoT. We are currently engaging stakeholders in a consultation process on the subject of consent that is also relevant to IoT. In the future you will see more from us on this topic!

So that’s what this talk is about, all that work and how it relates to IoT. We’ll look at privacy issues specific to IoT, possible solutions to design security and privacy into devices, where it fits with respect to current legislation in Canada, and some results from a recent privacy sweep of IoT devices that our office was involved with. Our 2016 privacy sweep found that privacy communications did not adequately explain how personal information was collected, used, and disclosed, how it was safeguarded, or how an individual could delete their data. We’ll discuss this in detail later in this presentation.

Now being part of a highly-skilled team of analysts with backgrounds in IT security, privacy, data analytics, and research, I just have to tell you a bit more about our particular work in this area. Generally speaking our team assists in investigations when there’s a technology component involved, which can be quite often given the digital world in which we live. We also do research in our tech lab, and so I’ll talk a bit about some of the things we’re working on as it relates to our most recent privacy sweep of IoT devices.

That will make up the first half of this presentation. In the second half we’ll focus on the workshop piece, by going over the guidance we provide on Good Privacy Practices for Developing Mobile Apps, along with a walkthrough of our guidance on Identification and Authentication. We won’t go over device to device authentication, as that’s mostly a security issue. Instead we’ll discuss identification and authentication of individuals, which uses personal information.

Massive growth in the IoT, but who is responsible?

Last year Gartner estimated that over 5 million new things would get connected every day in 2016, and they forecasted that over 20B connected things will be in use by the year 2020. I’m sure this growth is a major reason you’re here today, hopeful to take advantage of this growing market. It’s a huge opportunity for producers and developers of IoT devices and services, there’s no question about that.

The growth in IoT has a lot to do with cheaper hardware, that is both smaller or more energy efficient than the previous generation of products. But it’s also because of the potential interconnectedness of these devices, the opportunity to deploy many devices across many areas of our lives that find a way to communicate and work together to deliver new and exciting services, some I’m sure we haven’t even thought of yet. I’m sure you’ll see a lot of this potential here at this conference. Food tagged with RFID’s so our smart fridge can tell us what’s missing when we’re at the grocery store. Or a toilet that tells us, well, I don’t even want to imagine what my toilet is going to tell me. But hey, the potential is there to revolutionize the bathroom, among many other parts of our lives!

But with all this growth and interconnectedness, companies will be collecting, using, and disclosing masses of information about us, in real time. Some government agencies have already indicated a strong interest in using the IoT for surveillance purposes. Hackers may also take an interest, possibly to steal personal information or to hold devices or data under lock and key until a ransom is paid.

As Spiderman said, with great power comes great responsibility. You don’t want your device to be the one that is targeted and breached by malicious hackers. That’s bad for business. And not only yours, but for the industry as a whole. The success of IoT will rest in part on your ability to secure it and gain the trust of users so that they know their privacy will be protected.

Security and Privacy

By now I hope most of you have already heard of the attack on journalist and cybersecurity expert Brian Krebs’s website. Poorly secured IoT devices were taken over so they would take part in a massive attack, directing all traffic to Krebs’s website, resulting in the largest flow of data ever seen in such an event. The ease at which this was done is the scary part. Krebs called it “the democratization of censorship”, because it’s become so easy to get the tools necessary to attack and shutdown a website online. The attack on Krebs was carried out with 68 terrible passwords, and the source code used to launch this attack was publicly released. It wasn’t even the passwords of the IoT devices themselves, but of the product firmware used to build these devices. Unfortunately it doesn’t matter that it was the product that was unsecure to begin with, because the companies building IoT devices with these products have a due diligence requirement to ensure they are using secure products.

For our purposes it’s worth noting that this security threat will also pose a threat to privacy, be it through surveillance efforts or using unsecure IoT devices to leak personal information. That was the case with a website that was providing a livestream of webcams from around the world. The webcams were not secured, and included footage from within offices and homes. Our Office, in partnership with many other data protection authorities, issued a letter to the operators of the website. The website was then shut down, and when it came back online at a later date certain screening criteria had been put in place to reduce the risk of including home or office webcams. Manufacturers were also sent a letter urging them to include appropriate security measures in their devices by default.

Of course we can’t talk about IoT and privacy without mentioning one of the biggest recent media stories on the subject. This is the story of the infamous we-vibe, collecting user data that one person deemed far too personal to be shared with any organization. She alleges her usage data was collected along with her email address, suggesting it is identifiable personal information. As a result she decided to launch a class action lawsuit in the US.

Now our office has not been involved in this case, and as far as I know we haven’t received any complaints about it. I honestly don’t know how the service operates in Canada, so I can’t comment on specifics. But the story does align with the trends found in our most recent privacy sweep on IoT health-related devices, which we’ll discuss later in this presentation. So although security and privacy are close cousins, they’re also different.

An Introduction to Privacy Issues in IoT

Personal Information

In many cases the data gathered by an individual IoT device may not seem to be personal information. In isolation, and out of context. The challenge is that the data may be combined with the data from other IoT devices that can then reveal personal information. We can imagine many circumstances in which we’ll want to combine data from multiple IoT devices, to get a fuller picture. It could be the smart thermostats in your house, combined with the proximity sensors, smart windows that adjust their tint based on sun conditions and inside temperature, and so on and so forth! This is in fact one of the main features many tout when it comes to IoT, the interconnectedness and opportunity to learn and function as a whole.

But if this is the intent, to combine the data from several IoT devices, then will providers need to consider whether or not the data is deemed personal information when it’s considered all together? Well, in fact, the international data protection authorities adopted the position that data from connected devices is “high in quantity, quality and sensitivity” and therefore “should be regarded and treated as personal data.” We’re not talking machine to machine (M2M) devices, the kinds of devices used in business and device applications. But if those M2M devices are connecting to IoT devices, by which I mean data captured about people and their things, then it’s likely to be considered personal information.

Transparency and Consent

The subject of consent itself is a difficult one, even with a single device that may or may not have an appropriate user interface. Rule of thumb: don’t think that burying the details of what is being collected and how it will be used in a long license or user agreement is acceptable. It’s pretty much understood that most people aren’t reading these. To obtain meaningful consent, it will be increasingly necessary to adopt more flexible strategies to inform users of what data is being collected and why at the time it is being collected.

Just keep in mind that any collection, use or disclosure of personal information must only be for purposes that a reasonable person would consider appropriate in the circumstances. No one should be surprised by the collection, use, or disclosure of personal information. It’s a fundamental concept of privacy that we have a right to decide when and how our personal information is used (noted by the Supreme Court of Canada).

The time to ask if a device or service can collect location data is when the data itself is needed, so that the user can make an informed choice at that point in time. You’ll see mention of this when we look at the Privacy Sweep. The decision to consent may carry forward, provided the purpose of collection hasn’t changed, but users also need an opportunity to withdraw consent for that data type and purpose. This will be difficult to do with an IoT device that doesn’t have an interface. In that case the device itself should be fit for purpose, and users shouldn’t be surprised about what data is being collected and why. The labelling should clearly indicate the features and data being collected to support those features. If you buy a webcam it should be a webcam, and be streaming video only when you expect it to and for the purposes you want. If it collects geo-location data, that had better be a feature that is clearly described so that users know what it’s doing and why.

There may even be ways for machine-based rules or machine-learning approaches that allow for proxy-decision making, but these are forward-looking ideas that we haven’t fully investigated yet-at the OPC or as a society. It’s likely such a machine-driven consent mechanism would itself require consent! We are in the process of discussing the challenges of consent with stakeholders. We received 51 written submissions from academics, industry, and the public and are now holding consultations across the country with them to further the conversation. Although this is an ongoing effort, you can expect us to update Canadians on the outcomes of this work once this phase is done.

Securing the IoT

Our privacy laws require that the security safeguards be proportionate to the sensitivity of personal information. How will developers do this when they don’t know what other devices may be connected to it? Data that doesn’t seem sensitive now may become sensitive when it’s combined with other IoT device data, and the last thing you want is to be the weak link in the IoT chain.

This does raise interesting questions around responsibility when there is a breach incident. The possibility of having many devices interacting and working together complicates issues of accountability. The more moving parts you have in a system the greater the risk of unforeseen events leading to cascading failures. If one device didn’t consider the risks of collecting sensitive personal information, and therefore the safeguards were less robust than other devices, it could end up being be the linchpin in a successful cyberattack.

There are so many examples to choose from. Just imagine the connected car that takes data from other IoT devices, like street lights, other cars, pedestrians. We have already seen examples of hackers taking control of cars, such as the Jeep and the Tesla. And we are still in the early days of these technologies. Software has vulnerabilities, and connection to the Internet means access to outsiders. The IoT could be a hackers dream come true.

Online Trust Alliance (OTA) Trust Framework

Let’s now consider specific guidance. The Online Trust Alliance is a charitable organization in the US with the mission to “enhance online trust” by “advancing best practices and tools to enhance the protection of users’ security, privacy and identity.” In the January 2016 issue of the Canadian Privacy Law Review (Vol. 13, No. 2, pp. 14-16) it was suggested that “the Trust Framework is at least conceptually aligned with the applicable Canadian legal principles, since both derive from the same underlying” fair information principles. The Trust Framework has 31 principles covering security, user access & credentials, and privacy, disclosure, transparency, and regulatory requirements.

But don’t take that as an endorsement or legal advice on how the Trust Framework maps to PIPEDA requirements. It’s just a starting point for practical guidance to ensure IoT devices meet basic security and privacy principles that are more or less accepted globally.

The OTA found that the most recently reported IoT vulnerabilities were easily avoidable, and this was before the record-setting DDOS attack on Brian Krebs’s website! In a nutshell, what they found was:

  • Insecure credential management
  • Not properly explaining how personal information is collected, used, and disclosed
  • A lack of rigorous security testing
  • Poor reporting of observed vulnerabilities
  • Insecure or no network pairing control options
  • Not testing for common code injection exploits
  • A lack of transport security and encrypted storage
  • No clear way to address vulnerabilities during the lifetime of the device

This last point is an interesting one, probably forgotten or ignored by many. It’s hard to think far into the future, as we tend to discount those problems that are a long time in coming. But the reality is that many IoT devices may be used for several years, especially for the big ticket items like home appliances. How do you ensure that users have a device that is future proofed, especially when we know encryption technologies deployed today have a shelf life. One option is to build a “kill switch” into those devices, so that the IoT functionality can be turned off but still allow the main functions of the device to operate. So a fridge still keeps food cold, and a washing machine still washes clothes, even though the “smart” functionality is turned off.

A Guide for Businesses and Organizations

The fair information principles outlined in Schedule 1 of PIPEDA provide the ground rules for how an organization can work with personal information, and the control given to the people whose personal information is being handled by the private sector. We won’t go through the list of 10 principles, except to say that they exist and you can find more detail about them in our guidance. We’ve already touched on many of the principles throughout this presentation. A more practical way to understand these principles, in the context of IoT, may be to go through the guidance we provide on Good Privacy Practices for Developing Mobile Apps. So that’s what we’ll do in the second half of this workshop, along with a walkthrough of our guidance on Identification and Authentication.

Global Privacy Enforcement Network (GPEN) Sweep

Earlier, I mentioned our recent privacy sweep of IoT devices. This was part of the Global Privacy Enforcement Network’s annual privacy sweep, in which 25 data protection authorities from around the world looked at 314 devices. The OPC focused on health devices, like fitness trackers and heart rate monitors, while the other partners looked at everything from connected toys to smart TV’s and cars.

Authorities could use a variety of methods during the sweep to assess privacy. They could look at the documentation that came in the box with the device, or the documentation found online for that company, to see what the documentation said about privacy. They could interact with the device to see how well that documentation matched their experience of using the product, or contact the company directly with follow-up privacy questions. The OPC used a combination of all of these.

It’s important to know, however, that the Sweep was not an investigation looking to identify compliance issues or violations. The goals of the Sweep are to

  • increase public and business awareness of privacy rights, responsibilities and best practices;
  • encourage compliance with privacy legislation;
  • and enhance cooperation among privacy enforcement authorities.

The results presented here are just a summary from the global results, and more specific details about OPC’s findings of health devices can be found on our blog. As you’ll see there was a lot of interest in how companies communicate what they are collecting and why.

Indicator 1: Do privacy communications adequately explain how personal information is collected, used and disclosed? (41% yes, 59% no)

Users typically want to know why certain information is being collected about them. The best way to inform them may be when the data is needed, such as location data, and then explaining why it’s needed. Being in the moment, using the product for the first time, this information gathering will make more sense. Otherwise, why collect the data in the first place? Users have a right to know, and a right to make a meaningful decision of whether or not to consent.

Indicator 2: Are users fully informed about how personal information collected by the device is stored and safeguarded? (32% yes, 68% no)

With every breach and attack on data there is heightened awareness about security. Might as well be upfront and provide users with information. This may be in the box or on a website, but make it relatively easy for them to find and understand. Now that being said, you want to avoid the mistakes made in the Ashley Madison case, claiming to have security credentials that did not exist!

Indicator 3: Do privacy communications include contact details for individuals wanting to contact the company about a privacy-related matter? (62% yes, 38% no)

People are sensitive to the issue of privacy, and it’s therefore best to give them access to someone trained and well informed to discuss their concerns. We actually recommend people contact a company when something privacy related doesn’t feel right. Most companies are sensitive to consumer concerns and are glad to provide the information that will put a person’s mind at ease.

Indicator 4: Do privacy communications explain how a user can delete their information? (28% yes, 72% no)

It’s quite frustrating to have to dig and search for a way to withdraw consent or delete personal information. Users shouldn’t have to contact anyone to have their information deleted. The technology is there to make this easy and seamless to do. Click of a button and a warning message to let them confirm their choice. It’s their right, and your obligation to provide it. The Sweep actually found one service that didn’t delete their account or data two months after the fact, even after they had contacted customer service to confirm deletion. Your goal should be to build trust and credibility with customers.

Indicator 5: Did the company provide a timely, adequate and clear response to follow up questions? (57% yes, 43% no)

Here we found that answers to questions about privacy were generally timely, clear, and forthright. That being said, there were some cases in which the responses avoided the issues raised, or simply redirected us back to their privacy policy. Not an ideal way to build trust, and it would certainly be better to have trained staff with the knowledge and information needed to answer questions to the satisfaction of users.

Network Analysis

The privacy sweep involved briefly interacting with IoT devices, to get a feel for the consumer experience with an eye towards the privacy documentation provided. But we can and most likely will do more, and that’s where my team comes into play. Over recent months we’ve been building up our lab with the tools needed to conduct more in-depth network analysis of devices and services. We plan to analyze a select number of IoT devices, to look under the hood so we can better understand how they work, what information they’re sharing, and with whom they’re sharing it. We want to know what happens to our personal information when we use these devices.

From this work we eventually hope to develop guidance, especially geared towards small to medium enterprises and app developers. So that they have the tools necessary to build privacy protections into their products and services. We also hope to help contribute to the development of global technology standards in this space. We may directly contribute to standards bodies in drafting guidance, or our papers and sweeps will be used to inform them of the concerns regulators have and may therefore need to be addressed.

We also want to educate and inform Canadians about the relevant privacy risks so that they can make informed decisions. This will involve different types of publications, in the form of new blog posts and research papers that raise awareness and understanding of how our personal information is being found in every facet of our lives.

Connected+ Workshop

In this part of the workshop we walked through some of the material in our guidance documents, with a discussion of the key points and some relevant cases.

Key Privacy Considerations for Developing Mobile Apps

  • You are accountable for your conduct and your code.
  • Be open and transparent about your privacy practices.
  • Collect and keep only what your app needs to function, and secure it.
  • Obtaining meaningful consent despite the small screen challenge.
  • Timing of user notice and consent is critical.

Guidelines for Identification and Authentication

  • Only identify when necessary.
  • Determine what identity attributes are necessary to authorize a transaction
  • Inform individuals and obtain the appropriate form of consent before identification
  • Only authenticate when necessary
  • Ensure the level of authentication is commensurate with risks
  • Protect personal information
Date modified: