As the Internet of Things Grows, Is Privacy Possible?

The Internet of Things will instrument our world and, in doing so, simplify many aspects of our lives. Yet, the flip-side of gaining so much data about our physical world is the risk of being able to correlate events and data-points in a way that harms consumer privacy.

The Internet of Things will give us extraordinary insights into the mechanics of our world, our environment, our infrastructures and even ourselves. But companies that fail to acknowledge, mishandle or ignore these three privacy issues are undermining the future of the IoT.

Challenge #1: The Balance Between Features and Privacy

At the core of the Internet of Things is data. The more data a given IoT system obtains from its end users, the more value the system produces for application development, improved customer experiences and increased operational efficiency. But the secret to the IoT’s success will be in striking the right balance between the limitless amount of personalized data from connected devices (between the end users and their personal products, and between multiple, inter-connected devices) and the preservation of privacy.

Remember that just a few pieces of data from different sources can digitally “fingerprint” individual consumers. For example, a 2013 study on mobiles showed that, based on tracking our distance from mobile antennas every hour, four points in a dataset were enough to recognize 95 percent of individuals using the system. Today, both Google Now and Apple‘s iBeacons use unique user identities and location to provide seamless services that feel like magic.

The problem is that personalization via consumer-generated data and profiles is key for the IoT to truly take off in consumer-facing applications. To accelerate progress, we need IoT systems that not only manage and protect this data, but also glean powerful insights on individual identities based on just a few data points. The key will be to ensure that consumers have control over what information is shared, and how, and that they can share it with whoever they feel will give them a positive cost-benefit balance.

Challenge #2: Granular Data Sharing

The true value of the IoT comes from sharing data between systems (for example, cross-referencing information across devices and applications rather than looking at “one device, one protocol, one application” use cases). But this can significantly amplify privacy issues if it’s not managed properly.

I recently purchased a Jawbone activity tracker that could sync my Withings smart bathroom scales so I could gain a more holistic view of my fitness status and receive better training advice. However, there was no way for me to set up or restrict my data permissions—I had to share all of my personal data between products or none at all.

EPCglobal‘s own evolution of standards foreshadowed the need for the IoT to have better sharing schemes with greater granularity. Since then, a number of innovative solutions and standards have been applied, such as the use of sharing proxies based on fine-grained Web application programming interfaces (APIs)—using Representational State Transfer (REST), for instance—or social graphs combined with delegated Web authentication mechanisms, such as oAuth. The EVRYTHNG IoT smart products platform is architected on this Social Web of Things approach with data-sharing open APIs, based on many years of research. For the IoT to progress in 2015, the industry must understand that the absence of clear and simple controls to granular data management can kill the promise of data sharing.

Challenge #3: Providing Transparency and Controlling Access to Data

“Who owns the data?” The answer to this fundamental question will become more convoluted as more devices become connected, and it may differ depending on who is being asked. So creating solutions that provide transparency into who has access to IoT data will be central to policing privacy.

The ideal scenario would be to provide transparency and educate consumers to understand what they truly give away, creating a data democracy. The reality, however, is that analytical techniques combined with incidental data logging (your mobile phone connecting to a nearby antenna, for example) makes true data democracy a utopia unless clear governmental regulations are put in place.

Privacy Precedents to Come for the IoT

The question shouldn’t be “Will the IoT respect our privacy?” but “Will we find enough value to embrace IoT technologies even if we need to feed them with private data?” Private data, inevitably, will be exchanged, exposed and leveraged—there’s no going back from where the Web, social-media networks and smartphones have already taken us. However, we should ensure that these exchanges happen inside certain frameworks:
• Standardization of data-exchange protocols to ensure a transparent transport and a fine-grained sharing of private data
• Regulations to specify the boundaries of what companies can and can’t do with private data
• Simplification, standardization and regulation of data-sharing models to ensure that customers understand what they share without reading the small print

It is now time for privacy to become a new currency that lies in the hands of the people or enterprises to which it belongs—a currency that can be exchanged in an open market in which benefits and costs can be assessed easily and efficiently.

Luckily, we have history to learn from—through the evolution of technologies such as social networks and RFID systems, we understood the crucial balance between features and the hunger for private data. As a consequence, technologists were able to craft innovative ways to mitigate against growing data concerns. There is no doubt we will be able to triumph again.


{{ source }}

Moving past the illusion of data democracy

Individuals can no longer be their own privacy enforcers

Choice permeates nearly every facet of American life. We celebrate our freedom to voice a preference with every election and every episode of American Idol. We also want a choice in information privacy– the power to dictate exactly how our personal data is collected and used.

In a recent Foreign Affairs note, Ann Cavoukian, former Information and Privacy Commissioner of Ontario, Canada, put it this way: “When it comes to regulating privacy, let the people decide.” This concept of privacy as choice originated with Alan Westin, in his groundbreaking 1967 book Privacy and Freedom. He defined privacy as “…the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.

The reality, however, is that pervasive data gathering and analytical techniques fueled by advances in communication and information technology are rendering true data democracy obsolete.

The idea of letting the people decide is appealing. We each have different ideas of what should and should not be done with our information, and we all seek some level of protection from prying eyes. With that, policymakers developed a “notice and choice” framework, requiring people to decide at the point of data collection whether they accept the specified uses of their information.

The limitations on this approach are, by now, obvious. In practice, it means providing consumers with incomprehensible legalistic privacy policies, which no one reads, but that are treated as “informed consent” for companies to do as they please. One study estimated that if an average consumer read the privacy policies of all the websites they visited, it would take 224 hours a year.

The problem is only getting worse as the Internet of Things continues its rapid expansion. When refrigerators, automobiles, smartphones, and just about every object in daily life are all equipped with communications capabilities, it will be impossible to execute a privacy framework in which consumers can examine all the possible data uses before information is collected.

In today’s world of pervasive data collection and use, this blind insistence on a data democracy provides only the illusion of individual control. It is a fake mechanism of autonomy, offering no real consumer protection.

There is an alternative. Years ago, former Obama Administration official Danny Weitzner put it this way: “Consumers should not have to agree in advance to complex policies with unpredictable outcomes Instead, they should be confident that there will be redress if they are harmed by improper use of the information they provide…”

A reform movement is gaining steam. The goal is to prevent consumer harm—to ensure that information is not used in a way that is adverse to an individual’s legitimate interests. The basic concept is to make companies and institutions responsible for how they use collected data. There will clearly be some role for consumers in this regime, but they will not be the only, or even the primary, agent of enforcement.

The Obama Administration’s recent big data report moved in this direction, recommending that policymakers “look closely at the notice and consent framework that has been a central pillar of how privacy practices have been organized….” The accompanying report from the President’s Council of Advisors on Science and Technology is more direct, urging that “policy attention should focus more on the actual uses of big data and less on its collection and analysis.”

The answer is to focus on vigorous enforcement of existing laws that have proven effective at governing the collection and use of personal information. Policymakers and regulators must be vigilant in monitoring business activity to make certain our legal framework offers adequate consumer protection, and they should consider new restrictions on data collection and use only when real consumer harm is proven.

While it may sound paternalistic to have consumers protected instead of actively protecting themselves, the era of privacy notices has passed. Every new innovation in the Internet of Things provides another crack in the illusion of data democracy. It’s time to move beyond this outdated notion — just as it would make no sense for each of us to become our own meat inspector or bank examiner, it no longer makes sense to expect each of us to be our own privacy enforcer.


{{ source }}