UPDATED 18:29 EDT / JANUARY 28 2017

BIG DATA

When a doll rats out a parent: Tech firms struggle with thorny privacy issues

The rise of intelligent devices, from wearables to smart home sensors to Internet-connected Barbie dolls, is confronting technology companies with a host of new ethical issues involving privacy and security.

That’s forcing them to think carefully in advance about principles they need to apply to a wide range of products for the emerging Internet of Things, said Jules Polonetsky (pictured), chief executive of the Future of Privacy Forum, a Washington, D.C.-based think tank with 130 members from around the world, many of whom serve as chief privacy officers at big tech companies.

Polonetsky laid out the emerging ethical and business issues at a Data Privacy Day event held Thursday by the National Cyber Security Alliance at the San Francisco headquarters of Twitter Inc., a sponsor of the event. Data Privacy Day is an annual celebration to recognize the Jan. 28, 1981, signing of Convention 108, the first legally binding international treaty concerning privacy and data protection.

Jules Polonetsky is theCUBE’s Guest of the Week.

Polonetsky sat down with Jeff Frick, co-host of theCUBE, SiliconANGLE Media’s mobile video studio, to discuss the challenges his organization sees to privacy in the age of big data and how companies need to respond to those issues. This is one of a series of interviews with top executives and thought leaders at the event, and Polonetsky is also our featured guest of the week. The rest of this series will run in coming days.

A need for basic principles

Polonetsky said an overriding dilemma is that America, at least, is of two minds when it comes to privacy. “We don’t have clear consensus over whether we want the government keeping us safe by being able to catch every criminal, or not getting into our stuff because we don’t trust them,” he stated. As the opportunities to mine data from self-driving cars, wearable devices and more continue to grow, establishing fundamental principles for private companies to use in developing their technology becomes paramount.

“We say, ‘Listen, how can we have data that’ll make cars safer, how can we have wearables that’ll help improve fitness, but also have reasonable, responsible rules in place so that we don’t end up with discrimination or data breaches and all the problems that can come along?’” Polonetsky said.

The problem is that the very data that makes many new technologies work also potentially makes them threatening. For example, advertising technology makes it possible to offer a lot of media for free to consumers, but it also can collect more data about people than they like.

“We can’t have that when it comes to microphones in my home,” Polonetsky said. “I don’t want to be nervous that if I go into the bedroom, suddenly that’s shared across the ad tech ecosystem. I don’t know that we want [extremely personal data] being out there and available to data brokers.”

And very few makers of newer technologies, such as wearables, have even a rudimentary privacy policy or commitment to not doing certain things with users’ data, according to the Future of Privacy Forum. For example, what if a wearable notices a potential health problem? Should it offer advice, or would that be seen as creepy or invasive?

New ethical issues

That said, Polonetsky thinks tech companies are aware of the problem and are trying to be careful.

“I think the big tech companies, by a lot of pain and suffering over the years of being criticized, and with the realization that government goes to them for data, they don’t want that,” Polonetsky said. “They don’t want to be fighting the government and people being nervous that the IRS is gonna try to find out information about what you’re doing, which bedroom you’re in, what time you came home.”

Then there are new ethical issues. When does the self-driving car decide to put the car in the ditch to avoid hitting someone but it knows your children are in the car? Polonetsky said. Should data from sensors in a smart home that record that a murder suspect used a lot of water one evening be used by police to suggest he was washing bloodstains away? Or what if a device overhears what could be child abuse, or if Hello Barbie hears a child screaming or the child tells it a parent touched her inappropriately.

“Do we want dolls ratting out parents?” he asked. “The machines will be providing clues that in some cases are going to incriminate us. Companies that don’t want to be in the middle need to think about designing for privacy so as to avoid creating a world where all data is available to be used against us.”

Polonetsky said these issues need to be addressed collectively, with input from advocates, civil libertarians and companies. “If we can chart a path forward that lets us use these new technologies in ways that advance society, I think we’ll succeed,” he said. “If we don’t think about it, we’ll wake up and we’ll learn that we’ve really constrained ourselves and narrowed our lives in ways that we may not be very happy with.”

Here’s the complete video interview with Polonetsky. You can watch the rest of theCUBE’s coverage of the event here.

Photo by SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU