Image Title

Search Results for Future Privacy Forum:

Jules Polonetsky, Future of Privacy Forum | Data Privacy Day 2017


 

>> Hey, welcome back everybody. Jeff Frick here with theCUBE. We're in downtown San Francisco at Twitter's world headquarters at the Data Privacy Day, a full day event of sessions and breakout sessions really talking about privacy. Although privacy is dead in 1999 get over it, not really true and certainly a lot of people here beg to differ. We're excited to have our next guest Jules Polonetsky, excuse me, CEO of Future of Privacy Forum. Welcome. >> Thank you, great to be here. Exciting times for data, exciting times for privacy. >> Yeah, no shortage of opportunity, that's for sure. The job security and the privacy space is pretty high I'm gathering after a few of these interviews. >> There's a researcher coming up with some new way we can use data that is both exciting, curing diseases, studying genes, but also sometimes orwellian. Microphones are in my home, self-driving cars, and so, getting that right is hard. We don't have clear consensus over whether we want the government keeping us safe by being able to catch every criminal, or not getting into our stuff because we don't trust them >> Right. [Jules] - So challenging times. [Jeff] - So, before we jump into it, Future Privacy Forum, kind of a little bit about the organization, kind of your mission... [Jules] - We're eight years old at the Future Privacy Forum, we're a think tank in Washington, D.C. Many of our members are the chief privacy officers of companies around the world, so about 130 companies, ranging from many of the big tech companies. And as new sectors start becoming tech and data, they join us. So, the auto industries dealing with self-driving cars, connected cars, all those issues. Wearables, student data, so about 130 of those companies. But then the other half of our group are advocates and academics who are a little bit skeptical or worried. They want to engage, but they are worried about an Orwellian future. So we bring those folks together and we say, 'Listen, how can we have data that will make cars safer? How can we have wearables that'll help improve fitness? But also have reasonable, responsible rules in place so that, we don't end up with discrimination, or data breaches, and all the problems that can come along?' [Jeff] - Right, cause it's really two sides of the same coin and it's always two sides of the same coin. And typically on new technology, we kind of race ahead on the positive, cause everybody's really excited. And lag on kind of what the negative impacts are and/or the creation of rules and regulations about because this new technology, very hard to keep up. [Jules] - You know the stakes are high. Think about AdTech, right? We've got tons of adtech. It's fueling free content, but we've got problems of adware, and spyware, and fake news, and people being nervous about cookies and tracking. And every year, it seems to get more stressful and more complicated. We can't have that when it comes to microphones in my home. I don't want to be nervous that if I go into the bedroom, suddenly that's shared across the adtech ecosystem. Right? I don't know that we want how much we sweat or when it's somebody's time of the month, or other data like that being out there and available to data brokers. But, we did a study recently of some of the wearables, the more sensitive ones. Sleep trackers, apps that people use to track their periods, many of them, didn't even have a privacy policy, to say 'I don't do this, or I don't do that with your data.' So, stakes are high. This isn't just about, you know, are ads tracking me? And do I find that intrusive? This is about if I'm driving my car, and it's helping me navigate better and it's giving me directions, and it's making sure I don't shift out of my lane, or it's self-parking, that that data doesn't automatically go to all sorts of places where it might be used to deny me benefits, or discriminate, or raise my insurance rates. [Jeff]: Right, right. Well, there's so many angles on this. One is, you know, since I got an Alexa Dot for Christmas, for the family, to try it out and you know, it's interesting to think that she's listening all the time. [Jules] - So she's not >> And you push the little >> Let's talk about this >> button, you know. >> Or is she not? >> This is a great topic to [Jules] -talk about because a sheriff recently, wanted to investigate a crime and realized that they had an Amazon Echo in the home. And said, 'Well maybe, Amazon will have data about what happened >> Right >> Maybe they'll be clues, people shouting,' you know. And Amazon's fighting because they don't want to hand it over. But what Amazon did, and what Google Home did, and the X-Box did, they don't want to have that data. And so they've designed these things, I think, with actually a lot of care. So... the Echo, is listening for it's name. It's listening for Alexa... >> Right. And it keeps deleting. It listens, right it hears background noise, and if it didn't hear Alexa, drops it, drops it, drops it. Nothing is said out of your home. When you say 'Alexa, what's the weather?' Blue light glows, opens up the connection to Amazon, and now it's just like you're typing in a search or going directly >> Right, right. [Jules] - And so that's done quiet carefully. Google Home works like that, Siri works like that, so I think the big tech companies, despite a lot of pain and suffering over the years of being criticized, and with the realization that government goes to them for data. They don't want that. They don't want to be fighting the government and people being nervous that the IRS is going to try find out information about what you're doing, which bedroom you're in, and what time you came home. >> Although the Fit Bit has all that information. >> Exactly >> Even though Alexa doesn't. [Jules] - So the wearables are another exciting, interesting challenge. We had a project that was funded by both Robert Johnson Foundation, which wants Wearables to be used for health and so forth. But also from a lot of major tech companies. Because everybody was aware that we needed some sort of rules in place. So if Fit Bit, or Jaw Bone, or one of the other Wearables can detect that maybe I'm coming down with Parkinson's or I'm about to fall, or other data, what's their responsibility to do something with that? On one hand, that would be a bit frightening. Right, you got a phone call or an email saying 'Hey, this is your friendly friends at your Wearable and we think >> showing up at your front door >> You should seek medical, you know, help. You would be like, whoa, wait a second, right? On the other hand, what do you do with the fact that maybe we can help you? Take student data, alright. Adtech is very exciting, there's such opportunities for personalized learning, colleges are getting in on the act. They're trying to do big data analytics to understand how to make sure you graduate. Well, what happens when a guidance counselor sits down and says, 'Look, based on the data we have, your grades, your family situation, whether you've been to the gym, your cafeteria usage, data we took off your social media profile, you're really never going to make it in physics. I mean, the data says, people with your particular attributes... Never, never... Rarely succeed in four years at graduating with a degree. You need to change your scholarship. You need to change your career path. Or, you can do what you want, but we're not going give you that scholarship. Or simply, we advise you.' Now, what did we just tell Einstein? Maybe not to take Physics, right. But on the other hand, don't I have some responsibility, if I'm a guidance counselor, who would be looking at your records today, and sort of shuffling some papers and saying, 'Well, maybe you want to consider something else?' So, either we talk about this as privacy, but increasingly, many of my members, again who are chief privacy officers if these companies, are facing what are really ethical issues. And there may be risks, there may be benefits, and they need to help decide, or help their companies decide, when does the benefit outweigh the risk? Consider self-driving cars, right? When does the self-driving car say 'I'm going to put this car in the ditch Because I don't want to run somebody over?' But now it knows that your kids are in the backseat, what sort of calculations do we want this machine making? Do we know the answers ourselves? If the microphone in my home hears child abuse, if 'Hello Barbie' hears a child screaming, or, 'Hey, I swallowed poison,' or 'My dad touched me inappropriately,' what should it do? Do we want dolls ratting out parents? And the police showing up saying, 'Barbie says your child's being abused.' I mean, my gosh, I can see times when my kids thought I was a big Grinch and if the doll was reporting 'Hey dad is being mean to me,' you know, who knows. So, these are challenges that we're going to have to figure out, collectively, with, stakeholders, advocates, civil libertarians, and companies. And if we can chart a path forward that let's us use these new technologies in ways that advances society, I think we'll succeed. If we don't think about it, we'll wake up and we'll learn that we've really constrained ourselves and narrowed our lives in ways that we may not be very happy with. [Jeff] - Fascinating topic. And like on the child abuse thing, you know there are very strict rules for people that are involved in occupations that are dealing with children. Whether it's a doctor, or whether it's a teacher, or even a school administrator, that if they have some evidence of say child abuse, they're obligated >> they're obligated. [Jeff] - Not only are they obligated morally, but they're obligated professionally, and legally, right, to report that in. I mean, do you see those laws will just get translated onto the machine? Clearly, God, you could even argue that the machine probably has got better data and evidence, based on time, and frequency, than the teacher has happening to see, maybe a bruise or a kid acting a little bit different on the school yard. [Jules] - You can see a number of areas where law is going to have to rethink how it fits. Today, I get into an accident, we want to know who's fault is it. What happens when my self-driving car gets into an accident? Right? I didn't do it, the car did it. So, do the manufacturers take responsibility? If I have automated systems in my home, robots and so forth, again, am I responsible for what goes wrong? Or, do these things have, or their companies have some sort of responsibility? So, thinking these things through, is where I think we are first. I don't think we're ready for legal changes. I think what we're ready for is an attitude change. And I think that's happened. When I was the chief privacy officer, at AOL, many years ago, we were so proud of our cooperation with the government. If somebody was kidnapped, we were going to help. If somebody was involved in a terrorism thing, we were going to help. And companies, I think, still recognize their responsibility to cooperate with, you know, criminal activity. But they also recognize that it is their responsibility to push back when government says, 'Give me data about that person.' 'Well, do you have a warrant? Do you have a basis? Can we tell them so they can object? Right? Is it encrypted? Well, sorry, we can't risk all of our users by cracking encryption for you because you're following up on one particular crime.' So, there's been a big sea change in understanding that if you're a company, and there's data you don't want to have to hand over, data about immigrants today, lots of companies, in the Valley, and around the country, are thinking, 'Wait a second, could I be forced to hand over some data that could lead to someone being deported? Or tortured? Or who knows what?' Given that these things seem to be back on the table. And, you know again, years ago, you were a good asterisk, you participated in law enforcement and now people participate, but they also recognize that they have a strong obligation to either not have the data, like Amazon, will not have data that this sheriff wants. Now, their Smart Meter and how much water they're using, and all kinds of other information, frankly about their activity at home, since many other things about our homes is now smarter, may indeed be available. How much water did you use at this particular time? Maybe you were washing blood stains away. That sort of information is >> Wild [Jules] - going to be out there. So, the machines will be providing clues that in some cases are going to incriminate us. And companies that don't want to be in the middle, need to think about designing, for privacy, so as to avoid, creating a world where, you know, whole data is available to be used against us. [Jeff] - Right and then there's the whole factor of the devices are in place, not necessarily the company is using it or not, but, you know, bad actors taking advantage of cameras, microphones, all over and hacking into these devices to do things. And, it's one thing take a look at me while I'm on my PC, it's another thing to take control of my car. Right? And this is where, you know, there's some really interesting challenges ahead. As IT continues to grow. Everything becomes connected. The security people always like to say, you know, the certainty attack area, it grows exponentially. [Jules] - Yeah. Well cars are going to be an exciting opportunity. We have released, today, a guide that the National Auto Dealers Association is providing to auto dealers around the country. Because, when you buy a car today, and you sell it or you lend it, there's information about you in that vehicle. Your location history, maybe your contacts, your music history, and we never would give our phone away without clearing it, or you wouldn't give your computer away, but you don't think about your car as a computer, and so, this has all kinds of advice to people. Listen, your car is a computer. There's things you want to do, to take advantage of, >> Right. [Jules]- New services, safety. But there are things you want to also do to manage your privacy, delete. Make sure you're not sharing your information in a way you don't want it. [Jeff] - Jules, we could go on all day, but I think I've got to let you go to get back to the sessions. So, thanks for taking a few minutes out of your busy day. [Jules] - Really good to be with you. [Jeff] - Absolutely. Jeff Frack, you're watching The Cube. See you next time. (closing music)

Published Date : Jan 28 2017

SUMMARY :

We're excited to have our next guest Jules Polonetsky, Exciting times for data, exciting times for privacy. The job security and the privacy space is pretty high and so, getting that right is hard. to try it out and you know, it's interesting to think that and realized that they had an Amazon Echo in the home. and the X-Box did, When you say 'Alexa, what's the weather?' and people being nervous that the IRS is going to try [Jules] - So the wearables are another exciting, 'Hey dad is being mean to me,' you know, who knows. to cooperate with, you know, criminal activity. so as to avoid, creating a world where, you know, but I think I've got to let you go

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jules PolonetskyPERSON

0.99+

JulesPERSON

0.99+

Jeff FrickPERSON

0.99+

AmazonORGANIZATION

0.99+

JeffPERSON

0.99+

Jeff FrackPERSON

0.99+

AOLORGANIZATION

0.99+

National Auto Dealers AssociationORGANIZATION

0.99+

two sidesQUANTITY

0.99+

SiriTITLE

0.99+

1999DATE

0.99+

EinsteinPERSON

0.99+

Washington, D.C.LOCATION

0.99+

TodayDATE

0.99+

AdtechORGANIZATION

0.99+

eight yearsQUANTITY

0.99+

IRSORGANIZATION

0.99+

todayDATE

0.99+

EchoCOMMERCIAL_ITEM

0.99+

ChristmasEVENT

0.99+

bothQUANTITY

0.99+

Robert Johnson FoundationORGANIZATION

0.99+

Data Privacy DayEVENT

0.98+

Alexa DotCOMMERCIAL_ITEM

0.98+

The CubeTITLE

0.98+

BarbiePERSON

0.98+

AlexaTITLE

0.97+

about 130QUANTITY

0.97+

Fit BitORGANIZATION

0.97+

four yearsQUANTITY

0.96+

about 130 companiesQUANTITY

0.96+

Future Privacy ForumORGANIZATION

0.96+

oneQUANTITY

0.96+

TwitterORGANIZATION

0.96+

Jaw BoneORGANIZATION

0.95+

OneQUANTITY

0.95+

firstQUANTITY

0.95+

halfQUANTITY

0.94+

Google HomeCOMMERCIAL_ITEM

0.88+

theCUBEORGANIZATION

0.88+

Data Privacy Day 2017EVENT

0.86+

many years agoDATE

0.84+

ParkinsonOTHER

0.84+

Future of Privacy ForumORGANIZATION

0.83+

AdTechORGANIZATION

0.83+

GrinchPERSON

0.81+

X-BoxCOMMERCIAL_ITEM

0.8+

HomeCOMMERCIAL_ITEM

0.79+

years agoDATE

0.78+

downtown San FranciscoLOCATION

0.7+

Fit BitCOMMERCIAL_ITEM

0.7+

one handQUANTITY

0.68+

WearableORGANIZATION

0.66+

GodPERSON

0.66+

tonsQUANTITY

0.64+

a secondQUANTITY

0.63+

MeterCOMMERCIAL_ITEM

0.57+

MaybPERSON

0.52+

GoogleORGANIZATION

0.5+

yearQUANTITY

0.49+

secondQUANTITY

0.48+