Denelle Dixon, Mozilla | Data Privacy Day 2017
>> Hey, welcome back everybody, Jeff Frick here with theCUBE. It is Data Privacy Day which I just found out has been going on for about 20 years, or 30 years, but we're happy to be at our very first one. We're in downtown San Francisco at the Twitter headquarters, it's a full day event that's actually happening around the world, but we're here in San Francisco and excited to have some of the guests come down that are doing the panels and the discussions and the breakout sessions, and we're excited for our next guest Denelle Dixon, Chief Legal and Business Officer from Mozilla, welcome! >> Thank you, happy to be here. >> So there was a spirited panel to kick off the day, I wonder if you could share some of your thoughts as to some surprises that came out of that conversation? >> So not so many surprises, but we talked a lot about IOT and just the Internet of Things, the web of things, whatever we're going to call it, and the data that's available as a result of that to companies, to governments, to lots of different entities and whether consumers understand that, and the responsibilities that both the consumers and the technology companies have with respect to that data. >> And Mozilla, obviously, was right there at the big change to go to, you know, graphical web interface, which was a sea change really in the internet and how it would interact with people. IoT represents that same kind of thing, and oh, by the way, people are things too, as we like to say on theCUBE, so as you kind of look at the new challenges faced by IoT, what are some of the things that bubble onto your priority list in terms of things that need to really be thought of that really people aren't thinking enough of now? >> I think that one of the most important things about IoT and the idea that this is information that's collected and used by devices and technology companies because of the fact that it can be wearable, it can be things that you have in your house that collect data as you're talking to it. One of the most important things, and just keeping Data Privacy Day in mind, is that we make sure that consumers are aware that this is actually happening, that data is being collected and sent, and how that data is being used. It used to be, back in the day, we could have privacy policies, so we put them up, 15 pages long, and assumed that users understood that. Well, that can't be used with respect to these kinds of devices, so we need to be innovative, we need to be creative, we need to be able to ask questions of these devices and have them tell us what's going on with the data that they collect and how they're doing that. So it's just as incumbent upon the technology companies that create these devices to ensure that users understand that, as it is upon the users to understand that these kinds of actions are happening and these trade offs with respect to it. Really interesting, crazy, exciting in terms of the different technologies that we can use, but really important that we get this right. >> It just strikes me that, I think, so many people just click, yes I accept. Are people really, I mean I'm sure some people are that are paying attention, but it just seems that most people just click and accept, click and accept, click and accept, especially if you've kind of got into that behavior pattern and haven't really thought about the way these applications are evolving, haven't really thought about Facebook on your laptop or on your PC at home, is different from Facebook on your mobile, they haven't really thought about, wow, what are these connected devices now collecting data, that as you said didn't even get the chance to opt in, so how do you educate people to make intelligent choices, and how do we, like, break the EULA up, maybe, so that I can opt in for if I want to share A, B, and C, but not D, E and F, and oh, I forgot, I really need F to make this thing function. It seems like a really complicated kind of disclosure problem. >> It is complicated, and that doesn't mean that we don't have to crack it. So you said the word EULA, that's the End User Legal Agreement, and I don't think we can live in a world of EULAs. I think we live in a world where we put in context notices we have to actually create so that your interface, or whatever small thing that you have, is able to alert you that this data is actually transpiring, so it has to be in context, it has to be creative, it has to be part of product development, it can't be an afterthought. Before it used to be that they would hand this over to the lawyers and say, hey, can you help us figure out how to notify our users. This has to be part of our innovative process today. We're seeing more and more of it. We're seeing technology companies take this seriously and include privacy by design in their product development, make these in context notices part of the way that they think about the product, and not just about the afterthought, and so the more we do this the better it's going to be for all of us, but it's actually, just because it's hard it means that it's a creative, thoughtful amazing process that we all need to engage in. >> So one of the hot topics that we cover a lot is diversity in tech, and women in tech specifically, and not only is it the right thing to do, but there's very clearly defined positive business outcomes when you have a diversity of opinions when you're making decisions. Is there a corollary to what you're describing in terms of being more forthright in your privacy policy that's really not only the right thing to do question, which is fine, but is there a real business benefit that you can see or that you project that's going to be even a better motivator for people to start changing the behavior in the way in which they disclose or interact with people on the privacy issue? >> Yeah, I love the way you introduced that, because from my standpoint one of the things that we don't like to do, that we don't like to be in life is surprised. And so, one of the most important things is, if you think about everything, is a no surprise rule. So if we start thinking about business and our engagement with our users as creating a no surprises opportunity, it actually creates trust, it fosters deeper engagement, it makes it so that we are all going to be happier in terms of that relationship, maybe the users actually give more to the product, maybe the product can actually give more then to the user, so this no surprises rule, and the way that we can operate, creates really nice business cycles and really nice interesting dynamics between consumers and the businesses that they use. >> It's great, the trust word in it, it also plays into kind of the services, in that everything is a service. Because when everything is a service you have to maintain a solid, ongoing relationship, it's not a one time purchase, adios, we're never going to see you again, and so that really plays into this. If it's a trusted service provider that you feel good about, you continue to pay that $9.95 to Spotify or whomever that service provider is, so it's a really different way of looking at the world. >> It is, and it's one of the things that we actually encouraged from the very outset, is this kind of creation of trust. Trust is really easy to lose with respect to your consumer base, and it's the most important thing as you're engaging. We created these initiatives called the lean data practices and then we also have privacy initiatives that we put out there for start ups and other entities that they can utilize and hopefully create for their businesses. Part of it is the no surprises rule, but it's also think about what data you want to collect, so that you actually are collecting what you need, throw away what you don't, anonymize it. Like really create that trusted relationship because you can always grow. If you think, I actually need more data today than I did when I started a year ago, then it's a great way to have that conversation with your consumer base. So it's one of the things, trust starts it all. So from Mozilla's standpoint, we operate that through our products, because we definitely have that in our Firefox browser and the other products that we have on mobile, but one of the things that we care about is creating this awesome opportunity for the web to continue to grow, and so we care about how other companies are approaching this too. >> So you mentioned Firefox, and you guys have a new product coming out today, Firefox Focus, so explain to folks what is Firefox Focus, why should they care, what's different than just kind of traditional Firefox? >> Right, so we've had Focus in iOS before, and today we actually launched it in 27 languages to 27 different areas that you can get it. It's a privacy focused browser, but it can also be performance focused. So that you actually have content you can exclude, some content doesn't get pushed through so that your performance is faster, and you can really focus on what kind of data that you want to share with companies. So try it out, I think that it's an awesome experience, certainly from the standpoint of privacy but also from performance. >> So Denelle, 2017, we just flipped the calendar a few weeks ago, as you look forward in the year you probably went through your annual planning process, what are some of your priorities for 2017, what are you looking forward to that are top of your list for the next 12 months? >> So it's really the top, I run the policy, business and legal teams at Mozilla from a policy standpoint, really focused on encryption, security, privacy, looking at the new administration here in the US as well as what's happening in Europe. I think it's a really important area for us to focus on from a business standpoint. I want to see us really dive into growth with respect to Firefox as our desktop browser. I want to see our mobile space grow, and grow even outside the browser. So I'm really excited about what we can do there. And then from the legal side, I want to continue to push the envelope on this no surprises with respect to doing that in more areas that we can with respect to our products and pushing that idea side too. >> I love that, no surprises, it's like a bumper sticker. (laughs) She's Denelle, I'm Jeff, you're watching theCUBE, see you next time.
SUMMARY :
that are doing the panels and the discussions and the technology companies have with respect to that data. and oh, by the way, people are things too, about IoT and the idea that this is information that as you said didn't even get the chance to opt in, and so the more we do this the better it's going to be and not only is it the right thing to do, it makes it so that we are all going to be happier and so that really plays into this. and the other products that we have on mobile, So that you actually have content you can exclude, that we can with respect to our products I love that, no surprises, it's like a bumper sticker.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff Frick | PERSON | 0.99+ |
Denelle | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Denelle Dixon | PERSON | 0.99+ |
$9.95 | QUANTITY | 0.99+ |
15 pages | QUANTITY | 0.99+ |
US | LOCATION | 0.99+ |
Mozilla | ORGANIZATION | 0.99+ |
2017 | DATE | 0.99+ |
EULA | TITLE | 0.99+ |
Spotify | ORGANIZATION | 0.99+ |
30 years | QUANTITY | 0.99+ |
San Francisco | LOCATION | 0.99+ |
Firefox | TITLE | 0.99+ |
ORGANIZATION | 0.99+ | |
a year ago | DATE | 0.99+ |
one | QUANTITY | 0.99+ |
EULAs | TITLE | 0.99+ |
iOS | TITLE | 0.98+ |
27 languages | QUANTITY | 0.97+ |
27 different areas | QUANTITY | 0.97+ |
about 20 years | QUANTITY | 0.97+ |
One | QUANTITY | 0.97+ |
today | DATE | 0.96+ |
Firefox Focus | TITLE | 0.96+ |
one time | QUANTITY | 0.96+ |
Data Privacy Day | EVENT | 0.94+ |
both | QUANTITY | 0.93+ |
next 12 months | DATE | 0.9+ |
ORGANIZATION | 0.88+ | |
few weeks ago | DATE | 0.86+ |
Data Privacy Day 2017 | EVENT | 0.86+ |
End | TITLE | 0.84+ |
theCUBE | ORGANIZATION | 0.82+ |
first one | QUANTITY | 0.82+ |
Legal Agreement | TITLE | 0.76+ |
many | QUANTITY | 0.65+ |
adios | ORGANIZATION | 0.53+ |
Focus | ORGANIZATION | 0.49+ |
Eve Maler | Data Privacy Day 2017
>> Hey, welcome back everybody. Jeff Frick here with the CUBE. We are in downtown San Francisco at the Twitter headquarters for a big event, the Data Privacy Day that's been going on for years and years and years. It's our first visit and we're excited to be here. And our next guest is going to talk about something that is near and dear to all of our hearts. Eve Maler, she's the VP Innovation and Emerging Technology for ForgeRock. Welcome. >> Thank you so much. >> Absolutely. So for people who aren't familiar with ForgeRock, give us a little background on the company. >> Sure. So, of course, the digital journey for every customer and consumer and patient and citizen in the world is so important because trust is important. And so what ForgeRock is about is about creating that seamless digital identity journey throughout cloud, mobile, internet of things, devices, across all of their experiences in a trustworthy and secure way. >> So one of the topics that we had down and getting ready for this was OAuth. >> Yes. >> And as the proliferation of SAS applications continues to grow both within our home life as well as our work life, we have these pesky things called passwords which no one can remember and they force you to change all the time. So along comes OAuth. >> Yes. So OAuth is one of those technologies... I'm kind of a standards wonk. I actually had a hand in creating XML for those people who remember XML. >> Jeff: That's right. >> OAuth took a tact of saying, "Let's get rid of what's called the password anti-pattern. "Let's not give out our passwords to third party services and applications so that we can just give those applications what's called an access token. Instead it's meant just for that application. In fact, Twitter... We're heard at Twitter headquarters. Twitter uses that OAuth technology. And I'm involved in a standard, being a standards wonk, that builds on top of OAuth called user managed access. And it uses this so that we can share access with applications in the same way. And we can share access also with other people using applications. So for example, the same way we hit a share button in Google, Alice hits a share button to share access with a document with Bob. We want to allow every application in the world to be able to do that, not just GoogleDocs, GoogleSheets, and so on. So OAuth is powerful and user managed access is powerful for privacy in the same way. >> Now there's OAuth and I use my Twitter OAuth all the time. Or with Google. >> That's right. >> And then there's these other kind of third party tools which add kind of another layer. >> So you might use like tweetbot is something I like to use on my phone to tweet. >> Jeff: Right, right. >> And so there's... >> Well there's the tweetbot. But then there's these pure, like identity password manager applications which you know you load it into there and then... >> LastPass or something like that. >> Right, right, right. >> One password people use yeah >> To me it's just like wow, that just seems like it's adding another layer. And if oh my gosh, if I forget the LastPass password, I'm really in bad shape. >> You are. >> Not just the one application, but a whole bunch. I mean, how do you see the space kind of evolving to where we got to now? And how is it going to change going forward? It just fascinates me that you still have passwords when our phones have fingerprint. >> TouchID. >> Why can't it just work off my finger? >> More and more, SAS services and applications are actually becoming more sensitive to multifactor authentication, strong authentication, what we at ForgeRock would actually call contextual authentication and that's a great way to go. So they're leveraging things like TouchID, like device fingerprint, for example. Recognizing that the devices kind of represents you and your unique way of using the device. And in that way, we can start to do things like what's called a password list flow. Where it can, most of the time, or all of the time, actually not even use a password. And so, I don't know, I used to be an industry analyst and 75 percent of my conversations with folks like you would be about passwords. And more frequently, I would say now, we're getting into the topic of people are more password savvy and more of the time people are turning on things like multifactor authentication and more of that it knows the context that I'm using my corporate WiFi which is safer. Or I'm using a familiar device. And that means I don't have to use the password as often. So that's contextual authentication. Meaning I don't have to use that insecure password so often. >> Jeff: Right. >> So I think the world has gotten actually a little bit smarter about authentication. I'm hoping. And actually, technologies like OAuth and the things that are based on OAuth like OpenIDConnect which is an identity technology, a modern identity, federated identity technology. And things like user managed access are leveraging the fact that OAuth is getting away from having to use, if it was a password based authentication, not flinging the password around the internet, which is the problem. >> Right, right. Okay so that's good, that's getting better, but now we have this new thing. Internet of things. >> Yes indeed. >> And people are things. But now we've got connected devices, they're not necessarily ones that I purchased, that I authorized, that I even maybe am aware of. >> Like a beacon on a wall, just observing you. >> Like a beacon on a wall and sensors, and the proliferation is just now really starting to run. So from a privacy point of view, how does kind of IOT that I'm not directly involved with compare to IOT with my Alexa compare to applications that I'm actively participating in. How do those lines start to blur? And how does the privacy issues kind of spill over now into managing this wild world of IOT? >> Yeah, there's a couple of threads with the Internet of Things. And so I'm here today at this Data Privacy Day Event to participate on a panel about the IOT tipping point. And there's a couple of threads that are just really important. One is the security of these devices is in large part, a device identity theft problem with this dyn attack. In fact, that was an identity theft problem of devices. We had poorly authenticated devices. We had devices that have identities they have identities, they have identifiers, and they have secrets. And it was a matter of their own passwords being easily taken over. It was account takeovers, essentially for devices, that was the problem. And that's something we have to be aware of. So, you know, just like applications and services can have identities, just like people, we've always known that. It's something our platform can handle. We need to authenticate our devices better and that's something manufacturers have to take responsibility for. >> Jeff: Right. >> And we can see the government agencies starting to crack down on that which is a really good thing. The second thing is there's a saying in the healthcare world for people who are working on patient privacy rights, for example. And the saying is, no data about me without me. So there's got to be a kind of a pressure, you know we see whenever there's a front page news article about the latest password breach. We don't actually see so many password breaches anymore as we see this multifactor authentication come in to play. So that's the industry pressures coming in to play. Where passwords become less important because we have multifactor. We're starting to see consumer pressure say I want to be a part of this. I want you to tell me what you shared. I want more transparency, and I want more control. And that's got to be part of the equation now when it comes to these devices. It's got to be not just more transparent, but what is it you're sharing about me? >> Jeff: Right. >> Last year I actually bought, maybe this is TMI, I always have this habit of sharing too much information, >> That's okay, we're on theCUBE we like >> Being honest here. >> To go places other companies don't go. >> I bought one of those adjustable beds that actually has an air pump that... >> What's your number? Your sleep number. >> It is, it's a Sleep Number bed and it has a feature that connects to an app that tells you how well you slept. You look at the terms and conditions and it says we own your biometric data, we are free to do whatever we want. >> Where did you even find the terms and conditions? >> They're right there on the app, to use the app. >> Oh in the app, in the app. >> You have to say yes. >> So you actually read before just clicking on the box. >> Hey, I'm a privacy pro, I've got to. >> Right, right, right. >> And of course, I saw this, and to use the feature, you have to opt in. >> Right. >> This is the way it is. There's no choice, and they probably got some lawyer... This is the risk management view of privacy. It's no longer tenable to have just a risk management view because the most strategic and the most robust way to see your relationship with your customers is you have to realize there's two sides to the bargain because businesses are commoditized now. There's low switching costs to almost anything. I mean, I bought a bed, but I don't have to have that feature. >> Do you think, do you think they'll break it up? So you want the bed, you're using a FitBit or something else to tell you whether you got a good night's sleep or not. Do you see businesses starting to kind of break up the units of information that they're taking and can they deliver an experience based on a fragmented selection? >> I do believe so. So, user managed access and certain technologies like it, standards like it, there's a standard called consent receipts. They're based on a premise of being able to now deliver convenient control to users. There's even, so there's regulations that are coming like the general data protection regulation in the EU. It's bearing down on pretty much every multinational, every global enterprise that monitors or sells to an EU citizen. That's pretty much every enterprise. >> Jeff: Right, right. >> That demands that individuals get some measure of the ability to withdraw consent in a convenient fashion. So we've got to have consent tech that measures up to the policy that these >> Right. >> organizations have to have. So this is coming whether we sort of like it or not. But we should have a robust and strategic way of exposing to these people the kind of control that they want anyway. >> Jeff: Right. >> They all tell us they want it. So in essence, personal data is becoming a joint asset. We have to conceive of this that way. >> So that's in your... So that's in your sleep app, but what about the traffic cameras and the public facility? >> Yeah. >> I mean, they say in London right you're basically on camera all the time. I don't know if that's fact or not, but clearly there's a lot >> That's true, CCTV, yeah. Of cameras that are tracking your movements. You don't get a chance to opt in or out. >> That is actually true, that's a tough case. >> You don't know. >> The class of... Yeah. The class of beacons. >> And security, right. Obviously, post 9/11 world, that's usually the justification for we want to make sure something bad doesn't happen again. We want to keep track. >> Yeah. >> So how does kind of the government's role in that play? And even in the government, then you have you know all these different agencies, whether it's the traffic agency or even just a traffic camera that maybe KCBS puts up to keep track of you know, it says slow down >> Yeah. >> Between two exits. How does that play into this conversation? >> Yeah, where you don't have an identified individual. And not even an identifiable individual, these are actually terms if you look at GDPR, which I've read closely. It is a tougher case, although I have worked... One of the members of my user managed access working group is one of the sort of experts on UK CCTV stuff. And it is a very big challenge to figure out. And governments do have a special duty of care to figure this out. And so the toughest cases are when you have beacons that just observe passively. Especially because the incentives are such that, I will grant you, the incentives are such that, well how do they go and identify somebody who's hard to identify and then go inform them and be transparent about what they're doing. >> Jeff: Right, right. >> So in those cases, even heuristically identifying somebody is very, very tough. However, there is a case where eye beacons in, say, retail stores do have a very high incentive to identify their consumers and their retail customers. >> Right. >> And in those cases, the incentives flip in the other direction towards transparency and reaching out to the customer. >> Yeah. The tech of these things of someone who I will not name, recently got a drive through red light ticket. >> Yep. >> And the clarity of the images that came in that piece of paper that I saw was unbelievable. >> Yes. >> So I mean, if you're using any kind of monitoring equipment, the ability to identify is pretty much there. >> Now we have cases... So this just happened, actually I'm not going to say, do I say it was to me or to my husband? It was in a non-smart car in a non-smart circumstance where simply a red light camera that takes a picture of an identified car, so you've got a license plate and that binds it to a registered owner of a car. >> Right. >> Now I have a car that's registered in the name of a trust. They didn't get a picture of the driver. They got a picture of the car. So now here we can talk about, let's translate that from a dumb car circumstance, registered to a trust, not to an individual, they sent us what amounted to a parking ticket. Cause they couldn't identify the driver. So now that gives us an opportunity to map that to an IOT circumstance. Because if you've got a smart device. You've got a person, you've got a cloud account. What you need to do is the ability to, in responsible secured fashion, bind a smart device to a person and their cloud account. And the ability to unbind. So now we're back to having an identity centric architecture for security and privacy that knows how to... I'll give you a concrete example, let's say you've got a fleet vehicle in a police department. You assign it to whatever cop on the beat. And at the end of their shift, you assign the car to another cop. What happens on one shift and what happens on another shift is a completely different matter. And it's a smart car, maybe it's a cop who has a uniform with some sort of camera, you know body cam. That's another smart device, and those body cams also get reassigned. So you want whatever was recorded, in the car, on the body cam, with the cop, and with their whatever online account it is, you want the data to go with the cop, only when the cop is using the smart devices that they've been assigned and you want the data for somebody else to go with the somebody else. So in these cases, the binding of identities and the unbinding of identities is critical to the privacy of that police person. >> Jeff: Right, right. >> And to the integrity of the data. So this is why I think of identity centric security and privacy as being so important. And we actually say, at ForgeRock, we say identity relationship management is being so key. >> And whether you use it or not, it is really kind of after the fact of being able to effectively tie the two together. >> You have to look at the relationships in order to know whether it's viable to associate the police person's identity with the car identity. Did something happen to the car on the shift? Did something through the view of the camera on the shift? >> Right, right. And all this is underlaid by trust, which has come up in a number of these interviews today. And unfortunately we're in a situation now if you read all the surveys. And the government particularly, these are kind of the more crazy cases cause businesses can choose to or not to and they've got a relationship with the customer. But on the government side, where there's really no choice, right, they're there. Right now, I think we're at a low point on the trust factor. >> Indeed. >> So how is that, and if you don't trust, then these things are seen as really bad as opposed to if you do trust and then maybe they're just inconvenient or they're not quite worked out all the way. So as this trust changes and fake news and all this other stuff going on right now, how is that impacting the implementation of these technologies? >> Well ask me if I said yes to the terms and conditions. (laughter) Of the sleep app, right. I mean I said yes, I said yes. And I didn't even ask for the app, you know my husband signed up for the free trial. >> Just showed up on my phone. Cause I was in proximity >> I said this one on stage >> to the bed, right? >> at RSA so this is not news. I'm not breaking news here. But you know, consumers want the features, they want convenience, they want value. So it's unreasonable, I believe to simply mount an education campaign and thereby change the world. I do think it's good to have general awareness of what to demand and that's why I say no data about me without me. That's what people should be demanding is to be let in to the loop. Because that gives them more convenience and value. >> Right. >> They want share buttons. I mean, we saw that with the initial introduction of CareKit with Apple. Because that enabled what, people who are involved in user managed access, we call ourselves Umanitarians. So umanitarians like to say, like to call it Alice to Bob sharing, that's the use case. >> Jeff: Okay. >> And it enabled Alice to Dr. Bob sharing. That's a real use case. And IOT kind of made real that use case. When web and mobile and API, I don't think we thought about it so much as a positive use case, although in healthcare it's been a very real thing with EHR. You know you can go into your EHR system and you can see it, you can share with a spouse your allergy record or something, it's there. >> Right, right, right. >> But with IOT, it's a really positive thing. I've talked to folks in my day job about sharing access to a connected car to a remote user. You know, we've seen the experiments with let somebody deliver a package into the trunk of my car, but not get access to driving the car. These are real. That's better than saving >> I've heard that one actually >> Saving a little money by having smart light bulbs is not as good as you've got an Airbnb renter and you want to share limited access to all your stuff while you're away with your renter and then shut down access after you leave, that's an uma use case, actually. And that's good stuff. I could make money. >> Jeff: Right. >> Off of sharing that way. That's convenience and value. >> It's only, I just heard the other day that Airbnb is renting a million rooms a night. >> There you go. >> So not insignificant. >> So once you've have... You have a home that's bristling with smart stuff, you know. That's when it really makes sense to have a share button on all that stuff. It's not just data you're sharing. >> Well Eve, we could go on and on and on. >> Apparently. >> Are you going to be at RSA in a couple of weeks? >> Absolutely. >> Absolutely. >> I'm actually speaking about consent management. >> Alright, well maybe we'll see you there. >> That would be great. >> But I want to thank you for stopping by. >> It's a pleasure. >> And I really enjoyed the conversation. >> Me too, thanks. >> Alright, she's Eve, I'm Jeff, you're watching theCUBE. We'll catch you next time, thanks for watching. (upbeat music)
SUMMARY :
And our next guest is going to talk So for people who aren't familiar with ForgeRock, and citizen in the world is so important So one of the topics that we had down And as the proliferation of SAS applications So OAuth is one of those technologies... So for example, the same way we hit Now there's OAuth and I use my Twitter OAuth all the time. And then there's these other kind I like to use on my phone to tweet. which you know you load it into there and then... And if oh my gosh, if I forget the LastPass password, And how is it going to change going forward? And that means I don't have to use the password as often. is getting away from having to use, but now we have this new thing. And people are things. Like a beacon on a wall, And how does the privacy issues kind of spill over now And that's something we have to be aware of. So that's the industry pressures coming in to play. I bought one of those adjustable beds What's your number? to an app that tells you how well you slept. And of course, I saw this, and to use the feature, don't have to have that feature. or something else to tell you whether or sells to an EU citizen. some measure of the ability to withdraw consent to these people the kind of control that they want anyway. We have to conceive and the public facility? I don't know if that's fact or not, You don't get a chance to opt in or out. That is actually true, The class of beacons. the justification for we want How does that play into this conversation? And so the toughest cases are when you to identify their consumers and reaching out to the customer. The tech of these things of someone who I will not name, And the clarity of the images the ability to identify is pretty much there. and that binds it to a registered owner of a car. And the ability to unbind. And to the integrity of the data. And whether you use it or not, You have to look at the relationships not to and they've got a relationship with the customer. as opposed to if you do trust And I didn't even ask for the app, Cause I was in proximity I do think it's good to have general awareness to Bob sharing, that's the use case. And it enabled Alice to Dr. Bob sharing. get access to driving the car. to all your stuff while you're away Off of sharing that way. It's only, I just heard the other day You have a home that's bristling with smart stuff, you know. But I want to thank you We'll catch you next time, thanks for watching.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff | PERSON | 0.99+ |
Eve Maler | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
London | LOCATION | 0.99+ |
KCBS | ORGANIZATION | 0.99+ |
Eve | PERSON | 0.99+ |
Apple | ORGANIZATION | 0.99+ |
ForgeRock | ORGANIZATION | 0.99+ |
Bob | PERSON | 0.99+ |
Alice | PERSON | 0.99+ |
OAuth | TITLE | 0.99+ |
Last year | DATE | 0.99+ |
One | QUANTITY | 0.99+ |
75 percent | QUANTITY | 0.99+ |
two sides | QUANTITY | 0.99+ |
Airbnb | ORGANIZATION | 0.99+ |
LastPass | TITLE | 0.99+ |
two | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
today | DATE | 0.99+ |
9/11 | EVENT | 0.99+ |
first visit | QUANTITY | 0.99+ |
GDPR | TITLE | 0.99+ |
Data Privacy Day | EVENT | 0.99+ |
one | QUANTITY | 0.98+ |
second thing | QUANTITY | 0.98+ |
GoogleSheets | TITLE | 0.98+ |
one shift | QUANTITY | 0.98+ |
RSA | ORGANIZATION | 0.97+ |
tweetbot | TITLE | 0.97+ |
both | QUANTITY | 0.96+ |
One password | QUANTITY | 0.95+ |
two exits | QUANTITY | 0.95+ |
CUBE | ORGANIZATION | 0.95+ |
Dr. | PERSON | 0.95+ |
GoogleDocs | TITLE | 0.94+ |
ORGANIZATION | 0.94+ | |
UK | LOCATION | 0.93+ |
Twitter OAuth | TITLE | 0.9+ |
EHR | TITLE | 0.89+ |
a million rooms a night | QUANTITY | 0.87+ |
TouchID | OTHER | 0.87+ |
SAS | ORGANIZATION | 0.86+ |
San Francisco | LOCATION | 0.85+ |
Data Privacy Day 2017 | EVENT | 0.84+ |
Data Privacy Day Event | EVENT | 0.84+ |
OpenIDConnect | TITLE | 0.82+ |
Alexa | TITLE | 0.71+ |
EU | ORGANIZATION | 0.7+ |
CareKit | TITLE | 0.68+ |
one application | QUANTITY | 0.68+ |
years | QUANTITY | 0.67+ |
TMI | ORGANIZATION | 0.66+ |
Michael Kaiser | Data Privacy Day 2017
>> Hey, welcome back everybody. Jeff Frick here with theCUBE. We're in downtown San Francisco at the Twitter headquarters for Data Privacy Day. An interesting collection of people coming together here at Twitter to talk about privacy, the implications of privacy... And I can't help but think back to the classic Scott McNeely quote right, "Privacy is dead, get over it", and that was in 1999. Oh how the world has changed, most significantly obviously mobile phones with the release of the iPhone in 2007. So we're excited to really kind of have the spearhead of this event, Michael Kaiser. He's the executive director of the National Cyber Security Alliance in from Washington D.C.. Michael, great to see you. >> Thanks for having us in. >> For the folks that aren't here, what is kind of the agenda today? What's kind of the purpose, the mission? Why are we having this day? >> Well Data Privacy Day actually comes to us from Europe, from the EU which created privacy as a human right back in 1981. We've been doing it here in the United States since around 2008. NCSA took over the effort in 2011. The goal here really is just help educate people, people and businesses as well, about the importance of respecting privacy, the importance of safeguarding information, people's personal data. And then really hopefully with an end goal of building a lot more trust in the ecosystem around the handling of personal data which is so vital to the way the internet works right now. >> Right, and it seems like obviously companies figured out the value of this data long before individuals did and there's a trade for service. You use Google Maps, you use a lot of these services but does the value exchange necessarily, is it equal? Is it at the right level? And that seems to be kind of the theme of some of these privacy conversations. You're giving up a lot more value than you're getting back in exchange for some of these services. >> Yeah, and we actually have a very simple way that we talk about that. We like to say that personal information is like money and that you should value it and protect it. And so, trying to encourage people and educate people to understand that their personal information does have value and there is an exchange that's going on. They should make sure that those transactions are ones that they're comfortable in terms of giving their information and what they get back. >> Right, which sounds great Michael but then you know you get the EULA, you know you sign up for these things and they don't really give you the option. You can kind of read it but who reads it? Who goes through? You check the box and you move on. And or you get this announcement, we changed our policy, we changed our policy, we changed our policy. So, I don't know if realistic is the right word but how do people kind of navigate that? Because, let's face it my friends told me about Uber, I want to get an UBER. I download UBER. I'm stuck in a rainy corner in D.C. and I hit go and here comes the car. I don't really dig into the meat. Is there an option? I mean there's not really, I opt for privacy one, two, three and I'm opting out of five, six, seven. >> Yeah, I think we're seeing a little bit more granular controls for people on some of these things now but I think that's what we'd advocate for more. When we talk to consumers they tell us mostly that they want to have better clarity about what's being collected about them, better clarity about how that information's being used, or if it's, how it's being shared. Equally importantly, if there are controls where are they, how easy are they to use, and making them more prominent so people can engage in sort of making the services tailored to their own sort of privacy profile. I think we'd like to see more of that for sure, more companies being a little more forthcoming. Yeah you have the big privacy policy that's a long complicated legal document but there may be other way to create interfaces with your customers that make some of the key pieces more apparent. >> And do you see a trend where, because you mentioned in some of the notes that we prepared that privacy is good for business and potentially is a competitive differentiator. Are you starting to see where people are surfacing privacy more brightly so that they can potentially gain the customer, gain respect of the customer, the business of the customer over potentially a rival that's got that buried down? Is that really a competitive lever that you see? >> Well I think you see some extremes. So you see some companies that say we don't collect any information about you at all so that's part of, out there, and I think they're marketing to people who have extreme concerns about this. But I also think we're seeing again some places where there are more higher profile ability to control some of this data right. Even in you know places like the mobile setting where sometimes you'll just get a little warning saying oh this is about to use your location, is that okay, or your location is turned off you need to turn it back on in order to use this particular app. And I think those kinds of interfaces with the user of the technology are really important going forward. We don't want people overwhelmed like every time you turn on your phone you're going to have to answer 17 things in order to get to do x, y, and z but making people more aware of how the apps are using the information they collect about you I think is actually good for business. I think actually sometimes consumers get confused because they'll see a whole list of permissions that need to be provided and they don't understand how those permissions apply to what the app or service is really going to do. >> Right, right. >> Yeah, that's an interesting one. I was at a, we were at Grace Hopper in October and one of the keynote speakers was talking about how mobile data has really changed this thing right because once you're on your mobile phone it uses all the capabilities that are native in the phone in terms of geolocation, accelerometer, etc. All these things that a lot of people probably didn't know were different on the mobile Facebook app than were on the desktop Facebook app. Let's face it, most this stuff is mobile these days, certainly with the younger kids. As you said, and that's an interesting tack, why do you need access to my context? Why do you need access to my pictures? Why do you need access to my location? And then the piece that I'm curious to get your opinion, will some of the value come back to the consumer in terms of I'm not just selling your stuff, I'm not monetizing it via ads, I'm going to give some of that back to you? >> Yeah, I think there's a couple things there. One quick point on the other issue there, without naming names I was looking at an app and it said it had to have access to my phone, and I'm like why would this app need access to my phone? And then I realized later well it needs access to my phone because if the phone rings it needs to turn itself off so I can answer the phone. But that wasn't apparent right? And so I think it can be confusing to people like maybe it's innocuous in some ways. Some ways it might not be but in that case it was like okay yeah because if the phone rings I'd rather answer my phone than be looking at the app. >> Right, can I read it or can I just see it. You know the degree of the access too is very confusing. >> Yeah and I think in terms of the other issues that you're raising here about how the value exchange on data, I think the internet of things is really going to play a big role in this because it's really... You know in the current world it's about you know data, delivering ads, those kinds of things, making the experience more customized. But in IoT where you're talking about wearables or fitness or those kinds of things, or thermostats in your home, your data really drives that. So in order for those devices to really work well they have to have data about you. And that's where I think customers will really have to give great thought to. You know is that a good value proposition, right? I mean, do I want to share my data about when I come and leave every day just so my thermostat you know can turn on and off. And I think those are you know can be conscience decisions about when you're implementing that kind of technology. >> Right, so there's another interesting tack I'd love to get your opinion on. You know we see Flo from the Progressive commercials advertising to stick the USB in your cigarette lighter and we'll give you cheaper rates because now we know if you stop at stop signs or not. What's funny to me is that phone already knows whether you stop at stop signs or not and it already knows that you take 18 trips to 7-Eleven on a Saturday afternoon and you're sitting on your couch the balance of the time. As that information that's there somehow gets exposed and potentially runs into say healthcare mandated requirement from the company that you must wear Fitbits so now we know you're spending too much time at the 7-Eleven and on your couch and how that impacts your health insurance and stuff. And that's going to crash right into HIPAA. It just seems like there's this huge kind of collision coming from you know I can provide better service to people at the good end of the scale, and say aggregated risk models, but then what happens to the poor people at the other end? >> Well, I think that's why you have to have opt in, right? I think you can't make these things mandatory necessarily. And I think people have to be extremely aware of when their data is being collected and how it's being used. And so, you know the example of like the car insurance, I mean they can only, really should only be able to access that data about where you're going if you sign up to do that right? And if they want to say to you, hey Michael we might give you a better rate if we can track your, you know driving habits for a couple of weeks then that should be my choice right to give that data. Maybe my rates might be impacted if I don't but I can make that choice myself and should be allowed to make that choice myself. >> So it's funny, the opt in and opt out, so right now from your point of view what do you see in terms of the percentage of kind of opt in opt out on these privacy issues? Where is it and where should it be? >> Well I would like to see some more granular controls for the consumer in general right. I would like to see... And I said a little bit earlier a lot more transparency and ease of access to what's being collected about you and what's being used. You know outside of the formal legal process, obviously you know companies have to follow the law. They have to comply. They have to be, you know write these long EULAs or privacy policies in order to really reflect what they're doing. But they should be talking to their customers and understanding what's the most important thing that you want to know about my service before you sign up for it. And help people understand that and navigate their way through it. And I think in a lot of cases consumers will click yeah let's do it but they should do that really knowingly. If opting in is you're opting in it should be done with true consent right. >> Okay, so before I let you go just share some best practices, tips and tricks, you know kind of at least the top level what people should be thinking about, what they should be doing. >> Yeah, so we really, you know in this kind of space we look at a couple things. One, personal informations like money value and protect it. That really means being thoughtful about what information you share, when you share it, who you share it with. Own your online presence, this is really important. Consumers have an active role in how they interact with the internet. Use the settings that are there right. Use the safety and security or privacy and security settings that are in the services that you have. And then, actually a lot of this is behavioral. What you share is really important yourself so share with care right. I mean be thoughtful about the kinds of information that you put out there about yourself. Be thoughtful about the kind of information that you put about your friends and family. Realize that every single one of us in this digital world is entrusted with personal information about people much more than we used to be in the past. We have that responsibility to safeguard what other people give to us and that should be the common goal around the internet. >> I think we have to have you at the bullying and harassment convention down the road. Great insight Michael and really appreciate it. Have a great day today. I'm sure there's going to be a lot of terrific content that comes out. And for people to get more information go to the National Cyber Security Alliance. Thanks for stopping by. >> Thank you for having us. >> Absolutely. He's Michael Kaiser. I'm Jeff Frick. You're watching theCUBE, thanks for watching.
SUMMARY :
And I can't help but think back to the about the importance of respecting privacy, And that seems to be kind of the theme and that you should value it and protect it. You check the box and you move on. how easy are they to use, and making them more prominent in some of the notes that we prepared And I think those kinds of interfaces with the user And then the piece that I'm curious to get your opinion, And so I think it can be confusing to people You know the degree of the access too is very confusing. And I think those are you know can be conscience decisions and it already knows that you take 18 trips And I think people have to be extremely aware and ease of access to what's being collected about you you know kind of at least the top level and security settings that are in the services I think we have to have you I'm Jeff Frick.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Michael Kaiser | PERSON | 0.99+ |
Michael | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
2011 | DATE | 0.99+ |
2007 | DATE | 0.99+ |
1999 | DATE | 0.99+ |
National Cyber Security Alliance | ORGANIZATION | 0.99+ |
NCSA | ORGANIZATION | 0.99+ |
1981 | DATE | 0.99+ |
D.C. | LOCATION | 0.99+ |
October | DATE | 0.99+ |
Washington D.C | LOCATION | 0.99+ |
United States | LOCATION | 0.99+ |
Europe | LOCATION | 0.99+ |
iPhone | COMMERCIAL_ITEM | 0.99+ |
Scott McNeely | PERSON | 0.99+ |
18 trips | QUANTITY | 0.99+ |
17 things | QUANTITY | 0.99+ |
Data Privacy Day | EVENT | 0.99+ |
today | DATE | 0.99+ |
five | QUANTITY | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
UBER | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.97+ |
three | QUANTITY | 0.97+ |
HIPAA | TITLE | 0.97+ |
One quick point | QUANTITY | 0.97+ |
EULA | TITLE | 0.97+ |
seven | QUANTITY | 0.97+ |
Saturday afternoon | DATE | 0.96+ |
two | QUANTITY | 0.95+ |
ORGANIZATION | 0.95+ | |
six | QUANTITY | 0.95+ |
Google Maps | TITLE | 0.94+ |
2008 | DATE | 0.92+ |
Data Privacy Day 2017 | EVENT | 0.9+ |
EU | ORGANIZATION | 0.9+ |
One | QUANTITY | 0.88+ |
San Francisco | LOCATION | 0.86+ |
couple | QUANTITY | 0.82+ |
Flo | ORGANIZATION | 0.8+ |
ORGANIZATION | 0.77+ | |
Grace Hopper | ORGANIZATION | 0.77+ |
couple things | QUANTITY | 0.7+ |
7-Eleven | COMMERCIAL_ITEM | 0.67+ |
couple of weeks | QUANTITY | 0.66+ |
Fitbits | ORGANIZATION | 0.65+ |
theCUBE | ORGANIZATION | 0.63+ |
single | QUANTITY | 0.57+ |
7- | QUANTITY | 0.54+ |
Eleven | ORGANIZATION | 0.33+ |
Lisa Ho | Data Privacy Day 2017
>> Hey, welcome back everybody, Jeff Frick here with theCUBE. We're in downtown San Francisco at the Twitter headquarters at the Data Privacy Day Event. It's a full day event with a lot of seminars and presentations, really talking about data privacy, something that's getting increasingly important everyday, especially as we know, RSA's coming up in a couple of weeks and a lot of talk about phishing and increased surface area of attack, and et cetera, et cetera. So privacy is really important and we're excited to have Lisa Ho, Campus Privacy Officer at UC Berkeley. Welcome, Lisa. >> Thank you, glad to be here. >> So what does the Campus Privacy Officer do? >> Well, really anything that has to do with privacy that comes across. So making sure that we're in compliance or doing what I can to help the campus keep in compliance with privacy laws. But beyond that, also making sure that we stay aligned with our privacy values and when I say that, I mean, privacy is really important. It's critical for creativity and for intellectual freedom. So at the university, we need to make sure we hold on to those when we're dealing with new ideas and new scenarios that's got to come up. We have to balance privacy with all the other priorities and obligations we have. >> Yeah, I don't know if Berkeley gets enough credit and Stanford as really being two of the real big drivers of Silicon Valley. It attracts a lot of smart people. They come, they learn, and then more importantly, they stay. So you've got a lot of cutting edge innovation, you've got a ton of open source technologies come out of Berkeley over the years. Spark, et cetera. So you guys are really at the leading edge but at the same time, you're an old, established academic institution so what role do you have formally as an academic institution of higher education to help set some of these standards and norms as the world is changing around it so very, very quickly? >> Yeah, well, so as I say, the environment needs to be set for creativity and for allowing that intellectual freedom. So when we think about the university, the things that we do there are pretty much what we want to have in the community as a whole, and in our culture and environment. So some of the things that we think about particularly, first, if you talk about, think about school, you think about grades or you think about the letters evaluation that you get. Those things that, learning when you come down to it is a personal endeavor and you, developing internally. It's a transformation that's internal. And so what kind of feedback you get, what kind of critical evaluation, those need to be done in an area where you have the privacy to not be, have a reputation to either live up to or live down. Those are things that you keep secret or keep private and that's why school information and student data is so, as we've agreed as a society that that's something that needs to stay private. So that's one area that learning is personal. That's why the university is so important in that discussion. And secondly, I'd say, as we talked about, creativity requires time to develop and it requires freedom for taking risks. So whether you're working on a book or whether it's a piece of art or if you're a scientist, a formula, any kind of algorithm, a theory. Those are things that you need time to set aside and to be in your own head without the eyes of others until you're ready. Without not having judgment before it's ready for release. And those kind of things that you want to have space for creativity so that you can move beyond the status quo and take those risks to go somewhere to the next space and beyond. >> Jeff: Right. >> And I think lastly, I'd say that, this is not specific to the university, but where we hold particularly at Berkeley, is the fundamental rights that we have that privacy is one of those fundamental rights and as Ed Snowden said so famously, if you're saying I don't care about privacy because I have nothing to hide is like saying I don't care about freedom of speech because I have nothing to say. So just because you may not have something to say doesn't mean that you can take away the rights of someone else and you may find that you need those at some point in your life in the future, and no one has to justify why they need a fundamental right. So those things that are essential that come out in our university environment that we think of a lot are things that are applicable beyond just the learning space of the university, to the kind of society that we want to build. That's why the university's in the space to lead in these areas. >> Right, 'cause Berkeley's got a long history, right, of activism, and this goes back for decades and decades. I mean, is privacy starting to get elevated to the level that you're going to see more active, vocal points of view and statements, and I don't want to say marches, but potentially marches in terms of making sure this is taken care of? Because unfortunately, I think most privacy applications, at least historically, maybe it's changing, are really opt out, you know, not opt in. So do you see this? Is it becoming a more important kind of policy area versus just kind of an execution detail on an application? >> Yeah, we have a lot of really great professors working on these ideas around privacy and in cybersecurity that, those that are working on security and other things also have privacy in their background and are also advocating in that area as well. As far as marches, we all, you pretty much rely on the students for that and you can't dictate what the students are going to find as important. But there are. There's definitely a cadre of students that care and are interested in these topics and when you tie them together with the fundamental rights like free speech and academic freedom and creativity, that's where it becomes important and people get interested in that. >> Right. One of the real sticky areas that this bounces into is just security, security and unfortunately, there's been way too many instances at campuses over the last several years of crazy people grabbing a gun and shooting people, which, you know, hopefully won't happen today. And that's really kind of where the privacy and security thing runs up against should we have known? Should we have seen this person coming? If we had had access to whatever that they're doing, maybe we would have known and been able to prevent it. So when you look at kind of the, I don't want to say balance, but really, the conflict between security security and privacy, what are some of the rules coming out? How do you guys execute that to both provide a safe environment for people to study and learn and grow, as you mentioned, but at the same time, keep an eye out for unfortunately, there are bad characters in the world. >> Right, yeah well, I don't want to say that there's a dichotomy. I don't want to create a false dichotomy of it's either privacy or it's security and that's not the frame of mind that we want to be in. It's important for both and security is clearly important. Preventing unauthorized access to information or your personal information is clearly a part of privacy and so that's necessary for privacy and those are things that you would do to protect privacy. The two factor authentication and the antivirus and the network segmentation, those are all things that are important parts of protecting privacy as well. So it's not a dichotomy of one or the other, but there are things that you do for security purposes, whether it's cybersecurity or for the kind of security, personal security, that maybe in a conflict, have a different purpose than what you would do for privacy and monitoring is one of those areas specifically. When you're monitoring for attacks, this kind in particularly, now we have the continuous monitoring for any kind of attacks or to use that monitoring data as a forensic place to look for information after the fact. Those are things that really is lies in contrast with the idea in privacy of least perusal and not looking and not looking for information until you need it, so having that distance in the privacy of not having surveillance. So what we're coming to, at the University of California has outlined a privacy balancing analysis that's necessary for these kind of scenarios that are new, when we have, untested, when we don't have laws around them, to balance the many priorities and obligations and what you need to do is look at what does the security provide, look at the benefits together with the risks and do that balancing. And so you need to go through a series of questions. What is the utility that you're really getting out of that monitoring and not just in that normal scenario when you're expecting, how you're expecting to use it. But what about in the use cases that maybe you didn't expect that, but you can anticipate that it'll be wanted for those reasons or if you, what about when we're required to turn it over for a subpoena or another kind of letter. What are the use cases in that? What are the privacy impacts in those cases? What are the privacy impacts if it's hacked or what are the privacy impacts of an abuse by an employee? What are the privacy impacts for sharing it with partners? So that together, the utility with the impact you need to balance that and to look at those differences, and then also look at what's the scope of that? Does the scope change? If you change the scope of what you're monitoring, does it change the privacy impact? Does it change the utility? When you look at those kind of factors and keep them all in line, not just looking at what's the utility of what you're trying to do, but what are the other impacts to the privacy analysis and then what are the alternatives that you could do the same thing and are they appropriate? Do they give you the same kind of value that the proposed monitoring provides? Keeping transparent about and keeping accountable to what you're doing are really when it comes down to the key as you've done that analysis and making sure that you've looked through those questions of have you kept it, are you doing the least amount of perusal necessary to achieve the goals that you're trying to accomplish with that monitoring? And what about transparency and accountability coming back to whatever your decisions are, making those available to the community that's being monitored. >> Wow, well one you've got job security, I guess, for life, because that was amazing. Two, as you're talking balances, the word I was looking for before, so that is the right word. But you're balancing on so many axis and even once you get through the axis that you just went through that list of, it's phenomenal, then you still need to look at the alternatives, right? And do the same kind of analysis for each. So really, that was a great explanation. So I want to shift gears a little bit and talk about wearables. You're going to give a talk later on today about the wearables. Wearables are a whole new kind of interesting factor now that provide a whole bunch more data, really kind of the cutting edge of the internet of things with sensor data. People are things too, we like to say on theCUBE. So as you look at the wearables and the impact of wearables on this whole privacy conversation, what are some of the big, got you issues that are really kind of starting to be surfaced as these things get more popular? >> Yeah, I think a lot of the same kind of questions around what kind of monitoring you're doing, what's the utility, what is the privacy impact and how do you balance those in the various scenarios, the use cases that come up, really the same kind of questions apply to cybersecurity as they do to cybersecurity monitoring. We're finding, I think in college athletics and the university sponsored use of wearable technology is really just in infancy right now. It's not a big thing that we're working on. But it ties in so much as very much parallels the other kind of questions that we are talking about around learning data and how you jump or how your body functions is very private, very intimate. How you think, how you learn, that's right up there on the spectrum on that privacy and intimacy scale. So we're looking very much and we've been talking quite a bit in the university space about learning data and how we protect that. Some of the questions are who owns that data? It's about me, should I be, you know, it's about the student for example. Should I have control over how that information is used? When it's around learning data, maybe the average student, there may not be outside folks that are interested in that information but when you're talking about student athletes, potentially going pro, that's very valuable data that people may want, so that, people may want to pay for, maybe the student should have some say in the use of that data, monetizing that data, who owns that? Is it the student, is it the university, is it the company that we work with to provide that kind of monitoring the analytics on that? >> Jeff: Right, right. >> Even if we have a contract or right now, if it's through the university, we'd hopefully have made really clear who's the ownership, where the uses ally, what kind of things we can do with it, but as we move into kind of a consumer space, and it's where you just clicking the box and students may be asked, oh, use this technology, it's free and we'll be able to handle it, because of course, how much it costs is important in the university space >> Give you free slices at the pizza store. >> Right, well once we get into that consumer realm when it's just either not even having to click the box, the box is already clicked, can you say okay, that's the new come up to where students may be giving away data for reasons or for uses that they didn't intend, that they are not getting any compensation for, and in particular cases, when you talk about student athletes, that could be something that would be very meaningful for their career and beyond. >> Yeah or is it the guy that's come up with the unique and innovative training methodology that they're testing, is it Berkeley's information to see how people are learning so you can incorporate that into your lesson plans and the way that you teach 'em, and there's so many kind of angles but it always comes back, as you said, really the context. Kind of what's the context for the specific application that you're trying to use that and should you or should you not have rights for that context. It's really interesting space, a lot of interesting challenges, and like I said, job security for you for the unforeseeable future. >> Yeah, we're not going to run out of new and exciting applications and things to be thinking about in terms of privacy. It's just a non stop. >> Right, 'cause they're not, these are not technology questions, right? These are policy questions and rules questions. We heard a thing last night with the center and one of the topics was we need a lot more rules around these types of things because the technology's outpacing kind of the governance rules and really the thought processes, the ways that these things can all be used. >> It's a culture question, really. It's more than just what you allow or not, but how we feel about it and the kind of idea that privacy is dead is only true if we don't care about it anymore. So if we care about it and we pay attention to it, then privacy is not dead. >> Alright, well Lis, we'll leave it there. Lisa Ho from UC Berkeley, fantastic. Thank you for stopping by and good luck at your wearables panel later this afternoon. >> Thank you. >> Alright, I'm Jeff Frick. You're watching theCUBE, thanks for watching. (upbeat music)
SUMMARY :
the Twitter headquarters at the Data Privacy Day Event. So at the university, we need to make sure So you guys are really at the leading edge So some of the things that we think about particularly, is the fundamental rights that we have So do you see this? on the students for that and you can't dictate One of the real sticky areas that this bounces into and that's not the frame of mind that we want to be in. so that is the right word. is it the company that we work with slices at the pizza store. and in particular cases, when you talk about and the way that you teach 'em, and exciting applications and things and one of the topics was we need It's more than just what you allow or not, Thank you for stopping by and Alright, I'm Jeff Frick.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Ed Snowden | PERSON | 0.99+ |
Lisa Ho | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
Stanford | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
University of California | ORGANIZATION | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
Lis | PERSON | 0.99+ |
both | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Two | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
two factor | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
UC Berkeley | ORGANIZATION | 0.97+ |
last night | DATE | 0.97+ |
decades | QUANTITY | 0.96+ |
each | QUANTITY | 0.96+ |
Berkeley | ORGANIZATION | 0.95+ |
ORGANIZATION | 0.94+ | |
later this afternoon | DATE | 0.94+ |
One | QUANTITY | 0.93+ |
secondly | QUANTITY | 0.93+ |
theCUBE | ORGANIZATION | 0.92+ |
Data Privacy Day Event | EVENT | 0.92+ |
RSA | ORGANIZATION | 0.91+ |
Data Privacy Day 2017 | EVENT | 0.85+ |
Campus Privacy Officer | PERSON | 0.85+ |
Berkeley | LOCATION | 0.84+ |
downtown San Francisco | LOCATION | 0.79+ |
Spark | ORGANIZATION | 0.77+ |
last | DATE | 0.6+ |
years | DATE | 0.45+ |
Andreas S Weigend, PhD | Data Privacy Day 2017
>> Hey welcome back everybody, Jeff Frick here with theCUBE we're at the data privacy day at Twitter's world headquarters in downtown San Fransciso and we're really excited to get into it with our next guest Dr. Andreas Weigend, he is now at the Social Data Lab, used to be at Amazon, recently published author. Welcome. >> Good to be here, morning. >> Absolutely, so give us a little about what is Social Data Lab for people who aren't that familiar with it and what are you doing over at Berkeley? >> Alright, so let's start with what is social data? Social data is a data people create and share whether they know it or not and what that means is Twitter is explicit but also a geo location or maybe even just having photos about you. I was in Russia all day during the election day in the United States with Putin, and I have to say that people now share on Facebook what the KGB wouldn't have gotten out of them under torture. >> So did you ever see the Saturday Night Live sketch where they had a congressional hearing and the guy the CIA guy says, Facebook is the most successful project that we've ever launched, people tell us where they are who they're with and what they're going to do, share pictures, location, it's a pretty interesting sketch. >> Only be taught by Black Mirror, some of these episodes are absolutely amazing. >> People can't even watch is it what I have not seen I have to see but they're like that's just too crazy. Too real, too close to home. >> Yeah, so what was the question? >> So let's talk about your new book. >> Oh that was social data. >> Yeah social data >> Yeah, and so I call it actually social data revolution. Because if you think back, 10, 20 years ago we absolutely we doesn't mean just you and me, it means a billion people. They think about who they are, differently from 20 years ago, think Facebook as you mentioned. How we buy things, we buy things based on social data we buy things based on what other people say. Not on what some marketing department says. And even you know, the way we think about information I mean could you do a day without Google? >> No >> No. >> Could you go an hour without Google? >> An hour, yes, when I sleep. But some people actually they Google in their sleep. >> Well and they have their health tracker turned on while they sleep to tell them if they slept well. >> I actually find this super interesting. How dependent I am to know in the morning when I wake up before I can push a smiley face or the okay face or the frowny face, to first see how did I sleep? And if the cycles were nice up and down, then it must have been a good night. >> So it's interesting because the concept from all of these kind of biometric feedback loops is if you have the data, you can change your behavior based on the data, but on the other hand there is so much data and do we really change our behaivor based on the data? >> I think the question is a different one. The question is alright, we have all this data but how can we make sure that this data is used for us, not against us. Within a few hundred meters of here there's a company where employees were asked to wear a fit bit or tracking devices which retain more generally. And then one morning one employee came in after you know not having had an exactly solid night of sleep shall we say and his boss said I'm sorry but I just looked at your fit bit you know this is an important meeting, we can't have you at that meeting. Sorry about that. >> True story? >> Yeah >> Now that's interesting. So I think the fit bit angle is interesting when that is a requirement to have company issued health insurance and they see you've been sitting on your couch too much. Now how does that then run into the HIPPA regulations. >> You know, they have dog walkers here. I'm not sure where you live in San Francisco. But in the area many people have dogs. And I know that a couple of my neighbors they give when the dog walker comes to take the dog, they also give their phone to the dog walker so now it looks like they are taking regular walks and they're waiting for the discount from health insurance. >> Yeah, it's interesting. Works great for the person that does walk or gives their phone to the dog walker. But what about the person that doesn't, what about the person that doesn't stop at stop signs. What happens in a world on business models based on aggregated risk pooling when you can segment the individual? >> That is a very very very biased question. It's a question of fairness. So if we know everything about everybody what would it mean to be fair? As you said, insurance is built on pooling risk and that means by nature that there are things that we don't know about people. So maybe, we should propose lbotomy data lobotomy. So people actually have some part chopped off out of the data chopped off. So now we can pool again. >> Interesting >> Of course not, the answer is that we as society should come up with ways of coming up with objective functions, how do we weigh the person you know taking a walk and then it's easy to agree on the function then get the data and rank whatever insurance premium whatever you're talking about here rank that accordingly. So I really think it's a really important concept which actually goes back to my time at Amazon. Where we came up with fitness functions as we call it. And it takes a lot of work to have probably spent 50 hours on that with me going through groups and groups and groups figuring out, what do we want the fitness function to be like? You have to have the buy in of the groups you know it they just think you know that is some random management thing imposed on us, it's not going to happen. But if they understand that's the output they're managing for, then not bad. >> So I want to follow up on the Amazon piece because we're big fans of Jeff Hamilton and Jeff Bezzos who we go to AWS and it's interesting excuse me, James Hamilton when he talks about the resources that EWS can bring to bear around privacy and security and networking and all this massive infrastructure being built in terms of being able to protect privacy once you're in the quote un-quote public cloud versus people trying to execute that at the individual company level and you know RSA is in a couple of weeks the amount of crazy scary stuff that is coming in for people that want interviews around some of this crazy security stuff. When you look at kind of public cloud versus private cloud and privacy you know supported by a big heavy infrastructure like what EWS has versus a Joe Blow company you know trying to implement them themselves, how do you see that challenge. I mean I don't know how the person can compete with having the resourses again the aggregated resources pool that James Hamilton has to bring to barrel this problem. >> So I think we really need to distinguish two things. Which is security versus privacy. So for security there's no question in my mind that Joe Blow, with this little PC has not a chance against our Chinese or Russian friends. Is no question for me that Amazon or Google have way better security teams than anybody else can afford. Because it is really their bread and butter. And if there's a breach on that level then I think it is terrible for them. Just think about the Sony breach on a much smaller scale. That's a very different point from the point of privacy. And from the point about companies deliberately giving the data about you for targeting purposes for instance. And targeting purposes to other companies So I think for the cloud there I trust, I trust Google, I trust Amazon that they are doing hopefully a better job than the Russian hackers. I am more interested in the discussion on the value of data. Over the privacy discussion after all this is the world privacy day and there the question is what do people understand as the trade off they have, what they give in order to get something. People have talked about Google having this impossible irresistible value proposition that for all of those little data you get for instance I took Google Maps to get here, of course Google needs to know where I am to tell me to turn left at the intersection. And of course Google has to know where I want to be going. And Google knows that a bunch of other people are going there today, and you probably figure out that something interesting is happening here. >> Right >> And so those are the interesting questions from me. What do we do with data? What is the value of data? >> But A I don't really think people understand the amount of data that they're giving over and B I really don't think that they understand I mean now maybe they're starting to understand the value because of the value of companies like Google and Facebook that have the data. But do you see a shifting in A the awareness, and I think it's even worse with younger kids who just have lived on their mobile phones since the day they were conscious practically these days. Or will there be a value to >> Or will they even mobile before they were born? Children now come pre-loaded, because the parents take pictures of their children before they are born >> That's true. And you're right and the sonogram et cetera. But and then how has mobile changed this whole conversation because when I was on Facebook on my PC at home very different set of information than when it's connected to all the sensors in my mobile phone when Facebook is on my mobile phone really changes where I am how fast I'm moving, who I'm in proximity to it completely changed the privacy game. >> Yes so geo location and the ACLU here in Northern California chapter has a very good quote on that. "Geo location is really extremely powerful variable" Now what was the question? >> How has this whole privacy thing changed now with the proliferation of the mobile, and the other thing I would say, when you have kids that grew up with mobile and sharing on the young ones don't use Facebook anymore, Instagram, Snap Chat just kind of the notion of sharing and privacy relative to folks that you know wouldn't even give their credit card over the telephone not that long ago, much less type it into a keyboard, um do they really know the value do they really understand the value do they really get the implications when that's the world in which they've lived in. Most of them, you know they're just starting to enter the work force and haven't really felt the implications of that. >> So for me the value of data is how much the data impacts a decision. So for the side of the individual, if I have data about the restaurant, and that makes me decide whether to go there or to not go there. That is having an impact on my decision thus the data is valuable. For a company a decision whether to show me this offer or that offer that is how data is valued from the company. So that kind of should be quantified The value of the picture of my dog when I was a child. That is you know so valuable, I'm not talking about this. I'm very sort of rational here in terms of value of data as the impact is has on decisions. >> Do you see companies giving back more of that value to the providers of that data? Instead of you know just simple access to useful applications but obviously the value exceeds the value of the application they're giving you. >> So you use the term giving back and before you talked about kids giving up data. So I don't think that it is quite the right metaphor. So I know that metaphor come from the physical world. That sometimes has been data is in your oil and that indeed is a good metaphor when it comes to it needs to be refined to have value. But there are other elements where data is very different from oil and that is that I don't really give up data when I share and the company doesn't really give something back to me but it is much interesting exchange like a refinery that I put things in and now I get something not necessarily back I typically get something which is very different from what I gave because it has been combined with the data of a billion other people. And that is where the value lies, that my data gets combined with other peoples data in some cases it's impossible to actually take it out it's like a drop of ink, a drop in the ocean and it spreads out and you cannot say, oh I want my ink back. No, it's too late for that. But it's now spread out and that is a metaphor I think I have for data. So people say, you know I want to be in control of my data. I often think they don't have deep enough thought of what they mean by that. I want to change the conversation of people saying You what can I get by giving you the data? How can you help me make better decisions? How can I be empowered by the data which you are grabbing or which you are listening to that I produce. That is a conversation which I want to ask here at the Privacy Day. >> And that's happening with like Google Maps obviously you're exchanging the information, you're walking down the street, you're headed here they're telling you that there's a Starbucks on the corner if you want to pick up a coffee on the way. So that is already kind of happening right and that's why obviously Google has been so successful. Because they're giving you enough and you're giving them more and you get in this kind of virtuous cycle in terms of the information flow but clearly they're getting a lot more value than you are in terms of their you know based on their market capitalization you know, it's a very valuable thing in the aggregation. So it's almost like a one plus one makes three >> Yes. >> On their side. >> Yes, but it's a one trick pony ultimately. All of the money we make is rats. >> Right, right that's true. But in-- >> It's a good one to point out-- >> But then it begs the question too when we no longer ask but are just delivered that information. >> Yes, I have a friend Gam Dias and he runs a company called First Retail, and he makes the point that there will be no search anymore in a couple of years from now. What are you talking about? I search every day, but is it. Yes. But You know, you will get the things before you even think about it and with Google now a few years ago when other things, I think he is quite right. >> We're starting to see that, right where the cards come to you with a guess as to-- >> And it's not so complicated If let's see you go to the symphony you know, my phone knows that I'm at the symphony even if I turn it off, it know where I turned it off. And it knows when the symphony ends because there are like a thousand other people, so why not get Ubers, Lyfts closer there and amaze people by wow, your car is there already. You know that is always a joke what we have in Germany. In Germany we have a joke that says, Hey go for vacation in Poland your car is there already. But maybe I shouldn't tell those jokes. >> Let's talk about your book. So you've got a new book that came out >> Yeah >> Just recently released, it's called Data for the People. What's in it what should people expect, what motivated you to write the book? >> Well, I'm actually excited yesterday I got my first free copies not from the publisher and not from Amazon. Because they are going by the embargo by which is out next week. But Barnes and Noble-- >> They broke the embargo-- Barnes and Noble. Breaking news >> But three years of work and basically it is about trying to get people to embrace the data they create and to be empowered by the data they create. Lots of stories from companies I've worked with Lots of stories also from China, I have a house in China I spend a month or two months there every year for the last 15 years and the Chinese ecosystem is quite different from the US ecosystem and you of course know that the EU regulations are quite different from the US regulations. So, I wrote on what I think is interesting and I'm looking forward to actually rereading it because they told me I should reread it before I talk about it. >> Because when did you submit it? You probably submitted it-- >> Half a year >> Half a year ago, so yeah. Yeah. So it's available at Barnes and Noble and now Amazon >> It is available. I mean if you order it now, you'll get it by Monday. >> Alright, well Dr. Andreas Weigin thanks for taking a few minutes, we could go forever and ever but I think we've got to let you go back to the rest of the sessions. >> Thank you for having me. >> Alright, pleasure Jeff Frick, you're watching theCUBE see you next time.
SUMMARY :
Dr. Andreas Weigend, he is now at the Social Data Lab, day in the United States with Putin, So did you ever see the Saturday Night Live sketch Only be taught by Black Mirror, some of these episodes I have to see but they're like that's just too crazy. And even you know, the way we think about information But some people actually they Google in their sleep. Well and they have their health tracker turned on or the frowny face, to first see how did I sleep? an important meeting, we can't have you at that meeting. So I think the fit bit angle is interesting And I know that a couple of my neighbors they give aggregated risk pooling when you can segment the individual? As you said, insurance is built on pooling risk it they just think you know that is some random at the individual company level and you know RSA is the data about you for targeting purposes for instance. What is the value of data? because of the value of companies like Google and it completely changed the privacy game. Yes so geo location and the ACLU here in that you know wouldn't even give their credit card over the So for me the value of data is how much the data Instead of you know just simple access to How can I be empowered by the data which you are Because they're giving you enough and you're giving All of the money we make is rats. But in-- But then it begs the question too when You know, you will get the things before you even you know, my phone knows that I'm at the symphony So you've got a new book that came out what motivated you to write the book? free copies not from the publisher and not from Amazon. They broke the embargo-- and you of course know that the EU regulations are So it's available at Barnes and Noble and now Amazon I mean if you order it now, you'll get it by Monday. I think we've got to let you go back to the rest Jeff Frick, you're watching theCUBE see you next time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Amazon | ORGANIZATION | 0.99+ |
Putin | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
James Hamilton | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Jeff Bezzos | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
Jeff Hamilton | PERSON | 0.99+ |
Poland | LOCATION | 0.99+ |
Barnes and Noble | ORGANIZATION | 0.99+ |
Andreas Weigend | PERSON | 0.99+ |
Germany | LOCATION | 0.99+ |
Andreas Weigin | PERSON | 0.99+ |
Russia | LOCATION | 0.99+ |
50 hours | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
First Retail | ORGANIZATION | 0.99+ |
Sony | ORGANIZATION | 0.99+ |
China | LOCATION | 0.99+ |
CIA | ORGANIZATION | 0.99+ |
San Francisco | LOCATION | 0.99+ |
Andreas S Weigend | PERSON | 0.99+ |
ACLU | ORGANIZATION | 0.99+ |
EWS | ORGANIZATION | 0.99+ |
An hour | QUANTITY | 0.99+ |
a month | QUANTITY | 0.99+ |
United States | LOCATION | 0.99+ |
next week | DATE | 0.99+ |
Northern California | LOCATION | 0.99+ |
three years | QUANTITY | 0.99+ |
an hour | QUANTITY | 0.99+ |
two months | QUANTITY | 0.99+ |
Starbucks | ORGANIZATION | 0.99+ |
first free copies | QUANTITY | 0.99+ |
Social Data Lab | ORGANIZATION | 0.99+ |
Saturday Night Live | TITLE | 0.99+ |
KGB | ORGANIZATION | 0.99+ |
20 years ago | DATE | 0.99+ |
yesterday | DATE | 0.99+ |
EU | ORGANIZATION | 0.98+ |
three | QUANTITY | 0.98+ |
two things | QUANTITY | 0.98+ |
Black Mirror | TITLE | 0.98+ |
Half a year ago | DATE | 0.98+ |
Berkeley | LOCATION | 0.98+ |
today | DATE | 0.97+ |
US | LOCATION | 0.97+ |
one employee | QUANTITY | 0.97+ |
Monday | DATE | 0.97+ |
ORGANIZATION | 0.97+ | |
first | QUANTITY | 0.97+ |
Lyfts | ORGANIZATION | 0.96+ |
one morning | QUANTITY | 0.96+ |
Joe Blow | ORGANIZATION | 0.95+ |
Russian | OTHER | 0.95+ |
Data for the People | TITLE | 0.95+ |
one | QUANTITY | 0.94+ |
Google Maps | TITLE | 0.93+ |
a day | QUANTITY | 0.93+ |
Gam Dias | PERSON | 0.92+ |
Ubers | ORGANIZATION | 0.91+ |
Dr. | PERSON | 0.91+ |
Chinese | OTHER | 0.9+ |
one trick | QUANTITY | 0.89+ |
few years ago | DATE | 0.88+ |
ORGANIZATION | 0.83+ |
Jules Polonetsky, Future of Privacy Forum | Data Privacy Day 2017
>> Hey, welcome back everybody. Jeff Frick here with theCUBE. We're in downtown San Francisco at Twitter's world headquarters at the Data Privacy Day, a full day event of sessions and breakout sessions really talking about privacy. Although privacy is dead in 1999 get over it, not really true and certainly a lot of people here beg to differ. We're excited to have our next guest Jules Polonetsky, excuse me, CEO of Future of Privacy Forum. Welcome. >> Thank you, great to be here. Exciting times for data, exciting times for privacy. >> Yeah, no shortage of opportunity, that's for sure. The job security and the privacy space is pretty high I'm gathering after a few of these interviews. >> There's a researcher coming up with some new way we can use data that is both exciting, curing diseases, studying genes, but also sometimes orwellian. Microphones are in my home, self-driving cars, and so, getting that right is hard. We don't have clear consensus over whether we want the government keeping us safe by being able to catch every criminal, or not getting into our stuff because we don't trust them >> Right. [Jules] - So challenging times. [Jeff] - So, before we jump into it, Future Privacy Forum, kind of a little bit about the organization, kind of your mission... [Jules] - We're eight years old at the Future Privacy Forum, we're a think tank in Washington, D.C. Many of our members are the chief privacy officers of companies around the world, so about 130 companies, ranging from many of the big tech companies. And as new sectors start becoming tech and data, they join us. So, the auto industries dealing with self-driving cars, connected cars, all those issues. Wearables, student data, so about 130 of those companies. But then the other half of our group are advocates and academics who are a little bit skeptical or worried. They want to engage, but they are worried about an Orwellian future. So we bring those folks together and we say, 'Listen, how can we have data that will make cars safer? How can we have wearables that'll help improve fitness? But also have reasonable, responsible rules in place so that, we don't end up with discrimination, or data breaches, and all the problems that can come along?' [Jeff] - Right, cause it's really two sides of the same coin and it's always two sides of the same coin. And typically on new technology, we kind of race ahead on the positive, cause everybody's really excited. And lag on kind of what the negative impacts are and/or the creation of rules and regulations about because this new technology, very hard to keep up. [Jules] - You know the stakes are high. Think about AdTech, right? We've got tons of adtech. It's fueling free content, but we've got problems of adware, and spyware, and fake news, and people being nervous about cookies and tracking. And every year, it seems to get more stressful and more complicated. We can't have that when it comes to microphones in my home. I don't want to be nervous that if I go into the bedroom, suddenly that's shared across the adtech ecosystem. Right? I don't know that we want how much we sweat or when it's somebody's time of the month, or other data like that being out there and available to data brokers. But, we did a study recently of some of the wearables, the more sensitive ones. Sleep trackers, apps that people use to track their periods, many of them, didn't even have a privacy policy, to say 'I don't do this, or I don't do that with your data.' So, stakes are high. This isn't just about, you know, are ads tracking me? And do I find that intrusive? This is about if I'm driving my car, and it's helping me navigate better and it's giving me directions, and it's making sure I don't shift out of my lane, or it's self-parking, that that data doesn't automatically go to all sorts of places where it might be used to deny me benefits, or discriminate, or raise my insurance rates. [Jeff]: Right, right. Well, there's so many angles on this. One is, you know, since I got an Alexa Dot for Christmas, for the family, to try it out and you know, it's interesting to think that she's listening all the time. [Jules] - So she's not >> And you push the little >> Let's talk about this >> button, you know. >> Or is she not? >> This is a great topic to [Jules] -talk about because a sheriff recently, wanted to investigate a crime and realized that they had an Amazon Echo in the home. And said, 'Well maybe, Amazon will have data about what happened >> Right >> Maybe they'll be clues, people shouting,' you know. And Amazon's fighting because they don't want to hand it over. But what Amazon did, and what Google Home did, and the X-Box did, they don't want to have that data. And so they've designed these things, I think, with actually a lot of care. So... the Echo, is listening for it's name. It's listening for Alexa... >> Right. And it keeps deleting. It listens, right it hears background noise, and if it didn't hear Alexa, drops it, drops it, drops it. Nothing is said out of your home. When you say 'Alexa, what's the weather?' Blue light glows, opens up the connection to Amazon, and now it's just like you're typing in a search or going directly >> Right, right. [Jules] - And so that's done quiet carefully. Google Home works like that, Siri works like that, so I think the big tech companies, despite a lot of pain and suffering over the years of being criticized, and with the realization that government goes to them for data. They don't want that. They don't want to be fighting the government and people being nervous that the IRS is going to try find out information about what you're doing, which bedroom you're in, and what time you came home. >> Although the Fit Bit has all that information. >> Exactly >> Even though Alexa doesn't. [Jules] - So the wearables are another exciting, interesting challenge. We had a project that was funded by both Robert Johnson Foundation, which wants Wearables to be used for health and so forth. But also from a lot of major tech companies. Because everybody was aware that we needed some sort of rules in place. So if Fit Bit, or Jaw Bone, or one of the other Wearables can detect that maybe I'm coming down with Parkinson's or I'm about to fall, or other data, what's their responsibility to do something with that? On one hand, that would be a bit frightening. Right, you got a phone call or an email saying 'Hey, this is your friendly friends at your Wearable and we think >> showing up at your front door >> You should seek medical, you know, help. You would be like, whoa, wait a second, right? On the other hand, what do you do with the fact that maybe we can help you? Take student data, alright. Adtech is very exciting, there's such opportunities for personalized learning, colleges are getting in on the act. They're trying to do big data analytics to understand how to make sure you graduate. Well, what happens when a guidance counselor sits down and says, 'Look, based on the data we have, your grades, your family situation, whether you've been to the gym, your cafeteria usage, data we took off your social media profile, you're really never going to make it in physics. I mean, the data says, people with your particular attributes... Never, never... Rarely succeed in four years at graduating with a degree. You need to change your scholarship. You need to change your career path. Or, you can do what you want, but we're not going give you that scholarship. Or simply, we advise you.' Now, what did we just tell Einstein? Maybe not to take Physics, right. But on the other hand, don't I have some responsibility, if I'm a guidance counselor, who would be looking at your records today, and sort of shuffling some papers and saying, 'Well, maybe you want to consider something else?' So, either we talk about this as privacy, but increasingly, many of my members, again who are chief privacy officers if these companies, are facing what are really ethical issues. And there may be risks, there may be benefits, and they need to help decide, or help their companies decide, when does the benefit outweigh the risk? Consider self-driving cars, right? When does the self-driving car say 'I'm going to put this car in the ditch Because I don't want to run somebody over?' But now it knows that your kids are in the backseat, what sort of calculations do we want this machine making? Do we know the answers ourselves? If the microphone in my home hears child abuse, if 'Hello Barbie' hears a child screaming, or, 'Hey, I swallowed poison,' or 'My dad touched me inappropriately,' what should it do? Do we want dolls ratting out parents? And the police showing up saying, 'Barbie says your child's being abused.' I mean, my gosh, I can see times when my kids thought I was a big Grinch and if the doll was reporting 'Hey dad is being mean to me,' you know, who knows. So, these are challenges that we're going to have to figure out, collectively, with, stakeholders, advocates, civil libertarians, and companies. And if we can chart a path forward that let's us use these new technologies in ways that advances society, I think we'll succeed. If we don't think about it, we'll wake up and we'll learn that we've really constrained ourselves and narrowed our lives in ways that we may not be very happy with. [Jeff] - Fascinating topic. And like on the child abuse thing, you know there are very strict rules for people that are involved in occupations that are dealing with children. Whether it's a doctor, or whether it's a teacher, or even a school administrator, that if they have some evidence of say child abuse, they're obligated >> they're obligated. [Jeff] - Not only are they obligated morally, but they're obligated professionally, and legally, right, to report that in. I mean, do you see those laws will just get translated onto the machine? Clearly, God, you could even argue that the machine probably has got better data and evidence, based on time, and frequency, than the teacher has happening to see, maybe a bruise or a kid acting a little bit different on the school yard. [Jules] - You can see a number of areas where law is going to have to rethink how it fits. Today, I get into an accident, we want to know who's fault is it. What happens when my self-driving car gets into an accident? Right? I didn't do it, the car did it. So, do the manufacturers take responsibility? If I have automated systems in my home, robots and so forth, again, am I responsible for what goes wrong? Or, do these things have, or their companies have some sort of responsibility? So, thinking these things through, is where I think we are first. I don't think we're ready for legal changes. I think what we're ready for is an attitude change. And I think that's happened. When I was the chief privacy officer, at AOL, many years ago, we were so proud of our cooperation with the government. If somebody was kidnapped, we were going to help. If somebody was involved in a terrorism thing, we were going to help. And companies, I think, still recognize their responsibility to cooperate with, you know, criminal activity. But they also recognize that it is their responsibility to push back when government says, 'Give me data about that person.' 'Well, do you have a warrant? Do you have a basis? Can we tell them so they can object? Right? Is it encrypted? Well, sorry, we can't risk all of our users by cracking encryption for you because you're following up on one particular crime.' So, there's been a big sea change in understanding that if you're a company, and there's data you don't want to have to hand over, data about immigrants today, lots of companies, in the Valley, and around the country, are thinking, 'Wait a second, could I be forced to hand over some data that could lead to someone being deported? Or tortured? Or who knows what?' Given that these things seem to be back on the table. And, you know again, years ago, you were a good asterisk, you participated in law enforcement and now people participate, but they also recognize that they have a strong obligation to either not have the data, like Amazon, will not have data that this sheriff wants. Now, their Smart Meter and how much water they're using, and all kinds of other information, frankly about their activity at home, since many other things about our homes is now smarter, may indeed be available. How much water did you use at this particular time? Maybe you were washing blood stains away. That sort of information is >> Wild [Jules] - going to be out there. So, the machines will be providing clues that in some cases are going to incriminate us. And companies that don't want to be in the middle, need to think about designing, for privacy, so as to avoid, creating a world where, you know, whole data is available to be used against us. [Jeff] - Right and then there's the whole factor of the devices are in place, not necessarily the company is using it or not, but, you know, bad actors taking advantage of cameras, microphones, all over and hacking into these devices to do things. And, it's one thing take a look at me while I'm on my PC, it's another thing to take control of my car. Right? And this is where, you know, there's some really interesting challenges ahead. As IT continues to grow. Everything becomes connected. The security people always like to say, you know, the certainty attack area, it grows exponentially. [Jules] - Yeah. Well cars are going to be an exciting opportunity. We have released, today, a guide that the National Auto Dealers Association is providing to auto dealers around the country. Because, when you buy a car today, and you sell it or you lend it, there's information about you in that vehicle. Your location history, maybe your contacts, your music history, and we never would give our phone away without clearing it, or you wouldn't give your computer away, but you don't think about your car as a computer, and so, this has all kinds of advice to people. Listen, your car is a computer. There's things you want to do, to take advantage of, >> Right. [Jules]- New services, safety. But there are things you want to also do to manage your privacy, delete. Make sure you're not sharing your information in a way you don't want it. [Jeff] - Jules, we could go on all day, but I think I've got to let you go to get back to the sessions. So, thanks for taking a few minutes out of your busy day. [Jules] - Really good to be with you. [Jeff] - Absolutely. Jeff Frack, you're watching The Cube. See you next time. (closing music)
SUMMARY :
We're excited to have our next guest Jules Polonetsky, Exciting times for data, exciting times for privacy. The job security and the privacy space is pretty high and so, getting that right is hard. to try it out and you know, it's interesting to think that and realized that they had an Amazon Echo in the home. and the X-Box did, When you say 'Alexa, what's the weather?' and people being nervous that the IRS is going to try [Jules] - So the wearables are another exciting, 'Hey dad is being mean to me,' you know, who knows. to cooperate with, you know, criminal activity. so as to avoid, creating a world where, you know, but I think I've got to let you go
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jules Polonetsky | PERSON | 0.99+ |
Jules | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Jeff | PERSON | 0.99+ |
Jeff Frack | PERSON | 0.99+ |
AOL | ORGANIZATION | 0.99+ |
National Auto Dealers Association | ORGANIZATION | 0.99+ |
two sides | QUANTITY | 0.99+ |
Siri | TITLE | 0.99+ |
1999 | DATE | 0.99+ |
Einstein | PERSON | 0.99+ |
Washington, D.C. | LOCATION | 0.99+ |
Today | DATE | 0.99+ |
Adtech | ORGANIZATION | 0.99+ |
eight years | QUANTITY | 0.99+ |
IRS | ORGANIZATION | 0.99+ |
today | DATE | 0.99+ |
Echo | COMMERCIAL_ITEM | 0.99+ |
Christmas | EVENT | 0.99+ |
both | QUANTITY | 0.99+ |
Robert Johnson Foundation | ORGANIZATION | 0.99+ |
Data Privacy Day | EVENT | 0.98+ |
Alexa Dot | COMMERCIAL_ITEM | 0.98+ |
The Cube | TITLE | 0.98+ |
Barbie | PERSON | 0.98+ |
Alexa | TITLE | 0.97+ |
about 130 | QUANTITY | 0.97+ |
Fit Bit | ORGANIZATION | 0.97+ |
four years | QUANTITY | 0.96+ |
about 130 companies | QUANTITY | 0.96+ |
Future Privacy Forum | ORGANIZATION | 0.96+ |
one | QUANTITY | 0.96+ |
ORGANIZATION | 0.96+ | |
Jaw Bone | ORGANIZATION | 0.95+ |
One | QUANTITY | 0.95+ |
first | QUANTITY | 0.95+ |
half | QUANTITY | 0.94+ |
Google Home | COMMERCIAL_ITEM | 0.88+ |
theCUBE | ORGANIZATION | 0.88+ |
Data Privacy Day 2017 | EVENT | 0.86+ |
many years ago | DATE | 0.84+ |
Parkinson | OTHER | 0.84+ |
Future of Privacy Forum | ORGANIZATION | 0.83+ |
AdTech | ORGANIZATION | 0.83+ |
Grinch | PERSON | 0.81+ |
X-Box | COMMERCIAL_ITEM | 0.8+ |
Home | COMMERCIAL_ITEM | 0.79+ |
years ago | DATE | 0.78+ |
downtown San Francisco | LOCATION | 0.7+ |
Fit Bit | COMMERCIAL_ITEM | 0.7+ |
one hand | QUANTITY | 0.68+ |
Wearable | ORGANIZATION | 0.66+ |
God | PERSON | 0.66+ |
tons | QUANTITY | 0.64+ |
a second | QUANTITY | 0.63+ |
Meter | COMMERCIAL_ITEM | 0.57+ |
Mayb | PERSON | 0.52+ |
ORGANIZATION | 0.5+ | |
year | QUANTITY | 0.49+ |
second | QUANTITY | 0.48+ |
Michelle Dennedy, Cisco | Data Privacy Day 2017
>> Hey, welcome back everybody. Jeff Frick here with theCUBE. We're at Data Privacy Day at Twitter's World Headquarters in downtown San Francisco. Full-day event, a lot of seminars and sessions talking about the issue of privacy. Even though Scott McNealy in 1999 said, "Privacy's dead, get over it," everyone here would beg to differ; and it's a really important topic. We're excited to have Michelle Dennedy. She's the Chief Privacy Officer from Cisco. Welcome, Michelle. >> Indeed, thank you. And when Scott said that, I was his Chief Privacy Officer. >> Oh you were? >> I'm well acquainted with my young friend Scott's feelings on the subject. >> It's pretty interesting, 'cause that was eight years before the iPhone, so a completely different world than actually one of the prior guests we were talking about privacy is an issue in the Harvard Business Review from 125 years ago. So this is not new. >> Absolutely. >> So how have things changed? I mean that's a great perspective that you were there. What was he kind of thinking about and really what are the privacy challenges now compared to 1999? >> So different. Such a different world. I mean fascinating that when that statement was made the discussion was a press conference where we were introducing Connectivity. It was an offshoot of Java, and it basically allowed you to send from your personal computer a wireless message to your printer so that a document could come out (gasp). >> That's what it was? >> Yeah. >> Wireless printing? >> Wireless printing. And really it was gyro technology, so anything wirelessly could start talking to each other in an internet of things world. >> Right. >> So, good news bad news. The world has exploded from there, obviously; but the base premise of, can I be mobile, can I live in a world of connectivity, and still have control over my story, who I am, where I am, what I'm doing? And it was really a reframing moment of when you say privacy is dead, if what you mean by that is secrecy and hiding away and not being connected to the world around you, I may agree with you. However, privacy as a functional definition of how we define ourselves, how we live in a culture, what we can expect in terms of morality, ethics, respect, and security, alive and well, baby. Alive and well. >> (laughs) No shortage of opportunity to keep you busy. We talked to a lot of people who go to a lot of tech conferences. I have to say I don't know that we've ever talked to a Chief Privacy Officer. >> You're missing out. >> I know, so not you get to define the role, I love it. So what are your priorities as Chief Priority Officer? What are you keeping an eye on day to day as well as what are your more strategic objectives? >> It's a great question. So the rise of the Chief Privacy Officer, actually Scott was a big help in that and gave me exactly the right amount of rope to hang myself with. The way I look at it is, probably the simplest analogy is, should you have a Chief Financial Officer? >> Yeah. >> I would guess yeah, right? That didn't exist about 100 years ago. We just kind of loped along, and whoever had the biggest bag of money at the end was deemed to be successful. Where if somebody else who had no money left at the end but bought another store, you would have no way of measuring that. So the Chief Privacy Officer is that person for your digital currency. I look at the pros and the cons, the profit and the loss, of data and the data footprint for our company and for all the people to whom we sell. We think about, what are those control mechanisms for data? So think of me as your data financial officer. >> Right, right. But the data in and of itself is just stagnant, right? It's really just the data in the context of all these other applications. How it's used, where it's used, when it's used, what it's combined with, that really starts to trip into areas of value as well as potential problems. >> I feel like we scripted this before, but we didn't. >> Jeff: We did not script it, we don't script the-- >> So if I took out a rectangle out of my wallet, and it had a number on it, and it was green, what would you say that thing probably is? >> Probably Andrew Jackson on the front. >> Yeah, probably Andrew Jackson. What is that? >> A 20 dollar bill. >> Why is that a 20 dollar bill? >> Because we agree that you're going to give it to me and it has that much value, and thankfully the guy at Starbucks will give me 20 bucks worth of coffee for it. >> (laughs) Exactly. Well which could be a cup the way we're going. >> Which could be a cup. >> But that's exactly right. So is that 20 dollar bill stagnant? Yes. That 20 dollar bill just sitting on the table between us is nothing. I could burn it up, I could put it in my pocket and lose it and never see it again. I could flush it down the toilet. That's how we used to treat our data. If you recognize instead the story that we share about that piece of currency, we happen to be in a place where it's really easy to alienate that currency. I could go downstairs here and spend it. If I was in Beijing I probably would have to go and convert it into a different currency, and we'd tell a story about that conversion because our standards interface is different. Data is exactly the same way. The story that we share together today is a valuable story because we're communicating out, we're here for a purpose. >> Right. >> We're making friends. I'm liking you because you're asking me all these great questions that I would have fed you had I been able to feed you questions. >> Jeff: (laughs) But it's only that context, it's only that communicability that brings it value. We now assume as a populous that paper currency is valuable. It's just paper. It's only as good as the story that enlivens it. So now we're looking at smaller, smaller Microdata transactions of how am I tweeting out information to people who follow me? >> Jeff: Right, right. >> How do I share that with your following public, and does that give me a greater opportunity to educate people about security and privacy? Does that allow my company to sell more of my goods and services because we're building ethics and privacy into the fabric of our networks? I would say that's as valuable or more valuable than that Andrew Jackson. >> So it's interesting 'cause you talk about building privacy into the products. We often hear about building security into the products, right? Because the old way of security of building a bigger wall doesn't work any more and you really have to bake it in at all steps of the application: development, the data layer, the database, et cetera, et cetera. When you look at privacy versus security, and especially 'cause Cisco's sitting on, I mean you guys are sitting on the pipes, everything is running through your machines. >> That's right. >> How do you separate the two, how do you prioritize, and how do you make sure the privacy discussion is certainly part of that gets the right amount of relevance within the context of the security conversation? >> It's a glib answer that's much more complicated, but the security is really in many instances the what. I can really secure almost any batch of data. It can be complete gobbley gook zeroes and ones. It could be something really critical. It could be my medical records. The privacy and the data about what that context is, that's the why. I don't see them as one or the other at all. I see security and security not as not a technology but a series of verb things that you actually physically, people process technologies. That enactment should be addressed to a why. So it's kind of Peter Drucker's management of you manage what you measure. That was like incendiary advice when it first came out. Well I wanted to say that you secure what you treasure. So if you treasure a digital interaction with your employees, your customers, and your community, you should probably secure that. >> Right. But it seems like there's a little bit of a disconnect about maybe what should be treasured and what is the value with folks that have grown up. Let's pick on the young kids, not really thinking through or having the time or knowing an impact of a negative event in terms of just clicking and accepting the EULA and using that application on their phone. They just look at in a different way. Is that valid? How do they change that behavior? How do you look at this new generation, and there's this sea of data which is far larger than it used to be coming off all these devices, internet of things, obviously. People are things too. The mobile devices with all that geolocation data, and the sensor data, and then oh by the way it's all going to be in our cars and everything else shortly. How's that landscape changing and challenging you in new ways, and what are you doing about it? >> The speed and dynamics are astronomical. How do you count the stars, right? >> Jeff: (laughs) >> And should you? Isn't that kind of a waste of time? >> Jeff: Right, right. >> It used to be that knowledge, when I was a kid, was knowing what was in A to Z of the Encyclopedia Britannica. Now facts are cheap. Facts used to be expensive. You had to take time and commit to them, and physically find them, and be smart enough to read, and on, and on, and on. The dumbest kid is smarter than I was with my Encyclopedia Britannica because we have search engines. Now their commodity is how do I critically think? How do I make my brand and make my way? How do I ride and surf on a wave of untold quantities of information to create a quality brand for myself? So the young people are actually in a much better position than, I'll still count us as young. >> Jeff: Yeah, Uh huh. >> But maybe less young. >> Less young, less young than we were yesterday. >> We are digital natives, but I think I am hugely optimistic that the kids coming up are really starting to understand the power of brand: personal brand, family brand, cultural brand. And they're feeling very activist about the whole thing. >> Yeah, which is interesting 'cause that was never a factor when there was no personal brand, right? You were part of-- >> No way. >> whatever entity that you were in. >> Well, you were in a clique. >> Right. >> Right? You identified as when I was home I was the third out of four kids. I was a Roman Catholic girl in the Midwest. I was a total dork with a bowl haircut. Now kids can curate who and what and how they are over the network. Young professionals can connect with people with experience. Or they can decide, I get this all the time on Twitter actually. How did you become a Chief Privacy Officer? I'm really interested in taking a pivot in my career. And I love talking to those people 'cause they always educate me, and I hope that I give them a little bit of value too. >> Right, right. Michelle, we could go on for on and on and on. But, unfortunately, I think you got to go cover a session. So we're going to let you go. >> Thank you. >> Michelle Dennedy, thanks for taking a few minutes of your time. >> Thank you, and don't miss another Data Privacy Day. >> I will not. We'll be back next year as well. I'm Jeff Frick. You're watching theCUBE. See you next time.
SUMMARY :
talking about the issue of privacy. And when Scott said that, I was his Chief Privacy Officer. Scott's feelings on the subject. one of the prior guests we were talking about I mean that's a great perspective that you were there. the discussion was a press conference And really it was gyro technology, if what you mean by that is secrecy and hiding away (laughs) No shortage of opportunity to keep you busy. I know, so not you get to define the role, I love it. exactly the right amount of rope to hang myself with. and for all the people to whom we sell. It's really just the data in the context What is that? and thankfully the guy at Starbucks Well which could be a cup the way we're going. I could flush it down the toilet. had I been able to feed you questions. It's only as good as the story that enlivens it. How do I share that with your following public, and you really have to bake it in The privacy and the data about what that context is, and the sensor data, and then oh by the way How do you count the stars, right? So the young people are actually in a much better position hugely optimistic that the kids coming up I was a total dork with a bowl haircut. So we're going to let you go. of your time. See you next time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Scott | PERSON | 0.99+ |
Michelle Dennedy | PERSON | 0.99+ |
Michelle | PERSON | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
Andrew Jackson | PERSON | 0.99+ |
Beijing | LOCATION | 0.99+ |
1999 | DATE | 0.99+ |
20 bucks | QUANTITY | 0.99+ |
20 dollar | QUANTITY | 0.99+ |
Scott McNealy | PERSON | 0.99+ |
third | QUANTITY | 0.99+ |
Starbucks | ORGANIZATION | 0.99+ |
next year | DATE | 0.99+ |
two | QUANTITY | 0.99+ |
Java | TITLE | 0.99+ |
iPhone | COMMERCIAL_ITEM | 0.99+ |
four kids | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Peter Drucker | PERSON | 0.99+ |
yesterday | DATE | 0.99+ |
Harvard Business Review | TITLE | 0.98+ |
first | QUANTITY | 0.96+ |
one | QUANTITY | 0.96+ |
Data Privacy Day | EVENT | 0.96+ |
EULA | TITLE | 0.95+ |
Encyclopedia Britannica | TITLE | 0.95+ |
125 years ago | DATE | 0.93+ |
ORGANIZATION | 0.92+ | |
Data Privacy Day 2017 | EVENT | 0.91+ |
San Francisco | LOCATION | 0.89+ |
Privacy | PERSON | 0.87+ |
eight years | DATE | 0.86+ |
theCUBE | ORGANIZATION | 0.83+ |
World Headquarters | LOCATION | 0.81+ |
Midwest | LOCATION | 0.8+ |
Privacy Officer | PERSON | 0.77+ |
about 100 years ago | DATE | 0.77+ |
Financial | PERSON | 0.67+ |
Roman Catholic | OTHER | 0.47+ |
Eva Casey Velasquez | Data Privacy Day 2017
(soft click) >> Hey, welcome back everybody, Jeff Frick here with theCUBE. We're at downtown San Francisco, at Twitter's World Headquarters. It's a beautiful building. Find a reason to get up here and check it out. But they have Data Privacy Day here today. It's an all day seminar session, series of conversations about data privacy. And even though Scott McNealy said, "Data privacy is dead, get over it." Everyone here would beg to differ. So we're excited to have our next guest Eva Velasquez. Shes' the President and CEO of ITRC, welcome. >> Thank you, thank you for having me and for covering this important topic. >> Absolutely, so what is ITRC? >> We are the Identity Theft Resource Center. And the name, exactly what it is. We're a resource for the public when they have identity theft or fraud, privacy data breach issues, and need help. >> So this begs an interesting question. How do people usually find out that their identity has been compromised? And what is usually the first step they do take? And maybe what's the first step they should take? >> Well, it's interesting because there isn't one universal pathway that people discover it. It's usually a roadblock. So, they're trying to move forward in their lives in some manner. Maybe trying to rent an apartment, get a new job, buy a car or a house. And during that process they find out that there's something amiss. Either in a background check or a credit report. And at that point it creates a sense of urgency because they must resolve this issue. And prove to whoever they're trying to deal with that actually wasn't me, somebody used my identity. And that's how they find out, generally speaking. >> So, you didn't ask their credit scores. Something in a way that they had no idea, this is how they. What usually triggers it? >> Right, right, or a background check. You know, appearing in a database. It's just, when we think about how pervasive our identity is out there in the world now. And how it's being used by a wide swath of different companies. To do these kind of background checks and see who we are. That's where that damage comes in. >> Talking about security and security breaches at a lot of shows, you know. It's many hundred of days usually before companies know that they've been breached. Or a particular breach, I think now we just assume they're breached all the time. And hopefully they'd minimize damage. But an identity theft, what do you find is kind of the average duration between the time something was compromised before somebody actually figures it out? Is there kind of an industry mean? >> It's really wildly inconsistent from what we see. Because sometimes if there is an issue. Let's say that a wallet is stolen and they're on high alert, they can often discover it within a week or 10 days. Because they are looking for those things. But sometimes if it's a data breach that they were unaware of or have no idea how their information was compromised. And especially in the case of child identity theft, it can go on for years and years before they find out that something's amiss. >> Child identity theft? >> Mhmm. >> And what's going with? I've never heard of child identity theft. They usually don't have credit cards. What's kind of the story on child identity cut theft? Which is their PayPal account or their Snapchat account (laughs). >> Well, you're right, children don't have a credit file or a credit history. But they do have a social security number. And that is being issued within the first year of their life because their parents need to use it on their tax returns and other government documents. Well, because the Social Security Administration and the credit reporting agencies, they don't interface. So, if a thief gets ahold of that social security number. That first record that's created is what the credit bureaus will use. So they don't even need a legitimate name or date of birth. Obviously, the legitimate date of birth isn't going to go through those filters because it is for someone who's under 18. So, kid goes all through life, maybe all through school. And as they get out and start doing things like applying for student loans. Which is one of the really common ways we see it in our call center. Then they come to find out, I have this whole credit history. And guess what? It's a terrible credit history. And they have to clean that up before they can even begin to launch into adulthood. >> (chuckles) Okay, so, when people find out. What should they do? What's the right thing to do? I just get rejected on a credit application. Some weird thing gets flagged. What should people do first? >> There's a couple things and the first one is don't panic. Because we do have resources out there to help folks. One of them is the Identity Theft Resource Center. All of our services are completely free to the public. We're a charity, non-profit, funded by grants, donations, and sponsorships. They should also look into what they might have in their back pocket already. There are a lot of insurance policy writers for things like your home owners insurance, sometimes even your renters insurance. So, you might already have a benefit that you pay for in another way. There are a lot of plans within employee benefit packages. So, if you work for a company that has a reasonable robust package, you might have that help there as well. And then the other thing is if you really feel like you're overwhelmed and you don't have the time. You can always look into hiring a service provider and that's legitimate thing to do as long as you know who you're doing business with. And realize you're going to be paying for that convenience. But there are plenty of free resources out there. And then the last one is the Federal Trade Commission. They have some wonderful remediation plans online. That you can just plug in right there. >> And which is a great segway, 'cause you're doing a panel later today, you mentioned, with the FTC. Around data privacy and identity theft. You know, what role does the federal government have? And what is cleaning up my identity theft? What actually happens? >> Well, the federal government is one of the many stakeholders in this process. And we really believe that everybody has to be involved. So, that includes our government, that includes industry, and the individual consumers or victims themselves. So, on the government end, things like frameworks for how we need to treat data, have resources available to folks, build an understanding in a culture in our country that really understands the convenience versus security conundrum. Of course industry needs to protect and safeguard that data. And be good stewards of it, when people give it to them. And then individual consumers really need to pay attention and understand what choice they're making. It's their choice to make but it should be an educated one. >> Right, right. And it just, the whole social security card thing, is just, I find fascinating. It's always referenced as kind of the anchor data point of your identity. At the same time, you know, it's a paper card that comes after your born. And people ask for the paper card. I mean, I got a chip on my ATM card. It just seems so archaic, the amount of times it's asked in kind of common everyday, kind of customer service engagements with your bank or whatever. Just seems almost humorous in the fact that this is supposed to be such an anchor point of security. Why? You know, when is the Social Security Administration or that record, either going to come up to speed or do you see is there a different identity thing? With biometrics or a credit card? Or your fingerprint or your retina scan? I mean, I have clear, your Portican, look at my... Is that ever going to change or is it just always? It's such a legacy that's so embedded in who we are that it's just not going to change? It just seems so bizarre to me. >> Well, it's a classic case of we invented a tool for one purpose. And then industry decided to repurpose it. So the social security number was simply to entitle you to social security benefits. That was the only thing it was created for. Then, as we started building the credit and credit file industry, we needed an initial authenticator. And hey, look at this great thing. This is a number, it's issued to one individual. We know that there's some litmus test that they have to pass in order to get one. There's a great tool, let's use it. But nobody started talking about that. And now that we're looking at things like other type, government benefits being offered. And now, you know, credit is issued based on this number. It really kind of got away from everybody. And think about it, it used to be your military ID. And you would have your social security number painted on your rucksack, there for the world to see. It's still on our Medicare cards. It used to be on our checks. Lot of that has changed. >> That's right it was on our checks. >> It was, it was. So, we have started shifting into this. At least the thought process of, "If we're going to use something as an initial authenticator, we probably should not be displaying it, ready for anyone to see." And the big conversation, you know, you were talking about biometrics and other ways to authenticate people. That's one of the big conversations we're having right now is, "What is the solution?" Is it a repurposing of the social security number? Is it more sharing within government agencies and industry of that data, so we can authenticate people through that? Is it a combination of things? And that's what we're trying to wrestle with and work out. But it is moving forward, I'll be it, very very slowly. >> Yeah, they two factor authentication seems to have really taken off recently. >> Thankfully. >> You get the text and here's your secret code and you know, at least it's another step that's relatively simple to execute. >> Something you are, something you have, something you know. >> There you go. >> That's kind of the standard we're really trying to push. >> So, on the identity theft bad guys, how is their behavior changed since you've been in this business? Has it changed dramatically? Is the patterns of theft pretty similar? You know, how's that world evolving? 'Cause generally these things are little bit of an arm race, you know. And often times the bad guys are one step ahead of the good guys. 'Cause the good guys are reacting to the last thing that the bad guys do. How do you see that world kind of changing? >> Well, I've been in the fraud space for over 20 years. Which I hate to admit but it's the truth. >> Jeff: Ooh, well, tell me about it. >> And we do look at it sort of like a treadmill and I think that's just the nature of the beast. When you think about the fact that the thieves are they're, you know, they're doing penetration testing. And we, as the good guys, trying to prevent it. Have to be right a hundred percent of the time. The thieves only have to be right once, they know it. They also spend an extraordinary amount of time being creative about how they're going to monetize our information. The last big wave on new types of identity theft, was tax identity theft. And the federal government never really thought that that would be a thing. So when we went to online filing, there really weren't any fraud analytics. There wasn't any verification of it. So, that first filing was the one that was processed. Well, fast forward to now, we've started to address that it's still a huge problem and the number one type of identity theft. But if you had asked me ten years ago, if that would be something, I don't think I would have said yes. It seems, you know, so, you know. How do you create money out of something like that? And so, to me, what is moving forward is that I think we just have to be really vigilant for when we leave that door unlocked, the thieves are going to push it open and burst through. And we just have to make sure we notice when it's cracked. So that we can push it closed. Because that's really I think the only way we're going to be able to address this. Is just to be able to detect and react much more quickly than we do now. >> Right, right, 'cause going to come through, right? >> Exactly they are. >> There's no wall thick enough, right? Right and like you said they only have to be right once. >> Nothings impenetrable. >> Right, crazy. Alright Eva, we're going to leave it there and let you go off to your session. Have fun at your session and thanks for spending a few minutes with us. >> Thank you. >> Alright, she's Eva Velasquez, President and CEO of the ITRC. I'm Jeff Frick, you're watching theCUBE. Catch you next time. (upbeat electronic music)
SUMMARY :
Find a reason to get up here and check it out. and for covering this important topic. And the name, exactly what it is. And what is usually the first step they do take? And during that process they find out So, you didn't ask their credit scores. And how it's being used by a wide swath at a lot of shows, you know. And especially in the case of child identity theft, What's kind of the story on child identity cut theft? And they have to clean that up What's the right thing to do? And then the other thing is if you really feel like And what is cleaning up my identity theft? of the many stakeholders in this process. And it just, the whole social security card thing, that they have to pass in order to get one. And the big conversation, you know, seems to have really taken off recently. You get the text and here's your secret code So, on the identity theft bad guys, Well, I've been in the fraud space for over 20 years. And so, to me, what is moving forward is Right and like you said they only have to be right once. and let you go off to your session. President and CEO of the ITRC.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Eva Velasquez | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Federal Trade Commission | ORGANIZATION | 0.99+ |
Eva | PERSON | 0.99+ |
ITRC | ORGANIZATION | 0.99+ |
Jeff | PERSON | 0.99+ |
Scott McNealy | PERSON | 0.99+ |
Social Security Administration | ORGANIZATION | 0.99+ |
Identity Theft Resource Center | ORGANIZATION | 0.99+ |
Eva Casey Velasquez | PERSON | 0.99+ |
10 days | QUANTITY | 0.99+ |
first step | QUANTITY | 0.99+ |
PayPal | ORGANIZATION | 0.99+ |
first year | QUANTITY | 0.99+ |
one | QUANTITY | 0.98+ |
over 20 years | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
Data Privacy Day | EVENT | 0.98+ |
ORGANIZATION | 0.98+ | |
one purpose | QUANTITY | 0.98+ |
two factor | QUANTITY | 0.98+ |
a week | QUANTITY | 0.97+ |
under | QUANTITY | 0.97+ |
first | QUANTITY | 0.96+ |
ten years ago | DATE | 0.96+ |
FTC | ORGANIZATION | 0.96+ |
Snapchat | ORGANIZATION | 0.96+ |
first record | QUANTITY | 0.95+ |
hundred percent | QUANTITY | 0.94+ |
one step | QUANTITY | 0.9+ |
years | QUANTITY | 0.89+ |
first one | QUANTITY | 0.89+ |
big | EVENT | 0.89+ |
later today | DATE | 0.87+ |
theCUBE | ORGANIZATION | 0.85+ |
hundred of days | QUANTITY | 0.85+ |
San Francisco | LOCATION | 0.84+ |
Data Privacy Day 2017 | EVENT | 0.82+ |
World Headquarters | LOCATION | 0.81+ |
one individual | QUANTITY | 0.78+ |
once | QUANTITY | 0.73+ |
couple things | QUANTITY | 0.71+ |
first filing | QUANTITY | 0.71+ |
one universal pathway | QUANTITY | 0.7+ |
One of them | QUANTITY | 0.64+ |
President | PERSON | 0.63+ |
wave | EVENT | 0.58+ |
18 | QUANTITY | 0.57+ |
government | ORGANIZATION | 0.48+ |
Eva Velasquez, Identity Theft Resource Center | Data Privacy Day 2018
>> Hey, welcome back everybody, Jeff Frick here with The Cube. We're at Data Privacy Day 2018, I still can't believe it's 2018, in downtown San Francisco, at LinkedIn's headquarters, the new headquarters, it's a beautiful building just down the road from the sales force building, from the new Moscone that's being done, there's a lot of exciting things going on in San Francisco, but that's not what we're here to talk about. We're here to talk about data privacy, and we're excited to have a return visit from last year's Cube alumni, she's Eva Velasquez, president and CEO, Identity Theft Resource Center. Great to see you again. >> Thank you for having me back. >> Absolutely, so it's been a year, what's been going on in the last year in your world? >> Well, you know, identity theft hasn't gone away >> Shoot. >> And data-- >> I thought you told me it was last time. >> I know, I wish, and in fact, unfortunately we just released our data breach information, and there was a tremendous growth. It was a little over 1000, previous year, and over 1500 data breaches... in 2017. >> We're almost immune, they're like every day. And it used to be like big news. Now it's like, not only was Yahoo breached at some level, which we heard about a while ago, but then we hear they were actually breached like 100%. >> There is some fatigue, but I can tell you that it's not as pervasive as you might think. Our call center had such a tremendous spike in calls during the Equifax breach. It was the largest number of calls we'd had in a month, since we'd been measuring our call volume. So people were still very, very concerned. But a lot of us who are in this space are feeling, I think we may be feeling the fatigue more than your average consumer out there. Because for a lot of folks, this is really the first exposure to it. We're still having a lot of first exposures to a lot of these issues. >> So the Equifax one is interesting, because most people don't have a direct relationship with Equifax, I don't think. I'm not a direct paying customer, I did not choose to do business with them. But as one of the two or three main reporting agencies, right, they've got data on everybody for their customers who are the banks, financial institutions. So how does that relationship get managed? >> Oh my gosh, there's so much meat there. There's so much meat there. Okay, so, while it feels like you don't have a direct relationship with the credit reporting agencies, you actually do, you get a benefit from the services that they're providing to you. And every time you get a loan, I mean this is a great conversation for Data Privacy Day. Because when you get a loan, get a credit card, and you sign those terms and conditions, guess what? >> They're in there? >> You are giving that retailer, that lender, the authority to send that information over to the credit reporting agencies. And let's not forget that the intention of forming the credit reporting agencies was for better lending practices, so that your creditworthiness was not determined by things like your gender, your race, your religion, and those types of really, I won't say arbitrary, but just not pertinent factors. Now your creditworthiness is determined by your past history of, do you pay your bills? What is your income, do you have the ability to pay? So it started with a good, very good purpose in mind, and we definitely bought into that as a society. And I don't want to sound like I'm defending the credit reporting agencies and all of their behavior out there, because I do think there are some changes that need to be made, but we do get a benefit from the credit reporting agencies, like instant credit, much faster turnaround when we need those financial tools. I mean, that's just the reality of it. >> Right, right. So, who is the person that's then... been breached, I'm trying to think of the right word of the relationship between those who've had their data hacked from the person who was hacked. If it's this kind of indirect third party relationship through an authorization through the credit card company. >> No, the, Equifax is absolutely responsible. >> So who would be the litigant, just maybe that's the word that's coming to me in terms of feeling the pain, is it me as the holder of the Bank of America Mastercard? Is it Bank of America as the issuer of the Mastercard? Or is it Mastercard, in terms of retribution back to Equifax? >> Well you know, I can't really comment on who actually would have the strongest legal liability, but what I can say is, this is the same thing I say when I talk to banks about identity theft victims. There's some discussion about, well, no, it's the bank that's the victim in existing account identity theft, because they're the ones that are absorbing the financial losses. Not the person whose data it belongs to. Yet the person who owns that data, it's their identity credentials that have been compromised. They are dealing with issues as well, above and beyond just the financial compromise. They have to deal with cleaning up other messes and other records, and there's time spent on the phone, so it's not mutually exclusive. They're both victims of this situation. And with data breaches, often the breached entity, again, I hate to sound like an apologist, but I am keeping this real. A breached entity, when they're hacked, they are a victim, a hacker has committed that crime and gone into their systems. Yes, they have a responsibility to make those security systems as robust as possible, but the person whose identity credentials those are, they are the victim. Any entity or institution, if it's payment card data that's compromised, and a financial services institution has to replace that data, guess what, they're a victim too. That's what makes this issue and this crime so terrible, is that it has these tentacles that reach down and touch more than one person for each incident. >> Right. And then there's a whole 'nother level, which we talked about before we got started that we want to dig into, and that's children. Recently, a little roar was raised with these IOT connected toys. And just a big, giant privacy hole, into your kid's bedroom. With eyes and ears and everything else. So wonder if you've got some specific thoughts on how that landscape is evolving. >> Well, we have to think about the data that we're creating. That does comprise our identity. And when we start talking about these toys and other... internet connected, IOT devices that we're putting in our children's bedroom, it actually does make the advocacy part of me, it makes the hair on the back of my neck stand up. Because the more data that we create, the more that it's vulnerable, the more that it's used to comprise our identity, and we have a big enough problem with child identity theft just now, right now as it stands, without adding the rest of these challenges. Child and synthetic identity theft are a huge problem, and that's where a specific Social Security number is submitted and has a credit profile built around it, when it can either be completely made up, or it belongs to a child. And so you have a four year old whose Social Security number is now having a credit profile built around it. Obviously they're not, so the thieves are not submitting this belongs to a four year old, it would not be issued credit. So they're saying it's a, you know, 23 year old-- >> But they're grabbing the number. >> They're grabbing the number, they're using the name, they build this credit profile, and the biggest problem is we really haven't modernized how we're authenticating this information and this data. I think it's interesting and fitting that we're talking about this on Data Privacy Day, because the solution here is actually to share data. It's to share it more. And that's an important part of this whole conversation. We need to be smart about how we share our data. So yes, please, have a thoughtful conversation with yourself and with your family about what are the types of data that you want to share and keep, and what do you want to keep private, but then culturally we need to look at smart ways to open up some data sharing, particularly for these legitimate uses, for fraud detection and prevention. >> Okay, so you said way too much there, 'cause there's like 87 followup questions in my head. (Eva laughs) So we'll step back a couple, so is that synthetic identity, then? Is that what you meant when you said a synthetic identity problem, where it's the Social Security number of a four year old that's then used to construct this, I mean, it's the four year old's Social Security number, but a person that doesn't really exist? >> Yes, all child identity theft is synthetic identity theft, but not all synthetic identity theft is child identity theft. Sometimes it can just be that the number's been made up. It doesn't actually belong to anyone. Now, eventually maybe it will. We are hearing from more and more parents, I'm not going to say this is happening all the time, but I'm starting to hear it a little bit more often, where the Social Security number is being issued to their child, they go to file their taxes, so this child is less than a year old, and they are finding out that that number has a credit history associated with it. That was associated years ago. >> So somebody just generated the number. >> Just made it up. >> So are we ready to be done with Social Security numbers? I mean, for God's sake, I've read numerous things, like the nine-digit number that's printed on a little piece of paper is not protectable, period. And I've even had a case where they say, bring your little paper card that they gave you at the hospital, and I won't tell you what year that was, a long time ago. I'm like, I mean come on, it's 2018. Should that still be the anchor-- >> You super read my mind. >> Data point that it is? >> It was like I was putting that question in your head. >> Oh, it just kills me. >> I've actually been talking quite a bit about that, and it's not that we need to get, quote unquote, get rid of Social Security numbers. Okay, Social Security numbers were developed as an identifier, because we have, you can have John Smith with the same date of birth, and how do we know which one of those 50,000 John Smiths is the one we're looking for? So that unique identifier, it has value. And we should keep that. It's not a good authenticator, it is not a secret. It's not something that I should pretend only I know-- >> Right, I write it on my check when I send my tax return in. Write your number on the check! Oh, that's brilliant. >> Right, right. So it's not, we shouldn't pretend that this is, I'm going to, you, business that doesn't know me, and wants to make sure I am me, in this first initial relationship or interaction that we're having, that's not a good authenticator. That's where we need to come up with a better system. And it probably has to do with layers, and more layers, and it means that it won't be as frictionless for consumers, but I'm really challenging, this is one of our big challenges for 2018, we want to flip that security versus convenience conundrum on its ear and say, no, I really want to challenge consumers to say... I'm happier that I had to jump through those hoops. I feel safer, I think you're respecting my data and my privacy, and my identity more because you made it a little bit harder. And right now it's, no, I don't want to do that because it's a little too, nine seconds! I can't believe it took me nine seconds to get that done. >> Well, yeah, and we have all this technology, we've got fingerprint readers that we're carrying around in our pocket, I mean there's, we've got geolocation, you know, is this person in the place that they generally, and having 'em, there's so many things-- >> It's even more granular >> Beyond a printed piece of >> Than that-- >> paper, right? >> It's the angle at which you look at your phone when you look at it. It's the tension with which you enter your passcode, not just the passcode itself. There are all kinds of very non-invasive biometrics, for lack of a better word. We tend to think of them as just, like our face and our fingerprint, but there are a lot of other biometrics that are non-invasive and not personal. They're not private, they don't feel secret, but we can use them to authenticate ourselves. And that's the big discussion we need to be having. If I want to be smart about my privacy. >> Right. And it's interesting, on the sharing, 'cause we hear that a lot at security conferences, where one of the best defenses is that teams at competing companies, security teams, share data on breach attempts, right? Because probably the same person who tried it against you is trying it against that person, is trying it against that person. And really an effort to try to open up the dialogue at that level, as more of just an us against them versus we're competing against each other in the marketplace 'cause we both sell widgets. So are you seeing that? Is that something that people buy into, where there's a mutual benefit of sharing information to a certain level, so that we can be more armed? >> Oh, for sure, especially when you talk to the folks in the risk and fraud and identity theft mitigation and remediation space. They definitely want more data sharing. And... I'm simply saying that that's an absolutely legitimate use for sharing data. We also need to have conversations with the people who own that data, and who it belongs to, but I think you can make that argument, people get it when I say, do you really feel like the angle at which you hold your phone, is that personal? Couldn't that be helpful, that combined with 10 other data points about you, to help authenticate you? Do you feel like your personal business and life is being invaded by that piece of information? Or compare that to things like your health records. And medical conditions-- >> Mom's maiden name. >> That you're being treated for, well, wow, for sure that feels super, super personal, and I think we need to do that nuance. We need to talk about what data falls into which of these buckets, and on the bucket that isn't super personal, and feeling invasive and that I feel like I need to protect, how can I leverage that to make myself safer? >> Great. Lots of opportunity. >> I think it's there. >> Alright. Eva, thanks for taking a few minutes to stop by. It's such a multi-layered and kind of complex problem that we still feel pretty much early days at trying to solve. >> It's complicated, but we'll get there. More of this kind of dialogue gets us just that much closer. >> Alright, well thanks for taking a few minutes of your day, great to see you again. >> Thanks. >> Alright, she's Eva, I'm Jeff, you're watching The Cube from Data Privacy Days, San Francisco. (techno music)
SUMMARY :
Great to see you again. I thought you told me it was and there was a tremendous growth. but then we hear they were actually breached like 100%. the first exposure to it. I did not choose to do business with them. that they're providing to you. And let's not forget that the intention of the relationship between those who've had above and beyond just the financial compromise. that we want to dig into, and that's children. Because the more data that we create, the more We need to be smart about how we share our data. Is that what you meant when you said Sometimes it can just be that the number's been made up. at the hospital, and I won't tell you is the one we're looking for? Write your number on the check! And it probably has to do with layers, It's the tension with which you enter your passcode, Because probably the same person who tried it against you the angle at which you hold your phone, is that personal? and that I feel like I need to protect, Lots of opportunity. problem that we still feel pretty much early days just that much closer. of your day, great to see you again. Alright, she's Eva, I'm Jeff, you're watching The Cube
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff Frick | PERSON | 0.99+ |
Eva Velasquez | PERSON | 0.99+ |
Equifax | ORGANIZATION | 0.99+ |
2017 | DATE | 0.99+ |
nine seconds | QUANTITY | 0.99+ |
Eva | PERSON | 0.99+ |
Bank of America | ORGANIZATION | 0.99+ |
Yahoo | ORGANIZATION | 0.99+ |
Jeff | PERSON | 0.99+ |
2018 | DATE | 0.99+ |
ORGANIZATION | 0.99+ | |
four year | QUANTITY | 0.99+ |
Mastercard | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
Identity Theft Resource Center | ORGANIZATION | 0.99+ |
San Francisco | LOCATION | 0.99+ |
The Cube | TITLE | 0.99+ |
100% | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
Cube | ORGANIZATION | 0.99+ |
first exposure | QUANTITY | 0.99+ |
10 other data points | QUANTITY | 0.98+ |
each incident | QUANTITY | 0.98+ |
a month | QUANTITY | 0.98+ |
less than a year old | QUANTITY | 0.98+ |
more than one person | QUANTITY | 0.98+ |
over 1000 | QUANTITY | 0.97+ |
first exposures | QUANTITY | 0.97+ |
both victims | QUANTITY | 0.97+ |
nine-digit | QUANTITY | 0.97+ |
three main reporting agencies | QUANTITY | 0.97+ |
over 1500 data breaches | QUANTITY | 0.97+ |
87 followup questions | QUANTITY | 0.96+ |
The Cube | ORGANIZATION | 0.96+ |
both | QUANTITY | 0.96+ |
Data Privacy Day | EVENT | 0.95+ |
Data Privacy Day 2018 | EVENT | 0.94+ |
Data Privacy Days | TITLE | 0.94+ |
four year old | QUANTITY | 0.93+ |
Moscone | LOCATION | 0.9+ |
previous year | DATE | 0.88+ |
50,000 | QUANTITY | 0.85+ |
a year | QUANTITY | 0.82+ |
John Smith | PERSON | 0.81+ |
23 year old | QUANTITY | 0.81+ |
about a while ago | DATE | 0.68+ |
couple | QUANTITY | 0.68+ |
privacy | ORGANIZATION | 0.66+ |
IOT | ORGANIZATION | 0.61+ |
years | DATE | 0.56+ |
John Smiths | COMMERCIAL_ITEM | 0.4+ |
Eve Maler, ForgeRock | Data Privacy Day 2018
>> Hey, welcome back everybody. Jeff Frigg here with theCUBE. We're at Data Privacy Day 2018 here at Linked-In's brand new, downtown San Francisco headquarters not in Sunny Vale. And we're excited to be here for the second time. And we've got Eve Maylar back she's a VP in innovation and emerging tech at Forge Rock, we caught up last year, so great to see you. >> Likewise. >> So what's different in 2018 than 2017? >> Well GDPR, the general data protection regulation Well, also we didn't talk about it much here today, but the payment services directive version two is on the lips of everybody in the EU who's in financial services, along with open banking, and all these regulations are actually as much about digital transformation, I've been starting to say hashtag digital transformation, as they are about regulating data protection and privacy, so that's big. >> So why aren't those other two being talked about here do you think? >> To a certain extent they are for the global banks and the multinational banks and they have as much impact on things like user consent as GDPR does, so that's a big thing. >> Jeff: Same penalties? >> They do have some penalties, but they are as much about, okay, I'm starting to say hashtag in front of all these cliches, but you know they are as much about trying to do the digital single market as GDPR is, so what they're trying to do is have a level playing field for all those players. So the way that GDPR is trying to make sure that all of the countries have the same kind of regulations to apply so that they can get to the business of doing business. >> Right, right, and so it's the same thing trying to have this kind of unified platform. >> Yup, absolutely, and so that affects companies here if they play in that market as well. >> So there's a lot of talk on this security site when you go to these security conferences about baking security in everywhere, right? It can't be OL guard anymore, there is no such thing as keeping the bad guys out, it's more all the places you need to bake in security, and so you're talking about that really needs to be on the privacy side as well, it needs to go hand-in-hand, not be counter to innovation. >> Yes, it is not a zero sum game, it should be a positive sum game in fact, GDPR would call it data protection by design and by default. And so, you have to bake it in, and I think the best way to bake it in is to see this as an opportunity to do better business with your customers, your consumers, your patients, your citizens, your students, and the way to do that is to actually go for a trust mark instead of, I shouldn't say a trust mark, but go for building trusted digital relationships with all those people instead of just saying "Well I'm going to go for compliance" and then say " Well I'm sorry if you didn't feel that action "on my part was correct" >> Well hopefully it's more successful than we've seen on the security side right? Because data breaches are happening constantly, no one is immune and I don't know, we're almost kind of getting immune to it. I mean Yahoo's it was their entire database of however many billions of people, and some will say it's not even when you get caught it's more about how you react, when you do get caught both from a PR perspective, as well as mending faith like the old Tylenol issue back in the day, so, on the privacy side do you expect that to be the same? Are these regulations in such a way where it's relatively easy to implement so we won't have kind of this never ending breach problem on the security side, or is it going to be kind of the same. >> I think I just made a face when you said easy, the word easy okay. >> Not easy but actually doable, 'cause sometimes it feels like some of the security stuff again on the breaches specifically, yeah it seems like it should be doable, but man oh man we just hear over and over again on the headlines that people are getting compromised. >> Yeah people are getting compromised and I think they are sort of getting immune to the stories when it's a security breach. We try to do at my company at Forge Rock we're identities so I have this identity lens that I see everything through, and I think especially in the internet of things which we've talked about in the past there's a recognition that digital identity is a way that you can start to attack security and privacy problems, because if you want to, for example, save off somebody's consent to let information about them flow, you need to have a persistent storage that they did consent, you need to have persistent storage of the information about them, and if they want to withdraw consent which is a thing like GDPR requires you to be able to do, and prove that they're able to do, you need to have a persistent storage of their digital identity. So identity is actually a solution to the problem, and what you want to do is have an identity and access management solution that actually reduces the friction to solving those problems so it's basically a way to have consent life cycle management if you will and have that solution be able to solve your problems of security and privacy. >> And to come at it from the identity point of view versus coming at it from the data point of view. >> That's right, and especially when it comes to internet of things, but not even just internet of things, you're starting to need to authenticate and identity everything; services, applications, piles of data, and smart devices, and people, and keep track of the relationships among them. >> We just like to say people are things too so you can always include the people in the IT conversation. But it is pretty interesting the identity task 'cause we see that more and more, security companies coming at the problem from an identity problem because now you can test the identity against applications, against data, against usage, change what's available, not available to them, versus trying to build that big wall. >> Yes, there's no perimeters anymore. >> Unless you go to China and walk the old great wall. >> Yes you're taking your burner devices with you aren't you? (laughs) >> Yes. >> Good, good to hear >> Yeah but it's a very different way to attack the problem from an identity point of view. >> Yeah, and one of the interesting things actually about PSD2 and this open banking mandate, and I was talking about they want to get digital business to be more attractive, is that they're demanding strong customer authentication, SCA they call it, and so I think we're going to see, I think we talked about passwords last time we met, less reliance. >> Jeff: And I still have them and I still can't remember them. >> Well you know, less reliance on passwords either is the only factor or sometimes a factor, and more sophisticated authentication that has less impact, well less negative impact on your life, and so I'm kind of hopeful that they're getting it, and these things are rolling up faster than GDPR, so I think those are kind of easier. They're aware of the API economy, they get it. They get all the standards that are needed. >> 'Cause the API especially when you get the thing to thing and you got multi steps and everything is depending on the connectivity upstream, you've got some significant issues if you throw a big wrench into there. But it's interesting to see how the two factor authentication is slowly working its way into more and more applications, and using a phone now without the old RSA key on the keychain, what a game changer that is. >> Yeah I think we're getting there. Nice to hear something's improving right? >> There you go. So as you look forward to 2018 what are some of your priorities, what are we going to be talking about a year from now do you think? >> Well I'm working on this really interesting project, this is in the UK, it has to do with Affintech, the UK has a mandate that it's calling the Pensions Dashboard Project, and I think that this has got a great analogy in the US, we have 401ks. They did a study there where they say the average person has 11 jobs over their lifetime and they leave behind some, what they call pension pots, so that would be like our 401ks, and this Pensions Dashboard Project is a way for people to find all of their left behind pension pots, and we talked last year about the technology that I've worked on called user managed access, UMA, which is a way where you can kind of have a standardized version of that Google Docs share button where you're in control of how much you share with somebody else, well they're using UMA to actually manage this pension finder service, so you give access first of all, to yourself, so you can get this aggregated dashboard view of all your pensions, and then you can share, one pension pot, you know one account, or more, with financial advisors selectively, and get advice on how to spend your newly found money. It's pretty awesome and it's an Affintech use case. >> How much unclaimed pension pot money, that must just be. >> In the country, in the UK, apparently it's billions upon billions, so imagine in the US, I mean it's probably a trillion dollars. I'm not sure, but it's a lot. We should do something here, I'm wondering how much money I have left behind. >> All right check your pension pot, that's the message from today's interview. All right Eve, well thanks for taking a few minutes, and again really interesting space and you guys are right at the forefront, so exciting times. >> It's a pleasure. >> All right she's Eve Maylar I'm Jeff Frigg you're watching theCUBE from Data Privacy Day 2018, thanks for watching, catch you next time. (upbeat music)
SUMMARY :
Jeff Frigg here with theCUBE. Well GDPR, the general data protection regulation for the global banks and the multinational banks have the same kind of regulations to apply Right, right, and so it's the same thing Yup, absolutely, and so that affects companies all the places you need to bake in security, And so, you have to bake it in, and I think on the privacy side do you expect that to be the same? you said easy, the word easy okay. again on the headlines that people reduces the friction to solving those problems And to come at it from the identity point of view and identity everything; services, so you can always include the people in the IT conversation. Yeah but it's a very different Yeah, and one of the interesting and I still can't remember them. They're aware of the API economy, they get it. the thing to thing and you got multi steps Nice to hear something's improving right? So as you look forward to 2018 what are and then you can share, one pension pot, In the country, in the UK, apparently and again really interesting space and you guys Privacy Day 2018, thanks for watching, catch you next time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff Frigg | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
11 jobs | QUANTITY | 0.99+ |
Yahoo | ORGANIZATION | 0.99+ |
Eve Maylar | PERSON | 0.99+ |
Forge Rock | ORGANIZATION | 0.99+ |
2018 | DATE | 0.99+ |
US | LOCATION | 0.99+ |
2017 | DATE | 0.99+ |
Affintech | ORGANIZATION | 0.99+ |
Eve | PERSON | 0.99+ |
Eve Maler | PERSON | 0.99+ |
Sunny Vale | LOCATION | 0.99+ |
China | LOCATION | 0.99+ |
GDPR | TITLE | 0.99+ |
UK | LOCATION | 0.99+ |
last year | DATE | 0.99+ |
second time | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
billions | QUANTITY | 0.99+ |
one account | QUANTITY | 0.99+ |
today | DATE | 0.98+ |
one | QUANTITY | 0.98+ |
one pension pot | QUANTITY | 0.97+ |
Linked-In | ORGANIZATION | 0.97+ |
both | QUANTITY | 0.97+ |
billions of people | QUANTITY | 0.97+ |
Data Privacy Day 2018 | EVENT | 0.96+ |
Data Privacy Day 2018 | EVENT | 0.96+ |
Google Docs | TITLE | 0.94+ |
single | QUANTITY | 0.93+ |
PSD2 | TITLE | 0.93+ |
Tylenol | ORGANIZATION | 0.91+ |
San Francisco | LOCATION | 0.9+ |
ForgeRock | ORGANIZATION | 0.9+ |
two factor | QUANTITY | 0.89+ |
a trillion dollars | QUANTITY | 0.83+ |
EU | ORGANIZATION | 0.77+ |
UMA | ORGANIZATION | 0.75+ |
theCUBE | ORGANIZATION | 0.74+ |
first | QUANTITY | 0.69+ |
401ks | QUANTITY | 0.64+ |
UK | ORGANIZATION | 0.58+ |
Pensions Dashboard Project | OTHER | 0.57+ |
about a year | QUANTITY | 0.52+ |
version | QUANTITY | 0.41+ |
two | OTHER | 0.4+ |