Image Title

Search Results for Data Privacy Day:

Russell Schrader, National Cyber Security Alliance | Data Privacy Day 2018


 

(soft click) >> Hey, welcome back everybody Jeff Frick here with theCUBE. We're at Data Privacy Day 2018 here at downtown San Francisco, the LinkedIn headquarters, gracious enough to hose this event. Bigger than last year, last year we were here for the first time at Twitter. And really the momentum continues to grow 'cause there's some big regulations coming down the pike that are really going to be into place. And have significant financial penalties, if you don't get your act together. So, we're excited to have the new Russell Schrader, the Executive Director of the National Cyber Security Alliance Organization behind this event. Russ, great to see you. >> Thank you very much for coming today, it was a great event. >> Absolutely, so, you've been on the job, this job, you said like less than two weeks. >> It's true. >> What do you think? I mean then they throw you right into the big event. >> Well, I've known the organization, I've known the event. But the staff really has done an outstanding job. They made it so easy for me, everything that they've done has just been terrific. They lined up fantastic speakers, they picked cutting edge topics, they put together a really well paced program, and it was just a terrific day for all of us to get in, really have some good discussions. >> You're off to a great start. (chuckles) >> Thank you. (both laugh) >> So you said you're familiar with your orginazation. You know, why are you here? Why did you take advantage of this opportunity? What do you kind of see as the role of this organization? And where do you see the opportunities to really make some significant impact going forward? >> Sure, the National Cyber Security Alliance is a who's who in the organization. People who really care about cyber security. Who see it as part of their social obligation. And it was a wonderful group that I'd worked with before. When I was at Visa and I see now, coming in as Executive Director, to really take it to the next level. We really are pushing, I think, on four separate areas that I think there's a lot of opportunity for it. Doing more cooperate work. Serving more consumers, more consumer education, more consumer awareness. I think working with educating staffers on the hill and in regulatory agencies in D.C. on changes and technological changes. And the cutting edge stuff. But in also, I think working academia, sort of getting involved and getting some of the scholarly, the cutting edge, the new ideas. And just preparing for what's going to happen in the next few years. >> Right, that's interesting 'cause you guys are National Cyber Security, security is often used as a reason to have less privacy. Right? It's often the excuse that the government, big brother, would used to say, you know, "We need to know what you're up to, we've got red light cameras all over the place to make sure you're not running red lights." So, it's an interesting relationship between privacy, security, and then what we're hearing more and more, really, a better linchpin to drive all this, which is, identity. So I wonder if you can share your kind of perspective on kind of the security versus privacy. Kind of trade off and debate. Or am I completely off base and they really need to run in parallel? >> Well, they do intersect a whole lot. People have talked about them being two sides of the same coin. Another speaker today said that security is a science but privacy is an art. As part of it is, you know, security is, the keeping the data in one place, the same way in as when you put it out. Sort of an integrity piece. You know, it isn't being misused, it's not being manipulated in a way and it's just not being changed. So that's a security piece. The privacy piece is people choosing what is used with that data. You know, is it to help me with an app? Is it to give me more information? Is it to give me games to play and things like that? So and that leads into a lot of different advantages in the web and on the internet. Now, identity since you put in a trifecta of big terms. >> Everything's got to be in threes, right? >> And there's three reasons for that. I think that, you know, the identity part is part of who are you. Now on the internet you can be a lot of people, right? The old cartoon was, you know, on the internet no one knows you're a dog. Well, on the internet, you can be a dog, you can be, you know, the person who you are at school, you can be the person who you are among your friends, you can be the person who you are at work. And those different selves, those different identities, are the internet of me. And we just need to make sure that you are curating your identities and sharing the information that you feel comfortable with. And that making sure that those are reaching the right people and not the wrong people. >> Right. So, there's an interesting kind of conundrum, we cover a lot of big data shows. And, you know, and there is kind of a fiduciary moral and now legal responsibility as you're collecting this data to drive some algorithm, some application that you know what you're using it for. And it's a good use of that. And you have a implicit agreement with the people providing you the data. But one the interesting things that comes up is then there's this thing where you've got that data and there's an application down the road that was not part of the original agreement. That no one even had an idea whatever happened. How does that fit in? Because as more and more of this data's getting stored. And there's actually a lot of value that can be unlocked, applying it in different ways, different applications. But, that wasn't the explicit reason that I gave it to you. >> Right, right. And that's really tricky because people have to be really vigilant. There is that education piece. That is the personal responsible piece to do business with companies and with apps that you feel comfortable with. But, you still have to trust but verify. And you do want to look into your phone, look into your PC, look into your other device. And figure out where things have changed, where things are moving. That's one of the great things about being in the Bay area today is innovation. But innovation, you just want to make sure that you are participating in it and you're in the part of innovation that's best for you. >> Okay, so, you mentioned academe, which is great, we do a lot of stuff at Stanford, we do a lot of stuff at MIT. So, as you look at kind of the academic opportunities. Kind of, where is some of the cutting edge research? Where are some of the academe focus areas that are helping advance the science of proxy? >> Well, you named two of the most forward thinking ones right there. So, I'll add to that just because we're talking about Stanford, we have to talk about Berkeley. >> Jeff: Yes. >> Right and Berkeley does have the whole group in privacy and law. On the east coast, in addition to MIT, you see George Washington is doing some things. George Mason is doing some things. And so you want to reach out to different areas. Cornell is doing things as well. So, we want to be able to figure out, where are the best ideas coming from? There are conferences already there. And maybe we can convene some papers, convene some people. And source out and give a little bit of more push and publish to people who otherwise wouldn't be getting the kind of publicity and encourage the kind of research. In privacy and in cyber security. Because there is the business and the consumer educational component. Not just, you know, the tech component to the academic work. >> So, before I let you go, last question. Where do you see is the biggest opportunity? Where's the biggest, either gap that needs to be filled, you know, kind of positive that's filling in negative, or an untapped positive that we've just barely scraped the surface of? >> Well, I think it's all about the consumer, to a large extent, to large one. You've got to figure out, how do you make your life easier. Right? Go back to the iPad introduction, nobody knew that they needed an iPad until they realized they couldn't live without it. You look at what's happened with mobile, right? Now, the idea of having a wallet, is on your phone. So, while I'm waiting in line at the grocery store, I'm checking my messages, I'm texting back and forth. And I just point my phone and I pay. Those kinds of areas are the kind of innovations that are consumer facing, that I think are really terrific. There's a lot of business work as well being done. But you have to figure out where that's going to go and I think the consumer just has a fantastic opportunity. >> Alright, well good opportunity, look forward to catching up a year from now and seeing how much progress you make. >> I think we had such a great program this year, I can't wait til next year, thank you. >> He's Russ Schrader, he's the Executive Director. I'm Jeff Frick, you're watching theCUBE, we're at Data Privacy Day 2018 in San Francisco. Thanks for watching, we'll catch you next time. (soft electronic music)

Published Date : Jan 27 2018

SUMMARY :

And really the momentum continues to grow Thank you very much for coming today, you said like less than two weeks. I mean then they throw you right into the big event. Well, I've known the organization, I've known the event. You're off to a great start. Thank you. And where do you see the opportunities And the cutting edge stuff. So I wonder if you can share your kind of perspective the same way in as when you put it out. and sharing the information that you feel comfortable with. And you have a implicit agreement And you do want to look into your phone, So, as you look at kind of the academic opportunities. Well, you named two of the And so you want to reach out to different areas. Where's the biggest, either gap that needs to be filled, You've got to figure out, how do you make your life easier. and seeing how much progress you make. I think we had such a great program this year, Thanks for watching, we'll catch you next time.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

AmazonORGANIZATION

0.99+

Dave VellantePERSON

0.99+

Justin WarrenPERSON

0.99+

Sanjay PoonenPERSON

0.99+

IBMORGANIZATION

0.99+

ClarkePERSON

0.99+

David FloyerPERSON

0.99+

Jeff FrickPERSON

0.99+

Dave VolantePERSON

0.99+

GeorgePERSON

0.99+

DavePERSON

0.99+

Diane GreenePERSON

0.99+

Michele PalusoPERSON

0.99+

AWSORGANIZATION

0.99+

Sam LightstonePERSON

0.99+

Dan HushonPERSON

0.99+

NutanixORGANIZATION

0.99+

Teresa CarlsonPERSON

0.99+

KevinPERSON

0.99+

Andy ArmstrongPERSON

0.99+

Michael DellPERSON

0.99+

Pat GelsingerPERSON

0.99+

JohnPERSON

0.99+

GoogleORGANIZATION

0.99+

Lisa MartinPERSON

0.99+

Kevin SheehanPERSON

0.99+

Leandro NunezPERSON

0.99+

MicrosoftORGANIZATION

0.99+

OracleORGANIZATION

0.99+

AlibabaORGANIZATION

0.99+

NVIDIAORGANIZATION

0.99+

EMCORGANIZATION

0.99+

GEORGANIZATION

0.99+

NetAppORGANIZATION

0.99+

KeithPERSON

0.99+

Bob MetcalfePERSON

0.99+

VMwareORGANIZATION

0.99+

90%QUANTITY

0.99+

SamPERSON

0.99+

Larry BiaginiPERSON

0.99+

Rebecca KnightPERSON

0.99+

BrendanPERSON

0.99+

DellORGANIZATION

0.99+

PeterPERSON

0.99+

Clarke PattersonPERSON

0.99+

Jerrod Chong, Yubico | Data Privacy Day 2018


 

>> Hey welcome back everybody, Jeff Frick here with The Cube. We're in downtown San Francisco at LinkedIn's headquarters at Data Privacy Day 2018. Second year we've been at the event, pretty interesting, you know there's a lot of stuff going on in privacy. It kind of follows the security track, gets less attention but with the impending changes in regulation it's getting much more play, much more media. So we're excited to be joined by our next guest. He's Jerrod Chong the Vice President of product at Yubico. Jerrod, welcome. >> Thank you Jeff. So for folks that aren't familiar with Yubico, what are you guys all about? >> We're all about protecting people's identities and privacies and making them the authenticate securely to online accounts. >> So identity, that's so, an increasingly important strategy for security. Don't worry about the wall, can we really figure out who this person is. So how has that been changing over the last couple years? >> Yes there's definitely a lot of things been changing. So we can think of identity as some some companies want to know who you are. But some companies actually are okay with you being anonymous but then they want to still know that is the person that they talk to is still the person. And so what we see in the wall of data is-- >> An anonymous person as opposed to a not-- >> Someone else. We want to make sure the anonymous person is the same anonymous person. >> Oh okay, okay, right. >> And that's important, right? If you can think of like a journalist and you think of they need to talk to the informer so they need to know that this is the real informer. And they don't want to have the fake informer tell them the wrong story. And so they need a way to actually strongly authenticate themselves. And so identity is a very interesting intersection of strong authentication. But at the same time, real identities as well as anonymous identities. And there are actually real life applications for both that can protect citizens, can protect dissidents but also at the same time can help governments do the right things when they know who you are. >> Right, so we're so far behind that I still can't understand why you dial into the customer service person and you put in your account number and they still want to know you're mom's maiden name. And we've told them all a thousand times that can't be much of a secret anymore. And then I read something else that said the ability to use a nine digit social security number and keep that actually private is basically, the chances of doing that are basically zero. So we're well past that stage in terms of some of these more sophisticated systems but we still kind of have regulations that are still asking you to put in your social security number. So what are the ways that you guys are kind of addressing that? And you're kind of taking a novel approach with an open source solution which is pretty cool. >> Yes we've created the open standard which is FIDO U2F standard and we actually co-created this with Google. And one of the key things is that what we call knowledge-base systems are just a thing of the past. Knowledge-base is anything that you try to remember including passwords. And what we call recovery questions. You know, you name the recovery question that you want to put in. >> Right right, your dog, your pet, you know your street. >> And you can get everything online from LinkedIn or Facebook. So why are we doing those systems? And obviously they are, we need to change that. But this open standard that we've created really allows you to physically prove yourself with a physical device. Like, so you want to tell who you are and there are a couple ways you can tell who you are online. You can tell by remembering something, by something that you have, and something that who you are, right? So these are the basics in how you can identify yourself over the wire. And what we've really focused on is the combination of something you have and something you know. But the something you know is not revealed to the world. The something you know is revealed to the device that you have. So it's kind of like your ATM card. You're not going to tell the PIN to the world. Nobody really has you ATM, nobody asks you for the ATM. Even the banks don't know what your ATM is and you can change that and only you know about it. And it's only on the card. And so we take that same concept and make it available for companies to implement these types of authentication systems for their own services. So today Google supports this open standard. Actually today Facebook supports it as well. And SalesForce and hosts of other services. Which means that you can actually authenticate yourself with a device and something you know. And that really allows you as an individual to not have to think about all these different things that you have to remember for every single site because that's what people are doing today. And so the beauty about this protocol as well is that, is what the developer's think, Is that these systems, they don't know that you have the same authenticator. Which is a great thing, so they can't collude and share and then pinpoint it was you. If you took this authenticator you can use it with many different things but all of them don't know that you have what we call the YubiKey. And so this is, the YubiKey that we-- >> So it's like the old RSA key, what we think a lot of people are familiar with. >> What people think, obviously we've, it's way beyond RSA key. >> Right, but it's the same kind of concept, you've got a USB a little device-- >> And that's what you bring with you and that's who you are. And you can strongly authenticate to the servers that you want. And I think that's really the foundation which is people want to take back the way that they authenticate through the systems and they want to own it. And that's really a big difference that we see rather than the banks that you must have this or you must have that and you can only use it with me you can't use it with somebody else. I want to bring my authenticator anywhere. >> So you said Google's using that. I'm a huge Google user, I don't have one of those things. So where's the application? Is that something that I choose because I want to add another layer of protection or is that something that Google says hey Jeff, you're such and such a level of customer user et cetera we think you should take this to the next level. How does that happen? >> So it's actually been available since the end of 2014. It's part of the step up authentication. The latest iteration of the work that Google has done is the Google advanced protection program. Which means that you can enable one of these devices as part of your account. And one of the things they've done is that for those users at risk you can only log in with these devices. Which really restricts-- >> So they define you as a high risk person because of whatever reason. >> And they encourage you, hey please protect yourself with additional security measures. And the old additional security measures used to be like, you know, send me an SMS text. But that's actually pretty broken right now. We've seen it being breached everywhere because of what we call phone hijacking. You know, I pretend to be you and I've got your phone number and you know, now I've got your phone. >> Shoot I thought that was a good one. >> That is known, there's lot video how you can do that. And so this is available now for everyone. Everyone has a gmail account, you can go into your account it says I want step up authentication. They call it two step verification. And then they walk you through the process. And then you get one of these in the mail? >> You actually have to buy these but Google has been providing within different communities, they've been seeding the market, we've been also doing a lot of advocacy work. Many different types, even here today we've distributed a lot of YubiKeys for all of the journalists to use. But in general users will go online to Amazon or something and you would buy one of these devices. >> So then and then once I have that key and I bought into that system is you're saying then I can use that key for not just Google but my Amazon account-- >> Anyone that supports-- >> Anyone that supports that standard? >> Exactly, anybody that supports the standard. And that standard is growing extremely rapidly and it's users, it's big companies using it, developers of sites are using it. So the thing that we created for the world back in 2014 is now being actually accelerated because of all these breaches. They are very relevant to data breaches, identity breaches, and people want to take control. >> Right, I'm just curious, I'm sure you have a point of view, you know why haven't the phone companies implemented more use of the biometric data piece that they have whether, now they're talking about the face recognition or your finger recognition and tied that back to the apps on my phone? I still am befuddled by the lack of that integration. >> There's definitely, there are definitely solutions in that area. And I think, but one of the challenges that just like a computer, just like a phone, it's a complicated piece of software. There's a lot of dependencies. All it takes is one software to get it wrong and the entire phone can be compromised. So you're back into complicated systems, complex systems, people write these systems, people write these apps. It takes one bad developer to mess it up for everybody else. So it's actually pretty hard unless you control every single ecosystem that you build which is vastly difficult now in the mobile space. The mobile carriers are not just, it's not just from AT&T, you've got the OS, you've got you know, Google, the Android phone. You've got AT&T, you've got the apps on the phone, you've got all the, you know, the various processes, the components that talk to different apps and you've got the calling app, you've got all of these other games. So because it's such a complicated device getting it right from a security perspective is actually pretty difficult. So, but there are definitely applications that have been working over the years that have been trying to leverage the built in capabilities. We actually see it as the YubiKey can actually be used with this device. And then you can use these devices after you bootstrap them. What we deemed as, what we call blasted device. So you can use multiple different things. And the standard doesn't always define that you just use the hardware device of the YubiKey. You can use a phone if you trust the phone. We want to give flexibility to the ecosystem. >> So I'm just curious in terms of the open standard's approach for this problem, how that's gaining traction. Because clearly, you know, open source is done very very well, you know far beyond Linux as an operating system. But you know so many apps and stuff run open source software, components of open source. So in terms of market penetration and kind of adoption of this technology versus the one single vendor key that you used to have, how is the uptake, how is the industry responding? Is this something that a lot of people are getting behind? >> It's definitely getting a lot of traction in the industry. So we started the journey with Google and what was happening was that in order to work with this prominent scale you have to believe that just between, you know, Yubico and Google can't solve this problem. And if the answer is you got to do my thing, no one's going to play in this game. Just a high level. So I think what we've done is that the open standard is the catalyst for other big players to participate. Without any one vendor going to necessarily win. So today if, there's a big plenary going on at FIDO and it's really iteration of what we've developed with Google. And now we're taking the next level with actually Microsoft. And we've called it FIDO 2. So from U2F, FIDO Universal Second Factor, to FIDO 2. And that entire work that we've done with Google is now being evolved into the Microsoft ecosystem. So, and we'll see in a couple months, you will start to see real Microsoft products being able to support the same standard. Which is really excellent because what do you use every day? You either use, there's three major platform players that you have today, right you have, you either use a Google type of device, Chrome or Android. You use a Microsoft device, you've got Windows everywhere. Or you use an Apple device. So, and the only way these large internet companies are going to collaborate is if it's open. If it's closed, if it's my stuff, Google's not going to implement it because it's Microsoft stuff, Microsoft's not going to implement Apple stuff. So the only way you can-- >> I dunno about the Apple part of that analogy but that's okay. >> That's true, that's true, but I think it's important that the security industry working with the identity issue, work together. And we need to move away from all this one up, proprietary things. Because it makes it really difficult for the users and the people to implement things. And if everybody's collaborating like an open standard, then you actually can make a dent in the problem that you see today. >> And to your point, right, with BYOD, which is now, used to be a thing, it's not a thing obviously everybody's bringing their own devices. To have an open standard so people at different types of companies with different types of ecosystems with different types of users using different types of devices have a standard by which they can build these things. >> Absolutely. >> Exciting times. >> Exciting times. >> Alright Jerrod, well thanks for taking a few minutes out of your day. We look forward to watching the Yubico story unfold. >> Exactly, thank you very much. >> Alright, very good. He's Jerrod, I'm Jeff, you're watching The Cube where Data Privacy Day 2018, thanks for watching.

Published Date : Jan 27 2018

SUMMARY :

pretty interesting, you know there's a lot what are you guys all about? the authenticate securely to online accounts. So how has that been changing over the last couple years? that is the person that they talk to is the same anonymous person. do the right things when they know who you are. So what are the ways that you guys Knowledge-base is anything that you try to remember And that really allows you as an individual So it's like the old RSA key, what we think it's way beyond RSA key. And that's what you bring with you and that's who you are. So you said Google's using that. Which means that you can enable one of these devices So they define you as a high risk person You know, I pretend to be you and I've got your phone number And then they walk you through the process. to Amazon or something and you would So the thing that we created for the world back in 2014 I'm sure you have a point of view, And then you can use these devices after you bootstrap them. But you know so many apps and stuff And if the answer is you got to do my thing, of that analogy but that's okay. can make a dent in the problem that you see today. And to your point, right, with BYOD, We look forward to watching the Yubico story unfold. He's Jerrod, I'm Jeff, you're watching The Cube

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JerrodPERSON

0.99+

Jerrod ChongPERSON

0.99+

JeffPERSON

0.99+

Jeff FrickPERSON

0.99+

GoogleORGANIZATION

0.99+

2014DATE

0.99+

AmazonORGANIZATION

0.99+

AppleORGANIZATION

0.99+

YubicoORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

AT&TORGANIZATION

0.99+

FIDO 2TITLE

0.99+

LinkedInORGANIZATION

0.99+

FacebookORGANIZATION

0.99+

end of 2014DATE

0.99+

todayDATE

0.99+

Second yearQUANTITY

0.99+

oneQUANTITY

0.98+

zeroQUANTITY

0.98+

bothQUANTITY

0.98+

The CubeTITLE

0.98+

AndroidTITLE

0.98+

LinuxTITLE

0.98+

one softwareQUANTITY

0.97+

WindowsTITLE

0.97+

Data Privacy Day 2018EVENT

0.97+

YubiKeyORGANIZATION

0.96+

nine digitQUANTITY

0.96+

two stepQUANTITY

0.96+

The CubeORGANIZATION

0.96+

ChromeTITLE

0.93+

one bad developerQUANTITY

0.89+

FIDO Universal Second FactorTITLE

0.88+

FIDOTITLE

0.86+

single siteQUANTITY

0.83+

last couple yearsDATE

0.83+

single ecosystemQUANTITY

0.83+

U2FORGANIZATION

0.83+

three major platform playersQUANTITY

0.82+

FIDO U2FTITLE

0.8+

San FranciscoLOCATION

0.78+

YubiKeyOTHER

0.76+

one single vendorQUANTITY

0.76+

a thousand timesQUANTITY

0.75+

RSAOTHER

0.72+

one ofQUANTITY

0.71+

couple waysQUANTITY

0.7+

YubicoPERSON

0.7+

one vendorQUANTITY

0.69+

RSA keyOTHER

0.66+

Eva Velasquez, Identity Theft Resource Center | Data Privacy Day 2018


 

>> Hey, welcome back everybody, Jeff Frick here with The Cube. We're at Data Privacy Day 2018, I still can't believe it's 2018, in downtown San Francisco, at LinkedIn's headquarters, the new headquarters, it's a beautiful building just down the road from the sales force building, from the new Moscone that's being done, there's a lot of exciting things going on in San Francisco, but that's not what we're here to talk about. We're here to talk about data privacy, and we're excited to have a return visit from last year's Cube alumni, she's Eva Velasquez, president and CEO, Identity Theft Resource Center. Great to see you again. >> Thank you for having me back. >> Absolutely, so it's been a year, what's been going on in the last year in your world? >> Well, you know, identity theft hasn't gone away >> Shoot. >> And data-- >> I thought you told me it was last time. >> I know, I wish, and in fact, unfortunately we just released our data breach information, and there was a tremendous growth. It was a little over 1000, previous year, and over 1500 data breaches... in 2017. >> We're almost immune, they're like every day. And it used to be like big news. Now it's like, not only was Yahoo breached at some level, which we heard about a while ago, but then we hear they were actually breached like 100%. >> There is some fatigue, but I can tell you that it's not as pervasive as you might think. Our call center had such a tremendous spike in calls during the Equifax breach. It was the largest number of calls we'd had in a month, since we'd been measuring our call volume. So people were still very, very concerned. But a lot of us who are in this space are feeling, I think we may be feeling the fatigue more than your average consumer out there. Because for a lot of folks, this is really the first exposure to it. We're still having a lot of first exposures to a lot of these issues. >> So the Equifax one is interesting, because most people don't have a direct relationship with Equifax, I don't think. I'm not a direct paying customer, I did not choose to do business with them. But as one of the two or three main reporting agencies, right, they've got data on everybody for their customers who are the banks, financial institutions. So how does that relationship get managed? >> Oh my gosh, there's so much meat there. There's so much meat there. Okay, so, while it feels like you don't have a direct relationship with the credit reporting agencies, you actually do, you get a benefit from the services that they're providing to you. And every time you get a loan, I mean this is a great conversation for Data Privacy Day. Because when you get a loan, get a credit card, and you sign those terms and conditions, guess what? >> They're in there? >> You are giving that retailer, that lender, the authority to send that information over to the credit reporting agencies. And let's not forget that the intention of forming the credit reporting agencies was for better lending practices, so that your creditworthiness was not determined by things like your gender, your race, your religion, and those types of really, I won't say arbitrary, but just not pertinent factors. Now your creditworthiness is determined by your past history of, do you pay your bills? What is your income, do you have the ability to pay? So it started with a good, very good purpose in mind, and we definitely bought into that as a society. And I don't want to sound like I'm defending the credit reporting agencies and all of their behavior out there, because I do think there are some changes that need to be made, but we do get a benefit from the credit reporting agencies, like instant credit, much faster turnaround when we need those financial tools. I mean, that's just the reality of it. >> Right, right. So, who is the person that's then... been breached, I'm trying to think of the right word of the relationship between those who've had their data hacked from the person who was hacked. If it's this kind of indirect third party relationship through an authorization through the credit card company. >> No, the, Equifax is absolutely responsible. >> So who would be the litigant, just maybe that's the word that's coming to me in terms of feeling the pain, is it me as the holder of the Bank of America Mastercard? Is it Bank of America as the issuer of the Mastercard? Or is it Mastercard, in terms of retribution back to Equifax? >> Well you know, I can't really comment on who actually would have the strongest legal liability, but what I can say is, this is the same thing I say when I talk to banks about identity theft victims. There's some discussion about, well, no, it's the bank that's the victim in existing account identity theft, because they're the ones that are absorbing the financial losses. Not the person whose data it belongs to. Yet the person who owns that data, it's their identity credentials that have been compromised. They are dealing with issues as well, above and beyond just the financial compromise. They have to deal with cleaning up other messes and other records, and there's time spent on the phone, so it's not mutually exclusive. They're both victims of this situation. And with data breaches, often the breached entity, again, I hate to sound like an apologist, but I am keeping this real. A breached entity, when they're hacked, they are a victim, a hacker has committed that crime and gone into their systems. Yes, they have a responsibility to make those security systems as robust as possible, but the person whose identity credentials those are, they are the victim. Any entity or institution, if it's payment card data that's compromised, and a financial services institution has to replace that data, guess what, they're a victim too. That's what makes this issue and this crime so terrible, is that it has these tentacles that reach down and touch more than one person for each incident. >> Right. And then there's a whole 'nother level, which we talked about before we got started that we want to dig into, and that's children. Recently, a little roar was raised with these IOT connected toys. And just a big, giant privacy hole, into your kid's bedroom. With eyes and ears and everything else. So wonder if you've got some specific thoughts on how that landscape is evolving. >> Well, we have to think about the data that we're creating. That does comprise our identity. And when we start talking about these toys and other... internet connected, IOT devices that we're putting in our children's bedroom, it actually does make the advocacy part of me, it makes the hair on the back of my neck stand up. Because the more data that we create, the more that it's vulnerable, the more that it's used to comprise our identity, and we have a big enough problem with child identity theft just now, right now as it stands, without adding the rest of these challenges. Child and synthetic identity theft are a huge problem, and that's where a specific Social Security number is submitted and has a credit profile built around it, when it can either be completely made up, or it belongs to a child. And so you have a four year old whose Social Security number is now having a credit profile built around it. Obviously they're not, so the thieves are not submitting this belongs to a four year old, it would not be issued credit. So they're saying it's a, you know, 23 year old-- >> But they're grabbing the number. >> They're grabbing the number, they're using the name, they build this credit profile, and the biggest problem is we really haven't modernized how we're authenticating this information and this data. I think it's interesting and fitting that we're talking about this on Data Privacy Day, because the solution here is actually to share data. It's to share it more. And that's an important part of this whole conversation. We need to be smart about how we share our data. So yes, please, have a thoughtful conversation with yourself and with your family about what are the types of data that you want to share and keep, and what do you want to keep private, but then culturally we need to look at smart ways to open up some data sharing, particularly for these legitimate uses, for fraud detection and prevention. >> Okay, so you said way too much there, 'cause there's like 87 followup questions in my head. (Eva laughs) So we'll step back a couple, so is that synthetic identity, then? Is that what you meant when you said a synthetic identity problem, where it's the Social Security number of a four year old that's then used to construct this, I mean, it's the four year old's Social Security number, but a person that doesn't really exist? >> Yes, all child identity theft is synthetic identity theft, but not all synthetic identity theft is child identity theft. Sometimes it can just be that the number's been made up. It doesn't actually belong to anyone. Now, eventually maybe it will. We are hearing from more and more parents, I'm not going to say this is happening all the time, but I'm starting to hear it a little bit more often, where the Social Security number is being issued to their child, they go to file their taxes, so this child is less than a year old, and they are finding out that that number has a credit history associated with it. That was associated years ago. >> So somebody just generated the number. >> Just made it up. >> So are we ready to be done with Social Security numbers? I mean, for God's sake, I've read numerous things, like the nine-digit number that's printed on a little piece of paper is not protectable, period. And I've even had a case where they say, bring your little paper card that they gave you at the hospital, and I won't tell you what year that was, a long time ago. I'm like, I mean come on, it's 2018. Should that still be the anchor-- >> You super read my mind. >> Data point that it is? >> It was like I was putting that question in your head. >> Oh, it just kills me. >> I've actually been talking quite a bit about that, and it's not that we need to get, quote unquote, get rid of Social Security numbers. Okay, Social Security numbers were developed as an identifier, because we have, you can have John Smith with the same date of birth, and how do we know which one of those 50,000 John Smiths is the one we're looking for? So that unique identifier, it has value. And we should keep that. It's not a good authenticator, it is not a secret. It's not something that I should pretend only I know-- >> Right, I write it on my check when I send my tax return in. Write your number on the check! Oh, that's brilliant. >> Right, right. So it's not, we shouldn't pretend that this is, I'm going to, you, business that doesn't know me, and wants to make sure I am me, in this first initial relationship or interaction that we're having, that's not a good authenticator. That's where we need to come up with a better system. And it probably has to do with layers, and more layers, and it means that it won't be as frictionless for consumers, but I'm really challenging, this is one of our big challenges for 2018, we want to flip that security versus convenience conundrum on its ear and say, no, I really want to challenge consumers to say... I'm happier that I had to jump through those hoops. I feel safer, I think you're respecting my data and my privacy, and my identity more because you made it a little bit harder. And right now it's, no, I don't want to do that because it's a little too, nine seconds! I can't believe it took me nine seconds to get that done. >> Well, yeah, and we have all this technology, we've got fingerprint readers that we're carrying around in our pocket, I mean there's, we've got geolocation, you know, is this person in the place that they generally, and having 'em, there's so many things-- >> It's even more granular >> Beyond a printed piece of >> Than that-- >> paper, right? >> It's the angle at which you look at your phone when you look at it. It's the tension with which you enter your passcode, not just the passcode itself. There are all kinds of very non-invasive biometrics, for lack of a better word. We tend to think of them as just, like our face and our fingerprint, but there are a lot of other biometrics that are non-invasive and not personal. They're not private, they don't feel secret, but we can use them to authenticate ourselves. And that's the big discussion we need to be having. If I want to be smart about my privacy. >> Right. And it's interesting, on the sharing, 'cause we hear that a lot at security conferences, where one of the best defenses is that teams at competing companies, security teams, share data on breach attempts, right? Because probably the same person who tried it against you is trying it against that person, is trying it against that person. And really an effort to try to open up the dialogue at that level, as more of just an us against them versus we're competing against each other in the marketplace 'cause we both sell widgets. So are you seeing that? Is that something that people buy into, where there's a mutual benefit of sharing information to a certain level, so that we can be more armed? >> Oh, for sure, especially when you talk to the folks in the risk and fraud and identity theft mitigation and remediation space. They definitely want more data sharing. And... I'm simply saying that that's an absolutely legitimate use for sharing data. We also need to have conversations with the people who own that data, and who it belongs to, but I think you can make that argument, people get it when I say, do you really feel like the angle at which you hold your phone, is that personal? Couldn't that be helpful, that combined with 10 other data points about you, to help authenticate you? Do you feel like your personal business and life is being invaded by that piece of information? Or compare that to things like your health records. And medical conditions-- >> Mom's maiden name. >> That you're being treated for, well, wow, for sure that feels super, super personal, and I think we need to do that nuance. We need to talk about what data falls into which of these buckets, and on the bucket that isn't super personal, and feeling invasive and that I feel like I need to protect, how can I leverage that to make myself safer? >> Great. Lots of opportunity. >> I think it's there. >> Alright. Eva, thanks for taking a few minutes to stop by. It's such a multi-layered and kind of complex problem that we still feel pretty much early days at trying to solve. >> It's complicated, but we'll get there. More of this kind of dialogue gets us just that much closer. >> Alright, well thanks for taking a few minutes of your day, great to see you again. >> Thanks. >> Alright, she's Eva, I'm Jeff, you're watching The Cube from Data Privacy Days, San Francisco. (techno music)

Published Date : Jan 27 2018

SUMMARY :

Great to see you again. I thought you told me it was and there was a tremendous growth. but then we hear they were actually breached like 100%. the first exposure to it. I did not choose to do business with them. that they're providing to you. And let's not forget that the intention of the relationship between those who've had above and beyond just the financial compromise. that we want to dig into, and that's children. Because the more data that we create, the more We need to be smart about how we share our data. Is that what you meant when you said Sometimes it can just be that the number's been made up. at the hospital, and I won't tell you is the one we're looking for? Write your number on the check! And it probably has to do with layers, It's the tension with which you enter your passcode, Because probably the same person who tried it against you the angle at which you hold your phone, is that personal? and that I feel like I need to protect, Lots of opportunity. problem that we still feel pretty much early days just that much closer. of your day, great to see you again. Alright, she's Eva, I'm Jeff, you're watching The Cube

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FrickPERSON

0.99+

Eva VelasquezPERSON

0.99+

EquifaxORGANIZATION

0.99+

2017DATE

0.99+

nine secondsQUANTITY

0.99+

EvaPERSON

0.99+

Bank of AmericaORGANIZATION

0.99+

YahooORGANIZATION

0.99+

JeffPERSON

0.99+

2018DATE

0.99+

LinkedInORGANIZATION

0.99+

four yearQUANTITY

0.99+

MastercardORGANIZATION

0.99+

twoQUANTITY

0.99+

Identity Theft Resource CenterORGANIZATION

0.99+

San FranciscoLOCATION

0.99+

The CubeTITLE

0.99+

100%QUANTITY

0.99+

oneQUANTITY

0.99+

firstQUANTITY

0.99+

last yearDATE

0.99+

CubeORGANIZATION

0.99+

first exposureQUANTITY

0.99+

10 other data pointsQUANTITY

0.98+

each incidentQUANTITY

0.98+

a monthQUANTITY

0.98+

less than a year oldQUANTITY

0.98+

more than one personQUANTITY

0.98+

over 1000QUANTITY

0.97+

first exposuresQUANTITY

0.97+

both victimsQUANTITY

0.97+

nine-digitQUANTITY

0.97+

three main reporting agenciesQUANTITY

0.97+

over 1500 data breachesQUANTITY

0.97+

87 followup questionsQUANTITY

0.96+

The CubeORGANIZATION

0.96+

bothQUANTITY

0.96+

Data Privacy DayEVENT

0.95+

Data Privacy Day 2018EVENT

0.94+

Data Privacy DaysTITLE

0.94+

four year oldQUANTITY

0.93+

MosconeLOCATION

0.9+

previous yearDATE

0.88+

50,000QUANTITY

0.85+

a yearQUANTITY

0.82+

John SmithPERSON

0.81+

23 year oldQUANTITY

0.81+

about a while agoDATE

0.68+

coupleQUANTITY

0.68+

privacyORGANIZATION

0.66+

IOTORGANIZATION

0.61+

yearsDATE

0.56+

John SmithsCOMMERCIAL_ITEM

0.4+

Eve Maler, ForgeRock | Data Privacy Day 2018


 

>> Hey, welcome back everybody. Jeff Frigg here with theCUBE. We're at Data Privacy Day 2018 here at Linked-In's brand new, downtown San Francisco headquarters not in Sunny Vale. And we're excited to be here for the second time. And we've got Eve Maylar back she's a VP in innovation and emerging tech at Forge Rock, we caught up last year, so great to see you. >> Likewise. >> So what's different in 2018 than 2017? >> Well GDPR, the general data protection regulation Well, also we didn't talk about it much here today, but the payment services directive version two is on the lips of everybody in the EU who's in financial services, along with open banking, and all these regulations are actually as much about digital transformation, I've been starting to say hashtag digital transformation, as they are about regulating data protection and privacy, so that's big. >> So why aren't those other two being talked about here do you think? >> To a certain extent they are for the global banks and the multinational banks and they have as much impact on things like user consent as GDPR does, so that's a big thing. >> Jeff: Same penalties? >> They do have some penalties, but they are as much about, okay, I'm starting to say hashtag in front of all these cliches, but you know they are as much about trying to do the digital single market as GDPR is, so what they're trying to do is have a level playing field for all those players. So the way that GDPR is trying to make sure that all of the countries have the same kind of regulations to apply so that they can get to the business of doing business. >> Right, right, and so it's the same thing trying to have this kind of unified platform. >> Yup, absolutely, and so that affects companies here if they play in that market as well. >> So there's a lot of talk on this security site when you go to these security conferences about baking security in everywhere, right? It can't be OL guard anymore, there is no such thing as keeping the bad guys out, it's more all the places you need to bake in security, and so you're talking about that really needs to be on the privacy side as well, it needs to go hand-in-hand, not be counter to innovation. >> Yes, it is not a zero sum game, it should be a positive sum game in fact, GDPR would call it data protection by design and by default. And so, you have to bake it in, and I think the best way to bake it in is to see this as an opportunity to do better business with your customers, your consumers, your patients, your citizens, your students, and the way to do that is to actually go for a trust mark instead of, I shouldn't say a trust mark, but go for building trusted digital relationships with all those people instead of just saying "Well I'm going to go for compliance" and then say " Well I'm sorry if you didn't feel that action "on my part was correct" >> Well hopefully it's more successful than we've seen on the security side right? Because data breaches are happening constantly, no one is immune and I don't know, we're almost kind of getting immune to it. I mean Yahoo's it was their entire database of however many billions of people, and some will say it's not even when you get caught it's more about how you react, when you do get caught both from a PR perspective, as well as mending faith like the old Tylenol issue back in the day, so, on the privacy side do you expect that to be the same? Are these regulations in such a way where it's relatively easy to implement so we won't have kind of this never ending breach problem on the security side, or is it going to be kind of the same. >> I think I just made a face when you said easy, the word easy okay. >> Not easy but actually doable, 'cause sometimes it feels like some of the security stuff again on the breaches specifically, yeah it seems like it should be doable, but man oh man we just hear over and over again on the headlines that people are getting compromised. >> Yeah people are getting compromised and I think they are sort of getting immune to the stories when it's a security breach. We try to do at my company at Forge Rock we're identities so I have this identity lens that I see everything through, and I think especially in the internet of things which we've talked about in the past there's a recognition that digital identity is a way that you can start to attack security and privacy problems, because if you want to, for example, save off somebody's consent to let information about them flow, you need to have a persistent storage that they did consent, you need to have persistent storage of the information about them, and if they want to withdraw consent which is a thing like GDPR requires you to be able to do, and prove that they're able to do, you need to have a persistent storage of their digital identity. So identity is actually a solution to the problem, and what you want to do is have an identity and access management solution that actually reduces the friction to solving those problems so it's basically a way to have consent life cycle management if you will and have that solution be able to solve your problems of security and privacy. >> And to come at it from the identity point of view versus coming at it from the data point of view. >> That's right, and especially when it comes to internet of things, but not even just internet of things, you're starting to need to authenticate and identity everything; services, applications, piles of data, and smart devices, and people, and keep track of the relationships among them. >> We just like to say people are things too so you can always include the people in the IT conversation. But it is pretty interesting the identity task 'cause we see that more and more, security companies coming at the problem from an identity problem because now you can test the identity against applications, against data, against usage, change what's available, not available to them, versus trying to build that big wall. >> Yes, there's no perimeters anymore. >> Unless you go to China and walk the old great wall. >> Yes you're taking your burner devices with you aren't you? (laughs) >> Yes. >> Good, good to hear >> Yeah but it's a very different way to attack the problem from an identity point of view. >> Yeah, and one of the interesting things actually about PSD2 and this open banking mandate, and I was talking about they want to get digital business to be more attractive, is that they're demanding strong customer authentication, SCA they call it, and so I think we're going to see, I think we talked about passwords last time we met, less reliance. >> Jeff: And I still have them and I still can't remember them. >> Well you know, less reliance on passwords either is the only factor or sometimes a factor, and more sophisticated authentication that has less impact, well less negative impact on your life, and so I'm kind of hopeful that they're getting it, and these things are rolling up faster than GDPR, so I think those are kind of easier. They're aware of the API economy, they get it. They get all the standards that are needed. >> 'Cause the API especially when you get the thing to thing and you got multi steps and everything is depending on the connectivity upstream, you've got some significant issues if you throw a big wrench into there. But it's interesting to see how the two factor authentication is slowly working its way into more and more applications, and using a phone now without the old RSA key on the keychain, what a game changer that is. >> Yeah I think we're getting there. Nice to hear something's improving right? >> There you go. So as you look forward to 2018 what are some of your priorities, what are we going to be talking about a year from now do you think? >> Well I'm working on this really interesting project, this is in the UK, it has to do with Affintech, the UK has a mandate that it's calling the Pensions Dashboard Project, and I think that this has got a great analogy in the US, we have 401ks. They did a study there where they say the average person has 11 jobs over their lifetime and they leave behind some, what they call pension pots, so that would be like our 401ks, and this Pensions Dashboard Project is a way for people to find all of their left behind pension pots, and we talked last year about the technology that I've worked on called user managed access, UMA, which is a way where you can kind of have a standardized version of that Google Docs share button where you're in control of how much you share with somebody else, well they're using UMA to actually manage this pension finder service, so you give access first of all, to yourself, so you can get this aggregated dashboard view of all your pensions, and then you can share, one pension pot, you know one account, or more, with financial advisors selectively, and get advice on how to spend your newly found money. It's pretty awesome and it's an Affintech use case. >> How much unclaimed pension pot money, that must just be. >> In the country, in the UK, apparently it's billions upon billions, so imagine in the US, I mean it's probably a trillion dollars. I'm not sure, but it's a lot. We should do something here, I'm wondering how much money I have left behind. >> All right check your pension pot, that's the message from today's interview. All right Eve, well thanks for taking a few minutes, and again really interesting space and you guys are right at the forefront, so exciting times. >> It's a pleasure. >> All right she's Eve Maylar I'm Jeff Frigg you're watching theCUBE from Data Privacy Day 2018, thanks for watching, catch you next time. (upbeat music)

Published Date : Jan 27 2018

SUMMARY :

Jeff Frigg here with theCUBE. Well GDPR, the general data protection regulation for the global banks and the multinational banks have the same kind of regulations to apply Right, right, and so it's the same thing Yup, absolutely, and so that affects companies all the places you need to bake in security, And so, you have to bake it in, and I think on the privacy side do you expect that to be the same? you said easy, the word easy okay. again on the headlines that people reduces the friction to solving those problems And to come at it from the identity point of view and identity everything; services, so you can always include the people in the IT conversation. Yeah but it's a very different Yeah, and one of the interesting and I still can't remember them. They're aware of the API economy, they get it. the thing to thing and you got multi steps Nice to hear something's improving right? So as you look forward to 2018 what are and then you can share, one pension pot, In the country, in the UK, apparently and again really interesting space and you guys Privacy Day 2018, thanks for watching, catch you next time.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FriggPERSON

0.99+

JeffPERSON

0.99+

11 jobsQUANTITY

0.99+

YahooORGANIZATION

0.99+

Eve MaylarPERSON

0.99+

Forge RockORGANIZATION

0.99+

2018DATE

0.99+

USLOCATION

0.99+

2017DATE

0.99+

AffintechORGANIZATION

0.99+

EvePERSON

0.99+

Eve MalerPERSON

0.99+

Sunny ValeLOCATION

0.99+

ChinaLOCATION

0.99+

GDPRTITLE

0.99+

UKLOCATION

0.99+

last yearDATE

0.99+

second timeQUANTITY

0.99+

twoQUANTITY

0.99+

billionsQUANTITY

0.99+

one accountQUANTITY

0.99+

todayDATE

0.98+

oneQUANTITY

0.98+

one pension potQUANTITY

0.97+

Linked-InORGANIZATION

0.97+

bothQUANTITY

0.97+

billions of peopleQUANTITY

0.97+

Data Privacy Day 2018EVENT

0.96+

Data Privacy Day 2018EVENT

0.96+

Google DocsTITLE

0.94+

singleQUANTITY

0.93+

PSD2TITLE

0.93+

TylenolORGANIZATION

0.91+

San FranciscoLOCATION

0.9+

ForgeRockORGANIZATION

0.9+

two factorQUANTITY

0.89+

a trillion dollarsQUANTITY

0.83+

EUORGANIZATION

0.77+

UMAORGANIZATION

0.75+

theCUBEORGANIZATION

0.74+

firstQUANTITY

0.69+

401ksQUANTITY

0.64+

UKORGANIZATION

0.58+

Pensions Dashboard ProjectOTHER

0.57+

about a yearQUANTITY

0.52+

versionQUANTITY

0.41+

twoOTHER

0.4+

Craig Goodwin, CDK Global | Data Privacy Day


 

>> Welcome back everybody, Jeff Frick here with theCUBE. We're in downtown San Francisco at LinkedIn's brand new headquarters up here, at Data Privacy Day 2018. We were here last year, the conference is growing, a lot more people here, a lot more activity. We're excited to have a sponsor, Craig Goodwin, he's the Chief Security Officer of CDK Global. Great to see ya. >> Great to be here. >> Absolutely. So for people who aren't familiar, give us a quick kind of overview of what is CDK Global. >> Sure, so CDK Global runs automotive technology. So we enable technology for automotive dealerships, original equipment manufacturers, and we run a lot of the technology across the U.S. and the rest of the world. So, I think last estimate's about $500 billion worth of automotive transactions, whether buying a car, servicing a car, all went through CDK's systems. >> Okay, so it's the systems, it's the business systems for those autmotive companies. It's not like we were just at an autonomous vehicle company the other day, it's not those type of systems. >> Yeah, correct, I mean we're helping with that, right? So a lot of our technology is connecting, with IoT and connected vehicles helping to take in data from those vehicles, to help automotive dealerships, to service the vehicles, or to sell the vehicles. So we ingest that data, and we ingest that technology, but essentially we're talking about the data in the dealerships. >> Okay. So how have you seen things evolve over the last couple years? >> Well definitely with the extra regulation, right? With people and the way that their privacy dynamic is changing, consumers are becoming much more aware of where their data's going, and who's using their data. So we've heard an awful lot today, about the privacy of people's data, and how the industry needs to change. And I think consumers generally are getting much more educated on that, and therefore they're asking companies like ourselves, who deal with their data, to be much more robust in their practices. And we've also seen that in a regulation point of view, right? So governments, the industry, are pushing businesses to be more aware of how they're using consumer's data, which has got to be a positive move in the right direction. >> Jeff: Right, but it's kind of funny, 'cause on the flip side of that coin is people who are willing to give up their privacy to get more services, so you've got kind of the older folks, who've been around for a while, who think of privacy, and then you've got younger folks, who maybe haven't thought about it as much, are used to downloading the app, clicking on the EULA in their phone-- >> Absolutely. >> Follows them everywhere they go. So, is it really more the regulation that's driving the change? Or is just kind of an ongoing maturation process? >> Well I think-- >> Stewardship is I guess what I was saying. >> Yeah, it's a combination of both I would say. And you make a great point there, so if you look at car buying, right? Say 10 years ago, pick a number randomly, but 10 years ago, people wouldn't have been comfortable buying a car online, necessarily. Or definitely not all online. They'd have to touch it somewhere else, feel it physically, right? That's changing, and we're starting to enable that automotive commerce, so that it starts from the online and ends up at a dealership still. So they actually sign the paperwork, but essentially they start that process online. And that's making people more aware, as you say. I think some of the regulation, you look at GDPR in Europe, spoke of that a lot today, naturally. And some of that regulation is helping to drive companies to be more aware. But where I see the biggest problem is with small to medium sized businesses. So I think if you talk to larger business, you were speaking to Michel from Cisco, some of those larger businesses, this privacy thing's been built in from the beginning. Companies like CDK, where we were aware we were dealing with a lot of data, and therefore the GDPR regulations is more of an incremental change. It just ramps up that focus on privacy that was already there from the outset. The biggest problem, and where we see the biggest kind of change here, is in the smaller to medium sized businesses, and that's talking about dealerships, smaller dealership groups, where perhaps they haven't been so aware of privacy, they've been focused on the sales and not necessarily the data and technology, and GDPR for them is a significant step change. And it's down to industry, and larger vendors like ourselves, to reach out to those smaller dealerships, those smaller, medium sized businesses, and help them to work with GDPR to do better. >> But can they fulfill most of their obligations by working with companies such as yours, who have it baked into the product? I would imagine-- >> Yeah! >> I mean, that's the solution, right? >> Absolutely. >> If you're a little person, you don't have a lot of resources-- >> Yep. And I would say it's about sharing in the industry, right? So it's about reaching out. We talked to Cisco today, about how they're building it into their technologies. A lot of the smaller businesses use companies like CDK to enable their technology. So there's an awful lot we can do to help them, but it's not everything, right? So there are areas where we need to educate consumers a lot better, where they need to work with the data and work with where the data goes, in order to understand that full end to end data flow within their systems. We work a lot of the dealerships who perhaps don't understand the data they're collecting, don't understand the gravity of the information that they're collecting, and what that truly means to the consumer themselves. So we need to educate better, we need to reach out as bigger organizations, and teach smaller businesses about what they're doing with the data. >> And was there specific kind of holes in process, or in data management that the GDPR addresses that made a sea change? Or is it really just kind of ramping up the penalties, so you need to really ramp up your compliance? >> Well it really is incremental, right? So if you look at things that we've had in Europe for a long time, the Data Protection Act that was around since 1999, for example, or 1998, apologies. It's a ramp up of that, so it's just increasing the effectiveness. If you look at the 12 points that exist within GDPR, about what you need to know, or a consumer should know about their data, rather than just who's collecting it, it now includes things like when you change that data, when it moves, who it goes to from a third-party perspective, so really it's just about ramping up that awareness. Now, what that means for a business, is that they need to know that they can gather that data quickly. So they need to be clear and understand where their data is going, and CDK's a great example of that. They need to know what data they're sharing with CDK, on what systems it exists, and in fact how they would remove that data if a consumer was asked for that to happen. >> Jeff: (laughs) Who knows, we know in the cloud there is no deleting, right? >> Absolutely. >> It's in the cloud, it's there for everyone. >> That's rough (laughs). >> I mean, it really drives home kind of an AS application agent service provider services, because there's just, I could just see the auto dealership, right? Some guy's got his personal spreadsheet, that he keeps track of his favorite customers, clearly I don't think that's probably falling in compliance. >> Absolutely, yeah, and it can, right? You can work really hard, so it is a process problem. You identified that before, right? There is a lot of process here, technology isn't a golden bullet, it's not going to solve everything, right? And a lot of it is process and mentality driven. So we need to work with people to educate them, and then there's a big emphasis on the consumer as well. I think we focused on business here, but there's a big emphasis on the consumer, for them to begin to understand and be better educated. We heard from some government representatives today, about educating consumers, right? And you mentioned millennials, and the various other groups that exist, and it's important for them to understand where their data is going, and where it's being shared. 'Cause quite honestly we had a couple of really good stories today about privacy and security professionals really not having a genuine understanding of where their data is going. So a regular consumer, someone that goes to buy a car, how can we expect them, without education, to really understand about their data? >> Just to jump on it, obviously you're from the U.K., and we hear all the time that there's more closed circuit cameras in London (laughter) than probably any city else-- >> Yep. >> So, don't answer if you don't want to, but, (laughter) from a government point of view, and let's just take public red light cameras, there's so much data. >> Absolutely. >> Is the government in a position? Do they have the same requirements as a commercial institution in how they keep, manage and stay on top of this data? >> Yeah, absolutely. So I think, having come from a government background initially, I think the rules and regulations there are much more constrained, constrictive? then perhaps commercial side is. And I think what you find is a lot of the government regulations are now filtering through into the commercial world. But actually what we're seeing is a bit of a step change. So previously, maybe 15, 20 years ago, the leader in the industry was the government, right? So the government did the regulations, and it would filter through commercial. Actually, what we've seen in the industry now is that it flipped on its head. So a lot of the stuff is originating in the corporate world. We're close to Silicon Valley here, the Facebooks of the world, you know a lot of that stuff is now originating in the commercial side? And we heard from some government people today, you know. The government are having to run pretty fast to try and keep up with that changing world. And a lot of the legislation and regulation now, actually, is a bit historic, right? It's set in the old days, we talk today about data, and watching you move around, and geolocation data, a lot of that legislation dealing with that is probably 10, 15 years old now. And exists in a time before you could track your phone all over the world, right? And so, governments have to do some more work, I think ultimately, look at GDPR, I think ultimately the way to change the industry is from a basis of regulation, but then as we move through it's got to be up to the companies and the commercial businesses to take heed of that and do the right thing, ultimately. >> Jeff: It's just so interesting to watch, I mean my favorite is the car insurance ads where they want to give you the little USB gizmo to plug in, to watch you, and it's like, "Well, you already have "a phone in your pocket"-- >> Yep. >> You know? >> They don't really see it. >> You don't really need to plug it in, and all your providers know what's going on, so, exciting times, nothing but opportunity for you. >> Yeah, absolutely, absolutely, I hope so (laughs). >> Well Craig Goodwin, thanks for taking a few minutes-- >> No, thank you. >> And sharing your insights, appreciate it. >> Appreciate it, thank you. >> Alright, he's Craig, I'm Jeff, you're watching theCUBE, We're at Data Privacy Day 2018, I can't believe it's 2018. Thanks for watching, we'll catch you next time. (bright electronic music)

Published Date : Jan 26 2018

SUMMARY :

he's the Chief Security Officer of CDK Global. So for people who aren't familiar, give us a quick the technology across the U.S. and the rest of the world. it's the business systems for those autmotive companies. So a lot of our technology is connecting, with IoT So how have you seen things evolve and how the industry needs to change. So, is it really more the regulation of change here, is in the smaller to medium A lot of the smaller businesses use companies like CDK So they need to be clear and understand I could just see the auto dealership, right? So a regular consumer, someone that goes to buy a car, Just to jump on it, obviously you're from the U.K., So, don't answer if you don't want to, but, (laughter) So a lot of the stuff is originating in the corporate world. You don't really need to plug it in, Thanks for watching, we'll catch you next time.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JeffPERSON

0.99+

Jeff FrickPERSON

0.99+

Craig GoodwinPERSON

0.99+

EuropeLOCATION

0.99+

CiscoORGANIZATION

0.99+

CDKORGANIZATION

0.99+

CDK GlobalORGANIZATION

0.99+

MichelPERSON

0.99+

Data Protection ActTITLE

0.99+

LondonLOCATION

0.99+

CraigPERSON

0.99+

U.S.LOCATION

0.99+

Silicon ValleyLOCATION

0.99+

last yearDATE

0.99+

12 pointsQUANTITY

0.99+

1998DATE

0.99+

GDPRTITLE

0.99+

LinkedInORGANIZATION

0.99+

bothQUANTITY

0.99+

todayDATE

0.99+

U.K.LOCATION

0.99+

1999DATE

0.99+

about $500 billionQUANTITY

0.98+

10 years agoDATE

0.98+

EULATITLE

0.98+

2018DATE

0.98+

Data Privacy Day 2018EVENT

0.97+

Data Privacy DayEVENT

0.97+

FacebooksORGANIZATION

0.83+

10, 15 years oldQUANTITY

0.83+

last couple yearsDATE

0.83+

15, 20 years agoDATE

0.82+

San FranciscoLOCATION

0.77+

Security OfficerPERSON

0.75+

theCUBEORGANIZATION

0.69+

Michelle Dennedy, Cisco | Data Privacy Day 2018


 

(screen switch sound) >> Hey, welcome back everybody. Jeff Frick here with theCUBE. We're at the place that you should be. Where is that you say? Linked-In's new downtown San Francisco's headquarters at Data Privacy Day 2018. It's a small, but growing event. Talking, really a lot about privacy. You know we talk a lot about security all the time. But privacy is this kind of other piece of security and ironically it's often security that's used as a tool to kind of knock privacy down. So it's an interesting relationship. We're really excited to be joined by our first guest Michelle Dennedy. We had her on last year, terrific conversation. She's the Chief Privacy Officer at Cisco and a keynote speaker here. Michelle, great to see you again. >> Great to see you and happy privacy day. >> Thank you, thank you. So it's been a year, what has kind of changed on the landscape from a year ago? >> Well, we have this little thing called GDPR. >> Jeff: That's right. >> You know, it's this little old thing the General Data Protection Regulation. It's been, it was enacted almost two years ago. It will be enforced May 25, 2018. So everyone's getting ready. It's not Y2K, it's the beginning of a whole new era in data. >> But the potential penalties, direct penalties. Y2K had a lot of indirect penalties if the computers went down that night. But this has significant potential financial penalties that are spelled out very clearly. Multiples of revenue. >> Absolutely >> So what are people doing? How are they getting ready? Obviously, the Y2k, great example. It was a scramble. No one really knew what was going to happen. So what are people doing to get ready for this? >> Yeah, I think its, I like the analogy it ends because January one, after 2000, we figured it out, right? Or it didn't happen because of our prep work. In this case, we have had 20 years of lead time. 1995, 1998, we had major pieces of legislation saying know thy data, know where it's going, value it and secure it, and make sure your users know where and what it is. We didn't do a whole lot about it. There are niche market people, like myself, who said "Oh my gosh, this is really important." but now the rest of the world has to wake up and pay attention because four percent of global turnover is not chump change in a multi-billion dollar business and in a small business it could be the only available revenue stream that you wanted to spend innovating-- >> Right, right >> rather than recovering. >> But the difficulty again, as we've talked about before is not as much the companies. I mean obviously the companies have a fiduciary responsibility. But the people-- >> Yes. >> On the end of the data, will hit the ULA as we talked about before without thinking about it. They're walking around sharing all this information. They're logging in to public WiFi's and we actually even just got a note at theCube the other day asking us what our impact, are we getting personal information when we're filming stuff that's going out live over the internet. So I think this is a kind of weird implication. >> I wish I could like feel sad for that but there's a part of my privacy soul that's like, "Yes! People should be asking. "What are you doing with my image after this? "How will you repurpose this video? "Who are my users looking at it?" I actually, I think it's difficult at first to get started. But once you know how to do it, it's like being a nutritionist and a chef all in one. Think about the days before nutrition labels for food. When it was first required, and very high penalties of the same quanta of the GDPR and some of these other Asiatic countries are the same, people simply didn't know what they were eating. >> Right. >> People couldn't take care of their health and look for gluten free, or vitamin E, or vitamin A, or omega whatever. Now, it's a differentiator. Now to get there, people had to test food. They had to understand sources. They had to look at organics and pesticides and say, "This is something that the populace wants." And look at the innovation and even something as basic and integral to your humanity as food now we're looking at what is the story that we're sharing with one another and can we put the same effort in to get the same benefits out. Putting together a nutrition label for your data, understanding the mechanisms, understanding the life cycle flow. It's everything and is it a pain in the tuckus some times? You betcha. Why do it? A: You're going to get punished if you don't. But more importantly, B: It's the gateway to innovation. >> Right. It's just funny. We talked to a gal in a security show and she's got 100% hit rate. She did this at Black Hat, social engineering access to anything. Basically by calling, being a sweetheart, asking the right questions and getting access to people's-- >> Exactly. >> So where does that fit in terms of the company responsibility, when they are putting everything, as much as they can in their place. Here like at AWS too you'll hear, "Somebody has a security breach at AWS." Well it wasn't the security of the AWS system, it was somebody didn't hit a toggle switch in the right position. >> That's right. >> So it's pretty complex versus if you're a food manufacturer, hopefully you have pretty good controls as to what you put in the food and then you can come back and define. So it's a really complicated problem when it's the users who you're tryna protect that are often the people that are causing the most problems. >> Absolutely. And every analogy has its failures right? >> Right, right. >> We'll stick with food for a while. >> Oh no I like the food one. >> Alright it's something you can understand. >> Y2K is kind of old, right. >> Yeah, yeah. But think about like, have we made, I was going to use a brand name, a spray on cheese chip, have we made that illegal? That stuff is terrible for your body. We have an obesity crisis here in North America certainly, and other parts of the world, and yet we let people choose what they're putting into their bodies. At the same time we're educating consumers about what the new food chart should look like, we're listening to maybe sugar isn't as good as we thought it was, maybe fat isn't as bad. So giving people some modicum of control doesn't mean that people are always going to make the right choices but at least we give them a running chance by being able to test and separate and be accountable for at least what we put into the ingredients. >> Right, right, okay so what are some of the things you're working on at Cisco? I think you said before we go on the air you have a new report published, study, what's going on? I do, I'm ashamed Jeff to be so excited about data but, I'm excited about data. (laughter) >> Everybody's excited about data. >> Are they? >> Absolutely. >> Alright let's geek out for a moment. >> So what did you find out? >> So we actually did the first metrics reporting correlating data privacy maturity models and asking customers, 3,000 customers plus in 20 different countries from companies of all sizes S and B's to very large corps, are you experiencing a slow down based on the fears of privacy and security problems? We found that 68 percent of these questions said yes indeed we are, and we asked them what is the average timing of slowing down closing business based on these fears. We found a big spread from over 16 and a half weeks all the way down to two weeks. We said that's interesting. We asked that same set of customers, where would you put yourself on a zero to five ad hoc to optimized privacy maturity model. What we found was if you were even middle of the road a three or a four, to having some awareness, having some basic tools, you can lower your risk of loss, by up to 70 percent. I'm making it sound like it's causation, it's just a correlation but it was such a strong one that when we ran the data last year I didn't run the report, because we weren't sure enough. So we ran it again and got the same quantum with a larger sample size. So now I feel pretty confident that the self reporting of data maturity is related to closing business more efficiently and faster on the up side and limiting your losses on the down side. >> Right, so where are the holes? What's the easiest way to get from a zero or one to a three or a four, I don't even want to say three or four, two or three in terms of behaviors, actions, things that people do? >> So you're scratching on my geeky legal underbelly here. (laughter) I'm going to say it depends Jeff. >> Of course of course. >> Couching this and I'm not your lawyer. >> No forward licking statements. >> No forward licking statement. Well, for a reason what the heck. We're looking forward not back. It really does depend on your organization. So, Cisco, my company we are known for engineering. In fact on the down side of our brand, we're known for having trouble letting go until everything is perfect. So, sometimes it's slower than you want cause we want to be so perfect. In that culture my coming into the engineering with their bonafides and their pride in their brand, that's where I start to attack with privacy engineering education, and looking at specs and requirements for the products and services. So hitting my company where it lives in engineering was a great place to start to build in maturity. In a company like a large telco or healthcare or highly regulated industry, come from the legal aspect. Start with compliance if that's what is effective for your organization. >> Right, right. >> So look at where you are in your organization and then hit it there first, and then you can fill up, document those policies, make sure training is fun. Don't be afraid to embarrass yourself. It's kind of my mantra these days. Be a storyteller, make it personal to your employees and your customers, and actually care. >> Right, hopefully, hopefully. >> It's a weird thing to say right, you actually should give a beep >> Have a relationship with people. When you look at how companies moved that curve from last year to this year was it a significant movement? Was it more than you thought less than you thought? Is it appropriate for what's coming up? >> We haven't tracked individual companies time after time cause it's double blind study. So it's survey data. The survey numbers are statistically relevant. That when you have a greater level of less ad hoc and more routinized systems, more privacy policies that are stated and transparent, more tools and technologies that are implemented, measured, tested, and more board level engagement you start to see that even if you have a cyber risk the chances that it's over 500 thousand per event goes down precipitously. If you are at that kind of mid range level of maturity you can take off 70 percent of the lag time and go from about four months of closing a deal that has privacy and security implications to somewhere around two to three weeks. That's a lot of time. Time in business is everything. We run by the quarter. >> Yeah well if you don't sell it today, you never get today back. You might sell it tomorrow, but you never get today back. Alright so we just flipped the calendar. I can't believe it's 2018. That's a whole different conversation. (laughter) What are your priorities for 2018 as you look forward? >> Oh my gosh. I am hungry for privacy engineering to become a non niche topic. We're going out to universities. We're going out to high schools. We're doing innovation challenges within Cisco to make innovating around data a competitive advantage for everyone, and come up with a common language. So that if you're a user interface guy you're thinking about data control and the stories that you're telling about what the real value is behind your thing. If you are a compliance guy or girl, how do I efficiently measure? How do I come back again in three months without having compliance fatigue, because after the first couple days of enforcement of GDPR and some of these other laws come into force it's really easy to say whew, it didn't hit me. I've got no problem now. >> Right. >> That is not the attitude I want people to take. I want them to take real ownership over this information. >> It's very ana logist to what's happening in security. >> Very much so. >> Just baking it in all the way. It's not a walled garden. You can't defend the perimeter anymore, but it's got to be baked into everything. >> It's no mistake that it's like the security world. They're about 25 years ahead of us in data privacy and protection. My boss is our chief trust officer who formally was our CISO I am absolutely free riding on all the progresses the security people have made. We're just really complimenting each others skills, and getting out into other parts of the business in addition to the technical part of the business. >> Exciting times. >> Yeah, it's going to be fun. >> Well great to catch up and >> Yeah you too. >> We'll let you go. Unfortunately we're out of time. We'll see you in 2019. >> Data Privacy Day. >> Data Privacy Day. She's Michelle Dennedy, I'm Jeff Frank. You're watching theCUBE. Thanks for tuning in from Data Privacy Day 2018. (music)

Published Date : Jan 26 2018

SUMMARY :

We're at the place that you should be. on the landscape from a year ago? it's the beginning of a whole new era in data. But the potential penalties, direct penalties. Obviously, the Y2k, great example. and in a small business it could be the only available is not as much the companies. They're logging in to public WiFi's and we actually even I actually, I think it's difficult at first to get started. But more importantly, B: It's the gateway to innovation. asking the right questions and getting access to people's-- in the right position. as to what you put in the food And every analogy has its failures right? and other parts of the world, and yet we let people I think you said before we go on the air you have a new So now I feel pretty confident that the self reporting I'm going to say it depends Jeff. In that culture my coming into the engineering with So look at where you are in your organization Was it more than you thought less than you thought? We run by the quarter. You might sell it tomorrow, but you never get today back. it's really easy to say whew, That is not the attitude I want people to take. Just baking it in all the way. and getting out into other parts of the business We'll see you in 2019. Thanks for tuning in from Data Privacy Day 2018.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Michelle DennedyPERSON

0.99+

Jeff FrankPERSON

0.99+

JeffPERSON

0.99+

May 25, 2018DATE

0.99+

Jeff FrickPERSON

0.99+

CiscoORGANIZATION

0.99+

AWSORGANIZATION

0.99+

100%QUANTITY

0.99+

2018DATE

0.99+

1998DATE

0.99+

20 yearsQUANTITY

0.99+

Y2KORGANIZATION

0.99+

North AmericaLOCATION

0.99+

70 percentQUANTITY

0.99+

MichellePERSON

0.99+

1995DATE

0.99+

tomorrowDATE

0.99+

2019DATE

0.99+

General Data Protection RegulationTITLE

0.99+

last yearDATE

0.99+

zeroQUANTITY

0.99+

two weeksQUANTITY

0.99+

68 percentQUANTITY

0.99+

todayDATE

0.99+

fourQUANTITY

0.99+

threeQUANTITY

0.99+

GDPRTITLE

0.99+

3,000 customersQUANTITY

0.99+

four percentQUANTITY

0.99+

Y2kORGANIZATION

0.99+

twoQUANTITY

0.99+

January oneDATE

0.99+

Data Privacy DayEVENT

0.99+

20 different countriesQUANTITY

0.99+

this yearDATE

0.99+

a year agoDATE

0.99+

three monthsQUANTITY

0.98+

fiveQUANTITY

0.98+

oneQUANTITY

0.98+

Data Privacy Day 2018EVENT

0.98+

about four monthsQUANTITY

0.98+

first guestQUANTITY

0.97+

Linked-InORGANIZATION

0.97+

first couple daysQUANTITY

0.97+

up to 70 percentQUANTITY

0.97+

first metricsQUANTITY

0.97+

three weeksQUANTITY

0.97+

over 16 and a half weeksQUANTITY

0.97+

firstQUANTITY

0.97+

about 25 yearsQUANTITY

0.96+

multi-billion dollarQUANTITY

0.95+

San FranciscoLOCATION

0.94+

theCubeORGANIZATION

0.94+

vitamin AOTHER

0.94+

around twoQUANTITY

0.94+

2000DATE

0.9+

over 500 thousand per eventQUANTITY

0.9+

a yearQUANTITY

0.87+

Black HatORGANIZATION

0.85+

two years agoDATE

0.85+

vitamin EOTHER

0.83+

theCUBEORGANIZATION

0.78+

AsiaticOTHER

0.76+

double blind studyQUANTITY

0.75+

telcoORGANIZATION

0.75+

almostDATE

0.67+

Privacy OfficerPERSON

0.65+

ULAORGANIZATION

0.63+

quarterDATE

0.53+

Denelle Dixon, Mozilla | Data Privacy Day 2017


 

>> Hey, welcome back everybody, Jeff Frick here with theCUBE. It is Data Privacy Day which I just found out has been going on for about 20 years, or 30 years, but we're happy to be at our very first one. We're in downtown San Francisco at the Twitter headquarters, it's a full day event that's actually happening around the world, but we're here in San Francisco and excited to have some of the guests come down that are doing the panels and the discussions and the breakout sessions, and we're excited for our next guest Denelle Dixon, Chief Legal and Business Officer from Mozilla, welcome! >> Thank you, happy to be here. >> So there was a spirited panel to kick off the day, I wonder if you could share some of your thoughts as to some surprises that came out of that conversation? >> So not so many surprises, but we talked a lot about IOT and just the Internet of Things, the web of things, whatever we're going to call it, and the data that's available as a result of that to companies, to governments, to lots of different entities and whether consumers understand that, and the responsibilities that both the consumers and the technology companies have with respect to that data. >> And Mozilla, obviously, was right there at the big change to go to, you know, graphical web interface, which was a sea change really in the internet and how it would interact with people. IoT represents that same kind of thing, and oh, by the way, people are things too, as we like to say on theCUBE, so as you kind of look at the new challenges faced by IoT, what are some of the things that bubble onto your priority list in terms of things that need to really be thought of that really people aren't thinking enough of now? >> I think that one of the most important things about IoT and the idea that this is information that's collected and used by devices and technology companies because of the fact that it can be wearable, it can be things that you have in your house that collect data as you're talking to it. One of the most important things, and just keeping Data Privacy Day in mind, is that we make sure that consumers are aware that this is actually happening, that data is being collected and sent, and how that data is being used. It used to be, back in the day, we could have privacy policies, so we put them up, 15 pages long, and assumed that users understood that. Well, that can't be used with respect to these kinds of devices, so we need to be innovative, we need to be creative, we need to be able to ask questions of these devices and have them tell us what's going on with the data that they collect and how they're doing that. So it's just as incumbent upon the technology companies that create these devices to ensure that users understand that, as it is upon the users to understand that these kinds of actions are happening and these trade offs with respect to it. Really interesting, crazy, exciting in terms of the different technologies that we can use, but really important that we get this right. >> It just strikes me that, I think, so many people just click, yes I accept. Are people really, I mean I'm sure some people are that are paying attention, but it just seems that most people just click and accept, click and accept, click and accept, especially if you've kind of got into that behavior pattern and haven't really thought about the way these applications are evolving, haven't really thought about Facebook on your laptop or on your PC at home, is different from Facebook on your mobile, they haven't really thought about, wow, what are these connected devices now collecting data, that as you said didn't even get the chance to opt in, so how do you educate people to make intelligent choices, and how do we, like, break the EULA up, maybe, so that I can opt in for if I want to share A, B, and C, but not D, E and F, and oh, I forgot, I really need F to make this thing function. It seems like a really complicated kind of disclosure problem. >> It is complicated, and that doesn't mean that we don't have to crack it. So you said the word EULA, that's the End User Legal Agreement, and I don't think we can live in a world of EULAs. I think we live in a world where we put in context notices we have to actually create so that your interface, or whatever small thing that you have, is able to alert you that this data is actually transpiring, so it has to be in context, it has to be creative, it has to be part of product development, it can't be an afterthought. Before it used to be that they would hand this over to the lawyers and say, hey, can you help us figure out how to notify our users. This has to be part of our innovative process today. We're seeing more and more of it. We're seeing technology companies take this seriously and include privacy by design in their product development, make these in context notices part of the way that they think about the product, and not just about the afterthought, and so the more we do this the better it's going to be for all of us, but it's actually, just because it's hard it means that it's a creative, thoughtful amazing process that we all need to engage in. >> So one of the hot topics that we cover a lot is diversity in tech, and women in tech specifically, and not only is it the right thing to do, but there's very clearly defined positive business outcomes when you have a diversity of opinions when you're making decisions. Is there a corollary to what you're describing in terms of being more forthright in your privacy policy that's really not only the right thing to do question, which is fine, but is there a real business benefit that you can see or that you project that's going to be even a better motivator for people to start changing the behavior in the way in which they disclose or interact with people on the privacy issue? >> Yeah, I love the way you introduced that, because from my standpoint one of the things that we don't like to do, that we don't like to be in life is surprised. And so, one of the most important things is, if you think about everything, is a no surprise rule. So if we start thinking about business and our engagement with our users as creating a no surprises opportunity, it actually creates trust, it fosters deeper engagement, it makes it so that we are all going to be happier in terms of that relationship, maybe the users actually give more to the product, maybe the product can actually give more then to the user, so this no surprises rule, and the way that we can operate, creates really nice business cycles and really nice interesting dynamics between consumers and the businesses that they use. >> It's great, the trust word in it, it also plays into kind of the services, in that everything is a service. Because when everything is a service you have to maintain a solid, ongoing relationship, it's not a one time purchase, adios, we're never going to see you again, and so that really plays into this. If it's a trusted service provider that you feel good about, you continue to pay that $9.95 to Spotify or whomever that service provider is, so it's a really different way of looking at the world. >> It is, and it's one of the things that we actually encouraged from the very outset, is this kind of creation of trust. Trust is really easy to lose with respect to your consumer base, and it's the most important thing as you're engaging. We created these initiatives called the lean data practices and then we also have privacy initiatives that we put out there for start ups and other entities that they can utilize and hopefully create for their businesses. Part of it is the no surprises rule, but it's also think about what data you want to collect, so that you actually are collecting what you need, throw away what you don't, anonymize it. Like really create that trusted relationship because you can always grow. If you think, I actually need more data today than I did when I started a year ago, then it's a great way to have that conversation with your consumer base. So it's one of the things, trust starts it all. So from Mozilla's standpoint, we operate that through our products, because we definitely have that in our Firefox browser and the other products that we have on mobile, but one of the things that we care about is creating this awesome opportunity for the web to continue to grow, and so we care about how other companies are approaching this too. >> So you mentioned Firefox, and you guys have a new product coming out today, Firefox Focus, so explain to folks what is Firefox Focus, why should they care, what's different than just kind of traditional Firefox? >> Right, so we've had Focus in iOS before, and today we actually launched it in 27 languages to 27 different areas that you can get it. It's a privacy focused browser, but it can also be performance focused. So that you actually have content you can exclude, some content doesn't get pushed through so that your performance is faster, and you can really focus on what kind of data that you want to share with companies. So try it out, I think that it's an awesome experience, certainly from the standpoint of privacy but also from performance. >> So Denelle, 2017, we just flipped the calendar a few weeks ago, as you look forward in the year you probably went through your annual planning process, what are some of your priorities for 2017, what are you looking forward to that are top of your list for the next 12 months? >> So it's really the top, I run the policy, business and legal teams at Mozilla from a policy standpoint, really focused on encryption, security, privacy, looking at the new administration here in the US as well as what's happening in Europe. I think it's a really important area for us to focus on from a business standpoint. I want to see us really dive into growth with respect to Firefox as our desktop browser. I want to see our mobile space grow, and grow even outside the browser. So I'm really excited about what we can do there. And then from the legal side, I want to continue to push the envelope on this no surprises with respect to doing that in more areas that we can with respect to our products and pushing that idea side too. >> I love that, no surprises, it's like a bumper sticker. (laughs) She's Denelle, I'm Jeff, you're watching theCUBE, see you next time.

Published Date : Jan 30 2017

SUMMARY :

that are doing the panels and the discussions and the technology companies have with respect to that data. and oh, by the way, people are things too, about IoT and the idea that this is information that as you said didn't even get the chance to opt in, and so the more we do this the better it's going to be and not only is it the right thing to do, it makes it so that we are all going to be happier and so that really plays into this. and the other products that we have on mobile, So that you actually have content you can exclude, that we can with respect to our products I love that, no surprises, it's like a bumper sticker.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FrickPERSON

0.99+

DenellePERSON

0.99+

JeffPERSON

0.99+

EuropeLOCATION

0.99+

Denelle DixonPERSON

0.99+

$9.95QUANTITY

0.99+

15 pagesQUANTITY

0.99+

USLOCATION

0.99+

MozillaORGANIZATION

0.99+

2017DATE

0.99+

EULATITLE

0.99+

SpotifyORGANIZATION

0.99+

30 yearsQUANTITY

0.99+

San FranciscoLOCATION

0.99+

FirefoxTITLE

0.99+

FacebookORGANIZATION

0.99+

a year agoDATE

0.99+

oneQUANTITY

0.99+

EULAsTITLE

0.99+

iOSTITLE

0.98+

27 languagesQUANTITY

0.97+

27 different areasQUANTITY

0.97+

about 20 yearsQUANTITY

0.97+

OneQUANTITY

0.97+

todayDATE

0.96+

Firefox FocusTITLE

0.96+

one timeQUANTITY

0.96+

Data Privacy DayEVENT

0.94+

bothQUANTITY

0.93+

next 12 monthsDATE

0.9+

TwitterORGANIZATION

0.88+

few weeks agoDATE

0.86+

Data Privacy Day 2017EVENT

0.86+

EndTITLE

0.84+

theCUBEORGANIZATION

0.82+

first oneQUANTITY

0.82+

Legal AgreementTITLE

0.76+

manyQUANTITY

0.65+

adiosORGANIZATION

0.53+

FocusORGANIZATION

0.49+

Eve Maler | Data Privacy Day 2017


 

>> Hey, welcome back everybody. Jeff Frick here with the CUBE. We are in downtown San Francisco at the Twitter headquarters for a big event, the Data Privacy Day that's been going on for years and years and years. It's our first visit and we're excited to be here. And our next guest is going to talk about something that is near and dear to all of our hearts. Eve Maler, she's the VP Innovation and Emerging Technology for ForgeRock. Welcome. >> Thank you so much. >> Absolutely. So for people who aren't familiar with ForgeRock, give us a little background on the company. >> Sure. So, of course, the digital journey for every customer and consumer and patient and citizen in the world is so important because trust is important. And so what ForgeRock is about is about creating that seamless digital identity journey throughout cloud, mobile, internet of things, devices, across all of their experiences in a trustworthy and secure way. >> So one of the topics that we had down and getting ready for this was OAuth. >> Yes. >> And as the proliferation of SAS applications continues to grow both within our home life as well as our work life, we have these pesky things called passwords which no one can remember and they force you to change all the time. So along comes OAuth. >> Yes. So OAuth is one of those technologies... I'm kind of a standards wonk. I actually had a hand in creating XML for those people who remember XML. >> Jeff: That's right. >> OAuth took a tact of saying, "Let's get rid of what's called the password anti-pattern. "Let's not give out our passwords to third party services and applications so that we can just give those applications what's called an access token. Instead it's meant just for that application. In fact, Twitter... We're heard at Twitter headquarters. Twitter uses that OAuth technology. And I'm involved in a standard, being a standards wonk, that builds on top of OAuth called user managed access. And it uses this so that we can share access with applications in the same way. And we can share access also with other people using applications. So for example, the same way we hit a share button in Google, Alice hits a share button to share access with a document with Bob. We want to allow every application in the world to be able to do that, not just GoogleDocs, GoogleSheets, and so on. So OAuth is powerful and user managed access is powerful for privacy in the same way. >> Now there's OAuth and I use my Twitter OAuth all the time. Or with Google. >> That's right. >> And then there's these other kind of third party tools which add kind of another layer. >> So you might use like tweetbot is something I like to use on my phone to tweet. >> Jeff: Right, right. >> And so there's... >> Well there's the tweetbot. But then there's these pure, like identity password manager applications which you know you load it into there and then... >> LastPass or something like that. >> Right, right, right. >> One password people use yeah >> To me it's just like wow, that just seems like it's adding another layer. And if oh my gosh, if I forget the LastPass password, I'm really in bad shape. >> You are. >> Not just the one application, but a whole bunch. I mean, how do you see the space kind of evolving to where we got to now? And how is it going to change going forward? It just fascinates me that you still have passwords when our phones have fingerprint. >> TouchID. >> Why can't it just work off my finger? >> More and more, SAS services and applications are actually becoming more sensitive to multifactor authentication, strong authentication, what we at ForgeRock would actually call contextual authentication and that's a great way to go. So they're leveraging things like TouchID, like device fingerprint, for example. Recognizing that the devices kind of represents you and your unique way of using the device. And in that way, we can start to do things like what's called a password list flow. Where it can, most of the time, or all of the time, actually not even use a password. And so, I don't know, I used to be an industry analyst and 75 percent of my conversations with folks like you would be about passwords. And more frequently, I would say now, we're getting into the topic of people are more password savvy and more of the time people are turning on things like multifactor authentication and more of that it knows the context that I'm using my corporate WiFi which is safer. Or I'm using a familiar device. And that means I don't have to use the password as often. So that's contextual authentication. Meaning I don't have to use that insecure password so often. >> Jeff: Right. >> So I think the world has gotten actually a little bit smarter about authentication. I'm hoping. And actually, technologies like OAuth and the things that are based on OAuth like OpenIDConnect which is an identity technology, a modern identity, federated identity technology. And things like user managed access are leveraging the fact that OAuth is getting away from having to use, if it was a password based authentication, not flinging the password around the internet, which is the problem. >> Right, right. Okay so that's good, that's getting better, but now we have this new thing. Internet of things. >> Yes indeed. >> And people are things. But now we've got connected devices, they're not necessarily ones that I purchased, that I authorized, that I even maybe am aware of. >> Like a beacon on a wall, just observing you. >> Like a beacon on a wall and sensors, and the proliferation is just now really starting to run. So from a privacy point of view, how does kind of IOT that I'm not directly involved with compare to IOT with my Alexa compare to applications that I'm actively participating in. How do those lines start to blur? And how does the privacy issues kind of spill over now into managing this wild world of IOT? >> Yeah, there's a couple of threads with the Internet of Things. And so I'm here today at this Data Privacy Day Event to participate on a panel about the IOT tipping point. And there's a couple of threads that are just really important. One is the security of these devices is in large part, a device identity theft problem with this dyn attack. In fact, that was an identity theft problem of devices. We had poorly authenticated devices. We had devices that have identities they have identities, they have identifiers, and they have secrets. And it was a matter of their own passwords being easily taken over. It was account takeovers, essentially for devices, that was the problem. And that's something we have to be aware of. So, you know, just like applications and services can have identities, just like people, we've always known that. It's something our platform can handle. We need to authenticate our devices better and that's something manufacturers have to take responsibility for. >> Jeff: Right. >> And we can see the government agencies starting to crack down on that which is a really good thing. The second thing is there's a saying in the healthcare world for people who are working on patient privacy rights, for example. And the saying is, no data about me without me. So there's got to be a kind of a pressure, you know we see whenever there's a front page news article about the latest password breach. We don't actually see so many password breaches anymore as we see this multifactor authentication come in to play. So that's the industry pressures coming in to play. Where passwords become less important because we have multifactor. We're starting to see consumer pressure say I want to be a part of this. I want you to tell me what you shared. I want more transparency, and I want more control. And that's got to be part of the equation now when it comes to these devices. It's got to be not just more transparent, but what is it you're sharing about me? >> Jeff: Right. >> Last year I actually bought, maybe this is TMI, I always have this habit of sharing too much information, >> That's okay, we're on theCUBE we like >> Being honest here. >> To go places other companies don't go. >> I bought one of those adjustable beds that actually has an air pump that... >> What's your number? Your sleep number. >> It is, it's a Sleep Number bed and it has a feature that connects to an app that tells you how well you slept. You look at the terms and conditions and it says we own your biometric data, we are free to do whatever we want. >> Where did you even find the terms and conditions? >> They're right there on the app, to use the app. >> Oh in the app, in the app. >> You have to say yes. >> So you actually read before just clicking on the box. >> Hey, I'm a privacy pro, I've got to. >> Right, right, right. >> And of course, I saw this, and to use the feature, you have to opt in. >> Right. >> This is the way it is. There's no choice, and they probably got some lawyer... This is the risk management view of privacy. It's no longer tenable to have just a risk management view because the most strategic and the most robust way to see your relationship with your customers is you have to realize there's two sides to the bargain because businesses are commoditized now. There's low switching costs to almost anything. I mean, I bought a bed, but I don't have to have that feature. >> Do you think, do you think they'll break it up? So you want the bed, you're using a FitBit or something else to tell you whether you got a good night's sleep or not. Do you see businesses starting to kind of break up the units of information that they're taking and can they deliver an experience based on a fragmented selection? >> I do believe so. So, user managed access and certain technologies like it, standards like it, there's a standard called consent receipts. They're based on a premise of being able to now deliver convenient control to users. There's even, so there's regulations that are coming like the general data protection regulation in the EU. It's bearing down on pretty much every multinational, every global enterprise that monitors or sells to an EU citizen. That's pretty much every enterprise. >> Jeff: Right, right. >> That demands that individuals get some measure of the ability to withdraw consent in a convenient fashion. So we've got to have consent tech that measures up to the policy that these >> Right. >> organizations have to have. So this is coming whether we sort of like it or not. But we should have a robust and strategic way of exposing to these people the kind of control that they want anyway. >> Jeff: Right. >> They all tell us they want it. So in essence, personal data is becoming a joint asset. We have to conceive of this that way. >> So that's in your... So that's in your sleep app, but what about the traffic cameras and the public facility? >> Yeah. >> I mean, they say in London right you're basically on camera all the time. I don't know if that's fact or not, but clearly there's a lot >> That's true, CCTV, yeah. Of cameras that are tracking your movements. You don't get a chance to opt in or out. >> That is actually true, that's a tough case. >> You don't know. >> The class of... Yeah. The class of beacons. >> And security, right. Obviously, post 9/11 world, that's usually the justification for we want to make sure something bad doesn't happen again. We want to keep track. >> Yeah. >> So how does kind of the government's role in that play? And even in the government, then you have you know all these different agencies, whether it's the traffic agency or even just a traffic camera that maybe KCBS puts up to keep track of you know, it says slow down >> Yeah. >> Between two exits. How does that play into this conversation? >> Yeah, where you don't have an identified individual. And not even an identifiable individual, these are actually terms if you look at GDPR, which I've read closely. It is a tougher case, although I have worked... One of the members of my user managed access working group is one of the sort of experts on UK CCTV stuff. And it is a very big challenge to figure out. And governments do have a special duty of care to figure this out. And so the toughest cases are when you have beacons that just observe passively. Especially because the incentives are such that, I will grant you, the incentives are such that, well how do they go and identify somebody who's hard to identify and then go inform them and be transparent about what they're doing. >> Jeff: Right, right. >> So in those cases, even heuristically identifying somebody is very, very tough. However, there is a case where eye beacons in, say, retail stores do have a very high incentive to identify their consumers and their retail customers. >> Right. >> And in those cases, the incentives flip in the other direction towards transparency and reaching out to the customer. >> Yeah. The tech of these things of someone who I will not name, recently got a drive through red light ticket. >> Yep. >> And the clarity of the images that came in that piece of paper that I saw was unbelievable. >> Yes. >> So I mean, if you're using any kind of monitoring equipment, the ability to identify is pretty much there. >> Now we have cases... So this just happened, actually I'm not going to say, do I say it was to me or to my husband? It was in a non-smart car in a non-smart circumstance where simply a red light camera that takes a picture of an identified car, so you've got a license plate and that binds it to a registered owner of a car. >> Right. >> Now I have a car that's registered in the name of a trust. They didn't get a picture of the driver. They got a picture of the car. So now here we can talk about, let's translate that from a dumb car circumstance, registered to a trust, not to an individual, they sent us what amounted to a parking ticket. Cause they couldn't identify the driver. So now that gives us an opportunity to map that to an IOT circumstance. Because if you've got a smart device. You've got a person, you've got a cloud account. What you need to do is the ability to, in responsible secured fashion, bind a smart device to a person and their cloud account. And the ability to unbind. So now we're back to having an identity centric architecture for security and privacy that knows how to... I'll give you a concrete example, let's say you've got a fleet vehicle in a police department. You assign it to whatever cop on the beat. And at the end of their shift, you assign the car to another cop. What happens on one shift and what happens on another shift is a completely different matter. And it's a smart car, maybe it's a cop who has a uniform with some sort of camera, you know body cam. That's another smart device, and those body cams also get reassigned. So you want whatever was recorded, in the car, on the body cam, with the cop, and with their whatever online account it is, you want the data to go with the cop, only when the cop is using the smart devices that they've been assigned and you want the data for somebody else to go with the somebody else. So in these cases, the binding of identities and the unbinding of identities is critical to the privacy of that police person. >> Jeff: Right, right. >> And to the integrity of the data. So this is why I think of identity centric security and privacy as being so important. And we actually say, at ForgeRock, we say identity relationship management is being so key. >> And whether you use it or not, it is really kind of after the fact of being able to effectively tie the two together. >> You have to look at the relationships in order to know whether it's viable to associate the police person's identity with the car identity. Did something happen to the car on the shift? Did something through the view of the camera on the shift? >> Right, right. And all this is underlaid by trust, which has come up in a number of these interviews today. And unfortunately we're in a situation now if you read all the surveys. And the government particularly, these are kind of the more crazy cases cause businesses can choose to or not to and they've got a relationship with the customer. But on the government side, where there's really no choice, right, they're there. Right now, I think we're at a low point on the trust factor. >> Indeed. >> So how is that, and if you don't trust, then these things are seen as really bad as opposed to if you do trust and then maybe they're just inconvenient or they're not quite worked out all the way. So as this trust changes and fake news and all this other stuff going on right now, how is that impacting the implementation of these technologies? >> Well ask me if I said yes to the terms and conditions. (laughter) Of the sleep app, right. I mean I said yes, I said yes. And I didn't even ask for the app, you know my husband signed up for the free trial. >> Just showed up on my phone. Cause I was in proximity >> I said this one on stage >> to the bed, right? >> at RSA so this is not news. I'm not breaking news here. But you know, consumers want the features, they want convenience, they want value. So it's unreasonable, I believe to simply mount an education campaign and thereby change the world. I do think it's good to have general awareness of what to demand and that's why I say no data about me without me. That's what people should be demanding is to be let in to the loop. Because that gives them more convenience and value. >> Right. >> They want share buttons. I mean, we saw that with the initial introduction of CareKit with Apple. Because that enabled what, people who are involved in user managed access, we call ourselves Umanitarians. So umanitarians like to say, like to call it Alice to Bob sharing, that's the use case. >> Jeff: Okay. >> And it enabled Alice to Dr. Bob sharing. That's a real use case. And IOT kind of made real that use case. When web and mobile and API, I don't think we thought about it so much as a positive use case, although in healthcare it's been a very real thing with EHR. You know you can go into your EHR system and you can see it, you can share with a spouse your allergy record or something, it's there. >> Right, right, right. >> But with IOT, it's a really positive thing. I've talked to folks in my day job about sharing access to a connected car to a remote user. You know, we've seen the experiments with let somebody deliver a package into the trunk of my car, but not get access to driving the car. These are real. That's better than saving >> I've heard that one actually >> Saving a little money by having smart light bulbs is not as good as you've got an Airbnb renter and you want to share limited access to all your stuff while you're away with your renter and then shut down access after you leave, that's an uma use case, actually. And that's good stuff. I could make money. >> Jeff: Right. >> Off of sharing that way. That's convenience and value. >> It's only, I just heard the other day that Airbnb is renting a million rooms a night. >> There you go. >> So not insignificant. >> So once you've have... You have a home that's bristling with smart stuff, you know. That's when it really makes sense to have a share button on all that stuff. It's not just data you're sharing. >> Well Eve, we could go on and on and on. >> Apparently. >> Are you going to be at RSA in a couple of weeks? >> Absolutely. >> Absolutely. >> I'm actually speaking about consent management. >> Alright, well maybe we'll see you there. >> That would be great. >> But I want to thank you for stopping by. >> It's a pleasure. >> And I really enjoyed the conversation. >> Me too, thanks. >> Alright, she's Eve, I'm Jeff, you're watching theCUBE. We'll catch you next time, thanks for watching. (upbeat music)

Published Date : Jan 28 2017

SUMMARY :

And our next guest is going to talk So for people who aren't familiar with ForgeRock, and citizen in the world is so important So one of the topics that we had down And as the proliferation of SAS applications So OAuth is one of those technologies... So for example, the same way we hit Now there's OAuth and I use my Twitter OAuth all the time. And then there's these other kind I like to use on my phone to tweet. which you know you load it into there and then... And if oh my gosh, if I forget the LastPass password, And how is it going to change going forward? And that means I don't have to use the password as often. is getting away from having to use, but now we have this new thing. And people are things. Like a beacon on a wall, And how does the privacy issues kind of spill over now And that's something we have to be aware of. So that's the industry pressures coming in to play. I bought one of those adjustable beds What's your number? to an app that tells you how well you slept. And of course, I saw this, and to use the feature, don't have to have that feature. or something else to tell you whether or sells to an EU citizen. some measure of the ability to withdraw consent to these people the kind of control that they want anyway. We have to conceive and the public facility? I don't know if that's fact or not, You don't get a chance to opt in or out. That is actually true, The class of beacons. the justification for we want How does that play into this conversation? And so the toughest cases are when you to identify their consumers and reaching out to the customer. The tech of these things of someone who I will not name, And the clarity of the images the ability to identify is pretty much there. and that binds it to a registered owner of a car. And the ability to unbind. And to the integrity of the data. And whether you use it or not, You have to look at the relationships not to and they've got a relationship with the customer. as opposed to if you do trust And I didn't even ask for the app, Cause I was in proximity I do think it's good to have general awareness to Bob sharing, that's the use case. And it enabled Alice to Dr. Bob sharing. get access to driving the car. to all your stuff while you're away Off of sharing that way. It's only, I just heard the other day You have a home that's bristling with smart stuff, you know. But I want to thank you We'll catch you next time, thanks for watching.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JeffPERSON

0.99+

Eve MalerPERSON

0.99+

Jeff FrickPERSON

0.99+

LondonLOCATION

0.99+

KCBSORGANIZATION

0.99+

EvePERSON

0.99+

AppleORGANIZATION

0.99+

ForgeRockORGANIZATION

0.99+

BobPERSON

0.99+

AlicePERSON

0.99+

OAuthTITLE

0.99+

Last yearDATE

0.99+

OneQUANTITY

0.99+

75 percentQUANTITY

0.99+

two sidesQUANTITY

0.99+

AirbnbORGANIZATION

0.99+

LastPassTITLE

0.99+

twoQUANTITY

0.99+

TwitterORGANIZATION

0.99+

todayDATE

0.99+

9/11EVENT

0.99+

first visitQUANTITY

0.99+

GDPRTITLE

0.99+

Data Privacy DayEVENT

0.99+

oneQUANTITY

0.98+

second thingQUANTITY

0.98+

GoogleSheetsTITLE

0.98+

one shiftQUANTITY

0.98+

RSAORGANIZATION

0.97+

tweetbotTITLE

0.97+

bothQUANTITY

0.96+

One passwordQUANTITY

0.95+

two exitsQUANTITY

0.95+

CUBEORGANIZATION

0.95+

Dr.PERSON

0.95+

GoogleDocsTITLE

0.94+

GoogleORGANIZATION

0.94+

UKLOCATION

0.93+

Twitter OAuthTITLE

0.9+

EHRTITLE

0.89+

a million rooms a nightQUANTITY

0.87+

TouchIDOTHER

0.87+

SASORGANIZATION

0.86+

San FranciscoLOCATION

0.85+

Data Privacy Day 2017EVENT

0.84+

Data Privacy Day EventEVENT

0.84+

OpenIDConnectTITLE

0.82+

AlexaTITLE

0.71+

EUORGANIZATION

0.7+

CareKitTITLE

0.68+

one applicationQUANTITY

0.68+

yearsQUANTITY

0.67+

TMIORGANIZATION

0.66+

Michael Kaiser | Data Privacy Day 2017


 

>> Hey, welcome back everybody. Jeff Frick here with theCUBE. We're in downtown San Francisco at the Twitter headquarters for Data Privacy Day. An interesting collection of people coming together here at Twitter to talk about privacy, the implications of privacy... And I can't help but think back to the classic Scott McNeely quote right, "Privacy is dead, get over it", and that was in 1999. Oh how the world has changed, most significantly obviously mobile phones with the release of the iPhone in 2007. So we're excited to really kind of have the spearhead of this event, Michael Kaiser. He's the executive director of the National Cyber Security Alliance in from Washington D.C.. Michael, great to see you. >> Thanks for having us in. >> For the folks that aren't here, what is kind of the agenda today? What's kind of the purpose, the mission? Why are we having this day? >> Well Data Privacy Day actually comes to us from Europe, from the EU which created privacy as a human right back in 1981. We've been doing it here in the United States since around 2008. NCSA took over the effort in 2011. The goal here really is just help educate people, people and businesses as well, about the importance of respecting privacy, the importance of safeguarding information, people's personal data. And then really hopefully with an end goal of building a lot more trust in the ecosystem around the handling of personal data which is so vital to the way the internet works right now. >> Right, and it seems like obviously companies figured out the value of this data long before individuals did and there's a trade for service. You use Google Maps, you use a lot of these services but does the value exchange necessarily, is it equal? Is it at the right level? And that seems to be kind of the theme of some of these privacy conversations. You're giving up a lot more value than you're getting back in exchange for some of these services. >> Yeah, and we actually have a very simple way that we talk about that. We like to say that personal information is like money and that you should value it and protect it. And so, trying to encourage people and educate people to understand that their personal information does have value and there is an exchange that's going on. They should make sure that those transactions are ones that they're comfortable in terms of giving their information and what they get back. >> Right, which sounds great Michael but then you know you get the EULA, you know you sign up for these things and they don't really give you the option. You can kind of read it but who reads it? Who goes through? You check the box and you move on. And or you get this announcement, we changed our policy, we changed our policy, we changed our policy. So, I don't know if realistic is the right word but how do people kind of navigate that? Because, let's face it my friends told me about Uber, I want to get an UBER. I download UBER. I'm stuck in a rainy corner in D.C. and I hit go and here comes the car. I don't really dig into the meat. Is there an option? I mean there's not really, I opt for privacy one, two, three and I'm opting out of five, six, seven. >> Yeah, I think we're seeing a little bit more granular controls for people on some of these things now but I think that's what we'd advocate for more. When we talk to consumers they tell us mostly that they want to have better clarity about what's being collected about them, better clarity about how that information's being used, or if it's, how it's being shared. Equally importantly, if there are controls where are they, how easy are they to use, and making them more prominent so people can engage in sort of making the services tailored to their own sort of privacy profile. I think we'd like to see more of that for sure, more companies being a little more forthcoming. Yeah you have the big privacy policy that's a long complicated legal document but there may be other way to create interfaces with your customers that make some of the key pieces more apparent. >> And do you see a trend where, because you mentioned in some of the notes that we prepared that privacy is good for business and potentially is a competitive differentiator. Are you starting to see where people are surfacing privacy more brightly so that they can potentially gain the customer, gain respect of the customer, the business of the customer over potentially a rival that's got that buried down? Is that really a competitive lever that you see? >> Well I think you see some extremes. So you see some companies that say we don't collect any information about you at all so that's part of, out there, and I think they're marketing to people who have extreme concerns about this. But I also think we're seeing again some places where there are more higher profile ability to control some of this data right. Even in you know places like the mobile setting where sometimes you'll just get a little warning saying oh this is about to use your location, is that okay, or your location is turned off you need to turn it back on in order to use this particular app. And I think those kinds of interfaces with the user of the technology are really important going forward. We don't want people overwhelmed like every time you turn on your phone you're going to have to answer 17 things in order to get to do x, y, and z but making people more aware of how the apps are using the information they collect about you I think is actually good for business. I think actually sometimes consumers get confused because they'll see a whole list of permissions that need to be provided and they don't understand how those permissions apply to what the app or service is really going to do. >> Right, right. >> Yeah, that's an interesting one. I was at a, we were at Grace Hopper in October and one of the keynote speakers was talking about how mobile data has really changed this thing right because once you're on your mobile phone it uses all the capabilities that are native in the phone in terms of geolocation, accelerometer, etc. All these things that a lot of people probably didn't know were different on the mobile Facebook app than were on the desktop Facebook app. Let's face it, most this stuff is mobile these days, certainly with the younger kids. As you said, and that's an interesting tack, why do you need access to my context? Why do you need access to my pictures? Why do you need access to my location? And then the piece that I'm curious to get your opinion, will some of the value come back to the consumer in terms of I'm not just selling your stuff, I'm not monetizing it via ads, I'm going to give some of that back to you? >> Yeah, I think there's a couple things there. One quick point on the other issue there, without naming names I was looking at an app and it said it had to have access to my phone, and I'm like why would this app need access to my phone? And then I realized later well it needs access to my phone because if the phone rings it needs to turn itself off so I can answer the phone. But that wasn't apparent right? And so I think it can be confusing to people like maybe it's innocuous in some ways. Some ways it might not be but in that case it was like okay yeah because if the phone rings I'd rather answer my phone than be looking at the app. >> Right, can I read it or can I just see it. You know the degree of the access too is very confusing. >> Yeah and I think in terms of the other issues that you're raising here about how the value exchange on data, I think the internet of things is really going to play a big role in this because it's really... You know in the current world it's about you know data, delivering ads, those kinds of things, making the experience more customized. But in IoT where you're talking about wearables or fitness or those kinds of things, or thermostats in your home, your data really drives that. So in order for those devices to really work well they have to have data about you. And that's where I think customers will really have to give great thought to. You know is that a good value proposition, right? I mean, do I want to share my data about when I come and leave every day just so my thermostat you know can turn on and off. And I think those are you know can be conscience decisions about when you're implementing that kind of technology. >> Right, so there's another interesting tack I'd love to get your opinion on. You know we see Flo from the Progressive commercials advertising to stick the USB in your cigarette lighter and we'll give you cheaper rates because now we know if you stop at stop signs or not. What's funny to me is that phone already knows whether you stop at stop signs or not and it already knows that you take 18 trips to 7-Eleven on a Saturday afternoon and you're sitting on your couch the balance of the time. As that information that's there somehow gets exposed and potentially runs into say healthcare mandated requirement from the company that you must wear Fitbits so now we know you're spending too much time at the 7-Eleven and on your couch and how that impacts your health insurance and stuff. And that's going to crash right into HIPAA. It just seems like there's this huge kind of collision coming from you know I can provide better service to people at the good end of the scale, and say aggregated risk models, but then what happens to the poor people at the other end? >> Well, I think that's why you have to have opt in, right? I think you can't make these things mandatory necessarily. And I think people have to be extremely aware of when their data is being collected and how it's being used. And so, you know the example of like the car insurance, I mean they can only, really should only be able to access that data about where you're going if you sign up to do that right? And if they want to say to you, hey Michael we might give you a better rate if we can track your, you know driving habits for a couple of weeks then that should be my choice right to give that data. Maybe my rates might be impacted if I don't but I can make that choice myself and should be allowed to make that choice myself. >> So it's funny, the opt in and opt out, so right now from your point of view what do you see in terms of the percentage of kind of opt in opt out on these privacy issues? Where is it and where should it be? >> Well I would like to see some more granular controls for the consumer in general right. I would like to see... And I said a little bit earlier a lot more transparency and ease of access to what's being collected about you and what's being used. You know outside of the formal legal process, obviously you know companies have to follow the law. They have to comply. They have to be, you know write these long EULAs or privacy policies in order to really reflect what they're doing. But they should be talking to their customers and understanding what's the most important thing that you want to know about my service before you sign up for it. And help people understand that and navigate their way through it. And I think in a lot of cases consumers will click yeah let's do it but they should do that really knowingly. If opting in is you're opting in it should be done with true consent right. >> Okay, so before I let you go just share some best practices, tips and tricks, you know kind of at least the top level what people should be thinking about, what they should be doing. >> Yeah, so we really, you know in this kind of space we look at a couple things. One, personal informations like money value and protect it. That really means being thoughtful about what information you share, when you share it, who you share it with. Own your online presence, this is really important. Consumers have an active role in how they interact with the internet. Use the settings that are there right. Use the safety and security or privacy and security settings that are in the services that you have. And then, actually a lot of this is behavioral. What you share is really important yourself so share with care right. I mean be thoughtful about the kinds of information that you put out there about yourself. Be thoughtful about the kind of information that you put about your friends and family. Realize that every single one of us in this digital world is entrusted with personal information about people much more than we used to be in the past. We have that responsibility to safeguard what other people give to us and that should be the common goal around the internet. >> I think we have to have you at the bullying and harassment convention down the road. Great insight Michael and really appreciate it. Have a great day today. I'm sure there's going to be a lot of terrific content that comes out. And for people to get more information go to the National Cyber Security Alliance. Thanks for stopping by. >> Thank you for having us. >> Absolutely. He's Michael Kaiser. I'm Jeff Frick. You're watching theCUBE, thanks for watching.

Published Date : Jan 28 2017

SUMMARY :

And I can't help but think back to the about the importance of respecting privacy, And that seems to be kind of the theme and that you should value it and protect it. You check the box and you move on. how easy are they to use, and making them more prominent in some of the notes that we prepared And I think those kinds of interfaces with the user And then the piece that I'm curious to get your opinion, And so I think it can be confusing to people You know the degree of the access too is very confusing. And I think those are you know can be conscience decisions and it already knows that you take 18 trips And I think people have to be extremely aware and ease of access to what's being collected about you you know kind of at least the top level and security settings that are in the services I think we have to have you I'm Jeff Frick.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Michael KaiserPERSON

0.99+

MichaelPERSON

0.99+

Jeff FrickPERSON

0.99+

2011DATE

0.99+

2007DATE

0.99+

1999DATE

0.99+

National Cyber Security AllianceORGANIZATION

0.99+

NCSAORGANIZATION

0.99+

1981DATE

0.99+

D.C.LOCATION

0.99+

OctoberDATE

0.99+

Washington D.CLOCATION

0.99+

United StatesLOCATION

0.99+

EuropeLOCATION

0.99+

iPhoneCOMMERCIAL_ITEM

0.99+

Scott McNeelyPERSON

0.99+

18 tripsQUANTITY

0.99+

17 thingsQUANTITY

0.99+

Data Privacy DayEVENT

0.99+

todayDATE

0.99+

fiveQUANTITY

0.99+

UberORGANIZATION

0.99+

UBERORGANIZATION

0.98+

oneQUANTITY

0.97+

threeQUANTITY

0.97+

HIPAATITLE

0.97+

One quick pointQUANTITY

0.97+

EULATITLE

0.97+

sevenQUANTITY

0.97+

Saturday afternoonDATE

0.96+

twoQUANTITY

0.95+

TwitterORGANIZATION

0.95+

sixQUANTITY

0.95+

Google MapsTITLE

0.94+

2008DATE

0.92+

Data Privacy Day 2017EVENT

0.9+

EUORGANIZATION

0.9+

OneQUANTITY

0.88+

San FranciscoLOCATION

0.86+

coupleQUANTITY

0.82+

FloORGANIZATION

0.8+

FacebookORGANIZATION

0.77+

Grace HopperORGANIZATION

0.77+

couple thingsQUANTITY

0.7+

7-ElevenCOMMERCIAL_ITEM

0.67+

couple of weeksQUANTITY

0.66+

FitbitsORGANIZATION

0.65+

theCUBEORGANIZATION

0.63+

singleQUANTITY

0.57+

7-QUANTITY

0.54+

ElevenORGANIZATION

0.33+

Lisa Ho | Data Privacy Day 2017


 

>> Hey, welcome back everybody, Jeff Frick here with theCUBE. We're in downtown San Francisco at the Twitter headquarters at the Data Privacy Day Event. It's a full day event with a lot of seminars and presentations, really talking about data privacy, something that's getting increasingly important everyday, especially as we know, RSA's coming up in a couple of weeks and a lot of talk about phishing and increased surface area of attack, and et cetera, et cetera. So privacy is really important and we're excited to have Lisa Ho, Campus Privacy Officer at UC Berkeley. Welcome, Lisa. >> Thank you, glad to be here. >> So what does the Campus Privacy Officer do? >> Well, really anything that has to do with privacy that comes across. So making sure that we're in compliance or doing what I can to help the campus keep in compliance with privacy laws. But beyond that, also making sure that we stay aligned with our privacy values and when I say that, I mean, privacy is really important. It's critical for creativity and for intellectual freedom. So at the university, we need to make sure we hold on to those when we're dealing with new ideas and new scenarios that's got to come up. We have to balance privacy with all the other priorities and obligations we have. >> Yeah, I don't know if Berkeley gets enough credit and Stanford as really being two of the real big drivers of Silicon Valley. It attracts a lot of smart people. They come, they learn, and then more importantly, they stay. So you've got a lot of cutting edge innovation, you've got a ton of open source technologies come out of Berkeley over the years. Spark, et cetera. So you guys are really at the leading edge but at the same time, you're an old, established academic institution so what role do you have formally as an academic institution of higher education to help set some of these standards and norms as the world is changing around it so very, very quickly? >> Yeah, well, so as I say, the environment needs to be set for creativity and for allowing that intellectual freedom. So when we think about the university, the things that we do there are pretty much what we want to have in the community as a whole, and in our culture and environment. So some of the things that we think about particularly, first, if you talk about, think about school, you think about grades or you think about the letters evaluation that you get. Those things that, learning when you come down to it is a personal endeavor and you, developing internally. It's a transformation that's internal. And so what kind of feedback you get, what kind of critical evaluation, those need to be done in an area where you have the privacy to not be, have a reputation to either live up to or live down. Those are things that you keep secret or keep private and that's why school information and student data is so, as we've agreed as a society that that's something that needs to stay private. So that's one area that learning is personal. That's why the university is so important in that discussion. And secondly, I'd say, as we talked about, creativity requires time to develop and it requires freedom for taking risks. So whether you're working on a book or whether it's a piece of art or if you're a scientist, a formula, any kind of algorithm, a theory. Those are things that you need time to set aside and to be in your own head without the eyes of others until you're ready. Without not having judgment before it's ready for release. And those kind of things that you want to have space for creativity so that you can move beyond the status quo and take those risks to go somewhere to the next space and beyond. >> Jeff: Right. >> And I think lastly, I'd say that, this is not specific to the university, but where we hold particularly at Berkeley, is the fundamental rights that we have that privacy is one of those fundamental rights and as Ed Snowden said so famously, if you're saying I don't care about privacy because I have nothing to hide is like saying I don't care about freedom of speech because I have nothing to say. So just because you may not have something to say doesn't mean that you can take away the rights of someone else and you may find that you need those at some point in your life in the future, and no one has to justify why they need a fundamental right. So those things that are essential that come out in our university environment that we think of a lot are things that are applicable beyond just the learning space of the university, to the kind of society that we want to build. That's why the university's in the space to lead in these areas. >> Right, 'cause Berkeley's got a long history, right, of activism, and this goes back for decades and decades. I mean, is privacy starting to get elevated to the level that you're going to see more active, vocal points of view and statements, and I don't want to say marches, but potentially marches in terms of making sure this is taken care of? Because unfortunately, I think most privacy applications, at least historically, maybe it's changing, are really opt out, you know, not opt in. So do you see this? Is it becoming a more important kind of policy area versus just kind of an execution detail on an application? >> Yeah, we have a lot of really great professors working on these ideas around privacy and in cybersecurity that, those that are working on security and other things also have privacy in their background and are also advocating in that area as well. As far as marches, we all, you pretty much rely on the students for that and you can't dictate what the students are going to find as important. But there are. There's definitely a cadre of students that care and are interested in these topics and when you tie them together with the fundamental rights like free speech and academic freedom and creativity, that's where it becomes important and people get interested in that. >> Right. One of the real sticky areas that this bounces into is just security, security and unfortunately, there's been way too many instances at campuses over the last several years of crazy people grabbing a gun and shooting people, which, you know, hopefully won't happen today. And that's really kind of where the privacy and security thing runs up against should we have known? Should we have seen this person coming? If we had had access to whatever that they're doing, maybe we would have known and been able to prevent it. So when you look at kind of the, I don't want to say balance, but really, the conflict between security security and privacy, what are some of the rules coming out? How do you guys execute that to both provide a safe environment for people to study and learn and grow, as you mentioned, but at the same time, keep an eye out for unfortunately, there are bad characters in the world. >> Right, yeah well, I don't want to say that there's a dichotomy. I don't want to create a false dichotomy of it's either privacy or it's security and that's not the frame of mind that we want to be in. It's important for both and security is clearly important. Preventing unauthorized access to information or your personal information is clearly a part of privacy and so that's necessary for privacy and those are things that you would do to protect privacy. The two factor authentication and the antivirus and the network segmentation, those are all things that are important parts of protecting privacy as well. So it's not a dichotomy of one or the other, but there are things that you do for security purposes, whether it's cybersecurity or for the kind of security, personal security, that maybe in a conflict, have a different purpose than what you would do for privacy and monitoring is one of those areas specifically. When you're monitoring for attacks, this kind in particularly, now we have the continuous monitoring for any kind of attacks or to use that monitoring data as a forensic place to look for information after the fact. Those are things that really is lies in contrast with the idea in privacy of least perusal and not looking and not looking for information until you need it, so having that distance in the privacy of not having surveillance. So what we're coming to, at the University of California has outlined a privacy balancing analysis that's necessary for these kind of scenarios that are new, when we have, untested, when we don't have laws around them, to balance the many priorities and obligations and what you need to do is look at what does the security provide, look at the benefits together with the risks and do that balancing. And so you need to go through a series of questions. What is the utility that you're really getting out of that monitoring and not just in that normal scenario when you're expecting, how you're expecting to use it. But what about in the use cases that maybe you didn't expect that, but you can anticipate that it'll be wanted for those reasons or if you, what about when we're required to turn it over for a subpoena or another kind of letter. What are the use cases in that? What are the privacy impacts in those cases? What are the privacy impacts if it's hacked or what are the privacy impacts of an abuse by an employee? What are the privacy impacts for sharing it with partners? So that together, the utility with the impact you need to balance that and to look at those differences, and then also look at what's the scope of that? Does the scope change? If you change the scope of what you're monitoring, does it change the privacy impact? Does it change the utility? When you look at those kind of factors and keep them all in line, not just looking at what's the utility of what you're trying to do, but what are the other impacts to the privacy analysis and then what are the alternatives that you could do the same thing and are they appropriate? Do they give you the same kind of value that the proposed monitoring provides? Keeping transparent about and keeping accountable to what you're doing are really when it comes down to the key as you've done that analysis and making sure that you've looked through those questions of have you kept it, are you doing the least amount of perusal necessary to achieve the goals that you're trying to accomplish with that monitoring? And what about transparency and accountability coming back to whatever your decisions are, making those available to the community that's being monitored. >> Wow, well one you've got job security, I guess, for life, because that was amazing. Two, as you're talking balances, the word I was looking for before, so that is the right word. But you're balancing on so many axis and even once you get through the axis that you just went through that list of, it's phenomenal, then you still need to look at the alternatives, right? And do the same kind of analysis for each. So really, that was a great explanation. So I want to shift gears a little bit and talk about wearables. You're going to give a talk later on today about the wearables. Wearables are a whole new kind of interesting factor now that provide a whole bunch more data, really kind of the cutting edge of the internet of things with sensor data. People are things too, we like to say on theCUBE. So as you look at the wearables and the impact of wearables on this whole privacy conversation, what are some of the big, got you issues that are really kind of starting to be surfaced as these things get more popular? >> Yeah, I think a lot of the same kind of questions around what kind of monitoring you're doing, what's the utility, what is the privacy impact and how do you balance those in the various scenarios, the use cases that come up, really the same kind of questions apply to cybersecurity as they do to cybersecurity monitoring. We're finding, I think in college athletics and the university sponsored use of wearable technology is really just in infancy right now. It's not a big thing that we're working on. But it ties in so much as very much parallels the other kind of questions that we are talking about around learning data and how you jump or how your body functions is very private, very intimate. How you think, how you learn, that's right up there on the spectrum on that privacy and intimacy scale. So we're looking very much and we've been talking quite a bit in the university space about learning data and how we protect that. Some of the questions are who owns that data? It's about me, should I be, you know, it's about the student for example. Should I have control over how that information is used? When it's around learning data, maybe the average student, there may not be outside folks that are interested in that information but when you're talking about student athletes, potentially going pro, that's very valuable data that people may want, so that, people may want to pay for, maybe the student should have some say in the use of that data, monetizing that data, who owns that? Is it the student, is it the university, is it the company that we work with to provide that kind of monitoring the analytics on that? >> Jeff: Right, right. >> Even if we have a contract or right now, if it's through the university, we'd hopefully have made really clear who's the ownership, where the uses ally, what kind of things we can do with it, but as we move into kind of a consumer space, and it's where you just clicking the box and students may be asked, oh, use this technology, it's free and we'll be able to handle it, because of course, how much it costs is important in the university space >> Give you free slices at the pizza store. >> Right, well once we get into that consumer realm when it's just either not even having to click the box, the box is already clicked, can you say okay, that's the new come up to where students may be giving away data for reasons or for uses that they didn't intend, that they are not getting any compensation for, and in particular cases, when you talk about student athletes, that could be something that would be very meaningful for their career and beyond. >> Yeah or is it the guy that's come up with the unique and innovative training methodology that they're testing, is it Berkeley's information to see how people are learning so you can incorporate that into your lesson plans and the way that you teach 'em, and there's so many kind of angles but it always comes back, as you said, really the context. Kind of what's the context for the specific application that you're trying to use that and should you or should you not have rights for that context. It's really interesting space, a lot of interesting challenges, and like I said, job security for you for the unforeseeable future. >> Yeah, we're not going to run out of new and exciting applications and things to be thinking about in terms of privacy. It's just a non stop. >> Right, 'cause they're not, these are not technology questions, right? These are policy questions and rules questions. We heard a thing last night with the center and one of the topics was we need a lot more rules around these types of things because the technology's outpacing kind of the governance rules and really the thought processes, the ways that these things can all be used. >> It's a culture question, really. It's more than just what you allow or not, but how we feel about it and the kind of idea that privacy is dead is only true if we don't care about it anymore. So if we care about it and we pay attention to it, then privacy is not dead. >> Alright, well Lis, we'll leave it there. Lisa Ho from UC Berkeley, fantastic. Thank you for stopping by and good luck at your wearables panel later this afternoon. >> Thank you. >> Alright, I'm Jeff Frick. You're watching theCUBE, thanks for watching. (upbeat music)

Published Date : Jan 28 2017

SUMMARY :

the Twitter headquarters at the Data Privacy Day Event. So at the university, we need to make sure So you guys are really at the leading edge So some of the things that we think about particularly, is the fundamental rights that we have So do you see this? on the students for that and you can't dictate One of the real sticky areas that this bounces into and that's not the frame of mind that we want to be in. so that is the right word. is it the company that we work with slices at the pizza store. and in particular cases, when you talk about and the way that you teach 'em, and exciting applications and things and one of the topics was we need It's more than just what you allow or not, Thank you for stopping by and Alright, I'm Jeff Frick.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JeffPERSON

0.99+

Jeff FrickPERSON

0.99+

Ed SnowdenPERSON

0.99+

Lisa HoPERSON

0.99+

LisaPERSON

0.99+

StanfordORGANIZATION

0.99+

twoQUANTITY

0.99+

University of CaliforniaORGANIZATION

0.99+

Silicon ValleyLOCATION

0.99+

LisPERSON

0.99+

bothQUANTITY

0.99+

todayDATE

0.99+

TwoQUANTITY

0.99+

oneQUANTITY

0.99+

two factorQUANTITY

0.98+

firstQUANTITY

0.98+

UC BerkeleyORGANIZATION

0.97+

last nightDATE

0.97+

decadesQUANTITY

0.96+

eachQUANTITY

0.96+

BerkeleyORGANIZATION

0.95+

TwitterORGANIZATION

0.94+

later this afternoonDATE

0.94+

OneQUANTITY

0.93+

secondlyQUANTITY

0.93+

theCUBEORGANIZATION

0.92+

Data Privacy Day EventEVENT

0.92+

RSAORGANIZATION

0.91+

Data Privacy Day 2017EVENT

0.85+

Campus Privacy OfficerPERSON

0.85+

BerkeleyLOCATION

0.84+

downtown San FranciscoLOCATION

0.79+

SparkORGANIZATION

0.77+

lastDATE

0.6+

yearsDATE

0.45+

Andreas S Weigend, PhD | Data Privacy Day 2017


 

>> Hey welcome back everybody, Jeff Frick here with theCUBE we're at the data privacy day at Twitter's world headquarters in downtown San Fransciso and we're really excited to get into it with our next guest Dr. Andreas Weigend, he is now at the Social Data Lab, used to be at Amazon, recently published author. Welcome. >> Good to be here, morning. >> Absolutely, so give us a little about what is Social Data Lab for people who aren't that familiar with it and what are you doing over at Berkeley? >> Alright, so let's start with what is social data? Social data is a data people create and share whether they know it or not and what that means is Twitter is explicit but also a geo location or maybe even just having photos about you. I was in Russia all day during the election day in the United States with Putin, and I have to say that people now share on Facebook what the KGB wouldn't have gotten out of them under torture. >> So did you ever see the Saturday Night Live sketch where they had a congressional hearing and the guy the CIA guy says, Facebook is the most successful project that we've ever launched, people tell us where they are who they're with and what they're going to do, share pictures, location, it's a pretty interesting sketch. >> Only be taught by Black Mirror, some of these episodes are absolutely amazing. >> People can't even watch is it what I have not seen I have to see but they're like that's just too crazy. Too real, too close to home. >> Yeah, so what was the question? >> So let's talk about your new book. >> Oh that was social data. >> Yeah social data >> Yeah, and so I call it actually social data revolution. Because if you think back, 10, 20 years ago we absolutely we doesn't mean just you and me, it means a billion people. They think about who they are, differently from 20 years ago, think Facebook as you mentioned. How we buy things, we buy things based on social data we buy things based on what other people say. Not on what some marketing department says. And even you know, the way we think about information I mean could you do a day without Google? >> No >> No. >> Could you go an hour without Google? >> An hour, yes, when I sleep. But some people actually they Google in their sleep. >> Well and they have their health tracker turned on while they sleep to tell them if they slept well. >> I actually find this super interesting. How dependent I am to know in the morning when I wake up before I can push a smiley face or the okay face or the frowny face, to first see how did I sleep? And if the cycles were nice up and down, then it must have been a good night. >> So it's interesting because the concept from all of these kind of biometric feedback loops is if you have the data, you can change your behavior based on the data, but on the other hand there is so much data and do we really change our behaivor based on the data? >> I think the question is a different one. The question is alright, we have all this data but how can we make sure that this data is used for us, not against us. Within a few hundred meters of here there's a company where employees were asked to wear a fit bit or tracking devices which retain more generally. And then one morning one employee came in after you know not having had an exactly solid night of sleep shall we say and his boss said I'm sorry but I just looked at your fit bit you know this is an important meeting, we can't have you at that meeting. Sorry about that. >> True story? >> Yeah >> Now that's interesting. So I think the fit bit angle is interesting when that is a requirement to have company issued health insurance and they see you've been sitting on your couch too much. Now how does that then run into the HIPPA regulations. >> You know, they have dog walkers here. I'm not sure where you live in San Francisco. But in the area many people have dogs. And I know that a couple of my neighbors they give when the dog walker comes to take the dog, they also give their phone to the dog walker so now it looks like they are taking regular walks and they're waiting for the discount from health insurance. >> Yeah, it's interesting. Works great for the person that does walk or gives their phone to the dog walker. But what about the person that doesn't, what about the person that doesn't stop at stop signs. What happens in a world on business models based on aggregated risk pooling when you can segment the individual? >> That is a very very very biased question. It's a question of fairness. So if we know everything about everybody what would it mean to be fair? As you said, insurance is built on pooling risk and that means by nature that there are things that we don't know about people. So maybe, we should propose lbotomy data lobotomy. So people actually have some part chopped off out of the data chopped off. So now we can pool again. >> Interesting >> Of course not, the answer is that we as society should come up with ways of coming up with objective functions, how do we weigh the person you know taking a walk and then it's easy to agree on the function then get the data and rank whatever insurance premium whatever you're talking about here rank that accordingly. So I really think it's a really important concept which actually goes back to my time at Amazon. Where we came up with fitness functions as we call it. And it takes a lot of work to have probably spent 50 hours on that with me going through groups and groups and groups figuring out, what do we want the fitness function to be like? You have to have the buy in of the groups you know it they just think you know that is some random management thing imposed on us, it's not going to happen. But if they understand that's the output they're managing for, then not bad. >> So I want to follow up on the Amazon piece because we're big fans of Jeff Hamilton and Jeff Bezzos who we go to AWS and it's interesting excuse me, James Hamilton when he talks about the resources that EWS can bring to bear around privacy and security and networking and all this massive infrastructure being built in terms of being able to protect privacy once you're in the quote un-quote public cloud versus people trying to execute that at the individual company level and you know RSA is in a couple of weeks the amount of crazy scary stuff that is coming in for people that want interviews around some of this crazy security stuff. When you look at kind of public cloud versus private cloud and privacy you know supported by a big heavy infrastructure like what EWS has versus a Joe Blow company you know trying to implement them themselves, how do you see that challenge. I mean I don't know how the person can compete with having the resourses again the aggregated resources pool that James Hamilton has to bring to barrel this problem. >> So I think we really need to distinguish two things. Which is security versus privacy. So for security there's no question in my mind that Joe Blow, with this little PC has not a chance against our Chinese or Russian friends. Is no question for me that Amazon or Google have way better security teams than anybody else can afford. Because it is really their bread and butter. And if there's a breach on that level then I think it is terrible for them. Just think about the Sony breach on a much smaller scale. That's a very different point from the point of privacy. And from the point about companies deliberately giving the data about you for targeting purposes for instance. And targeting purposes to other companies So I think for the cloud there I trust, I trust Google, I trust Amazon that they are doing hopefully a better job than the Russian hackers. I am more interested in the discussion on the value of data. Over the privacy discussion after all this is the world privacy day and there the question is what do people understand as the trade off they have, what they give in order to get something. People have talked about Google having this impossible irresistible value proposition that for all of those little data you get for instance I took Google Maps to get here, of course Google needs to know where I am to tell me to turn left at the intersection. And of course Google has to know where I want to be going. And Google knows that a bunch of other people are going there today, and you probably figure out that something interesting is happening here. >> Right >> And so those are the interesting questions from me. What do we do with data? What is the value of data? >> But A I don't really think people understand the amount of data that they're giving over and B I really don't think that they understand I mean now maybe they're starting to understand the value because of the value of companies like Google and Facebook that have the data. But do you see a shifting in A the awareness, and I think it's even worse with younger kids who just have lived on their mobile phones since the day they were conscious practically these days. Or will there be a value to >> Or will they even mobile before they were born? Children now come pre-loaded, because the parents take pictures of their children before they are born >> That's true. And you're right and the sonogram et cetera. But and then how has mobile changed this whole conversation because when I was on Facebook on my PC at home very different set of information than when it's connected to all the sensors in my mobile phone when Facebook is on my mobile phone really changes where I am how fast I'm moving, who I'm in proximity to it completely changed the privacy game. >> Yes so geo location and the ACLU here in Northern California chapter has a very good quote on that. "Geo location is really extremely powerful variable" Now what was the question? >> How has this whole privacy thing changed now with the proliferation of the mobile, and the other thing I would say, when you have kids that grew up with mobile and sharing on the young ones don't use Facebook anymore, Instagram, Snap Chat just kind of the notion of sharing and privacy relative to folks that you know wouldn't even give their credit card over the telephone not that long ago, much less type it into a keyboard, um do they really know the value do they really understand the value do they really get the implications when that's the world in which they've lived in. Most of them, you know they're just starting to enter the work force and haven't really felt the implications of that. >> So for me the value of data is how much the data impacts a decision. So for the side of the individual, if I have data about the restaurant, and that makes me decide whether to go there or to not go there. That is having an impact on my decision thus the data is valuable. For a company a decision whether to show me this offer or that offer that is how data is valued from the company. So that kind of should be quantified The value of the picture of my dog when I was a child. That is you know so valuable, I'm not talking about this. I'm very sort of rational here in terms of value of data as the impact is has on decisions. >> Do you see companies giving back more of that value to the providers of that data? Instead of you know just simple access to useful applications but obviously the value exceeds the value of the application they're giving you. >> So you use the term giving back and before you talked about kids giving up data. So I don't think that it is quite the right metaphor. So I know that metaphor come from the physical world. That sometimes has been data is in your oil and that indeed is a good metaphor when it comes to it needs to be refined to have value. But there are other elements where data is very different from oil and that is that I don't really give up data when I share and the company doesn't really give something back to me but it is much interesting exchange like a refinery that I put things in and now I get something not necessarily back I typically get something which is very different from what I gave because it has been combined with the data of a billion other people. And that is where the value lies, that my data gets combined with other peoples data in some cases it's impossible to actually take it out it's like a drop of ink, a drop in the ocean and it spreads out and you cannot say, oh I want my ink back. No, it's too late for that. But it's now spread out and that is a metaphor I think I have for data. So people say, you know I want to be in control of my data. I often think they don't have deep enough thought of what they mean by that. I want to change the conversation of people saying You what can I get by giving you the data? How can you help me make better decisions? How can I be empowered by the data which you are grabbing or which you are listening to that I produce. That is a conversation which I want to ask here at the Privacy Day. >> And that's happening with like Google Maps obviously you're exchanging the information, you're walking down the street, you're headed here they're telling you that there's a Starbucks on the corner if you want to pick up a coffee on the way. So that is already kind of happening right and that's why obviously Google has been so successful. Because they're giving you enough and you're giving them more and you get in this kind of virtuous cycle in terms of the information flow but clearly they're getting a lot more value than you are in terms of their you know based on their market capitalization you know, it's a very valuable thing in the aggregation. So it's almost like a one plus one makes three >> Yes. >> On their side. >> Yes, but it's a one trick pony ultimately. All of the money we make is rats. >> Right, right that's true. But in-- >> It's a good one to point out-- >> But then it begs the question too when we no longer ask but are just delivered that information. >> Yes, I have a friend Gam Dias and he runs a company called First Retail, and he makes the point that there will be no search anymore in a couple of years from now. What are you talking about? I search every day, but is it. Yes. But You know, you will get the things before you even think about it and with Google now a few years ago when other things, I think he is quite right. >> We're starting to see that, right where the cards come to you with a guess as to-- >> And it's not so complicated If let's see you go to the symphony you know, my phone knows that I'm at the symphony even if I turn it off, it know where I turned it off. And it knows when the symphony ends because there are like a thousand other people, so why not get Ubers, Lyfts closer there and amaze people by wow, your car is there already. You know that is always a joke what we have in Germany. In Germany we have a joke that says, Hey go for vacation in Poland your car is there already. But maybe I shouldn't tell those jokes. >> Let's talk about your book. So you've got a new book that came out >> Yeah >> Just recently released, it's called Data for the People. What's in it what should people expect, what motivated you to write the book? >> Well, I'm actually excited yesterday I got my first free copies not from the publisher and not from Amazon. Because they are going by the embargo by which is out next week. But Barnes and Noble-- >> They broke the embargo-- Barnes and Noble. Breaking news >> But three years of work and basically it is about trying to get people to embrace the data they create and to be empowered by the data they create. Lots of stories from companies I've worked with Lots of stories also from China, I have a house in China I spend a month or two months there every year for the last 15 years and the Chinese ecosystem is quite different from the US ecosystem and you of course know that the EU regulations are quite different from the US regulations. So, I wrote on what I think is interesting and I'm looking forward to actually rereading it because they told me I should reread it before I talk about it. >> Because when did you submit it? You probably submitted it-- >> Half a year >> Half a year ago, so yeah. Yeah. So it's available at Barnes and Noble and now Amazon >> It is available. I mean if you order it now, you'll get it by Monday. >> Alright, well Dr. Andreas Weigin thanks for taking a few minutes, we could go forever and ever but I think we've got to let you go back to the rest of the sessions. >> Thank you for having me. >> Alright, pleasure Jeff Frick, you're watching theCUBE see you next time.

Published Date : Jan 28 2017

SUMMARY :

Dr. Andreas Weigend, he is now at the Social Data Lab, day in the United States with Putin, So did you ever see the Saturday Night Live sketch Only be taught by Black Mirror, some of these episodes I have to see but they're like that's just too crazy. And even you know, the way we think about information But some people actually they Google in their sleep. Well and they have their health tracker turned on or the frowny face, to first see how did I sleep? an important meeting, we can't have you at that meeting. So I think the fit bit angle is interesting And I know that a couple of my neighbors they give aggregated risk pooling when you can segment the individual? As you said, insurance is built on pooling risk it they just think you know that is some random at the individual company level and you know RSA is the data about you for targeting purposes for instance. What is the value of data? because of the value of companies like Google and it completely changed the privacy game. Yes so geo location and the ACLU here in that you know wouldn't even give their credit card over the So for me the value of data is how much the data Instead of you know just simple access to How can I be empowered by the data which you are Because they're giving you enough and you're giving All of the money we make is rats. But in-- But then it begs the question too when You know, you will get the things before you even you know, my phone knows that I'm at the symphony So you've got a new book that came out what motivated you to write the book? free copies not from the publisher and not from Amazon. They broke the embargo-- and you of course know that the EU regulations are So it's available at Barnes and Noble and now Amazon I mean if you order it now, you'll get it by Monday. I think we've got to let you go back to the rest Jeff Frick, you're watching theCUBE see you next time.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AmazonORGANIZATION

0.99+

PutinPERSON

0.99+

GoogleORGANIZATION

0.99+

James HamiltonPERSON

0.99+

Jeff FrickPERSON

0.99+

Jeff BezzosPERSON

0.99+

FacebookORGANIZATION

0.99+

Jeff HamiltonPERSON

0.99+

PolandLOCATION

0.99+

Barnes and NobleORGANIZATION

0.99+

Andreas WeigendPERSON

0.99+

GermanyLOCATION

0.99+

Andreas WeiginPERSON

0.99+

RussiaLOCATION

0.99+

50 hoursQUANTITY

0.99+

AWSORGANIZATION

0.99+

First RetailORGANIZATION

0.99+

SonyORGANIZATION

0.99+

ChinaLOCATION

0.99+

CIAORGANIZATION

0.99+

San FranciscoLOCATION

0.99+

Andreas S WeigendPERSON

0.99+

ACLUORGANIZATION

0.99+

EWSORGANIZATION

0.99+

An hourQUANTITY

0.99+

a monthQUANTITY

0.99+

United StatesLOCATION

0.99+

next weekDATE

0.99+

Northern CaliforniaLOCATION

0.99+

three yearsQUANTITY

0.99+

an hourQUANTITY

0.99+

two monthsQUANTITY

0.99+

StarbucksORGANIZATION

0.99+

first free copiesQUANTITY

0.99+

Social Data LabORGANIZATION

0.99+

Saturday Night LiveTITLE

0.99+

KGBORGANIZATION

0.99+

20 years agoDATE

0.99+

yesterdayDATE

0.99+

EUORGANIZATION

0.98+

threeQUANTITY

0.98+

two thingsQUANTITY

0.98+

Black MirrorTITLE

0.98+

Half a year agoDATE

0.98+

BerkeleyLOCATION

0.98+

todayDATE

0.97+

USLOCATION

0.97+

one employeeQUANTITY

0.97+

MondayDATE

0.97+

TwitterORGANIZATION

0.97+

firstQUANTITY

0.97+

LyftsORGANIZATION

0.96+

one morningQUANTITY

0.96+

Joe BlowORGANIZATION

0.95+

RussianOTHER

0.95+

Data for the PeopleTITLE

0.95+

oneQUANTITY

0.94+

Google MapsTITLE

0.93+

a dayQUANTITY

0.93+

Gam DiasPERSON

0.92+

UbersORGANIZATION

0.91+

Dr.PERSON

0.91+

ChineseOTHER

0.9+

one trickQUANTITY

0.89+

few years agoDATE

0.88+

InstagramORGANIZATION

0.83+

Jules Polonetsky, Future of Privacy Forum | Data Privacy Day 2017


 

>> Hey, welcome back everybody. Jeff Frick here with theCUBE. We're in downtown San Francisco at Twitter's world headquarters at the Data Privacy Day, a full day event of sessions and breakout sessions really talking about privacy. Although privacy is dead in 1999 get over it, not really true and certainly a lot of people here beg to differ. We're excited to have our next guest Jules Polonetsky, excuse me, CEO of Future of Privacy Forum. Welcome. >> Thank you, great to be here. Exciting times for data, exciting times for privacy. >> Yeah, no shortage of opportunity, that's for sure. The job security and the privacy space is pretty high I'm gathering after a few of these interviews. >> There's a researcher coming up with some new way we can use data that is both exciting, curing diseases, studying genes, but also sometimes orwellian. Microphones are in my home, self-driving cars, and so, getting that right is hard. We don't have clear consensus over whether we want the government keeping us safe by being able to catch every criminal, or not getting into our stuff because we don't trust them >> Right. [Jules] - So challenging times. [Jeff] - So, before we jump into it, Future Privacy Forum, kind of a little bit about the organization, kind of your mission... [Jules] - We're eight years old at the Future Privacy Forum, we're a think tank in Washington, D.C. Many of our members are the chief privacy officers of companies around the world, so about 130 companies, ranging from many of the big tech companies. And as new sectors start becoming tech and data, they join us. So, the auto industries dealing with self-driving cars, connected cars, all those issues. Wearables, student data, so about 130 of those companies. But then the other half of our group are advocates and academics who are a little bit skeptical or worried. They want to engage, but they are worried about an Orwellian future. So we bring those folks together and we say, 'Listen, how can we have data that will make cars safer? How can we have wearables that'll help improve fitness? But also have reasonable, responsible rules in place so that, we don't end up with discrimination, or data breaches, and all the problems that can come along?' [Jeff] - Right, cause it's really two sides of the same coin and it's always two sides of the same coin. And typically on new technology, we kind of race ahead on the positive, cause everybody's really excited. And lag on kind of what the negative impacts are and/or the creation of rules and regulations about because this new technology, very hard to keep up. [Jules] - You know the stakes are high. Think about AdTech, right? We've got tons of adtech. It's fueling free content, but we've got problems of adware, and spyware, and fake news, and people being nervous about cookies and tracking. And every year, it seems to get more stressful and more complicated. We can't have that when it comes to microphones in my home. I don't want to be nervous that if I go into the bedroom, suddenly that's shared across the adtech ecosystem. Right? I don't know that we want how much we sweat or when it's somebody's time of the month, or other data like that being out there and available to data brokers. But, we did a study recently of some of the wearables, the more sensitive ones. Sleep trackers, apps that people use to track their periods, many of them, didn't even have a privacy policy, to say 'I don't do this, or I don't do that with your data.' So, stakes are high. This isn't just about, you know, are ads tracking me? And do I find that intrusive? This is about if I'm driving my car, and it's helping me navigate better and it's giving me directions, and it's making sure I don't shift out of my lane, or it's self-parking, that that data doesn't automatically go to all sorts of places where it might be used to deny me benefits, or discriminate, or raise my insurance rates. [Jeff]: Right, right. Well, there's so many angles on this. One is, you know, since I got an Alexa Dot for Christmas, for the family, to try it out and you know, it's interesting to think that she's listening all the time. [Jules] - So she's not >> And you push the little >> Let's talk about this >> button, you know. >> Or is she not? >> This is a great topic to [Jules] -talk about because a sheriff recently, wanted to investigate a crime and realized that they had an Amazon Echo in the home. And said, 'Well maybe, Amazon will have data about what happened >> Right >> Maybe they'll be clues, people shouting,' you know. And Amazon's fighting because they don't want to hand it over. But what Amazon did, and what Google Home did, and the X-Box did, they don't want to have that data. And so they've designed these things, I think, with actually a lot of care. So... the Echo, is listening for it's name. It's listening for Alexa... >> Right. And it keeps deleting. It listens, right it hears background noise, and if it didn't hear Alexa, drops it, drops it, drops it. Nothing is said out of your home. When you say 'Alexa, what's the weather?' Blue light glows, opens up the connection to Amazon, and now it's just like you're typing in a search or going directly >> Right, right. [Jules] - And so that's done quiet carefully. Google Home works like that, Siri works like that, so I think the big tech companies, despite a lot of pain and suffering over the years of being criticized, and with the realization that government goes to them for data. They don't want that. They don't want to be fighting the government and people being nervous that the IRS is going to try find out information about what you're doing, which bedroom you're in, and what time you came home. >> Although the Fit Bit has all that information. >> Exactly >> Even though Alexa doesn't. [Jules] - So the wearables are another exciting, interesting challenge. We had a project that was funded by both Robert Johnson Foundation, which wants Wearables to be used for health and so forth. But also from a lot of major tech companies. Because everybody was aware that we needed some sort of rules in place. So if Fit Bit, or Jaw Bone, or one of the other Wearables can detect that maybe I'm coming down with Parkinson's or I'm about to fall, or other data, what's their responsibility to do something with that? On one hand, that would be a bit frightening. Right, you got a phone call or an email saying 'Hey, this is your friendly friends at your Wearable and we think >> showing up at your front door >> You should seek medical, you know, help. You would be like, whoa, wait a second, right? On the other hand, what do you do with the fact that maybe we can help you? Take student data, alright. Adtech is very exciting, there's such opportunities for personalized learning, colleges are getting in on the act. They're trying to do big data analytics to understand how to make sure you graduate. Well, what happens when a guidance counselor sits down and says, 'Look, based on the data we have, your grades, your family situation, whether you've been to the gym, your cafeteria usage, data we took off your social media profile, you're really never going to make it in physics. I mean, the data says, people with your particular attributes... Never, never... Rarely succeed in four years at graduating with a degree. You need to change your scholarship. You need to change your career path. Or, you can do what you want, but we're not going give you that scholarship. Or simply, we advise you.' Now, what did we just tell Einstein? Maybe not to take Physics, right. But on the other hand, don't I have some responsibility, if I'm a guidance counselor, who would be looking at your records today, and sort of shuffling some papers and saying, 'Well, maybe you want to consider something else?' So, either we talk about this as privacy, but increasingly, many of my members, again who are chief privacy officers if these companies, are facing what are really ethical issues. And there may be risks, there may be benefits, and they need to help decide, or help their companies decide, when does the benefit outweigh the risk? Consider self-driving cars, right? When does the self-driving car say 'I'm going to put this car in the ditch Because I don't want to run somebody over?' But now it knows that your kids are in the backseat, what sort of calculations do we want this machine making? Do we know the answers ourselves? If the microphone in my home hears child abuse, if 'Hello Barbie' hears a child screaming, or, 'Hey, I swallowed poison,' or 'My dad touched me inappropriately,' what should it do? Do we want dolls ratting out parents? And the police showing up saying, 'Barbie says your child's being abused.' I mean, my gosh, I can see times when my kids thought I was a big Grinch and if the doll was reporting 'Hey dad is being mean to me,' you know, who knows. So, these are challenges that we're going to have to figure out, collectively, with, stakeholders, advocates, civil libertarians, and companies. And if we can chart a path forward that let's us use these new technologies in ways that advances society, I think we'll succeed. If we don't think about it, we'll wake up and we'll learn that we've really constrained ourselves and narrowed our lives in ways that we may not be very happy with. [Jeff] - Fascinating topic. And like on the child abuse thing, you know there are very strict rules for people that are involved in occupations that are dealing with children. Whether it's a doctor, or whether it's a teacher, or even a school administrator, that if they have some evidence of say child abuse, they're obligated >> they're obligated. [Jeff] - Not only are they obligated morally, but they're obligated professionally, and legally, right, to report that in. I mean, do you see those laws will just get translated onto the machine? Clearly, God, you could even argue that the machine probably has got better data and evidence, based on time, and frequency, than the teacher has happening to see, maybe a bruise or a kid acting a little bit different on the school yard. [Jules] - You can see a number of areas where law is going to have to rethink how it fits. Today, I get into an accident, we want to know who's fault is it. What happens when my self-driving car gets into an accident? Right? I didn't do it, the car did it. So, do the manufacturers take responsibility? If I have automated systems in my home, robots and so forth, again, am I responsible for what goes wrong? Or, do these things have, or their companies have some sort of responsibility? So, thinking these things through, is where I think we are first. I don't think we're ready for legal changes. I think what we're ready for is an attitude change. And I think that's happened. When I was the chief privacy officer, at AOL, many years ago, we were so proud of our cooperation with the government. If somebody was kidnapped, we were going to help. If somebody was involved in a terrorism thing, we were going to help. And companies, I think, still recognize their responsibility to cooperate with, you know, criminal activity. But they also recognize that it is their responsibility to push back when government says, 'Give me data about that person.' 'Well, do you have a warrant? Do you have a basis? Can we tell them so they can object? Right? Is it encrypted? Well, sorry, we can't risk all of our users by cracking encryption for you because you're following up on one particular crime.' So, there's been a big sea change in understanding that if you're a company, and there's data you don't want to have to hand over, data about immigrants today, lots of companies, in the Valley, and around the country, are thinking, 'Wait a second, could I be forced to hand over some data that could lead to someone being deported? Or tortured? Or who knows what?' Given that these things seem to be back on the table. And, you know again, years ago, you were a good asterisk, you participated in law enforcement and now people participate, but they also recognize that they have a strong obligation to either not have the data, like Amazon, will not have data that this sheriff wants. Now, their Smart Meter and how much water they're using, and all kinds of other information, frankly about their activity at home, since many other things about our homes is now smarter, may indeed be available. How much water did you use at this particular time? Maybe you were washing blood stains away. That sort of information is >> Wild [Jules] - going to be out there. So, the machines will be providing clues that in some cases are going to incriminate us. And companies that don't want to be in the middle, need to think about designing, for privacy, so as to avoid, creating a world where, you know, whole data is available to be used against us. [Jeff] - Right and then there's the whole factor of the devices are in place, not necessarily the company is using it or not, but, you know, bad actors taking advantage of cameras, microphones, all over and hacking into these devices to do things. And, it's one thing take a look at me while I'm on my PC, it's another thing to take control of my car. Right? And this is where, you know, there's some really interesting challenges ahead. As IT continues to grow. Everything becomes connected. The security people always like to say, you know, the certainty attack area, it grows exponentially. [Jules] - Yeah. Well cars are going to be an exciting opportunity. We have released, today, a guide that the National Auto Dealers Association is providing to auto dealers around the country. Because, when you buy a car today, and you sell it or you lend it, there's information about you in that vehicle. Your location history, maybe your contacts, your music history, and we never would give our phone away without clearing it, or you wouldn't give your computer away, but you don't think about your car as a computer, and so, this has all kinds of advice to people. Listen, your car is a computer. There's things you want to do, to take advantage of, >> Right. [Jules]- New services, safety. But there are things you want to also do to manage your privacy, delete. Make sure you're not sharing your information in a way you don't want it. [Jeff] - Jules, we could go on all day, but I think I've got to let you go to get back to the sessions. So, thanks for taking a few minutes out of your busy day. [Jules] - Really good to be with you. [Jeff] - Absolutely. Jeff Frack, you're watching The Cube. See you next time. (closing music)

Published Date : Jan 28 2017

SUMMARY :

We're excited to have our next guest Jules Polonetsky, Exciting times for data, exciting times for privacy. The job security and the privacy space is pretty high and so, getting that right is hard. to try it out and you know, it's interesting to think that and realized that they had an Amazon Echo in the home. and the X-Box did, When you say 'Alexa, what's the weather?' and people being nervous that the IRS is going to try [Jules] - So the wearables are another exciting, 'Hey dad is being mean to me,' you know, who knows. to cooperate with, you know, criminal activity. so as to avoid, creating a world where, you know, but I think I've got to let you go

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jules PolonetskyPERSON

0.99+

JulesPERSON

0.99+

Jeff FrickPERSON

0.99+

AmazonORGANIZATION

0.99+

JeffPERSON

0.99+

Jeff FrackPERSON

0.99+

AOLORGANIZATION

0.99+

National Auto Dealers AssociationORGANIZATION

0.99+

two sidesQUANTITY

0.99+

SiriTITLE

0.99+

1999DATE

0.99+

EinsteinPERSON

0.99+

Washington, D.C.LOCATION

0.99+

TodayDATE

0.99+

AdtechORGANIZATION

0.99+

eight yearsQUANTITY

0.99+

IRSORGANIZATION

0.99+

todayDATE

0.99+

EchoCOMMERCIAL_ITEM

0.99+

ChristmasEVENT

0.99+

bothQUANTITY

0.99+

Robert Johnson FoundationORGANIZATION

0.99+

Data Privacy DayEVENT

0.98+

Alexa DotCOMMERCIAL_ITEM

0.98+

The CubeTITLE

0.98+

BarbiePERSON

0.98+

AlexaTITLE

0.97+

about 130QUANTITY

0.97+

Fit BitORGANIZATION

0.97+

four yearsQUANTITY

0.96+

about 130 companiesQUANTITY

0.96+

Future Privacy ForumORGANIZATION

0.96+

oneQUANTITY

0.96+

TwitterORGANIZATION

0.96+

Jaw BoneORGANIZATION

0.95+

OneQUANTITY

0.95+

firstQUANTITY

0.95+

halfQUANTITY

0.94+

Google HomeCOMMERCIAL_ITEM

0.88+

theCUBEORGANIZATION

0.88+

Data Privacy Day 2017EVENT

0.86+

many years agoDATE

0.84+

ParkinsonOTHER

0.84+

Future of Privacy ForumORGANIZATION

0.83+

AdTechORGANIZATION

0.83+

GrinchPERSON

0.81+

X-BoxCOMMERCIAL_ITEM

0.8+

HomeCOMMERCIAL_ITEM

0.79+

years agoDATE

0.78+

downtown San FranciscoLOCATION

0.7+

Fit BitCOMMERCIAL_ITEM

0.7+

one handQUANTITY

0.68+

WearableORGANIZATION

0.66+

GodPERSON

0.66+

tonsQUANTITY

0.64+

a secondQUANTITY

0.63+

MeterCOMMERCIAL_ITEM

0.57+

MaybPERSON

0.52+

GoogleORGANIZATION

0.5+

yearQUANTITY

0.49+

secondQUANTITY

0.48+

Michelle Dennedy, Cisco | Data Privacy Day 2017


 

>> Hey, welcome back everybody. Jeff Frick here with theCUBE. We're at Data Privacy Day at Twitter's World Headquarters in downtown San Francisco. Full-day event, a lot of seminars and sessions talking about the issue of privacy. Even though Scott McNealy in 1999 said, "Privacy's dead, get over it," everyone here would beg to differ; and it's a really important topic. We're excited to have Michelle Dennedy. She's the Chief Privacy Officer from Cisco. Welcome, Michelle. >> Indeed, thank you. And when Scott said that, I was his Chief Privacy Officer. >> Oh you were? >> I'm well acquainted with my young friend Scott's feelings on the subject. >> It's pretty interesting, 'cause that was eight years before the iPhone, so a completely different world than actually one of the prior guests we were talking about privacy is an issue in the Harvard Business Review from 125 years ago. So this is not new. >> Absolutely. >> So how have things changed? I mean that's a great perspective that you were there. What was he kind of thinking about and really what are the privacy challenges now compared to 1999? >> So different. Such a different world. I mean fascinating that when that statement was made the discussion was a press conference where we were introducing Connectivity. It was an offshoot of Java, and it basically allowed you to send from your personal computer a wireless message to your printer so that a document could come out (gasp). >> That's what it was? >> Yeah. >> Wireless printing? >> Wireless printing. And really it was gyro technology, so anything wirelessly could start talking to each other in an internet of things world. >> Right. >> So, good news bad news. The world has exploded from there, obviously; but the base premise of, can I be mobile, can I live in a world of connectivity, and still have control over my story, who I am, where I am, what I'm doing? And it was really a reframing moment of when you say privacy is dead, if what you mean by that is secrecy and hiding away and not being connected to the world around you, I may agree with you. However, privacy as a functional definition of how we define ourselves, how we live in a culture, what we can expect in terms of morality, ethics, respect, and security, alive and well, baby. Alive and well. >> (laughs) No shortage of opportunity to keep you busy. We talked to a lot of people who go to a lot of tech conferences. I have to say I don't know that we've ever talked to a Chief Privacy Officer. >> You're missing out. >> I know, so not you get to define the role, I love it. So what are your priorities as Chief Priority Officer? What are you keeping an eye on day to day as well as what are your more strategic objectives? >> It's a great question. So the rise of the Chief Privacy Officer, actually Scott was a big help in that and gave me exactly the right amount of rope to hang myself with. The way I look at it is, probably the simplest analogy is, should you have a Chief Financial Officer? >> Yeah. >> I would guess yeah, right? That didn't exist about 100 years ago. We just kind of loped along, and whoever had the biggest bag of money at the end was deemed to be successful. Where if somebody else who had no money left at the end but bought another store, you would have no way of measuring that. So the Chief Privacy Officer is that person for your digital currency. I look at the pros and the cons, the profit and the loss, of data and the data footprint for our company and for all the people to whom we sell. We think about, what are those control mechanisms for data? So think of me as your data financial officer. >> Right, right. But the data in and of itself is just stagnant, right? It's really just the data in the context of all these other applications. How it's used, where it's used, when it's used, what it's combined with, that really starts to trip into areas of value as well as potential problems. >> I feel like we scripted this before, but we didn't. >> Jeff: We did not script it, we don't script the-- >> So if I took out a rectangle out of my wallet, and it had a number on it, and it was green, what would you say that thing probably is? >> Probably Andrew Jackson on the front. >> Yeah, probably Andrew Jackson. What is that? >> A 20 dollar bill. >> Why is that a 20 dollar bill? >> Because we agree that you're going to give it to me and it has that much value, and thankfully the guy at Starbucks will give me 20 bucks worth of coffee for it. >> (laughs) Exactly. Well which could be a cup the way we're going. >> Which could be a cup. >> But that's exactly right. So is that 20 dollar bill stagnant? Yes. That 20 dollar bill just sitting on the table between us is nothing. I could burn it up, I could put it in my pocket and lose it and never see it again. I could flush it down the toilet. That's how we used to treat our data. If you recognize instead the story that we share about that piece of currency, we happen to be in a place where it's really easy to alienate that currency. I could go downstairs here and spend it. If I was in Beijing I probably would have to go and convert it into a different currency, and we'd tell a story about that conversion because our standards interface is different. Data is exactly the same way. The story that we share together today is a valuable story because we're communicating out, we're here for a purpose. >> Right. >> We're making friends. I'm liking you because you're asking me all these great questions that I would have fed you had I been able to feed you questions. >> Jeff: (laughs) But it's only that context, it's only that communicability that brings it value. We now assume as a populous that paper currency is valuable. It's just paper. It's only as good as the story that enlivens it. So now we're looking at smaller, smaller Microdata transactions of how am I tweeting out information to people who follow me? >> Jeff: Right, right. >> How do I share that with your following public, and does that give me a greater opportunity to educate people about security and privacy? Does that allow my company to sell more of my goods and services because we're building ethics and privacy into the fabric of our networks? I would say that's as valuable or more valuable than that Andrew Jackson. >> So it's interesting 'cause you talk about building privacy into the products. We often hear about building security into the products, right? Because the old way of security of building a bigger wall doesn't work any more and you really have to bake it in at all steps of the application: development, the data layer, the database, et cetera, et cetera. When you look at privacy versus security, and especially 'cause Cisco's sitting on, I mean you guys are sitting on the pipes, everything is running through your machines. >> That's right. >> How do you separate the two, how do you prioritize, and how do you make sure the privacy discussion is certainly part of that gets the right amount of relevance within the context of the security conversation? >> It's a glib answer that's much more complicated, but the security is really in many instances the what. I can really secure almost any batch of data. It can be complete gobbley gook zeroes and ones. It could be something really critical. It could be my medical records. The privacy and the data about what that context is, that's the why. I don't see them as one or the other at all. I see security and security not as not a technology but a series of verb things that you actually physically, people process technologies. That enactment should be addressed to a why. So it's kind of Peter Drucker's management of you manage what you measure. That was like incendiary advice when it first came out. Well I wanted to say that you secure what you treasure. So if you treasure a digital interaction with your employees, your customers, and your community, you should probably secure that. >> Right. But it seems like there's a little bit of a disconnect about maybe what should be treasured and what is the value with folks that have grown up. Let's pick on the young kids, not really thinking through or having the time or knowing an impact of a negative event in terms of just clicking and accepting the EULA and using that application on their phone. They just look at in a different way. Is that valid? How do they change that behavior? How do you look at this new generation, and there's this sea of data which is far larger than it used to be coming off all these devices, internet of things, obviously. People are things too. The mobile devices with all that geolocation data, and the sensor data, and then oh by the way it's all going to be in our cars and everything else shortly. How's that landscape changing and challenging you in new ways, and what are you doing about it? >> The speed and dynamics are astronomical. How do you count the stars, right? >> Jeff: (laughs) >> And should you? Isn't that kind of a waste of time? >> Jeff: Right, right. >> It used to be that knowledge, when I was a kid, was knowing what was in A to Z of the Encyclopedia Britannica. Now facts are cheap. Facts used to be expensive. You had to take time and commit to them, and physically find them, and be smart enough to read, and on, and on, and on. The dumbest kid is smarter than I was with my Encyclopedia Britannica because we have search engines. Now their commodity is how do I critically think? How do I make my brand and make my way? How do I ride and surf on a wave of untold quantities of information to create a quality brand for myself? So the young people are actually in a much better position than, I'll still count us as young. >> Jeff: Yeah, Uh huh. >> But maybe less young. >> Less young, less young than we were yesterday. >> We are digital natives, but I think I am hugely optimistic that the kids coming up are really starting to understand the power of brand: personal brand, family brand, cultural brand. And they're feeling very activist about the whole thing. >> Yeah, which is interesting 'cause that was never a factor when there was no personal brand, right? You were part of-- >> No way. >> whatever entity that you were in. >> Well, you were in a clique. >> Right. >> Right? You identified as when I was home I was the third out of four kids. I was a Roman Catholic girl in the Midwest. I was a total dork with a bowl haircut. Now kids can curate who and what and how they are over the network. Young professionals can connect with people with experience. Or they can decide, I get this all the time on Twitter actually. How did you become a Chief Privacy Officer? I'm really interested in taking a pivot in my career. And I love talking to those people 'cause they always educate me, and I hope that I give them a little bit of value too. >> Right, right. Michelle, we could go on for on and on and on. But, unfortunately, I think you got to go cover a session. So we're going to let you go. >> Thank you. >> Michelle Dennedy, thanks for taking a few minutes of your time. >> Thank you, and don't miss another Data Privacy Day. >> I will not. We'll be back next year as well. I'm Jeff Frick. You're watching theCUBE. See you next time.

Published Date : Jan 28 2017

SUMMARY :

talking about the issue of privacy. And when Scott said that, I was his Chief Privacy Officer. Scott's feelings on the subject. one of the prior guests we were talking about I mean that's a great perspective that you were there. the discussion was a press conference And really it was gyro technology, if what you mean by that is secrecy and hiding away (laughs) No shortage of opportunity to keep you busy. I know, so not you get to define the role, I love it. exactly the right amount of rope to hang myself with. and for all the people to whom we sell. It's really just the data in the context What is that? and thankfully the guy at Starbucks Well which could be a cup the way we're going. I could flush it down the toilet. had I been able to feed you questions. It's only as good as the story that enlivens it. How do I share that with your following public, and you really have to bake it in The privacy and the data about what that context is, and the sensor data, and then oh by the way How do you count the stars, right? So the young people are actually in a much better position hugely optimistic that the kids coming up I was a total dork with a bowl haircut. So we're going to let you go. of your time. See you next time.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JeffPERSON

0.99+

Jeff FrickPERSON

0.99+

ScottPERSON

0.99+

Michelle DennedyPERSON

0.99+

MichellePERSON

0.99+

CiscoORGANIZATION

0.99+

Andrew JacksonPERSON

0.99+

BeijingLOCATION

0.99+

1999DATE

0.99+

20 bucksQUANTITY

0.99+

20 dollarQUANTITY

0.99+

Scott McNealyPERSON

0.99+

thirdQUANTITY

0.99+

StarbucksORGANIZATION

0.99+

next yearDATE

0.99+

twoQUANTITY

0.99+

JavaTITLE

0.99+

iPhoneCOMMERCIAL_ITEM

0.99+

four kidsQUANTITY

0.99+

todayDATE

0.99+

Peter DruckerPERSON

0.99+

yesterdayDATE

0.99+

Harvard Business ReviewTITLE

0.98+

firstQUANTITY

0.96+

oneQUANTITY

0.96+

Data Privacy DayEVENT

0.96+

EULATITLE

0.95+

Encyclopedia BritannicaTITLE

0.95+

125 years agoDATE

0.93+

TwitterORGANIZATION

0.92+

Data Privacy Day 2017EVENT

0.91+

San FranciscoLOCATION

0.89+

PrivacyPERSON

0.87+

eight yearsDATE

0.86+

theCUBEORGANIZATION

0.83+

World HeadquartersLOCATION

0.81+

MidwestLOCATION

0.8+

Privacy OfficerPERSON

0.77+

about 100 years agoDATE

0.77+

FinancialPERSON

0.67+

Roman CatholicOTHER

0.47+

Eva Casey Velasquez | Data Privacy Day 2017


 

(soft click) >> Hey, welcome back everybody, Jeff Frick here with theCUBE. We're at downtown San Francisco, at Twitter's World Headquarters. It's a beautiful building. Find a reason to get up here and check it out. But they have Data Privacy Day here today. It's an all day seminar session, series of conversations about data privacy. And even though Scott McNealy said, "Data privacy is dead, get over it." Everyone here would beg to differ. So we're excited to have our next guest Eva Velasquez. Shes' the President and CEO of ITRC, welcome. >> Thank you, thank you for having me and for covering this important topic. >> Absolutely, so what is ITRC? >> We are the Identity Theft Resource Center. And the name, exactly what it is. We're a resource for the public when they have identity theft or fraud, privacy data breach issues, and need help. >> So this begs an interesting question. How do people usually find out that their identity has been compromised? And what is usually the first step they do take? And maybe what's the first step they should take? >> Well, it's interesting because there isn't one universal pathway that people discover it. It's usually a roadblock. So, they're trying to move forward in their lives in some manner. Maybe trying to rent an apartment, get a new job, buy a car or a house. And during that process they find out that there's something amiss. Either in a background check or a credit report. And at that point it creates a sense of urgency because they must resolve this issue. And prove to whoever they're trying to deal with that actually wasn't me, somebody used my identity. And that's how they find out, generally speaking. >> So, you didn't ask their credit scores. Something in a way that they had no idea, this is how they. What usually triggers it? >> Right, right, or a background check. You know, appearing in a database. It's just, when we think about how pervasive our identity is out there in the world now. And how it's being used by a wide swath of different companies. To do these kind of background checks and see who we are. That's where that damage comes in. >> Talking about security and security breaches at a lot of shows, you know. It's many hundred of days usually before companies know that they've been breached. Or a particular breach, I think now we just assume they're breached all the time. And hopefully they'd minimize damage. But an identity theft, what do you find is kind of the average duration between the time something was compromised before somebody actually figures it out? Is there kind of an industry mean? >> It's really wildly inconsistent from what we see. Because sometimes if there is an issue. Let's say that a wallet is stolen and they're on high alert, they can often discover it within a week or 10 days. Because they are looking for those things. But sometimes if it's a data breach that they were unaware of or have no idea how their information was compromised. And especially in the case of child identity theft, it can go on for years and years before they find out that something's amiss. >> Child identity theft? >> Mhmm. >> And what's going with? I've never heard of child identity theft. They usually don't have credit cards. What's kind of the story on child identity cut theft? Which is their PayPal account or their Snapchat account (laughs). >> Well, you're right, children don't have a credit file or a credit history. But they do have a social security number. And that is being issued within the first year of their life because their parents need to use it on their tax returns and other government documents. Well, because the Social Security Administration and the credit reporting agencies, they don't interface. So, if a thief gets ahold of that social security number. That first record that's created is what the credit bureaus will use. So they don't even need a legitimate name or date of birth. Obviously, the legitimate date of birth isn't going to go through those filters because it is for someone who's under 18. So, kid goes all through life, maybe all through school. And as they get out and start doing things like applying for student loans. Which is one of the really common ways we see it in our call center. Then they come to find out, I have this whole credit history. And guess what? It's a terrible credit history. And they have to clean that up before they can even begin to launch into adulthood. >> (chuckles) Okay, so, when people find out. What should they do? What's the right thing to do? I just get rejected on a credit application. Some weird thing gets flagged. What should people do first? >> There's a couple things and the first one is don't panic. Because we do have resources out there to help folks. One of them is the Identity Theft Resource Center. All of our services are completely free to the public. We're a charity, non-profit, funded by grants, donations, and sponsorships. They should also look into what they might have in their back pocket already. There are a lot of insurance policy writers for things like your home owners insurance, sometimes even your renters insurance. So, you might already have a benefit that you pay for in another way. There are a lot of plans within employee benefit packages. So, if you work for a company that has a reasonable robust package, you might have that help there as well. And then the other thing is if you really feel like you're overwhelmed and you don't have the time. You can always look into hiring a service provider and that's legitimate thing to do as long as you know who you're doing business with. And realize you're going to be paying for that convenience. But there are plenty of free resources out there. And then the last one is the Federal Trade Commission. They have some wonderful remediation plans online. That you can just plug in right there. >> And which is a great segway, 'cause you're doing a panel later today, you mentioned, with the FTC. Around data privacy and identity theft. You know, what role does the federal government have? And what is cleaning up my identity theft? What actually happens? >> Well, the federal government is one of the many stakeholders in this process. And we really believe that everybody has to be involved. So, that includes our government, that includes industry, and the individual consumers or victims themselves. So, on the government end, things like frameworks for how we need to treat data, have resources available to folks, build an understanding in a culture in our country that really understands the convenience versus security conundrum. Of course industry needs to protect and safeguard that data. And be good stewards of it, when people give it to them. And then individual consumers really need to pay attention and understand what choice they're making. It's their choice to make but it should be an educated one. >> Right, right. And it just, the whole social security card thing, is just, I find fascinating. It's always referenced as kind of the anchor data point of your identity. At the same time, you know, it's a paper card that comes after your born. And people ask for the paper card. I mean, I got a chip on my ATM card. It just seems so archaic, the amount of times it's asked in kind of common everyday, kind of customer service engagements with your bank or whatever. Just seems almost humorous in the fact that this is supposed to be such an anchor point of security. Why? You know, when is the Social Security Administration or that record, either going to come up to speed or do you see is there a different identity thing? With biometrics or a credit card? Or your fingerprint or your retina scan? I mean, I have clear, your Portican, look at my... Is that ever going to change or is it just always? It's such a legacy that's so embedded in who we are that it's just not going to change? It just seems so bizarre to me. >> Well, it's a classic case of we invented a tool for one purpose. And then industry decided to repurpose it. So the social security number was simply to entitle you to social security benefits. That was the only thing it was created for. Then, as we started building the credit and credit file industry, we needed an initial authenticator. And hey, look at this great thing. This is a number, it's issued to one individual. We know that there's some litmus test that they have to pass in order to get one. There's a great tool, let's use it. But nobody started talking about that. And now that we're looking at things like other type, government benefits being offered. And now, you know, credit is issued based on this number. It really kind of got away from everybody. And think about it, it used to be your military ID. And you would have your social security number painted on your rucksack, there for the world to see. It's still on our Medicare cards. It used to be on our checks. Lot of that has changed. >> That's right it was on our checks. >> It was, it was. So, we have started shifting into this. At least the thought process of, "If we're going to use something as an initial authenticator, we probably should not be displaying it, ready for anyone to see." And the big conversation, you know, you were talking about biometrics and other ways to authenticate people. That's one of the big conversations we're having right now is, "What is the solution?" Is it a repurposing of the social security number? Is it more sharing within government agencies and industry of that data, so we can authenticate people through that? Is it a combination of things? And that's what we're trying to wrestle with and work out. But it is moving forward, I'll be it, very very slowly. >> Yeah, they two factor authentication seems to have really taken off recently. >> Thankfully. >> You get the text and here's your secret code and you know, at least it's another step that's relatively simple to execute. >> Something you are, something you have, something you know. >> There you go. >> That's kind of the standard we're really trying to push. >> So, on the identity theft bad guys, how is their behavior changed since you've been in this business? Has it changed dramatically? Is the patterns of theft pretty similar? You know, how's that world evolving? 'Cause generally these things are little bit of an arm race, you know. And often times the bad guys are one step ahead of the good guys. 'Cause the good guys are reacting to the last thing that the bad guys do. How do you see that world kind of changing? >> Well, I've been in the fraud space for over 20 years. Which I hate to admit but it's the truth. >> Jeff: Ooh, well, tell me about it. >> And we do look at it sort of like a treadmill and I think that's just the nature of the beast. When you think about the fact that the thieves are they're, you know, they're doing penetration testing. And we, as the good guys, trying to prevent it. Have to be right a hundred percent of the time. The thieves only have to be right once, they know it. They also spend an extraordinary amount of time being creative about how they're going to monetize our information. The last big wave on new types of identity theft, was tax identity theft. And the federal government never really thought that that would be a thing. So when we went to online filing, there really weren't any fraud analytics. There wasn't any verification of it. So, that first filing was the one that was processed. Well, fast forward to now, we've started to address that it's still a huge problem and the number one type of identity theft. But if you had asked me ten years ago, if that would be something, I don't think I would have said yes. It seems, you know, so, you know. How do you create money out of something like that? And so, to me, what is moving forward is that I think we just have to be really vigilant for when we leave that door unlocked, the thieves are going to push it open and burst through. And we just have to make sure we notice when it's cracked. So that we can push it closed. Because that's really I think the only way we're going to be able to address this. Is just to be able to detect and react much more quickly than we do now. >> Right, right, 'cause going to come through, right? >> Exactly they are. >> There's no wall thick enough, right? Right and like you said they only have to be right once. >> Nothings impenetrable. >> Right, crazy. Alright Eva, we're going to leave it there and let you go off to your session. Have fun at your session and thanks for spending a few minutes with us. >> Thank you. >> Alright, she's Eva Velasquez, President and CEO of the ITRC. I'm Jeff Frick, you're watching theCUBE. Catch you next time. (upbeat electronic music)

Published Date : Jan 28 2017

SUMMARY :

Find a reason to get up here and check it out. and for covering this important topic. And the name, exactly what it is. And what is usually the first step they do take? And during that process they find out So, you didn't ask their credit scores. And how it's being used by a wide swath at a lot of shows, you know. And especially in the case of child identity theft, What's kind of the story on child identity cut theft? And they have to clean that up What's the right thing to do? And then the other thing is if you really feel like And what is cleaning up my identity theft? of the many stakeholders in this process. And it just, the whole social security card thing, that they have to pass in order to get one. And the big conversation, you know, seems to have really taken off recently. You get the text and here's your secret code So, on the identity theft bad guys, Well, I've been in the fraud space for over 20 years. And so, to me, what is moving forward is Right and like you said they only have to be right once. and let you go off to your session. President and CEO of the ITRC.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Eva VelasquezPERSON

0.99+

Jeff FrickPERSON

0.99+

Federal Trade CommissionORGANIZATION

0.99+

EvaPERSON

0.99+

ITRCORGANIZATION

0.99+

JeffPERSON

0.99+

Scott McNealyPERSON

0.99+

Social Security AdministrationORGANIZATION

0.99+

Identity Theft Resource CenterORGANIZATION

0.99+

Eva Casey VelasquezPERSON

0.99+

10 daysQUANTITY

0.99+

first stepQUANTITY

0.99+

PayPalORGANIZATION

0.99+

first yearQUANTITY

0.99+

oneQUANTITY

0.98+

over 20 yearsQUANTITY

0.98+

todayDATE

0.98+

Data Privacy DayEVENT

0.98+

TwitterORGANIZATION

0.98+

one purposeQUANTITY

0.98+

two factorQUANTITY

0.98+

a weekQUANTITY

0.97+

underQUANTITY

0.97+

firstQUANTITY

0.96+

ten years agoDATE

0.96+

FTCORGANIZATION

0.96+

SnapchatORGANIZATION

0.96+

first recordQUANTITY

0.95+

hundred percentQUANTITY

0.94+

one stepQUANTITY

0.9+

yearsQUANTITY

0.89+

first oneQUANTITY

0.89+

bigEVENT

0.89+

later todayDATE

0.87+

theCUBEORGANIZATION

0.85+

hundred of daysQUANTITY

0.85+

San FranciscoLOCATION

0.84+

Data Privacy Day 2017EVENT

0.82+

World HeadquartersLOCATION

0.81+

one individualQUANTITY

0.78+

onceQUANTITY

0.73+

couple thingsQUANTITY

0.71+

first filingQUANTITY

0.71+

one universal pathwayQUANTITY

0.7+

One of themQUANTITY

0.64+

PresidentPERSON

0.63+

waveEVENT

0.58+

18QUANTITY

0.57+

governmentORGANIZATION

0.48+

Robert Waitman, Cisco | Cisco Live EU Barcelona 2020


 

>> Announcer: Live from Barcelona, Spain it's theCUBE covering Cisco Live 2020, brought to you by Cisco and its ecosystem partners. >> Okay, welcome back everyone. It's theCUBE's live coverage here in Barcelona, Spain for Cisco Live 2020, I'm John Furrier host of theCUBE, My co Stu Miniman. We've been talking about the value of data for many, many years and privacy and today's Data Privacy Day, and super important we are here every year past couple years, and the routine at Cisco has some answers for us. Our next guest Robert Waitman Director, Data Privacy and Economic Security and Trust Organization at Cisco, Robert Cube alumni, welcome back, good to see you. Thanks for coming back on. >> Thank you. Great to see you again. >> John: So you know we've had great chats in the past. You knows my favorite topic, the value of data, the role of data, we all believe data driven organizations. You guys just put out your annual report, which is privacy to profit. We asked question here on theCUBE, what's the value of data? That's the Holy Grail. But you guys actually got some progress on this, and narrowly defining around privacy, what's it worth with privacy if you invest in privacy, there is an ROI. We've seen similar reports on diversity, investing in these areas that look like impact mission based items actually has economic value. You guys have new data on a return on investment of privacy, share us the results. >> Happy to do so. And we've been on this journey for three years to try to understand where the value is coming from from privacy, putting protections in place. We first saw that it was showing up in terms of better sales motions, we're having fewer sales delays because organizations put privacy in place. Last year, we started looking at some of the security benefits that those organizations that invested in privacy we're seeing fewer and less costly breaches for example, and less records exfiltrated. So the idea of getting your data house in order is translating into business value. This year, we've not only validated those results from the past two years, but we've now taken it to the next step to have an actually a return on investment on those privacy investments. So our survey this year, which we put out yesterday, was based on 2800 companies, 2500 of which knew about privacy at their organizations. And we asked them about their investments, we ask them about the benefits that they thought they were getting, some in tangible ways and also some intangible ways, like competitive advantage, or operational efficiency, things that are hard to quantify. Overall results on that the average organization spends $100 on privacy, they're getting $270 back, it is a great investment. I don't know how many investments they have to have that kind of return. But in this environment, and this is where we're seeing, the customers who want these kinds of protections, it's a great investment. >> It's an omni directional kind of forcing function if you think about it. I wanted to ask you, how do you see some of the categories because I can certainly see the benefit of just, people who are afraid of their privacy and their data. You see a lot of train wrecks in the industry, from Facebook to other things where users are in control, right? They want to be in control. That's the trend. So I can see the halo effect of saying, well, this company's got good privacy, I like that company. >> Robert: Right? >> It's almost a modern kind of table stakes, like going green or something like that. Is there areas that pop out in the survey, where the ROI was a must have, in terms of privacy or sort of categorical? >> Well, the this idea of building your loyalty and trust of your customers, is something that we had explored. If like, there's a companion piece that we just put out a few weeks ago, exactly on this issue of the consumer interest in having that available to them. And I would say, wouldn't take it for granted. Until recently, most people have said, privacy is dead, and I don't know who has access to my data, and I don't know what's controlling it. But the combination of GDPR, which swung the pendulum a little bit back so that users again had the ability to know what data companies had about them, and in some cases to modify or delete it, started that tread, the CCPA in California, carries out a little bit further. And what we saw in this companion survey around individuals was fascinating because we saw people that are more active. They're saying not only do I care about privacy, which most people will say that I will spend time and money, which many people may say that, but the real test was here that they've made a change, that they've changed a provider or someone that they work with over their data practices or data policies. And what that saying to us is, there's an active community, we're calling them privacy actives. It's a third of the population today, who are standing up to say, I now know that I have some control over how my data is used. Therefore, think about the companies and how they relate to that their customers are saying to them, I'm not going to work with you. And I'm not going to do business with you. And I want to only work with companies who I know how the data is being used. It's now become an important priority. It's part of the brand. It's part of the overall customer experience. So customers aren't going to-- >> John: I think you are understanding the numbers too. I think you I believe what you just said, is only going to be amplified because with social networking and what we've seen with virality and even just with fake news and disinformation. There's also information that could go viral, like, hey, this company, the buyer swing, the influence that these groups could have could be a force multiplier on impact negatively and positively. >> Robert: Right. And I think that actually, we would bear that out as well. So even though I described the third of people who already have made that change, there's another 30 plus percent, who said the first two, they just haven't made that change yet, maybe they aren't comfortable with doing it yet or they haven't had the opportunity. So again, this is something that all companies A need to pay attention to, and B it's going to be fundamentally part of the overall experience. If you don't have the privacy right, you're like not in business. And again, I think that's a positive trend, getting to the creating the conditions in the world that I think we all want to live in where, where when I share my information with somebody who uses it well, I'm happy with that. If I share it with somebody who misuses it, I don't want anything to do with them. And that's, I think, what we all think how it should work. >> Yeah, that's really fascinating and I love what you're saying about how the consumers are getting involved. I was a little bit concerned that things like GDPR and CCPA were going to be like the old, software accept it to use it. Nobody reads it, nobody pays any attention to it, I just opt in to anything. So, what advice do you have to users, how do you make sure that you're working with companies that are going to be using your data correctly and get involved, if they're not? >> Robert: Well, first thing they should do is be aware of the regulations and the rights that they have. I mean, the awareness even if GDPR in Europe runs under two thirds, right? So it's not something to take for granted that everybody knows about what they can do. So the first thing is know what you can do ask for the data if you're not sure. And ask the questions about how your data is being used. If the company is not completely upfront and transparent with how the data is being used, and I don't mean a 20 page consent document, which you can't figure out what they're doing, then you should be either not doing it or asking those questions and you should have comfort that there are a lot of other consumers out there that are doing the same. So make sure you're doing that. Cisco tries to work very hard to share with our customers exactly how data is using in all of our products that's why it published the data privacy maps and the data privacy sheets to kind of make that easy on our customers. But in any business, that's something that, a consumer should be asking, a customers should be asking and the company should explain, simply and transparency. The one number one complaint that individuals still have today is they don't understand what companies are doing with their data. >> Yeah. >> I mean, it's just mind boggling and that's, I think, again, the advice I give them is, you've got to get that right. >> How does Cisco do? What do you guys do? What do you offer people? I mean, let's just say people want to check. What was the mechanism that you guys are putting in place? Because I have no idea of WebEx my video is going to be facial recognition or my packets being routed through Cisco routers are being sniffed out, how do you guys put that transparency out there? >> Robert: Well, you like many customers ask those questions. And so we started creating and publishing these privacy data sheets, which were relatively streamlined, fairly short documents that you could go through and say, okay, I understand where the data is going. And we've done that on a whole bunch of the most requested products. We've taken another step to make them now very visual. I think we talked, we just launched that a year ago, where we tried to make them look like subway maps. Where you have sort of color coded ways the data flows through the system. And those are available. Anybody can come get them from trust.Cisco.com on the website, publicly available for customers who are interested in a product, don't have to go down the road and say well, it's just going to be my needs, they can get almost all of their questions answered through that. Yes, there may be some additional questions we want to answer later, like through the lawyers and through the conversations, but we least have a mechanism for giving the most of that information up front. >> Stu: Yeah, I love that trust was something that was front and center in the keynote this morning. I'm curious, Robert, with Cisco's position in the marketplace, the ecosystem you have is either something Cisco can lead or their industry considering to have kind of like a better business bureau. I shouldn't be able to go there and say, is this a reputable company? Am I okay, doing business with them? From a privacy standpoint, are there any initiatives in the works or is that something you might foresee going forward that I know oh, hey, this is somebody that it makes sense for me to work with. >> It's an interesting idea of, that could be created around that. I mean, I think where we are today is there's still a huge value of the government playing a role. I mean, the idea of GDPR and other regulations, if you have too much of it may not be helpful. But in today's environment, because the consumer can't always trust the company to do what they say they're going to do, you may not even be able to figure it out from the policy to begin with. But the government's role is to make sure that they're doing what they say they're going to do and therefore, consumers want government involved in that. So that again, there's a role to see fines and see penalties means that some of the guys are at least being-- >> Stu: Well, I wonder even you look at some of the fragmentation of the internet today, is there something that government or intergovernmental, kind of like the organization that runs the internet today, if there's there would be some room for them to be involved in something like that? I know it's a big audacious thing, but it is something that the general public companies, they don't trust most corporations with their information. >> Right. And it's a nice idea, especially in an environment where we want to avoid 50 different state legislative environments that companies are going to have to comply with. I mean, so far we go back to our study, we see this very positive return on privacy investment. If we get 50 more state laws that people have to comply with, that's very quickly going to get negative, right? So as although consumers are demanding more, it's more part of the brand. If we have too much regulation will start to see that around. So you're getting your idea of consolidation, having a single way is a very positive idea. >> Stu: In your report, I saw that GDPR and CCPA, oh, China's doing something, Brazil's doing something, it's going to be well it's from it's going to become onerous on the supplier and the consumer side if there isn't some commonality between them. >> Robert: Fully agree. That's right. >> John: Well, I got the report here folks, check it out. It is an amazing report. Every year, the team does an amazing job. This year it's about privacy ROI. This proves that good hiring works. Privacy, hiring practice, diversity inclusion, inclusion and diversity has pay off and this is the new modern era. I want to switch gears well on that note because Robert, we always love to talk about your role you're a data privacy and economics. So privacy economic, ROI, get that security and trust organization. The economic value is a big part of your study here. I think it's just scratching the surface. And I want to give you an example and I want you to react to it. Was having a conversation with a big time venture capitalist who just changed this job to start a fund for impact investing for profit. And one of his focus area is economics around self governing communities around policing some of these regulation issues. So there's so much regulation, business could get stunted. There's a trend going on now, Stu kind of lead it into it where communities are going to start to govern the brands as a check based on buyer behavior. So there's real signals that users are reacting to companies, policies, with data or whatever whether its environment whatever people are making purchasing decisions and organizing, that's going to change the economics, which is the top line impact, not just so much a cost structure to have certain regulatory policies. This is a venture capitalist. What's your reaction? He's investing in this direction. He thinks it's going to be big, your thoughts. >> Well, I think there's evidence for that, again, it's the idea that a company is more valuable because of some of the things that we're talking about was also we actually asked that question, did respondents think that their company was more valuable because they had progressed along the privacy dimension. Because you think about the loyalty and trust they built with their customers, aside from the operational benefits, and maybe the compliance benefits as well. So I would say, evidence for companies thinking that they themselves are further along, and those companies that have gone more than just the minimum that it's sort of becoming a little more mature, a little more accountable from their privacy programs are getting the best returns. We talked about that $270 on 100. If you're investing a little more and going up that curve, it's $310 on that 100. So again, better return on your investment, more loyalty and value and you see your company as being more valuable. So I think there's strong evidence for that happening again, you know how that actually works operationally another question, but there's something there. >> John: Stu and I were talking about how advertising and how social network and medium is all changing. And one of the things is you're driving at is that advertising used to be an attention game, get on TV, spread the message around, while you're teasing out and what Stu was talking about earlier in our other session is that influence and reputation is a new benchmark. So it's not so much, know my brand, my key rating or brand impression, its reputation, you're getting at something that's really interesting around reputation, which is swinging buying behavior. This is a new dynamic. >> Robert: Yeah, I think that reputational aspect is such an important part of the brand and even doing business and why this whole issue. I mean, the idea of privacy becoming a central tenet of the company and the brand and the overall experience is kind of what we're seeing as that pendulum swings back to the consumer and the ability to make those choices. It's becoming more and more important for the companies to get that right and have that be part of it. That's the value of the company, again, the value of the overall relationship. So I think that's a positive direction. >> John: We really appreciate you coming on. I want to get your thoughts, last question. What's your vision of where we are today in the world? You look around and you'll be happy with some of the things. You look at things like Facebook's going through a change, Jeff Bezos' phone was hacked via video on WhatsApp. You got the political environment, you have this entire trust equation. And it's just a dynamic time, your vision of how trust and data privacy and the economics and all your role. What do you see happening? What's your vision? >> I'm very optimistic about where it's going. I mean, I think we see ups and downs and we see setbacks. We see millions of records get exposed on users, and they get concerned about things. But I think we're trying to put the right processes and controls in place so that those controls and so the right things do happen with data, all trying to create that world that we all want to live in. That when our data is shared, it's used appropriately. So it's not going to be a smooth upward curve, but again, I think that idea of not only the legislative process where our governments are seeing that consumers need these protections that we can't go it alone, we need help with the companies that we work with, and the idea that they're willing to take it more into themselves. I mean, the fact that governments and companies who are concerned about the regulations and individuals themselves, would share the responsibility for creating all of those protections is, I mean, that makes me very positive about where it's going. >> John: And as politicians from all around the world, whether it's United States or other countries have to figure out how much regulation to put on the tech companies, this is a flashpoint of where industry could do their part and be part of the solution, not just be regulated, hopefully. Too much regulation kills entrepreneurship, in my opinion, but that's my opinion. >> Robert: It would kill our ROI right? >> ROI. >> Down the toilet. >> Okay, theCUBE comes bringing all the great conversations here at Barcelona, Data Privacy Day, this is a big part of our society now, and there's now evidence that it's worth investing in privacy thanks to Cisco's report. Good ROI. Of course great ROI of you stay with theCUBE for more action after this short break

Published Date : Jan 28 2020

SUMMARY :

brought to you by Cisco and its ecosystem partners. and the routine at Cisco has some answers for us. Great to see you again. the role of data, we all believe data driven organizations. So the idea of getting your data house in order So I can see the halo effect of saying, It's almost a modern kind of table stakes, and how they relate to that I think you I believe what you just said, and B it's going to be that are going to be using your data correctly So the first thing is know what you can do the advice I give them is, you've got to get that right. that you guys are putting in place? for giving the most of that information up front. the ecosystem you have So that again, there's a role to see fines and see penalties but it is something that the general public companies, that companies are going to have to comply with. and the consumer side Robert: Fully agree. and I want you to react to it. and maybe the compliance benefits as well. And one of the things is you're driving at is that and the ability to make those choices. and the economics and all your role. and so the right things do happen with data, and be part of the solution, not just be regulated, comes bringing all the great conversations here

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
RobertPERSON

0.99+

JohnPERSON

0.99+

CiscoORGANIZATION

0.99+

StuPERSON

0.99+

$270QUANTITY

0.99+

Jeff Bezos'PERSON

0.99+

Last yearDATE

0.99+

EuropeLOCATION

0.99+

FacebookORGANIZATION

0.99+

$310QUANTITY

0.99+

$100QUANTITY

0.99+

John FurrierPERSON

0.99+

Robert WaitmanPERSON

0.99+

CaliforniaLOCATION

0.99+

firstQUANTITY

0.99+

thirdQUANTITY

0.99+

three yearsQUANTITY

0.99+

Stu MinimanPERSON

0.99+

100QUANTITY

0.99+

This yearDATE

0.99+

20 pageQUANTITY

0.99+

this yearDATE

0.99+

30 plus percentQUANTITY

0.99+

2800 companiesQUANTITY

0.99+

yesterdayDATE

0.99+

2500QUANTITY

0.99+

Barcelona, SpainLOCATION

0.99+

a year agoDATE

0.99+

GDPRTITLE

0.98+

todayDATE

0.98+

theCUBEORGANIZATION

0.98+

oneQUANTITY

0.98+

United StatesLOCATION

0.98+

Data Privacy DayEVENT

0.97+

50 more state lawsQUANTITY

0.97+

Robert CubePERSON

0.96+

50 different state legislative environmentsQUANTITY

0.96+

WhatsAppORGANIZATION

0.95+

first thingQUANTITY

0.95+

few weeks agoDATE

0.94+

single wayQUANTITY

0.93+

first twoQUANTITY

0.93+

this morningDATE

0.92+

WebExORGANIZATION

0.92+

BarcelonaLOCATION

0.91+

two thirdsQUANTITY

0.9+

Data Privacy and Economic SecurityORGANIZATION

0.89+

millions of recordsQUANTITY

0.87+

past couple yearsDATE

0.87+

past two yearsDATE

0.83+

Michelle Dennedy & Robert Waitman, Cisco | Cisco Live EU 2019


 

>> Live from Barcelona, Spain it's theCUBE! Covering Cisco Live! Europe brought to you by Cisco and its ecosystem partners. >> Hello everyone, welcome back to theCube's live coverage here in Barcelona, Spain for Cisco Live! Europe 2019. We're at day three of three days of coverage I'm John Furrier with Dave Vellante Our next two guests we're going to talk about privacy data Michelle Dennnedy, VP and Chief Privacy officer at Cisco and Robert Waitman who is the Director of Security and Trust. Welcome back, we had them last year and everything we talked about kinda's happening on steroids here this year >> Yep. >> Welcome back >> Thank you glad to be here >> Thanks for having us >> So security, privacy all go hand in hand. A lot going on. You're seeing more breaches you're seeing more privacy challenges Certainly GDPR's going to the next level. People are, quote, complying here's a gig of data go figure it out. So there's a lot happening, give us the update. >> Well, as we suggested last year it was privacypalooza all year long running up to the enforcement deadline of May 25, 2018. There were sort of two kinds of companies. There's one that ran up to that deadline and said woohoo we're ready to drive this baby forward! And then there's a whole nother set of people who are still sort of oh my gosh. And then there's a third category of people who still don't understand. I had someone come up to me several weeks ago and say what do I do? When is this GDPR going to be a law? I thought oh honey you need a hug >> Two years ago, you need some help. >> And some companies in the US, at least were turning off their websites. Some media companies were in the news for actually shutting down their site and not making it available because they weren't ready. So a lot of people were caught off guard, some were prepared but still, you said people would be compliant, kind of and they did that but still more work to do. >> Lots more work to do and as we said when the law was first promulgated two and a half years ago GDPR and the deadline A, It's just one region but as you'll hear as we talk about our study it's impacting the globe but it's also not the end of anything it's the beginning of the information economy at long last. So, I think we all have a lot to do even if you feel rather confident of your base-level compliance now it's time to step up your game and keep on top of it. >> Before we get into some of the details of the new finding you guys have I want you to take a minute to explain how your role is now centered in the middle of Cisco because if you look at the keynotes data's in the center of a lot of things in this intent based network on one side and you've got cloud and edge on the other. Data is the new ingredient that's feeding applications and certainly collective intelligence for security. So the role of data is critical. This is a big part of the Cisco tech plan nevermind policy and or privacy and these other things you're in the middle of it. Explain your role within Cisco and how that shapes you. >> How we sort of fit in. Well it's such a good question and actually if you watch our story through theCUBE we announced, actually on data privacy day several years ago that data is the new currency and this is exactly what we're talking about the only way that you can operationalize your data currency is to really think about it throughout the platform. You're not just pleasing a regulator you're not just pleasing your shareholders you're not just pleasing your employee base. So, as such, the way we organize our group is my role sits under the COO's office our Chief Operations Office under the office of John Stewart who is our Chief Trust officer. So security, trust, advanced research all live together in operations. We have sister organizations in places like public policy, legal, marketing, the sales groups the people who are actually operationalizing come together for a group. My role really is to provide two types of strategy. One, rolling out privacy engineering and getting across inside and outside of the company as quickly as possible. It's something new. As soon as we have set processes I put them into my sister organization and they send them out as routine and hopefully automated things. The other side is the work Robert and I do together is looking at data valuation models. Working about the economics of data where does it drive up revenue and business and speed time to closure and how do we use data to not just be compliant in the privacy risk but really control our overall risk and the quality of our information overall. It's a mouth full >> So that's interesting and Robert, that leads me to a question when we've seen these unfunded mandates before we saw it with Y2K, the Enron backlash certainly the United States the Federal Rules of Civil Procedure. And the folks in the corner office would say oh, here we go again. Is there any way to get more value beyond just reducing risk and complying and have you seen companies be able to take data and value and apply it based on the compliance and governance and privacy policies? >> Dave that's a great question. It's sort of the thought that we had and the hypothesis was that this was going to be more valuable than just for the compliance reasons and one of the big findings of the study that we just released this week was that in fact those investments you know we're saying that good privacy is very good for business. It was painful, some firms stuck their head in the sand and said I don't want to even do this but still, going through the GDPR preparation process or for any of the privacy regulations has taken people to get their data house in order and it's important to communicate. We wanted to find out what benefits were coming from those organizations that had made those investments and that's really what came out in our study this week for international data privacy day we got into that quite a bit. >> What is this study? can you give us some details on it? >> It's the Data Privacy Benchmark study we published this week for international data privacy day. It's sort of an opportunity to focus on data privacy issues both for consumers and for businesses sort of the one day a year kind of like mother's day that you should always think of your mom but mother's day's a good day so you should always think of privacy when you're making decisions about your data but it's a chance to raise awareness. So we published our study this year and it was based on over thirty-two hundred responses from companies around the world from 18 countries all sorts of sizes of companies and the big findings were in fact around that. Privacy has become a serious and a boardroom level issue that the awareness has really skyrocketed for companies who are saying before I do business with you I want to know how you're using my data. What we saw this year is that seven out of eight companies are actually seeing some sales delay from their customers asking those kinds of questions. But those that have made the investment getting ready for GDPR or being more mature on privacy are seeing shorter delays. If you haven't gotten ready you're seeing 60% longer delays. And even more interestingly for us too is when you have data breaches and a lot of companies have them as we've talked about those breaches are not nearly as impactful. The organizations that aren't ready for GDPR are seeing three times as many records impacted by the breach. They're seeing system downtime that's 50% longer and so the cost of the whole thing is much more. So, kind of the question of is this still something good to do? Not only because you have to do it when you want to avoid 4% penalties from GDPR and everything else but it's something that's so important for my business that drives value. >> So the upshot there is that you do the compliance. Okay, check the box, we don't want to get fined So you're taking your medicine basically. Turns into an upside with the data you're seeing from your board. Sales benefit and then just preparedness readiness for breaches. >> Right, I mean it's a nice-- >> Is that right? >> That's exactly right John you've got it right. Then you've got your data house in order I mean there's a logic to this. So then if you figured out where your data is how to protect it, who has access to it you're able to deal with these questions. When customers ask you questions about that you're ready to answer it. And when something bad goes wrong let's say there is a breach you've already done the right things to control your data. You've got rid of the data you don't need anymore. I mean 50% of your data isn't used for anything and of course we suggest that people get rid of that that makes it less available when and if a breach occurs. >> So I got to ask you a question on the data valuation because a lot of the data geeks and data nerds like myself saw this coming. We saw data, mostly on the tech side if you invested in data it was going to feed applications and I think I wrote a blog post in 2007 data's going to be part of the development kits or development environment you're seeing that now here. Data's now part of application development it's part of network intelligence for security. Okay, so yes, check, that's happening. At the CFO level, can you value the data so it's a balance sheet item? Can you say we're investing in this? So you start to see movement you almost project, maybe, in a few years, or now how do you guys see the valuation? Is it going to be another kind of financial metric? >> Well John, it's a great point. Seeing where we're developing around this. So I think we're still in somewhat early days of that issue. I think the organizations that are thinking about data as an asset and monetizing its value are certainly ahead of this we're trying to do that ourselves. We probed on that a little bit in the survey just to get a sense of where organizations are and only about a third of organizations are doing those data mature things. Do they have a complete data map of where their stuff is? Do they have a Chief Data Officer? Are they starting to monetize in appropriate ways, their data? So, there's a long way to go before organizations are really getting the value out of that data. >> But the signals are showing that there's value in the data. Obviously the number of sales there's some upside to compliance not just doin it to check the box there's actually business benefits. So how are you guys thinking about this cause you guys are early adopters or leaders in this how are you thinking about the data measurement of it? Can you share your insights on that? >> Yeah, so you know, data on the balance sheet Grace Hopper 1965, right? data will one day be on the corporate balance sheet because it's in most cases more valuable than the hardware that processes. This is the woman who's making software and hardware work for us, in 1965! Here we are in 2019. It's coming on the balance sheet. She was right then, I believe in it now. What we're doing is, even starting this is a study of correlation rather than causation. So now we have at least the artifacts to say to our legal teams go back and look at when you have one of our new improved streamline privacy sheets and you're telling in a more transparent fashion a deal. Mark the time that you're getting the question. Mark the time that you're finishing. Let's really be much more stilletto-like measuring time to close and efficiency. Then we're adding that capability across our businesses. >> Well one use case we heard on theCUBE this week was around privacy and security in the network versus on top of the network and one point that was referenced was when a salesperson leaves they take the contacts with them. So that's an asset and people get sued over it. So this again, this is a business policy thing. so business policy sounds like... >> Well in a lot of the solutions that exist in the marketplace or have existed I've sat on three encrypted email companies before encrypted email was something the market desired. I've sat on two advisory boards of-- a hope that you could sell your own data to the marketers. Every time someone gets an impression you get a micro cent or a bitcoin. We haven't really got that because we're looking on the periphery. What we're really trying to do is let's look at what the actual business flow and processes are in general and say things like can we put a metric on having less records higher impact, and higher quality. The old data quality in the CDO is rising up again get that higher quality now correlate it with speed to innovation speed to close, launch times the things that make your business run anyway. Now correlate it and eventually find causal connections to data. That's how we're going to get that data on the balance sheet. >> You know, that's a great point the data quality issue used to be kind of a back office records management function and now it's coming to the fore and I just make an observation if you look at what were before Facebook fake news what were the top five companies in the United States in terms of market value Amazon, Google, Facebook was up there, Microsoft, Apple. They're all data companies and so the market has valued them beyond the banks, beyond the oil companies. So you're starting to see clearer evidence quantifiable evidence that there's value there. I want to ask you about we have Guillermo Diaz coming up shortly, Michelle and I want to ask you your thoughts on the technical function. You mentioned it's a board level issue now, privacy. How should the CIO be communicating to the board about privacy? What should that conversation be like? >> Oh my gosh. So we now report quarterly to the board so we're getting a lot of practice We'll put it that way. I think we're on the same journey as the security teams used to you used to walk into the board and go here's what ransomware is and all of these former CFOs and sales guys would look at you and go ah, okay, onto the financials because there wasn't anything for them to do strategically. Today's board metrics are a little soft. It's more activity driven. Have you done your PIAs? Have you passed some sort of a third party audit? Are you getting rejected for third party value chain in your partner communities? That's the have not and da da da. To me I don't want my board telling us how to do operations that's how we do. To really give the board a more strategic view what we're really trying to do is study things like time to close and then showing trending impacts. The one conversation with John Chambers that's always stuck in my head is he doesn't want to know what today's snapshot is cause today's already over give me something over time, Michelle, that will trend. And so even though it sounds like, you know who cares if your sales force is a little annoyed that it takes longer to get this deal through legal well it turns out when you multiply that in a multi-billion dollar environment you're talking about hundreds of millions of dollars probably a week, lost to inefficiency. So, if we believe in efficiency in the tangible supply chain that's the more strategic view I want to take and then you add on things like here's a risk portfolio a potential fair risk reporting type of thing if we want to do a new business Do we light up a business in the Ukraine right now versus Barcelona? That is a strategic conversation that is board level. We've forgotten that by giving them activity. >> Interesting what you say about Chambers. John you just interviewed John Chambers and he was the first person, in the mid 90s to talk about a virtual close, if you remember that. So, obviously, what you're talking about is way beyond that. >> Yeah and you're exactly right. Let's go back to those financial roots. One of the things we talk about in privacy engineering is getting people's heads-- the concept that the data changes. So, the day before your earnings that data will send Chuck Robbins to jail if someone is leaking it and causing people to invest accordingly. The day after, it's news, we want everyone to have it. Look at how you have to process and handle and operationalize in 24 hours. Figuring out those data stories helps it turn it on its head and make it more valuable. >> You know, you mentioned John Chambers one of the things that I noticed was he really represented Silicon Valley well in Washington DC and there's been a real void there since he retired. You guys still have a presence there and are doing stuff there and you see Amazon with Theresa Carlson doing some great work there and you still got Oracle and IBM in there doing their thing. How is your presence and leadership translating into DC now? Can you give us an update of what's happening at-- >> So, I don't know if you caught a little tweet from a little guy named Chuck Robbins this week but Chuck is actually actively engaged in the debate for US federal legislation for privacy. The last thing we want is only the lobbyists as you say and I love my lobbyists wherever you are we need them to help give information but the strategic advisors to what a federal bill looks like for an economy as large and complex and dependent on international structure we have to have the network in there. And so one of the things that we are doing in privacy is really looking at what does a solid bill look like so at long last we can get a solid piece of federal legislation and Chuck is talking about it at Davos as was everyone else, which was amazing and now you're going to hear his voice very loudly ringing through the halls of DC >> So he's upping his game in leadership in DC >> Have you seen the size of Chuck Robbins? Game upped, privacy on! >> It's a great opportunity because we need leadership in technology in DC so-- >> To affect public policy, no doubt >> Absolutely. >> And globally too. It's not just DC and America but also globally. >> Yeah, we need to serve our customers. We win when they win. >> Final question, we got to get wrapped up here but I want to get you guys a chance to talk about what you guys announced here at the show what's going on get the plug in for what's going on Cisco Trust. What's happening? >> Do you want to plug first? >> Well, I think a few things we can add. So, in addition to releasing our benchmark study this week and talking about that with customers and with the public we've also announced a new version of our privacy data sheets. This was a big tool to enable salespeople and customers to see exactly how data is being used in all of our products and so the new innovation this week is we've released these very nice, color created like subway maps, you know? They make it easy for you to navigate around it just makes it easy for people to see exactly how data flows. So again, something up on our site at trust.cisco.com where people can go and get that information and sort of make it easy. We're pushing towards simplicity and transparency in everything we do from a privacy standpoint and this is really that trajectory of making it as easy as possible for anyone to see exactly how things go and I think that's the trajectory we're on that's where the legislation both where GDPR is heading and federal legislation as well to try to make this as easy as reading the nutrition label on the food item. To say what's actually here? Do I want to buy it? Do I want to eat it? And we want to make that that easy >> Trust, transparency accountability comes into play too because if you have those things you know who's accountable. >> It's terrifying. I challenge all of my competitors go to trust.cisco.com not just my customers, love you to be there too go and look at our data subway maps. You have to be radically transparent to say here's what you get customer here's what I get, Cisco, here's where my third party's. It's not as detailed as a long report but you can get the trajectory and have a real conversation. I hope everybody gets on board with this kind of simplification. >> Trust.cisco.com we're going to keep track of it. Great work you guys are doing. I think you guys are leading the industry, Congratulations. >> Thank you. >> This is not going to end, this conversation continues will continue globally. >> Excellent >> Thanks for coming on Michelle, appreciate it. Robert thanks for coming on. CUBE coverage here day three in Barcelona. We'll be back with more coverage after this break.

Published Date : Jan 31 2019

SUMMARY :

brought to you by Cisco and everything we talked Certainly GDPR's going to the next level. I thought oh honey you need a hug And some companies in the US, at least GDPR and the deadline of the new finding you guys have the only way that you can and apply it based on the compliance and one of the big findings of the study and so the cost of the Okay, check the box, we and of course we suggest At the CFO level, can you value the data are really getting the So how are you guys thinking about this It's coming on the balance sheet. and one point that was referenced Well in a lot of the solutions and I want to ask you your thoughts and then you add on things person, in the mid 90s One of the things we talk about and you see Amazon with Theresa Carlson only the lobbyists as you say It's not just DC and Yeah, we need to serve our customers. to talk about what you guys and so the new innovation this week is because if you have those things to say here's what you get customer I think you guys are leading This is not going to end, Thanks for coming on

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
IBMORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

OracleORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

AppleORGANIZATION

0.99+

FacebookORGANIZATION

0.99+

CiscoORGANIZATION

0.99+

Dave VellantePERSON

0.99+

EnronORGANIZATION

0.99+

Michelle DennnedyPERSON

0.99+

JohnPERSON

0.99+

ChuckPERSON

0.99+

2019DATE

0.99+

May 25, 2018DATE

0.99+

RobertPERSON

0.99+

Michelle DennedyPERSON

0.99+

Chuck RobbinsPERSON

0.99+

50%QUANTITY

0.99+

MichellePERSON

0.99+

Robert WaitmanPERSON

0.99+

2007DATE

0.99+

BarcelonaLOCATION

0.99+

Washington DCLOCATION

0.99+

60%QUANTITY

0.99+

USLOCATION

0.99+

John FurrierPERSON

0.99+

Theresa CarlsonPERSON

0.99+

DavePERSON

0.99+

sevenQUANTITY

0.99+

John ChambersPERSON

0.99+

last yearDATE

0.99+

UkraineLOCATION

0.99+

DCLOCATION

0.99+

Y2KORGANIZATION

0.99+

United StatesLOCATION

0.99+

Grace HopperPERSON

0.99+

trust.cisco.comOTHER

0.99+

Barcelona, SpainLOCATION

0.99+

1965DATE

0.99+

GDPRTITLE

0.99+

24 hoursQUANTITY

0.99+

three daysQUANTITY

0.99+

Barcelona, SpainLOCATION

0.99+

Trust.cisco.comOTHER

0.99+

John StewartPERSON

0.99+

Two years agoDATE

0.99+

todayDATE

0.99+

this weekDATE

0.99+

Guillermo DiazPERSON

0.98+

two kindsQUANTITY

0.98+

Silicon ValleyLOCATION

0.98+

OneQUANTITY

0.98+

two typesQUANTITY

0.98+

eight companiesQUANTITY

0.98+

TodayDATE

0.98+

over thirty-two hundred responsesQUANTITY

0.98+

several weeks agoDATE

0.98+

18 countriesQUANTITY

0.98+

third categoryQUANTITY

0.98+

one pointQUANTITY

0.98+

firstQUANTITY

0.98+

first personQUANTITY

0.98+

two guestsQUANTITY

0.98+

Cisco TrustORGANIZATION

0.98+

several years agoDATE

0.98+