Russell Schrader, National Cyber Security Alliance | Data Privacy Day 2018
(soft click) >> Hey, welcome back everybody Jeff Frick here with theCUBE. We're at Data Privacy Day 2018 here at downtown San Francisco, the LinkedIn headquarters, gracious enough to hose this event. Bigger than last year, last year we were here for the first time at Twitter. And really the momentum continues to grow 'cause there's some big regulations coming down the pike that are really going to be into place. And have significant financial penalties, if you don't get your act together. So, we're excited to have the new Russell Schrader, the Executive Director of the National Cyber Security Alliance Organization behind this event. Russ, great to see you. >> Thank you very much for coming today, it was a great event. >> Absolutely, so, you've been on the job, this job, you said like less than two weeks. >> It's true. >> What do you think? I mean then they throw you right into the big event. >> Well, I've known the organization, I've known the event. But the staff really has done an outstanding job. They made it so easy for me, everything that they've done has just been terrific. They lined up fantastic speakers, they picked cutting edge topics, they put together a really well paced program, and it was just a terrific day for all of us to get in, really have some good discussions. >> You're off to a great start. (chuckles) >> Thank you. (both laugh) >> So you said you're familiar with your orginazation. You know, why are you here? Why did you take advantage of this opportunity? What do you kind of see as the role of this organization? And where do you see the opportunities to really make some significant impact going forward? >> Sure, the National Cyber Security Alliance is a who's who in the organization. People who really care about cyber security. Who see it as part of their social obligation. And it was a wonderful group that I'd worked with before. When I was at Visa and I see now, coming in as Executive Director, to really take it to the next level. We really are pushing, I think, on four separate areas that I think there's a lot of opportunity for it. Doing more cooperate work. Serving more consumers, more consumer education, more consumer awareness. I think working with educating staffers on the hill and in regulatory agencies in D.C. on changes and technological changes. And the cutting edge stuff. But in also, I think working academia, sort of getting involved and getting some of the scholarly, the cutting edge, the new ideas. And just preparing for what's going to happen in the next few years. >> Right, that's interesting 'cause you guys are National Cyber Security, security is often used as a reason to have less privacy. Right? It's often the excuse that the government, big brother, would used to say, you know, "We need to know what you're up to, we've got red light cameras all over the place to make sure you're not running red lights." So, it's an interesting relationship between privacy, security, and then what we're hearing more and more, really, a better linchpin to drive all this, which is, identity. So I wonder if you can share your kind of perspective on kind of the security versus privacy. Kind of trade off and debate. Or am I completely off base and they really need to run in parallel? >> Well, they do intersect a whole lot. People have talked about them being two sides of the same coin. Another speaker today said that security is a science but privacy is an art. As part of it is, you know, security is, the keeping the data in one place, the same way in as when you put it out. Sort of an integrity piece. You know, it isn't being misused, it's not being manipulated in a way and it's just not being changed. So that's a security piece. The privacy piece is people choosing what is used with that data. You know, is it to help me with an app? Is it to give me more information? Is it to give me games to play and things like that? So and that leads into a lot of different advantages in the web and on the internet. Now, identity since you put in a trifecta of big terms. >> Everything's got to be in threes, right? >> And there's three reasons for that. I think that, you know, the identity part is part of who are you. Now on the internet you can be a lot of people, right? The old cartoon was, you know, on the internet no one knows you're a dog. Well, on the internet, you can be a dog, you can be, you know, the person who you are at school, you can be the person who you are among your friends, you can be the person who you are at work. And those different selves, those different identities, are the internet of me. And we just need to make sure that you are curating your identities and sharing the information that you feel comfortable with. And that making sure that those are reaching the right people and not the wrong people. >> Right. So, there's an interesting kind of conundrum, we cover a lot of big data shows. And, you know, and there is kind of a fiduciary moral and now legal responsibility as you're collecting this data to drive some algorithm, some application that you know what you're using it for. And it's a good use of that. And you have a implicit agreement with the people providing you the data. But one the interesting things that comes up is then there's this thing where you've got that data and there's an application down the road that was not part of the original agreement. That no one even had an idea whatever happened. How does that fit in? Because as more and more of this data's getting stored. And there's actually a lot of value that can be unlocked, applying it in different ways, different applications. But, that wasn't the explicit reason that I gave it to you. >> Right, right. And that's really tricky because people have to be really vigilant. There is that education piece. That is the personal responsible piece to do business with companies and with apps that you feel comfortable with. But, you still have to trust but verify. And you do want to look into your phone, look into your PC, look into your other device. And figure out where things have changed, where things are moving. That's one of the great things about being in the Bay area today is innovation. But innovation, you just want to make sure that you are participating in it and you're in the part of innovation that's best for you. >> Okay, so, you mentioned academe, which is great, we do a lot of stuff at Stanford, we do a lot of stuff at MIT. So, as you look at kind of the academic opportunities. Kind of, where is some of the cutting edge research? Where are some of the academe focus areas that are helping advance the science of proxy? >> Well, you named two of the most forward thinking ones right there. So, I'll add to that just because we're talking about Stanford, we have to talk about Berkeley. >> Jeff: Yes. >> Right and Berkeley does have the whole group in privacy and law. On the east coast, in addition to MIT, you see George Washington is doing some things. George Mason is doing some things. And so you want to reach out to different areas. Cornell is doing things as well. So, we want to be able to figure out, where are the best ideas coming from? There are conferences already there. And maybe we can convene some papers, convene some people. And source out and give a little bit of more push and publish to people who otherwise wouldn't be getting the kind of publicity and encourage the kind of research. In privacy and in cyber security. Because there is the business and the consumer educational component. Not just, you know, the tech component to the academic work. >> So, before I let you go, last question. Where do you see is the biggest opportunity? Where's the biggest, either gap that needs to be filled, you know, kind of positive that's filling in negative, or an untapped positive that we've just barely scraped the surface of? >> Well, I think it's all about the consumer, to a large extent, to large one. You've got to figure out, how do you make your life easier. Right? Go back to the iPad introduction, nobody knew that they needed an iPad until they realized they couldn't live without it. You look at what's happened with mobile, right? Now, the idea of having a wallet, is on your phone. So, while I'm waiting in line at the grocery store, I'm checking my messages, I'm texting back and forth. And I just point my phone and I pay. Those kinds of areas are the kind of innovations that are consumer facing, that I think are really terrific. There's a lot of business work as well being done. But you have to figure out where that's going to go and I think the consumer just has a fantastic opportunity. >> Alright, well good opportunity, look forward to catching up a year from now and seeing how much progress you make. >> I think we had such a great program this year, I can't wait til next year, thank you. >> He's Russ Schrader, he's the Executive Director. I'm Jeff Frick, you're watching theCUBE, we're at Data Privacy Day 2018 in San Francisco. Thanks for watching, we'll catch you next time. (soft electronic music)
SUMMARY :
And really the momentum continues to grow Thank you very much for coming today, you said like less than two weeks. I mean then they throw you right into the big event. Well, I've known the organization, I've known the event. You're off to a great start. Thank you. And where do you see the opportunities And the cutting edge stuff. So I wonder if you can share your kind of perspective the same way in as when you put it out. and sharing the information that you feel comfortable with. And you have a implicit agreement And you do want to look into your phone, So, as you look at kind of the academic opportunities. Well, you named two of the And so you want to reach out to different areas. Where's the biggest, either gap that needs to be filled, You've got to figure out, how do you make your life easier. and seeing how much progress you make. I think we had such a great program this year, Thanks for watching, we'll catch you next time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Justin Warren | PERSON | 0.99+ |
Sanjay Poonen | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Clarke | PERSON | 0.99+ |
David Floyer | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Dave Volante | PERSON | 0.99+ |
George | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Diane Greene | PERSON | 0.99+ |
Michele Paluso | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Sam Lightstone | PERSON | 0.99+ |
Dan Hushon | PERSON | 0.99+ |
Nutanix | ORGANIZATION | 0.99+ |
Teresa Carlson | PERSON | 0.99+ |
Kevin | PERSON | 0.99+ |
Andy Armstrong | PERSON | 0.99+ |
Michael Dell | PERSON | 0.99+ |
Pat Gelsinger | PERSON | 0.99+ |
John | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
Lisa Martin | PERSON | 0.99+ |
Kevin Sheehan | PERSON | 0.99+ |
Leandro Nunez | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Alibaba | ORGANIZATION | 0.99+ |
NVIDIA | ORGANIZATION | 0.99+ |
EMC | ORGANIZATION | 0.99+ |
GE | ORGANIZATION | 0.99+ |
NetApp | ORGANIZATION | 0.99+ |
Keith | PERSON | 0.99+ |
Bob Metcalfe | PERSON | 0.99+ |
VMware | ORGANIZATION | 0.99+ |
90% | QUANTITY | 0.99+ |
Sam | PERSON | 0.99+ |
Larry Biagini | PERSON | 0.99+ |
Rebecca Knight | PERSON | 0.99+ |
Brendan | PERSON | 0.99+ |
Dell | ORGANIZATION | 0.99+ |
Peter | PERSON | 0.99+ |
Clarke Patterson | PERSON | 0.99+ |
Jerrod Chong, Yubico | Data Privacy Day 2018
>> Hey welcome back everybody, Jeff Frick here with The Cube. We're in downtown San Francisco at LinkedIn's headquarters at Data Privacy Day 2018. Second year we've been at the event, pretty interesting, you know there's a lot of stuff going on in privacy. It kind of follows the security track, gets less attention but with the impending changes in regulation it's getting much more play, much more media. So we're excited to be joined by our next guest. He's Jerrod Chong the Vice President of product at Yubico. Jerrod, welcome. >> Thank you Jeff. So for folks that aren't familiar with Yubico, what are you guys all about? >> We're all about protecting people's identities and privacies and making them the authenticate securely to online accounts. >> So identity, that's so, an increasingly important strategy for security. Don't worry about the wall, can we really figure out who this person is. So how has that been changing over the last couple years? >> Yes there's definitely a lot of things been changing. So we can think of identity as some some companies want to know who you are. But some companies actually are okay with you being anonymous but then they want to still know that is the person that they talk to is still the person. And so what we see in the wall of data is-- >> An anonymous person as opposed to a not-- >> Someone else. We want to make sure the anonymous person is the same anonymous person. >> Oh okay, okay, right. >> And that's important, right? If you can think of like a journalist and you think of they need to talk to the informer so they need to know that this is the real informer. And they don't want to have the fake informer tell them the wrong story. And so they need a way to actually strongly authenticate themselves. And so identity is a very interesting intersection of strong authentication. But at the same time, real identities as well as anonymous identities. And there are actually real life applications for both that can protect citizens, can protect dissidents but also at the same time can help governments do the right things when they know who you are. >> Right, so we're so far behind that I still can't understand why you dial into the customer service person and you put in your account number and they still want to know you're mom's maiden name. And we've told them all a thousand times that can't be much of a secret anymore. And then I read something else that said the ability to use a nine digit social security number and keep that actually private is basically, the chances of doing that are basically zero. So we're well past that stage in terms of some of these more sophisticated systems but we still kind of have regulations that are still asking you to put in your social security number. So what are the ways that you guys are kind of addressing that? And you're kind of taking a novel approach with an open source solution which is pretty cool. >> Yes we've created the open standard which is FIDO U2F standard and we actually co-created this with Google. And one of the key things is that what we call knowledge-base systems are just a thing of the past. Knowledge-base is anything that you try to remember including passwords. And what we call recovery questions. You know, you name the recovery question that you want to put in. >> Right right, your dog, your pet, you know your street. >> And you can get everything online from LinkedIn or Facebook. So why are we doing those systems? And obviously they are, we need to change that. But this open standard that we've created really allows you to physically prove yourself with a physical device. Like, so you want to tell who you are and there are a couple ways you can tell who you are online. You can tell by remembering something, by something that you have, and something that who you are, right? So these are the basics in how you can identify yourself over the wire. And what we've really focused on is the combination of something you have and something you know. But the something you know is not revealed to the world. The something you know is revealed to the device that you have. So it's kind of like your ATM card. You're not going to tell the PIN to the world. Nobody really has you ATM, nobody asks you for the ATM. Even the banks don't know what your ATM is and you can change that and only you know about it. And it's only on the card. And so we take that same concept and make it available for companies to implement these types of authentication systems for their own services. So today Google supports this open standard. Actually today Facebook supports it as well. And SalesForce and hosts of other services. Which means that you can actually authenticate yourself with a device and something you know. And that really allows you as an individual to not have to think about all these different things that you have to remember for every single site because that's what people are doing today. And so the beauty about this protocol as well is that, is what the developer's think, Is that these systems, they don't know that you have the same authenticator. Which is a great thing, so they can't collude and share and then pinpoint it was you. If you took this authenticator you can use it with many different things but all of them don't know that you have what we call the YubiKey. And so this is, the YubiKey that we-- >> So it's like the old RSA key, what we think a lot of people are familiar with. >> What people think, obviously we've, it's way beyond RSA key. >> Right, but it's the same kind of concept, you've got a USB a little device-- >> And that's what you bring with you and that's who you are. And you can strongly authenticate to the servers that you want. And I think that's really the foundation which is people want to take back the way that they authenticate through the systems and they want to own it. And that's really a big difference that we see rather than the banks that you must have this or you must have that and you can only use it with me you can't use it with somebody else. I want to bring my authenticator anywhere. >> So you said Google's using that. I'm a huge Google user, I don't have one of those things. So where's the application? Is that something that I choose because I want to add another layer of protection or is that something that Google says hey Jeff, you're such and such a level of customer user et cetera we think you should take this to the next level. How does that happen? >> So it's actually been available since the end of 2014. It's part of the step up authentication. The latest iteration of the work that Google has done is the Google advanced protection program. Which means that you can enable one of these devices as part of your account. And one of the things they've done is that for those users at risk you can only log in with these devices. Which really restricts-- >> So they define you as a high risk person because of whatever reason. >> And they encourage you, hey please protect yourself with additional security measures. And the old additional security measures used to be like, you know, send me an SMS text. But that's actually pretty broken right now. We've seen it being breached everywhere because of what we call phone hijacking. You know, I pretend to be you and I've got your phone number and you know, now I've got your phone. >> Shoot I thought that was a good one. >> That is known, there's lot video how you can do that. And so this is available now for everyone. Everyone has a gmail account, you can go into your account it says I want step up authentication. They call it two step verification. And then they walk you through the process. And then you get one of these in the mail? >> You actually have to buy these but Google has been providing within different communities, they've been seeding the market, we've been also doing a lot of advocacy work. Many different types, even here today we've distributed a lot of YubiKeys for all of the journalists to use. But in general users will go online to Amazon or something and you would buy one of these devices. >> So then and then once I have that key and I bought into that system is you're saying then I can use that key for not just Google but my Amazon account-- >> Anyone that supports-- >> Anyone that supports that standard? >> Exactly, anybody that supports the standard. And that standard is growing extremely rapidly and it's users, it's big companies using it, developers of sites are using it. So the thing that we created for the world back in 2014 is now being actually accelerated because of all these breaches. They are very relevant to data breaches, identity breaches, and people want to take control. >> Right, I'm just curious, I'm sure you have a point of view, you know why haven't the phone companies implemented more use of the biometric data piece that they have whether, now they're talking about the face recognition or your finger recognition and tied that back to the apps on my phone? I still am befuddled by the lack of that integration. >> There's definitely, there are definitely solutions in that area. And I think, but one of the challenges that just like a computer, just like a phone, it's a complicated piece of software. There's a lot of dependencies. All it takes is one software to get it wrong and the entire phone can be compromised. So you're back into complicated systems, complex systems, people write these systems, people write these apps. It takes one bad developer to mess it up for everybody else. So it's actually pretty hard unless you control every single ecosystem that you build which is vastly difficult now in the mobile space. The mobile carriers are not just, it's not just from AT&T, you've got the OS, you've got you know, Google, the Android phone. You've got AT&T, you've got the apps on the phone, you've got all the, you know, the various processes, the components that talk to different apps and you've got the calling app, you've got all of these other games. So because it's such a complicated device getting it right from a security perspective is actually pretty difficult. So, but there are definitely applications that have been working over the years that have been trying to leverage the built in capabilities. We actually see it as the YubiKey can actually be used with this device. And then you can use these devices after you bootstrap them. What we deemed as, what we call blasted device. So you can use multiple different things. And the standard doesn't always define that you just use the hardware device of the YubiKey. You can use a phone if you trust the phone. We want to give flexibility to the ecosystem. >> So I'm just curious in terms of the open standard's approach for this problem, how that's gaining traction. Because clearly, you know, open source is done very very well, you know far beyond Linux as an operating system. But you know so many apps and stuff run open source software, components of open source. So in terms of market penetration and kind of adoption of this technology versus the one single vendor key that you used to have, how is the uptake, how is the industry responding? Is this something that a lot of people are getting behind? >> It's definitely getting a lot of traction in the industry. So we started the journey with Google and what was happening was that in order to work with this prominent scale you have to believe that just between, you know, Yubico and Google can't solve this problem. And if the answer is you got to do my thing, no one's going to play in this game. Just a high level. So I think what we've done is that the open standard is the catalyst for other big players to participate. Without any one vendor going to necessarily win. So today if, there's a big plenary going on at FIDO and it's really iteration of what we've developed with Google. And now we're taking the next level with actually Microsoft. And we've called it FIDO 2. So from U2F, FIDO Universal Second Factor, to FIDO 2. And that entire work that we've done with Google is now being evolved into the Microsoft ecosystem. So, and we'll see in a couple months, you will start to see real Microsoft products being able to support the same standard. Which is really excellent because what do you use every day? You either use, there's three major platform players that you have today, right you have, you either use a Google type of device, Chrome or Android. You use a Microsoft device, you've got Windows everywhere. Or you use an Apple device. So, and the only way these large internet companies are going to collaborate is if it's open. If it's closed, if it's my stuff, Google's not going to implement it because it's Microsoft stuff, Microsoft's not going to implement Apple stuff. So the only way you can-- >> I dunno about the Apple part of that analogy but that's okay. >> That's true, that's true, but I think it's important that the security industry working with the identity issue, work together. And we need to move away from all this one up, proprietary things. Because it makes it really difficult for the users and the people to implement things. And if everybody's collaborating like an open standard, then you actually can make a dent in the problem that you see today. >> And to your point, right, with BYOD, which is now, used to be a thing, it's not a thing obviously everybody's bringing their own devices. To have an open standard so people at different types of companies with different types of ecosystems with different types of users using different types of devices have a standard by which they can build these things. >> Absolutely. >> Exciting times. >> Exciting times. >> Alright Jerrod, well thanks for taking a few minutes out of your day. We look forward to watching the Yubico story unfold. >> Exactly, thank you very much. >> Alright, very good. He's Jerrod, I'm Jeff, you're watching The Cube where Data Privacy Day 2018, thanks for watching.
SUMMARY :
pretty interesting, you know there's a lot what are you guys all about? the authenticate securely to online accounts. So how has that been changing over the last couple years? that is the person that they talk to is the same anonymous person. do the right things when they know who you are. So what are the ways that you guys Knowledge-base is anything that you try to remember And that really allows you as an individual So it's like the old RSA key, what we think it's way beyond RSA key. And that's what you bring with you and that's who you are. So you said Google's using that. Which means that you can enable one of these devices So they define you as a high risk person You know, I pretend to be you and I've got your phone number And then they walk you through the process. to Amazon or something and you would So the thing that we created for the world back in 2014 I'm sure you have a point of view, And then you can use these devices after you bootstrap them. But you know so many apps and stuff And if the answer is you got to do my thing, of that analogy but that's okay. can make a dent in the problem that you see today. And to your point, right, with BYOD, We look forward to watching the Yubico story unfold. He's Jerrod, I'm Jeff, you're watching The Cube
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jerrod | PERSON | 0.99+ |
Jerrod Chong | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
2014 | DATE | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Apple | ORGANIZATION | 0.99+ |
Yubico | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
AT&T | ORGANIZATION | 0.99+ |
FIDO 2 | TITLE | 0.99+ |
ORGANIZATION | 0.99+ | |
ORGANIZATION | 0.99+ | |
end of 2014 | DATE | 0.99+ |
today | DATE | 0.99+ |
Second year | QUANTITY | 0.99+ |
one | QUANTITY | 0.98+ |
zero | QUANTITY | 0.98+ |
both | QUANTITY | 0.98+ |
The Cube | TITLE | 0.98+ |
Android | TITLE | 0.98+ |
Linux | TITLE | 0.98+ |
one software | QUANTITY | 0.97+ |
Windows | TITLE | 0.97+ |
Data Privacy Day 2018 | EVENT | 0.97+ |
YubiKey | ORGANIZATION | 0.96+ |
nine digit | QUANTITY | 0.96+ |
two step | QUANTITY | 0.96+ |
The Cube | ORGANIZATION | 0.96+ |
Chrome | TITLE | 0.93+ |
one bad developer | QUANTITY | 0.89+ |
FIDO Universal Second Factor | TITLE | 0.88+ |
FIDO | TITLE | 0.86+ |
single site | QUANTITY | 0.83+ |
last couple years | DATE | 0.83+ |
single ecosystem | QUANTITY | 0.83+ |
U2F | ORGANIZATION | 0.83+ |
three major platform players | QUANTITY | 0.82+ |
FIDO U2F | TITLE | 0.8+ |
San Francisco | LOCATION | 0.78+ |
YubiKey | OTHER | 0.76+ |
one single vendor | QUANTITY | 0.76+ |
a thousand times | QUANTITY | 0.75+ |
RSA | OTHER | 0.72+ |
one of | QUANTITY | 0.71+ |
couple ways | QUANTITY | 0.7+ |
Yubico | PERSON | 0.7+ |
one vendor | QUANTITY | 0.69+ |
RSA key | OTHER | 0.66+ |
Eva Velasquez, Identity Theft Resource Center | Data Privacy Day 2018
>> Hey, welcome back everybody, Jeff Frick here with The Cube. We're at Data Privacy Day 2018, I still can't believe it's 2018, in downtown San Francisco, at LinkedIn's headquarters, the new headquarters, it's a beautiful building just down the road from the sales force building, from the new Moscone that's being done, there's a lot of exciting things going on in San Francisco, but that's not what we're here to talk about. We're here to talk about data privacy, and we're excited to have a return visit from last year's Cube alumni, she's Eva Velasquez, president and CEO, Identity Theft Resource Center. Great to see you again. >> Thank you for having me back. >> Absolutely, so it's been a year, what's been going on in the last year in your world? >> Well, you know, identity theft hasn't gone away >> Shoot. >> And data-- >> I thought you told me it was last time. >> I know, I wish, and in fact, unfortunately we just released our data breach information, and there was a tremendous growth. It was a little over 1000, previous year, and over 1500 data breaches... in 2017. >> We're almost immune, they're like every day. And it used to be like big news. Now it's like, not only was Yahoo breached at some level, which we heard about a while ago, but then we hear they were actually breached like 100%. >> There is some fatigue, but I can tell you that it's not as pervasive as you might think. Our call center had such a tremendous spike in calls during the Equifax breach. It was the largest number of calls we'd had in a month, since we'd been measuring our call volume. So people were still very, very concerned. But a lot of us who are in this space are feeling, I think we may be feeling the fatigue more than your average consumer out there. Because for a lot of folks, this is really the first exposure to it. We're still having a lot of first exposures to a lot of these issues. >> So the Equifax one is interesting, because most people don't have a direct relationship with Equifax, I don't think. I'm not a direct paying customer, I did not choose to do business with them. But as one of the two or three main reporting agencies, right, they've got data on everybody for their customers who are the banks, financial institutions. So how does that relationship get managed? >> Oh my gosh, there's so much meat there. There's so much meat there. Okay, so, while it feels like you don't have a direct relationship with the credit reporting agencies, you actually do, you get a benefit from the services that they're providing to you. And every time you get a loan, I mean this is a great conversation for Data Privacy Day. Because when you get a loan, get a credit card, and you sign those terms and conditions, guess what? >> They're in there? >> You are giving that retailer, that lender, the authority to send that information over to the credit reporting agencies. And let's not forget that the intention of forming the credit reporting agencies was for better lending practices, so that your creditworthiness was not determined by things like your gender, your race, your religion, and those types of really, I won't say arbitrary, but just not pertinent factors. Now your creditworthiness is determined by your past history of, do you pay your bills? What is your income, do you have the ability to pay? So it started with a good, very good purpose in mind, and we definitely bought into that as a society. And I don't want to sound like I'm defending the credit reporting agencies and all of their behavior out there, because I do think there are some changes that need to be made, but we do get a benefit from the credit reporting agencies, like instant credit, much faster turnaround when we need those financial tools. I mean, that's just the reality of it. >> Right, right. So, who is the person that's then... been breached, I'm trying to think of the right word of the relationship between those who've had their data hacked from the person who was hacked. If it's this kind of indirect third party relationship through an authorization through the credit card company. >> No, the, Equifax is absolutely responsible. >> So who would be the litigant, just maybe that's the word that's coming to me in terms of feeling the pain, is it me as the holder of the Bank of America Mastercard? Is it Bank of America as the issuer of the Mastercard? Or is it Mastercard, in terms of retribution back to Equifax? >> Well you know, I can't really comment on who actually would have the strongest legal liability, but what I can say is, this is the same thing I say when I talk to banks about identity theft victims. There's some discussion about, well, no, it's the bank that's the victim in existing account identity theft, because they're the ones that are absorbing the financial losses. Not the person whose data it belongs to. Yet the person who owns that data, it's their identity credentials that have been compromised. They are dealing with issues as well, above and beyond just the financial compromise. They have to deal with cleaning up other messes and other records, and there's time spent on the phone, so it's not mutually exclusive. They're both victims of this situation. And with data breaches, often the breached entity, again, I hate to sound like an apologist, but I am keeping this real. A breached entity, when they're hacked, they are a victim, a hacker has committed that crime and gone into their systems. Yes, they have a responsibility to make those security systems as robust as possible, but the person whose identity credentials those are, they are the victim. Any entity or institution, if it's payment card data that's compromised, and a financial services institution has to replace that data, guess what, they're a victim too. That's what makes this issue and this crime so terrible, is that it has these tentacles that reach down and touch more than one person for each incident. >> Right. And then there's a whole 'nother level, which we talked about before we got started that we want to dig into, and that's children. Recently, a little roar was raised with these IOT connected toys. And just a big, giant privacy hole, into your kid's bedroom. With eyes and ears and everything else. So wonder if you've got some specific thoughts on how that landscape is evolving. >> Well, we have to think about the data that we're creating. That does comprise our identity. And when we start talking about these toys and other... internet connected, IOT devices that we're putting in our children's bedroom, it actually does make the advocacy part of me, it makes the hair on the back of my neck stand up. Because the more data that we create, the more that it's vulnerable, the more that it's used to comprise our identity, and we have a big enough problem with child identity theft just now, right now as it stands, without adding the rest of these challenges. Child and synthetic identity theft are a huge problem, and that's where a specific Social Security number is submitted and has a credit profile built around it, when it can either be completely made up, or it belongs to a child. And so you have a four year old whose Social Security number is now having a credit profile built around it. Obviously they're not, so the thieves are not submitting this belongs to a four year old, it would not be issued credit. So they're saying it's a, you know, 23 year old-- >> But they're grabbing the number. >> They're grabbing the number, they're using the name, they build this credit profile, and the biggest problem is we really haven't modernized how we're authenticating this information and this data. I think it's interesting and fitting that we're talking about this on Data Privacy Day, because the solution here is actually to share data. It's to share it more. And that's an important part of this whole conversation. We need to be smart about how we share our data. So yes, please, have a thoughtful conversation with yourself and with your family about what are the types of data that you want to share and keep, and what do you want to keep private, but then culturally we need to look at smart ways to open up some data sharing, particularly for these legitimate uses, for fraud detection and prevention. >> Okay, so you said way too much there, 'cause there's like 87 followup questions in my head. (Eva laughs) So we'll step back a couple, so is that synthetic identity, then? Is that what you meant when you said a synthetic identity problem, where it's the Social Security number of a four year old that's then used to construct this, I mean, it's the four year old's Social Security number, but a person that doesn't really exist? >> Yes, all child identity theft is synthetic identity theft, but not all synthetic identity theft is child identity theft. Sometimes it can just be that the number's been made up. It doesn't actually belong to anyone. Now, eventually maybe it will. We are hearing from more and more parents, I'm not going to say this is happening all the time, but I'm starting to hear it a little bit more often, where the Social Security number is being issued to their child, they go to file their taxes, so this child is less than a year old, and they are finding out that that number has a credit history associated with it. That was associated years ago. >> So somebody just generated the number. >> Just made it up. >> So are we ready to be done with Social Security numbers? I mean, for God's sake, I've read numerous things, like the nine-digit number that's printed on a little piece of paper is not protectable, period. And I've even had a case where they say, bring your little paper card that they gave you at the hospital, and I won't tell you what year that was, a long time ago. I'm like, I mean come on, it's 2018. Should that still be the anchor-- >> You super read my mind. >> Data point that it is? >> It was like I was putting that question in your head. >> Oh, it just kills me. >> I've actually been talking quite a bit about that, and it's not that we need to get, quote unquote, get rid of Social Security numbers. Okay, Social Security numbers were developed as an identifier, because we have, you can have John Smith with the same date of birth, and how do we know which one of those 50,000 John Smiths is the one we're looking for? So that unique identifier, it has value. And we should keep that. It's not a good authenticator, it is not a secret. It's not something that I should pretend only I know-- >> Right, I write it on my check when I send my tax return in. Write your number on the check! Oh, that's brilliant. >> Right, right. So it's not, we shouldn't pretend that this is, I'm going to, you, business that doesn't know me, and wants to make sure I am me, in this first initial relationship or interaction that we're having, that's not a good authenticator. That's where we need to come up with a better system. And it probably has to do with layers, and more layers, and it means that it won't be as frictionless for consumers, but I'm really challenging, this is one of our big challenges for 2018, we want to flip that security versus convenience conundrum on its ear and say, no, I really want to challenge consumers to say... I'm happier that I had to jump through those hoops. I feel safer, I think you're respecting my data and my privacy, and my identity more because you made it a little bit harder. And right now it's, no, I don't want to do that because it's a little too, nine seconds! I can't believe it took me nine seconds to get that done. >> Well, yeah, and we have all this technology, we've got fingerprint readers that we're carrying around in our pocket, I mean there's, we've got geolocation, you know, is this person in the place that they generally, and having 'em, there's so many things-- >> It's even more granular >> Beyond a printed piece of >> Than that-- >> paper, right? >> It's the angle at which you look at your phone when you look at it. It's the tension with which you enter your passcode, not just the passcode itself. There are all kinds of very non-invasive biometrics, for lack of a better word. We tend to think of them as just, like our face and our fingerprint, but there are a lot of other biometrics that are non-invasive and not personal. They're not private, they don't feel secret, but we can use them to authenticate ourselves. And that's the big discussion we need to be having. If I want to be smart about my privacy. >> Right. And it's interesting, on the sharing, 'cause we hear that a lot at security conferences, where one of the best defenses is that teams at competing companies, security teams, share data on breach attempts, right? Because probably the same person who tried it against you is trying it against that person, is trying it against that person. And really an effort to try to open up the dialogue at that level, as more of just an us against them versus we're competing against each other in the marketplace 'cause we both sell widgets. So are you seeing that? Is that something that people buy into, where there's a mutual benefit of sharing information to a certain level, so that we can be more armed? >> Oh, for sure, especially when you talk to the folks in the risk and fraud and identity theft mitigation and remediation space. They definitely want more data sharing. And... I'm simply saying that that's an absolutely legitimate use for sharing data. We also need to have conversations with the people who own that data, and who it belongs to, but I think you can make that argument, people get it when I say, do you really feel like the angle at which you hold your phone, is that personal? Couldn't that be helpful, that combined with 10 other data points about you, to help authenticate you? Do you feel like your personal business and life is being invaded by that piece of information? Or compare that to things like your health records. And medical conditions-- >> Mom's maiden name. >> That you're being treated for, well, wow, for sure that feels super, super personal, and I think we need to do that nuance. We need to talk about what data falls into which of these buckets, and on the bucket that isn't super personal, and feeling invasive and that I feel like I need to protect, how can I leverage that to make myself safer? >> Great. Lots of opportunity. >> I think it's there. >> Alright. Eva, thanks for taking a few minutes to stop by. It's such a multi-layered and kind of complex problem that we still feel pretty much early days at trying to solve. >> It's complicated, but we'll get there. More of this kind of dialogue gets us just that much closer. >> Alright, well thanks for taking a few minutes of your day, great to see you again. >> Thanks. >> Alright, she's Eva, I'm Jeff, you're watching The Cube from Data Privacy Days, San Francisco. (techno music)
SUMMARY :
Great to see you again. I thought you told me it was and there was a tremendous growth. but then we hear they were actually breached like 100%. the first exposure to it. I did not choose to do business with them. that they're providing to you. And let's not forget that the intention of the relationship between those who've had above and beyond just the financial compromise. that we want to dig into, and that's children. Because the more data that we create, the more We need to be smart about how we share our data. Is that what you meant when you said Sometimes it can just be that the number's been made up. at the hospital, and I won't tell you is the one we're looking for? Write your number on the check! And it probably has to do with layers, It's the tension with which you enter your passcode, Because probably the same person who tried it against you the angle at which you hold your phone, is that personal? and that I feel like I need to protect, Lots of opportunity. problem that we still feel pretty much early days just that much closer. of your day, great to see you again. Alright, she's Eva, I'm Jeff, you're watching The Cube
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff Frick | PERSON | 0.99+ |
Eva Velasquez | PERSON | 0.99+ |
Equifax | ORGANIZATION | 0.99+ |
2017 | DATE | 0.99+ |
nine seconds | QUANTITY | 0.99+ |
Eva | PERSON | 0.99+ |
Bank of America | ORGANIZATION | 0.99+ |
Yahoo | ORGANIZATION | 0.99+ |
Jeff | PERSON | 0.99+ |
2018 | DATE | 0.99+ |
ORGANIZATION | 0.99+ | |
four year | QUANTITY | 0.99+ |
Mastercard | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
Identity Theft Resource Center | ORGANIZATION | 0.99+ |
San Francisco | LOCATION | 0.99+ |
The Cube | TITLE | 0.99+ |
100% | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
Cube | ORGANIZATION | 0.99+ |
first exposure | QUANTITY | 0.99+ |
10 other data points | QUANTITY | 0.98+ |
each incident | QUANTITY | 0.98+ |
a month | QUANTITY | 0.98+ |
less than a year old | QUANTITY | 0.98+ |
more than one person | QUANTITY | 0.98+ |
over 1000 | QUANTITY | 0.97+ |
first exposures | QUANTITY | 0.97+ |
both victims | QUANTITY | 0.97+ |
nine-digit | QUANTITY | 0.97+ |
three main reporting agencies | QUANTITY | 0.97+ |
over 1500 data breaches | QUANTITY | 0.97+ |
87 followup questions | QUANTITY | 0.96+ |
The Cube | ORGANIZATION | 0.96+ |
both | QUANTITY | 0.96+ |
Data Privacy Day | EVENT | 0.95+ |
Data Privacy Day 2018 | EVENT | 0.94+ |
Data Privacy Days | TITLE | 0.94+ |
four year old | QUANTITY | 0.93+ |
Moscone | LOCATION | 0.9+ |
previous year | DATE | 0.88+ |
50,000 | QUANTITY | 0.85+ |
a year | QUANTITY | 0.82+ |
John Smith | PERSON | 0.81+ |
23 year old | QUANTITY | 0.81+ |
about a while ago | DATE | 0.68+ |
couple | QUANTITY | 0.68+ |
privacy | ORGANIZATION | 0.66+ |
IOT | ORGANIZATION | 0.61+ |
years | DATE | 0.56+ |
John Smiths | COMMERCIAL_ITEM | 0.4+ |
Eve Maler, ForgeRock | Data Privacy Day 2018
>> Hey, welcome back everybody. Jeff Frigg here with theCUBE. We're at Data Privacy Day 2018 here at Linked-In's brand new, downtown San Francisco headquarters not in Sunny Vale. And we're excited to be here for the second time. And we've got Eve Maylar back she's a VP in innovation and emerging tech at Forge Rock, we caught up last year, so great to see you. >> Likewise. >> So what's different in 2018 than 2017? >> Well GDPR, the general data protection regulation Well, also we didn't talk about it much here today, but the payment services directive version two is on the lips of everybody in the EU who's in financial services, along with open banking, and all these regulations are actually as much about digital transformation, I've been starting to say hashtag digital transformation, as they are about regulating data protection and privacy, so that's big. >> So why aren't those other two being talked about here do you think? >> To a certain extent they are for the global banks and the multinational banks and they have as much impact on things like user consent as GDPR does, so that's a big thing. >> Jeff: Same penalties? >> They do have some penalties, but they are as much about, okay, I'm starting to say hashtag in front of all these cliches, but you know they are as much about trying to do the digital single market as GDPR is, so what they're trying to do is have a level playing field for all those players. So the way that GDPR is trying to make sure that all of the countries have the same kind of regulations to apply so that they can get to the business of doing business. >> Right, right, and so it's the same thing trying to have this kind of unified platform. >> Yup, absolutely, and so that affects companies here if they play in that market as well. >> So there's a lot of talk on this security site when you go to these security conferences about baking security in everywhere, right? It can't be OL guard anymore, there is no such thing as keeping the bad guys out, it's more all the places you need to bake in security, and so you're talking about that really needs to be on the privacy side as well, it needs to go hand-in-hand, not be counter to innovation. >> Yes, it is not a zero sum game, it should be a positive sum game in fact, GDPR would call it data protection by design and by default. And so, you have to bake it in, and I think the best way to bake it in is to see this as an opportunity to do better business with your customers, your consumers, your patients, your citizens, your students, and the way to do that is to actually go for a trust mark instead of, I shouldn't say a trust mark, but go for building trusted digital relationships with all those people instead of just saying "Well I'm going to go for compliance" and then say " Well I'm sorry if you didn't feel that action "on my part was correct" >> Well hopefully it's more successful than we've seen on the security side right? Because data breaches are happening constantly, no one is immune and I don't know, we're almost kind of getting immune to it. I mean Yahoo's it was their entire database of however many billions of people, and some will say it's not even when you get caught it's more about how you react, when you do get caught both from a PR perspective, as well as mending faith like the old Tylenol issue back in the day, so, on the privacy side do you expect that to be the same? Are these regulations in such a way where it's relatively easy to implement so we won't have kind of this never ending breach problem on the security side, or is it going to be kind of the same. >> I think I just made a face when you said easy, the word easy okay. >> Not easy but actually doable, 'cause sometimes it feels like some of the security stuff again on the breaches specifically, yeah it seems like it should be doable, but man oh man we just hear over and over again on the headlines that people are getting compromised. >> Yeah people are getting compromised and I think they are sort of getting immune to the stories when it's a security breach. We try to do at my company at Forge Rock we're identities so I have this identity lens that I see everything through, and I think especially in the internet of things which we've talked about in the past there's a recognition that digital identity is a way that you can start to attack security and privacy problems, because if you want to, for example, save off somebody's consent to let information about them flow, you need to have a persistent storage that they did consent, you need to have persistent storage of the information about them, and if they want to withdraw consent which is a thing like GDPR requires you to be able to do, and prove that they're able to do, you need to have a persistent storage of their digital identity. So identity is actually a solution to the problem, and what you want to do is have an identity and access management solution that actually reduces the friction to solving those problems so it's basically a way to have consent life cycle management if you will and have that solution be able to solve your problems of security and privacy. >> And to come at it from the identity point of view versus coming at it from the data point of view. >> That's right, and especially when it comes to internet of things, but not even just internet of things, you're starting to need to authenticate and identity everything; services, applications, piles of data, and smart devices, and people, and keep track of the relationships among them. >> We just like to say people are things too so you can always include the people in the IT conversation. But it is pretty interesting the identity task 'cause we see that more and more, security companies coming at the problem from an identity problem because now you can test the identity against applications, against data, against usage, change what's available, not available to them, versus trying to build that big wall. >> Yes, there's no perimeters anymore. >> Unless you go to China and walk the old great wall. >> Yes you're taking your burner devices with you aren't you? (laughs) >> Yes. >> Good, good to hear >> Yeah but it's a very different way to attack the problem from an identity point of view. >> Yeah, and one of the interesting things actually about PSD2 and this open banking mandate, and I was talking about they want to get digital business to be more attractive, is that they're demanding strong customer authentication, SCA they call it, and so I think we're going to see, I think we talked about passwords last time we met, less reliance. >> Jeff: And I still have them and I still can't remember them. >> Well you know, less reliance on passwords either is the only factor or sometimes a factor, and more sophisticated authentication that has less impact, well less negative impact on your life, and so I'm kind of hopeful that they're getting it, and these things are rolling up faster than GDPR, so I think those are kind of easier. They're aware of the API economy, they get it. They get all the standards that are needed. >> 'Cause the API especially when you get the thing to thing and you got multi steps and everything is depending on the connectivity upstream, you've got some significant issues if you throw a big wrench into there. But it's interesting to see how the two factor authentication is slowly working its way into more and more applications, and using a phone now without the old RSA key on the keychain, what a game changer that is. >> Yeah I think we're getting there. Nice to hear something's improving right? >> There you go. So as you look forward to 2018 what are some of your priorities, what are we going to be talking about a year from now do you think? >> Well I'm working on this really interesting project, this is in the UK, it has to do with Affintech, the UK has a mandate that it's calling the Pensions Dashboard Project, and I think that this has got a great analogy in the US, we have 401ks. They did a study there where they say the average person has 11 jobs over their lifetime and they leave behind some, what they call pension pots, so that would be like our 401ks, and this Pensions Dashboard Project is a way for people to find all of their left behind pension pots, and we talked last year about the technology that I've worked on called user managed access, UMA, which is a way where you can kind of have a standardized version of that Google Docs share button where you're in control of how much you share with somebody else, well they're using UMA to actually manage this pension finder service, so you give access first of all, to yourself, so you can get this aggregated dashboard view of all your pensions, and then you can share, one pension pot, you know one account, or more, with financial advisors selectively, and get advice on how to spend your newly found money. It's pretty awesome and it's an Affintech use case. >> How much unclaimed pension pot money, that must just be. >> In the country, in the UK, apparently it's billions upon billions, so imagine in the US, I mean it's probably a trillion dollars. I'm not sure, but it's a lot. We should do something here, I'm wondering how much money I have left behind. >> All right check your pension pot, that's the message from today's interview. All right Eve, well thanks for taking a few minutes, and again really interesting space and you guys are right at the forefront, so exciting times. >> It's a pleasure. >> All right she's Eve Maylar I'm Jeff Frigg you're watching theCUBE from Data Privacy Day 2018, thanks for watching, catch you next time. (upbeat music)
SUMMARY :
Jeff Frigg here with theCUBE. Well GDPR, the general data protection regulation for the global banks and the multinational banks have the same kind of regulations to apply Right, right, and so it's the same thing Yup, absolutely, and so that affects companies all the places you need to bake in security, And so, you have to bake it in, and I think on the privacy side do you expect that to be the same? you said easy, the word easy okay. again on the headlines that people reduces the friction to solving those problems And to come at it from the identity point of view and identity everything; services, so you can always include the people in the IT conversation. Yeah but it's a very different Yeah, and one of the interesting and I still can't remember them. They're aware of the API economy, they get it. the thing to thing and you got multi steps Nice to hear something's improving right? So as you look forward to 2018 what are and then you can share, one pension pot, In the country, in the UK, apparently and again really interesting space and you guys Privacy Day 2018, thanks for watching, catch you next time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff Frigg | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
11 jobs | QUANTITY | 0.99+ |
Yahoo | ORGANIZATION | 0.99+ |
Eve Maylar | PERSON | 0.99+ |
Forge Rock | ORGANIZATION | 0.99+ |
2018 | DATE | 0.99+ |
US | LOCATION | 0.99+ |
2017 | DATE | 0.99+ |
Affintech | ORGANIZATION | 0.99+ |
Eve | PERSON | 0.99+ |
Eve Maler | PERSON | 0.99+ |
Sunny Vale | LOCATION | 0.99+ |
China | LOCATION | 0.99+ |
GDPR | TITLE | 0.99+ |
UK | LOCATION | 0.99+ |
last year | DATE | 0.99+ |
second time | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
billions | QUANTITY | 0.99+ |
one account | QUANTITY | 0.99+ |
today | DATE | 0.98+ |
one | QUANTITY | 0.98+ |
one pension pot | QUANTITY | 0.97+ |
Linked-In | ORGANIZATION | 0.97+ |
both | QUANTITY | 0.97+ |
billions of people | QUANTITY | 0.97+ |
Data Privacy Day 2018 | EVENT | 0.96+ |
Data Privacy Day 2018 | EVENT | 0.96+ |
Google Docs | TITLE | 0.94+ |
single | QUANTITY | 0.93+ |
PSD2 | TITLE | 0.93+ |
Tylenol | ORGANIZATION | 0.91+ |
San Francisco | LOCATION | 0.9+ |
ForgeRock | ORGANIZATION | 0.9+ |
two factor | QUANTITY | 0.89+ |
a trillion dollars | QUANTITY | 0.83+ |
EU | ORGANIZATION | 0.77+ |
UMA | ORGANIZATION | 0.75+ |
theCUBE | ORGANIZATION | 0.74+ |
first | QUANTITY | 0.69+ |
401ks | QUANTITY | 0.64+ |
UK | ORGANIZATION | 0.58+ |
Pensions Dashboard Project | OTHER | 0.57+ |
about a year | QUANTITY | 0.52+ |
version | QUANTITY | 0.41+ |
two | OTHER | 0.4+ |
Michelle Dennedy, Cisco | Data Privacy Day 2018
(screen switch sound) >> Hey, welcome back everybody. Jeff Frick here with theCUBE. We're at the place that you should be. Where is that you say? Linked-In's new downtown San Francisco's headquarters at Data Privacy Day 2018. It's a small, but growing event. Talking, really a lot about privacy. You know we talk a lot about security all the time. But privacy is this kind of other piece of security and ironically it's often security that's used as a tool to kind of knock privacy down. So it's an interesting relationship. We're really excited to be joined by our first guest Michelle Dennedy. We had her on last year, terrific conversation. She's the Chief Privacy Officer at Cisco and a keynote speaker here. Michelle, great to see you again. >> Great to see you and happy privacy day. >> Thank you, thank you. So it's been a year, what has kind of changed on the landscape from a year ago? >> Well, we have this little thing called GDPR. >> Jeff: That's right. >> You know, it's this little old thing the General Data Protection Regulation. It's been, it was enacted almost two years ago. It will be enforced May 25, 2018. So everyone's getting ready. It's not Y2K, it's the beginning of a whole new era in data. >> But the potential penalties, direct penalties. Y2K had a lot of indirect penalties if the computers went down that night. But this has significant potential financial penalties that are spelled out very clearly. Multiples of revenue. >> Absolutely >> So what are people doing? How are they getting ready? Obviously, the Y2k, great example. It was a scramble. No one really knew what was going to happen. So what are people doing to get ready for this? >> Yeah, I think its, I like the analogy it ends because January one, after 2000, we figured it out, right? Or it didn't happen because of our prep work. In this case, we have had 20 years of lead time. 1995, 1998, we had major pieces of legislation saying know thy data, know where it's going, value it and secure it, and make sure your users know where and what it is. We didn't do a whole lot about it. There are niche market people, like myself, who said "Oh my gosh, this is really important." but now the rest of the world has to wake up and pay attention because four percent of global turnover is not chump change in a multi-billion dollar business and in a small business it could be the only available revenue stream that you wanted to spend innovating-- >> Right, right >> rather than recovering. >> But the difficulty again, as we've talked about before is not as much the companies. I mean obviously the companies have a fiduciary responsibility. But the people-- >> Yes. >> On the end of the data, will hit the ULA as we talked about before without thinking about it. They're walking around sharing all this information. They're logging in to public WiFi's and we actually even just got a note at theCube the other day asking us what our impact, are we getting personal information when we're filming stuff that's going out live over the internet. So I think this is a kind of weird implication. >> I wish I could like feel sad for that but there's a part of my privacy soul that's like, "Yes! People should be asking. "What are you doing with my image after this? "How will you repurpose this video? "Who are my users looking at it?" I actually, I think it's difficult at first to get started. But once you know how to do it, it's like being a nutritionist and a chef all in one. Think about the days before nutrition labels for food. When it was first required, and very high penalties of the same quanta of the GDPR and some of these other Asiatic countries are the same, people simply didn't know what they were eating. >> Right. >> People couldn't take care of their health and look for gluten free, or vitamin E, or vitamin A, or omega whatever. Now, it's a differentiator. Now to get there, people had to test food. They had to understand sources. They had to look at organics and pesticides and say, "This is something that the populace wants." And look at the innovation and even something as basic and integral to your humanity as food now we're looking at what is the story that we're sharing with one another and can we put the same effort in to get the same benefits out. Putting together a nutrition label for your data, understanding the mechanisms, understanding the life cycle flow. It's everything and is it a pain in the tuckus some times? You betcha. Why do it? A: You're going to get punished if you don't. But more importantly, B: It's the gateway to innovation. >> Right. It's just funny. We talked to a gal in a security show and she's got 100% hit rate. She did this at Black Hat, social engineering access to anything. Basically by calling, being a sweetheart, asking the right questions and getting access to people's-- >> Exactly. >> So where does that fit in terms of the company responsibility, when they are putting everything, as much as they can in their place. Here like at AWS too you'll hear, "Somebody has a security breach at AWS." Well it wasn't the security of the AWS system, it was somebody didn't hit a toggle switch in the right position. >> That's right. >> So it's pretty complex versus if you're a food manufacturer, hopefully you have pretty good controls as to what you put in the food and then you can come back and define. So it's a really complicated problem when it's the users who you're tryna protect that are often the people that are causing the most problems. >> Absolutely. And every analogy has its failures right? >> Right, right. >> We'll stick with food for a while. >> Oh no I like the food one. >> Alright it's something you can understand. >> Y2K is kind of old, right. >> Yeah, yeah. But think about like, have we made, I was going to use a brand name, a spray on cheese chip, have we made that illegal? That stuff is terrible for your body. We have an obesity crisis here in North America certainly, and other parts of the world, and yet we let people choose what they're putting into their bodies. At the same time we're educating consumers about what the new food chart should look like, we're listening to maybe sugar isn't as good as we thought it was, maybe fat isn't as bad. So giving people some modicum of control doesn't mean that people are always going to make the right choices but at least we give them a running chance by being able to test and separate and be accountable for at least what we put into the ingredients. >> Right, right, okay so what are some of the things you're working on at Cisco? I think you said before we go on the air you have a new report published, study, what's going on? I do, I'm ashamed Jeff to be so excited about data but, I'm excited about data. (laughter) >> Everybody's excited about data. >> Are they? >> Absolutely. >> Alright let's geek out for a moment. >> So what did you find out? >> So we actually did the first metrics reporting correlating data privacy maturity models and asking customers, 3,000 customers plus in 20 different countries from companies of all sizes S and B's to very large corps, are you experiencing a slow down based on the fears of privacy and security problems? We found that 68 percent of these questions said yes indeed we are, and we asked them what is the average timing of slowing down closing business based on these fears. We found a big spread from over 16 and a half weeks all the way down to two weeks. We said that's interesting. We asked that same set of customers, where would you put yourself on a zero to five ad hoc to optimized privacy maturity model. What we found was if you were even middle of the road a three or a four, to having some awareness, having some basic tools, you can lower your risk of loss, by up to 70 percent. I'm making it sound like it's causation, it's just a correlation but it was such a strong one that when we ran the data last year I didn't run the report, because we weren't sure enough. So we ran it again and got the same quantum with a larger sample size. So now I feel pretty confident that the self reporting of data maturity is related to closing business more efficiently and faster on the up side and limiting your losses on the down side. >> Right, so where are the holes? What's the easiest way to get from a zero or one to a three or a four, I don't even want to say three or four, two or three in terms of behaviors, actions, things that people do? >> So you're scratching on my geeky legal underbelly here. (laughter) I'm going to say it depends Jeff. >> Of course of course. >> Couching this and I'm not your lawyer. >> No forward licking statements. >> No forward licking statement. Well, for a reason what the heck. We're looking forward not back. It really does depend on your organization. So, Cisco, my company we are known for engineering. In fact on the down side of our brand, we're known for having trouble letting go until everything is perfect. So, sometimes it's slower than you want cause we want to be so perfect. In that culture my coming into the engineering with their bonafides and their pride in their brand, that's where I start to attack with privacy engineering education, and looking at specs and requirements for the products and services. So hitting my company where it lives in engineering was a great place to start to build in maturity. In a company like a large telco or healthcare or highly regulated industry, come from the legal aspect. Start with compliance if that's what is effective for your organization. >> Right, right. >> So look at where you are in your organization and then hit it there first, and then you can fill up, document those policies, make sure training is fun. Don't be afraid to embarrass yourself. It's kind of my mantra these days. Be a storyteller, make it personal to your employees and your customers, and actually care. >> Right, hopefully, hopefully. >> It's a weird thing to say right, you actually should give a beep >> Have a relationship with people. When you look at how companies moved that curve from last year to this year was it a significant movement? Was it more than you thought less than you thought? Is it appropriate for what's coming up? >> We haven't tracked individual companies time after time cause it's double blind study. So it's survey data. The survey numbers are statistically relevant. That when you have a greater level of less ad hoc and more routinized systems, more privacy policies that are stated and transparent, more tools and technologies that are implemented, measured, tested, and more board level engagement you start to see that even if you have a cyber risk the chances that it's over 500 thousand per event goes down precipitously. If you are at that kind of mid range level of maturity you can take off 70 percent of the lag time and go from about four months of closing a deal that has privacy and security implications to somewhere around two to three weeks. That's a lot of time. Time in business is everything. We run by the quarter. >> Yeah well if you don't sell it today, you never get today back. You might sell it tomorrow, but you never get today back. Alright so we just flipped the calendar. I can't believe it's 2018. That's a whole different conversation. (laughter) What are your priorities for 2018 as you look forward? >> Oh my gosh. I am hungry for privacy engineering to become a non niche topic. We're going out to universities. We're going out to high schools. We're doing innovation challenges within Cisco to make innovating around data a competitive advantage for everyone, and come up with a common language. So that if you're a user interface guy you're thinking about data control and the stories that you're telling about what the real value is behind your thing. If you are a compliance guy or girl, how do I efficiently measure? How do I come back again in three months without having compliance fatigue, because after the first couple days of enforcement of GDPR and some of these other laws come into force it's really easy to say whew, it didn't hit me. I've got no problem now. >> Right. >> That is not the attitude I want people to take. I want them to take real ownership over this information. >> It's very ana logist to what's happening in security. >> Very much so. >> Just baking it in all the way. It's not a walled garden. You can't defend the perimeter anymore, but it's got to be baked into everything. >> It's no mistake that it's like the security world. They're about 25 years ahead of us in data privacy and protection. My boss is our chief trust officer who formally was our CISO I am absolutely free riding on all the progresses the security people have made. We're just really complimenting each others skills, and getting out into other parts of the business in addition to the technical part of the business. >> Exciting times. >> Yeah, it's going to be fun. >> Well great to catch up and >> Yeah you too. >> We'll let you go. Unfortunately we're out of time. We'll see you in 2019. >> Data Privacy Day. >> Data Privacy Day. She's Michelle Dennedy, I'm Jeff Frank. You're watching theCUBE. Thanks for tuning in from Data Privacy Day 2018. (music)
SUMMARY :
We're at the place that you should be. on the landscape from a year ago? it's the beginning of a whole new era in data. But the potential penalties, direct penalties. Obviously, the Y2k, great example. and in a small business it could be the only available is not as much the companies. They're logging in to public WiFi's and we actually even I actually, I think it's difficult at first to get started. But more importantly, B: It's the gateway to innovation. asking the right questions and getting access to people's-- in the right position. as to what you put in the food And every analogy has its failures right? and other parts of the world, and yet we let people I think you said before we go on the air you have a new So now I feel pretty confident that the self reporting I'm going to say it depends Jeff. In that culture my coming into the engineering with So look at where you are in your organization Was it more than you thought less than you thought? We run by the quarter. You might sell it tomorrow, but you never get today back. it's really easy to say whew, That is not the attitude I want people to take. Just baking it in all the way. and getting out into other parts of the business We'll see you in 2019. Thanks for tuning in from Data Privacy Day 2018.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Michelle Dennedy | PERSON | 0.99+ |
Jeff Frank | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
May 25, 2018 | DATE | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
100% | QUANTITY | 0.99+ |
2018 | DATE | 0.99+ |
1998 | DATE | 0.99+ |
20 years | QUANTITY | 0.99+ |
Y2K | ORGANIZATION | 0.99+ |
North America | LOCATION | 0.99+ |
70 percent | QUANTITY | 0.99+ |
Michelle | PERSON | 0.99+ |
1995 | DATE | 0.99+ |
tomorrow | DATE | 0.99+ |
2019 | DATE | 0.99+ |
General Data Protection Regulation | TITLE | 0.99+ |
last year | DATE | 0.99+ |
zero | QUANTITY | 0.99+ |
two weeks | QUANTITY | 0.99+ |
68 percent | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
four | QUANTITY | 0.99+ |
three | QUANTITY | 0.99+ |
GDPR | TITLE | 0.99+ |
3,000 customers | QUANTITY | 0.99+ |
four percent | QUANTITY | 0.99+ |
Y2k | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
January one | DATE | 0.99+ |
Data Privacy Day | EVENT | 0.99+ |
20 different countries | QUANTITY | 0.99+ |
this year | DATE | 0.99+ |
a year ago | DATE | 0.99+ |
three months | QUANTITY | 0.98+ |
five | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
Data Privacy Day 2018 | EVENT | 0.98+ |
about four months | QUANTITY | 0.98+ |
first guest | QUANTITY | 0.97+ |
Linked-In | ORGANIZATION | 0.97+ |
first couple days | QUANTITY | 0.97+ |
up to 70 percent | QUANTITY | 0.97+ |
first metrics | QUANTITY | 0.97+ |
three weeks | QUANTITY | 0.97+ |
over 16 and a half weeks | QUANTITY | 0.97+ |
first | QUANTITY | 0.97+ |
about 25 years | QUANTITY | 0.96+ |
multi-billion dollar | QUANTITY | 0.95+ |
San Francisco | LOCATION | 0.94+ |
theCube | ORGANIZATION | 0.94+ |
vitamin A | OTHER | 0.94+ |
around two | QUANTITY | 0.94+ |
2000 | DATE | 0.9+ |
over 500 thousand per event | QUANTITY | 0.9+ |
a year | QUANTITY | 0.87+ |
Black Hat | ORGANIZATION | 0.85+ |
two years ago | DATE | 0.85+ |
vitamin E | OTHER | 0.83+ |
theCUBE | ORGANIZATION | 0.78+ |
Asiatic | OTHER | 0.76+ |
double blind study | QUANTITY | 0.75+ |
telco | ORGANIZATION | 0.75+ |
almost | DATE | 0.67+ |
Privacy Officer | PERSON | 0.65+ |
ULA | ORGANIZATION | 0.63+ |
quarter | DATE | 0.53+ |
Michelle Dennedy & Robert Waitman, Cisco | Cisco Live EU 2019
>> Live from Barcelona, Spain it's theCUBE! Covering Cisco Live! Europe brought to you by Cisco and its ecosystem partners. >> Hello everyone, welcome back to theCube's live coverage here in Barcelona, Spain for Cisco Live! Europe 2019. We're at day three of three days of coverage I'm John Furrier with Dave Vellante Our next two guests we're going to talk about privacy data Michelle Dennnedy, VP and Chief Privacy officer at Cisco and Robert Waitman who is the Director of Security and Trust. Welcome back, we had them last year and everything we talked about kinda's happening on steroids here this year >> Yep. >> Welcome back >> Thank you glad to be here >> Thanks for having us >> So security, privacy all go hand in hand. A lot going on. You're seeing more breaches you're seeing more privacy challenges Certainly GDPR's going to the next level. People are, quote, complying here's a gig of data go figure it out. So there's a lot happening, give us the update. >> Well, as we suggested last year it was privacypalooza all year long running up to the enforcement deadline of May 25, 2018. There were sort of two kinds of companies. There's one that ran up to that deadline and said woohoo we're ready to drive this baby forward! And then there's a whole nother set of people who are still sort of oh my gosh. And then there's a third category of people who still don't understand. I had someone come up to me several weeks ago and say what do I do? When is this GDPR going to be a law? I thought oh honey you need a hug >> Two years ago, you need some help. >> And some companies in the US, at least were turning off their websites. Some media companies were in the news for actually shutting down their site and not making it available because they weren't ready. So a lot of people were caught off guard, some were prepared but still, you said people would be compliant, kind of and they did that but still more work to do. >> Lots more work to do and as we said when the law was first promulgated two and a half years ago GDPR and the deadline A, It's just one region but as you'll hear as we talk about our study it's impacting the globe but it's also not the end of anything it's the beginning of the information economy at long last. So, I think we all have a lot to do even if you feel rather confident of your base-level compliance now it's time to step up your game and keep on top of it. >> Before we get into some of the details of the new finding you guys have I want you to take a minute to explain how your role is now centered in the middle of Cisco because if you look at the keynotes data's in the center of a lot of things in this intent based network on one side and you've got cloud and edge on the other. Data is the new ingredient that's feeding applications and certainly collective intelligence for security. So the role of data is critical. This is a big part of the Cisco tech plan nevermind policy and or privacy and these other things you're in the middle of it. Explain your role within Cisco and how that shapes you. >> How we sort of fit in. Well it's such a good question and actually if you watch our story through theCUBE we announced, actually on data privacy day several years ago that data is the new currency and this is exactly what we're talking about the only way that you can operationalize your data currency is to really think about it throughout the platform. You're not just pleasing a regulator you're not just pleasing your shareholders you're not just pleasing your employee base. So, as such, the way we organize our group is my role sits under the COO's office our Chief Operations Office under the office of John Stewart who is our Chief Trust officer. So security, trust, advanced research all live together in operations. We have sister organizations in places like public policy, legal, marketing, the sales groups the people who are actually operationalizing come together for a group. My role really is to provide two types of strategy. One, rolling out privacy engineering and getting across inside and outside of the company as quickly as possible. It's something new. As soon as we have set processes I put them into my sister organization and they send them out as routine and hopefully automated things. The other side is the work Robert and I do together is looking at data valuation models. Working about the economics of data where does it drive up revenue and business and speed time to closure and how do we use data to not just be compliant in the privacy risk but really control our overall risk and the quality of our information overall. It's a mouth full >> So that's interesting and Robert, that leads me to a question when we've seen these unfunded mandates before we saw it with Y2K, the Enron backlash certainly the United States the Federal Rules of Civil Procedure. And the folks in the corner office would say oh, here we go again. Is there any way to get more value beyond just reducing risk and complying and have you seen companies be able to take data and value and apply it based on the compliance and governance and privacy policies? >> Dave that's a great question. It's sort of the thought that we had and the hypothesis was that this was going to be more valuable than just for the compliance reasons and one of the big findings of the study that we just released this week was that in fact those investments you know we're saying that good privacy is very good for business. It was painful, some firms stuck their head in the sand and said I don't want to even do this but still, going through the GDPR preparation process or for any of the privacy regulations has taken people to get their data house in order and it's important to communicate. We wanted to find out what benefits were coming from those organizations that had made those investments and that's really what came out in our study this week for international data privacy day we got into that quite a bit. >> What is this study? can you give us some details on it? >> It's the Data Privacy Benchmark study we published this week for international data privacy day. It's sort of an opportunity to focus on data privacy issues both for consumers and for businesses sort of the one day a year kind of like mother's day that you should always think of your mom but mother's day's a good day so you should always think of privacy when you're making decisions about your data but it's a chance to raise awareness. So we published our study this year and it was based on over thirty-two hundred responses from companies around the world from 18 countries all sorts of sizes of companies and the big findings were in fact around that. Privacy has become a serious and a boardroom level issue that the awareness has really skyrocketed for companies who are saying before I do business with you I want to know how you're using my data. What we saw this year is that seven out of eight companies are actually seeing some sales delay from their customers asking those kinds of questions. But those that have made the investment getting ready for GDPR or being more mature on privacy are seeing shorter delays. If you haven't gotten ready you're seeing 60% longer delays. And even more interestingly for us too is when you have data breaches and a lot of companies have them as we've talked about those breaches are not nearly as impactful. The organizations that aren't ready for GDPR are seeing three times as many records impacted by the breach. They're seeing system downtime that's 50% longer and so the cost of the whole thing is much more. So, kind of the question of is this still something good to do? Not only because you have to do it when you want to avoid 4% penalties from GDPR and everything else but it's something that's so important for my business that drives value. >> So the upshot there is that you do the compliance. Okay, check the box, we don't want to get fined So you're taking your medicine basically. Turns into an upside with the data you're seeing from your board. Sales benefit and then just preparedness readiness for breaches. >> Right, I mean it's a nice-- >> Is that right? >> That's exactly right John you've got it right. Then you've got your data house in order I mean there's a logic to this. So then if you figured out where your data is how to protect it, who has access to it you're able to deal with these questions. When customers ask you questions about that you're ready to answer it. And when something bad goes wrong let's say there is a breach you've already done the right things to control your data. You've got rid of the data you don't need anymore. I mean 50% of your data isn't used for anything and of course we suggest that people get rid of that that makes it less available when and if a breach occurs. >> So I got to ask you a question on the data valuation because a lot of the data geeks and data nerds like myself saw this coming. We saw data, mostly on the tech side if you invested in data it was going to feed applications and I think I wrote a blog post in 2007 data's going to be part of the development kits or development environment you're seeing that now here. Data's now part of application development it's part of network intelligence for security. Okay, so yes, check, that's happening. At the CFO level, can you value the data so it's a balance sheet item? Can you say we're investing in this? So you start to see movement you almost project, maybe, in a few years, or now how do you guys see the valuation? Is it going to be another kind of financial metric? >> Well John, it's a great point. Seeing where we're developing around this. So I think we're still in somewhat early days of that issue. I think the organizations that are thinking about data as an asset and monetizing its value are certainly ahead of this we're trying to do that ourselves. We probed on that a little bit in the survey just to get a sense of where organizations are and only about a third of organizations are doing those data mature things. Do they have a complete data map of where their stuff is? Do they have a Chief Data Officer? Are they starting to monetize in appropriate ways, their data? So, there's a long way to go before organizations are really getting the value out of that data. >> But the signals are showing that there's value in the data. Obviously the number of sales there's some upside to compliance not just doin it to check the box there's actually business benefits. So how are you guys thinking about this cause you guys are early adopters or leaders in this how are you thinking about the data measurement of it? Can you share your insights on that? >> Yeah, so you know, data on the balance sheet Grace Hopper 1965, right? data will one day be on the corporate balance sheet because it's in most cases more valuable than the hardware that processes. This is the woman who's making software and hardware work for us, in 1965! Here we are in 2019. It's coming on the balance sheet. She was right then, I believe in it now. What we're doing is, even starting this is a study of correlation rather than causation. So now we have at least the artifacts to say to our legal teams go back and look at when you have one of our new improved streamline privacy sheets and you're telling in a more transparent fashion a deal. Mark the time that you're getting the question. Mark the time that you're finishing. Let's really be much more stilletto-like measuring time to close and efficiency. Then we're adding that capability across our businesses. >> Well one use case we heard on theCUBE this week was around privacy and security in the network versus on top of the network and one point that was referenced was when a salesperson leaves they take the contacts with them. So that's an asset and people get sued over it. So this again, this is a business policy thing. so business policy sounds like... >> Well in a lot of the solutions that exist in the marketplace or have existed I've sat on three encrypted email companies before encrypted email was something the market desired. I've sat on two advisory boards of-- a hope that you could sell your own data to the marketers. Every time someone gets an impression you get a micro cent or a bitcoin. We haven't really got that because we're looking on the periphery. What we're really trying to do is let's look at what the actual business flow and processes are in general and say things like can we put a metric on having less records higher impact, and higher quality. The old data quality in the CDO is rising up again get that higher quality now correlate it with speed to innovation speed to close, launch times the things that make your business run anyway. Now correlate it and eventually find causal connections to data. That's how we're going to get that data on the balance sheet. >> You know, that's a great point the data quality issue used to be kind of a back office records management function and now it's coming to the fore and I just make an observation if you look at what were before Facebook fake news what were the top five companies in the United States in terms of market value Amazon, Google, Facebook was up there, Microsoft, Apple. They're all data companies and so the market has valued them beyond the banks, beyond the oil companies. So you're starting to see clearer evidence quantifiable evidence that there's value there. I want to ask you about we have Guillermo Diaz coming up shortly, Michelle and I want to ask you your thoughts on the technical function. You mentioned it's a board level issue now, privacy. How should the CIO be communicating to the board about privacy? What should that conversation be like? >> Oh my gosh. So we now report quarterly to the board so we're getting a lot of practice We'll put it that way. I think we're on the same journey as the security teams used to you used to walk into the board and go here's what ransomware is and all of these former CFOs and sales guys would look at you and go ah, okay, onto the financials because there wasn't anything for them to do strategically. Today's board metrics are a little soft. It's more activity driven. Have you done your PIAs? Have you passed some sort of a third party audit? Are you getting rejected for third party value chain in your partner communities? That's the have not and da da da. To me I don't want my board telling us how to do operations that's how we do. To really give the board a more strategic view what we're really trying to do is study things like time to close and then showing trending impacts. The one conversation with John Chambers that's always stuck in my head is he doesn't want to know what today's snapshot is cause today's already over give me something over time, Michelle, that will trend. And so even though it sounds like, you know who cares if your sales force is a little annoyed that it takes longer to get this deal through legal well it turns out when you multiply that in a multi-billion dollar environment you're talking about hundreds of millions of dollars probably a week, lost to inefficiency. So, if we believe in efficiency in the tangible supply chain that's the more strategic view I want to take and then you add on things like here's a risk portfolio a potential fair risk reporting type of thing if we want to do a new business Do we light up a business in the Ukraine right now versus Barcelona? That is a strategic conversation that is board level. We've forgotten that by giving them activity. >> Interesting what you say about Chambers. John you just interviewed John Chambers and he was the first person, in the mid 90s to talk about a virtual close, if you remember that. So, obviously, what you're talking about is way beyond that. >> Yeah and you're exactly right. Let's go back to those financial roots. One of the things we talk about in privacy engineering is getting people's heads-- the concept that the data changes. So, the day before your earnings that data will send Chuck Robbins to jail if someone is leaking it and causing people to invest accordingly. The day after, it's news, we want everyone to have it. Look at how you have to process and handle and operationalize in 24 hours. Figuring out those data stories helps it turn it on its head and make it more valuable. >> You know, you mentioned John Chambers one of the things that I noticed was he really represented Silicon Valley well in Washington DC and there's been a real void there since he retired. You guys still have a presence there and are doing stuff there and you see Amazon with Theresa Carlson doing some great work there and you still got Oracle and IBM in there doing their thing. How is your presence and leadership translating into DC now? Can you give us an update of what's happening at-- >> So, I don't know if you caught a little tweet from a little guy named Chuck Robbins this week but Chuck is actually actively engaged in the debate for US federal legislation for privacy. The last thing we want is only the lobbyists as you say and I love my lobbyists wherever you are we need them to help give information but the strategic advisors to what a federal bill looks like for an economy as large and complex and dependent on international structure we have to have the network in there. And so one of the things that we are doing in privacy is really looking at what does a solid bill look like so at long last we can get a solid piece of federal legislation and Chuck is talking about it at Davos as was everyone else, which was amazing and now you're going to hear his voice very loudly ringing through the halls of DC >> So he's upping his game in leadership in DC >> Have you seen the size of Chuck Robbins? Game upped, privacy on! >> It's a great opportunity because we need leadership in technology in DC so-- >> To affect public policy, no doubt >> Absolutely. >> And globally too. It's not just DC and America but also globally. >> Yeah, we need to serve our customers. We win when they win. >> Final question, we got to get wrapped up here but I want to get you guys a chance to talk about what you guys announced here at the show what's going on get the plug in for what's going on Cisco Trust. What's happening? >> Do you want to plug first? >> Well, I think a few things we can add. So, in addition to releasing our benchmark study this week and talking about that with customers and with the public we've also announced a new version of our privacy data sheets. This was a big tool to enable salespeople and customers to see exactly how data is being used in all of our products and so the new innovation this week is we've released these very nice, color created like subway maps, you know? They make it easy for you to navigate around it just makes it easy for people to see exactly how data flows. So again, something up on our site at trust.cisco.com where people can go and get that information and sort of make it easy. We're pushing towards simplicity and transparency in everything we do from a privacy standpoint and this is really that trajectory of making it as easy as possible for anyone to see exactly how things go and I think that's the trajectory we're on that's where the legislation both where GDPR is heading and federal legislation as well to try to make this as easy as reading the nutrition label on the food item. To say what's actually here? Do I want to buy it? Do I want to eat it? And we want to make that that easy >> Trust, transparency accountability comes into play too because if you have those things you know who's accountable. >> It's terrifying. I challenge all of my competitors go to trust.cisco.com not just my customers, love you to be there too go and look at our data subway maps. You have to be radically transparent to say here's what you get customer here's what I get, Cisco, here's where my third party's. It's not as detailed as a long report but you can get the trajectory and have a real conversation. I hope everybody gets on board with this kind of simplification. >> Trust.cisco.com we're going to keep track of it. Great work you guys are doing. I think you guys are leading the industry, Congratulations. >> Thank you. >> This is not going to end, this conversation continues will continue globally. >> Excellent >> Thanks for coming on Michelle, appreciate it. Robert thanks for coming on. CUBE coverage here day three in Barcelona. We'll be back with more coverage after this break.
SUMMARY :
brought to you by Cisco and everything we talked Certainly GDPR's going to the next level. I thought oh honey you need a hug And some companies in the US, at least GDPR and the deadline of the new finding you guys have the only way that you can and apply it based on the compliance and one of the big findings of the study and so the cost of the Okay, check the box, we and of course we suggest At the CFO level, can you value the data are really getting the So how are you guys thinking about this It's coming on the balance sheet. and one point that was referenced Well in a lot of the solutions and I want to ask you your thoughts and then you add on things person, in the mid 90s One of the things we talk about and you see Amazon with Theresa Carlson only the lobbyists as you say It's not just DC and Yeah, we need to serve our customers. to talk about what you guys and so the new innovation this week is because if you have those things to say here's what you get customer I think you guys are leading This is not going to end, Thanks for coming on
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
IBM | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Apple | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Cisco | ORGANIZATION | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Enron | ORGANIZATION | 0.99+ |
Michelle Dennnedy | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Chuck | PERSON | 0.99+ |
2019 | DATE | 0.99+ |
May 25, 2018 | DATE | 0.99+ |
Robert | PERSON | 0.99+ |
Michelle Dennedy | PERSON | 0.99+ |
Chuck Robbins | PERSON | 0.99+ |
50% | QUANTITY | 0.99+ |
Michelle | PERSON | 0.99+ |
Robert Waitman | PERSON | 0.99+ |
2007 | DATE | 0.99+ |
Barcelona | LOCATION | 0.99+ |
Washington DC | LOCATION | 0.99+ |
60% | QUANTITY | 0.99+ |
US | LOCATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
Theresa Carlson | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
seven | QUANTITY | 0.99+ |
John Chambers | PERSON | 0.99+ |
last year | DATE | 0.99+ |
Ukraine | LOCATION | 0.99+ |
DC | LOCATION | 0.99+ |
Y2K | ORGANIZATION | 0.99+ |
United States | LOCATION | 0.99+ |
Grace Hopper | PERSON | 0.99+ |
trust.cisco.com | OTHER | 0.99+ |
Barcelona, Spain | LOCATION | 0.99+ |
1965 | DATE | 0.99+ |
GDPR | TITLE | 0.99+ |
24 hours | QUANTITY | 0.99+ |
three days | QUANTITY | 0.99+ |
Barcelona, Spain | LOCATION | 0.99+ |
Trust.cisco.com | OTHER | 0.99+ |
John Stewart | PERSON | 0.99+ |
Two years ago | DATE | 0.99+ |
today | DATE | 0.99+ |
this week | DATE | 0.99+ |
Guillermo Diaz | PERSON | 0.98+ |
two kinds | QUANTITY | 0.98+ |
Silicon Valley | LOCATION | 0.98+ |
One | QUANTITY | 0.98+ |
two types | QUANTITY | 0.98+ |
eight companies | QUANTITY | 0.98+ |
Today | DATE | 0.98+ |
over thirty-two hundred responses | QUANTITY | 0.98+ |
several weeks ago | DATE | 0.98+ |
18 countries | QUANTITY | 0.98+ |
third category | QUANTITY | 0.98+ |
one point | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
first person | QUANTITY | 0.98+ |
two guests | QUANTITY | 0.98+ |
Cisco Trust | ORGANIZATION | 0.98+ |
several years ago | DATE | 0.98+ |
Craig Goodwin, CDK Global | Data Privacy Day
>> Welcome back everybody, Jeff Frick here with theCUBE. We're in downtown San Francisco at LinkedIn's brand new headquarters up here, at Data Privacy Day 2018. We were here last year, the conference is growing, a lot more people here, a lot more activity. We're excited to have a sponsor, Craig Goodwin, he's the Chief Security Officer of CDK Global. Great to see ya. >> Great to be here. >> Absolutely. So for people who aren't familiar, give us a quick kind of overview of what is CDK Global. >> Sure, so CDK Global runs automotive technology. So we enable technology for automotive dealerships, original equipment manufacturers, and we run a lot of the technology across the U.S. and the rest of the world. So, I think last estimate's about $500 billion worth of automotive transactions, whether buying a car, servicing a car, all went through CDK's systems. >> Okay, so it's the systems, it's the business systems for those autmotive companies. It's not like we were just at an autonomous vehicle company the other day, it's not those type of systems. >> Yeah, correct, I mean we're helping with that, right? So a lot of our technology is connecting, with IoT and connected vehicles helping to take in data from those vehicles, to help automotive dealerships, to service the vehicles, or to sell the vehicles. So we ingest that data, and we ingest that technology, but essentially we're talking about the data in the dealerships. >> Okay. So how have you seen things evolve over the last couple years? >> Well definitely with the extra regulation, right? With people and the way that their privacy dynamic is changing, consumers are becoming much more aware of where their data's going, and who's using their data. So we've heard an awful lot today, about the privacy of people's data, and how the industry needs to change. And I think consumers generally are getting much more educated on that, and therefore they're asking companies like ourselves, who deal with their data, to be much more robust in their practices. And we've also seen that in a regulation point of view, right? So governments, the industry, are pushing businesses to be more aware of how they're using consumer's data, which has got to be a positive move in the right direction. >> Jeff: Right, but it's kind of funny, 'cause on the flip side of that coin is people who are willing to give up their privacy to get more services, so you've got kind of the older folks, who've been around for a while, who think of privacy, and then you've got younger folks, who maybe haven't thought about it as much, are used to downloading the app, clicking on the EULA in their phone-- >> Absolutely. >> Follows them everywhere they go. So, is it really more the regulation that's driving the change? Or is just kind of an ongoing maturation process? >> Well I think-- >> Stewardship is I guess what I was saying. >> Yeah, it's a combination of both I would say. And you make a great point there, so if you look at car buying, right? Say 10 years ago, pick a number randomly, but 10 years ago, people wouldn't have been comfortable buying a car online, necessarily. Or definitely not all online. They'd have to touch it somewhere else, feel it physically, right? That's changing, and we're starting to enable that automotive commerce, so that it starts from the online and ends up at a dealership still. So they actually sign the paperwork, but essentially they start that process online. And that's making people more aware, as you say. I think some of the regulation, you look at GDPR in Europe, spoke of that a lot today, naturally. And some of that regulation is helping to drive companies to be more aware. But where I see the biggest problem is with small to medium sized businesses. So I think if you talk to larger business, you were speaking to Michel from Cisco, some of those larger businesses, this privacy thing's been built in from the beginning. Companies like CDK, where we were aware we were dealing with a lot of data, and therefore the GDPR regulations is more of an incremental change. It just ramps up that focus on privacy that was already there from the outset. The biggest problem, and where we see the biggest kind of change here, is in the smaller to medium sized businesses, and that's talking about dealerships, smaller dealership groups, where perhaps they haven't been so aware of privacy, they've been focused on the sales and not necessarily the data and technology, and GDPR for them is a significant step change. And it's down to industry, and larger vendors like ourselves, to reach out to those smaller dealerships, those smaller, medium sized businesses, and help them to work with GDPR to do better. >> But can they fulfill most of their obligations by working with companies such as yours, who have it baked into the product? I would imagine-- >> Yeah! >> I mean, that's the solution, right? >> Absolutely. >> If you're a little person, you don't have a lot of resources-- >> Yep. And I would say it's about sharing in the industry, right? So it's about reaching out. We talked to Cisco today, about how they're building it into their technologies. A lot of the smaller businesses use companies like CDK to enable their technology. So there's an awful lot we can do to help them, but it's not everything, right? So there are areas where we need to educate consumers a lot better, where they need to work with the data and work with where the data goes, in order to understand that full end to end data flow within their systems. We work a lot of the dealerships who perhaps don't understand the data they're collecting, don't understand the gravity of the information that they're collecting, and what that truly means to the consumer themselves. So we need to educate better, we need to reach out as bigger organizations, and teach smaller businesses about what they're doing with the data. >> And was there specific kind of holes in process, or in data management that the GDPR addresses that made a sea change? Or is it really just kind of ramping up the penalties, so you need to really ramp up your compliance? >> Well it really is incremental, right? So if you look at things that we've had in Europe for a long time, the Data Protection Act that was around since 1999, for example, or 1998, apologies. It's a ramp up of that, so it's just increasing the effectiveness. If you look at the 12 points that exist within GDPR, about what you need to know, or a consumer should know about their data, rather than just who's collecting it, it now includes things like when you change that data, when it moves, who it goes to from a third-party perspective, so really it's just about ramping up that awareness. Now, what that means for a business, is that they need to know that they can gather that data quickly. So they need to be clear and understand where their data is going, and CDK's a great example of that. They need to know what data they're sharing with CDK, on what systems it exists, and in fact how they would remove that data if a consumer was asked for that to happen. >> Jeff: (laughs) Who knows, we know in the cloud there is no deleting, right? >> Absolutely. >> It's in the cloud, it's there for everyone. >> That's rough (laughs). >> I mean, it really drives home kind of an AS application agent service provider services, because there's just, I could just see the auto dealership, right? Some guy's got his personal spreadsheet, that he keeps track of his favorite customers, clearly I don't think that's probably falling in compliance. >> Absolutely, yeah, and it can, right? You can work really hard, so it is a process problem. You identified that before, right? There is a lot of process here, technology isn't a golden bullet, it's not going to solve everything, right? And a lot of it is process and mentality driven. So we need to work with people to educate them, and then there's a big emphasis on the consumer as well. I think we focused on business here, but there's a big emphasis on the consumer, for them to begin to understand and be better educated. We heard from some government representatives today, about educating consumers, right? And you mentioned millennials, and the various other groups that exist, and it's important for them to understand where their data is going, and where it's being shared. 'Cause quite honestly we had a couple of really good stories today about privacy and security professionals really not having a genuine understanding of where their data is going. So a regular consumer, someone that goes to buy a car, how can we expect them, without education, to really understand about their data? >> Just to jump on it, obviously you're from the U.K., and we hear all the time that there's more closed circuit cameras in London (laughter) than probably any city else-- >> Yep. >> So, don't answer if you don't want to, but, (laughter) from a government point of view, and let's just take public red light cameras, there's so much data. >> Absolutely. >> Is the government in a position? Do they have the same requirements as a commercial institution in how they keep, manage and stay on top of this data? >> Yeah, absolutely. So I think, having come from a government background initially, I think the rules and regulations there are much more constrained, constrictive? then perhaps commercial side is. And I think what you find is a lot of the government regulations are now filtering through into the commercial world. But actually what we're seeing is a bit of a step change. So previously, maybe 15, 20 years ago, the leader in the industry was the government, right? So the government did the regulations, and it would filter through commercial. Actually, what we've seen in the industry now is that it flipped on its head. So a lot of the stuff is originating in the corporate world. We're close to Silicon Valley here, the Facebooks of the world, you know a lot of that stuff is now originating in the commercial side? And we heard from some government people today, you know. The government are having to run pretty fast to try and keep up with that changing world. And a lot of the legislation and regulation now, actually, is a bit historic, right? It's set in the old days, we talk today about data, and watching you move around, and geolocation data, a lot of that legislation dealing with that is probably 10, 15 years old now. And exists in a time before you could track your phone all over the world, right? And so, governments have to do some more work, I think ultimately, look at GDPR, I think ultimately the way to change the industry is from a basis of regulation, but then as we move through it's got to be up to the companies and the commercial businesses to take heed of that and do the right thing, ultimately. >> Jeff: It's just so interesting to watch, I mean my favorite is the car insurance ads where they want to give you the little USB gizmo to plug in, to watch you, and it's like, "Well, you already have "a phone in your pocket"-- >> Yep. >> You know? >> They don't really see it. >> You don't really need to plug it in, and all your providers know what's going on, so, exciting times, nothing but opportunity for you. >> Yeah, absolutely, absolutely, I hope so (laughs). >> Well Craig Goodwin, thanks for taking a few minutes-- >> No, thank you. >> And sharing your insights, appreciate it. >> Appreciate it, thank you. >> Alright, he's Craig, I'm Jeff, you're watching theCUBE, We're at Data Privacy Day 2018, I can't believe it's 2018. Thanks for watching, we'll catch you next time. (bright electronic music)
SUMMARY :
he's the Chief Security Officer of CDK Global. So for people who aren't familiar, give us a quick the technology across the U.S. and the rest of the world. it's the business systems for those autmotive companies. So a lot of our technology is connecting, with IoT So how have you seen things evolve and how the industry needs to change. So, is it really more the regulation of change here, is in the smaller to medium A lot of the smaller businesses use companies like CDK So they need to be clear and understand I could just see the auto dealership, right? So a regular consumer, someone that goes to buy a car, Just to jump on it, obviously you're from the U.K., So, don't answer if you don't want to, but, (laughter) So a lot of the stuff is originating in the corporate world. You don't really need to plug it in, Thanks for watching, we'll catch you next time.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Craig Goodwin | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
CDK | ORGANIZATION | 0.99+ |
CDK Global | ORGANIZATION | 0.99+ |
Michel | PERSON | 0.99+ |
Data Protection Act | TITLE | 0.99+ |
London | LOCATION | 0.99+ |
Craig | PERSON | 0.99+ |
U.S. | LOCATION | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
last year | DATE | 0.99+ |
12 points | QUANTITY | 0.99+ |
1998 | DATE | 0.99+ |
GDPR | TITLE | 0.99+ |
ORGANIZATION | 0.99+ | |
both | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
U.K. | LOCATION | 0.99+ |
1999 | DATE | 0.99+ |
about $500 billion | QUANTITY | 0.98+ |
10 years ago | DATE | 0.98+ |
EULA | TITLE | 0.98+ |
2018 | DATE | 0.98+ |
Data Privacy Day 2018 | EVENT | 0.97+ |
Data Privacy Day | EVENT | 0.97+ |
Facebooks | ORGANIZATION | 0.83+ |
10, 15 years old | QUANTITY | 0.83+ |
last couple years | DATE | 0.83+ |
15, 20 years ago | DATE | 0.82+ |
San Francisco | LOCATION | 0.77+ |
Security Officer | PERSON | 0.75+ |
theCUBE | ORGANIZATION | 0.69+ |