Eve Maler | Data Privacy Day 2017
>> Hey, welcome back everybody. Jeff Frick here with the CUBE. We are in downtown San Francisco at the Twitter headquarters for a big event, the Data Privacy Day that's been going on for years and years and years. It's our first visit and we're excited to be here. And our next guest is going to talk about something that is near and dear to all of our hearts. Eve Maler, she's the VP Innovation and Emerging Technology for ForgeRock. Welcome. >> Thank you so much. >> Absolutely. So for people who aren't familiar with ForgeRock, give us a little background on the company. >> Sure. So, of course, the digital journey for every customer and consumer and patient and citizen in the world is so important because trust is important. And so what ForgeRock is about is about creating that seamless digital identity journey throughout cloud, mobile, internet of things, devices, across all of their experiences in a trustworthy and secure way. >> So one of the topics that we had down and getting ready for this was OAuth. >> Yes. >> And as the proliferation of SAS applications continues to grow both within our home life as well as our work life, we have these pesky things called passwords which no one can remember and they force you to change all the time. So along comes OAuth. >> Yes. So OAuth is one of those technologies... I'm kind of a standards wonk. I actually had a hand in creating XML for those people who remember XML. >> Jeff: That's right. >> OAuth took a tact of saying, "Let's get rid of what's called the password anti-pattern. "Let's not give out our passwords to third party services and applications so that we can just give those applications what's called an access token. Instead it's meant just for that application. In fact, Twitter... We're heard at Twitter headquarters. Twitter uses that OAuth technology. And I'm involved in a standard, being a standards wonk, that builds on top of OAuth called user managed access. And it uses this so that we can share access with applications in the same way. And we can share access also with other people using applications. So for example, the same way we hit a share button in Google, Alice hits a share button to share access with a document with Bob. We want to allow every application in the world to be able to do that, not just GoogleDocs, GoogleSheets, and so on. So OAuth is powerful and user managed access is powerful for privacy in the same way. >> Now there's OAuth and I use my Twitter OAuth all the time. Or with Google. >> That's right. >> And then there's these other kind of third party tools which add kind of another layer. >> So you might use like tweetbot is something I like to use on my phone to tweet. >> Jeff: Right, right. >> And so there's... >> Well there's the tweetbot. But then there's these pure, like identity password manager applications which you know you load it into there and then... >> LastPass or something like that. >> Right, right, right. >> One password people use yeah >> To me it's just like wow, that just seems like it's adding another layer. And if oh my gosh, if I forget the LastPass password, I'm really in bad shape. >> You are. >> Not just the one application, but a whole bunch. I mean, how do you see the space kind of evolving to where we got to now? And how is it going to change going forward? It just fascinates me that you still have passwords when our phones have fingerprint. >> TouchID. >> Why can't it just work off my finger? >> More and more, SAS services and applications are actually becoming more sensitive to multifactor authentication, strong authentication, what we at ForgeRock would actually call contextual authentication and that's a great way to go. So they're leveraging things like TouchID, like device fingerprint, for example. Recognizing that the devices kind of represents you and your unique way of using the device. And in that way, we can start to do things like what's called a password list flow. Where it can, most of the time, or all of the time, actually not even use a password. And so, I don't know, I used to be an industry analyst and 75 percent of my conversations with folks like you would be about passwords. And more frequently, I would say now, we're getting into the topic of people are more password savvy and more of the time people are turning on things like multifactor authentication and more of that it knows the context that I'm using my corporate WiFi which is safer. Or I'm using a familiar device. And that means I don't have to use the password as often. So that's contextual authentication. Meaning I don't have to use that insecure password so often. >> Jeff: Right. >> So I think the world has gotten actually a little bit smarter about authentication. I'm hoping. And actually, technologies like OAuth and the things that are based on OAuth like OpenIDConnect which is an identity technology, a modern identity, federated identity technology. And things like user managed access are leveraging the fact that OAuth is getting away from having to use, if it was a password based authentication, not flinging the password around the internet, which is the problem. >> Right, right. Okay so that's good, that's getting better, but now we have this new thing. Internet of things. >> Yes indeed. >> And people are things. But now we've got connected devices, they're not necessarily ones that I purchased, that I authorized, that I even maybe am aware of. >> Like a beacon on a wall, just observing you. >> Like a beacon on a wall and sensors, and the proliferation is just now really starting to run. So from a privacy point of view, how does kind of IOT that I'm not directly involved with compare to IOT with my Alexa compare to applications that I'm actively participating in. How do those lines start to blur? And how does the privacy issues kind of spill over now into managing this wild world of IOT? >> Yeah, there's a couple of threads with the Internet of Things. And so I'm here today at this Data Privacy Day Event to participate on a panel about the IOT tipping point. And there's a couple of threads that are just really important. One is the security of these devices is in large part, a device identity theft problem with this dyn attack. In fact, that was an identity theft problem of devices. We had poorly authenticated devices. We had devices that have identities they have identities, they have identifiers, and they have secrets. And it was a matter of their own passwords being easily taken over. It was account takeovers, essentially for devices, that was the problem. And that's something we have to be aware of. So, you know, just like applications and services can have identities, just like people, we've always known that. It's something our platform can handle. We need to authenticate our devices better and that's something manufacturers have to take responsibility for. >> Jeff: Right. >> And we can see the government agencies starting to crack down on that which is a really good thing. The second thing is there's a saying in the healthcare world for people who are working on patient privacy rights, for example. And the saying is, no data about me without me. So there's got to be a kind of a pressure, you know we see whenever there's a front page news article about the latest password breach. We don't actually see so many password breaches anymore as we see this multifactor authentication come in to play. So that's the industry pressures coming in to play. Where passwords become less important because we have multifactor. We're starting to see consumer pressure say I want to be a part of this. I want you to tell me what you shared. I want more transparency, and I want more control. And that's got to be part of the equation now when it comes to these devices. It's got to be not just more transparent, but what is it you're sharing about me? >> Jeff: Right. >> Last year I actually bought, maybe this is TMI, I always have this habit of sharing too much information, >> That's okay, we're on theCUBE we like >> Being honest here. >> To go places other companies don't go. >> I bought one of those adjustable beds that actually has an air pump that... >> What's your number? Your sleep number. >> It is, it's a Sleep Number bed and it has a feature that connects to an app that tells you how well you slept. You look at the terms and conditions and it says we own your biometric data, we are free to do whatever we want. >> Where did you even find the terms and conditions? >> They're right there on the app, to use the app. >> Oh in the app, in the app. >> You have to say yes. >> So you actually read before just clicking on the box. >> Hey, I'm a privacy pro, I've got to. >> Right, right, right. >> And of course, I saw this, and to use the feature, you have to opt in. >> Right. >> This is the way it is. There's no choice, and they probably got some lawyer... This is the risk management view of privacy. It's no longer tenable to have just a risk management view because the most strategic and the most robust way to see your relationship with your customers is you have to realize there's two sides to the bargain because businesses are commoditized now. There's low switching costs to almost anything. I mean, I bought a bed, but I don't have to have that feature. >> Do you think, do you think they'll break it up? So you want the bed, you're using a FitBit or something else to tell you whether you got a good night's sleep or not. Do you see businesses starting to kind of break up the units of information that they're taking and can they deliver an experience based on a fragmented selection? >> I do believe so. So, user managed access and certain technologies like it, standards like it, there's a standard called consent receipts. They're based on a premise of being able to now deliver convenient control to users. There's even, so there's regulations that are coming like the general data protection regulation in the EU. It's bearing down on pretty much every multinational, every global enterprise that monitors or sells to an EU citizen. That's pretty much every enterprise. >> Jeff: Right, right. >> That demands that individuals get some measure of the ability to withdraw consent in a convenient fashion. So we've got to have consent tech that measures up to the policy that these >> Right. >> organizations have to have. So this is coming whether we sort of like it or not. But we should have a robust and strategic way of exposing to these people the kind of control that they want anyway. >> Jeff: Right. >> They all tell us they want it. So in essence, personal data is becoming a joint asset. We have to conceive of this that way. >> So that's in your... So that's in your sleep app, but what about the traffic cameras and the public facility? >> Yeah. >> I mean, they say in London right you're basically on camera all the time. I don't know if that's fact or not, but clearly there's a lot >> That's true, CCTV, yeah. Of cameras that are tracking your movements. You don't get a chance to opt in or out. >> That is actually true, that's a tough case. >> You don't know. >> The class of... Yeah. The class of beacons. >> And security, right. Obviously, post 9/11 world, that's usually the justification for we want to make sure something bad doesn't happen again. We want to keep track. >> Yeah. >> So how does kind of the government's role in that play? And even in the government, then you have you know all these different agencies, whether it's the traffic agency or even just a traffic camera that maybe KCBS puts up to keep track of you know, it says slow down >> Yeah. >> Between two exits. How does that play into this conversation? >> Yeah, where you don't have an identified individual. And not even an identifiable individual, these are actually terms if you look at GDPR, which I've read closely. It is a tougher case, although I have worked... One of the members of my user managed access working group is one of the sort of experts on UK CCTV stuff. And it is a very big challenge to figure out. And governments do have a special duty of care to figure this out. And so the toughest cases are when you have beacons that just observe passively. Especially because the incentives are such that, I will grant you, the incentives are such that, well how do they go and identify somebody who's hard to identify and then go inform them and be transparent about what they're doing. >> Jeff: Right, right. >> So in those cases, even heuristically identifying somebody is very, very tough. However, there is a case where eye beacons in, say, retail stores do have a very high incentive to identify their consumers and their retail customers. >> Right. >> And in those cases, the incentives flip in the other direction towards transparency and reaching out to the customer. >> Yeah. The tech of these things of someone who I will not name, recently got a drive through red light ticket. >> Yep. >> And the clarity of the images that came in that piece of paper that I saw was unbelievable. >> Yes. >> So I mean, if you're using any kind of monitoring equipment, the ability to identify is pretty much there. >> Now we have cases... So this just happened, actually I'm not going to say, do I say it was to me or to my husband? It was in a non-smart car in a non-smart circumstance where simply a red light camera that takes a picture of an identified car, so you've got a license plate and that binds it to a registered owner of a car. >> Right. >> Now I have a car that's registered in the name of a trust. They didn't get a picture of the driver. They got a picture of the car. So now here we can talk about, let's translate that from a dumb car circumstance, registered to a trust, not to an individual, they sent us what amounted to a parking ticket. Cause they couldn't identify the driver. So now that gives us an opportunity to map that to an IOT circumstance. Because if you've got a smart device. You've got a person, you've got a cloud account. What you need to do is the ability to, in responsible secured fashion, bind a smart device to a person and their cloud account. And the ability to unbind. So now we're back to having an identity centric architecture for security and privacy that knows how to... I'll give you a concrete example, let's say you've got a fleet vehicle in a police department. You assign it to whatever cop on the beat. And at the end of their shift, you assign the car to another cop. What happens on one shift and what happens on another shift is a completely different matter. And it's a smart car, maybe it's a cop who has a uniform with some sort of camera, you know body cam. That's another smart device, and those body cams also get reassigned. So you want whatever was recorded, in the car, on the body cam, with the cop, and with their whatever online account it is, you want the data to go with the cop, only when the cop is using the smart devices that they've been assigned and you want the data for somebody else to go with the somebody else. So in these cases, the binding of identities and the unbinding of identities is critical to the privacy of that police person. >> Jeff: Right, right. >> And to the integrity of the data. So this is why I think of identity centric security and privacy as being so important. And we actually say, at ForgeRock, we say identity relationship management is being so key. >> And whether you use it or not, it is really kind of after the fact of being able to effectively tie the two together. >> You have to look at the relationships in order to know whether it's viable to associate the police person's identity with the car identity. Did something happen to the car on the shift? Did something through the view of the camera on the shift? >> Right, right. And all this is underlaid by trust, which has come up in a number of these interviews today. And unfortunately we're in a situation now if you read all the surveys. And the government particularly, these are kind of the more crazy cases cause businesses can choose to or not to and they've got a relationship with the customer. But on the government side, where there's really no choice, right, they're there. Right now, I think we're at a low point on the trust factor. >> Indeed. >> So how is that, and if you don't trust, then these things are seen as really bad as opposed to if you do trust and then maybe they're just inconvenient or they're not quite worked out all the way. So as this trust changes and fake news and all this other stuff going on right now, how is that impacting the implementation of these technologies? >> Well ask me if I said yes to the terms and conditions. (laughter) Of the sleep app, right. I mean I said yes, I said yes. And I didn't even ask for the app, you know my husband signed up for the free trial. >> Just showed up on my phone. Cause I was in proximity >> I said this one on stage >> to the bed, right? >> at RSA so this is not news. I'm not breaking news here. But you know, consumers want the features, they want convenience, they want value. So it's unreasonable, I believe to simply mount an education campaign and thereby change the world. I do think it's good to have general awareness of what to demand and that's why I say no data about me without me. That's what people should be demanding is to be let in to the loop. Because that gives them more convenience and value. >> Right. >> They want share buttons. I mean, we saw that with the initial introduction of CareKit with Apple. Because that enabled what, people who are involved in user managed access, we call ourselves Umanitarians. So umanitarians like to say, like to call it Alice to Bob sharing, that's the use case. >> Jeff: Okay. >> And it enabled Alice to Dr. Bob sharing. That's a real use case. And IOT kind of made real that use case. When web and mobile and API, I don't think we thought about it so much as a positive use case, although in healthcare it's been a very real thing with EHR. You know you can go into your EHR system and you can see it, you can share with a spouse your allergy record or something, it's there. >> Right, right, right. >> But with IOT, it's a really positive thing. I've talked to folks in my day job about sharing access to a connected car to a remote user. You know, we've seen the experiments with let somebody deliver a package into the trunk of my car, but not get access to driving the car. These are real. That's better than saving >> I've heard that one actually >> Saving a little money by having smart light bulbs is not as good as you've got an Airbnb renter and you want to share limited access to all your stuff while you're away with your renter and then shut down access after you leave, that's an uma use case, actually. And that's good stuff. I could make money. >> Jeff: Right. >> Off of sharing that way. That's convenience and value. >> It's only, I just heard the other day that Airbnb is renting a million rooms a night. >> There you go. >> So not insignificant. >> So once you've have... You have a home that's bristling with smart stuff, you know. That's when it really makes sense to have a share button on all that stuff. It's not just data you're sharing. >> Well Eve, we could go on and on and on. >> Apparently. >> Are you going to be at RSA in a couple of weeks? >> Absolutely. >> Absolutely. >> I'm actually speaking about consent management. >> Alright, well maybe we'll see you there. >> That would be great. >> But I want to thank you for stopping by. >> It's a pleasure. >> And I really enjoyed the conversation. >> Me too, thanks. >> Alright, she's Eve, I'm Jeff, you're watching theCUBE. We'll catch you next time, thanks for watching. (upbeat music)
SUMMARY :
And our next guest is going to talk So for people who aren't familiar with ForgeRock, and citizen in the world is so important So one of the topics that we had down And as the proliferation of SAS applications So OAuth is one of those technologies... So for example, the same way we hit Now there's OAuth and I use my Twitter OAuth all the time. And then there's these other kind I like to use on my phone to tweet. which you know you load it into there and then... And if oh my gosh, if I forget the LastPass password, And how is it going to change going forward? And that means I don't have to use the password as often. is getting away from having to use, but now we have this new thing. And people are things. Like a beacon on a wall, And how does the privacy issues kind of spill over now And that's something we have to be aware of. So that's the industry pressures coming in to play. I bought one of those adjustable beds What's your number? to an app that tells you how well you slept. And of course, I saw this, and to use the feature, don't have to have that feature. or something else to tell you whether or sells to an EU citizen. some measure of the ability to withdraw consent to these people the kind of control that they want anyway. We have to conceive and the public facility? I don't know if that's fact or not, You don't get a chance to opt in or out. That is actually true, The class of beacons. the justification for we want How does that play into this conversation? And so the toughest cases are when you to identify their consumers and reaching out to the customer. The tech of these things of someone who I will not name, And the clarity of the images the ability to identify is pretty much there. and that binds it to a registered owner of a car. And the ability to unbind. And to the integrity of the data. And whether you use it or not, You have to look at the relationships not to and they've got a relationship with the customer. as opposed to if you do trust And I didn't even ask for the app, Cause I was in proximity I do think it's good to have general awareness to Bob sharing, that's the use case. And it enabled Alice to Dr. Bob sharing. get access to driving the car. to all your stuff while you're away Off of sharing that way. It's only, I just heard the other day You have a home that's bristling with smart stuff, you know. But I want to thank you We'll catch you next time, thanks for watching.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff | PERSON | 0.99+ |
Eve Maler | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
London | LOCATION | 0.99+ |
KCBS | ORGANIZATION | 0.99+ |
Eve | PERSON | 0.99+ |
Apple | ORGANIZATION | 0.99+ |
ForgeRock | ORGANIZATION | 0.99+ |
Bob | PERSON | 0.99+ |
Alice | PERSON | 0.99+ |
OAuth | TITLE | 0.99+ |
Last year | DATE | 0.99+ |
One | QUANTITY | 0.99+ |
75 percent | QUANTITY | 0.99+ |
two sides | QUANTITY | 0.99+ |
Airbnb | ORGANIZATION | 0.99+ |
LastPass | TITLE | 0.99+ |
two | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
today | DATE | 0.99+ |
9/11 | EVENT | 0.99+ |
first visit | QUANTITY | 0.99+ |
GDPR | TITLE | 0.99+ |
Data Privacy Day | EVENT | 0.99+ |
one | QUANTITY | 0.98+ |
second thing | QUANTITY | 0.98+ |
GoogleSheets | TITLE | 0.98+ |
one shift | QUANTITY | 0.98+ |
RSA | ORGANIZATION | 0.97+ |
tweetbot | TITLE | 0.97+ |
both | QUANTITY | 0.96+ |
One password | QUANTITY | 0.95+ |
two exits | QUANTITY | 0.95+ |
CUBE | ORGANIZATION | 0.95+ |
Dr. | PERSON | 0.95+ |
GoogleDocs | TITLE | 0.94+ |
ORGANIZATION | 0.94+ | |
UK | LOCATION | 0.93+ |
Twitter OAuth | TITLE | 0.9+ |
EHR | TITLE | 0.89+ |
a million rooms a night | QUANTITY | 0.87+ |
TouchID | OTHER | 0.87+ |
SAS | ORGANIZATION | 0.86+ |
San Francisco | LOCATION | 0.85+ |
Data Privacy Day 2017 | EVENT | 0.84+ |
Data Privacy Day Event | EVENT | 0.84+ |
OpenIDConnect | TITLE | 0.82+ |
Alexa | TITLE | 0.71+ |
EU | ORGANIZATION | 0.7+ |
CareKit | TITLE | 0.68+ |
one application | QUANTITY | 0.68+ |
years | QUANTITY | 0.67+ |
TMI | ORGANIZATION | 0.66+ |