Image Title

Search Results for nick koo koo koo roo:

Dale Skivington, Dell EMC & Nick Curcuru, MasterCard - Dell EMC World 2016


 

live from austin texas it's the cube covering deli MC world 2016 brought to you by delhi MC now here are your hosts dave vellante and stu minimus welcome back to dell emc world at austin texas 2016 this is the cube the worldwide leader in live tech coverage dale skiffington is here sees the chief privacy officer at dell she's joined by nick koo koo koo roo was a vice president of big data practice at mastercard folks welcome to the cube thanks for coming on thank you having us very important topic a privacy security I like to talk to them as two sides of the same coin but Dale why don't you start it off tell us what you guys are talking about here at Delhi MC world thanks well oftentimes you're right privacy and security are two really different topics to talk about and Nick will cover a lot this afternoon about the importance of securing data in order to have a successful big data program but privacy is also a concern to our shareholders and stakeholders and that is privacy deals with what information do you collect what information how do you use that information and who to whom do you should with whom do you share it and that's a little different than securing the data and our regulators and our customers are getting increasingly concerned about those issues and so it requires some governance some thought to be put into those programs and that's what we're going to talk about today and it's interesting Nick because in 2006 when the Federal Rules of Civil Procedure enabled or required organizations to retain and produce electronic material it instantly became the notion that data was a liability and everybody wanted to understand okay when can i delete it when can I get rid of it and then when this big data mean occurred all of a sudden data becomes an asset in a big way even though it's always been an asset we know that but in a bigger way it was almost like a bit flip and it sort of changed the attitude is that a reasonable description and how did that affect how you approached privacy well part of it is is you're absolutely right he became an asset everyone started wanted to monetize the data that they were carrying because there were great nuggets that set inside that data so we started talking about security you know the original he's talked about personally identifiable information right and that's what everyone's at name address phone numbers you know you many email addresses but then it started to turn into as we started to bring other sources of data such as Facebook Twitter all that data that sits out there in social media together we started to realize other pieces of information needed to be secure as well so now you've broaden the way that you want to take look at security because all this unstructured data starts to come in where you can identify people through a picture a photograph through a twitter feed what you want to be able to say is how do I protect them as much as I protect someone's credit card or someone's personally identifiable name address and phone number tell what talk about your role at Adele it's interesting to have a chief privacy officer on a tail and now of course Delhi MC he opens up a whole new can of worms if I could say that yes so together with our chief information security officer who looks at the policies that procedures around securing data my team is responsible for the policies procedures and controls relating to the use of the data so you know in terms of the reason why our session today is called the ethical use of data is because the laws are lagging a little bit in terms of requiring certain things to be put in place about the use they're starting to develop but what each regulator has said in the US and Europe and elsewhere is they've given companies and technology companies a chance to put in good governance in place and they've asked the companies to put in internal review boards and an accountable responsible individuals in those organizations to make good decisions about the use of data and that's what a chief privacy officer helps the organization do develop the governance structure and help with the accountability of the use of decisions around using data so they obviously the big discussion going on like this inside of MasterCard and Nicki we're talking about everybody wants to monetize the data or figure out how data can help them monetize so how do you deal with that you know analytics and you know you guys talk about the creepy factor I always worried the Amazon knows more about me than I do you know what I'm out of something and I'm reordering and my patterns and and that's kind of creepy so how do you deal with that you know part of what we do and my side of the house is we anonymize the data in many cases for that type of analysis so we try to take that personally identifiable information out of the analysis so again I can we call it an autumn is a shin where we actually on the front end say I don't care who you are what I care about is your are your patterns and can I figure out what those patterns are to create affinities so by taking them out front end and anonymizing the data doing the analysis on it and then potentially at the back end our customers re identifying those people that we have anonymized on the front end that makes it a little bit better because it's no longer a creepy factor per se because when you work with someone like Dale and what the usage of that data is in many cases when you do that analysis it's doing it for the good of that person so that person either a gets a healthier lifestyle be gets to see the products and services that they want to see or want to be able to you know purchase or whatever so again for us it's been able to understand how we protect the individual as you look through the entire analysis string and that's what we do on the advisor size with our customers so that's cool but the chief marketing officer he or she lets you identify that individual you know the the customer of one you know that one-to-one personal interaction how do you square that circle well that's actually we work with the marketing team they always say that well we have a population of 5 million in our database and I want to look at all five minutes like yes you can look at all 5 million but anonymize them because most cases you're going to send us your data scientists and there's 20 or 30 data scientists that could be working on these five million to create your campaigns they don't need to know names phone numbers or addresses so secure the data so that you're not carrying identifiable information through the ecosystem only at the very end when you say out of that population of 5 million mr. marketer here's the half a million that have a high propensity to do what you're asking do is when you re identifier so at that particular point you haven't put 5 million people at risk you've actually put half a million people what you want them to do which is the propensity to purchase or the propensity to taking action so again at the end is when you re identify and say these are the number of these are the people we should be sending a mail or two or an email to or so an offer and that narrows the threat correct matrix if I use that term and and reduces the risk very much stuff to the consumer and obviously to the organization yeah and that's why when we work with people like our privacy officers it's what are you trying to do in the analysis so that we can understand that data usage because that becomes important with what the data is that's carried through the analysis phase you may not have to carry gender you may not have to carry ethnic background you may not have to carry and these other markers that could put someone as Anna you can identify someone with so if we can keep those out it's how you're using the data and the analysis at the end and to follow up on that you know so that's the what the privacy office does it works with the business when they are envisioning a particular use of data and application a product that's going to do some of these analytics we work with them to design that product to avoid some of these risks sometimes you can sometimes the answer is we absolutely need that personal information because that's the purpose of that particular project and in those cases then we look at did you have permission from the data subject to do what you want to do with the data and if not does the society good outweigh the risks and can you mitigate those risks in certain ways so that's the balancing act that we do and that's when we decide when it's past that creepy line or when it hasn't because my role within the company is to advocate for the data subject to make sure that their expectations are being met by Del I wonder if we can unpack another use case which is fraud detection which is advanced so rapidly in the last 10 years it used to be six months and you find maybe something happened you had a look at your own statements and now you're getting texts and very proactive but certainly a lot of information has to be accessible but it's very narrow in terms of the individual can you talk about that using yeah the one thing that we find from our customers are the people we work with when you talk about fraud people don't mind that you're watching because you're reducing their liability you're reducing someone from stealing that credit card from them or being able to run up charges so when you talk about protecting someone protecting someone's digitalpersona their wallet they're willing to give and take a little bit on what information they provide to you they don't mind that you know that Pam in austin texas today and then someone's trying to charge in you know guitar at the same day they understand that it's not a privacy issue but i want to ask you about the pendulum is kind of swung like I said it used to be it would take forever to find out if there was some kind of fraud and then it became like this flawed of false positives and and and it seems to be getting better and presumably it's because a big data analytics but I wonder if you could talk absolutely our fraud teams matter of fact at mastercard we work very hard to reduce the false positives because that creates a bad experience for both the user as well as the issue of that card right so what we try to do all the times you can continue to do learning machine learning the artificial intelligence how to reduce that as you also look at people's patterns is this person a professional traveler or always traveling so that goes into the algorithm which are take a look at a false positive around fraud do they buy these types of goods with their credit cards so going you start to look at the protection and you start to add those rules into it and you start to actually reduce it it's all about learning it's not just one and done those algorithms have to be constantly updated in real time in some cases so that you're constantly in a learning phase you're building models and iterating those models and that's always a challenge but I'd love to talk about that if we have time but but I wanted to ask you Dale talk about deep learning Michael was talking a lot about machine learning and deep learning and part of his visionary discussion this morning what's the role of transparency how do you guide your constituents in terms of transparency what are the guidelines how transparent when to be trans Aaron yeah that's a great question and you know transparency was where the privacy profession lived 10 years ago it was all about giving the consumers notice about why you're collecting the data and using it consistent with that notice and being very visible with privacy statements and you know there's lots of laws around that now where you have to give specific notices the problem with big data is the power of it is using the data in ways that you didn't envision when you collected the data and that is the dilemma for privacy and big data and that's where the privacy community is trying to develop some tools for organizations to do a balancing act of okay the consumer didn't know that when they gave you that data it was going to be used for this purpose but they're not it's good its tangential to that use so that would be an acceptable use but if it's going to so surprised the consumer that you're using the data for you really need to go back and get reap Reaper missioned and in some countries it's an opt-in permission I'm going to mix Pam law spam and do not call laws seem trivial doesn't it you were mentioning off camera that I think it's your CISO is participates in public policy through the Obama administration is that as that was it you say so it's part of our DNA is security and securing the data our CEOs made a tremendous commitment to make sure that we can apply our best practices into and help the community understand how to make sure the data is secure because that's a digital persona we consider ourselves to be stewards of data not owners of data someone has entrusted us with that we want to make sure that we're constantly contributing back how to make sure it's secure and used right as we take a look at that how about regional nuances local laws haha describe sort of what you're seeing there how you address those complexities yeah so a good example is the new European regulation that's going into effect may of 2018 that has a new specific requirement about profiling automated decision that's used for marketing purposes you have to have an opt-in for using that data companies are going to struggle with how to implement that but nonetheless it's a new law and that law has four percent of annual revenue as a potential penalty Wow so it may get this straight you have to opt-in to be automated profiled automated profiling where it's going to be used for certain types of purposes decisions and you know what they're really trying to avoid is the things that the Obama administration came out with a big data report as well discrimination decisions that are made about insurance and credit etc that are automated decisions and then marketing decisions on those you know with that data the law now requires very specific opt-in and and transparency boy that's going to be tricky yeah the other thing for us is which was just described as working with people is the ability to tag that data as it's being brought in so as you think a big day that ingestion that tagging of that data and carrying the metadata what types of data needs to be tagged what types of data you have to be watching out for was it an opt-in versus an opt-out all that adds into understanding the power of what big data can do to protect both the individual and the company from being able to do something wrong with information so the nice part is with big data you can do that so again we're working with our customers and with the privacy officers understand how you do your data classifications what data needs to be tagged and then to be able to follow that full lineage through the entire ecosystem and obviously that has to be done at the point of creation correct otherwise it's it's not going to scale and and technology helps you solve that problem and that's been a challenge for years but it's a day where that actually works now yeah there's a lot of great partners and we're here at you know Dell world WMC world and they're here as well to help on that ingestion of data as it's coming in to start to tag it and to start to index and catalog it if that's the power of what big data can help you with because before you had to do it individually now you can actually use the tools you can use AI to actually understand about that information coming in to do that tagging to create that lineage it's very very important and very powerful especially as we start looking at what's coming down the road till you get involved in in helping guide solutions is that sir we have a process that is called the privacy impact assessment process and it's in the life cycle development of our products and services so much like the security reviews that are done when we when we commercialize a product we now are interjecting ourselves with a privacy review so if that project or product development or application is intending to use big data analytics as part of it we will we will help guide the business whether they need to build in opt-in consents what it is that what do they want to do with the product and what kinds of things are from a compliance perspective there do they need to build in so that we are at the table with our business partners all right we got a rep and Nick I'll give you the last word to mean so festive as the big data analytics I'll call you a visionary you know what's the future hold where's your focus in the next you know near the midterm you know under stay right with the ethics world and and probably always tell people what we're asking now is just because you have the data doesn't mean you have to use the data just because you have that information you've got to become a parent and start to be able to put some parameters around how that data is use so people in the privacy world you need to bring them to the table so again just because you have it doesn't mean you should be using it and now it's better to be a parent then just let people run crazy right Nick Goodell thanks very much for coming too i love this conversation is fascinating thank you for working do all right keep right to everybody will be back this is dell emc world from Austin Texas this is the cube right back

Published Date : Oct 19 2016

SUMMARY :

action so again at the end is when you

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
MichaelPERSON

0.99+

Nick GoodellPERSON

0.99+

20QUANTITY

0.99+

USLOCATION

0.99+

2006DATE

0.99+

five millionQUANTITY

0.99+

AmazonORGANIZATION

0.99+

EuropeLOCATION

0.99+

half a millionQUANTITY

0.99+

dale skiffingtonPERSON

0.99+

DalePERSON

0.99+

NickPERSON

0.99+

half a million peopleQUANTITY

0.99+

six monthsQUANTITY

0.99+

two sidesQUANTITY

0.99+

5 millionQUANTITY

0.99+

twoQUANTITY

0.99+

Austin TexasLOCATION

0.99+

Federal Rules of Civil ProcedureTITLE

0.99+

MasterCardORGANIZATION

0.99+

dave vellantePERSON

0.99+

four percentQUANTITY

0.98+

five minutesQUANTITY

0.98+

todayDATE

0.98+

30 data scientistsQUANTITY

0.98+

DelPERSON

0.98+

FacebookORGANIZATION

0.98+

ObamaPERSON

0.98+

TwitterORGANIZATION

0.98+

5 million peopleQUANTITY

0.97+

bothQUANTITY

0.97+

10 years agoDATE

0.97+

AnnaPERSON

0.96+

each regulatorQUANTITY

0.96+

Dell EMCORGANIZATION

0.96+

austinLOCATION

0.96+

may of 2018DATE

0.95+

austin texasLOCATION

0.94+

one thingQUANTITY

0.94+

this afternoonDATE

0.94+

AdelePERSON

0.93+

delhi MCORGANIZATION

0.9+

2016DATE

0.89+

nick koo koo koo rooPERSON

0.87+

yearsQUANTITY

0.86+

two really different topicsQUANTITY

0.85+

chief privacy officerPERSON

0.83+

this morningDATE

0.81+

mailQUANTITY

0.81+

dellORGANIZATION

0.81+

last 10 yearsDATE

0.78+

EuropeanOTHER

0.77+

vicePERSON

0.75+

NickiPERSON

0.74+

lots of lawsQUANTITY

0.72+

texasLOCATION

0.7+

dell emc worldORGANIZATION

0.69+

oneQUANTITY

0.68+

stuPERSON

0.66+

Dale SkivingtonPERSON

0.66+

twitterORGANIZATION

0.63+

administrationORGANIZATION

0.62+

DelhiLOCATION

0.62+

WorldEVENT

0.61+

dell emcORGANIZATION

0.61+

lot of informationQUANTITY

0.6+

MCORGANIZATION

0.57+

countriesQUANTITY

0.57+

deliEVENT

0.57+

PamPERSON

0.55+

mastercardORGANIZATION

0.53+

CurcuruORGANIZATION

0.53+

2016EVENT

0.49+

WMCORGANIZATION

0.44+

Delhi MCORGANIZATION

0.43+

MCEVENT

0.39+