Image Title

Search Results for Stuart Madnick:

Dr. Stuart Madnick, MIT | MIT CDOIQ 2019


 

>> from Cambridge, Massachusetts. It's the Cube covering M I T. Chief data officer and information quality Symposium 2019. Brought to you by Silicon Angle Media. >> Welcome back to M I. T. In Cambridge, Massachusetts. Everybody. You're watching the cube. The leader in live tech coverage. This is M I t CDO I Q the chief data officer and information quality conference. Someday Volonte with my co host, Paul Galen. Professor Dr Stewart, Mad Nick is here. Longtime Cube alum. Ah, long time professor at M i. T soon to be retired, but we're really grateful that you're taking your time toe. Come on. The Cube is great to see you again. >> It's great to see you again. It's been a long time. She worked together and I really appreciate the opportunity to share our spirits. Hear our mighty with your audience. Well, it's really been fun >> to watch this conference evolved were full and it's really amazing. We have to move to a new venue >> next year. I >> understand. And data we talk about the date explosion all the time, But one of the areas that you're focused on and you're gonna talk about today is his ethics and privacy and data causes so many concerns in those two areas. But so give us the highlight of what you're gonna discuss with the audience today. We'll get into >> one of things that makes it so challenging. It is. Data has so many implications. Tow it. And that's why the issue of ethics is so hard to get people to reach agreement on it. We're talking people regarding medicine and the idea big data and a I so know, to be able to really identify causes you need mass amounts of data. That means more data has to be made available as long as it's Elsa data, not mine. Well, not my backyard. If he really So you have this issue where on the one hand, people are concerned about sharing the data. On the other hand, there's so many valuable things would gain by sharing data and getting people to reach agreement is a challenge. Well, one of things >> I wanted to explore with you is how things have changed you back in the day very familiar with Paul you as well with Microsoft, Department of Justice, justice, FTC issues regarding Microsoft. And it wasn't so much around data was really around browsers and bundling things today. But today you see Facebook and Google Amazon coming under fire, and it's largely data related. Listen, Liz Warren, last night again break up big tech your thoughts on similarities and differences between sort of the monopolies of yesterday and the data monopolies of today Should they be broken up? What do you thought? So >> let me broaden the issue a little bit more from Maryland, and I don't know how the demographics of the audience. But I often refer to the characteristics that millennials the millennials in general. I ask my students this question here. Now, how many of you have a Facebook account in almost every class? Facebook. You realize you've given away a lot of nation about yourself. It it doesn't really occurred to them. That may be an issue. I was told by someone that in some countries, Facebook is very popular. That's how they cordoned the kidnappings of teenagers from rich families. They track them. They know they're going to go to this basketball game of the soccer match. You know exactly what I'm going after it. That's the perfect spot to kidnap them, so I don't know whether students think about the fact that when they're putting things on Facebook than making so much of their life at risk. On the other hand, it makes their life richer, more enjoyable. And so that's why these things are so challenging now, getting back to the issue of the break up of the big tech companies. One of the big challenges there is that in order to do the great things that big data has been doing and the things that a I promises do you need lots of data. Having organizations that can gather it all together in a relatively systematic and consistent manner is so valuable breaking up the tech companies. And there's some reasons why people want to do that, but also interferes with that benefit. And that's why I think it's gonna be looked at real Kim, please, to see not only what game maybe maybe breaking up also what losses of disadvantages we're creating >> for ourselves so example might be, perhaps it makes United States less competitive. Visa VI China, in the area of machine intelligence, is one example. The flip side of that is, you know Facebook has every incentive to appropriate our data to sell ads. So it's not an easy, you know, equation. >> Well, even ads are a funny situation for some people having a product called to your attention that something actually really want. But you never knew it before could be viewed as a feature, right? So, you know, in some case of the ads, could be viewed as a feature by some people. And, of course, a bit of intrusion by other people. Well, sometimes we use the search. Google, right? Looking >> for the ad on the side. No longer. It's all ads. You know >> it. I wonder if you see public public sentiment changing in this respect. There's a lot of concerns, certainly at the legislative level now about misuse of data. But Facebook user ship is not going down. Instagram membership is not going down. Uh, indication is that that ordinary citizens don't really care. >> I know that. That's been my I don't have all the data. Maybe you may have seen, but just anecdotally and talking to people in the work we're doing, I agree with you. I think most people maybe a bit dramatic, but at a conference once and someone made a comment that there has not been the digital Pearl Harbor yet. No, there's not been some event that was just so onerous. Is so all by the people. Remember the day it happened kind of thing. And so these things happen and maybe a little bit of press coverage and you're back on your Facebook. How their instagram account the next day. Nothing is really dramatic. Individuals may change now and then, but I don't see massive changes. But >> you had the Equifax hack two years ago. 145,000,000 records. Capital one. Just this week. 100,000,000 records. I mean, that seems pretty Pearl Harbor ish to me. >> Well, it's funny way we're talking about that earlier today regarding different parts of the world. I think in Europe, the general, they really seem to care about privacy. United States that kind of care about privacy in China. They know they have no privacy. But even in us where they care about privacy, exactly how much they care about it is really an issue. And in general it's not enough to move the needle. If it does, it moves it a little bit about the time when they show that smart TVs could be broken into smart. See, TV sales did not Dutch an inch. Not much help people even remember that big scandal a year ago. >> Well, now, to your point about expects, I mean, just this week, I think Equifax came out with a website. Well, you could check whether or not your credentials were. >> It's a new product. We're where we're compromised. And enough in what has been >> as head mind, I said, My wife says it's too. So you had a choice, you know, free monitoring or $125. So that way went okay. Now what? You know, life goes >> on. It doesn't seem like anything really changes. And we were talking earlier about your 1972 book about cyber security, that many of the principles and you outlined in that book are still valid today. Why are we not making more progress against cybercriminals? >> Well, two things. One thing is you gotta realize, as I said before, the Cave man had no privacy problems and no break in problems. But I'm not sure any of us want to go back to caveman era because you've got to realize that for all these bad things. There's so many good things that are happening, things you could now do, which a smartphone you couldn't even visualize doing a decade or two ago. So there's so much excitement, so much for momentum, autonomous cars and so on and so on that these minor bumps in the road are easy to ignore in the enthusiasm and excitement. >> Well and now, as we head into 2020 affection it was. It was fake news in 2016. Now we've got deep fakes. Get the ability to really use video in new ways. Do you see a way out of that problem? A lot of people looking a Blockchain You wrote an article recently, and Blockchain you think it's on hackable? Well, think again. >> What are you seeing? I think one of things we always talk about when we talk about improving privacy and security and organizations, the first thing is awareness. Most people are really small moment of time, aware that there's an issue and it quickly pass in the mind. The analogy I use regarding industrial safety. You go into almost any factory. You'll see a sign over the door every day that says 520 days, his last industrial accident and then a sub line. Please do not be the one to reset it this year. And I often say, When's the last time you went to a data center? And so assign is at 50 milliseconds his last cyber data breach. And so it needs to be something that is really front, the mind and people. And we talk about how to make awareness activities over companies and host household. And that's one of our major movements here is trying to be more aware because we're not aware that you're putting things at risk. You're not gonna do anything about it. >> Last year we contacted Silicon Angle, 22 leading security experts best in one simple question. Are we winning or losing the war against cybercriminals? Unanimously, they said, we're losing. What is your opinion of that question? >> I have a great quote I like to use. The good news is the good guys are getting better than a firewall of cryptographic codes. But the bad guys are getting batter faster, and there's a lot of reasons for that well on all of them. But we came out with a nautical talking about the docking Web, and the reason why it's fascinating is if you go to most companies if they've suffered a data breach or a cyber attack, they'll be very reluctant to say much about unless they really compelled to do so on the dock, where they love to Brent and reputation. I'm the one who broke in the Capital One. And so there's much more information sharing that much more organized, a much more disciplined. I mean, the criminal ecosystem is so much more superior than the chaotic mess we have here on the good guys side of the table. >> Do you see any hope for that? There are service's. IBM has one, and there are others in a sort of anonymous eyes. Security data enable organizations to share sensitive information without risk to their company. You see any hope on the collaboration, Front >> said before the good guys are getting better. The trouble is, at first I thought there was an issue that was enough sharing going on. It turns out we identified over 120 sharing organizations. That's the good news. And the bad news is 120. So IBM is one and another 119 more to go. So it's not a very well coordinated sharing. It's going just one example. The challenges Do I see any hope in the future? Well, in the more distant future, because the challenge we have is that there'll be a cyber attack next week of some form or shape that we've never seen before and therefore what? Probably not well prepared for it. At some point, I'll no longer be able to say that, but I think the cyber attackers and creatures and so on are so creative. They've got another decade of more to go before they run out of >> Steve. We've got from hacktivists to organized crime now nation states, and you start thinking about the future of war. I was talking to Robert Gates, aboutthe former defense secretary, and my question was, Why don't we have the best cyber? Can't we go in the oven? It goes, Yeah, but we also have the most to lose our critical infrastructure, and the value of that to our society is much greater than some of our adversaries. So we have to be very careful. It's kind of mind boggling to think autonomous vehicles is another one. I know that you have some visibility on that. And you were saying that technical challenges of actually achieving quality autonomous vehicles are so daunting that security is getting pushed to the back burner. >> And if the irony is, I had a conversation. I was a visiting professor, sir, at the University of Niece about a 12 14 years ago. And that's before time of vehicles are not what they were doing. Big automotive tele metrics. And I realized at that time that security wasn't really our top priority. I happen to visit organization, doing really Thomas vehicles now, 14 years later, and this conversation is almost identical now. The problems we're trying to solve. A hider problem that 40 years ago, much more challenging problems. And as a result, those problems dominate their mindset and security issues kind of, you know, we'll get around him if we can't get the cot a ride correctly. Why worry about security? >> Well, what about the ethics of autonomous vehicles? Way talking about your programming? You know, if you're gonna hit a baby or a woman or kill your passengers and yourself, what do you tell the machine to Dio, that is, it seems like an unsolvable problem. >> Well, I'm an engineer by training, and possibly many people in the audience are, too. I'm the kind of person likes nice, clear, clean answers. Two plus two is four, not 3.94 point one. That's the school up the street. They deal with that. The trouble with ethic issues is they don't tend to have a nice, clean answer. Almost every study we've done that has these kind of issues on it. And we have people vote almost always have spread across the board because you know any one of these is a bad decision. So which the bad decision is least bad. Like, what's an example that you used the example I use in my class, and we've been using that for well over a year now in class, I teach on ethics. Is you out of the design of an autonomous vehicle, so you must program it to do everything and particular case you have is your in the vehicle. It's driving around the mountain and Swiss Alps. You go around a corner and the vehicle, using all of senses, realize that straight ahead on the right? Ian Lane is a woman in a baby carriage pushing on to this onto the left, just entering the garage way a three gentlemen, both sides a road have concrete barriers so you can stay on your path. Hit the woman the baby carriage via to the left. Hit the three men. Take a shop, right or shot left. Hit the concrete wall and kill yourself. And trouble is, every one of those is unappealing. Imagine the headline kills woman and baby. That's not a very good thing. There actually is a theory of ethics called utility theory that says, better to say three people than to one. So definitely doing on Kim on a kill three men, that's the worst. And then the idea of hitting the concrete wall may feel magnanimous. I'm just killing myself. But as a design of the car, shouldn't your number one duty be to protect the owner of the car? And so people basically do. They close their eyes and flip a coin because they don't want anyone. Those hands, >> not an algorithmic >> response, doesn't leave. >> I want to come back for weeks before we close here to the subject of this conference. Exactly. You've been involved with this conference since the very beginning. How have you seen the conversation changed since that time? >> I think I think it's changing to Wei first. As you know, this record breaking a group of people are expecting here. Close to 500 I think have registered s o much Clea grown kind of over the years, but also the extent to which, whether it was called big data or call a I now whatever is something that was kind of not quite on the radar when we started, I think it's all 15 years ago. He first started the conference series so clearly has become something that is not just something We talk about it in the academic world but is becoming main stay business for corporations Maur and Maur. And I think it's just gonna keep increasing. I think so much of our society so much of business is so dependent on the data in any way, shape or form that we use it and have >> it well, it's come full circle. It's policy and I were talking at are open. This conference kind of emerged from the ashes of the back office information quality and you say the big date and now a I guess what? It's all coming back to information. >> Lots of data. That's no good. Or that you don't understand what they do with this. Not very healthy. >> Well, doctor Magic. Thank you so much. It's a >> relief for all these years. Really Wanna thank you. Thank you, guys, for joining us and helping to spread the word. Thank you. Pleasure. All right, keep it right, everybody. Paul and >> I will be back at M I t cdo right after this short break. You're watching the cue.

Published Date : Jul 31 2019

SUMMARY :

Brought to you by The Cube is great to see you again. It's great to see you again. We have to move to a new venue I But one of the areas that you're focused on and you're gonna talk about today is his ethics and privacy to be able to really identify causes you need mass amounts of data. I wanted to explore with you is how things have changed you back in the One of the big challenges there is that in order to do the great things that big data has been doing The flip side of that is, you know Facebook has every incentive to appropriate our data to sell ads. But you never knew it before could be viewed as a feature, for the ad on the side. There's a lot of concerns, certainly at the legislative level now about misuse of data. Is so all by the people. I mean, that seems pretty Pearl Harbor ish to me. And in general it's not enough to move the needle. Well, now, to your point about expects, I mean, just this week, And enough in what has been So you had a choice, you know, book about cyber security, that many of the principles and you outlined in that book are still valid today. in the road are easy to ignore in the enthusiasm and excitement. Get the ability to really use video in new ways. And I often say, When's the last time you went to a data center? What is your opinion of that question? Web, and the reason why it's fascinating is if you go to most companies if they've suffered You see any hope on the collaboration, in the more distant future, because the challenge we have is that there'll be a cyber attack I know that you have some visibility on that. And if the irony is, I had a conversation. that is, it seems like an unsolvable problem. But as a design of the car, shouldn't your number one How have you seen the conversation so much of business is so dependent on the data in any way, shape or form that we use it and from the ashes of the back office information quality and you say the big date and now a I Or that you don't understand what they do with this. Thank you so much. to spread the word. I will be back at M I t cdo right after this short break.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Ian LanePERSON

0.99+

Stuart MadnickPERSON

0.99+

Liz WarrenPERSON

0.99+

Paul GalenPERSON

0.99+

IBMORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

EuropeLOCATION

0.99+

ChinaLOCATION

0.99+

$125QUANTITY

0.99+

PaulPERSON

0.99+

EquifaxORGANIZATION

0.99+

2016DATE

0.99+

StevePERSON

0.99+

FacebookORGANIZATION

0.99+

Robert GatesPERSON

0.99+

GoogleORGANIZATION

0.99+

Silicon AngleORGANIZATION

0.99+

Silicon Angle MediaORGANIZATION

0.99+

ElsaPERSON

0.99+

fourQUANTITY

0.99+

520 daysQUANTITY

0.99+

StewartPERSON

0.99+

Last yearDATE

0.99+

next yearDATE

0.99+

Cambridge, MassachusettsLOCATION

0.99+

TwoQUANTITY

0.99+

KimPERSON

0.99+

2020DATE

0.99+

50 millisecondsQUANTITY

0.99+

Swiss AlpsLOCATION

0.99+

this weekDATE

0.99+

yesterdayDATE

0.99+

three menQUANTITY

0.99+

14 years laterDATE

0.99+

two years agoDATE

0.99+

a year agoDATE

0.99+

three peopleQUANTITY

0.99+

todayDATE

0.99+

AmazonORGANIZATION

0.99+

twoQUANTITY

0.99+

two thingsQUANTITY

0.99+

one simple questionQUANTITY

0.99+

last nightDATE

0.99+

one exampleQUANTITY

0.99+

InstagramORGANIZATION

0.99+

two areasQUANTITY

0.98+

DioPERSON

0.98+

United StatesLOCATION

0.98+

120QUANTITY

0.98+

next weekDATE

0.98+

firstQUANTITY

0.98+

this yearDATE

0.98+

22 leading security expertsQUANTITY

0.98+

three gentlemenQUANTITY

0.98+

OneQUANTITY

0.98+

1972DATE

0.98+

instagramORGANIZATION

0.98+

FTCORGANIZATION

0.98+

oneQUANTITY

0.97+

100,000,000 recordsQUANTITY

0.97+

MagicPERSON

0.97+

145,000,000 recordsQUANTITY

0.97+

Pearl HarborEVENT

0.97+

40 years agoDATE

0.97+

MarylandLOCATION

0.97+

University of NieceORGANIZATION

0.97+

Department of JusticeORGANIZATION

0.96+

One thingQUANTITY

0.95+

over 120 sharing organizationsQUANTITY

0.95+

next dayDATE

0.95+

12 14 years agoDATE

0.94+

15 years agoDATE

0.93+

an inchQUANTITY

0.93+

first thingQUANTITY

0.93+

one exampleQUANTITY

0.92+