Dr Prakriteswar Santikary, ERT | MIT CDOIQ 2018
>> Live from the MIT campus in Cambridge, Massachusetts, it's the Cube, covering the 12th Annual MIT Chief Data Officer and Information Quality Symposium. Brought to you by SiliconANGLE Media. >> Welcome back to the Cube's coverage of MITCDOIQ here in Cambridge, Massachusetts. I'm your host, Rebecca Knight, along with my co-host, Peter Burris. We're joined by Dr. Santikary, he is the vice-president and chief data officer at ERT. Thanks so much for coming on the show. >> Thanks for inviting me. >> We're going to call you Santi, that's what you go by. So, start by telling our viewers a little bit about ERT. What you do, and what kind of products you deliver to clients. >> I'll be happy to do that. The ERT is a clinical trial small company and we are a global data and technology company that minimizes risks and uncertainties within clinical trials for our customers. Our customers are top pharma companies, biotechnologic companies, medical device companies and they trust us to run their clinical trials so that they can bring their life-saving drugs to the market on time and every time. So we have a huge responsibility in that regard, because they put their trust in us, so we serve as their custodians of data and the processes, and the therapeutic experience that you bring to the table as well as compliance-related expertise that we have. So not only do we provide data and technology expertise, we also provide science expertise, regulatory expertise, so that's one of the reasons they trust us. And we also have been around since 1977, so it's almost over 50 years, so we have this collective wisdom that we have gathered over the years. And we have really earned trust in this past and because we deal with safety and efficacy of drugs and these are the two big components that help MDA, or any regulatory authority for that matter, to approve the drugs. So we have a huge responsibility in this regard, as well. In terms of product, as I said, we are in the safety and efficacy side of the clinical trial process, and as part of that, we have multiple product lines. We have respiratory product lines, we have cardiac safety product lines, we have imaging. As you know, imaging is becoming more and more so important for every clinical trial and particularly on oncology space for sure. To measure the growth of the tumor and that kind of things. So we have a business that focuses exclusively on the imaging side. And then we have data and analytics side of the house, because we provide real-time information about the trial itself, so that our customers can really measure risks and uncertainties before they become a problem. >> At this symposium, you're going to be giving a talk about clinical trials and the problems of, the missteps that can happen when the data is not accurate. Lay out the problem for our viewers, and then we're going to talk about the best practices that have emerged. >> I think that clinical trial space is very complex by its own nature, and the process itself is very lengthy. If you know one of the statistics, for example, it takes about 10 to 15 years to really develop and commercialize a drug. And it usually costs about $2.5 to 3 billion. Per drug. So think about the enormity of this. So the challenges are too many. One is data collection itself. Your clinical trials are becoming more and more complex. Becoming more and more global. Getting patients to the sites is another problem. Patient selection and retention, another one. Regulatory guidelines is another big issue because not every regulated authority follows the same sets of rules and regulations. And cost. Cost is a big imperative to the whole thing, because the development life-cycle of a drug is so lengthy. And as I said, it takes about $3 billion to commercialize a drug and that cost comes down to the consumers. That means patients. So the cost of the health care is growing, is sky-rocketing. And in terms of data collection, there are lots of devices in the field, as you know. Wearables, mobile helds, so the data volume is a tremendous problem. And the vendors. Each pharmaceutical companies use so many vendors to run their trials. CRO's. The Clinical Research Organizations. They have EDC systems, they can have labs. You name it. So they outsource all these to different vendors. Now, how do you coordinate and how do you make them to collaborate? And that's where the data plays a big role because now the data is everywhere across different systems, and those systems don't talk to each other. So how do you really make real-time decisioning when you don't know where your data is? And data is the primary ingredient that you use to make decisions? So that's where data and analytics, and bringing that data in real-time, is a very, very critical service that we provide to our customers. >> When you look at medicine, obviously, the whole notion of evidence-based medicine has been around for 15 years now, and it's becoming a seminal feature of how we think about the process of delivering medical services and ultimately paying it forward to everything else, and partly that's because doctors are scientists and they have an affinity for data. But if we think about going forward, it seems to me as though learning more about the genome and genomics is catalyzing additional need and additional understanding of the role that drugs play in the human body and it almost becomes an information problem, where the drug, I don't want to say that a drug is software, but a drug is delivering something that, ultimately, is going to get known at a genomic level. So does that catalyze additional need for data? is that changing the way we think about clinical trials? Especially when we think about, as you said, it's getting more complex because we have to make sure that a drug has the desired effect with men and women, with people from here, people from there. Are we going to push the data envelope even harder over the next few years? >> Oh, you bet. And that's where the real world evidence is playing a big role. So, instead of patients coming to the clinical trials, clinical trial is going to the patient. It is becoming more and more patient-centric. >> Interesting. >> And the early part of protocol design, for example, the study design, that is step one. So more and more the real world evidence data is being used to design the protocol. The very first stage of the clinical trial. Another thing that is pushing the envelope is artificial intelligence and other data mining techniques and now people can be used to really mine that data, the MAR data, prescription data, claims data. Those are real evidence data coming from the real patients. So now you can use these artificial intelligence and mission learning techniques to mine that data then to really design the protocol and the study design instead of flipping through the year MAR data manually. So patient collection, for example, is no patients, no trials, right? So gathering patients, and the right set of patients, is one of the big problems. It takes a lot of that time to bring those patients and even more troublesome is to retain those patients over time. These, too, are big, big things that take a long time and site selection, as well. Which site is going to really be able to bring the right patients for the right trials? >> So, two quick comments on that. One of the things, when you say the patients, when someone has a chronic problem, a chronic disease, when they start to feel better as a consequence of taking the drug, they tend to not take the drug anymore. And that creates this ongoing cycle. But going back to what you're saying, does it also mean that clinical trial processes, because we can gather data more successfully over time, it used to be really segmented. We did the clinical trial and it stopped. Then the drug went into production and maybe we caught some data. But now because we can do a better job with data, the clinical trial concept can be sustained a little bit more. That data becomes even more valuable over time and we can add additional volumes of data back in, to improve the process. >> Is that shortening clinical trials? Tell us a little bit about that. >> Yes, as I said, it takes 10 to 15 years if we follow the current process, like Phase One, Phase Two, Phase Three. And then post-marketing, that is Phase Four. I'm not taking the pre-clinical side of these trials in the the picture. That's about 10 to 15 years, about $3 billion kind of thing. So when you use these kind of AI techniques and the real world evidence data and all this, the projection is that it will reduce the cycle by 60 to 70%. >> Wow. >> The whole study, beginning to end time. >> So from 15 down to four or five? >> Exactly. So think about, there are two advantages. One is obviously, you are creating efficiency within the system, and this drug industry and drug discovery industry is rife for disruption. Because it has been using that same process over and over for a long time. It's like, it is working, so why fix it? But unfortunately, it's not working. Because the health care cost has sky-rocketed. So these inefficiencies are going to get solved when we employ real world evidencing into the mixture. Real-time decision making. Risks analysis before they become risks. Instead of spending one year to recruit patients, you use AI techniques to get to the right patients in minutes, so think about the efficiency again. And also, the home monitoring, or mHealth type of program, where the patients don't need to come to the sites, the clinical sites, for check-up anymore. You can wear wearables that are MDA regulated and approved and then, they're going to do all the work from within the comfort of their home. So think about that. And the other thing is, very, terminally sick patients, for example. They don't have time, nor do they have the energy, to come to the clinical site for check-up. Because every day is important to them. So, this is the paradigm shift that is going on. Instead of patients coming to the clinical trials, clinical trials are coming to the patients. And that shift, that's a paradigm shift and that is happening because of these AI techniques. Blockchain. Precision Medicine is another one. You don't run a big clinical trial anymore. You just go micro-trial, you just group small number of patients. You don't run a trial on breast cancer anymore, you just say, breast cancer for these patients, so it's micro-trials. And that needs -- >> Well that can still be aggregated. >> Exactly. It still needs to be aggregated, but you can get the RTD's quickly, so that you can decide whether you need to keep investing in that trial, or not. Instead of waiting 10 years, only to find out that your trial is going to fail. So you are wasting not only your time, but also preventing patients from getting the right medicine on time. So you have that responsibility as a pharmaceutical company, as well. So yes, it is a paradigm shift and this whole industry is rife for disruption and ERT is right at the center. We have not only data and technology experience, but as I said, we have deep domain experience within the clinical domain as well as regulatory and compliance experience. You need all these to navigate through this turbulent water of clinical research. >> Revolutionary changes taking place. >> It is and the satisfaction is, you are really helping the patients. You know? >> And helping the doctor. >> Helping the doctors. >> At the end of the day, the drug company does not supply the drug. >> Exactly. >> The doctor is prescribing, based on knowledge that she has about that patient and that drug and how they're going to work together. >> And out of the good statistics, in 2017, just last year, 60% of the MDA approved drugs were supported through our platform. 60 percent. So there were, I think, 60 drugs got approved? I think 30 or 35 of them used our platform to run their clinical trial, so think about the satisfaction that we have. >> A job well done. >> Exactly. >> Well, thank you for coming on the show Santi, it's been really great having you on. >> Thank you very much. >> Yes. >> Thank you. >> I'm Rebecca Knight. For Peter Burris, we will have more from MITCDOIQ, and the Cube's coverage of it. just after this. (techno music)
SUMMARY :
Brought to you by SiliconANGLE Media. Thanks so much for coming on the show. We're going to call you Santi, that's what you go by. and the therapeutic experience that you bring to the table the missteps that can happen And data is the primary ingredient that you use is that changing the way we think about clinical trials? patients coming to the clinical trials, So more and more the real world evidence data is being used One of the things, when you say the patients, Is that shortening clinical trials? and the real world evidence data and all this, and then, they're going to do all the work is rife for disruption and ERT is right at the center. It is and the satisfaction is, At the end of the day, and how they're going to work together. And out of the good statistics, Well, thank you for coming on the show Santi, and the Cube's coverage of it.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff Frick | PERSON | 0.99+ |
David | PERSON | 0.99+ |
Rebecca Knight | PERSON | 0.99+ |
Alan | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
Adrian | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Paul | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Adrian Swinscoe | PERSON | 0.99+ |
Jeff Brewer | PERSON | 0.99+ |
MAN Energy Solutions | ORGANIZATION | 0.99+ |
2017 | DATE | 0.99+ |
Tony | PERSON | 0.99+ |
Shelly | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Volkswagen | ORGANIZATION | 0.99+ |
Tony Fergusson | PERSON | 0.99+ |
Pega | ORGANIZATION | 0.99+ |
Europe | LOCATION | 0.99+ |
Paul Greenberg | PERSON | 0.99+ |
James Hutton | PERSON | 0.99+ |
Shelly Kramer | PERSON | 0.99+ |
Stu Miniman | PERSON | 0.99+ |
Rob Walker | PERSON | 0.99+ |
Dylan | PERSON | 0.99+ |
10 | QUANTITY | 0.99+ |
June 2019 | DATE | 0.99+ |
Corey Quinn | PERSON | 0.99+ |
Don | PERSON | 0.99+ |
Santikary | PERSON | 0.99+ |
Croom | PERSON | 0.99+ |
china | LOCATION | 0.99+ |
Tony Ferguson | PERSON | 0.99+ |
30 | QUANTITY | 0.99+ |
60 drugs | QUANTITY | 0.99+ |
roland cleo | PERSON | 0.99+ |
UK | LOCATION | 0.99+ |
Don Schuerman | PERSON | 0.99+ |
cal poly | ORGANIZATION | 0.99+ |
Santi | PERSON | 0.99+ |
1985 | DATE | 0.99+ |
Duncan Macdonald | PERSON | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
millions | QUANTITY | 0.99+ |
Cloud Native Computing Foundation | ORGANIZATION | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
one year | QUANTITY | 0.99+ |
10 years | QUANTITY | 0.99+ |
Pegasystems | ORGANIZATION | 0.99+ |
80% | QUANTITY | 0.99+ |
Dr Prakriteswar Santikary, ERT | MIT CDOIQ 2018
>> Live from the MIT campus in Cambridge, Massachusetts, it's the Cube covering the 12th annual MIT Chief Data Officer and Information Quality Symposium. Brought to you by SiliconANGLE Media. >> Welcome back to the Cube's coverage of MIT CDOIQ here in Cambridge, Massachusetts. I'm your host, Rebecca Knight along with my co-host, Peter Burris. We're welcoming back Dr. Santikary who is the Vice President and Chief Data Officer of ERT, thanks for coming back on the program. >> Thank you very much. >> So, in our first interview, we talked about the why and the what and now we're really going to focus on the how. How, what are the kinds of imperatives that ERT needs to build into its platform to accomplish the goals that we talked about earlier? >> Yeah, it's a great question. So, that's where our data and technology pieces come in. As we were talking about, you know, the frustration that the complexity of clinical trials. So, in our platform like we are just drowning in data, because the data is coming from everywhere. They are like real-time data, there is unstructured data, there is binary data such as image data, and they normally don't fit in one data store. They are like different types of data. So, what we have come up with is a unique way to really gather the data real-time in a data lake and we implemented that platform on Amazon Web Services Cloud and that has the ability to ingest as well as integrate data of any volume of any type coming to us at any velocity. So, it's a unique platform and it is already live. Press release came out early part of June and we are very excited about that and it is commercial right now, so yeah. >> But, you're more than just a platform. The product and services on top of that platform, one might say that the services in many respects are what you're really providing to the customers. The services that the platform provides, have I got that right? >> Yes, yes. So, platform like in a uBuild different kinds of services, we call it data products on top of that platform. So, one of the data products is business intelligence where you do real-time decisioning and the product is RBM, Risk Based Monitoring, where you come up with all the risks that a clinical trial may be facing and really expose those risks preemptively. >> So, give us an examples. >> Examples will be like patient visit, for example. A patient may be noncompliant with the protocol, so if that happens, then FDA is not going to like it. So, before they get there, our platform almost warns the sponsors that hey, there is something going on, can you take preemptive actions? Instead of just waiting for the 11th hour and only to find out that you have really missed out on some major things. It's just one example, another could be data quality issues, right? So, let's say there's a gap in data, and/or inconsistent data, or the data is not statistically significant, so you raise some of these with the sponsors so that they can start gathering data that makes sense. Because at the end of the day, data quality is vital for the approval of the drug. If that quality of the data that you are collecting is not good, then what good is the drug? >> So, that also suggests a data governance is gotta be a major feature of some of the services associated with the platform. >> Yes, data governance is key, because that's where you get to know who owns which data, how do you really maintain the quality of data overtime? So, we use both tools, technologies, and processes to really govern the data. And as I was telling you in our session one, that we are the custodian of this data, so we have fiduciary responsibility in some sense to really make sure that the data is ingested properly, gathered properly, integrated properly. And then, we make it available real-time for our real-time decision making, so that our customers can really make the right decisions based on the right information. So, data governance is key. >> One of the things that I believe about medical profession is that it's always been at the vanguard of ethics, social ethics, and increasingly, well, there's always been a correspondence within social ethics and business ethics. I mean ideally, they're very closely aligned. Are you finding that the medical ethics, social medical ethics of privacy and how you handle data, are starting to inform a broader understanding of the issues of privacy, ethical use of data, and how are you guys pushing that envelope if you think that has an important future? >> Yes, that is a great question like we use all these, but we have like data security in place in our platform, right? And the data security in our case plays at multiple level. We don't co-mingle one sponsor's data with others, so they're always like particularized. We partition the data in technical sense and then we have permissions and roles so they will see what they're supposed to be seeing. Not like interdepending on the roles, so yeah, data security is very critical to what we do. We also de-anonymize the data, we don't really store the PII like personally identifiable information as well like e-mail address, or first name or last name, you know? Or social security number for that matter. We don't, when you do analysis, we de-identify the data. >> Are you working with say, European pharmaceuticals as well, Bayer and others? >> Yeah, we have like as I said -- >> So, you have GDPR issues that you have satisfied? >> We have GDPR issues, we have like HIPAA issues, so you name it, so data privacy, data security, data protection, they're all a part of what we do and that's why technology's one piece that we do very well. Another pieces are the compliance, science, because you need all of those three in order to be really, you know, trustworthy to your ultimate customers and in our case they are pharmaceutical companies, medical device companies, and biotechnology companies. >> Where there are lives at stake. >> Exactly. >> So, I know you have worked, Santi, in a number of different industries, I'd love to get your thoughts on what differentiates ERT from your competitors and then, more broadly, what will separate the winners from the losers in this area? >> Yeah, obviously before joining ERT I was the Head of Engineering at Ebay. >> Who? (panel members laughing) >> So, that's the bidding platform, so obviously we were dealing with consumer data, right? So, we were applying artificial intelligence, machine learning, and predictive analytics, all kinds of things to drive the business. In this case, while we are still doing predictive analytics, but the idea of predictive analytics is very different, because in our case here at ERT, we can't recommend anything because they are all like, we can't say hey, don't take Aspirin, take Tylenol, we can't do that, it needs to be driven by doctors. Whereas at Ebay, we would just talking to the end consumers here and we would just predict. >> Again, different ethical considerations. >> Exactly, but in our domain primarily like ERT, ERT is the best of breed in terms of what we do, driving clinical trials and helping our customers and the things that we do best are those three ideas like data collection, obviously the data custodiancy that includes privacy, security, you name it. Another thing we do very well is real-time decisioning that allow our customers, in this case pharmaceutical companies, who will have this integrated dataset in one place, almost like cockpit, where they can see which data is where, what the risks are, how to mitigate those risks, because remember that this trials are happening globally. So, your sites, some sites are here, some sites are in India, who knows where? >> So, the mission control is so critical. >> Critical, time critical. And as well as, you know, cost effective as well, because if you can mitigate those risks before they become problems, you save not only cost, but you shorten the timeline of the study itself. So, your time to market, you know? You reduce that time to market, so that you can go to market faster. >> And you mentioned that it can be as long, the process can be a $3 billion dollar process, so reducing time to market could be a billion dollars a cost and a few billion dollars of revenue, because you get your product out before anybody else. >> Exactly, plus you're helping your end goals which is to help the ultimate patients, right? Because you can bring the drug five years earlier than what you have ended for, then you would save lots of lives there. >> So, the one question I had is we've talked a lot about these various elements, we haven't once mentioned master data management. >> Yes. >> So, give us a little sense of the role that master data management plays within ERT and how you see it changing, because you used to be a very metadata, technical-oriented thing and it's becoming much more something that is almost a reflection of the degree to which an institution has taken up the role that data plays within decision-making and operations. >> Exactly, a great question. At the master data management has people, process, and technology, all three that they co-mingle each other to drive master data management. It's not just about technology. So, in our case, our master data is for example, site, or customers, or vendors, or study, they're master data because they lead in each system. Now, depenation of those entities and semantics of those entities are different in each system. Now, in our platform, when you bring data together from this pair of systems, somehow we need to harmonize these master entities. That's why master data management comes into play. >> While complying with regulatory and ethical requirements. >> Exactly. So, customers for example aren't worried as once said. Or, pick any other name, can be spared 20 different ways in 20 different systems, but when you are bringing the data together, into a called platform, we want nobody to be spared only one way. So that's how you mental the data quality of those master entities. And then obviously we have the technology side of things, we have master data management tools, we have data governance that is allowing data qualities to be established over time. And then that is also allowing us to really help our ultimate customers, who are also seeing the high-quality data set. That's the end goal, whether they can trust the number. And that's the main purpose of our integrated platform that we have just launched on AWS. >> Trust, it's been such a recurring theme in our conversation. The immense trust that the pharmaceutical companies are putting in you, the trust that the patients are putting in the pharmaceutical companies to build and manufacture these drugs. How do you build trust, particularly in this environment? On the main stage they were talking this morning about, how just this very notion of data as an asset. It really requires buy-in, but also trust in that fact. >> Yeah, trust is a two-way street, because it has always been. So, our customers trust us- we trust them. And the way you build the trust is through showing, not through talking, right? So, as I said, in 2017 alone, 60% of the FDA approval went through our platform, so that says something. So customers are seeing the results, they're seeing their drugs are getting approved, we are helping them with compliance, we're artists with science, obviously with tools and technologies. So that's how you build trust, over time, and we have been around since 1977, that helps as well because it says that true and tried methods, we know the procedures, we know the water as they say, and obviously folks like us, we know the modern tools and technologies to expedite the clinical trials. To really gain efficiency within the process itself. >> I'll just add one thing to that, trust- and test you on this- trust is a social asset. At the end of the day it's a social asset. There are a lot of people in the technology industry continuously forget is that they think trust is about your hardware, or it's about something in your infrastructure, or even your applications. You can say you have a trusted asset, but if your customer says you don't, or a partner says you don't, or some group of your employees say you don't, you don't have a trusted asset. Trust is where the technological, the process, and the people really come together, that's the test of whether or not you've really got something the people want. >> Yes, and your results will show that, right. Because at the end of the day, your ultimate test is the results. Everything hinges on that. And the experience helps, as your experience with tools and technologies, signs, regulatories, because it's a multidimensional venn diagram almost, and we are very good at that, and we have been for the past 50 years. >> Well Santi, thank you so much for coming on the program again, it's really fun talking to you. >> Thank you very much, thank you. >> I'm Rebecca Knight for Peter Burris, we will have more from M.I.T CDOIQ in just a little bit.
SUMMARY :
Brought to you by SiliconANGLE Media. thanks for coming back on the program. So, in our first interview, we talked about and that has the ability to ingest one might say that the services in many respects and the product is RBM, Risk Based Monitoring, where you If that quality of the data that you are collecting a major feature of some of the services so that our customers can really make the right decisions is that it's always been at the vanguard of ethics, and then we have permissions and roles in order to be really, you know, trustworthy Yeah, obviously before joining ERT So, that's the bidding platform, and the things that we do best are those three ideas so that you can go to market faster. because you get your product out before anybody else. Because you can bring the drug So, the one question I had is something that is almost a reflection of the degree Now, in our platform, when you bring data together that we have just launched on AWS. in the pharmaceutical companies And the way you build the trust is through showing, and the people really come together, that's the test Because at the end of the day, your ultimate test is Well Santi, thank you so much for coming on the program we will have more from M.I.T CDOIQ in just a little bit.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Peter Burris | PERSON | 0.99+ |
Rebecca Knight | PERSON | 0.99+ |
Santi | PERSON | 0.99+ |
India | LOCATION | 0.99+ |
2017 | DATE | 0.99+ |
60% | QUANTITY | 0.99+ |
Bayer | ORGANIZATION | 0.99+ |
Santikary | PERSON | 0.99+ |
ERT | ORGANIZATION | 0.99+ |
each system | QUANTITY | 0.99+ |
20 different systems | QUANTITY | 0.99+ |
Ebay | ORGANIZATION | 0.99+ |
11th hour | QUANTITY | 0.99+ |
GDPR | TITLE | 0.99+ |
Cambridge, Massachusetts | LOCATION | 0.99+ |
HIPAA | TITLE | 0.99+ |
three ideas | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
FDA | ORGANIZATION | 0.99+ |
SiliconANGLE Media | ORGANIZATION | 0.99+ |
first interview | QUANTITY | 0.99+ |
one piece | QUANTITY | 0.98+ |
1977 | DATE | 0.98+ |
one example | QUANTITY | 0.98+ |
One | QUANTITY | 0.98+ |
three | QUANTITY | 0.98+ |
Cube | ORGANIZATION | 0.98+ |
one question | QUANTITY | 0.98+ |
one way | QUANTITY | 0.98+ |
both tools | QUANTITY | 0.97+ |
20 different ways | QUANTITY | 0.97+ |
Amazon Web Services | ORGANIZATION | 0.97+ |
Prakriteswar Santikary | PERSON | 0.97+ |
one place | QUANTITY | 0.97+ |
one | QUANTITY | 0.96+ |
one thing | QUANTITY | 0.95+ |
early part of June | DATE | 0.95+ |
MIT | ORGANIZATION | 0.95+ |
MIT Chief Data Officer and Information Quality Symposium | EVENT | 0.94+ |
Dr. | PERSON | 0.93+ |
MIT CDOIQ | ORGANIZATION | 0.92+ |
five years | QUANTITY | 0.92+ |
this morning | DATE | 0.87+ |
two-way street | QUANTITY | 0.85+ |
$3 billion dollar | QUANTITY | 0.84+ |
M.I.T | ORGANIZATION | 0.83+ |
few billion dollars | QUANTITY | 0.82+ |
2018 | DATE | 0.77+ |
one data | QUANTITY | 0.77+ |
billion dollars | QUANTITY | 0.76+ |
session one | QUANTITY | 0.72+ |
12th annual | QUANTITY | 0.7+ |
CDOIQ | ORGANIZATION | 0.69+ |
Risk Based Monitoring | OTHER | 0.68+ |
first | QUANTITY | 0.67+ |
Tylenol | ORGANIZATION | 0.67+ |
European | OTHER | 0.65+ |
Vice President | PERSON | 0.65+ |
each | QUANTITY | 0.6+ |
Aspirin | ORGANIZATION | 0.57+ |
years | QUANTITY | 0.51+ |
past 50 | DATE | 0.51+ |
ERT | TITLE | 0.47+ |
ERT | OTHER | 0.39+ |
CDOIQ | EVENT | 0.3+ |
Ilana Golbin, PwC | MIT CDOIQ 2018
>> Live from the MIT campus in Cambridge, Massachusetts, it's The Cube, covering the 12th annual MIT Chief Data Officer and Information Quality Symposium. Brought to you by Silicon Angle Media. >> Welcome back to The Cube's coverage of MIT CDOIQ, here in Cambridge, Massachusetts. I'm your host, Rebecca Knight, along with my cohost Peter Burris. We're joined by Ilana Golbin. She is the manager of artificial intelligence accelerator PWC... >> Hi. >> Based out of Los Angeles. Thanks so much for coming on the show! >> Thank you for having me. >> So I know you were on the main stage, giving a presentation, really talking about fears, unfounded or not, about how artificial intelligence will change the way companies do business. Lay out the problem for us. Tell our viewers a little bit about how you see the landscape right now. >> Yeah, so I think... We've really all experienced this, that we're generating more data than we ever have in the past. So there's all this data coming in. A few years ago that was the hot topic: big data. That big data's coming and how are we going to harness big data. And big data coupled with this increase in computing power has really enabled us to build stronger models that can provide more predictive power for a variety of use cases. So this is a good thing. The problem is that we're seeing these really cool models come out that are black box. Very difficult to understand how they're making decisions. And it's not just for us as end users, but also developers. We don't really know 100% why some models are making the decisions that they are. And that can be a problem for auditing. It can be a problem for regulation if that comes into play. And as end users for us to trust the model. Comes down to the use case, so why we're building these models. But ultimately we want to ensure that we're building models responsibly so the models are in line with our mission as business, and they also don't do any unintended harm. And so because of that, we need some additional layers to protect ourself. We need to build explainability into models and really understand what they're doing. >> You said two really interesting things. Let's take one and then the other. >> Of course. >> We need to better understand how we build models and we need to do a better job of articulating what those models are. Let's start with the building of models. What does it mean to do a better job of building models? Where are we in the adoption of better? >> So I think right now we're at the point where we just have a lot of data and we're very excited about it and we just want to throw it into whatever models we can and see what we can get that has the best performance. But we need to take a step back and look at the data that we're using. Is the data biased? Does the data match what we see in the real world? Do we have a variety of opinions in both the data collection process and also the model design process? Diversity is not just important for opinions in a room but it's also important for models. So we need to take a step back and make sure that we have that covered. Once we're sure that we have data that's sufficient for our use case and the bias isn't there or the bias is there to the extent that we want it to be, then we can go forward and build these better models. So I think we're at the point where we're really excited, and we're seeing what we can do, but businesses are starting to take a step back and see how they can do that better. >> Now the one B and the tooling, where is the tooling? >> The tooling... If you follow any of the literature, you'll see new publications come out sometimes every minute of the different applications for these really advanced models. Some of the hottest models on the market today are deep learning models and reinforcement learning models. They may not have an application for some businesses yet, but they definitely are building those types of applications, so the techniques themselves are continuing to advance, and I expect them to continue to do so. Mostly because the data is there and the processing power is there and there's so much investment coming in from various government institutions and governments in these types of models. >> And the way typically that these things work is the techniques and the knowledge of techniques advance and then we turn them into tools. So the tools are lagging a little bit still behind the techniques, but it's catching up. Would you agree? >> I would agree with that. Just because commercial tools can't keep up with the pace of academic environment, we wouldn't really expect them to, but once you've invested in a tool you want to try and improve that tool rather than reformat that tool with the best technique that came out yesterday. So there is some kind of iteration that will continue to happen to make sure that our commercially available tools match what we see in the academic space. >> So a second question is, now we've got the model, how do we declare the model? What is the state of the art in articulating metadata, what the model does, what its issues are? How are we doing a better job and what can we do better to characterize these models so they can be more applicable while at the same time maintaining fidelity that was originally intended and embedded? >> I think the first step is identifying your use case. The extent to which we want to explain a model really is dependent on this use case. For instance, if you have a model that is going to be navigating a self-driving car, you probably want to have a lot more rigor around how that model is developed than with a model that targets mailers. There's a lot of middle ground there, and most of the business applications fall into that middle ground, but there're still business risks that need to be considered. So to the extent to which we can clearly articulate and define the use case for an AI application, that will help inform what level of explainability or interpretability we need out of our tool. >> So are you thinking in terms of what it means, how do we successfully define use cases? Do you have templates that you're using at PWC? Or other approaches to ensure that you get the rigor in the definition or the characterization of the model that then can be applied both to a lesser, you know, who are you mailing, versus a life and death situation like, is the car behaving the way it's expected to? >> And yet the mailing, we have the example, the very famous Target example that outed a young teenage girl who was pregnant before. So these can have real life implications. >> And they can, but that's a very rare instance, right? And you could also argue that that's not the same as missing a stop sign and potentially injuring someone in a car. So there are always going to be extremes, but usually when we think about use cases we think about criticality, which is the extent to which someone could be harmed. And vulnerability, which is the willingness for an end user to accept a model and the decision that it makes. A high vulnerability use case could be... Like a few years ago or a year ago I was talking to a professor at UCSC, University of California San Diego, and he was talking to a medical devices company that manufactures devices for monitoring your blood sugar levels. So this could be a high vulnerability case. If you have an incorrect reading, someone's life could be in danger. This medical device was intended to read the blood sugar levels by noninvasive means, just by scanning your skin. But the metric that was used to calculate this blood sugar was correct, it just wasn't the same that an end user was expecting. Because that didn't match, these end users did not accept this device, even though it did operate very well. >> They abandoned it? >> They abandoned it. It didn't sell. And what this comes down to is this is a high vulnerability case. People want to make sure that their lives, the lives of their kids, whoever's using this devices is in good hands, and if they feel like they can't trust it, they're not going to use it. So the use case I do believe is very important, and when we think about use cases, we think of them on those two metrics: vulnerability and criticality. >> Vulnerability and criticality. >> And we're always evolving our thinking on this, but this is our current thinking, yeah. >> Where are we, in terms of the way in which... From your perspective, the way in which corporations are viewing this, do you believe that they have the right amount of trepidation? Or are they too trepidatious when it comes to this? What is the mindset? Speaking in general terms. >> I think everybody's still trying to figure it out. What I've been seeing, personally, is businesses taking a step back and saying, "You know we've been building all these proof of concepts, "or deploying these pilots, "but we haven't done anything enterprise-wide yet." Generally speaking. So what we're seeing are business coming back and saying, "Before we go any further, we need "a comprehensive AI strategy. "We need something central within our organization "that tells us, that defines how we're going to move forward "and build these future tools, so that we're not then "moving backwards and making sure everything aligns." So I think this is really the stage that businesses are in. Once they have a central AI strategy, I think it becomes much easier to evaluate regulatory risks or anything like that. Just because it all reports to a central entity. >> But I want to build on that notion. 'Cause generally we agree. But I want to build on that notion, though. We're doing a good job in the technology world of talking about how we're distributing processing power. We're doing a good job of describing how we're distributing data. And we're even doing a good job of just describing how we're distributing known process. We're not doing a particularly good job of what we call systems of agency. How we're distributing agency. In other words, the degree to which a model is made responsible for acting on behalf of the brand. Now in some domains, medical devices, there is a very clear relationship between what the device says it's going to do, and who ultimately is decided to be, who's culpable. But in the software world, we use copyright law. And copyright law is a speech act. How do we ensure that this notion of agency, we're distributing agency appropriately so that when something is being done on behalf of the brand, that there is a lineage of culpability, a lineage of obligations associated with that? Where are we? >> I think right now we're still... And I can't speak for most organizations, just my personal experience. I think that the companies or the instances I've seen, we're still really early on in that. Because AI is different from traditional software, but it still needs to be audited. So we're at the stage where we're taking a step back and we're saying, "We know we need a mechanism "to monitor and audit our AI." We need controls around this. We need to accurately provide auditing and assurance around our AI applications. But we recognize it's different from traditional software. For a variety of reasons. AI is adaptive. It's not static like traditional software. >> It's probabilistic and not categorical. >> Exactly. So there are a lot of other externalities that need to be considered. And so this is something that a lot of businesses are thinking about. One of the reasons why having a central AI strategy is really important, is that you can also define a central controls framework, some type of centralized assurance and auditing process that's mandated from a high level of the organization that everybody will follow. And that's really the best way to get AI widely adopted. Because otherwise, I think we'll be seeing a lot of challenges. >> So I've got one more question. And one question I have is, if you look out in the next three years, as someone who is working with customers, working with academics, trying to match the need to the expertise, what is the next conversation that's going to pop to the top of the stack in this world, in, say, within the next two years? >> Yeah what we'll we be talking about next year or five years from now, too, at the next CDOIQ? >> I think this topic of explainability will persist. Because I don't think we will necessarily tick all the boxes in the next year. I think we'll uncover new challenges and we'll have to think about new ways to explain how models are operating. Other than that, I think customers will want to see more transparency in the process itself. So not just the model and how it's making its decisions, but what data is feeding into that. How are you using my data to impact how a model is making decisions on my behalf? What is feeding into my credit score? And what can I do to improve it? Those are the types of conversations I think we'll be having in the next two years, for sure. >> Great, well Ilana, thanks so much for coming on The Cube. It was great having you. >> Thank you for having me. >> I'm Rebecca Knight for Peter Burris. We will have more from MIT Chief Data Officer Symposium 2018 just after this. (upbeat electronic music)
SUMMARY :
Brought to you by Silicon Angle Media. She is the manager of artificial intelligence accelerator Thanks so much for coming on the show! Lay out the problem for us. are making the decisions that they are. really interesting things. We need to better understand how we build models and look at the data that we're using. and the processing power is there and there's so much So the tools are lagging a little bit still of academic environment, we wouldn't really expect them to, and most of the business applications the very famous Target example and the decision that it makes. So the use case I do believe is very important, And we're always evolving our thinking on this, What is the mindset? I think it becomes much easier to evaluate But in the software world, we use copyright law. So we're at the stage where we're taking a step back And that's really the best way the need to the expertise, So not just the model and how it's making its decisions, It was great having you. We will have more from MIT Chief Data Officer Symposium 2018
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Ilana | PERSON | 0.99+ |
Rebecca Knight | PERSON | 0.99+ |
Ilana Golbin | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
PWC | ORGANIZATION | 0.99+ |
Silicon Angle Media | ORGANIZATION | 0.99+ |
100% | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
UCSC | ORGANIZATION | 0.99+ |
Los Angeles | LOCATION | 0.99+ |
one question | QUANTITY | 0.99+ |
first step | QUANTITY | 0.99+ |
next year | DATE | 0.99+ |
Cambridge, Massachusetts | LOCATION | 0.99+ |
second question | QUANTITY | 0.99+ |
yesterday | DATE | 0.99+ |
one more question | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
two metrics | QUANTITY | 0.99+ |
a year ago | DATE | 0.99+ |
One | QUANTITY | 0.98+ |
both | QUANTITY | 0.98+ |
The Cube | ORGANIZATION | 0.97+ |
MIT | ORGANIZATION | 0.93+ |
MIT Chief Data Officer and Information Quality Symposium | EVENT | 0.93+ |
few years ago | DATE | 0.93+ |
next two years | DATE | 0.92+ |
today | DATE | 0.92+ |
Target | ORGANIZATION | 0.91+ |
MIT CDOIQ | ORGANIZATION | 0.91+ |
interesting things | QUANTITY | 0.88+ |
PwC | ORGANIZATION | 0.87+ |
University of California San Diego | ORGANIZATION | 0.85+ |
next three years | DATE | 0.81+ |
MIT Chief Data Officer Symposium 2018 | EVENT | 0.79+ |
12th annual | QUANTITY | 0.75+ |
MIT CDOIQ 2018 | EVENT | 0.74+ |
five | DATE | 0.69+ |
years | QUANTITY | 0.63+ |
Cube | ORGANIZATION | 0.59+ |
CDOIQ | ORGANIZATION | 0.45+ |
Wrap | MIT CDOIQ
>> Live from the MIT campus in Cambridge, Massachusetts, it's theCUBE covering the 12th annual MIT Chief Data Officer and Information Quality Symposium, brought to you by SiliconANGLE media. >> We are wrapping up a day of coverage here at theCUBE for MIT CDOIQ here in Cambridge, Massachusetts. I'm Rebecca Knight, along with Peter Burris. We've been here all day, folks. We've learned a lot, we've had a lot of great conversations here, a lot of lively debate and interest. So, Peter, this morning, you were talking about this fundamental idea that data needs to be viewed as an asset within an organization. Obviously we're here with a bunch of people who are drinking that Kool-Aid, but-- >> Living that Kool-Aid. >> Living that Kool-Aid, embodying that Kool-Aid. So based on what we heard today, do you think that business has caught up? >> Well, I would say two things. First of all, this has been, as you said, it's been absolutely marvelous series of conversations in many respects. This is what theCUBE is built for, right? Smart people in conversation on camera. And we've had some smart people here today. What I got out of it on that particular issue is that there is general agreement among CDOs that they have to start introducing this notion of asset and what that means in their business. There's not general agreement, or there's a general, I guess not agreement, but there's general concern that we still aren't there yet. I think that everybody that we talk to I think, would come back and say, yes we grew those practices, but the conventions are not as established and mature as they need to be for everybody in our business to agree so that we can acculturate. Now we did hear some examples of folks that have done it. So that great BBDA case we talked about was an example. There was a company that is actually becoming, is really truly institutionalizing, acculturating that notion of data as an asset that performs work, but I think we've got general agreement that that's the right way of thinking about it, but also a recognition that more work needs to be done, and that's why conferences like this are so important. >> Well, one of the things that really struck me about what BBDA did was this education campaign of its 130,000 employees, and as you said, really starting from the ground and saying, this is how we're going to do things. This is who we are as an organization. >> Yeah, and it was a great conversation because one of the points I made was, specifically, that BBDA is a bank. It is an information-based business that has very deep practices and principles associated with information, and when they decided that they need to move beyond that, they were able to get the entire bank to adopt a set of practices that are leading to new types of engagement models, product orientations, service capabilities. That's a pretty phenomenal feat. So, it's happening and it can get done, and there are examples of it happening. Another thing we talked about was the fact that over the course of the next few years, one of the big, one of the most exciting things about digital business is not just digital business and digital, what people call digital maintenance, but that transformation practices. That way forward. And we talked about the idea of how you wrapper existing goods and services and offerings with data to turn them into something else, and the incumbents are going to find ways of doing that so they can re-establish themselves as leaders in a lot of different markets. >> And that's what will separate the people who really get this from the people who, or from the organizations that are going to lag. >> Yeah, we're starting to hear that a lot more from clients, is that the idea increasingly is, okay, I've already got customers. I've already got offers. How do I wrapper them? Using a term we heard from a professor at MIT. How do I wrapper them to improve them utilizing data? And that's a big challenge, but it's happening. >> One of the other fun interviews we had was all about clinical trials, and the use of data in these clinical trials. There are so many challenges about, with clinical trials because of the time it takes to conduct one of these, the cost that it takes, and then at the end you are dealing with patients who just say, "oh, I think I'm not going to take that drug today." Or other factors that take place here. I mean, what do you see, I know your dad is a physician, what do you see as the most exciting thing about the use of data in clinical trials, but also just in the healthcare industry in general? >> Well, so what we heard, and it was a great combination of interviews, but what we heard is that to bring a new drug to market can cost $4 billion and take 15 years. And the question is, can data, first off, reduce the cost of bringing a new drug to market? And we heard numbers like, yeah, by $1 billion or even more. So imagine having the cost of bringing a new drug to market, but also reducing the time by as much as two thirds. That's very, very powerful stuff when we come down to it. And as you said, the way you do that is you have to protect your data to make sure you're complying with various regulations, but as you said, for example, sustaining someone in the trial even though they're starting to feel better because the drug's working. Well, people opt out. They abandon the trial. Well can you use data to keep them tied in, to provide new types of benefits and new types of capabilities so they want to sustain their participation in the trial. >> Or at least the pharma company, hey, this person's dropping out, you need to explain that to the FDA, and that's going to become a point, yeah. >> Or you need to provide an incentive to keep them in. >> Right. >> Or another example that was used was, if we can compress the amount of time, but then recognize that we can sustain an engagement with a patient and collect data longer, that even though we can satisfy the specific regulatory mandates of a trial, shorter, we can still be collecting data because we have a digital engagement model as part of this whole process subject to keeping privacy in place and ownership notions in place, and everything else, complying with regulatory notions. So that is I think a very powerful example. And again, Santi, Dr. Santi was talking specifically about how ERT is helping to accelerate this whole process because over the course of the next dozen years, we're going to learn more about people, the genome is going to become better understood. Genomics is going to continue to evolve. Data is going to become increasingly central to how we think about defining disease and disease processes, and one of the key responses is to learn from that and apply data so that we can more rapidly build the new procedures, devices, and drugs that are capable of responding. >> When we're thinking about what keeps the chief data officers up at night, we know that data security, data fidelity, privacy, the other thing we really heard about from Melana Goldban from PwC Accelerator is the idea about bias, and that is a real concern. From the way she is talking about, it sounded as though companies are more aware of this. It really is an organizational challenge that they recognize that not just matters for social reasons but really for business reasons too, frankly. It affects your bottom line. Where do you come out on that? Do you think we're moving in the right direction? >> First of all, it was a great interview, and a lot of what Melana said was illuminating to me, and I agree with virtually everything she said. We're doing a piece of research on that right now. I would say that, in fact, most companies are not fully factoring the role that bias plays in a lot of different ways. That's one of the things that absolutely must happen as part of the acculturation process, what's known as evidence-based management starts to take grip more within businesses is to understand not only what bias introduced into data now, but as you create derivatives on that data, how that bias changes, delays that data. And that is a relatively poorly understood problem. >> But it's a big problem. >> Oh, it's going to be even bigger because we're going to utilize AI and it's actually going to limit the range of options that people consider as they make a decision, or make the decisions directly for the individual, act on behalf of the brand, what we call agency, a system of agency. And not understanding that range, not having it be auditable, not understanding what the inherent bias is can very quickly send a business off the rails in unexpected ways. So we're devoting a lot of time and energy into understanding that right now. But here's the challenge, that we've got business decision makers who are very familiar with certain kinds of information. There's nobody gets to be the CEO or the COO or a senior person in business if they don't have a pretty decent understanding of findings. So financial information is absolutely adopted within the board room and the senior ranks of management in virtually all businesses of any consequential size today. What we're asking them to do is to learn about wholly new classes of data. New data conventions, what it means, how to apply it, how you should factor it, how to converge agreement around things, that allows them to be as mature in their use of customer data or production data or partner data or any other number of metrics as they are with financial data. That's a real tall order. It's one of the significant challenges that a lot of businesses face today. So it's not that they don't get data or they don't understand data. It's that the sources of data and therefore the range of options that are going to be shaped by data are becoming that much more significant in business. >> And it's how they need to think about data too. I mean I was really struck by Tom Sasala at the very beginning saying, one of the reasons the intelligence community didn't predict 9/11 is that we didn't have people who were thinking like Hollywood people, thinking audaciously enough about what could happen and that similarly we need to have business leaders and executives, who may be very good at crunching numbers, really think much more broadly about the kinds of-- >> And Tom is absolutely right. We also, cuz I was very close to the DoD at the time, there was serious confirmation bias that was going on at that time too. >> Exactly. But clearly he's right, that the objective is for executives to, as a group, acknowledge the powerful role that data can play, have a data-first mentality as opposed to a bias or experience-first mentality. Because my experience is very private relative to your experience. And it takes a lot of time for us to negotiate that before we can make a very, very consequential move. That's not going to go away. We're human beings. But we increasingly need to look at data, which can provide a common foundation for us to build our biases upon so that we can be more specific and more transparent about articulating my interpretations. You can't start doing that until you are better, more willing to utilize data as a potentially unifying tool and mechanism for thinking about, thinking about how we move forward with something. >> That's great. And it's a great way to end our day of coverage here at M.I.T CDOIQ. Thank you so much. It's been a pleasure, >> As always, Rebecca. >> hosting with you. And thanks to the crew and everyone here. It's been really a lot of fun. I'm Rebecca Knight for Peter Burris. We will see you next time on theCUBE. (techno music)
SUMMARY :
brought to you by SiliconANGLE media. data needs to be viewed Living that Kool-Aid, that they have to start Well, one of the things that are leading to new that are going to lag. from clients, is that One of the other fun interviews we had but also reducing the time and that's going to become a point, yeah. incentive to keep them in. the genome is going to the other thing we really heard about is to understand not only what bias It's that the sources of data and that similarly we need that was going on at that time too. But clearly he's right, that the objective And it's a great way to And thanks to the crew and everyone here.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Santi | PERSON | 0.99+ |
Melana | PERSON | 0.99+ |
Rebecca Knight | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Tom Sasala | PERSON | 0.99+ |
Melana Goldban | PERSON | 0.99+ |
15 years | QUANTITY | 0.99+ |
BBDA | ORGANIZATION | 0.99+ |
Peter | PERSON | 0.99+ |
Rebecca | PERSON | 0.99+ |
Tom | PERSON | 0.99+ |
$4 billion | QUANTITY | 0.99+ |
130,000 employees | QUANTITY | 0.99+ |
ERT | ORGANIZATION | 0.99+ |
Cambridge, Massachusetts | LOCATION | 0.99+ |
MIT | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
two thirds | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
SiliconANGLE | ORGANIZATION | 0.98+ |
first | QUANTITY | 0.98+ |
9/11 | EVENT | 0.98+ |
PwC Accelerator | ORGANIZATION | 0.98+ |
Kool-Aid | ORGANIZATION | 0.98+ |
this morning | DATE | 0.98+ |
FDA | ORGANIZATION | 0.97+ |
First | QUANTITY | 0.97+ |
M.I.T CDOIQ | ORGANIZATION | 0.97+ |
MIT Chief Data Officer and Information Quality Symposium | EVENT | 0.97+ |
$1 billion | QUANTITY | 0.97+ |
two things | QUANTITY | 0.95+ |
One | QUANTITY | 0.95+ |
MIT CDOIQ | ORGANIZATION | 0.94+ |
DoD | TITLE | 0.88+ |
theCUBE | ORGANIZATION | 0.79+ |
next few years | DATE | 0.78+ |
next dozen years | DATE | 0.68+ |
Hollywood | ORGANIZATION | 0.65+ |
12th annual | QUANTITY | 0.61+ |
points | QUANTITY | 0.53+ |
a day | QUANTITY | 0.52+ |
CDOIQ | TITLE | 0.28+ |
Cortnie Abercrombie & Carl Gerber | MIT CDOIQ 2018
>> Live from the MIT campus in Cambridge, Massachusetts, it's theCUBE, covering the 12th Annual MIT Chief Data Officer and Information Quality Symposium. Brought to you by SiliconANGLE Media. >> Welcome back to theCUBE's coverage of MIT CDOIQ here in Cambridge, Massachusetts. I'm your host Rebecca Knight along with my cohost Peter Burris. We have two guests on this segment. We have Cortnie Abercrombie, she is the founder of the nonprofit AI Truth, and Carl Gerber, who is the managing partner at Global Data Analytics Leaders. Thanks so much for coming on theCUBE Cortnie and Carl. >> Thank you. >> Thank you. >> So I want to start by just having you introduce yourselves to our viewers, what you do. So tell us a little bit about AI Truth, Cortnie. >> So this was born out of a passion. As I, the last gig I had at IBM, everybody knows me for chief data officer and what I did with that, but the more recent role that I had was developing custom offerings for Fortune 500 in the AI solutions area, so as I would go meet and see different clients, and talk with them and start to look at different processes for how you implement AI solutions, it became very clear that not everybody is attuned, just because they're the ones funding the project or even initiating the purpose of the project, the business leaders don't necessarily know how these things work or run or what can go wrong with them. And on the flip side of that, we have very ambitious up-and-comer-type data scientists who are just trying to fulfill the mission, you know, the talent at hand, and they get really swept up in it. To the point where you can even see that data's getting bartered back and forth with any real governance over it or policies in place to say, "Hey, is that right? Should we have gotten that kind of information?" Which leads us into things like the creepy factor. Like, you know target (laughs) and some of these cases that are well-known. And so, as I saw some of these mistakes happening that were costing brand reputation, our return on investment, or possibly even creating opportunities for risk for the companies and for the business leaders, I felt like someone's got to take one for the team here and go out and start educating people on how this stuff actually works, what the issues can be and how to prevent those issues, and then also what do you do when things do go wrong, how do you fix it? So that's the mission of AI Truth and I have a book. Yes, power to the people, but you know really my main concern was concerned individuals, because I think we've all been affected when we've sent and email and all of a sudden we get a weird ad, and we're like, "Hey, what, they should not, is somebody reading my email?" You know, and we feel this, just, offense-- >> And the answer is yes. >> Yes, and they are, they are. So I mean, we, but we need to know because the only way we can empower ourselves to do something is to actually know how it works. So, that's what my missions is to try and do. So, for the concerned individuals out there, I am writing a book to kind of encapsulate all the experiences that I had so people know where to look and what they can actually do, because you'll be less fearful if you know, "Hey, I can download DuckDuckGo for my browser, or my search engine I mean, and Epic for my browser, and some private, you know, private offerings instead of the typical free offerings. There's not an answer for Facebook yet though. >> So, (laughs) we'll get there. Carl, tell us a little bit about Global Data Analytics Leaders. >> So, I launched Analytics Leaders and CDO Coach after a long career in corporate America. I started building an executive information system when I was in the military for a four-star commander, and I've really done a lot in data analytics throughout my career. Most recently, starting a CDO function at two large multinational companies in leading global transformation programs. And, what I've experienced is even though the industries may vary a little bit, the challenges are the same and the patterns of behavior are the same, both the good and bad behavior, bad habits around the data. And, through the course of my career, I've developed these frameworks and playbooks and just ways to get a repeatable outcome and bring these new technologies like machine learning to bear to really overcome the challenges that I've seen. And what I've seen is a lot of the current thinking is we're solving these data management problems manually. You know, we all hear the complaints about the people who are analysts and data scientists spending 70, 80% of their time being a data gatherer and not really generating insight from the data itself and making it actionable. Well, that's why we have computer systems, right? But that large-scale technology in automation hasn't really served us well, because we think in silos, right? We fund these projects based on departments and divisions. We acquire companies through mergers and acquisitions. And the CDO role has emerged because we need to think about this, all the data that an enterprise uses, horizontally. And with that, I bring a high degree of automation, things like machine learning, to solve those problems. So, I'm now bottling that and advising my clients. And at the same time, the CDO role is where the CIO role was 20 years ago. We're really in it's infancy, and so you see companies define it differently, have different expectations. People are filling the roles that may have not done this before, and so I provide the coaching services there. It's like a professional golfer who has a swing coach. So I come in and I help the data executives with upping their game. >> Well, it's interesting, I actually said the CIO role 40 years ago. But, here's why. If we look back in the 1970s, hardcore financial systems were made possible by the technology which allowed us to run businesses like a portfolio: Jack Welch, the GE model. That was not possible if you didn't have a common asset management system, if you didn't have a common cached management system, etc. And so, when we started creating those common systems, we needed someone that could describe how that shared asset was going to be used within the organization. And we went from the DP manager in HR, the DP manager within finance, to the CIO. And in many respects, we're doing the same thing, right? We're talking about data in a lot of different places and now the business is saying, "We can bring this data together in new and interesting ways into more a shared asset, and we need someone that can help administer that process, and you know, navigate between different groups and different needs and whatnot." Is that kind of what you guys are seeing? >> Oh yeah. >> Yeah. >> Well you know once I get to talking (laughs). For me, I can going right back to the newer technologies like AI and IOT that are coming from externally into your organization, and then also the fact that we're seeing bartering at an unprec... of data at an unprecedented level before. And yet, what the chief data officer role originally did was look at data internally, and structured data mostly. But now, we're asking them to step out of their comfort zone and start looking at all these unknown, niche data broker firms that may or may not be ethical in how they're... I mean, I... look I tell people, "If you hear the word scrape, you run." No scraping, we don't want scraped data, no, no, no (laugh). But I mean, but that's what we're talking about-- >> Well, what do you mean by scraped data, 'cause that's important? >> Well, this is a well-known data science practice. And it's not that... nobody's being malicious here, nobody's trying to have a malintent, but I think it's just data scientists are just scruffy, they roll up their sleeves and they get data however they can. And so, the practice emerged. Look, they're built off of open-source software and everything's free, right, for them, for the most part? So they just start reading in screens and things that are available that you could see, they can optical character read it in, or they can do it however without having to have a subscription to any of that data, without having to have permission to any of that data. It's, "I can see it, so it's mine." But you know, that doesn't work in candy stores. We can't just go, or jewelry stores in my case, I mean, you can't just say, "I like that diamond earring, or whatever, I'm just going to take it because I can see it." (laughs) So, I mean, yeah we got to... that's scraping though. >> And the implications of that are suddenly now you've got a great new business initiative and somebody finds out that you used their private data in that initiative, and now they've got a claim on that asset. >> Right. And this is where things start to get super hairy, and you just want to make sure that you're being on the up-and-up with your data practices and you data ethics, because, in my opinion, 90% of what's gone wrong in AI or the fear factor of AI is that your privacy's getting violated and then you're labeled with data that you may or may not know even exists half the time. I mean. >> So, what's the answer? I mean as you were talking about these data scientists are scrappy, scruffy, roll-up-your-sleeves kind of people, and they are coming up with new ideas, new innovations that sometimes are good-- >> Oh yes, they are. >> So what, so what is the answer? Is this this code of ethics? Is it a... sort of similar to a Hippocratic Oath? I mean how would you, what do you think? >> So, it's a multidimensional problem. Cortnie and I were talking earlier that you have to have more transparency into the models you're creating, and that means a significant validation process. And that's where the chief data officer partners with folks in risk and other areas and the data science team around getting more transparency and visibility into what's the data that's feeding into it? Is it really the authoritative data of the company? And as Cortnie points out, do we even have the rights to that data that's feeding our models? And so, by bringing that transparency and a little more validation before you actually start making key, bet-the-business decisions on the outcomes of these models, you need to look at how you're vetting them. >> And the vetting process is part technology, part culture, part process, it goes back to that people process technology trying. >> Yeah, absolutely, know where your data came from. Why are you doing this model? What are you doing to do with the outcomes? Are you actually going to do something with it or are you going to ignore it? Under what conditions will you empower a decision-maker to use the information that is the output of the model? A lot of these things, you have to think through when you want to operationalize it. It's not just, "I'm going to go get a bunch of data wherever I can, I put a model together. Here, don't you like the results?" >> But this is Silicon Valley way, right? An MVP for everything and you just let it run until... you can't. >> That's a great point Cortnie (laughs) I've always believed, and I want to test this with you, we talk about people process technology about information, we never talk about people process technology and information of information. There's a manner of respects what we're talking about is making explicit the information about... information, the metadata, and how we manage that and how we treat that, and how we defuse that, and how we turn that, the metadata itself, into models to try to govern and guide utilization of this. That's especially important in AI world, isn't it? >> I start with this. For me, it's simple, I mean, but everything he said was true. But, I try to keep it to this: it's about free will. If I said you can do that with my data, to me it's always my data. I don't care if it's on Facebook, I don't care where it is and I don't care if it's free or not, it's still my data. Even if it's X23andMe, or 23andMe, sorry, and they've taken the swab, or whether it's Facebook or I did a google search, I don't care, it's still my data. So if you ask me if it's okay to do a certain type of thing, then maybe I will consent to that. But I should at least be given an option. And no, be given the transparency. So it's all about free will. So in my mind, as long as you're always providing some sort of free will (laughs), the ability for me to having a decision to say, "Yes, I want to participate in that," or, "Yes, you can label me as whatever label I'm getting, Trump or a pro-Hillary or Obam-whatever, name whatever issue of the day is," then I'm okay with that as long as I get a choice. >> Let's go back to it, I want to build on that if I can, because, and then I want to ask you a question about it Carl, the issue of free will presupposes that both sides know exactly what's going into the data. So for example, if I have a medical procedure, I can sit down on that form and I can say, "Whatever happens is my responsibility." But if bad things happen because of malfeasance, guess what? That piece of paper's worthless and I can sue. Because the doctor and the medical provider is supposed to know more about what's going on than I do. >> Right. >> Does the same thing exist? You talked earlier about governance and some of the culture imperatives and transparency, doesn't that same thing exist? And I'm going to ask you a question: is that part of your nonprofit is to try to raise the bar for everybody? But doesn't that same notion exist, that at the end of the day, you don't... You do have information asymmetries, both sides don't know how the data's being used because of the nature of data? >> Right. That's why you're seeing the emergence of all these data privacy laws. And so what I'm advising executives and the board and my clients is we need to step back and think bigger about this. We need to think about as not just GDPR, the European scope, it's global data privacy. And if we look at the motivation, why are we doing this? Are we doing it just because we have to be regulatory-compliant 'cause there's a law in the books, or should we reframe it and say, "This is really about the user experience, the customer experience." This is a touchpoint that my customers have with my company. How transparent should I be with what data I have about you, how I'm using it, how I'm sharing it, and is there a way that I can turn this into a positive instead of it's just, "I'm doing this because I have to for regulatory-compliance." And so, I believe if you really examine the motivation and look at it from more of the carrot and less of the stick, you're going to find that you're more motivated to do it, you're going to be more transparent with your customers, and you're going to share, and you're ultimately going to protect that data more closely because you want to build that trust with your customers. And then lastly, let's face it, this is the data we want to analyze, right? This is the authenticated data we want to give to the data scientists, so I just flip that whole thing on its head. We do for these reasons and we increase the transparency and trust. >> So Cortnie, let me bring it back to you. >> Okay. >> That presupposes, again, an up-leveling of knowledge about data privacy not just for the executive but also for the consumer. How are you going to do that? >> Personally, I'm going to come back to free will again, and I'm also going to add: harm impacts. We need to start thinking impact assessments instead of governance, quite frankly. We need to start looking at if I, you know, start using a FICO score as a proxy for another piece of information, like a crime record in a certain district of whatever, as a way to understand how responsible you are and whether or not your car is going to get broken into, and now you have to pay more. Well, you're... if you always use a FICO score, for example, as a proxy for responsibility which, let's face it, once a data scientist latches onto something, they share it with everybody 'cause that's how they are, right? They love that and I love that about them, quite frankly. But, what I don't like is it propagates, and then before you know it, the people who are of lesser financial means, it's getting propagated because now they're going to be... Every AI pricing model is going to use FICO score as a-- >> And they're priced out of the market. >> And they're priced out of the market and how is that fair? And there's a whole group, I think you know about the Fairness Accountability Transparency group that, you know, kind of watch dogs this stuff. But I think business leaders as a whole don't really think through to that level like, "If I do this, then this this and this could incur--" >> So what would be the one thing you could say if, corporate America's listening. >> Let's do impact. Let's do impact assessments. If you're going to cost someone their livelihood, or you're going to cost them thousands of dollars, then let's put more scrutiny, let's put more government validation. To your point, let's put some... 'cause not everything needs the nth level. Like, if I present you with a blue sweater instead of a red sweater on google or whatever, (laughs) You know, that's not going to harm you. But it will harm you if I give you a teacher assessment that's based on something that you have no control over, and now you're fired because you've been laid off 'cause your rating was bad. >> This is a great conversation. Let me... Let me add something different, 'cause... Or say it a different way, and tell me if you agree. In many respects, it's: Does this practice increase inclusion or does this practice decrease inclusion? This is not some goofy, social thing, this is: Are you making your market bigger or are you making your market smaller? Because the last thing you want is that the participation by people ends with: You can't play because of some algorithmic response we had. So maybe the question of inclusion becomes a key issue. Would you agree with that? >> I do agree with it, and I still think there's levels even to inclusion. >> Of course. >> Like, you know, being a part of the blue sweater club versus the (laughs) versus, "I don't want to be a convict," you know, suddenly because of some record you found, or association with someone else. And let's just face it, a lot of these algorithmic models do do these kinds of things where they... They use n+1, you know, a lot... you know what I'm saying. And so you're associated naturally with the next person closest to you, and that's not always the right thing to do, right? So, in some ways, and so I'm positing just little bit of a new idea here, you're creating some policies, whether you're being, and we were just talking about this, but whether you're being implicit about them or explicit, more likely you're being implicit because you're just you're summarily deciding. Well, okay, I have just decided in the credit score example, that if you don't have a good credit threshold... But where in your policies and your corporate policy did it ever say that people of lesser financial means should be excluded from being able to have good car insurance for... 'cause now, the same goes with like Facebook. Some people feel like they're going to have to opt of of life, I mean, if they don't-- >> (laughs) Opt out of life. >> I mean like, seriously, when you think about grandparents who are excluded, you know, out in whatever Timbuktu place they live, and all their families are somewhere else, and the only way that they get to see is, you know, on Facebook. >> Go back to the issue you raised earlier about "Somebody read my email," I can tell you, as a person with a couple of more elderly grandparents, they inadvertently shared some information with me on Facebook about a health condition that they had. You know how grotesque the response of Facebook was to that? And, it affected me to because they had my name in it. They didn't know any better. >> Sometimes there's a stigma. Sometimes things become a stigma as well. There's an emotional response. When I put the article out about why I left IBM to start this new AI Truth nonprofit, the responses I got back that were so immediate were emotional responses about how this stuff affects people. That they're scared of what this means. Can people come after my kids or my grandkids? And if you think about how genetic information can get used, you're not just hosing yourself. I mean, breast cancer genes, I believe, aren't they, like... They run through families, so, I-- >> And they're pretty well-understood. >> If someone swabs my, and uses it and swaps it with other data, you know, people, all of a sudden, not just me is affected, but my whole entire lineage, I mean... It's hard to think of that, but... it's true (laughs). >> These are real life and death... these are-- >> Not just today, but for the future. And in many respects, it's that notion of inclusion... Going back to it, now I'm making something up, but not entirely, but going back to some of the stuff that you were talking about, Carl, the decisions we make about data today, we want to ensure that we know that there's value in the options for how we use that data in the future. So, the issue of inclusion is not just about people, but it's also about other activities, or other things that we might be able to do with data because of the nature of data. I think we always have to have an options approach to thinking about... as we make data decisions. Would you agree with that? Yes, because you know, data's not absolute. So, you can measure something and you can look at the data quality, you can look at the inputs to a model, whatever, but you still have to have that human element of, "Are you we doing the right thing?" You know, the data should guide us in our decisions, but I don't think it's ever an absolute. It's a range of options, and we chose this options for this reason. >> Right, so are we doing the right thing and do no harm too? Carl, Cortnie, we could talk all day, this has been a really fun conversation. >> Oh yeah, and we have. (laughter) >> But we're out of time. I'm Rebecca Knight for Peter Burris, we will have more from MIT CDOIQ in just a little bit. (upbeat music)
SUMMARY :
Brought to you by SiliconANGLE Media. she is the founder of the nonprofit AI Truth, So I want to start by just having you To the point where you can even see that and some private, you know, private offerings Carl, tell us a little bit about and not really generating insight from the data itself and you know, navigate between different groups Well you know once I get to talking (laughs). And so, the practice emerged. and somebody finds out that you used and you just want to make sure that you're being on the Is it a... sort of similar to a Hippocratic Oath? that you have to have more transparency And the vetting process is part technology, A lot of these things, you have to think through An MVP for everything and you just let it run until... the metadata, and how we manage that the ability for me to having a decision to say, because, and then I want to ask you a question about it Carl, that at the end of the day, you don't... This is the authenticated data we want to give How are you going to do that? and now you have to pay more. And there's a whole group, I think you know about So what would be the one thing you could say if, But it will harm you if I give you a teacher assessment Because the last thing you want is that I do agree with it, and I still think there's levels and that's not always the right thing to do, right? and the only way that they get to see is, you know, Go back to the issue you raised earlier about And if you think about how genetic information can get used, and uses it and swaps it with other data, you know, people, in the options for how we use that data in the future. and do no harm too? Oh yeah, and we have. we will have more from MIT CDOIQ in just a little bit.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Rebecca Knight | PERSON | 0.99+ |
Cortnie Abercrombie | PERSON | 0.99+ |
Carl | PERSON | 0.99+ |
Cortnie | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Trump | PERSON | 0.99+ |
Carl Gerber | PERSON | 0.99+ |
Jack Welch | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
90% | QUANTITY | 0.99+ |
Hillary | PERSON | 0.99+ |
four-star | QUANTITY | 0.99+ |
GE | ORGANIZATION | 0.99+ |
two guests | QUANTITY | 0.99+ |
1970s | DATE | 0.99+ |
Cambridge, Massachusetts | LOCATION | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
both sides | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
Obam | PERSON | 0.99+ |
both | QUANTITY | 0.98+ |
SiliconANGLE Media | ORGANIZATION | 0.98+ |
40 years ago | DATE | 0.98+ |
DuckDuckGo | TITLE | 0.98+ |
thousands of dollars | QUANTITY | 0.98+ |
Timbuktu | LOCATION | 0.98+ |
America | LOCATION | 0.98+ |
theCUBE | ORGANIZATION | 0.98+ |
today | DATE | 0.98+ |
FICO | ORGANIZATION | 0.98+ |
GDPR | TITLE | 0.98+ |
MIT CDOIQ | ORGANIZATION | 0.96+ |
20 years ago | DATE | 0.95+ |
ORGANIZATION | 0.95+ | |
12th Annual MIT Chief Data Officer and Information Quality Symposium | EVENT | 0.93+ |
one | QUANTITY | 0.93+ |
AI Truth | ORGANIZATION | 0.89+ |
70, 80% | QUANTITY | 0.87+ |
MIT | ORGANIZATION | 0.87+ |
Global Data Analytics Leaders | ORGANIZATION | 0.86+ |
2018 | DATE | 0.83+ |
CDO Coach | TITLE | 0.82+ |
Hippocratic Oath | TITLE | 0.82+ |
two large multinational companies | QUANTITY | 0.79+ |
half | QUANTITY | 0.75+ |
Fairness | ORGANIZATION | 0.68+ |
X23andMe | ORGANIZATION | 0.68+ |
23andMe | ORGANIZATION | 0.66+ |
Analytics | ORGANIZATION | 0.64+ |
couple | QUANTITY | 0.62+ |
European | OTHER | 0.59+ |
blue sweater | ORGANIZATION | 0.58+ |
Epic | ORGANIZATION | 0.5+ |
Fortune | ORGANIZATION | 0.48+ |
1 | QUANTITY | 0.46+ |
CDOIQ | EVENT | 0.36+ |
500 | QUANTITY | 0.35+ |
Dr Prakriteswar Santikary, ERT | MIT CDOIQ 2018
>> Live from the MIT campus in Cambridge Massachusetts, it's theCube, covering the 12th annual MIT Chief Data Officer and Information Quality Symposium, brought to you by SiliconANGLE media. >> Welcome back to theCUBE's coverage of MIT CDOIQ here in Cambridge, Massachusetts. I'm your host Rebecca Knight along with my co-host Peter Burris. We're welcoming back Dr. Santikary, who is the Vice President and Chief Data Officer of ERT. Thanks for coming back on the program. >> Thank you very much. >> So in our first interview we talked about the why and the what and now we're really going to focus on how, the how. How, what are the kinds of imperatives that ERT needs to build into its platform to accomplish the goals that we talked about earlier. >> Yeah, it's a great question. So, that's where our data and technology pieces come in. We are as we were talking about in our first session that the complexity of clinical trials. So in our platform like we are just drowning in data because the data is coming from everywhere. There are like real-time data, there is unstructured data, there is binary data such as image data and they normally don't fit in one data store. They are like different types of data. So what we have come up with is a unique way to really gather the data real time, in a data lake, and we implemented that platform on Amazon web services ... Cloud and ... that has the ability to ingest as well as integrate data of any volume, of any type coming to us at any velocity. So it's a unique platform and it is already live, press release came out early part of June and we are very excited about that. And it is commercial right now. So, yeah. >> But you're more than just a platform, you're product and services on top of that platform, one might say that the services in many respects are what you're really providing to the customers, the services that the platform provides. Have I got that right? >> Yes, yes. So platform like you build different kinds of services we call it data products on top of that platform. So one of the data products is business intelligence. Why do you do real time decisioning? Another product is RBM, Risk-Based Monitoring, where you ... come up with all the risks that a clinical trial may be facing and really expose those risks preemptively. >> So give us some examples. >> Examples will be like patient visit for example. Patient may be non-compliant with the protocol. So if that happens then FDA is not going to like it. So before they get there our platform almost warns the sponsor that hey there is something going on can you take preemptive actions? Instead of just waiting for the 11th hour and only to find out that you have really missed out on some major things. It's just one example. Another could be data quality issues, right. So let's say there is a gap in data and/or inconsistent data or the data is not statistically significant. So you've to raise some of these with the sponsors so that they can start gathering data that makes sense because at the end of the day, data quality is vital for the approval of the drug. If the quality of the data that you are collecting is not good, then what good is the trial? >> So that also suggested that data governance is got to be a major feature of some of the services associated with the platform. Have I got that right? >> Yes, data governance is key because that's where you get to know who owns which data. How do you really maintain the quality of data over time? So we use both tools, technologies, and processes to really govern the data and as I was telling you in our session one, that we have the custodian of these data. So we have fiduciary responsibility in some sense to really make sure that the data is ingested properly, gathered properly, integrated properly and then we make it available real time for real time decision making so that our customers can really make the right decisions based on the right information. So data governance is key. >> One of the things that I believe about medical profession is that it's always been at the vanguard of ethics, social ethics and increasingly, well there has always been a correspondence between social ethics and business ethics. I mean, ideally they're very closely aligned. Are you finding that the medical ethics, social medical ethics of privacy and how you handle data are starting to inform a broader understanding of the issues of privacy, ethical use of data, and how are you guys pushing that envelope if you think that that is an important feature? >> Yeah, that's a great question. We use all these, but we have like data security in place in our platform, right? And the data security in our case plays at multiple level. We don't co-mingle one sponsor's data with other's. So they are always like particalized. We partition the data in technical sense and then we have permissions and roles. So they will see what they are supposed to be seeing. Not like, you know depending on the roles. So yeah, data security is very critical to what we do. We also de-anonymize the data. We don't really store the PII like Personally Identifiable Information as well like email address or first name or last name or social security number for that matter. When we do analysis, we de-identify the data. >> Are you working with European pharmaceuticals as well, Bayer and others? >> Yeah, we have like as I said. >> So you have GDPR issues (crosstalk). >> We have GDPR issues. We have like HIPPA issues. So you name it. Data privacy, data security, data protection. They are all a part of what we do and that's why technology is one piece that we do very well. Another pieces are the compliance, science. Because you need all of those three in order to be really trustworthy to your ultimate customers and in our case they are pharmaceutical companies, medical device companies, and biotechnology companies. >> Where there are lives at stake. >> Exactly. >> So I know you have worked Santi in a number of different industries. I'd like to get your thoughts on what differentiates ERT from your competitors and then more broadly, what will separate the winners from the losers in this area. >> Yeah, obviously before joining ERT, I was the head of data engineering at eBay. >> Who? (laughing) >> So that's the bidding platform so obviously we were dealing with consumer data right? So we were applying like artificial intelligence, machine learning and predictive analytics. All kinds of thing to drive the business. In this case, while we are still doing predictive analytics but the ideal predictive analytics is very different because in our case here at ERT we can't recommend anything because they are all like we can't say hey don't take Aspirin, take Tylenol. We can't do that. It's to be driven by doctors. Whereas at eBay, we were just talking to the end consumers here and we would just predict. >> Different ethical considerations. >> Exactly. But in our domain primarily like ERT, ERT is the best of breed in terms of what we do, driving clinical trials and helping our customers and the things that we do best are those three areas like data collection. Obviously the data custodiancy that includes privacy, security, you name it. Another thing we do very well is real time decisioning. So that allow our customers, in this case, pharmaceutical companies who will have this integrated dataset in one place. Almost like a cockpit where they can see which data is where, where the risks are, how to mitigate those risks. Because remember that these trials are happening globally. So some sites are here, some sites are in India. Who knows where? >> So the mission control is so critical. >> Critical, time critical. >> Hmm. >> And as well as you know cost-effective as well because if you can mitigate those risks before they become problems, you save not only cost but you shorten the timeline of the study itself. So your time to market, you know. You reduce that time to market so that you can go to market faster. >> And you mentioned that it can be, they could be, the process could be a 3 billion dollar process. So reducing time to market could be a billion dollars of cost and a few billion dollars of revenue because you get your product out before anybody else. >> Exactly. Plus you are helping your end goals which is to help the ultimate patients, right? >> And that too. >> Because if you can bring the drug five years earlier than what- >> Save lives. >> What you had intended for then you know, you'd save lots of lives there. Definitely. >> So the one question I have is we've talked a lot about these various elements. We haven't once mentioned master data management. >> Yes. >> So give us a little sense of the role that master data management plays within ERT and how you see it changing. Because it used to be a very metadata technical oriented thing and it's becoming much more something that is almost a reflection of the degree to which an institution has taken up the role that data plays within decision making and operation. >> Exactly, a great question. The master data management has like people, process, and technology. All three, they co-mingle each other to drive master data management. So it's not just about technology. So in our case, our master data is for example, site or customers, or vendors or study. They're master data because they live in each system. Now definition of those entities and semantics of those entities are different in each system. Now in our platform when you bring data together from disparate systems, somehow we need to harmonize these master entities. That's why master data management- >> While complying with regulatory and ethical requirements. >> Exactly. So customers for example Novartis let's say, or be it any other name, can be spelled 20 different ways in 20 different systems. But when we are bringing the data together into our core platform, we want Novartis to be spelled only one way. So that's how you maintain the data quality of those master entities. And then obviously we have the technology side of things. We have master data management tools. We have data governance that is allowing data qualities to be established over time and then that is also allowing us to really help our ultimate customers who are also seeing the high quality dataset. That's the end goal, whether they can trust the number. And that's the main purpose of our integrated platform that we have just launched on AWS. >> Trust is just, it's been such a recurring theme in our conversation. The immense trust that the pharmaceutical companies are putting in you, the trust that the patients are putting in the pharmaceutical companies to build and manufacture these drugs. How do you build trust, particularly in this environment? We've talked, on the main stage they were talking this morning about how just this very notion of data as an asset, it really requires buy-in, but also trust in that fact. >> Yeah, yeah. Trust is a two-way street, right? Because it has always been. So our customers trust us, we trust them. And the way you build the trust is through showing not through talking, right? So, as I said, in 2017 alone, 60% of the FDA approval went through our platform. So that says something. So customers are seeing the results. So they are seeing their drugs are getting approved. We are helping them with compliance, with audits, with science, obviously with tools and technologies. So that's how you build trust over time. And we have been around since 1977, that helps as well, because it's a ... true and tried method. We know the procedures. We know the water, as they say. And obviously, folks like us, we know the modern tools and technologies to expedite the clinical trials, to really gain efficiency within the process itself. >> I'll just add one thing to that and test you on this. Trust is a social asset. >> Yeah. >> At the end of the day it's a social asset and I think what a lot of people in the technology industry continuously forget, is that they think the trust is about your hardware, or it's about something in your infrastructure, or even in your applications. You can say you have a trusted asset but if your customer says you don't or a partner says you don't or some group of your employees say you don't, you don't have a trusted asset. >> Exactly. >> Trust is where the technological, the process, and the people really come together. >> And the people come together. >> That's the test of whether or not you've really got something that people want. >> Yes. And your results will show that, right? Because at the end of the day, your ultimate test is the results, right? And because that, everything hinges on that. And then the experience helps as you're experienced with tools and technologies, science, regularities. Because it's a multidimensional Venn diagram almost. And we are very good at that and we have been for the past 50 years. >> Great. Well Santi, thank you so much for coming on the program again. >> Okay, thank you very much. >> It was really fun talking to you. >> Thank you. >> I'm Rebecca Knight for Peter Burris. We will have more from MIT CDOIQ in just a little bit. (upbeat futuristic music)
SUMMARY :
brought to you by SiliconANGLE media. Thanks for coming back on the program. So in our first interview we talked about that has the ability to ingest as well as integrate one might say that the services in many respects So one of the data products is business intelligence. So if that happens then FDA is not going to like it. So that also suggested that data governance to really govern the data and as I was telling you is that it's always been at the vanguard of ethics, and then we have permissions and roles. So you name it. So I know you have worked Santi Yeah, obviously before joining ERT, So that's the bidding platform so and the things that we do best are those three areas so that you can go to market faster. So reducing time to market Plus you are helping your end goals What you had intended for then you know, So the one question I have is is almost a reflection of the degree to which Now in our platform when you bring data together and ethical requirements. So that's how you maintain the data quality on the main stage they were talking this morning And the way you build the trust to that and test you on this. is that they think the trust is about your hardware, the process, and the people really come together. That's the test of whether or not Because at the end of the day, for coming on the program again. We will have more from MIT CDOIQ in just a little bit.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Rebecca Knight | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
2017 | DATE | 0.99+ |
Bayer | ORGANIZATION | 0.99+ |
India | LOCATION | 0.99+ |
Santi | PERSON | 0.99+ |
eBay | ORGANIZATION | 0.99+ |
60% | QUANTITY | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
each system | QUANTITY | 0.99+ |
11th hour | QUANTITY | 0.99+ |
20 different systems | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Santikary | PERSON | 0.99+ |
ERT | ORGANIZATION | 0.99+ |
3 billion dollar | QUANTITY | 0.99+ |
20 different ways | QUANTITY | 0.99+ |
three | QUANTITY | 0.99+ |
first session | QUANTITY | 0.99+ |
FDA | ORGANIZATION | 0.99+ |
Cambridge, Massachusetts | LOCATION | 0.99+ |
Cambridge Massachusetts | LOCATION | 0.99+ |
one piece | QUANTITY | 0.99+ |
first interview | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
one example | QUANTITY | 0.98+ |
1977 | DATE | 0.98+ |
GDPR | TITLE | 0.98+ |
SiliconANGLE | ORGANIZATION | 0.98+ |
one place | QUANTITY | 0.98+ |
one way | QUANTITY | 0.97+ |
two-way | QUANTITY | 0.97+ |
early part of June | DATE | 0.97+ |
Prakriteswar Santikary | PERSON | 0.97+ |
three areas | QUANTITY | 0.96+ |
Novartis | ORGANIZATION | 0.96+ |
one thing | QUANTITY | 0.96+ |
billion dollars | QUANTITY | 0.96+ |
one | QUANTITY | 0.95+ |
MIT Chief Data Officer and Information Quality Symposium | EVENT | 0.95+ |
Dr. | PERSON | 0.95+ |
one question | QUANTITY | 0.94+ |
MIT | ORGANIZATION | 0.94+ |
this morning | DATE | 0.94+ |
theCUBE | ORGANIZATION | 0.94+ |
MIT CDOIQ | ORGANIZATION | 0.92+ |
each | QUANTITY | 0.86+ |
first | QUANTITY | 0.84+ |
Dr | PERSON | 0.84+ |
both tools | QUANTITY | 0.79+ |
session one | QUANTITY | 0.76+ |
few billion dollars | QUANTITY | 0.71+ |
12th annual | QUANTITY | 0.7+ |
five years | QUANTITY | 0.69+ |
Risk | OTHER | 0.68+ |
Tylenol | ORGANIZATION | 0.68+ |
one data store | QUANTITY | 0.67+ |
European | OTHER | 0.65+ |
Chief Data | PERSON | 0.64+ |
2018 | DATE | 0.63+ |
Santi | ORGANIZATION | 0.62+ |
Aspirin | ORGANIZATION | 0.6+ |
Vice President | PERSON | 0.6+ |
50 years | QUANTITY | 0.58+ |
MIT CDOIQ | TITLE | 0.57+ |
Based | OTHER | 0.52+ |
things | QUANTITY | 0.5+ |
Identifiable | OTHER | 0.49+ |
theCube | ORGANIZATION | 0.46+ |
past | DATE | 0.46+ |
HIPPA | ORGANIZATION | 0.28+ |
Jeff McMillan, Morgan Stanley | MIT CDOIQ 2018
>> Live from the MIT campus in Cambridge, Massachusetts, it's theCUBE, covering the 12th annual MIT Chief Data Officer and Information Quality Symposium. Brought to you by SiliconANGLE Media. >> Welcome back to theCUBE's coverage of MIT's CDOIQ. I'm your host, Rebecca Knight, along with my cohost, Peter Burris. We're joined by Jeff McMillan. He is the managing director at Morgan Stanley. Well, thanks so much for coming on theCUBE, Jeff. >> Thanks for having me, it's great to be here. >> So you were just on a panel that was discussing the challenges and the opportunities of the CDO today. I mean, it is a mark of where the CDO role is, just by virtue of the fact that so many corporations are putting it front and center in their organizations. >> Yeah, I think what's interesting, though, is it is bit of a solution in search of a problem, and what I find the biggest challenge that many of these people are facing is that data in and of itself solves nothing, right? Unless you actually say, what business problem am I trying to solve, is it a risk problem, is it an efficiency play, is it a customer service issue, and then building your data solutions in support of that. Too many people start their journey by hiring 400 people, and they create data lineages and they have to create a dictionary and they put all these structures in place, and most of them fail, because they actually didn't figure out what they're solving for, and often times, very elegant and small solutions can actually drive a lot of positive outcomes, but the biggest mistake that, and we actually just discussed this on the panel, is knowing what you're solving for is the first step to be a successful chief data officer. >> Well, investments in infrastructure before outcomes fail no matter what they are, right? So whether it's an infrastructure of doing data analytics better, as you said, a whole bunch of clusters and a whole bunch of metadata management and other stuff, if it's not applied to some end, it's not going to get adopted. So we like to think we were talking in the opening thing, that one of the things that a chief data officer needs to do is acculturate the business to the idea of data being an asset, something that can be applied to work. And it's interesting in part because data can also help you choose what work you should apply it to. So talk a little bit about that. Does that resonate with you? >> I would totally agree with that, and it's not different, like when the first person created a business 2,000 years ago, somewhere along the line they said they needed somebody to keep track of the money, right? And the chief financial officer role sort of emerged, and then we had this thing where people actually came to work every day and they weren't really well trained and didn't understand their responsibilities, so we created the head of human resources. And I think these functions have evolved because as the business model grows, you need to have people to drive specific skills and competencies around these areas. And the truth is, in most organizations, we don't treat data like an asset. And part of it is the machinery, it's getting your Hadoop clusters up and putting your data meta and all that stuff. >> Or we confuse the assets of the technology with the assets that drive business value. >> That's right, and when people fail, it is rarely because they couldn't get the right data quality controls in place. They fail because they didn't get the right engagement model, and they didn't get the left hand and the right hand talking together, and at the core, data is not a data problem, it's an organizational problem. >> So there is this lack of consensus about where the CDO should sit, what his or her responsibilities mandate, scope, what do you think is the answer here? >> Well, we just got off the panel, and this was actually hotly debated, and there were two views on this that were highly divergent. >> But none of the other panelists are here today. >> Yeah, so my view's the right view. (laughter) Actually, I'll lay out both arguments. One of my colleagues on the panel was really driving this tech-focused approach, and her argument, which has some matter in fairness, is that so much of data is about the technology and the interplay and also the knowledge and the expertise and appreciation. You know, technology's been dealing with this problem for 25 years. No one was actually listening to them, right? So there is tremendous knowledge and expertise built up there. I took the other side of the equation, and I worked for the co-heads of our business, because it's not about the technology. And again, the challenges and the barriers to success are not technical in nature, it's leadership. And one thing that's interesting about data, and the reason that people have such a hard time with it, is that the problem and the solution to the problem often sit in two different cost centers. So getting somebody else to care about a problem that impacts you, when it actually doesn't drive your outcomes, is really hard, and that requires leadership and it requires collaboration. And sitting in a technology organization, by the way, I work with terrific technology folks, so this is not a disparagement on them, but sitting under the co-heads of the business, I am able to have those conversations with the other leaders of the business, and say listen, I know that you don't care about this, but for the best interest of the organization, we have to make these investments and let me explain, and those people think more holistically 'cause they're solving for the enterprise as opposed to their individual piece of technology. >> Which really is kind of you said, it requires leadership and it requires collaboration, but that also is one of the fundamental orientation of what great strategy should be. It's a way of cohering the mental model, getting everybody to agree on what the outcome and what the objective needs to be. >> Totally, and by the way, for those of us who are around in the late 90s. >> Not me! (laughter) >> When everyone hired the head of Internet strategy. This feels very much the same way, right? Everyone built websites and they had straight through processing and they sort of woke up a year and a half later, and they said, how has this gotten better? And they said oh, maybe we actually need to connect it to our infrastructure. >> I'll date myself. I remember when these conversations about whether or not we had a CIL, when we had a head of DP within HR, we had a head of DP within accounting, and there was whoa, what are we, the chief is responsible from my perspective, and I'd like to hear what you have to say, a chief anything is responsible for getting a return on the assets that are entrusted to them. >> Yeah, and that is 100% true. That being said, where you make your money is in the businesses, and I think to be really good at this job, you have to be very humble. And you can't make it about you and your goals and objectives, 'cause I have no goals and objectives outside of the goals and objectives of the business that I support. And part of what a lot of the challenge that people have, is they want to build empires, and I actually, I said to my boss, I have declared success when I'm an organization of one, because what I've been able to done is I've been able to set up the right controls, I've got the right people on the right jobs who understand, and the right technology, but the innovation is happening. It doesn't happen to my group. It happens away from my group. It happens when that 23-year-old who has got, with six weeks of visualization training, is sitting at 10 o'clock at night, figures out a better way to sell a municipal bond, because they spent 100 of their hours working on that. It's democratizing access to that, and it's really finding that right balance between control, ensuring the right data quality's in place, but also giving people the ability to innovate, and I think that's the perfect inflection point where you want to be. >> So what is the answer here? How would you give this remedy to other organizations in terms of the best practices that have emerged, and how to do this and do it right? >> Well, first and foremost, you got to know what your strategy is. I was on a panel with GE and General Motors. Their goals and objectives are very different than my goals and objectives. So don't leave this conference because Jeff McMillan did it this way at Morgan Stanley, and assume that that's the right answer for you. I think you have to first ask yourself, what are the most important objectives, and what is your strategy, 'cause the other thing I find is, you ask that question, a lot of businesses, even in this world in which, 2018, we talk about all the time, they don't have a clearly articulated strategy. And unless you have a strategy, putting data on the back end of that is not going to solve the problem. So first and foremost, you got to have a strategy. And then secondly, you got to put the right technical infrastructure in, there's a lot of plumbing that goes into this, and I'm going to gloss over it, but it's really important, and then you got to put the right organizational structure in place. I actually don't believe that you create a different parallel committee around this. The way we do it at our firm is we actually, the existing executive committee, is responsible for this, it's an additional function of them. We report into that function, and then you say, what is your business goals and objectives? Figure out where the gaps are, and then spend the time, money, and resources to solve and focus on that, and do it one problem at a time, and in doing that, you start to build this, what I'll describe as a data-centric or decision-centric culture. >> We call it data first, and so the way we tend to think about it, and I want to bounce this off of you is, you know, what's your business, what are the activities, the outcomes that are necessary to perform that business, what activities are necessary to achieve those outcomes, what data is necessary to perform those activities? >> That's right. >> Does that kind of follow? >> 100%, and also what processes, 'cause the other thing is that you talk to the data consultants, it's all about the data. And then you talk to the process consultants about the process, it's all about all of those things, and the point is that the data is the piece that sits, but there are many factors that influence that. Sometimes it's a data quality program. Sometimes it's a training program. Sometimes it's a technology issue. Sometimes it's a vendor supply issue. There's a whole host of reasons, and really the question is how do you use the data as the rallying point to say, this is the objective source of truth, and where is that objective source of truth, either not from a quality perspective, or from a business perspective, how does it impact those business, and always going back to that thing, 'cause there's truth in that attribute. >> And is that a culture issue? >> Well, it's a process of the technology, and it ultimately is a culture. And it's going back to the original comment, is do you see data as a problem, or do you see data as an opportunity? And I would argue, and I'm not going to speak for other companies, but in the world of finance, we live in bits, zeroes and ones, right? We are an information based business at the core, that happens to be delivering a financial services product. And in that world, that is our competitive advantage. I have a database of every single transaction that every client has ever given with us at Morgan Stanley. I know your risk tolerance level. I know where you live, I know whether you have children. That is a powerful source of knowledge, that if harnessed appropriately, allows to deliver a far, far superior solution to our clients and what they were getting previously. >> Great, well Jeff, thanks so much for coming on the show. It's really fun talking to you. >> Yeah, thank you so much. >> I'm Rebecca Knight. For Peter Burris, we will have more from MIT's CDOIQ coming up just after this.
SUMMARY :
Brought to you by SiliconANGLE Media. He is the managing it's great to be here. of the CDO today. and they have to create a dictionary the business to the idea And part of it is the machinery, assets of the technology and the right hand talking and this was actually hotly debated, But none of the other and the barriers to success is one of the fundamental orientation Totally, and by the and they said, how has this gotten better? and I'd like to hear what you have to say, and objectives of the and then you got to put and the point is that the Well, it's a process of the technology, much for coming on the show. For Peter Burris, we will
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Rebecca Knight | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Jeff McMillan | PERSON | 0.99+ |
GE | ORGANIZATION | 0.99+ |
Jeff | PERSON | 0.99+ |
100 | QUANTITY | 0.99+ |
General Motors | ORGANIZATION | 0.99+ |
Morgan Stanley | ORGANIZATION | 0.99+ |
25 years | QUANTITY | 0.99+ |
six weeks | QUANTITY | 0.99+ |
2018 | DATE | 0.99+ |
100% | QUANTITY | 0.99+ |
MIT | ORGANIZATION | 0.99+ |
two views | QUANTITY | 0.99+ |
400 people | QUANTITY | 0.99+ |
Cambridge, Massachusetts | LOCATION | 0.99+ |
10 o'clock | DATE | 0.99+ |
theCUBE | ORGANIZATION | 0.99+ |
SiliconANGLE Media | ORGANIZATION | 0.99+ |
a year and a half later | DATE | 0.99+ |
late 90s | DATE | 0.98+ |
first | QUANTITY | 0.98+ |
first step | QUANTITY | 0.98+ |
today | DATE | 0.97+ |
one problem | QUANTITY | 0.96+ |
2,000 years ago | DATE | 0.95+ |
both arguments | QUANTITY | 0.95+ |
one | QUANTITY | 0.95+ |
first person | QUANTITY | 0.95+ |
MIT Chief Data Officer and Information Quality Symposium | EVENT | 0.94+ |
one thing | QUANTITY | 0.94+ |
single transaction | QUANTITY | 0.88+ |
23-year-old | QUANTITY | 0.81+ |
CDOIQ | ORGANIZATION | 0.8+ |
two different | QUANTITY | 0.76+ |
CDO | ORGANIZATION | 0.75+ |
secondly | QUANTITY | 0.72+ |
One of my colleagues | QUANTITY | 0.71+ |
CDOIQ | EVENT | 0.69+ |
12th annual | QUANTITY | 0.66+ |
night | DATE | 0.56+ |
Kickoff | MIT CDOIQ 2018
>> Live from the MIT Campus in Cambridge, Massachusetts, it's theCUBE. Covering the 12th Annual MIT chief data officer and Information Quality Symposium. Brought to you by: SiliconANGLE Media. >> Welcome to theCUBE's coverage of MITCDOIQ here in Cambridge, Massachusetts on the MIT Campus. I'm your host, Rebecca Knight, along with my co-host, Peter Burris. Peter, it's a pleasure to be here with you. Thanks for joining me. >> Absolutely, good to see you again, Rebecca. >> These are my stomping grounds. >> Ha! >> So welcome to Massachusetts. >> It's an absolutely beautiful day in Cambridge. >> It is, it is, indeed. I'm so excited to be hosting this with you, what do you think, this is about chief data officer information quality. We're really going to get inside the heads of these chief data officers, find out what's on their minds, what's keeping them up at night, how are they thinking about data, how are they pricing it, how are they keeping it clean, how are they optimizing it, exploiting it, how are they hiring for it? What do you think is the top issue of the day, in your mind. There's a lot to talk about here, what's number one? >> Well, I think the first thing, Rebecca, is that if you're going to have a chief in front of your name then, at least in my mind, that means the Board has directed you to generate some return on the assets that you've been entrusted with. I think the first thing that the CDO, the chief data officer, has to do is start to do a better job of pricing out the value of data, demonstrating how they're turning it into assets that can be utilized and exploited in a number of different ways to generate returns that are actually superior to some of the other assets in the business because data is getting greater investment these days. So I think the first thing is, how are you turning your data into an asset, because if you're not, why are you achieve anything? >> (laughs) No, that's a very good point. The other thing we were talking about before the cameras were rolling, is the role of the CDO, chief data officer, and the role of the CIO, chief information officer, and how those roles differ. I mean is that something that we're going to get into today? What do you think? >> I think it's something certainly to ask a lot of the chief data officers that are coming on, there's some confusion in the industry about what the relationship should be and how the roles are different. The chief data officer as a concept, has been around for probably 10-12 years, something like that. I mean, the first time I heard it was probably 2007-2008. The CIO role has always been about information, but it ended up being more about the technology, and then the question was, what does a Chief Technology Officer does? Well it was a Chief Technology Officer could have had a different role, but they also seem to be increasing the responsible for the technology. So if you look at a lot organizations that have a CDO, the CIO looks more often to be the individual in charge of the IT assets, the technology officer tends to be in charge of the IT infrastructure, and the CDO tends to be more associated with, again, the role that the data plays, increasingly associated with analytics. But I think, over the next few years, that set of relationships is going to change, and new regimes will be put in place as businesses start to re-institutionalize their work around their data, and what it really means to have data as an asset. >> And the other role we've not mentioned is the CDO, Chief Digital Officer, which is the convergence of those two roles as well. How do you see, you started out by saying this is really about optimizing the data and finding a way to make money from it. >> Or generate a return. >> Generate a return, exactly! Find value in it, exactly. >> One of the things about data, and one of the things about IT, historically, is that it often doesn't generate money directly, but rather indirectly, and that's one of the reasons why it has been difficult to sustain investments in. The costs are almost always direct, so if I invest in an IT project, for example, the costs show up immediately but the benefits come through whatever function I just invested in the application to support. And the same thing exists with data. So if we take a look at the Chief Digital Officer, often that's a job that has been developed largely close or approximate to the COO to better understand how operations are going to change as a consequence of an increasing use of data. So, the Chief Digital Officer is often an individual whose entrusted to think about as we re-institutionalize work around data, what is that going to mean to our operations and our engagement models too? So, I think it's a combination of operations and engagement. So the Chief Digital Officer is often very proximate to the COO thinking about how data is going to change the way organization works, change the way the organization engages, from a strategic standpoint first, but we're starting to see that role move more directly into operations. I don't want to say compete with the COO, but work much more closely with them in an operational level. >> Right, and of course, depending organization to organization. >> It's always different, and to what degree are your assets historically data-oriented, like if you're a media company or if you're a financial services company, those are companies that are very strong lineages of data as an asset. If you're a manufacturing company, and you're building digital twins, like a GE or something along those lines, then you might be a little bit newer to the game, but still you have to catch up because data is going to mush a lot of industries together, and it's going to to be hard to parse some of these industries in five to ten years. >> Well, precisely, one of the things you said was that the CDO, as a role, is really only 11-12 years old. In fact that this conference is in its 12th year, so really it started at the very beginning of the CDO journey itself, and we're now amidst the CDO movement. I mean, what do you think, how is the CDO thinking about his or her role within the larger AI revolution? >> Well, that's a great question, and it's one of the primary reasons why it's picking up pace. We've had a number of different technology introductions over the past 15 - 20 years that have bought us here. The notion of virtualizing machines changed or broke that relationship between applications and hardware. The idea of very high speed, very flexible, very easy to manage data center networking broke the way that we thought about how resources could be bought together. Very importantly, in the last six or seven years, the historical norm for storage was disc, which was more emphasized how do I persist the data that results from a transaction, and now we're moving to flash, and flash-based systems, which is more about how can I deliver data to new types of applications. That combination of things makes it possible to utilize a lot of these AI algorithms and a lot of these approaches to AI, many of which the algorithms have been around for 40-50 years, so we're catalyzing a new era in which we can think about delivering data faster with higher fidelity, with lower administrative costs because we're not copying everything and putting it in a lot of different places. That is making it possible to do these AI things. That's precisely one of the factors that's really driving the need to look at data as an asset because we can do more with it than we ever have before. You know, it's interesting, I have a little bromide, when people ask me what's really going on in the industry, what I like to say is for the first 50 years of the industry, it was known process, unknown technology. We knew we were going to do accounting, we knew we were going to do HR, there was largely given to us legal or regulatory or other types of considerations, but the unknown was, do we put it on a mainframe? Do we put it on a (mumbles) Do we use a database manager? How distributed is this going to be? We're now moving into an era where it's unknown process because we're focused on engagement or the role that data can play in changing operations, but the technology is going to be relatively common. It's going to be Cloud or Cloud-like. So, we don't have quite as, it's not to say that technology questions go away entirely, they don't, but it's not as focused on the technology questions, we can focus more on the outcomes, but we have a hard time challenging those outcomes or deciding what those outcomes are going to be, and that's one of the interesting things here. We're not only using data to deliver the outcomes, we're also using data to choose what outcomes to pursue. So it's an interesting recursive set of activities where the CDO is responsible for helping the business decide what are we going to do and also, how are we going to do it? >> Well, exactly. That's an excellent point, because there are so many, one of the things that we've heard about on the main stage this morning is the difficulty a lot of CDOs get with just buy-in, and really understanding, this is important, and this is not as important or this is what we're going to do, this is what we're saying the data is telling us, and these are the actions we're going to take. How do you change a culture? How do you get people to embrace it? >> Well, this is an adoption challenge, and an adoption challenges are always met by showing returns quickly and sustainably. So one of the first things, which is why I said, one of the first things that a CDO has to do is show the organization how data can be thought of as an asset, because once you do that, now you can start to describe some concrete returns that you are able to help deliver as a consequence of your chief role. So that's probably the first thing. But, I think, one of the other things to do is to start doing things like demonstrating the role that information quality plays within an organization. Now, information quality is almost always measured in terms of the output or the outcomes that it supports, but there are questions of fidelity, there are questions of what data are we going to use, what data are we not going to use? How are we going to get rid of data? There's a lot of questions related to information quality that have process elements to them, and those processes are just now being introduced to the organization, and doing a good job of that, and acculturating people to understanding the role of equality plays, information quality plays, is another part of it. So I think you have to demonstrate that you have conceived and can execute on a regime of value, and at the same time you have to demonstrate that you have particular insight into some of those on-going processes that are capable of sustaining that value. It's a combination of those two things that, I think, the chief data officer's going to have to do to demonstrate that they belong at the table, on-going. >> Well, today we're going to be talking to an array of people, some from MIT who study this stuff >> I hear they're smart people. >> Yeah, maybe. A little bit. We'll see, we'll see. MIT, some people from the US Government, so CDOs from the US Army, the Air Force, we've got people from industry too, we've also got management consultants coming on to talk about some best practices, so it's going to be a great day. We're going to really dig in here. >> Looking forward to it. >> Yes. I'm Rebecca Knight, for Peter Burris, we will have more from MITCDOIQ in just a little bit. (techno music)
SUMMARY :
Brought to you by: SiliconANGLE Media. Massachusetts on the MIT Campus. Absolutely, good to beautiful day in Cambridge. issue of the day, in your mind. the chief data officer, has to do rolling, is the role of the CDO, and the CDO tends to be is the CDO, Chief Digital Officer, Generate a return, exactly! and one of the things depending organization to organization. and to what degree of the things you said the need to look at data as an asset one of the things that we've and at the same time so CDOs from the US Army, the Air Force, we will have more from
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Rebecca | PERSON | 0.99+ |
Rebecca Knight | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Rebecca Knight | PERSON | 0.99+ |
Peter | PERSON | 0.99+ |
five | QUANTITY | 0.99+ |
Massachusetts | LOCATION | 0.99+ |
Cambridge | LOCATION | 0.99+ |
MIT | ORGANIZATION | 0.99+ |
12th year | QUANTITY | 0.99+ |
US Army | ORGANIZATION | 0.99+ |
GE | ORGANIZATION | 0.99+ |
theCUBE | ORGANIZATION | 0.99+ |
MITCDOIQ | ORGANIZATION | 0.99+ |
SiliconANGLE Media | ORGANIZATION | 0.99+ |
Cambridge, Massachusetts | LOCATION | 0.99+ |
two roles | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
Cambridge, Massachusetts | LOCATION | 0.99+ |
ten years | QUANTITY | 0.99+ |
2007-2008 | DATE | 0.99+ |
40-50 years | QUANTITY | 0.99+ |
first 50 years | QUANTITY | 0.98+ |
11-12 years | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
first thing | QUANTITY | 0.98+ |
US Government | ORGANIZATION | 0.98+ |
two things | QUANTITY | 0.97+ |
One | QUANTITY | 0.97+ |
10-12 years | QUANTITY | 0.96+ |
first time | QUANTITY | 0.95+ |
this morning | DATE | 0.93+ |
Information Quality Symposium | EVENT | 0.92+ |
first things | QUANTITY | 0.9+ |
MIT Campus | LOCATION | 0.86+ |
12th Annual MIT chief data officer | EVENT | 0.84+ |
first | QUANTITY | 0.79+ |
CDO | ORGANIZATION | 0.78+ |
seven years | QUANTITY | 0.78+ |
Campus | LOCATION | 0.77+ |
CDO | TITLE | 0.65+ |
MIT CDOIQ 2018 | EVENT | 0.64+ |
last six | DATE | 0.59+ |
years | DATE | 0.59+ |
Air Force | ORGANIZATION | 0.58+ |
Kickoff | EVENT | 0.57+ |
things | QUANTITY | 0.54+ |
15 | QUANTITY | 0.52+ |
CDO | EVENT | 0.51+ |
20 years | QUANTITY | 0.48+ |
past | DATE | 0.44+ |