Katya Fisher, Greenspoon Marder | Acronis Global Cyber Summit 2019
>> Narrator: From Miami Beach, Florida, it's theCUBE, covering Acronis Global Cyber Summit 2019. Brought to you by Acronis. >> Okay, welcome back everyone. It's theCUBE's two day coverage of Acronis' Global Cyber Summit 2019, here in Miami Beach, at the Fontainebleau Hotel. I'm John Furrier, host of theCUBE. We're with Katya Fisher, Partner Chief and Chief Privacy Officer at Greenspoon Marder. Legal advice is right here on theCUBE, ask her anything. We're going to do a session here. Thanks for coming on, appreciate it. >> Thank you very much, I'm going to have to do the little disclaimer that all lawyers do, which is, nothing here is to be construed as advice. It's just opinions and information only. >> I didn't mean to set you up like that. All kidding aside, you closed for the panel here for Acronis' conference. Obviously, cyber protection's their gig. Data protection, cyber protection. Makes sense, I think that category is evolving from a niche, typical enterprise niche, to a much more holistic view as data becomes you know, critical in the security piece of it. What was on the, what were you guys talking about in the panel? >> Well, so, the first issue that you have to understand is that cyber protection is something that has now become critical for pretty much every individual on the planet, as well as governments. So something that we talked about on the panel today was how governments are actually dealing with incoming cyber threats. Because now, they have to take a look at it from the perspective of, first of all, how they themselves are going to become technologically savvy enough to protect themselves, and to protect their data, but also, in terms of regulation and how to protect citizens. So, that was what the panel discussion was about today. >> On the regulatory front, we've been covering on SiliconANGLE, our journalism site, the innovation balance, is regulatory action helpful or hurtful to innovation? Where is the balance? What is the education needed? What's your thoughts on this, where are we? I mean early stages, where's the progress? What needs to get done? What's your view on the current situation? >> So, I'm an attorney, so my views are perhaps a bit more conservative than some of the technologists you might speak with and some of my clients as well. I think that regulation is, as a general matter, it can be a good thing. And it can be quite necessary. The issues that we see right now, with regard to regulation, I think one of the hottest issues today is with respect to data laws and data privacy laws. And that's obviously something that I think everyone is familiar with. I mean take a look at, in the United States alone. We've seen the city of Baltimore dealing with breaches. We've seen other parts of the government, from the Federal level all the way down to municipalities, dealing with breaches in cyber attacks. We've seen data breaches from banks, Capital One, right? I believe Dunkin' Donuts suffered a breach. Equifax, and then at the same time we've also seen individuals up in arms over companies like 23andMe and Facebook, and how data is used and processed. So data seems to be a very very hot button issue today across the board. So something that we're really thinking about now is, first of all, with respect to the regulatory climate, how to deal with it, not only in the United States, but on a global level, because, when we talk about technology and the internet right, we're in an era of globalization. We're in an era where a lot of these things go across boarders and therefore we have to be mindful of the regulatory regimes in other places. So, I'll give you an example. You might be familiar with the GDPR. So the GDPR is in the European Union. It's been in effect now for the last year and a half, but it affects all my U.S. clients. We still have to take a look at the GDPR because at the end of the day my clients, my firm, might be dealing with foreign companies, foreign individuals, companies that have some sort of nexus in the European Union, et cetera. So because of that, even though the GDPR is a set of regulations specific to the European Union, it becomes extremely important in the context of the United States and globally. At the same time, the GDPR has certain issues that then end up conflicting often times with some of the regulations that we have here in the United States. So, for example, the right to be forgotten is perhaps the most famous clause or part of the GDPR and the right to be forgotten is this concept in the GDPR that an individual can have information erased about him or her in order to protect his or her privacy. The problem is that from a technical's perspective, first of all, it's an issue because it becomes very very difficult to figure out where data is stored, if you're using third-party processors, et cetera. But from a regulatory perspective, the conflict comes in when you take a look at certain U.S. laws. So take a look for example at banking regulations in the United States. Banks have to hold some types of data for seven years and other types of data they can never delete. Right? Lawyers. I am licensed by the New York State Bar Association. Lawyers have their own rules and regulations with regard to how they store data and how they store information. HIPAA, medical records. So, you see these conflicts and there are ways to deal with them appropriately, but it becomes some food for thought. >> So it's complicated. >> It's really complicated >> There's a lot of conflicts. >> Yeah. >> First of all, I talked to a storage guy. He's like data? I don't even know which drive that's on. Storage is not elevated up to the level of state-of-the-art, from a tracking standpoint. So, it's just on the business logic is complicated. I can't imagine that. So, I guess my question to you is that, are you finding that the jurisdictional issue, is it the biggest problem, in terms of crossport and the business side or is the technical underpinnings, that with GDPR's the problem or both? What's your-- >> I mean it's both, right? They're a lot of issues. You're right, it's very complicated. I mean, in the United States we don't have some sort of overarching federal law. There's no cyber protection law in the United States. There's no overarching data protection law. So, even in the U.S. alone, because of federalism, we have HIPAA and we have COPPA which protects children and we have other types of acts, but then we also have state regulations. So, in California you have the California Privacy Act. In New York you have certain regulations with regard to cyber security and you have to deal with this patchwork. So, that becomes something that adds a new layer of complexity and a new layer of issues, as we take a look, even within the U.S. alone, as to how to deal with all of this. And then we start looking at the GDPR and all of this. From a technical perspective. I'm not a technologist, but. >> Katya, let me ask you a question on the (mumbles) and business front. (mumbles) I think one of the things. I'm saying it might or may not be an issue, but I want to get your legal weigh-in on this. >> Katya: Sure. >> It used to be when you started a company, you go to Delaware, very friendly, domicile in Delaware, do some formation there, whether you're a C corp or whatever, that's where we tend to go, raise some money, get some preferred stock, you're in business. >> Is there a shift in where companies with domicile, their entity, or restructure their companies around this complexity? Because, there's two schools of thought. This brute force act, everything coming at you, or you restructure your corporate formation to handle some of the nuances, whether it's I have a Cayman or a Bermuda... whatever's going on in the regulatory regime, whether it's innovative or not. Are people thinking like that? Or, what's your take on it? What's some of the data you're seeing from the field around, restructuring around the problem? >> So, with respect to restructuring, specifically around data laws and data protection laws, I'm not seeing too much of that, simple because of the fact that regulations like the GDPR are just so all-encompassing. With respect to companies setting up in Delaware as opposed to other jurisdictions, those are usually based on two issues, right, two core ones, if I can condense it. One has to do with the court system and how favorable a court system is to the corporation, and the second is taxes. So, a lot of times when you see companies that are doing all of this restructuring, where they're setting up in offshore zones, or et cetera, it's usually because of some sort of a tax benefit. It might be because of the fact that, I don't know, for example, intellectual property. If you have a company that's been licensing IP to the United States, there's a 30% withholding tax when royalties are paid back overseas. So a lot of times when you're looking at an international structuring, you're trying to figure out a jurisdiction that might have a tax treaty with the United States, that will create some sort of an opportunity to get rid of that 30% withholding. So, that's where things usually come into play with regard to taxes and IP. I haven't seen yet, on the side of looking for courts that are more favorable to companies, with respect to data privacy and data protection. I just haven't seen that happen yet because I think that it's too soon. >> How do companies defend themselves against claims that come out of these new relations? I mean GDPR, I've called it the shitstorm when it came out. I never was a big fan of it. It just didn't. I mean, I get the concept, but I kind of understood the technical issues, but let's just say that you're a small growing business and you don't have the army of lawyers or if someone makes a claim on you, I have to defend it. How are companies defending themselves? Do they just shut down? Do they hire you guys? I mean, obviously lawyers need to be involved. But, at some point there's a line of where having a U.S. company and someone consumes my media in Germany and it says, hey I'm a German citizen. You American company, delete my records. How does that work? Do I have to be responsible for that? I mean, what's? >> So, it's really case-by-case basis. First of all, obviously, with regard to what I was talking about earlier, with respect to the fact that there are certain regulations in the U.S. that conflict with GDPR and the right to be forgotten. If you can actually assert a defense and sort of a good reason for why you have to maintain that information, that's step one. Step two is, if it's some complaint that you received, is to delete the person's information. There's an easier way to do it. >> Yeah, just do what they want. >> Just comply with what they want. If somebody wants to be off of a mailing list, take them off the mailing list. The third is, putting in best practices. So, I'm sure a lot of things that people see online, it's always great to go ahead and obtain legal counsel, even if you're consulting with a lawyer just for an hour or two, just to really understand your particular situation. But, take a look at privacy policies online. Take a look at the fact that cookies now have a pop-up whenever you go to a website. I'm sure you've noticed this, right? >> John: Yeah. So, there are little things like this. Think about the fact that there are, what is known as clickwrap agreements. So, usually you have to consent. You have to check a box or uncheck a box with respect to reading privacy policies, being approved for having your email address and contact information somewhere. So, use some common sense. >> So, basically don't ignore the prompt. >> Don't ignore the problem. >> Don't ignore it. Don't stick your head in the sand. It'll bite you. >> Correct. And the thing is, to be honest, for most people, for most small companies, it's not that difficult to comply. When we start talking about mid-size and large businesses, the next level, the next step, obviously beyond hiring attorneys and the like, is try to comply with standards and certifications. For example, there's what is known as ISO standards. Your company can go through the ISO 27001 certification process. I think it costs around approximately $20,000. But, it's an opportunity to go ahead, go through that process, understand how compliant you are, and because you have the certification, you're then able to go to your customers and say, hey, we've been through this, we're certified. >> Yeah. Well, I want to get, Katya, your thoughts, as we wrap up on this segment, around Crypto and Blockchain. Obviously, we're bullish on Blockchain. We think this is a supply chain. (mumbles) Blockchain can be a good force, although some think there's some work needs to be done on the whole energy side of it, which is, we would agree. But, still. I'm not going to make that be a wet blanket of excitement. But cryptocurrency has been fraudulent. It's been. The SCC's been cracking down in the U.S., in the news. Lieber's falling apart, although, I called that separately, but, (laughing) it had nothing to do with that Lieber. It was more of Facebook, but. Telegram. We were talking about that, others. People are getting handcuffed on this stuff. They're really kind of clamping down. But, overseas in Asia, it's still an unregulated, seems to be (mumbles) kind of market. Your advice to clients was to shy away, be careful? >> My advice to clients is as follows. First of all, Blockchain and cryptocurrency are not the same thing. Right? Cryptocurrency is a use case coming out of Blockchain technology. I think that in the United States, the best way to think about it is to understand that the term cryptocurrency, from a regulatory perspective, is actually a misnomer. It's not a currency. It's property. Right? It's an asset. It's digital assets. So, if you think about it the same way that we think of shares in a company, it's actually much easier to become compliant, because, then you can understand that it's going to be subject to U.S. securities laws, just like other securities. It's going to be taxed, just like securities are taxed, which means that it's going to be subject to long and short-term capitol gain, and it's also going to be subject to the other regulatory restrictions that are adherent to securities, both on the federal and state level. >> It's interesting that you mentioned security. The word security. If you look back at the ICO craze, internet coin offerings, crypto offerings, whatever you call it, The people who got whacked the most were the ones that went out as a utility tokens. Not to get nerdy on this, but utility and security are two types of tokens. The ones that went out and raised money as the utility token had no product, raised money using the utility that doesn't exist. That's essentially a security. And, so, no wonder why they're getting slapped. >> They're securities. Look, Bitcoin, different story, because Bitcoin is the closest to being I guess, what we could consider to be truly decentralized, right? And the regulatory climate around Bitcoin is a little bit different from what I'm talking about, with respects to securities laws. Although, from a tax perspective, it's the same. It's taxed as property. It's not taxed the way that foreign currency is taxed. But ultimately, yeah. You had a lot of cowboys who went out, and made a lot of money, and were just breaking the law, and now everyone is shocked when they see what's going on with this cease-and-desist order from the SCC against Telegram, and these other issues. But, none of it is particularly surprising because at the end of the day we have regulations in place, we have a regulatory regime, and most people just chose to ignore it. >> It's interesting how fast the SCC modernized their thinking around this. They really. From a speed standpoint, all government agencies tend to be glacier speed kind of movement. They were pretty fast. I mean, they kind of huddled on this for a couple months and came out with direction. They've been proactive. I got to say. I was usually skeptical of most government organization. I don't think they well inform. In this case, I think the SCC did a good job. >> So, I think that the issue is as follows. You know, Crypto is a very very very small portion of what the SCC deals with, so, they actually paid an inordinate amount of attention to this, and, I think that they did it for a couple of reasons. One is because, you asked me in the beginning of this interview about regulations versus innovation. And, I don't think anyone wants to stifle innovation in America. It's a very interesting technology. It's very interesting ideas, right? No one wants that to go away and no one wants people to stop experimenting and stop dreaming bigger. At the same time, the other issue that we've seen now, especially, not only with the SCC, but with the IRS now getting involved, is the fact that even though this is something very very small, they are very concerned about where the technology could go in the future. The IRS is extremely concerned about erosion of the tax space. So, because of that, it makes a lot of sense for them to pay attention to this very very early on, nip this in the bud, and help guide it back into the right direction. >> I think that's a good balance. Great point. Innovation doesn't want to be stifled at all, absolutely. What's new and exciting for you? Share some personal or business updates in your world. What's going on? What's getting you excited these days, in the field? >> What's getting me excited these days? Well, I have to tell you that one thing that actually has gotten me excited these days is the fact that the Blockchain and cryptocurrency industries have grown up, substantially. And, now we're able to take a look at those industries in tandem with the tech industry at large, because they seem to sort of be going off in a different direction, and now we're taking a look at it, and now you can really see sort of where the areas that things are going to get exciting. I look at my clients and I see the things that they're doing and I'm always excited for them, and I'm always interested to see what new things that they'll innovate, because, again, I'm not a technologist. So, for me, that's a lot of fun. And, in addition to that, I think that other areas are extremely exciting as well. I'm a big fan of Acronis. I'm a big fan of cyber protection issues, data protection, data regulation. I think something that's really interesting in the world of data regulation, that actually has come out of the Blockchain community, in a way, is the notion of data as a personal right, as personal property. So, one of the big things is the idea that now that we've seen these massive data breaches with Facebook and 23andME, and the way that big government, big companies, are using individuals' datas, the idea that if data were to be personal property, it would be used very very differently. And technologists who are using Blockchain technology say that Blockchain technology might actually be able to make that happen. Because if you could have a decentralized Facebook, let's say, people could own their own data and then use that data as they want to and be compensated for it. So, that's really interesting, right-- Yeah, but, if you're just going to use the product, they might as well own their data, right? >> Katya: Exactly. >> Katya, thanks for coming on theCUBE. Thanks for the insight. Great, compelling narrative. Thanks for sharing. >> Sure, thank you very much. >> Appreciate it. I'm John Furrier here on theCUBE, Miami Beach, at the Fontainebleau hotel for Acronis' Global Cyber Summit 2019. We'll be back with more coverage after this short break.
SUMMARY :
Brought to you by Acronis. here in Miami Beach, at the Fontainebleau Hotel. I'm going to have to do the little disclaimer I didn't mean to set you up like that. Well, so, the first issue that you have to understand So, for example, the right to be forgotten So, I guess my question to you is that, I mean, in the United States on the (mumbles) and business front. It used to be when you started a company, What's some of the data you're seeing from the field One has to do with the court system I mean GDPR, I've called it the shitstorm when it came out. that conflict with GDPR and the right to be forgotten. Take a look at the fact Think about the fact that there are, Don't stick your head in the sand. And the thing is, to be honest, it had nothing to do with that Lieber. Blockchain and cryptocurrency are not the same thing. It's interesting that you mentioned security. because Bitcoin is the closest to being I got to say. and help guide it back into the right direction. I think that's a good balance. I look at my clients and I see the things Thanks for the insight. Miami Beach, at the Fontainebleau hotel
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Katya Fisher | PERSON | 0.99+ |
Equifax | ORGANIZATION | 0.99+ |
Delaware | LOCATION | 0.99+ |
Katya | PERSON | 0.99+ |
Germany | LOCATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
John | PERSON | 0.99+ |
SCC | ORGANIZATION | 0.99+ |
two issues | QUANTITY | 0.99+ |
Acronis | ORGANIZATION | 0.99+ |
America | LOCATION | 0.99+ |
Miami Beach | LOCATION | 0.99+ |
Capital One | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Asia | LOCATION | 0.99+ |
California Privacy Act | TITLE | 0.99+ |
30% | QUANTITY | 0.99+ |
New York State Bar Association | ORGANIZATION | 0.99+ |
United States | LOCATION | 0.99+ |
IRS | ORGANIZATION | 0.99+ |
seven years | QUANTITY | 0.99+ |
New York | LOCATION | 0.99+ |
Dunkin' Donuts | ORGANIZATION | 0.99+ |
both | QUANTITY | 0.99+ |
U.S. | LOCATION | 0.99+ |
23andMe | ORGANIZATION | 0.99+ |
GDPR | TITLE | 0.99+ |
today | DATE | 0.99+ |
Baltimore | LOCATION | 0.99+ |
two day | QUANTITY | 0.99+ |
Miami Beach, Florida | LOCATION | 0.99+ |
European Union | ORGANIZATION | 0.99+ |
two schools | QUANTITY | 0.99+ |
California | LOCATION | 0.99+ |
HIPAA | TITLE | 0.99+ |
an hour | QUANTITY | 0.99+ |
Greenspoon Marder | ORGANIZATION | 0.99+ |
first issue | QUANTITY | 0.99+ |
third | QUANTITY | 0.99+ |
Bermuda | LOCATION | 0.99+ |
One | QUANTITY | 0.98+ |
two | QUANTITY | 0.98+ |
two types | QUANTITY | 0.98+ |
Acronis' Global Cyber Summit 2019 | EVENT | 0.98+ |
last year and a half | DATE | 0.98+ |
Lieber | PERSON | 0.97+ |
COPPA | TITLE | 0.97+ |
second | QUANTITY | 0.97+ |
one | QUANTITY | 0.97+ |
Russell L. Jones, Deloitte | RSA 2019
>> Live from San Francisco, it's theCUBE! Covering the RSA Conference 2019. Brought to you by ForeScout. >> Hey, welcome back everybody, Jeff Frick here with theCUBE. We're at RSA at Moscone at downtown San Francisco. We're in the ForeScout booth, our first time in the ForeScout booth, we're really excited to be here and we're talking about cyber security, I don't know what the official number is this year, probably 45 thousand professionals walkin' around, talkin' about security. And we've got our next guest on, he is Russell Jones, partner on cyber risk services for Deloitte. Russell, great to meet you! >> Same to meet you as well. >> So, I asked him before we turned on, what's getting you excited these days and he said, everything! So, this is a crazy busy space. What have you been working on lately, what's kind of your take away from the first couple days at the show? >> Yeah, it is a crazy, busy space and if you look at the cyber landscape, everything's moving at the speed of the internet, so it's this cat and mouse game in terms of attackers trying to find new ways to get into systems that is driving the industry. When you talk about health care though, the issue is these systems, like medical devices, often times are connected to people. >> Right. >> And so, the implications of a hack against, let's say, a MRI machine or a fusion pump, could be devastating to an actual person connected to it. And that's really what's driving a lot of innovation in terms of some of the technologies you see, like ForeScout, and also, a lot of what's going on from a regulatory perspective, and also the hospitals and the health care system themselves. >> Right. >> Trying to solve that problem, managing cyber risk as it relates to clinical technology. >> And a lot of that stuff wasn't connected before, right? There weren't IP addresses on every MRI machine or all these pump machines or, you know, you have a pacemaker, all these things. How are they looking at kind of the risk reward from a connected device that gives you all kinds of benefits-- >> Yeah. >> but it does open up this attack surface that previously had maybe an air gap there? >> That's a great point, bottom line is the life saving, life extending attributes of these medical technologies and medical devices far outweighs the risk of cyber, however, we got to be smart about managing that risk. So, we're going to see more connectivity, not less. Train's left the station, in terms of what's coming and in the future of the healthcare, connecting more of, not only the medical devices, but the information in them and being able to share that and then bring it together and aggregate it in ways that, you know, with analytics on top of it allows doctors and researchers in the clinical community to connect dots in ways that solve cancer, solve some different maladies that have plagued us forever. >> Right. >> So I think, on the one hand, it's great, this connectivity is extending healthcare out to people in rural locations and it's also bringing together a lot of different data from everything from your Fitbit to your pacemaker to apps that you have on your phone in a way that's going to benefit us. >> Right, right, so, one of the things about healthcare is they're way out in front of, kind of, not healthcare in terms of regulations. >> Yeah. >> You know, and HIPAA's been around for a long time, GDPR just went into place in Europe last year, so when you look at it from a regulatory environment, which people have to consider, there's not only the complexity of the machines, there's not only the complexity of the security, but you also have regulatory environment. >> Yeah. >> How is the cyber security in healthcare, with their very unique regulations, kind of impacting the way people should think about the problem, the way they should implement solutions? >> That's a good question, I think we've thought about, in the cyber community, forever. We talk about confidentiality, integrity, availability, right, the triangle. When you think about healthcare and clinical technology and medical devices, you need to flip that triangle upside down and the focus is integrity and availability, those things together equal patient safety. So, in other words, as we're connecting more of these devices to each other, to electronic health record systems, to the cloud, the integrity of the information in there, which is being used by doctors and other folks to make decisions about treatment, about surgical procedures, about medicines, it's crucial that that information and the integrity of it is maintained. And then the availability of the device is critical, right? If you're going in to get an MRI and it's down because it's been hacked, there's usually not a spare MRI and so there's a profound impact for patients that are scheduled back to back to back to back to go get that procedure, that MRI that's going to be used by a doctor to do some surgery or some other kind of a treatment plan >> Right. >> So integrity and availability are huge in the cyber world. And, if you look at the regulations, depending on which one we're talking and which part of the world, right? You mentioned HIPAA, we've got security and privacy, you've got GDPR, you've got the FDA that have guidance around what they want the manufacturers to do, building security into the devices. >> Right. >> They all have an impact on cyber and how it's going to be addressed, how we're going to manage cyber risk in the healthcare world. >> Right. >> In that environment. >> And then there's this whole new thing, I went to the Wall Street Journal Health Conference a couple weeks back, I don't know if you were there, but there was two people up where you now you can take your genetic footprint, right? >> Yeah. >> You can take your 23andMe results and after you figure out where your family's from, you can actually sell it back into a research market-- >> Yeah. >> so that doctors and clinicians and people doing trials on new drugs can now take your data in kind of a marketplace, back into a whole nother application so it's kind of outside of the core healthcare system, if you will. >> That's right. >> But I mean, it's basically, it's me, right? (laughs) In the form of my DNA footprint. >> Yup. >> It's crazy, crazy amounts of strange data that now is potentially exposed to a hack. >> That's right, and so the implications there, obviously, privacy, right? That's a huge issue, I think, that we're going to have to address and that's why you see GDPR and that's why you see the California Consumer Privacy Act. >> Right. >> There's a recognition that, again, the train's left the station, there's a lot of good things that come out of sharing data and sharing information, there's a lot benefits that can come out of it for the consumers, patients. There's a dark side as well and that has to be managed. That's why we have the privacy regulations that we have, we're probably going to see more, probably going to see more things like the California Consumer Privacy Act. >> Right. >> More states and eventually-- >> Right. >> probably a federal act for the US. >> Do you think that the healthcare industry is better equipped to deal with GDPR and the California Healthcare Act because of things like HIPAA and they kind of come from that world? Or is this just a whole new level of regulation that they now have to account for? >> I think it's probably a mixed bag. On the one hand, healthcare has been dealing with privacy for a long time, even before HIPAA, right. And then HIPAA has very specific requirements around how you have to manage that information and consent and notifying the patient of their rights. On your other hand, you look at some of the new things, like GDPR, it goes way beyond HIPAA, and I think-- >> It goes way beyond HIPAA? >> Goes way behind HIPAA, like for example, this whole notion of the right to be forgotten. >> Right. >> Right, that's a requirement on the GDPR. That means, me as a patient, if I tell my doctor, I want you to get rid of all my medical records, everything in your system everywhere about me, I want it gone. Not that it makes sense-- >> Right, right. >> but, at least in Europe, if they ask to do that, you have to be able to comply. From a technology perspective and a medical device perspective, some of these devices are very complex, ecosystem of devices, components that make up the product. >> Right >> That's a very difficult thing to do. There's no one delete button-- >> Right. >> that you hit that can delete you from all different instances, downstream from where you came into the healthcare system. >> Right. >> And so, when you think about it from a cyber perspective, it gets to be very challenging. >> The other thing, right, is health care's always under tremendous kind of price pressure from the insurers and the consumers and a bad medical event can wipe-- >> Yeah. >> people out, right? >> Yeah. >> Especially when they're later in life and they're not properly insured, when they're making kind of an ROI analysis on cyber investments versus all the other things they can spend their money on, and they can't spend it all on security, that's not possible, how are they factoring in kind of the cyber investment, it's kind of this new layer of investment that they have to make because all these things are invested versus just investing in better beds and better machines and better people? >> That's the million dollar question. (laughs) I would say, some hospitals and health systems are doing it better than others, so maybe a little bit more further along and mature about thinking about the total cost of ownership and also, the patient factor, right? What has to be balanced, obviously, is not just the costs, but at the end of the day, what's best for the patient. And you hear this term, patient centricity, a lot today. And there's a recognition from all the players in the echo system, it's all about the patient. >> I'm so glad you say that 'cause I think a lot of people probably think that the patient sometimes gets lost in this whole thing, but you're saying no. >> There is an acknowledgement over the last few years and it's called patient centricity, it's an acknowledgement that the way we're going into the future of healthcare and the kinds of medical devices and technology and cloud solutions that are becoming part of the healthcare fabric, they're all being built and geared towards the patient being the center of the equation, not the doctor, not the hospital, it's the patient. >> Right, right, right, that's good to hear. >> And so, to answer your original question, we're in early days and really trying to balance the patient and patient centricity versus we've got vulnerabilities in our environment that could impact the patient and we've only got limited people and costs. >> Right, right. >> Making decisions that kind of balance all of those things. >> Right, alright Russell, last question, we're sitting here in the ForeScout booth. >> Yes. >> Obviously you have a relationship with them, talk about kind of what their solution adds to some of the stuff that you're workin' on. >> So, ForeScout, one of the reasons that we're working closely with ForeScout, their solution, really, they've taken an approach that's holistic around these issues that we're talking about, right, managing cyber risk, complex environment, a lot of different devices that are connected to each other and to the cloud and to the internet. They have built a solution that focuses on ability to have visibility into those devices that are on your network, some of which you may not even know exists, and then being able to kind of build an asset inventory around that visibility that allows you to do things like detect, based on policy, activity that suggests that you might be hacked or there might be some internal processes or players that are doing things that are going to put patients at risk or have you in non-compliance with GDPR, HIPAA and the rest. >> Right. >> And then their solution goes beyond ability to kind of visibility and detect, but to actually do something actionable, right? Security controls and orchestration with other technologies, like Simp Solutions and SOAR Solutions. Being able to orchestrate, hey, I know that I detected some activity on this infusion pump that suggests that we may being hacked, let me send an alert out, but then let me also, maybe, quarantine that part of the network. So, it's the ability to orchestrate between different security technologies that exist in a hospital environment, that's what we like about ForeScout. >> I'm just curious, when they run their first kind of crawl, if you will-- >> Yeah. >> are people surprised at the results of what's on there, that they had no clue? >> I mean, yes and no. >> Yes and no, okay. >> I think, most of the big hospitals that we work with, they know that, what they don't know, and especially when-- >> They know what they don't know. >> you're talkin' about a health system that maybe has a 100 thousand connected medical devices across the health system, they know what they don't know. They're looking for solutions to help them better manage and understand the things that they don't know, that they don't know. >> Right. >> Versus what they do know about. >> Right. >> And I think that's what we bring to the table in terms of kind of cyber risk services Deloitte brings, and then that's what ForeScout brings with their solution to be able to kind of help solve those problems. >> Well Russell, thanks for taking a few minutes out of your day to share those stories, super-- >> Thank you. >> super important work, you know, it's one thing to steal a few bucks out of the bank account, like you said. >> Yeah. >> It's another thing to start taking down machines at the hospital, not a good thing. >> Not a good thing. >> Alright >> Thank you. >> He's Russell, I'm Jeff, you're watchin' theCUBE, we're at RSA in Moscone in the ForeScout booth, thanks for watching, we'll see you next time. (techno music)
SUMMARY :
Brought to you by ForeScout. in the ForeScout booth, we're couple days at the show? the issue is these systems, and the health care system themselves. as it relates to clinical technology. kind of the risk reward from in the clinical community to connect dots to your pacemaker to apps that you have the things about healthcare complexity of the machines, that that information and the the manufacturers to do, risk in the healthcare world. the core healthcare system, In the form of my DNA footprint. of strange data that now is That's right, and so the implications and that has to be managed. and notifying the patient of their rights. of the right to be forgotten. requirement on the GDPR. if they ask to do that, you That's a very difficult thing to do. that you hit that can delete you it gets to be very challenging. and also, the patient factor, right? I'm so glad you say that that the way we're going that's good to hear. that could impact the patient Making decisions that kind in the ForeScout booth. to some of the stuff a lot of different devices that So, it's the ability to the health system, they to be able to kind of out of the bank account, like you said. machines at the hospital, in the ForeScout booth,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Russell | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
California Consumer Privacy Act | TITLE | 0.99+ |
Deloitte | ORGANIZATION | 0.99+ |
Jeff | PERSON | 0.99+ |
HIPAA | TITLE | 0.99+ |
GDPR | TITLE | 0.99+ |
California Healthcare Act | TITLE | 0.99+ |
two people | QUANTITY | 0.99+ |
San Francisco | LOCATION | 0.99+ |
100 thousand | QUANTITY | 0.99+ |
Russell L. Jones | PERSON | 0.99+ |
Russell Jones | PERSON | 0.99+ |
Moscone | LOCATION | 0.99+ |
last year | DATE | 0.99+ |
first | QUANTITY | 0.99+ |
first time | QUANTITY | 0.99+ |
ForeScout | ORGANIZATION | 0.99+ |
45 thousand professionals | QUANTITY | 0.98+ |
RSA Conference 2019 | EVENT | 0.96+ |
SOAR Solutions | ORGANIZATION | 0.96+ |
first couple days | QUANTITY | 0.96+ |
this year | DATE | 0.95+ |
today | DATE | 0.95+ |
FDA | ORGANIZATION | 0.95+ |
Wall Street Journal Health Conference | EVENT | 0.95+ |
2019 | DATE | 0.95+ |
million dollar | QUANTITY | 0.95+ |
one | QUANTITY | 0.94+ |
a couple weeks back | DATE | 0.93+ |
one thing | QUANTITY | 0.92+ |
theCUBE | ORGANIZATION | 0.88+ |
Simp Solutions | ORGANIZATION | 0.88+ |
RSA | ORGANIZATION | 0.87+ |
few bucks | QUANTITY | 0.87+ |
lot of people | QUANTITY | 0.7+ |
last few years | DATE | 0.69+ |
ForeScout | TITLE | 0.69+ |
one of the reasons | QUANTITY | 0.65+ |
Fitbit | ORGANIZATION | 0.65+ |
US | LOCATION | 0.61+ |
minutes | QUANTITY | 0.57+ |
RSA | EVENT | 0.54+ |
23andMe | TITLE | 0.51+ |
ForeScout | LOCATION | 0.43+ |
Cortnie Abercrombie & Carl Gerber | MIT CDOIQ 2018
>> Live from the MIT campus in Cambridge, Massachusetts, it's theCUBE, covering the 12th Annual MIT Chief Data Officer and Information Quality Symposium. Brought to you by SiliconANGLE Media. >> Welcome back to theCUBE's coverage of MIT CDOIQ here in Cambridge, Massachusetts. I'm your host Rebecca Knight along with my cohost Peter Burris. We have two guests on this segment. We have Cortnie Abercrombie, she is the founder of the nonprofit AI Truth, and Carl Gerber, who is the managing partner at Global Data Analytics Leaders. Thanks so much for coming on theCUBE Cortnie and Carl. >> Thank you. >> Thank you. >> So I want to start by just having you introduce yourselves to our viewers, what you do. So tell us a little bit about AI Truth, Cortnie. >> So this was born out of a passion. As I, the last gig I had at IBM, everybody knows me for chief data officer and what I did with that, but the more recent role that I had was developing custom offerings for Fortune 500 in the AI solutions area, so as I would go meet and see different clients, and talk with them and start to look at different processes for how you implement AI solutions, it became very clear that not everybody is attuned, just because they're the ones funding the project or even initiating the purpose of the project, the business leaders don't necessarily know how these things work or run or what can go wrong with them. And on the flip side of that, we have very ambitious up-and-comer-type data scientists who are just trying to fulfill the mission, you know, the talent at hand, and they get really swept up in it. To the point where you can even see that data's getting bartered back and forth with any real governance over it or policies in place to say, "Hey, is that right? Should we have gotten that kind of information?" Which leads us into things like the creepy factor. Like, you know target (laughs) and some of these cases that are well-known. And so, as I saw some of these mistakes happening that were costing brand reputation, our return on investment, or possibly even creating opportunities for risk for the companies and for the business leaders, I felt like someone's got to take one for the team here and go out and start educating people on how this stuff actually works, what the issues can be and how to prevent those issues, and then also what do you do when things do go wrong, how do you fix it? So that's the mission of AI Truth and I have a book. Yes, power to the people, but you know really my main concern was concerned individuals, because I think we've all been affected when we've sent and email and all of a sudden we get a weird ad, and we're like, "Hey, what, they should not, is somebody reading my email?" You know, and we feel this, just, offense-- >> And the answer is yes. >> Yes, and they are, they are. So I mean, we, but we need to know because the only way we can empower ourselves to do something is to actually know how it works. So, that's what my missions is to try and do. So, for the concerned individuals out there, I am writing a book to kind of encapsulate all the experiences that I had so people know where to look and what they can actually do, because you'll be less fearful if you know, "Hey, I can download DuckDuckGo for my browser, or my search engine I mean, and Epic for my browser, and some private, you know, private offerings instead of the typical free offerings. There's not an answer for Facebook yet though. >> So, (laughs) we'll get there. Carl, tell us a little bit about Global Data Analytics Leaders. >> So, I launched Analytics Leaders and CDO Coach after a long career in corporate America. I started building an executive information system when I was in the military for a four-star commander, and I've really done a lot in data analytics throughout my career. Most recently, starting a CDO function at two large multinational companies in leading global transformation programs. And, what I've experienced is even though the industries may vary a little bit, the challenges are the same and the patterns of behavior are the same, both the good and bad behavior, bad habits around the data. And, through the course of my career, I've developed these frameworks and playbooks and just ways to get a repeatable outcome and bring these new technologies like machine learning to bear to really overcome the challenges that I've seen. And what I've seen is a lot of the current thinking is we're solving these data management problems manually. You know, we all hear the complaints about the people who are analysts and data scientists spending 70, 80% of their time being a data gatherer and not really generating insight from the data itself and making it actionable. Well, that's why we have computer systems, right? But that large-scale technology in automation hasn't really served us well, because we think in silos, right? We fund these projects based on departments and divisions. We acquire companies through mergers and acquisitions. And the CDO role has emerged because we need to think about this, all the data that an enterprise uses, horizontally. And with that, I bring a high degree of automation, things like machine learning, to solve those problems. So, I'm now bottling that and advising my clients. And at the same time, the CDO role is where the CIO role was 20 years ago. We're really in it's infancy, and so you see companies define it differently, have different expectations. People are filling the roles that may have not done this before, and so I provide the coaching services there. It's like a professional golfer who has a swing coach. So I come in and I help the data executives with upping their game. >> Well, it's interesting, I actually said the CIO role 40 years ago. But, here's why. If we look back in the 1970s, hardcore financial systems were made possible by the technology which allowed us to run businesses like a portfolio: Jack Welch, the GE model. That was not possible if you didn't have a common asset management system, if you didn't have a common cached management system, etc. And so, when we started creating those common systems, we needed someone that could describe how that shared asset was going to be used within the organization. And we went from the DP manager in HR, the DP manager within finance, to the CIO. And in many respects, we're doing the same thing, right? We're talking about data in a lot of different places and now the business is saying, "We can bring this data together in new and interesting ways into more a shared asset, and we need someone that can help administer that process, and you know, navigate between different groups and different needs and whatnot." Is that kind of what you guys are seeing? >> Oh yeah. >> Yeah. >> Well you know once I get to talking (laughs). For me, I can going right back to the newer technologies like AI and IOT that are coming from externally into your organization, and then also the fact that we're seeing bartering at an unprec... of data at an unprecedented level before. And yet, what the chief data officer role originally did was look at data internally, and structured data mostly. But now, we're asking them to step out of their comfort zone and start looking at all these unknown, niche data broker firms that may or may not be ethical in how they're... I mean, I... look I tell people, "If you hear the word scrape, you run." No scraping, we don't want scraped data, no, no, no (laugh). But I mean, but that's what we're talking about-- >> Well, what do you mean by scraped data, 'cause that's important? >> Well, this is a well-known data science practice. And it's not that... nobody's being malicious here, nobody's trying to have a malintent, but I think it's just data scientists are just scruffy, they roll up their sleeves and they get data however they can. And so, the practice emerged. Look, they're built off of open-source software and everything's free, right, for them, for the most part? So they just start reading in screens and things that are available that you could see, they can optical character read it in, or they can do it however without having to have a subscription to any of that data, without having to have permission to any of that data. It's, "I can see it, so it's mine." But you know, that doesn't work in candy stores. We can't just go, or jewelry stores in my case, I mean, you can't just say, "I like that diamond earring, or whatever, I'm just going to take it because I can see it." (laughs) So, I mean, yeah we got to... that's scraping though. >> And the implications of that are suddenly now you've got a great new business initiative and somebody finds out that you used their private data in that initiative, and now they've got a claim on that asset. >> Right. And this is where things start to get super hairy, and you just want to make sure that you're being on the up-and-up with your data practices and you data ethics, because, in my opinion, 90% of what's gone wrong in AI or the fear factor of AI is that your privacy's getting violated and then you're labeled with data that you may or may not know even exists half the time. I mean. >> So, what's the answer? I mean as you were talking about these data scientists are scrappy, scruffy, roll-up-your-sleeves kind of people, and they are coming up with new ideas, new innovations that sometimes are good-- >> Oh yes, they are. >> So what, so what is the answer? Is this this code of ethics? Is it a... sort of similar to a Hippocratic Oath? I mean how would you, what do you think? >> So, it's a multidimensional problem. Cortnie and I were talking earlier that you have to have more transparency into the models you're creating, and that means a significant validation process. And that's where the chief data officer partners with folks in risk and other areas and the data science team around getting more transparency and visibility into what's the data that's feeding into it? Is it really the authoritative data of the company? And as Cortnie points out, do we even have the rights to that data that's feeding our models? And so, by bringing that transparency and a little more validation before you actually start making key, bet-the-business decisions on the outcomes of these models, you need to look at how you're vetting them. >> And the vetting process is part technology, part culture, part process, it goes back to that people process technology trying. >> Yeah, absolutely, know where your data came from. Why are you doing this model? What are you doing to do with the outcomes? Are you actually going to do something with it or are you going to ignore it? Under what conditions will you empower a decision-maker to use the information that is the output of the model? A lot of these things, you have to think through when you want to operationalize it. It's not just, "I'm going to go get a bunch of data wherever I can, I put a model together. Here, don't you like the results?" >> But this is Silicon Valley way, right? An MVP for everything and you just let it run until... you can't. >> That's a great point Cortnie (laughs) I've always believed, and I want to test this with you, we talk about people process technology about information, we never talk about people process technology and information of information. There's a manner of respects what we're talking about is making explicit the information about... information, the metadata, and how we manage that and how we treat that, and how we defuse that, and how we turn that, the metadata itself, into models to try to govern and guide utilization of this. That's especially important in AI world, isn't it? >> I start with this. For me, it's simple, I mean, but everything he said was true. But, I try to keep it to this: it's about free will. If I said you can do that with my data, to me it's always my data. I don't care if it's on Facebook, I don't care where it is and I don't care if it's free or not, it's still my data. Even if it's X23andMe, or 23andMe, sorry, and they've taken the swab, or whether it's Facebook or I did a google search, I don't care, it's still my data. So if you ask me if it's okay to do a certain type of thing, then maybe I will consent to that. But I should at least be given an option. And no, be given the transparency. So it's all about free will. So in my mind, as long as you're always providing some sort of free will (laughs), the ability for me to having a decision to say, "Yes, I want to participate in that," or, "Yes, you can label me as whatever label I'm getting, Trump or a pro-Hillary or Obam-whatever, name whatever issue of the day is," then I'm okay with that as long as I get a choice. >> Let's go back to it, I want to build on that if I can, because, and then I want to ask you a question about it Carl, the issue of free will presupposes that both sides know exactly what's going into the data. So for example, if I have a medical procedure, I can sit down on that form and I can say, "Whatever happens is my responsibility." But if bad things happen because of malfeasance, guess what? That piece of paper's worthless and I can sue. Because the doctor and the medical provider is supposed to know more about what's going on than I do. >> Right. >> Does the same thing exist? You talked earlier about governance and some of the culture imperatives and transparency, doesn't that same thing exist? And I'm going to ask you a question: is that part of your nonprofit is to try to raise the bar for everybody? But doesn't that same notion exist, that at the end of the day, you don't... You do have information asymmetries, both sides don't know how the data's being used because of the nature of data? >> Right. That's why you're seeing the emergence of all these data privacy laws. And so what I'm advising executives and the board and my clients is we need to step back and think bigger about this. We need to think about as not just GDPR, the European scope, it's global data privacy. And if we look at the motivation, why are we doing this? Are we doing it just because we have to be regulatory-compliant 'cause there's a law in the books, or should we reframe it and say, "This is really about the user experience, the customer experience." This is a touchpoint that my customers have with my company. How transparent should I be with what data I have about you, how I'm using it, how I'm sharing it, and is there a way that I can turn this into a positive instead of it's just, "I'm doing this because I have to for regulatory-compliance." And so, I believe if you really examine the motivation and look at it from more of the carrot and less of the stick, you're going to find that you're more motivated to do it, you're going to be more transparent with your customers, and you're going to share, and you're ultimately going to protect that data more closely because you want to build that trust with your customers. And then lastly, let's face it, this is the data we want to analyze, right? This is the authenticated data we want to give to the data scientists, so I just flip that whole thing on its head. We do for these reasons and we increase the transparency and trust. >> So Cortnie, let me bring it back to you. >> Okay. >> That presupposes, again, an up-leveling of knowledge about data privacy not just for the executive but also for the consumer. How are you going to do that? >> Personally, I'm going to come back to free will again, and I'm also going to add: harm impacts. We need to start thinking impact assessments instead of governance, quite frankly. We need to start looking at if I, you know, start using a FICO score as a proxy for another piece of information, like a crime record in a certain district of whatever, as a way to understand how responsible you are and whether or not your car is going to get broken into, and now you have to pay more. Well, you're... if you always use a FICO score, for example, as a proxy for responsibility which, let's face it, once a data scientist latches onto something, they share it with everybody 'cause that's how they are, right? They love that and I love that about them, quite frankly. But, what I don't like is it propagates, and then before you know it, the people who are of lesser financial means, it's getting propagated because now they're going to be... Every AI pricing model is going to use FICO score as a-- >> And they're priced out of the market. >> And they're priced out of the market and how is that fair? And there's a whole group, I think you know about the Fairness Accountability Transparency group that, you know, kind of watch dogs this stuff. But I think business leaders as a whole don't really think through to that level like, "If I do this, then this this and this could incur--" >> So what would be the one thing you could say if, corporate America's listening. >> Let's do impact. Let's do impact assessments. If you're going to cost someone their livelihood, or you're going to cost them thousands of dollars, then let's put more scrutiny, let's put more government validation. To your point, let's put some... 'cause not everything needs the nth level. Like, if I present you with a blue sweater instead of a red sweater on google or whatever, (laughs) You know, that's not going to harm you. But it will harm you if I give you a teacher assessment that's based on something that you have no control over, and now you're fired because you've been laid off 'cause your rating was bad. >> This is a great conversation. Let me... Let me add something different, 'cause... Or say it a different way, and tell me if you agree. In many respects, it's: Does this practice increase inclusion or does this practice decrease inclusion? This is not some goofy, social thing, this is: Are you making your market bigger or are you making your market smaller? Because the last thing you want is that the participation by people ends with: You can't play because of some algorithmic response we had. So maybe the question of inclusion becomes a key issue. Would you agree with that? >> I do agree with it, and I still think there's levels even to inclusion. >> Of course. >> Like, you know, being a part of the blue sweater club versus the (laughs) versus, "I don't want to be a convict," you know, suddenly because of some record you found, or association with someone else. And let's just face it, a lot of these algorithmic models do do these kinds of things where they... They use n+1, you know, a lot... you know what I'm saying. And so you're associated naturally with the next person closest to you, and that's not always the right thing to do, right? So, in some ways, and so I'm positing just little bit of a new idea here, you're creating some policies, whether you're being, and we were just talking about this, but whether you're being implicit about them or explicit, more likely you're being implicit because you're just you're summarily deciding. Well, okay, I have just decided in the credit score example, that if you don't have a good credit threshold... But where in your policies and your corporate policy did it ever say that people of lesser financial means should be excluded from being able to have good car insurance for... 'cause now, the same goes with like Facebook. Some people feel like they're going to have to opt of of life, I mean, if they don't-- >> (laughs) Opt out of life. >> I mean like, seriously, when you think about grandparents who are excluded, you know, out in whatever Timbuktu place they live, and all their families are somewhere else, and the only way that they get to see is, you know, on Facebook. >> Go back to the issue you raised earlier about "Somebody read my email," I can tell you, as a person with a couple of more elderly grandparents, they inadvertently shared some information with me on Facebook about a health condition that they had. You know how grotesque the response of Facebook was to that? And, it affected me to because they had my name in it. They didn't know any better. >> Sometimes there's a stigma. Sometimes things become a stigma as well. There's an emotional response. When I put the article out about why I left IBM to start this new AI Truth nonprofit, the responses I got back that were so immediate were emotional responses about how this stuff affects people. That they're scared of what this means. Can people come after my kids or my grandkids? And if you think about how genetic information can get used, you're not just hosing yourself. I mean, breast cancer genes, I believe, aren't they, like... They run through families, so, I-- >> And they're pretty well-understood. >> If someone swabs my, and uses it and swaps it with other data, you know, people, all of a sudden, not just me is affected, but my whole entire lineage, I mean... It's hard to think of that, but... it's true (laughs). >> These are real life and death... these are-- >> Not just today, but for the future. And in many respects, it's that notion of inclusion... Going back to it, now I'm making something up, but not entirely, but going back to some of the stuff that you were talking about, Carl, the decisions we make about data today, we want to ensure that we know that there's value in the options for how we use that data in the future. So, the issue of inclusion is not just about people, but it's also about other activities, or other things that we might be able to do with data because of the nature of data. I think we always have to have an options approach to thinking about... as we make data decisions. Would you agree with that? Yes, because you know, data's not absolute. So, you can measure something and you can look at the data quality, you can look at the inputs to a model, whatever, but you still have to have that human element of, "Are you we doing the right thing?" You know, the data should guide us in our decisions, but I don't think it's ever an absolute. It's a range of options, and we chose this options for this reason. >> Right, so are we doing the right thing and do no harm too? Carl, Cortnie, we could talk all day, this has been a really fun conversation. >> Oh yeah, and we have. (laughter) >> But we're out of time. I'm Rebecca Knight for Peter Burris, we will have more from MIT CDOIQ in just a little bit. (upbeat music)
SUMMARY :
Brought to you by SiliconANGLE Media. she is the founder of the nonprofit AI Truth, So I want to start by just having you To the point where you can even see that and some private, you know, private offerings Carl, tell us a little bit about and not really generating insight from the data itself and you know, navigate between different groups Well you know once I get to talking (laughs). And so, the practice emerged. and somebody finds out that you used and you just want to make sure that you're being on the Is it a... sort of similar to a Hippocratic Oath? that you have to have more transparency And the vetting process is part technology, A lot of these things, you have to think through An MVP for everything and you just let it run until... the metadata, and how we manage that the ability for me to having a decision to say, because, and then I want to ask you a question about it Carl, that at the end of the day, you don't... This is the authenticated data we want to give How are you going to do that? and now you have to pay more. And there's a whole group, I think you know about So what would be the one thing you could say if, But it will harm you if I give you a teacher assessment Because the last thing you want is that I do agree with it, and I still think there's levels and that's not always the right thing to do, right? and the only way that they get to see is, you know, Go back to the issue you raised earlier about And if you think about how genetic information can get used, and uses it and swaps it with other data, you know, people, in the options for how we use that data in the future. and do no harm too? Oh yeah, and we have. we will have more from MIT CDOIQ in just a little bit.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Rebecca Knight | PERSON | 0.99+ |
Cortnie Abercrombie | PERSON | 0.99+ |
Carl | PERSON | 0.99+ |
Cortnie | PERSON | 0.99+ |
Peter Burris | PERSON | 0.99+ |
Trump | PERSON | 0.99+ |
Carl Gerber | PERSON | 0.99+ |
Jack Welch | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
90% | QUANTITY | 0.99+ |
Hillary | PERSON | 0.99+ |
four-star | QUANTITY | 0.99+ |
GE | ORGANIZATION | 0.99+ |
two guests | QUANTITY | 0.99+ |
1970s | DATE | 0.99+ |
Cambridge, Massachusetts | LOCATION | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
both sides | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
Obam | PERSON | 0.99+ |
both | QUANTITY | 0.98+ |
SiliconANGLE Media | ORGANIZATION | 0.98+ |
40 years ago | DATE | 0.98+ |
DuckDuckGo | TITLE | 0.98+ |
thousands of dollars | QUANTITY | 0.98+ |
Timbuktu | LOCATION | 0.98+ |
America | LOCATION | 0.98+ |
theCUBE | ORGANIZATION | 0.98+ |
today | DATE | 0.98+ |
FICO | ORGANIZATION | 0.98+ |
GDPR | TITLE | 0.98+ |
MIT CDOIQ | ORGANIZATION | 0.96+ |
20 years ago | DATE | 0.95+ |
ORGANIZATION | 0.95+ | |
12th Annual MIT Chief Data Officer and Information Quality Symposium | EVENT | 0.93+ |
one | QUANTITY | 0.93+ |
AI Truth | ORGANIZATION | 0.89+ |
70, 80% | QUANTITY | 0.87+ |
MIT | ORGANIZATION | 0.87+ |
Global Data Analytics Leaders | ORGANIZATION | 0.86+ |
2018 | DATE | 0.83+ |
CDO Coach | TITLE | 0.82+ |
Hippocratic Oath | TITLE | 0.82+ |
two large multinational companies | QUANTITY | 0.79+ |
half | QUANTITY | 0.75+ |
Fairness | ORGANIZATION | 0.68+ |
X23andMe | ORGANIZATION | 0.68+ |
23andMe | ORGANIZATION | 0.66+ |
Analytics | ORGANIZATION | 0.64+ |
couple | QUANTITY | 0.62+ |
European | OTHER | 0.59+ |
blue sweater | ORGANIZATION | 0.58+ |
Epic | ORGANIZATION | 0.5+ |
Fortune | ORGANIZATION | 0.48+ |
1 | QUANTITY | 0.46+ |
CDOIQ | EVENT | 0.36+ |
500 | QUANTITY | 0.35+ |
James Lowey, TGEN | Dell Technologies World 2018
>> Narrator: Live from Las Vegas, it's theCUBE, covering Dell Technologies World 2018. Brought to you by Dell EMC and its ecosystem partners. >> Welcome back to theCUBE. We are live in Las Vegas. Day two of Dell Technologies World. I am Lisa Martin with Stu Miniman, my cohost. And we're excited to welcome to theCUBE for the first time the CIO of TGen, Translational Genomics, James Lowey. James, welcome to theCUBE. >> Ah, thank you so much, it's great being here. >> So, genomics, really interesting topic that we want to get into and understand. How are you making IT and digital and workforce transformation real in it, but get give our viewers and overview of TGen. It started out about 16 years ago as a very collaborative effort within Arizona and really grew. Talk to us about that. >> Yeah, absolutely. So, TGen is a nonprofit biomedical research institute based in Phoenix, Arizona. As you mentioned, we've been around about 16 years. We were, the inception of the institute was really built around bringing biomedical technology into the sate of Arizona. And we're fortunate enough to have a really visionary and gifted leader in Dr. Jeffrey Trent, who is one of the original guys to sequence the human completely for the first time. So I don't know if you get any better street cred than that when it comes to genomics. >> And you mentioned, before we went live, give our viewers an overview of what it took to sequence the human genome in terms of time and money and now, how 15 years later, how fast it can be done. >> Yeah, so, you know we've moved from a point where it costs billions of dollars and took many years to complete the first sequence to today where it takes a little bit over a day and about $3 thousand. So it's really the democratization of the technology is driving clinical application, which, in turn, is going to benefit all of us. >> Yeah, James, genomics is one of those areas, when we talk about there is the opportunity of data, but there's also the challenge of data, because you've got to, I have to imagine, orders of magnitude more data than your typical company does, so talk to us a little bit about the role of data inside your organization. >> Well, data is our lifeblood. I mean, we've been generating terascale then petascale for many years now. And the fact is, is every time you sequence a patient you're generating about 4 terabytes of data for one patient. So if you're doing 100 patients, do the math, or you're doing a thousand patients. We're talking just an immense volume of data. And really, data is what drives us because that information that's encoded in our genome is nothing but data, right? It's turning our analog selves into a digital format that then we can interrogate to come up with better treatments to help patients. >> Can you bring this inside? When you talk about the infrastructure that enables that. You know, what I was teasing out with the last question, it's not just about storing data, you need to be able to access the data, you need to be able to share data. So as the CIO, what's your purview? Give us a little bit of a thumbnail sketch as to what your organization-- >> Oh yeah, yeah, no that's great. You know, so we've been a long time Isilon customer. The scale-out storage is what really has enabled us to be successful. Our partnership with Dell EMC has spanned many years and we're fortunate enough to have enough visibility within the organization to get early access to technologies. And really, that's really important because the science moves faster than the IT. So having things like scale-out, super fast flash, you know, having new Intel processors, all these things are what really enable us to do our job and to be successful. >> How have, you've been with TGen for a long time now, you've been the CIO for about three years. Talk to us about the transformation of the technology and how you've evolved it to not just facilitate digital transformation and IT transformation, but I imagine security transformation with human genetic data is of paramount importance. >> You know, that's a really good point. Security is always on my mind, for obvious reasons because I would say there's nothing more personally identifiable than your genome. There's the laws around these things still have not been totally codified. So we're sitting at a point today where we're still uncertain to how exactly best protect this very, very important data. But to that end, we tend to fail in the closed state of doing things, everything's encrypted. You know, we are big believers in identity management and making sure that the right people have access to the right data at the right time. We've utilized SecureWorks, for instance, for perimeter, logging, and to get their expertise. 'Cause one of the things I've learned in my tenure as CIO is that it's really all about the people and they're what drive your success. And so I'm fortunate enough to have a team that's amazing. These folks are some of the best people in their field and really do a great job at helping us, protect the data, get access to the data, as well as thinking about what the next iteration is going to look like. >> When you look at, just as a whole, the security and data protection, you think about everybody, if they get those home kits, or things like that, how has that evolved the last few years? I'm curious if that impacts your business. >> Well, I think it does impact our business insofar as it creates awareness. And you know, I think it's really fantastic when I attend a cocktail party or something and people come up and ask, say, "You know, should I get the 23andMe Ancestry?" And they're really engaged and interested and wanting to learn about these things. And I think that's going to spur questions to be asked when they go in to be treated by a physician. Which is really important. I think, I'm a believer that we should own our own data, especially our genomic data, because what's more personal than that? And so we have a lot of challenges ahead, I think, in IT in particular, in protecting, storing, and providing that data to patients. >> Just a quick followup, I'm sure you secure stuff. What's the cocktail answer for that? If, you know, should I get that? Can I trust this company? Is my insurance company and everybody else going to get that? What do you advise the average consumer? >> I would say read the terms of use agreement very carefully. >> so the theme of the event, James, make it real. You know, few things are more real than our own data, our own genomes, what does that theme mean to you from an application perspective? How are you making digital transformation real? And things like the alliance with City of Hope to impact disease study and cures? What is that reality component to you? >> Yeah, it has, you know, I really like the make it real theme, and I think it's something that we are doing every day. I think it just speaks to, you know, taking technology, applying it for meaningful use, to actually make a difference, and to do something that has real impact. And I think that at TGen, I've been empowered to build systems that can do that, that can help our scientists and ultimately help patients. You mentioned City of Hope. We're, our alignment with them is amazing. They have just hired a Chief Digital Officer as they go through a digital transformation of their own. And you know, we're on board in striving to help them go through this process because, as you might be aware, everything's about the data. And that's where we have to focus. >> James, if you go back, you talked about your scale-out architecture with Isilon. How do you report back to the business as to the results you're doing? What are the, do you have any hero metrics or things that you point out that says this is why we're successful. This is why we've made the right decision. This is why we should be doing this in the future. >> Well, I think we're especially fortunate that we can measure our success in people's lives. So, meeting a kid who's in full remission from brain cancer who was treated using drugs that were derived from being sequenced and run through our labs and then our computational infrastructure and having them say thank you, I think is pretty much a metric that I don't know how you can beat that. >> Talk about making it real. That's where it's really impactful. I'd love to understand your thoughts as you continue to evolve your transformation as a company. We've heard a lot about emerging technologies and what Dell EMC, Dell Technologies, is doing to enable organizations and customers to be able to realize what's possible with artificial intelligence, machine learning, IoT. What are your thoughts about weaving in those emerging technologies to make what TGen delivers even more impactful. >> Well you just said three of my favorite things that I'm spending a lot of time thinking about. You know, artificial intelligence is going to be absolutely, is required to interrogate the vast amounts of data that are being created. I mean, this is all unstructured data, so you have to have systems that can store and present that data in such a way that you're going to be able to do something meaningful. IoT is another area where we're spending a lot of time and energy in what we believe is like quantitative medicine. So basically taking measurements all the time to see about changes and then using that to hopefully gain insight into treatment of diseases. You know, machine learning and some of these technologies are also absolutely going to be critical, especially when we start building out drug databases and being able to match the patient with the drug. >> Yeah, James, bring us inside to your organization a little bit. What kind of skill sets do you have to have to architect, operate, a theme of this show, they've got Andy McAfee, who's from MIT, we've spoken to, it's about people and machines. You can't have one without the other. You need to be able to marry those two. How does an organization like yours get ready for that and move forward? >> Yeah, it's a really good point. I think the technology enables the people, and you have to have the right people to help make the decisions and what technologies you get and apply. And I think that the skill sets that we look for is generally people who have a broad view of the world. You know, people who are particular experts, at least in the IT side are of limited use, because we need people to be able to switch gears quickly and to think about problems holistically. So I'd say most of the IT folks are working several different disciplines and are really good at that. On the scientific side it's a little different. We're looking for data scientists all the time. So if anybody's watching and wants to come work for a great place, TGen, look us up. Because that's really where we're headed. You know, we have a lot of biologists, we have a lot of molecular biologists, we have people who do statistics, but it's not quite the same as data science. So that's kind of the new area that we're really focused on. >> All right, so James, one of the things I always love to ask when I get a CIO here is, when you're talking to your peers in the industry, how do you all see the role of the CIO changing? What are some of the biggest challenges that you're facing? >> So, yeah, it's a great question. I think the role's changing towards being empowered in the business. And I think that as that has to be part of the transformation. Is you have to be aligned completely with what your objectives are. And we're fortunate, you know, we are. And I feel very lucky to have a boss and a boss's boss who both understand the importance and the value that we bring to the organization. I also see that in the industry, especially in healthcare, a need for folks who are focused beyond just the EMR and daily IT things, to really start looking beyond maybe where you're comfortable. I know that I stretch my boundaries, and I think that in order to be successful as a CIO I think that's what you're going to have to do. I think you're going to have to push the envelope. You're going to have to look for new technologies and new ways to make a difference. >> So last question, big impact that TGen has made to the state of Arizona. I read on LinkedIn that you like building high-performance teams. What are some of the impacts that this has made for Arizona but also maybe as an example for other states to look to be inspired to set up something similar? >> That's really a great question. I think, you know, Arizona made an investment, and the way that it's easy to measure. So if you come down to the TGen building and realize that that building was the first building that is now surrounded by buildings, including a full-on cancer center, that's all in downtown Phoenix. And it's almost the if you build it they will come, but it's not just the infrastructure, it really is about the people and identifying the right folks to come in and help build that, to invest in them and to provide basically the opportunity for success. You know, Arizona has really been fortunate, I think, in being able to build out this amazing infrastructure around biotechnology. And you know, but we're just getting going. I mean, we are, we've only been doing this for about 16 years and I look forward to the next 16. >> Well thanks so much, James, for stopping by and talking about how you're applying technologies, not just from Dell EMC but others as well to make transformation real, to make it real across IT, digital, workforce, security, and doing something that's really literally has the opportunity to save lives. Thanks so much. >> Well thank you very much, it's been a pleasure. >> We want to thank you for watching theCUBE. I'm Lisa Martin, with my cohost Stu Miniman. We are live day two of Dell Technologies World. We'll be back after a lunch break. We'll see you then.
SUMMARY :
Brought to you by Dell EMC Welcome back to theCUBE. Ah, thank you so much, Talk to us about that. to sequence the human And you mentioned, before we went live, So it's really the democratization talk to us a little bit interrogate to come up with as to what your organization-- and to be successful. Talk to us about the protect the data, get access to the data, the security and data protection, And I think that's going to everybody else going to get that? I would say read the What is that reality component to you? and to do something that has real impact. as to the results you're doing? that I don't know how you can beat that. I'd love to understand your thoughts and being able to match You need to be able to marry those two. and to think about problems holistically. I also see that in the industry, I read on LinkedIn that you like And it's almost the if you has the opportunity to save lives. Well thank you very We want to thank you
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
James | PERSON | 0.99+ |
Stu Miniman | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Andy McAfee | PERSON | 0.99+ |
James Lowey | PERSON | 0.99+ |
Arizona | LOCATION | 0.99+ |
Jeffrey Trent | PERSON | 0.99+ |
100 patients | QUANTITY | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
TGen | ORGANIZATION | 0.99+ |
Dell Technologies | ORGANIZATION | 0.99+ |
Dell EMC | ORGANIZATION | 0.99+ |
first sequence | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
two | QUANTITY | 0.99+ |
about $3 thousand | QUANTITY | 0.99+ |
Phoenix, Arizona | LOCATION | 0.99+ |
billions of dollars | QUANTITY | 0.99+ |
first time | QUANTITY | 0.99+ |
one patient | QUANTITY | 0.99+ |
today | DATE | 0.98+ |
about three years | QUANTITY | 0.98+ |
both | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
Dell EMC | ORGANIZATION | 0.98+ |
15 years later | DATE | 0.98+ |
Isilon | ORGANIZATION | 0.97+ |
City of Hope | ORGANIZATION | 0.97+ |
Day two | QUANTITY | 0.97+ |
Translational Genomics | ORGANIZATION | 0.96+ |
MIT | ORGANIZATION | 0.96+ |
about 16 years | QUANTITY | 0.96+ |
theCUBE | ORGANIZATION | 0.96+ |
Intel | ORGANIZATION | 0.95+ |
Dell Technologies World 2018 | EVENT | 0.94+ |
first building | QUANTITY | 0.93+ |
day two | QUANTITY | 0.91+ |
Dell Technologies World | EVENT | 0.9+ |
about 4 terabytes | QUANTITY | 0.89+ |
16 years ago | DATE | 0.87+ |
a thousand patients | QUANTITY | 0.86+ |
over a day | QUANTITY | 0.84+ |
TGen | LOCATION | 0.76+ |
TGen | PERSON | 0.76+ |
Dr. | PERSON | 0.72+ |
next 16 | DATE | 0.72+ |
Dell Technologies World | ORGANIZATION | 0.71+ |
three of my favorite | QUANTITY | 0.68+ |
last | DATE | 0.59+ |
Phoenix | LOCATION | 0.53+ |
TGEN | PERSON | 0.52+ |
23andMe | ORGANIZATION | 0.49+ |
SecureWorks | ORGANIZATION | 0.45+ |
Sam Greenblatt, Nano Global - Open Networking Summit 2017 - #ONS2017 - #theCUBE
(lively synth music) >> Announcer: Live, from Santa Clara, California, it's The Cube, covering Open Networking Summit 2017. Brought to you by The Linux Foundation. >> Hey welcome back everybody, Jeff Frick here with The Cube. We are at Open Networking Summit, joined here in this segment by Scott Raynovich, my guest host for the next couple days, great to see you again Scott. >> Good to see you. >> And real excited to have a long-time Cube alumni, a many-time Cube alumni always up to some interesting and innovative thing. (Scott laughs) Sam Greenblat, he's now amongst other things the CTO of Nano Global, nano like very very small. Sam, great to see ya. >> Great to see you too Jim. >> So you said before we went offline, you thought you would retire, but there's just too many exciting things going on, and it drug you back into this crazy tech world. >> Just when you think you're out, they pull you back in. (all laugh) >> All right, so what is Nano Global, for people that aren't familiar with the company? >> Nano Global is a Amosil-Q, which is the compound, which is a nano compound that basically kills viruses, pathogens, funguses, and it does it by attaching itself at the nano level to these microbiol, microlife, and it implodes it, and technically that term is called lysis. >> (Jeff) That sounds very scary. >> It's very scary, because we try to sell it as a hand processing. >> You just told me it kills everything, I don't know if I want to put that on my hands, Sam. (all laugh) >> No it's good, that it kills some of the good bacteria, but it basically protects you for 24 hours. You don't have to reapply it, you can wash your hands. >> (Scott) It's like you become Superman or something. >> Absolutely, I literally use it to wash off the trays on the planes, and the armrests, while the guy next to me is sneezing like crazy, to try to kill any airborne pathogens. >> So what about the nanotechnology's got you traveling up to Santa Clara today for? >> Well, what I'm doing is, one of the things we're working on, besides that, is we're working on genomics, and I worked with some other companies on genomics besides Nano, and genomics has me totally fascinated. When I was at Dell, I went to ASU, and for the first time, I saw, pediatric genomics being processed quickly, and that was in a day. Today, a day is unheard of, it's terrible, you want to do it in less than an hour, and I was fascinated by how many people can be affected by the use of genomic medicine, and genomic pharmacology. And you see some of the ads on TV like Teva, that's genomic medicine, added tax, a genomic irregularity in your DNA, so it's amazing. And the other thing I'm very interested in is eradicating in my lifetime, which I don't know if it's going to happen, cancer, and how you do that is very simple. They found that chemotherapy is interesting, but not fascinating, it doesn't always work, but what they're finding is if they can find enough biometric information from genomes, from your proteomics, from your RNA, they can literally customize, it's called precision medicine, a specific medicine track for you, to actually fight the cancer successfully. >> I can't wait for the day, and hopefully it will be in your lifetime, when they look back at today's cancer treatments, and said "now what did you do again? (Sam laughs) You gave them as much poison as they could take, right up to the time they almost die, and hopefully the cancer dies first?" >> I'll take the - >> It's like bloodletting, it will not be that long from now that we look back at this time and say that was just archaic, which is good. >> It's called reactive medicine. It's funny, there's a story, that the guy who actually did the sequencing of the DNA, the original DNA strand tells, that when he was younger, he basically were able to see his chromosomes, and then he was able to get down to the DNA and to the proteins, and he could see that he had an irregularity that was known for basically cancer. And he went to the doctor, and he said "I think I have cancer of the pancreas." And the guy said "your blood tests don't show it." and by the way you don't get that blood test until you're over 40 years old, PS-1, the PS scan. And what happened was they actually found out that he had cancer of the pancreas, so... >> Yeah, it's predictive isn't it? So basically what you're doing is you're data mining the human and the human genome, and trying to do some sort of - >> We're not doing the 23andme, which tells you you have a propensity to be fat. >> Right, right, but walk us through what you're doing. You're obviously, you're here at an IT cloud conference so you're obviously using cloud technology to help accelerate the discovery of medicine, so walk us through how you're doing that. >> What happens is, when you get the swab, or the blood, and your DNA is then processed, it comes in and it gets cut to how many literal samples that they need. 23andme uses the 30x, that's 30 pieces. That's 80, by the way, gigabytes of data. If you were to take a 50x, is what you need for cancer, which is probably low, but it's, that takes you up to 150 gigabytes per person. Now think about the fact, you got to capture that, then you got to capture the RNA of the person, you got to capture his biometrics, and you got to capture his electronic medical record, and all the radiology that's done. And you got to bring it together, look at it, and determine what they should do. And the problem is the oncologic doctors today are scared to death of this, because they know how, if you have this, I'm going to take you in and basically do some radiation. I'm going to do chemotherapy on you and run the course. What's happening is, when you do all of this, you got to correlate all this data, it's probably the world's largest big data outside of Youtube. It's number two in number of bytes, and we haven't sequenced everybody on the planet. Everybody should get sequenced, it should be stored, and then when you get, that's called a germline you're healthy, then you take the cancer and you look at the germline and compare it, and then you're able to see what the difference is. Now open source has great technology to deal with this flood of data. LinkedIn, as you know open source, cacafa and one of the things that's great about that is it's a pull model, it's a producer, broker, subscriber model, and you can open up multiple channels, and by opening up multiple channels, since the subscribers are doing the pull instead of trying to send it all and overflow it, and we all know what it's like to overflow a pipe. It goes everywhere. But doing it through a cacafa model or a NiFi model, which is, by the way, donated by the NSA. We're not going to unmask who donated it but, (laughs) no, I'm only kidding, but the NSA donated it, and data flows now become absolutely critical, because as you get these segments of DNA, you got to send it all down, then what you got to do is do, and you're going to love this, a hidden Markovian chain, and put it all back together, so you can match the pattern, and then once you match the pattern, then you got to do quality control to see whether or not you screwed it up. And then, beyond that, you then have to do something called Smith-Waterman, which is a QC time, and then you can give it to somebody to figure out where the variant is. The whole key is all three of us share 99.9% of the same DNA. That one percent, tenth of a percent, is what is a variant. The variant is what causes all the diseases. We're all born with cancer. You have cancer in you, I have it, Jeff has it, and the only difference between a healthy person and a sick person is your killer cell went to sleep and doesn't attack the cancer. The only way to attack cancer is not chemotherapy, and I know every oncologic person who sees this is going to have a heart attack, it's basically let your immune system fight it. So what this tech does is it moves all that massive data into the variant. Once you get the variant, then you got to look at the RNA and see if there's variance there. Then you got to look at the radiology, the germline, and the biometric data, and once you get that, you can make a decision. I'll give you the guy who's my hero in this is the guy named Dr. Soon. He's the guy who came up with Abroxane. Abroxane is for pancreatic -- >> Jeff: Who is he with now? >> NantHealth. (both laugh) And why I, he discovered, he knew all about medicine, but he didn't know anything about technology. So then this becomes probably the best machine learning issue that you can have, because you have all this data, you're going to learn what it works on patients. And you're going to get all the records back, so what I'm going to talk about, because they wanted to talk about using SDM, using NFA, opening up hundreds of channels from source to, from provider to the subscriber, or consumer, as they call it, with the broker in the middle. And moving that data, then getting it over there, and doing the processing fast enough that it can be done while the patient still hasn't had any other problems. So I have great charts of what the genome looks like. I sent it to you. >> So it's clear these two fields are going to continue to merge, and the bioinformatics, and IT cloud. >> Sam: They're merging, as fast as possible. >> And we just plug our brain and our bodies into the health cloud, and it tells us what's up. >> Exactly, if Ginni was here, Ginni Rometty from IBM, she would tell you that quantum, she'd just announce it first commercially, an available quantum computer. Her first use for it is genomics, because genomics is a very repetitive process that is done in parallel. Remember you just cut this thing into 50 pieces, you put it back together, and now you're looking to see what's hidden, and it doesn't look like it's normal. If you looked at my genetics, one of the things you'll notice, that I will not consume a lot of caffeine. And how they know that is because there's a set of chromosomes, and my 23 chromosomes, that basically says I won't consume it. Turns out to be totally wrong, because of my behavior over the day. (all laugh) But what the Linux Foundation was interesting is everybody here wants to talk about, are we going to use this technology or that technology. What they want is an application, using the technology, and NantHealth that I talked about, can transport a terabyte of data virtually. In other words, it's not really doing it, but it's doing it through multiple sources and multiple consumers, and that's what people are fascinated by. >> All right, well like I said, Sammy gets into the wild and wooly ways and exciting new things. (Sam laughs) So sounds great, and a very bright future on the health care side. Thanks for stopping by. >> Thank you very much. I hope I didn't bore you with... (Jeff and Sam laugh) >> No, no, no, we don't want more chemotherapy, so that's definitely better to have less chemotherapy and more genetic fixing of sickness. So Sam, nice to see you again, thanks for stopping by. >> Thank you very much. >> Scott Raynovich, Jeff Frick, you're watching The Cube, from Open Networking Summit in Santa Clara, we'll be back after this short break. Thanks for watching. (synth music) >> Announcer: Robert Hershevech.
SUMMARY :
Brought to you by The Linux Foundation. great to see you again Scott. the CTO of Nano Global, nano like very very small. and it drug you back into this crazy tech world. Just when you think you're out, they pull you back in. and it does it by attaching itself at the nano level It's very scary, because we try to sell it as I don't know if I want to put that on my hands, Sam. You don't have to reapply it, you can wash your hands. on the planes, and the armrests, while the guy going to happen, cancer, and how you do that is very simple. that was just archaic, which is good. and by the way you don't get that blood test until which tells you you have a propensity to be fat. accelerate the discovery of medicine, and the biometric data, and once you get that, issue that you can have, because you have all this data, continue to merge, and the bioinformatics, and IT cloud. into the health cloud, and it tells us what's up. you put it back together, and now you're looking the health care side. Thank you very much. So Sam, nice to see you again, thanks for stopping by. Scott Raynovich, Jeff Frick, you're watching The Cube,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Sam Greenblat | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Scott | PERSON | 0.99+ |
Sam Greenblatt | PERSON | 0.99+ |
Scott Raynovich | PERSON | 0.99+ |
Ginni Rometty | PERSON | 0.99+ |
Robert Hershevech | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Sam | PERSON | 0.99+ |
80 | QUANTITY | 0.99+ |
30 pieces | QUANTITY | 0.99+ |
NSA | ORGANIZATION | 0.99+ |
50 pieces | QUANTITY | 0.99+ |
Jim | PERSON | 0.99+ |
Santa Clara | LOCATION | 0.99+ |
24 hours | QUANTITY | 0.99+ |
Ginni | PERSON | 0.99+ |
23 chromosomes | QUANTITY | 0.99+ |
99.9% | QUANTITY | 0.99+ |
one percent | QUANTITY | 0.99+ |
Linux Foundation | ORGANIZATION | 0.99+ |
50x | QUANTITY | 0.99+ |
Cube | ORGANIZATION | 0.99+ |
Santa Clara, California | LOCATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Today | DATE | 0.99+ |
Sammy | PERSON | 0.99+ |
The Cube | TITLE | 0.99+ |
less than an hour | QUANTITY | 0.99+ |
30x | QUANTITY | 0.99+ |
Nano Global | ORGANIZATION | 0.98+ |
Youtube | ORGANIZATION | 0.98+ |
over 40 years old | QUANTITY | 0.98+ |
The Cube | ORGANIZATION | 0.98+ |
Dr. | PERSON | 0.98+ |
today | DATE | 0.98+ |
first time | QUANTITY | 0.97+ |
NantHealth | ORGANIZATION | 0.97+ |
#ONS2017 | EVENT | 0.97+ |
Soon | PERSON | 0.97+ |
first | QUANTITY | 0.96+ |
Open Networking Summit | EVENT | 0.96+ |
two fields | QUANTITY | 0.95+ |
Superman | PERSON | 0.95+ |
tenth of a percent | QUANTITY | 0.95+ |
Amosil-Q | OTHER | 0.94+ |
three | QUANTITY | 0.94+ |
Nano Global - Open Networking Summit 2017 | EVENT | 0.94+ |
Dell | ORGANIZATION | 0.94+ |
Open Networking Summit 2017 | EVENT | 0.93+ |
one | QUANTITY | 0.93+ |
hundreds of channels | QUANTITY | 0.92+ |
Teva | ORGANIZATION | 0.92+ |
first use | QUANTITY | 0.91+ |
a day | QUANTITY | 0.9+ |
ASU | ORGANIZATION | 0.9+ |
RNA | OTHER | 0.89+ |
CTO | PERSON | 0.88+ |
gigabytes | QUANTITY | 0.83+ |
The Linux Foundation | ORGANIZATION | 0.8+ |
Nano Global | OTHER | 0.73+ |
up to 150 gigabytes per person | QUANTITY | 0.72+ |
both laugh | QUANTITY | 0.7+ |
lysis | OTHER | 0.67+ |
terabyte of data | QUANTITY | 0.65+ |
couple days | DATE | 0.65+ |
-1 | OTHER | 0.62+ |
Smith-Waterman | PERSON | 0.58+ |
Abroxane | OTHER | 0.57+ |
Abroxane | COMMERCIAL_ITEM | 0.57+ |
variant | OTHER | 0.56+ |