Image Title

Search Results for Pharma:

Jesse Cugliotta & Nicholas Taylor | The Future of Cloud & Data in Healthcare


 

(upbeat music) >> Welcome back to Supercloud 2. This is Dave Vellante. We're here exploring the intersection of data and analytics in the future of cloud and data. In this segment, we're going to look deeper into the life sciences business with Jesse Cugliotta, who leads the Healthcare and Life Sciences industry practice at Snowflake. And Nicholas Nick Taylor, who's the executive director of Informatics at Ionis Pharmaceuticals. Gentlemen, thanks for coming in theCUBE and participating in the program. Really appreciate it. >> Thank you for having us- >> Thanks for having me. >> You're very welcome, okay, we're go really try to look at data sharing as a use case and try to understand what's happening in the healthcare industry generally and specifically, how Nick thinks about sharing data in a governed fashion whether tapping the capabilities of multiple clouds is advantageous long term or presents more challenges than the effort is worth. And to start, Jesse, you lead this industry practice for Snowflake and it's a challenging and vibrant area. It's one that's hyper-focused on data privacy. So the first question is, you know there was a time when healthcare and other regulated industries wouldn't go near the cloud. What are you seeing today in the industry around cloud adoption and specifically multi-cloud adoption? >> Yeah, for years I've heard that healthcare and life sciences has been cloud diverse, but in spite of all of that if you look at a lot of aspects of this industry today, they've been running in the cloud for over 10 years now. Particularly when you look at CRM technologies or HR or HCM, even clinical technologies like EDC or ETMF. And it's interesting that you mentioned multi-cloud as well because this has always been an underlying reality especially within life sciences. This industry grows through acquisition where companies are looking to boost their future development pipeline either by buying up smaller biotechs, they may have like a late or a mid-stage promising candidate. And what typically happens is the larger pharma could then use their commercial muscle and their regulatory experience to move it to approvals and into the market. And I think the last few decades of cheap capital certainly accelerated that trend over the last couple of years. But this typically means that these new combined institutions may have technologies that are running on multiple clouds or multiple cloud strategies in various different regions to your point. And what we've often found is that they're not planning to standardize everything onto a single cloud provider. They're often looking for technologies that embrace this multi-cloud approach and work seamlessly across them. And I think this is a big reason why we, here at Snowflake, we've seen such strong momentum and growth across this industry because healthcare and life science has actually been one of our fastest growing sectors over the last couple of years. And a big part of that is in fact that we run on not only all three major cloud providers, but individual accounts within each and any one of them, they had the ability to communicate and interoperate with one another, like a globally interconnected database. >> Great, thank you for that setup. And so Nick, tell us more about your role and Ionis Pharma please. >> Sure. So I've been at Ionis for around five years now. You know, when when I joined it was, the IT department was pretty small. There wasn't a lot of warehousing, there wasn't a lot of kind of big data there. We saw an opportunity with Snowflake pretty early on as a provider that would be a lot of benefit for us, you know, 'cause we're small, wanted something that was fairly hands off. You know, I remember the days where you had to get a lot of DBAs in to fine tune your databases, make sure everything was running really, really well. The notion that there's, you know, no indexes to tune, right? There's very few knobs and dials, you can turn on Snowflake. That was appealing that, you know, it just kind of worked. So we found a use case to bring the platform in. We basically used it as a logging replacement as a Splunk kind of replacement with a platform called Elysium Analytics as a way to just get it in the door and give us the opportunity to solve a real world use case, but also to help us start to experiment using Snowflake as a platform. It took us a while to A, get the funding to bring it in, but B, build the momentum behind it. But, you know, as we experimented we added more data in there, we ran a few more experiments, we piloted in few more applications, we really saw the power of the platform and now, we are becoming a commercial organization. And with that comes a lot of major datasets. And so, you know, we really see Snowflake as being a very important part of our ecology going forward to help us build out our infrastructure. >> Okay, and you are running, your group runs on Azure, it's kind of mono cloud, single cloud, but others within Ionis are using other clouds, but you're not currently, you know, collaborating in terms of data sharing. And I wonder if you could talk about how your data needs have evolved over the past decade. I know you came from another highly regulated industry in financial services. So what's changed? You sort of touched on this before, you had these, you know, very specialized individuals who were, you know, DBAs, and, you know, could tune databases and the like, so that's evolved, but how has generally your needs evolved? Just kind of make an observation over the last, you know, five or seven years. What have you seen? >> Well, we, I wasn't in a group that did a lot of warehousing. It was more like online trade capture, but, you know, it was very much on-prem. You know, being in the cloud is very much a dirty word back then. I know that's changed since I've left. But in, you know, we had major, major teams of everyone who could do everything, right. As I mentioned in the pharma organization, there's a lot fewer of us. So the data needs there are very different, right? It's, we have a lot of SaaS applications. One of the difficulties with bringing a lot of SaaS applications on board is obviously data integration. So making sure the data is the same between them. But one of the big problems is joining the data across those SaaS applications. So one of the benefits, one of the things that we use Snowflake for is to basically take data out of these SaaS applications and load them into a warehouse so we can do those joins. So we use technologies like Boomi, we use technologies like Fivetran, like DBT to bring this data all into one place and start to kind of join that basically, allow us to do, run experiments, do analysis, basically take better, find better use for our data that was siloed in the past. You mentioned- >> Yeah. And just to add on to Nick's point there. >> Go ahead. >> That's actually something very common that we're seeing across the industry is because a lot of these SaaS applications that you mentioned, Nick, they're with from vendors that are trying to build their own ecosystem in walled garden. And by definition, many of them do not want to integrate with one another. So from a, you know, from a data platform vendor's perspective, we see this as a huge opportunity to help organizations like Ionis and others kind of deal with the challenges that Nick is speaking about because if the individual platform vendors are never going to make that part of their strategy, we see it as a great way to add additional value to these customers. >> Well, this data sharing thing is interesting. There's a lot of walled gardens out there. Oracle is a walled garden, AWS in many ways is a walled garden. You know, Microsoft has its walled garden. You could argue Snowflake is a walled garden. But the, what we're seeing and the whole reason behind the notion of super-cloud is we're creating an abstraction layer where you actually, in this case for this use case, can share data in a governed manner. Let's forget about the cross-cloud for a moment. I'll come back to that, but I wonder, Nick, if you could talk about how you are sharing data, again, Snowflake sort of, it's, I look at Snowflake like the app store, Apple, we're going to control everything, we're going to guarantee with data clean rooms and governance and the standards that we've created within that platform, we're going to make sure that it's safe for you to share data in this highly regulated industry. Are you doing that today? And take us through, you know, the considerations that you have in that regard. >> So it's kind of early days for us in Snowflake in general, but certainly in data sharing, we have a couple of examples. So data marketplace, you know, that's a great invention. It's, I've been a small IT shop again, right? The fact that we are able to just bring down terabyte size datasets straight into our Snowflake and run analytics directly on that is huge, right? The fact that we don't have to FTP these massive files around run jobs that may break, being able to just have that on tap is huge for us. We've recently been talking to one of our CRO feeds- CRO organizations about getting their data feeds in. Historically, this clinical trial data that comes in on an FTP file, we have to process it, take it through the platforms, put it into the warehouse. But one of the CROs that we talked to recently when we were reinvestigate in what data opportunities they have, they were a Snowflake customer and we are, I think, the first production customer they have, have taken that feed. So they're basically exposing their tables of data that historically came in these FTP files directly into our Snowflake instance now. We haven't taken advantage of that. It only actually flipped the switch about three or four weeks ago. But that's pretty big for us again, right? We don't have to worry about maintaining those jobs that take those files in. We don't have to worry about the jobs that take those and shove them on the warehouse. We now have a feed that's directly there that we can use a tool like DBT to push through directly into our model. And then the third avenue that's came up, actually fairly recently as well was genetics data. So genetics data that's highly, highly regulated. We had to be very careful with that. And we had a conversation with Snowflake about the data white rooms practice, and we see that as a pretty interesting opportunity. We are having one organization run genetic analysis being able to send us those genetic datasets, but then there's another organization that's actually has the in quotes "metadata" around that, so age, ethnicity, location, et cetera. And being able to join those two datasets through some kind of mechanism would be really beneficial to the organization. Being able to build a data white room so we can put that genetic data in a secure place, anonymize it, and then share the amalgamated data back out in a way that's able to be joined to the anonymized metadata, that could be pretty huge for us as well. >> Okay, so this is interesting. So you talk about FTP, which was the common way to share data. And so you basically, it's so, I got it now you take it and do whatever you want with it. Now we're talking, Jesse, about sharing the same copy of live data. How common is that use case in your industry? >> It's become very common over the last couple of years. And I think a big part of it is having the right technology to do it effectively. You know, as Nick mentioned, historically, this was done by people sending files around. And the challenge with that approach, of course, while there are multiple challenges, one, every time you send a file around your, by definition creating a copy of the data because you have to pull it out of your system of record, put it into a file, put it on some server where somebody else picks it up. And by definition at that point you've lost governance. So this creates challenges in general hesitation to doing so. It's not that it hasn't happened, but the other challenge with it is that the data's no longer real time. You know, you're working with a copy of data that was as fresh as at the time at that when that was actually extracted. And that creates limitations in terms of how effective this can be. What we're starting to see now with some of our customers is live sharing of information. And there's two aspects of that that are important. One is that you're not actually physically creating the copy and sending it to someone else, you're actually exposing it from where it exists and allowing another consumer to interact with it from their own account that could be in another region, some are running in another cloud. So this concept of super-cloud or cross-cloud could becoming realized here. But the other important aspect of it is that when that other- when that other entity is querying your data, they're seeing it in a real time state. And this is particularly important when you think about use cases like supply chain planning, where you're leveraging data across various different enterprises. If I'm a manufacturer or if I'm a contract manufacturer and I can see the actual inventory positions of my clients, of my distributors, of the levels of consumption at the pharmacy or the hospital that gives me a lot of indication as to how my demand profile is changing over time versus working with a static picture that may have been from three weeks ago. And this has become incredibly important as supply chains are becoming more constrained and the ability to plan accurately has never been more important. >> Yeah. So the race is on to solve these problems. So it start, we started with, hey, okay, cloud, Dave, we're going to simplify database, we're going to put it in the cloud, give virtually infinite resources, separate compute from storage. Okay, check, we got that. Now we've moved into sort of data clean rooms and governance and you've got an ecosystem that's forming around this to make it safer to share data. And then, you know, nirvana, at least near term nirvana is we're going to build data applications and we're going to be able to share live data and then you start to get into monetization. Do you see, Nick, in the near future where I know you've got relationships with, for instance, big pharma like AstraZeneca, do you see a situation where you start sharing data with them? Is that in the near term? Is that more long term? What are the considerations in that regard? >> I mean, it's something we've been thinking about. We haven't actually addressed that yet. Yeah, I could see situations where, you know, some of these big relationships where we do need to share a lot of data, it would be very nice to be able to just flick a switch and share our data assets across to those organizations. But, you know, that's a ways off for us now. We're mainly looking at bringing data in at the moment. >> One of the things that we've seen in financial services in particular, and Jesse, I'd love to get your thoughts on this, is companies like Goldman or Capital One or Nasdaq taking their stack, their software, their tooling actually putting it on the cloud and facing it to their customers and selling that as a new monetization vector as part of their digital or business transformation. Are you seeing that Jesse at all in healthcare or is it happening today or do you see a day when that happens or is healthier or just too scary to do that? >> No, we're seeing the early stages of this as well. And I think it's for some of the reasons we talked about earlier. You know, it's a much more secure way to work with a colleague if you don't have to copy your data and potentially expose it. And some of the reasons that people have historically copied that data is that they needed to leverage some sort of algorithm or application that a third party was providing. So maybe someone was predicting the ideal location and run a clinical trial for this particular rare disease category where there are only so many patients around the world that may actually be candidates for this disease. So you have to pick the ideal location. Well, sending the dataset to do so, you know, would involve a fairly complicated process similar to what Nick was mentioning earlier. If the company who was providing the logic or the algorithm to determine that location could bring that algorithm to you and you run it against your own data, that's a much more ideal and a much safer and more secure way for this industry to actually start to work with some of these partners and vendors. And that's one of the things that we're looking to enable going into this year is that, you know, the whole concept should be bring the logic to your data versus your data to the logic and the underlying sharing mechanisms that we've spoken about are actually what are powering that today. >> And so thank you for that, Jesse. >> Yes, Dave. >> And so Nick- Go ahead please. >> Yeah, if I could add, yeah, if I could add to that, that's something certainly we've been thinking about. In fact, we'd started talking to Snowflake about that a couple of years ago. We saw the power there again of the platform to be able to say, well, could we, we were thinking in more of a data share, but could we share our data out to say an AI/ML vendor, have them do the analytics and then share the data, the results back to us. Now, you know, there's more powerful mechanisms to do that within the Snowflake ecosystem now, but you know, we probably wouldn't need to have onsite AI/ML people, right? Some of that stuff's very sophisticated, expensive resources, hard to find, you know, it's much better for us to find a company that would be able to build those analytics, maintain those analytics for us. And you know, we saw an opportunity to do that a couple years ago and we're kind of excited about the opportunity there that we can just basically do it with a no op, right? We share the data route, we have the analytics done, we get the result back and it's just fairly seamless. >> I mean, I could have a whole another Cube session on this, guys, but I mean, I just did a a session with Andy Thurai, a Constellation research about how difficult it's been for organization to get ROI because they don't have the expertise in house so they want to either outsource it or rely on vendor R&D companies to inject that AI and machine intelligence directly into applications. My follow-up question to you Nick is, when you think about, 'cause Jesse was talking about, you know, let the data basically stay where it is and you know bring the compute to that data. If that data lives on different clouds, and maybe it's not your group, but maybe it's other parts of Ionis or maybe it's your partners like AstraZeneca, or you know, the AI/ML partners and they're potentially on other clouds or that data is on other clouds. Do you see that, again, coming back to super-cloud, do you see it as an advantage to be able to have a consistent experience across those clouds? Or is that just kind of get in the way and make things more complex? What's your take on that, Nick? >> Well, from the vendors, so from the client side, it's kind of seamless with Snowflake for us. So we know for a fact that one of the datasets we have at the moment, Compile, which is a, the large multi terabyte dataset I was talking about. They're on AWS on the East Coast and we are on Azure on the West Coast. And they had to do a few tweaks in the background to make sure the data was pushed over from, but from my point of view, the data just exists, right? So for me, I think it's hugely beneficial that Snowflake supports this kind of infrastructure, right? We don't have to jump through hoops to like, okay, well, we'll download it here and then re-upload it here. They already have the mechanism in the background to do these multi-cloud shares. So it's not important for us internally at the moment. I could see potentially at some point where we start linking across different groups in the organization that do have maybe Amazon or Google Cloud, but certainly within our providers. We know for a fact that they're on different services at the moment and it just works. >> Yeah, and we learned from Benoit Dageville, who came into the studio on August 9th with first Supercloud in 2022 that Snowflake uses a single global instance across regions and across clouds, yeah, whether or not you can query across you know, big regions, it just depends, right? It depends on latency. You might have to make a copy or maybe do some tweaks in the background. But guys, we got to jump, I really appreciate your time. Really thoughtful discussion on the future of data and cloud, specifically within healthcare and pharma. Thank you for your time. >> Thanks- >> Thanks for having us. >> All right, this is Dave Vellante for theCUBE team and my co-host, John Furrier. Keep it right there for more action at Supercloud 2. (upbeat music)

Published Date : Jan 3 2023

SUMMARY :

and analytics in the So the first question is, you know And it's interesting that you Great, thank you for that setup. get the funding to bring it in, over the last, you know, So one of the benefits, one of the things And just to add on to Nick's point there. that you mentioned, Nick, and the standards that we've So data marketplace, you know, And so you basically, it's so, And the challenge with Is that in the near term? bringing data in at the moment. One of the things that we've seen that algorithm to you and you And so Nick- the results back to us. Or is that just kind of get in the way in the background to do on the future of data and cloud, All right, this is Dave Vellante

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jesse CugliottaPERSON

0.99+

Dave VellantePERSON

0.99+

GoldmanORGANIZATION

0.99+

AstraZenecaORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

John FurrierPERSON

0.99+

Capital OneORGANIZATION

0.99+

JessePERSON

0.99+

Andy ThuraiPERSON

0.99+

AWSORGANIZATION

0.99+

August 9thDATE

0.99+

NickPERSON

0.99+

NasdaqORGANIZATION

0.99+

Nicholas Nick TaylorPERSON

0.99+

fiveQUANTITY

0.99+

AmazonORGANIZATION

0.99+

IonisORGANIZATION

0.99+

DavePERSON

0.99+

Ionis PharmaORGANIZATION

0.99+

Nicholas TaylorPERSON

0.99+

Ionis PharmaceuticalsORGANIZATION

0.99+

SnowflakeORGANIZATION

0.99+

first questionQUANTITY

0.99+

Benoit DagevillePERSON

0.99+

AppleORGANIZATION

0.99+

seven yearsQUANTITY

0.99+

OracleORGANIZATION

0.99+

2022DATE

0.99+

todayDATE

0.99+

over 10 yearsQUANTITY

0.98+

SnowflakeTITLE

0.98+

oneQUANTITY

0.98+

OneQUANTITY

0.98+

two aspectsQUANTITY

0.98+

firstQUANTITY

0.98+

this yearDATE

0.97+

eachQUANTITY

0.97+

two datasetsQUANTITY

0.97+

West CoastLOCATION

0.97+

four weeks agoDATE

0.97+

around five yearsQUANTITY

0.97+

threeQUANTITY

0.95+

first productionQUANTITY

0.95+

East CoastLOCATION

0.95+

third avenueQUANTITY

0.95+

one organizationQUANTITY

0.94+

theCUBEORGANIZATION

0.94+

couple years agoDATE

0.93+

single cloudQUANTITY

0.92+

single cloud providerQUANTITY

0.92+

hree weeks agoDATE

0.91+

one placeQUANTITY

0.88+

AzureTITLE

0.86+

last couple of yearsDATE

0.85+

Loic Giraud, Novartis & Jesse Cugliotta, Snowflake | Snowflake Summit 2022


 

(upbeat music) >> Welcome back to Vegas, baby. Lisa Martin here with theCUBE. We are live at Caesar's Forum covering Snowflake Summit 22. This is day two of our wall to wall coverage on theCUBE you won't want to miss. We've got an exciting customer story to talk to you about next with Novartis and Snowflake. Please welcome two guests to theCUBE. Loïc Giraud, Global head digital delivery, Novartis. I hope I got the name right. >> Yes. Hi, thank you. >> I did my best. >> Absolutely. >> Lisa: (laughs) Jesse Cugliotta also joins us. Global Industry Lead, Healthcare and Life Sciences at Snowflake. Welcome with theCUBE, gentlemen. >> Thank you for having us. Good morning. >> So it was great to hear Novartis is a household word now, especially with what's gone on in the last two years. I had a chance to see the Keynote yesterday, heard Novartis mention in terms of a massive outcome that Snowflake is delivering that we're going to get to. But Loic talk to us about Novartis global 500 organization. You rank among the world's top companies investing in R&D, the massive portfolio and you're reaching nearly 800 million patients worldwide. That's huge, but there's been a lot of change in the healthcare and life sciences industry, especially recently. Talk to us about the industry landscape. What are you seeing? >> As you described, Novartis is one of the top life science company in the world. We are number three. We operate in 150 countries, and we have almost 120,000 employees. Our purpose is actually to reimagine medicine for the use of data science and technology and to extend people's life. And we really mean it. I think, as you mentioned, we treat eight or 9 million patient per year with our drugs. We expect to treat more than a billion patients in near time soon. Over the last few years, especially during COVID, our digital transformation help us to accelerate the drug discovery and then the commiseration of our drug to markets. As it was mentioned in the Keynote yesterday, we have actually been able to reduce our time to market. It used to take us up to 12 years and cost around 1.2 billion to discover and commercialize drug. And now we've actually use of technology like Snowflake, we have been able to reduce by two to three years, which ultimately is a benefit for our patients. >> Absolutely. Well, we're talking about life and death situations. Talk about... You mentioned Novartis wants to reimagine medicine. What does that look like? Where is data in that and how is Snowflake an enabler of reimagining medicine? >> So data is core for our asset, is a core of enterprise process. So if you look at our enterprise, we are using data from the research, for drug development, in manufacturing process, and how do we market and sell our product through HCPs and distribute it to reach our patients. If you build through our digital transformation we have created this integrated data ecosystem, where Snowflake is a core component. And through that ecosystem, we are able to identify compounds and cohorts, perform clinical trials, and engage HCPs and HGOs so that can prescribe drugs to serve our patient needs. >> Jesse, let's bring you into the conversation. Snowflake recently launched its healthcare and life sciences data cloud. I believe that was back in March. >> It was. >> Just a couple of months ago. Talk to us about the vertical focus. Talk to us about what this healthcare and life sciences data cloud is aiming to help customers like Novartis achieve. >> Well, as you mentioned there, Snowflake has made a real pivot to kind of focus on the various different industries that we serve in a new way. I think historically, we've been engaged in really, all of the industries across the major sectors where we participate today. But historically we've been often engaging with the office of IT. And there was a recognition as a company that we really need to be able to better speak the language of our customers in with our respective industries. So the entire organization has really made a pivot to start to build that capability internally. That's part of the team that I support here at Snowflake. And with respect to healthcare and life sciences, that means being able to solve some of the challenges that Loic was just speaking about. In particular, we're seeing the industry evolve in a number of ways. You bring up clinical research in the time that it takes to actually bring a drug to market. This is a big one that's really changed a lot over the last couple of years. Some of the reasons are obvious and other ones are somewhat opportunistic. When we looked at what it takes to get a drug to market, there's several stages of clinical research that have to be participated in, and this can often take years. What we saw in the last couple of years, is that all of a sudden, patients didn't want to physically participate in those anymore, because there was fear of potential infection and being in a healthcare facility. So the entire industry realized that it needed to change in terms of way that it would engage with patients in that context. And we're now seeing this concept of decentralized clinical research. And with that, becomes the need to potentially involve many different types of organizations beyond the traditional pharma, their research partners, but we're starting to see organizations like retail pharmacies, like big box retailers, who have either healthcare delivery or pharmaceutical arms actually get involved in the process. And of course, one of the core things that happens here is that everyone needs a better way to collaborate and share data amongst one another. So bringing this back to your original question, this concept of being able to do exactly that is core to the healthcare and the life sciences data cloud. To be able to collaborate and share data amongst those different types of organizations. >> Collaboration and data sharing. It seems to me to be a differentiator for Snowflake, in terms of being able to deliver secure, governed powerful analytics and data sharing to customers, partners to the ecosystem. You mentioned an example of the ecosystem there and how impactful to patients' lives, that collaboration and data sharing can be. >> That's absolutely right. It's something that if you think about all of the major challenges that the industry has had historically, whether it is high costs, whether it are health inequities, whether it is physicians practicing defensive medicine or repeat testing, what's core to each one of these things is kind of the inability to adequate collaborate and share data amongst all of the different players. So the industry has been waiting for the capability or some sort of solution to be able to do this, I think for a long, long time. And this is probably one of the most exciting parts of the conversations that we have with our customers, is when they realize that this is possible. And not only that it's possible within our platform, but that most of the organizations that they work with today are also Snowflake customers. So they realize that everyone's already here. It's just a matter of who else can we work with and how do we get started? >> Join the party. >> Exactly. >> Loic talk to us about Novartis's data journey. I know you guys have been, I believe using Snowflake since 2017 pre pandemic. But you had a largely on-premises infrastructure. Talk to us about the decision of Novartis to go to the cloud, do it securely and why you chose to partner with Snowflake. >> So when we started our journey in 2018, I think the ambition that our CEO, was to transform all enterprise processes for the use of digital tech. And at the core of this digital tech is data foundation. So we started with a large program called Formula One, which aim to integrate all our internal and external data asset into an integrated platform. And for that, I think we've built this multicloud and best upgrade platform, where Snowflake is a core component. And we've been able to integrate almost 1,000 data asset, internal and external for the platform to be able to accelerate the use of data to create insight for our users. In that transformation, we've realized that Snowflake could be a core component because of the scalability and the performance with large dataset. And moreover, when Snowflake started to actually open collaboration for their marketplace, we've been able to integrate new data set that are publicly available at the place that we could not do on ourself, on our own. So that is a core component of what we are trying to do. >> Yeah, and I think that's a great example of really what we're talking about here is that, he's mentioning that they're going out to our marketplace to be able to integrate data more easily with some of the vendors there. And that is kind of this concept of the healthcare and life sciences data cloud realized, where all of a sudden, acquiring and bringing data in and making it ready for analysis becomes much faster, much easier. We continually see more and more vendors coming to us saying, I get it now, I want in. Who else can I work with in this space? So I think that's a perfect example of how this starts to become real for folks. >> Well, it sounds like the marketplace has been an enabler, Loic, of the expansion of use cases. You've grown this beyond drug development. I read that you're developing new products and services for healthcare providers to personalize treatments for patients, which we all are demanding patients. We want that personalized care. But talk about the marketplace as a facilitator of those expanding use cases that Snowflake is powering. >> Yes. That's right. I mean we have currently almost 65 use cases in production and we are in advanced progress for over 200 use cases and they go across all our business sector. So if you look at drug development, we are monitoring our clinical trials using Snowflake. If you look at our omnichannel marketing, we are looking at personalization of information with our HCPs and HGOs using snowflake. If you look at our manufacturing process, we are looking at yet management, freight optimization, inventory, insight. So almost across all the industry sectors that we have, I think we are using the platforms to be able to deliver faster information to our users. >> And that's what we all want. Faster information. I think in the pandemic we learned that access to real time data in every industry wasn't a nice to have. That was a- >> Necessity. >> Absolute necessity. >> Yeah. >> And made the difference for companies that survived and thrived and those that didn't. That's something that we learned. But we also learned that the volume of data just continues to proliferate. Loic, you've been in the industry a couple of decades. What do you see? And you've got, obviously this great foundation now with Snowflake. You've got 65 use cases you said in production. What's the future of the data culture in healthcare and life sciences from your perspective? >> So my perspective. It is time now we give the access to our business technologies to be able to be self-sufficient using digital product. We need to consumerize digital technology so they can be self-sufficient. The amount of problems that we have to solve, and we can now solve with new technology has never been there. And I think where in the past, where in the next few years that you will see an accelerated generation of insight and an accelerated process of medicine by empowering the business technologies to use a technology that like Snowflake and over progress. >> What are your thoughts Loic, of some of the, obviously a lot of news coming out yesterday from Snowflake, we mentioned standing room only in the Keynote. This I believe is north of 10,000 attendees. People are ready to engage in person with Snowflake, but some of the news coming out, what is your perspective? You've been a partner of theirs for a while. What do you see from Snowflake in terms of the news, the volume of customers it's adding, all that good stuff? >> I must say I was blown away yesterday when Frank was talking about the ramp up of customers using Snowflake. But also, and I think in Benoit and Christian, and they talk about the innovation. When you look at native application or you look at hybrid tables, we saw a thing there. And the expansion of the marketplace by monetization application, that is something that is going to accelerate the expansion, not only on the company, but the integration and the utilization of customers. And to Jesse's point, I think that it is key that people collaborate using the platform. I think we want to collaborate with suppliers and providers and they want to collaborate with us. But we want to have a neutral environment where we can do that. And Snowflake can be that environment. >> And do it securely, right? Security is absolutely- >> Of course. I mean that's really table stake for this industry. And I think the point that you just made Loic, is very important, is that, the biggest question that we're often asked by our customers is who else is a customer within this industry that I can collaborate with? I think as Loic here will attest to, one of the challenges within life sciences in particular is that it is a highly regulated industry. It is a highly competitive industry, and folks are very sensitive about referenceability. So about things like logo usage. So to give some ideas here, people often have no idea that we're working with 28 of the top 50 global pharma today, working with seven of the top 12 global medical device companies today. The largest CROs, the largest distributors. So when I say that the party is here, they really are. And that's why we're so excited to have events like these, 'cause people can physically introduce themselves to one another and meet, and actually start to engage in some of these more collaborative discussions that they've been waiting for. >> Jesse, what's been some of the feedback that you've heard the last couple of days on the healthcare and life sciences data cloud? You've obviously finally gotten back to engaging with customers in person. But what are some of the things, feed on this street have said that you've thought, we made the absolute right decision on this pivot? >> Yeah, well I think some of it speaks to the the point I was just speaking about, is that they had no idea that so many of their peers were actually working with Snowflake already and that how mature their implementations have actually been. The other thing that folks are realizing is that, a lot of the technologies that serve this ecosystem, whether they're in the health tech space, whether they're clinical management or commercial engagement or supply chain planning technologies, those companies are also now pivoting to Snowflake, where they're either building a part or the entirety of their platform on top of ours. So it offers this great way to start to collaborate with the ecosystem through some of those capabilities that we spoke about. And that's driving new use cases in commercial, in supply chain, in pharmacovigilance, in clinical operations. >> Well, I think you just sum up beautifully why the theme of this conference is the world of data collaboration. >> Yes, absolutely. >> The potential there, that Snowflake is unleashing to the world is I think is what's captivating to me. That you just scratch on the surface about connecting and facilitating this collaboration and this data sharing in a secure way across industries. Loic, last question for you. Take us home with what is next for Novartis. You've done a tremendous amount of digitalization. 65 use cases in production with Snowflake. What's next for the company? >> See, I think that in next year's to come, open collaboration with the ecosystem, but also personalization. If you look at digital medicine and access to patient's informations, I think this is probably the next revolution that we are entering into. >> Excellent. And of course those demanding patients aren't going to want anything slower or less information. Guys, thank you for joining me on the program talking about the Novartis-Snowflake collaboration. The partnership, the outcomes that you're achieving and how this is really dramatically impacting the lives of hundreds of millions of people. We appreciate your time and your insights. >> Thank you for having us. This was fun. >> My pleasure. >> Thank you. >> For my guests, I'm Lisa Martin. You're watching theCUBE. This is live from Las Vegas, day two of our coverage of Snowflake Summit 22. I'll be right back with my next guest, so stick around. (upbeat music)

Published Date : Jun 15 2022

SUMMARY :

to talk to you about next Healthcare and Life Sciences at Snowflake. Thank you for having us. in the healthcare and of our drug to markets. Where is data in that and how do we market and sell our product I believe that was back in March. is aiming to help customers And of course, one of the of the ecosystem there is kind of the inability Talk to us about the decision of Novartis and the performance with large dataset. of how this starts to the expansion of use cases. So almost across all the we learned that access to real that the volume of data just and we can now solve with new technology in terms of the news, And the expansion of the marketplace and actually start to engage to engaging with customers in person. a lot of the technologies is the world of data collaboration. What's next for the company? and access to patient's informations, joining me on the program Thank you for having us. of Snowflake Summit 22.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jesse CugliottaPERSON

0.99+

Lisa MartinPERSON

0.99+

JessePERSON

0.99+

MarchDATE

0.99+

FrankPERSON

0.99+

28QUANTITY

0.99+

NovartisORGANIZATION

0.99+

2018DATE

0.99+

twoQUANTITY

0.99+

Las VegasLOCATION

0.99+

eightQUANTITY

0.99+

Jesse CugliottaPERSON

0.99+

VegasLOCATION

0.99+

sevenQUANTITY

0.99+

Loïc GiraudPERSON

0.99+

Loic GiraudPERSON

0.99+

yesterdayDATE

0.99+

LisaPERSON

0.99+

three yearsQUANTITY

0.99+

more than a billion patientsQUANTITY

0.99+

two guestsQUANTITY

0.99+

oneQUANTITY

0.99+

65 use casesQUANTITY

0.99+

LoicPERSON

0.99+

next yearDATE

0.99+

SnowflakeORGANIZATION

0.99+

over 200 use casesQUANTITY

0.99+

150 countriesQUANTITY

0.98+

nearly 800 million patientsQUANTITY

0.98+

almost 65 use casesQUANTITY

0.97+

around 1.2 billionQUANTITY

0.97+

todayDATE

0.97+

couple of months agoDATE

0.96+

ChristianORGANIZATION

0.96+

hundreds of millionsQUANTITY

0.95+

Snowflake Summit 22EVENT

0.95+

Snowflake Summit 2022EVENT

0.95+

theCUBEORGANIZATION

0.95+

BenoitORGANIZATION

0.94+

SnowflakeTITLE

0.93+

almost 120,000 employeesQUANTITY

0.93+

up to 12 yearsQUANTITY

0.93+

last couple of yearsDATE

0.92+

pandemicEVENT

0.91+

LoicORGANIZATION

0.89+

top 12 global medical device companiesQUANTITY

0.89+

day twoQUANTITY

0.87+

almost 1,000 data assetQUANTITY

0.87+

snowflakeTITLE

0.87+

KeynoteEVENT

0.85+

last few yearsDATE

0.83+

north of 10,000 attendeesQUANTITY

0.82+

500 organizationQUANTITY

0.82+

SnowflakeEVENT

0.79+

top 50 global pharmaQUANTITY

0.79+

last two yearsDATE

0.79+

peopleQUANTITY

0.78+

John Vitalie, Aizon | CUBE Conversation May 2021


 

>>Welcome to this cube conversation that is a part of the AWS startup showcase. I'm lisa martin I've got with me now the ceo of amazon john Vitaly john welcome to the cube >>lisa. It's a pleasure to be here. Nice to see you. >>Likewise give our audience in a real liaison and what it is that you guys do specifically in pharma and life sciences. >>Well, you can find that in our, the name of the company is on uh, we think of us as leading uh, customers to the horizon of AI and pharmaceutical, biological manufacturing. And uh, we're all about helping our customers take The step into Pharma 40 and really realized the value of leveraging, machine learning and artificial intelligence in the manufacturing process so they can get higher yields and predictability and ultimately better outcomes for their patients. >>Is your technology built on AWS? >>Absolutely. From the ground up. We leverage, yeah, we leveraged as much as we can from AWS innovation and, you know, a few years ago, when our founders envisioned the future of manufacturing in this industry and where it needs to go first thought was go with a leader to build the solutions and of course A W. S. Is by far the largest provider of this type of technology. And we're happy to say that we're helping and partnering with A W. S. Two to advance the science of artificial intelligence in life sciences. And uh it's just a natural fit for us to continue to leverage the platform on behalf of our customers. >>I like that. The Ai horizon. Excellent. So talk to me a little bit about, you know, the last year has been presented many challenges and also opportunities for people in every industry. I'm just wondering what are some of the changes that we've seen? Farm and life sciences companies have become household names for example, but talk to me about some of the the key initiatives in smart manufacturing and what pharma companies require. >>Well sure, you know farmer companies and biotech companies like look into the lessons from other industries where ai has been widely adopted. If you look at uh manufacturing and other industries has been widely adopted for a number of years. Tesla is a great example of how to use A. I. And robotics and and data science uh to advance uh the efficiency of manufacturing globally. Uh that's exactly what we're trying to achieve here in in life sciences. So um you know, a lot of the leading innovators in this space have been working in their labs with data science teams to you know find new ways to collect data uh to cleanse that data, make it data that's useful across the enterprise. Um but they haven't really tackled, you know, continuous processing in manufacturing yet. There are a number of leaders that are mapping out strategies and they've begun to go down this path. Um But most are really looking at how first to bring the data together in a way that it could be democratized and anonymous in some cases and used across the enterprise. Uh There's a model that we've adopted in terms of our product strategy and how we engage customers and that's the uh the the pharmaceutical maturity model which was developed by the bio forum. This maturity models is a great way for companies and vendors alike innovators to look at how to help Advance their capabilities from one level to the next. And so we help customers understand where they are in that journey and we look for the areas where they can get traction more quickly. They can see value sooner and therefore the adoption would would be accelerating across across their their sites. And in different ways of use. >>Is that maturity model? That farm of maturity model? Is it is it built on or based on digital transformation? >>Absolutely. It's all about digital transformation. And so the model really begins with pre digital and you'd be amazed to find I think the the amount of Excel spreadsheets that are still used in manufacturing today and that would be what we would consider to be pretty much pre digital because that data is not accessible. It's only used by the operator or the user. So it's really about getting from that level to uh breaking down data silos and bringing that data together and harmonizing the data and making it useful. The next level would be about the connected plant actually connecting machines and data lakes um to begin to get more value and find find more ways to improve the processes. And then you move up to using advanced analytics and AI and then ultimately have an enterprise wide adaptive manufacturing capabilities, which is really the ultimate vision, ultimate goal. Every manufacturer has. >>One of the things that we've been talking about for the last 14 plus months or so is really the acceleration in cloud adoption, digital transformation as really a survival mechanism that many industries undertook. And we saw all of us go remote or many of us and be dependent on cloud based collaboration tools. For example, I'm curious in the pharmaceutical industry again, as I said, you know, we we know that the big three and for household names that many of us have been following for the last 14 months or so. What have you seen in terms of acceleration? Informal companies going all right, we need to figure out where we are in this maturity model. We need to be able to accelerate, you know, drug discovery, be able to get access to data. Has that accelerated in the Covid era? >>Covid has been the great catalyst of all time for this industry. Ah and I think it was a wake up call for a lot of, a lot of people in the industry to recognize that uh, just because we have the highest quality standards and we have highest level of compliance requirements and um, we ultimately all think about efficacy and patient safety as our goal to achieve the highest levels of quality. Everyone agrees with that. What the realization was is that we do not have the capacity in any, any geography or with any company, um, to meet the demands that we're seeing today demands to get product to market the demand to get the supply chain right and make it work for manufacturing. The, uh, the uh, The opportunity to partner to get there was, you know, you can see that by the way companies came together to partner for COVID-19 vaccine manufacturing production. And so, um, it was a wake up call that it's time to get over the kind of cultural barriers, risk aversion and really come together to coalesce around a a smart manufacturing strategy that has to be combined with a G XP or good manufacturing compliance standards. And that has to be designed in to the technology and manufacturing processes Together. That's Pharma 4.0, >>got it. Thank you. Let's dig in more to that GSP compliance. And you guys, we talk about that in different industries. The X being, you know, X for X type of industry, talk to me about the compliance regulations and your G XP AI platform and how you guys built on top of amazon, help customers evolve their maturity and facilitate complaints. >>Absolutely. So as I alluded to earlier, one of the biggest challenges is just getting the data together in a place that you can actually manage it. And because there's so many legacy systems and on predominantly on prem technologies and use today, cloud is starting to gain a lot more traction, but it's been limited to uh kind of tier two and tier three data. Uh so now we're seeing uh you know, the recognition that uh just having a data link isn't enough. And so uh we have to overcome, you know, the biggest barrier is really a version to change and change management is really a huge part of any customer being successful. And I think with a W S and us, we were working together to help customers customers understand the type of change management that's required. It's not enough to say, well, we're going to apply the old techniques and processes and use new technology. It just doesn't work that way. If you're adding people uh, and scaling up people just to do validation, worked on a brand new platform, like AWS offers, like we offer on top of AWS, you just won't get three return on investment, you won't get the outcomes and results you're targeting. Uh you have to really have a full strategy in place. Um but you can, and start in small ways, you can start to get traction with use cases that might not have the a huge impact that you're looking for, but it's a way to get started. And uh, the AWS platform is, you know, a great way to look at um, a strategy to scale manufacturing not just in one site but across multiple sites because it's really a data management strategy uh for us using US components uh to build our data collection technology was the starting point. So how do you bring this day together and make it easy and with low overhead and begin to use Ai at the point of collection? So we built our technology with AWS components to do that it's called we call them be data feeders and those are agents that go out and collect that data and bring it together. We also because of the way at AWS innovated around data management we can use a multitude of components to continue to build capabilities on top of what we have today. So we're excited to partner to follow the AWS Roadmap but also continue to add value to what A. W. S. Does today for customers. >>Right? Seems very symbiotic but also your gives you the platform gives you the agility and flexibility that you need to turn things on a dime. I like how you said Covid was a catalyst. I've been saying that for a year now there are things that it has catalyzed for the good and one of those that we've seen repeatedly is that the need for real time data access in many industries like life sciences and pharma is no longer a nice to have but it's incredibly challenging to get real time access to high quality data. Be able to run analytics on that you know, identify where the supply chain in the manufacturing process. For example things can be optimized. Give me an example or some examples of some of the use cases that you guys are working with customers on. I imagine things like that to process optimization, anomaly detection. But what are some of those key use cases in which you really excel? >>Well, it all starts with with what we can do around predictions. There's a lot of data science work being done today, understand variability and how to reduce deviations and how to get more um of predictions to know what is expected to happen. Uh But a lot of that doesn't get applied to the processes. It's not applied as a change the process because that requires revalidation of that entire process. Our platform brings huge value to customers and partners because we do the qualification and validation on the platform in real time. And so that eliminates the needs to go back out and deploy people and uh track and re document uh and re validate what's going on in the process. So that that just takes a huge uh responsibility in some cases liabilities off off of the operators and uh the folks analyzing the data. So that's that's really to get to real time. You have to think carefully about how to apply apply ai because a I was developed in a scientific way but you also have to apply it in a scientific way to to these critical processes in manufacturing. And so that's that's only done uh on a platform, you can't do it on a kind of a stand alone basis. You have to leverage a platform because you're analysing changes to the data and to the code being used to collect and analyze the data that all has to be documented. And that's that's done by our capabilities are using to audit or create audit trails uh to any changes that are happening in the process. And so that's a critical critical process monitoring capability. That is almost impossible to do manually. Uh Some some would say it's impossible to do manually. Uh so uh the the ability to to qualify algorithms to validate in real time enables real time manufacturing and there's a F. D A. Uh I would I would say mandate but guidance called continuous process verification cPV that they will be coming out with additional guidance on that this year. That's really there to uh tell tell manufacturers that they should be getting to real time capabilities. They should be driving their investments and and types of deployments to get to real time manufacturing. That's the only way you can predict deviations and predict anomalies and deal with them in the process and track it. >>So give me give me a snapshot of a customer or two that you've worked with in the last year as they were rapidly evolving and adjusting to the changes going on. How did you help some of these customers extract more value from their pharma manufacturing processes, understand what it is that they need to do to embrace A. I. And get to that real time. >>Absolutely. So, you know, most of our customers are facing the challenge and dilemma that just adding more people and more resources and even upgrading existing technologies or adding more data scientist has a limit. They've reached the limit of improvement that they can make to these processes in the output in manufacturing. So the next natural step would be to say, okay, what science can I apply here and what technology is available To really get to that next one or two improvement in the processes. And it's really critical to look at um you know, not just one use case, but how can I address multiple problems using the same technology? So bringing multi variant uh multi variable excuse me. Um analysis capabilities um is is something that's done in every other industry um but it has not been applied here in terms of changing how manufacturing works today. We can do that, we can we can do multi variable analysis in real time, we can predict what will happen. We can actually alert the operator to make changes to the process based on uh a number of predictions of what will happen in a batch or series of matches in manufacturing. We also bring unstructured data into those calculations that wasn't possible before cloud technology came along and before a I was deployed. Um So now we can look at environmental inputs, we can look at um upstream data that can be used for improving um you know, the yield on batches. So the you know, the main um focus today is you know, how do I get, reduce my risk around asset management? How can I improve visibility into the supply chain? How can I reduce deviations in these processes? How can I get more yield? How can I optimize the yield uh in any given batch uh to improve uh you know, the entire process but also reduce costs in each step of the way. Uh So uh the good news is that when you apply our technology and our know how uh there's an immediate positive impact. There's a customer, we're working with very large customer where we walked in and they said we have this problem, we've reached a certain level of optimization and yield. We can't seem to get it to go any higher. and within six weeks we had a solution in place and we are saving them tens of millions of dollars in material loss just in that once one step in the process that's worth hundreds of millions of dollars in terms of finished product. Uh and if you apply that across multiple lines and across multiple manufacturing sites for that customer, we're talking hundreds of millions of dollars of savings, um >>significant impact, significant business impact that your customers I saw on the website, you know, R. O. I. And was at six when I get this right. I had it here somewhere um quite quickly. But the key thing there is that these organizations actually are really moving their business forward. You just gave some great examples of how you can do that. And just kind of a phase one of the project. Let me ask you this in in a post Covid world, assuming we'll get there hopefully soon. Where is in your opinion? Um Ai and ml for pharma companies, is it going to be something that is is for those that adopt it and adopt all the change management needed to do that? Is it going to be kind of the factor in deciding the winners and the losers of tomorrow? Okay, >>well, I don't want to lay down predictions like that, but I would, what I would say is uh all of thought leaders out there have have openly shared and privately shared that this is exactly where the industry has to go to meet the demands. Not just of ramping up COVID-19 vaccine production on a global basis, which we have to do. It's also dealing with how do we how do we uh scale up for personalized medicine, which requires small, small batch manufacturing? How do we turn over lines of manufacturing more efficiently to get more drugs to market more different types of drugs to market, how to contract manufacturers deal with all these pressures, um, and still serve their customers and innovate. Uh, there's also the rise of generics there, you know, that's bringing on cost pressures for big pharma particularly. And so these are all moving the industry in the right direction to respond to these on an individual basis. Would would definitely require the use of Ai and Ml But when you bring it all together, there's a huge huge of push for finding and finding breakthroughs to increase capacity and quality at the same time. >>Yeah, tremendous opportunity. My last question for you, john is a bit more on the personal side. I know you're a serial entrepreneur. What drew you to a zon when you have the opportunity? I can only imagine based on some of the things that you've said. But what was it that you said? This is my next great >>opportunity. That's a great question because I asked myself that question, uh so having been in the industry for for a long time, having been with very innovative companies my whole career, uh I knew that uh manufacturing had fallen behind even further in terms of innovating using the latest cloud technologies and ai in particular, I knew that from running another company uh that focused on the use of predictive analytics. And so uh given all the vectors coming together, the market pressure that's happening on the technology, absolutely. Being a maturity level that we could we could make these things a reality for customers in the size of the challenge. And market opportunity was just overwhelming. It was it was enough to make me jump in with both feet. So I'm very happy uh to be leading such a great team and amazing, amazing talent at amazon and super excited about our partnership with a W. S and where that's going and solving very, very complex and very critical, uh, challenges that our customers are facing together as partners. >>Absolutely. Well, john, thank you for joining me today and talking to us about who is on is what you're doing, particularly in pharma and life sciences, smart manufacturing and what you're enabling in a covid catalysis sort of way. We appreciate you joining us here today. >>This has been a pleasure. Thanks for having me. >>Likewise for john Vitaly, I'm lisa martin, you're watching the cube.

Published Date : May 18 2021

SUMMARY :

to the cube It's a pleasure to be here. Likewise give our audience in a real liaison and what it is that you guys do specifically Well, you can find that in our, the name of the company is on uh, we think of us as and of course A W. S. Is by far the largest provider So talk to me a little bit about, you know, So um you know, a lot of the leading innovators in this space have to uh breaking down data silos and bringing that We need to be able to accelerate, you know, drug discovery, be able to get access to data. a lot of people in the industry to recognize that uh, Let's dig in more to that GSP compliance. And so uh we have to overcome, you know, Be able to run analytics on that you know, identify where the supply And so that eliminates the needs to go back out How did you help some of these customers extract more value from their pharma manufacturing processes, the operator to make changes to the process based on uh a Um Ai and ml for pharma companies, is it going to be something that is and finding breakthroughs to increase capacity and quality at the same time. I can only imagine based on some of the things that you've said. I knew that from running another company uh that focused on the use of predictive Well, john, thank you for joining me today and talking to us about who is on is what you're doing, This has been a pleasure.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
amazonORGANIZATION

0.99+

May 2021DATE

0.99+

john VitalyPERSON

0.99+

lisa martinPERSON

0.99+

John VitaliePERSON

0.99+

AWSORGANIZATION

0.99+

johnPERSON

0.99+

COVID-19OTHER

0.99+

ExcelTITLE

0.99+

both feetQUANTITY

0.99+

hundreds of millions of dollarsQUANTITY

0.99+

twoQUANTITY

0.99+

tens of millions of dollarsQUANTITY

0.99+

last yearDATE

0.99+

todayDATE

0.99+

G XPTITLE

0.99+

one stepQUANTITY

0.98+

one siteQUANTITY

0.98+

firstQUANTITY

0.97+

one levelQUANTITY

0.97+

oneQUANTITY

0.97+

A W. S.ORGANIZATION

0.97+

OneQUANTITY

0.96+

each stepQUANTITY

0.96+

this yearDATE

0.96+

tomorrowDATE

0.96+

six weeksQUANTITY

0.95+

few years agoDATE

0.93+

one use caseQUANTITY

0.92+

sixQUANTITY

0.92+

john Vitaly johnPERSON

0.92+

a yearQUANTITY

0.92+

first thoughtQUANTITY

0.91+

AizonPERSON

0.89+

USLOCATION

0.89+

last 14 monthsDATE

0.89+

lisaPERSON

0.88+

last 14 plus monthsDATE

0.86+

R. O. I.PERSON

0.82+

tier threeQUANTITY

0.81+

threeQUANTITY

0.75+

tier twoQUANTITY

0.74+

two improvementQUANTITY

0.71+

phase oneQUANTITY

0.68+

CovidPERSON

0.65+

TeslaORGANIZATION

0.6+

A. W. S.ORGANIZATION

0.54+

TwoTITLE

0.46+

CovidOTHER

0.38+

Pharma 40ORGANIZATION

0.34+

CovidDATE

0.23+

Omer Asad, HPE ft Matt Cadieux, Red Bull Racing full v1 (UNLISTED)


 

(upbeat music) >> Edge computing is projected to be a multi-trillion dollar business. It's hard to really pinpoint the size of this market let alone fathom the potential of bringing software, compute, storage, AI and automation to the edge and connecting all that to clouds and on-prem systems. But what is the edge? Is it factories? Is it oil rigs, airplanes, windmills, shipping containers, buildings, homes, race cars. Well, yes and so much more. And what about the data? For decades we've talked about the data explosion. I mean, it's a mind-boggling but guess what we're going to look back in 10 years and laugh what we thought was a lot of data in 2020. Perhaps the best way to think about Edge is not as a place but when is the most logical opportunity to process the data and maybe it's the first opportunity to do so where it can be decrypted and analyzed at very low latencies. That defines the edge. And so by locating compute as close as possible to the sources of data to reduce latency and maximize your ability to get insights and return them to users quickly, maybe that's where the value lies. Hello everyone and welcome to this CUBE conversation. My name is Dave Vellante and with me to noodle on these topics is Omer Asad, VP and GM of Primary Storage and Data Management Services at HPE. Hello Omer, welcome to the program. >> Thanks Dave. Thank you so much. Pleasure to be here. >> Yeah. Great to see you again. So how do you see the edge in the broader market shaping up? >> Dave, I think that's a super important question. I think your ideas are quite aligned with how we think about it. I personally think enterprises are accelerating their sort of digitization and asset collection and data collection, they're typically especially in a distributed enterprise, they're trying to get to their customers. They're trying to minimize the latency to their customers. So especially if you look across industries manufacturing which has distributed factories all over the place they are going through a lot of factory transformations where they're digitizing their factories. That means a lot more data is now being generated within their factories. A lot of robot automation is going on, that requires a lot of compute power to go out to those particular factories which is going to generate their data out there. We've got insurance companies, banks, that are creating and interviewing and gathering more customers out at the edge for that. They need a lot more distributed processing out at the edge. What this is requiring is what we've seen is across analysts. A common consensus is this that more than 50% of an enterprises data especially if they operate globally around the world is going to be generated out at the edge. What does that mean? New data is generated at the edge what needs to be stored. It needs to be processed data. Data which is not required needs to be thrown away or classified as not important. And then it needs to be moved for DR purposes either to a central data center or just to another site. So overall in order to give the best possible experience for manufacturing, retail, especially in distributed enterprises, people are generating more and more data centric assets out at the edge. And that's what we see in the industry. >> Yeah. We're definitely aligned on that. There's some great points and so now, okay. You think about all this diversity what's the right architecture for these multi-site deployments, ROBO, edge? How do you look at that? >> Oh, excellent question, Dave. Every customer that we talked to wants SimpliVity and no pun intended because SimpliVity is reasoned with a simplistic edge centric architecture, right? Let's take a few examples. You've got large global retailers, they have hundreds of global retail stores around the world that is generating data that is producing data. Then you've got insurance companies, then you've got banks. So when you look at a distributed enterprise how do you deploy in a very simple and easy to deploy manner, easy to lifecycle, easy to mobilize and easy to lifecycle equipment out at the edge. What are some of the challenges that these customers deal with? These customers, you don't want to send a lot of IT staff out there because that adds cost. You don't want to have islands of data and islands of storage and promote sites because that adds a lot of states outside of the data center that needs to be protected. And then last but not the least how do you push lifecycle based applications, new applications out at the edge in a very simple to deploy manner. And how do you protect all this data at the edge? So the right architecture in my opinion needs to be extremely simple to deploy so storage compute and networking out towards the edge in a hyper converged environment. So that's we agree upon that. It's a very simple to deploy model but then comes how do you deploy applications on top of that? How do you manage these applications on top of that? How do you back up these applications back towards the data center, all of this keeping in mind that it has to be as zero touch as possible. We at HPE believe that it needs to be extremely simple, just give me two cables, a network cable, a power cable, fire it up, connect it to the network, push it state from the data center and back up it state from the edge back into the data center, extremely simple. >> It's got to be simple 'cause you've got so many challenges. You've got physics that you have to deal, you have latency to deal with. You got RPO and RTO. What happens if something goes wrong you've got to be able to recover quickly. So that's great. Thank you for that. Now you guys have heard news. What is new from HPE in this space? >> Excellent question, great. So from a deployment perspective, HPE SimpliVity is just gaining like it's exploding like crazy especially as distributed enterprises adopted as it's standardized edge architecture, right? It's an HCI box has got storage computer networking all in one. But now what we have done is not only you can deploy applications all from your standard V-Center interface from a data center, what have you have now added is the ability to backup to the cloud right from the edge. You can also back up all the way back to your core data center. All of the backup policies are fully automated and implemented in the distributed file system that is the heart and soul of the SimpliVity installation. In addition to that, the customers now do not have to buy any third-party software. Backup is fully integrated in the architecture and it's then efficient. In addition to that now you can backup straight to the client. You can back up to a central high-end backup repository which is in your data center. And last but not least, we have a lot of customers that are pushing the limit in their application transformation. So not only, we previously were one-on-one leaving VMware deployments out at the edge site now evolved also added both stateful and stateless container orchestration as well as data protection capabilities for containerized applications out at the edge. So we have a lot of customers that are now deploying containers, rapid manufacture containers to process data out at remote sites. And that allows us to not only protect those stateful applications but back them up back into the central data center. >> I saw in that chart, it was a line no egress fees. That's a pain point for a lot of CIOs that I talked to. They grit their teeth at those cities. So you can't comment on that or? >> Excellent question. I'm so glad you brought that up and sort of at the point that pick that up. So along with SimpliVity, we have the whole Green Lake as a service offering as well, right? So what that means Dave is, that we can literally provide our customers edge as a service. And when you compliment that with with Aruba Wired Wireless Infrastructure that goes at the edge, the hyperconverged infrastructure as part of SimpliVity that goes at the edge. One of the things that was missing with cloud backups is that every time you back up to the cloud, which is a great thing by the way, anytime you restore from the cloud there is that egress fee, right? So as a result of that, as part of the GreenLake offering we have cloud backup service natively now offered as part of HPE, which is included in your HPE SimpliVity edge as a service offering. So now not only can you backup into the cloud from your edge sites, but you can also restore back without any egress fees from HPE's data protection service. Either you can restore it back onto your data center, you can restore it back towards the edge site and because the infrastructure is so easy to deploy centrally lifecycle manage, it's very mobile. So if you want to deploy and recover to a different site, you could also do that. >> Nice. Hey, can you, Omer, can you double click a little bit on some of the use cases that customers are choosing SimpliVity for particularly at the edge and maybe talk about why they're choosing HPE? >> Excellent question. So one of the major use cases that we see Dave is obviously easy to deploy and easy to manage in a standardized form factor, right? A lot of these customers, like for example, we have large retailer across the US with hundreds of stores across US, right? Now you cannot send service staff to each of these stores. Their data center is essentially just a closet for these guys, right? So now how do you have a standardized deployment? So standardized deployment from the data center which you can literally push out and you can connect a network cable and a power cable and you're up and running and then automated backup, elimination of backup and state and DR from the edge sites and into the data center. So that's one of the big use cases to rapidly deploy new stores, bring them up in a standardized configuration both from a hardware and a software perspective and the ability to backup and recover that instantly. That's one large use case. The second use case that we see actually refers to a comment that you made in your opener, Dave, was when a lot of these customers are generating a lot of the data at the edge. This is robotics automation that is going up in manufacturing sites. These is racing teams that are out at the edge of doing post-processing of their cars data. At the same time there is disaster recovery use cases where you have campsites and local agencies that go out there for humanity's benefit. And they move from one site to the other. It's a very, very mobile architecture that they need. So those are just a few cases where we were deployed. There was a lot of data collection and there was a lot of mobility involved in these environments, so you need to be quick to set up, quick to backup, quick to recover. And essentially you're up to your next move. >> You seem pretty pumped up about this new innovation and why not. >> It is, especially because it has been taught through with edge in mind and edge has to be mobile. It has to be simple. And especially as we have lived through this pandemic which I hope we see the tail end of it in at least 2021 or at least 2022. One of the most common use cases that we saw and this was an accidental discovery. A lot of the retail sites could not go out to service their stores because mobility is limited in these strange times that we live in. So from a central recenter you're able to deploy applications. You're able to recover applications. And a lot of our customers said, hey I don't have enough space in my data center to back up. Do you have another option? So then we rolled out this update release to SimpliVity verse from the edge site. You can now directly back up to our backup service which is offered on a consumption basis to the customers and they can recover that anywhere they want. >> Fantastic Omer, thanks so much for coming on the program today. >> It's a pleasure, Dave. Thank you. >> All right. Awesome to see you, now, let's hear from Red Bull Racing an HPE customer that's actually using SimpliVity at the edge. (engine revving) >> Narrator: Formula one is a constant race against time Chasing in tens of seconds. (upbeat music) >> Okay. We're back with Matt Cadieux who is the CIO Red Bull Racing. Matt, it's good to see you again. >> Great to see you Dave. >> Hey, we're going to dig in to a real world example of using data at the edge in near real time to gain insights that really lead to competitive advantage. But first Matt tell us a little bit about Red Bull Racing and your role there. >> Sure. So I'm the CIO at Red Bull Racing and at Red Bull Racing we're based in Milton Keynes in the UK. And the main job for us is to design a race car, to manufacture the race car and then to race it around the world. So as CIO, we need to develop, the IT group needs to develop the applications use the design, manufacturing racing. We also need to supply all the underlying infrastructure and also manage security. So it's really interesting environment that's all about speed. So this season we have 23 races and we need to tear the car apart and rebuild it to a unique configuration for every individual race. And we're also designing and making components targeted for races. So 23 and movable deadlines this big evolving prototype to manage with our car but we're also improving all of our tools and methods and software that we use to design make and race the car. So we have a big can-do attitude of the company around continuous improvement. And the expectations are that we continue to say, make the car faster. That we're winning races, that we improve our methods in the factory and our tools. And so for IT it's really unique and that we can be part of that journey and provide a better service. It's also a big challenge to provide that service and to give the business the agility of needs. So my job is really to make sure we have the right staff, the right partners, the right technical platforms. So we can live up to expectations. >> And Matt that tear down and rebuild for 23 races, is that because each track has its own unique signature that you have to tune to or are there other factors involved? >> Yeah, exactly. Every track has a different shape. Some have lots of straight, some have lots of curves and lots are in between. The track surface is very different and the impact that has on tires, the temperature and the climate is very different. Some are hilly, some have big curbs that affect the dynamics of the car. So all that in order to win you need to micromanage everything and optimize it for any given race track. >> COVID has of course been brutal for sports. What's the status of your season? >> So this season we knew that COVID was here and we're doing 23 races knowing we have COVID to manage. And as a premium sporting team with Pharma Bubbles we've put health and safety and social distancing into our environment. And we're able to able to operate by doing things in a safe manner. We have some special exceptions in the UK. So for example, when people returned from overseas that they did not have to quarantine for two weeks, but they get tested multiple times a week. And we know they're safe. So we're racing, we're dealing with all the hassle that COVID gives us. And we are really hoping for a return to normality sooner instead of later where we can get fans back at the track and really go racing and have the spectacle where everyone enjoys it. >> Yeah. That's awesome. So important for the fans but also all the employees around that ecosystem. Talk about some of the key drivers in your business and some of the key apps that give you competitive advantage to help you win races. >> Yeah. So in our business, everything is all about speed. So the car obviously needs to be fast but also all of our business operations need to be fast. We need to be able to design a car and it's all done in the virtual world, but the virtual simulations and designs needed to correlate to what happens in the real world. So all of that requires a lot of expertise to develop the simulations, the algorithms and have all the underlying infrastructure that runs it quickly and reliably. In manufacturing we have cost caps and financial controls by regulation. We need to be super efficient and control material and resources. So ERP and MES systems are running and helping us do that. And at the race track itself. And in speed, we have hundreds of decisions to make on a Friday and Saturday as we're fine tuning the final configuration of the car. And here again, we rely on simulations and analytics to help do that. And then during the race we have split seconds literally seconds to alter our race strategy if an event happens. So if there's an accident and the safety car comes out or the weather changes, we revise our tactics and we're running Monte-Carlo for example. And use an experienced engineers with simulations to make a data-driven decision and hopefully a better one and faster than our competitors. All of that needs IT to work at a very high level. >> Yeah, it's interesting. I mean, as a lay person, historically when I think about technology in car racing, of course I think about the mechanical aspects of a self-propelled vehicle, the electronics and the light but not necessarily the data but the data's always been there. Hasn't it? I mean, maybe in the form of like tribal knowledge if you are somebody who knows the track and where the hills are and experience and gut feel but today you're digitizing it and you're processing it and close to real time. Its amazing. >> I think exactly right. Yeah. The car's instrumented with sensors, we post process and we are doing video image analysis and we're looking at our car, competitor's car. So there's a huge amount of very complicated models that we're using to optimize our performance and to continuously improve our car. Yeah. The data and the applications that leverage it are really key and that's a critical success factor for us. >> So let's talk about your data center at the track, if you will. I mean, if I can call it that. Paint a picture for us what does that look like? >> So we have to send a lot of equipment to the track at the edge. And even though we have really a great wide area network link back to the factory and there's cloud resources a lot of the tracks are very old. You don't have hardened infrastructure, don't have ducks that protect cabling, for example and you can lose connectivity to remote locations. So the applications we need to operate the car and to make really critical decisions all that needs to be at the edge where the car operates. So historically we had three racks of equipment like I said infrastructure and it was really hard to manage, to make changes, it was too flexible. There were multiple panes of glass and it was too slow. It didn't run our applications quickly. It was also too heavy and took up too much space when you're cramped into a garage with lots of environmental constraints. So we'd introduced hyper convergence into the factory and seen a lot of great benefits. And when we came time to refresh our infrastructure at the track, we stepped back and said, there's a lot smarter way of operating. We can get rid of all the slow and flexible expensive legacy and introduce hyper convergence. And we saw really excellent benefits for doing that. We saw up three X speed up for a lot of our applications. So I'm here where we're post-processing data. And we have to make decisions about race strategy. Time is of the essence. The three X reduction in processing time really matters. We also were able to go from three racks of equipment down to two racks of equipment and the storage efficiency of the HPE SimpliVity platform with 20 to one ratios allowed us to eliminate a rack. And that actually saved a $100,000 a year in freight costs by shipping less equipment. Things like backup mistakes happen. Sometimes the user makes a mistake. So for example a race engineer could load the wrong data map into one of our simulations. And we could restore that DDI through SimpliVity backup at 90 seconds. And this enables engineers to focus on the car to make better decisions without having downtime. And we sent two IT guys to every race, they're managing 60 users a really diverse environment, juggling a lot of balls and having a simple management platform like HPE SimpliVity gives us, allows them to be very effective and to work quickly. So all of those benefits were a huge step forward relative to the legacy infrastructure that we used to run at the edge. >> Yeah. So you had the nice Petri dish in the factory so it sounds like your goals are obviously number one KPIs speed to help shave seconds, awesome time, but also cost just the simplicity of setting up the infrastructure is-- >> That's exactly right. It's speed, speed, speed. So we want applications absolutely fly, get to actionable results quicker, get answers from our simulations quicker. The other area that speed's really critical is our applications are also evolving prototypes and we're always, the models are getting bigger. The simulations are getting bigger and they need more and more resource and being able to spin up resource and provision things without being a bottleneck is a big challenge in SimpliVity. It gives us the means of doing that. >> So did you consider any other options or was it because you had the factory knowledge? It was HCI was very clearly the option. What did you look at? >> Yeah, so we have over five years of experience in the factory and we eliminated all of our legacy infrastructure five years ago. And the benefits I've described at the track we saw that in the factory. At the track we have a three-year operational life cycle for our equipment. When in 2017 was the last year we had legacy as we were building for 2018, it was obvious that hyper-converged was the right technology to introduce. And we'd had years of experience in the factory already. And the benefits that we see with hyper-converged actually mattered even more at the edge because our operations are so much more pressurized. Time is even more of the essence. And so speeding everything up at the really pointy end of our business was really critical. It was an obvious choice. >> Why SimpliVity, why'd you choose HPE SimpliVity? >> Yeah. So when we first heard about hyper-converged way back in the factory, we had a legacy infrastructure overly complicated, too slow, too inflexible, too expensive. And we stepped back and said there has to be a smarter way of operating. We went out and challenged our technology partners, we learned about hyperconvergence, would enough the hype was real or not. So we underwent some PLCs and benchmarking and the PLCs were really impressive. And all these speed and agility benefits we saw and HPE for our use cases was the clear winner in the benchmarks. So based on that we made an initial investment in the factory. We moved about 150 VMs and 150 VDIs into it. And then as we've seen all the benefits we've successfully invested and we now have an estate in the factory of about 800 VMs and about 400 VDIs. So it's been a great platform and it's allowed us to really push boundaries and give the business the service it expects. >> Awesome fun stories, just coming back to the metrics for a minute. So you're running Monte Carlo simulations in real time and sort of near real-time. And so essentially that's if I understand it, that's what ifs and it's the probability of the outcome. And then somebody got to make, then the human's got to say, okay, do this, right? Was the time in which you were able to go from data to insight to recommendation or edict was that compressed and you kind of indicated that. >> Yeah, that was accelerated. And so in that use case, what we're trying to do is predict the future and you're saying, well and before any event happens, you're doing what ifs and if it were to happen, what would you probabilistic do? So that simulation, we've been running for awhile but it gets better and better as we get more knowledge. And so that we were able to accelerate that with SimpliVity but there's other use cases too. So we also have telemetry from the car and we post-process it. And that reprocessing time really, is it's very time consuming. And we went from nine, eight minutes for some of the simulations down to just two minutes. So we saw big, big reductions in time. And ultimately that meant an engineer could understand what the car was doing in a practice session, recommend a tweak to the configuration or setup of it and just get more actionable insight quicker. And it ultimately helps get a better car quicker. >> Such a great example. How are you guys feeling about the season, Matt? What's the team's sentiment? >> I think we're optimistic. Thinking our simulations that we have a great car we have a new driver lineup. We have the Max Verstapenn who carries on with the team and Sergio Cross joins the team. So we're really excited about this year and we want to go and win races. And I think with COVID people are just itching also to get back to a little degree of normality and going racing again even though there's no fans, it gets us into a degree of normality. >> That's great, Matt, good luck this season and going forward and thanks so much for coming back in theCUBE. Really appreciate it. >> It's my pleasure. Great talking to you again. >> Okay. Now we're going to bring back Omer for quick summary. So keep it right there. >> Narrator: That's where the data comes face to face with the real world. >> Narrator: Working with Hewlett Packard Enterprise is a hugely beneficial partnership for us. We're able to be at the cutting edge of technology in a highly technical, highly stressed environment. There is no bigger challenge than Formula One. (upbeat music) >> Being in the car and driving in on the limit that is the best thing out there. >> Narrator: It's that innovation and creativity to ultimately achieves winning of this. >> Okay. We're back with Omer. Hey, what did you think about that interview with Matt? >> Great. I have to tell you, I'm a big formula One fan and they are one of my favorite customers. So obviously one of the biggest use cases as you saw for Red Bull Racing is track side deployments. There are now 22 races in a season. These guys are jumping from one city to the next they got to pack up, move to the next city, set up the infrastructure very very quickly. An average Formula One car is running the thousand plus sensors on, that is generating a ton of data on track side that needs to be collected very quickly. It needs to be processed very quickly and then sometimes believe it or not snapshots of this data needs to be sent to the Red Bull back factory back at the data center. What does this all need? It needs reliability. It needs compute power in a very short form factor. And it needs agility quick to set up, quick to go, quick to recover. And then in post processing they need to have CPU density so they can pack more VMs out at the edge to be able to do that processing. And we accomplished that for the Red Bull Racing guys in basically two of you have two SimpliVity nodes that are running track side and moving with them from one race to the next race to the next race. And every time those SimpliVity nodes connect up to the data center, collect up to a satellite they're backing up back to their data center. They're sending snapshots of data back to the data center essentially making their job a whole lot easier where they can focus on racing and not on troubleshooting virtual machines. >> Red bull Racing and HPE SimpliVity. Great example. It's agile, it's it's cost efficient and it shows a real impact. Thank you very much Omer. I really appreciate those summary comments. >> Thank you, Dave. Really appreciate it. >> All right. And thank you for watching. This is Dave Volante for theCUBE. (upbeat music)

Published Date : Mar 5 2021

SUMMARY :

and connecting all that to Pleasure to be here. So how do you see the edge in And then it needs to be moved for DR How do you look at that? and easy to deploy It's got to be simple and implemented in the So you can't comment on that or? and because the infrastructure is so easy on some of the use cases and the ability to backup You seem pretty pumped up about A lot of the retail sites on the program today. It's a pleasure, Dave. SimpliVity at the edge. a constant race against time Matt, it's good to see you again. in to a real world example and then to race it around the world. So all that in order to win What's the status of your season? and have the spectacle So important for the fans So the car obviously needs to be fast and close to real time. and to continuously improve our car. data center at the track, So the applications we Petri dish in the factory and being able to spin up the factory knowledge? And the benefits that we see and the PLCs were really impressive. Was the time in which you And so that we were able to about the season, Matt? and Sergio Cross joins the team. and thanks so much for Great talking to you again. going to bring back Omer comes face to face with the real world. We're able to be at the that is the best thing out there. and creativity to ultimately that interview with Matt? So obviously one of the biggest use cases and it shows a real impact. Thank you, Dave. And thank you for watching.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Matt CadieuxPERSON

0.99+

DavePERSON

0.99+

Dave VellantePERSON

0.99+

Sergio CrossPERSON

0.99+

2017DATE

0.99+

2018DATE

0.99+

Red Bull RacingORGANIZATION

0.99+

MattPERSON

0.99+

2020DATE

0.99+

Milton KeynesLOCATION

0.99+

two weeksQUANTITY

0.99+

three-yearQUANTITY

0.99+

20QUANTITY

0.99+

Red Bull RacingORGANIZATION

0.99+

Omer AsadPERSON

0.99+

Dave VolantePERSON

0.99+

USLOCATION

0.99+

OmerPERSON

0.99+

Red BullORGANIZATION

0.99+

UKLOCATION

0.99+

two racksQUANTITY

0.99+

23 racesQUANTITY

0.99+

Max VerstapennPERSON

0.99+

90 secondsQUANTITY

0.99+

60 usersQUANTITY

0.99+

22 racesQUANTITY

0.99+

eight minutesQUANTITY

0.99+

more than 50%QUANTITY

0.99+

each trackQUANTITY

0.99+

twoQUANTITY

0.99+

one raceQUANTITY

0.99+

two minutesQUANTITY

0.99+

two cablesQUANTITY

0.99+

nineQUANTITY

0.99+

Hewlett Packard EnterpriseORGANIZATION

0.99+

150 VDIsQUANTITY

0.99+

SimpliVityTITLE

0.99+

Pharma BubblesORGANIZATION

0.99+

oneQUANTITY

0.99+

five years agoDATE

0.99+

first opportunityQUANTITY

0.99+

last yearDATE

0.99+

OneQUANTITY

0.99+

about 800 VMsQUANTITY

0.99+

three racksQUANTITY

0.98+

firstQUANTITY

0.98+

one siteQUANTITY

0.98+

HPEORGANIZATION

0.98+

Monte CarloTITLE

0.98+

about 400 VDIsQUANTITY

0.98+

Primary Storage and Data Management ServicesORGANIZATION

0.98+

hundreds of storesQUANTITY

0.98+

Red bull RacingORGANIZATION

0.98+

bothQUANTITY

0.98+

thousand plus sensorsQUANTITY

0.98+

tens of secondsQUANTITY

0.98+

second use caseQUANTITY

0.98+

multi-trillion dollarQUANTITY

0.98+

over five yearsQUANTITY

0.98+

todayDATE

0.97+

GreenLakeORGANIZATION

0.97+

one cityQUANTITY

0.97+

10 yearsQUANTITY

0.96+

HPE SimpliVityTITLE

0.96+

COVIDOTHER

0.96+

hundreds of global retail storesQUANTITY

0.96+

about 150 VMsQUANTITY

0.96+

Faramarz Mahdavi, Cadence Design Systems | Nutanix .NEXT Conference 2019


 

>> Live from Anaheim, California It's the queue covering nutanix dot next twenty nineteen. Brought to you by Nutanix >> Welcome back, everyone to the Cubes Live coverage of Nutanix Next here in Anaheim, California I'm your host, Rebecca Night, along with my co host, John Furrier. We're joined by Pharma's Mahdavi. He is the senior group director Cadence Design Systems. Thank you so much for coming on the Cube. So tell our viewers a little bit about Kate, based in San Jose. Can't tell our viewers a little bit about your company Cadence Design Systems. >> So Cajuns has been a A company in the very essence about thirty years ago. So we make software to enable semiconductor companies to design test than billed chips. So most technique, you know, technology that you bought, you see, and the fries Electronics has some cadence solution. >> So you guys had a lot of legacy and you're talking about the nutanix relationship. >> So our journey with Nutanix started about three years ago. I'd actually explored Nutanix at a previous company. I've been with Cadence three and a half years. Eso liked it, but there was really no opportunity Teo do much At that time, the company was very new at the time. But I cadence, we identified some opportunities Teo to explore nutanix. And it's been a great experience so far Way actually are running a lot of our critical of business applications on nutanix. So we're all in. >> What was the door opener for? What was the door opener for you? You guys there? That cadence. What? Goddammit! >> The overall architecture look good in a presentation level s so it was worth exploring. But, you know, it's a new company. New architecture. Er you have to kind of going to it carefully. So it was a matter of identifying opportunities that were maybe not production, not super business critical to start. But as time goes on, you build confidence and you do more and more. So today we're using Nutanix. As I said for business applications were using your for VD I AA lot of ours End that stop. You know, instances are running on nutanix today. We use that as well because here zero so a lot of art shared services. You know, the n s active directory. Those sorts of services are running on his hands. So, you know, we're looking for more and more opportunities to expand it. >> So I always like to know how this actually helps you and your company. Do people do their jobs better, more quickly, more efficiently, more productively? Can you sort of walk us through what life was like before nutanix and what life is like now in terms of the staffing and the overhead and the >> star? So I would say there's a couple of different, you know, big benefits. One is we're in a cloud, uh, era, right? So a lot of companies are looking for work close to move to the public cloud, and we're no different. We're constantly looking for what? What makes sense and the public cloud. What makes sense on Prem? So from this support and skill sets, fan point is very important to be consistent. I basically have the same support model for both on Prima's well as public public cloud. So that's one big benefit that Nutanix offers because the same skill sets to support. Let's say eight lbs environment is the same as, you know, the nutanix support environment. Thie. Other critical thing is just like any ICTSI organization were challenged with limited resource is you know, doing more with less. So the ease of administration, ease of support, just inherent reliability of the technology allows our staff to, you know, sleep more at nights and, you know, work less often during the weekend. So the overalls support overhead has reduced significantly. So that's the those are the biggest things. I would say. >> Those are two very important things. >> Those are the two biggest things that way went into this, um, this engagement with But, you know, we're pleasantly surprised that performance is exceeded our expectations, you know? You know, I did expect reliability. I didn't quite expect this level of performance improvement, so that's been excellent. So again, we're looking for more and more opportunities to expand it. Just given that experience, he >> said, the staff sleeps well at night. How have they reacted? What if some other anecdotes from the staff freed more free time management playing? What's the most of what was some of the feedback from the from your team? >> Well, I mean, I don't want to give the wrong impression. It's not like they're not >> working. Yeah, I write >> the scenario, but, you know, I would say it's gone from, uh, crazy environments is something a little more humane, S O, I think not only with the staff just across the company. You have those who are who kind of buy in and go into it positively and others who are more reluctant. And that's no different the support staff. So I think just their own confidence level. And, you know, there, >> uh, a >> desire to do more with nutanix as increase as they had more experience with >> it. It's interesting. I did a panel yesterday with some customers from NUTANIX and was a mixed in a big bank, midsized company and and a good, big corporate kind of it. And it's very interesting. The legacy with was where there was more legacy. There was a lot of dependencies, and they were looking at time frames for pushing stuff out, like eight weeks to two months in two hours. So they went for eight weeks. Teo pushing any kind of rule propagation or any kind of new stuff. It weeks the two hours and that was a huge number. Are you Are you guys seeing anything around in terms of performance and group on the time side with Nutanix? What are some of the things that you're getting benefits wise operationally. >> Well, the more we do, the more cookie cutter it becomes. So you know, each migration is easier and faster and so on. And that also acid with confidence, right? The very first critical business application that we moved to Nutanix the level of testing we did was insane. Now it's less Oh, so for multiple reasons that migration experience is much more efficient much, much quicker today than it was early on. >> One of the things we hear to Rebecca was, you know, new channels. The new vendor you mentioned new company. They're ten years old, so still new relative to the bigger guys getting it pushed, getting it through, getting it approved by executive confidence from executive management around. Wait, was this new new company what's the benefits? All kinds of gyrations, of approvals and sometimes politics and, you know, legacy kind of factors in How does that work on your N? How did that go? Getting nutanix through was a struggle. That was The challenge was to take us through that. >> So as you mentioned the fact that it's new technology new company that has its own set of challenges from first, some application owners and executives. You know, why take the risk? Why not do the same thing we've done? You know, always, um so? So that that's one big big challenge. The other was There is a tendency, especially early on when NUTANIX was selling it as an appliance, as opposed to license on Lee. Um, there is a tendency to view it as a hardware solution, and it's exactly not that it's the exact opposite of that is purely a software solution. That's where the value is. So it's very easy to get chopped into that hardware discussion where people will kind of compare with servers and storage versus nutanix s. So you have to kind of change that mindset and show the real value that hyper convergence provides thes of administration, that high performance reliability and so on on DH. Then, as you make that argument and convince more people again, you have to, you know, start small and expand. But that that was some of the main challenges. I would say >> when you're talking about the migration experience and you said when we formed the first business critical application with it was a long time we tested it. We really worked at it. Now we have a bit more faith that it's that it's going to work out. But can you talk about some best practices that emerged in terms of how to migrate and my great well, that maybe other companies could learn from from Cadence Design System? >> Yeah, well, I would say the best practices aren't unique to unit nutanix. Any migration process has, you know, various phases in terms of planning, testing and so on. And I think just having that discipline well documented, consistent process so that you're not starting fresh every time there's a new migration initiative going on. But I think nutanix makes it easier just given the especially the prison management tool. But I would say it's not particularly unique to your tent. NUTANIX Torto organization just need to be well disciplined in immigration process. >> One of the things that you mentioned software, which is great point that cultural shifts, not a hardware box, and it's probably all the best practices around. Evaluating hardware software is becoming more and more central to it. How do you see it evolving because you got cloud right on the horizon. You got public cloud benefits. They are clear if you're greenfield yet legacy Stop. We have containers containing ization happening as a trend lift and shift versus, you know, evolved life cycle management of APS and workloads, or are now under a new kind of view with software that was changing and, you know, as a as a practitioner in the field. Now, do you look at the evolution of how it is going to change? >> So my side of the house is the infrastructure and operations side, and they tend to be historically kind of manual, you know, different network administrator, storage administrator, system administrations, the administrators that is all changing and all becoming more developer skill sets, scripting automation, things without sort. So I think that's the biggest changes going on in today is kind of changing the skill sets and kind of viewing it as a full stack as opposed to just stories. You're just network. So having that holistic view point having ability, too, develop automation that works across the stack. I think that those those are the changes that traditional infrastructure groups need. Thio adapted. >> While I was talking to a customer yesterday And he was a young young guy, was I think, in this late twenties I'm seeing myself. You know, ten years ago he was in high school or college. So you see a new generation coming up where they gravitate towards Dev ops, right? And so they get that so they don't have that dogma. What? We went with this vendor. So they kind of this new thinking, Any observations that you can share on this younger generation coming inside your new talent that's coming in. That's developer or what they like. What? What's the work style? What they gravitate to what some of the tools they like. That's the mindset. >> So I think they can teach us to be honest way have you know, the older folks like myself have a tendency to look at the way things have always been done. Right? So having the fresh viewpoint is great to kind of come into it with a dead body develops mentality, you know, off jump. But I think I which we should kind of welcome that and take advantage of that. Um you know, for cadence in General Wei are pretty mature company in terms of our personnel we don't have that rapid turnover person of, you know, our team members. So we're trying to actually, you know, we welcome that new talent, eh? So that we can kind of get that, uh, Dev officer mentality in house and kind of mature it ourselves. So we're in the beginning of that journey. >> How do you work together? Because, I mean, you're not that old first of all, but But this This is the time where we have multiple generations together working in the workforce, thes digital natives that we were talking about that and the people who get technology so innate Li grew up with it versus the Gen Xers. The boomers are still there. The gen y's that are emerging and graduating. Now, how is it a challenge at at Cadence to to get all these people working collaboratively productively together? >> Well, Katie, this is an extremely technical company. Uh, referred to our customers, you know, they're all double e, you know, Master's and doctorate engineers. So it's a very technical environment. We try not to really focus on the technology, actually, but to look at, you know, the business objectives, you know? What are we trying to achieve what problems that we're trying to solve. That supposed, Tio. Oh, here's a cool technology. How can we use it? You know, the mindset is a little bit different. We're looking at the business side first and then using technology to solve for those problems. So once you have that focus, regardless of your experience, your age, your background, you work together, you know, to to achieve that end goal. >> What you think about the show. We're here at NUTANIX next Anaheim. What's what's your verdict on so far? The content. Positioning your customer. What's next for you guys? Yeah, very loyal customer. Based on what we found. People love the product. What's next, Joe? >> I'm very impressed. I wasn't expecting it to be this large. You know, I went Teo Local smaller version that was in the area last year. That was pretty impressive, too. But this is amazing. I like it because, you know, I t leaders get sales calls all the time, and we kind of get bombarded. So Tennessee so ignore those. This kind of gives us a chance to at our own pace kind of see who the key partners are. Two new tenants look for opportunities and meet some of these other vendors s. So it's been both educational as well as kind of entertaining. >> Excellent. Well, thank you so much. Farmers for coming on the Q b really appreciated >> my pleasure to meet you. Thank you. >> I'm Rebecca Knight for John Furrier. We will have much more of nutanix next here in Anaheim, coming up in just a little bit.

Published Date : May 8 2019

SUMMARY :

Brought to you by Nutanix Thank you so much for coming on the Cube. So most technique, you know, technology that you bought, So our journey with Nutanix started about three years ago. What was the door opener for? But, you know, it's a new company. So I always like to know how this actually helps you and your company. So I would say there's a couple of different, you know, um, this engagement with But, you know, What if some other anecdotes from the staff Well, I mean, I don't want to give the wrong impression. Yeah, I write the scenario, but, you know, I would say it's gone from, What are some of the things that you're getting So you One of the things we hear to Rebecca was, you know, new channels. So as you mentioned the fact that it's new technology new company that has its own set of But can you talk about some best practices that emerged in terms of how to Any migration process has, you know, various phases in terms One of the things that you mentioned software, which is great point that cultural shifts, So my side of the house is the infrastructure and operations side, and they tend to be So you So I think they can teach us to be honest way have you know, How do you work together? but to look at, you know, the business objectives, you know? What you think about the show. I like it because, you know, Well, thank you so much. my pleasure to meet you. We will have much more of nutanix next here in Anaheim,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Rebecca NightPERSON

0.99+

eight weeksQUANTITY

0.99+

John FurrierPERSON

0.99+

San JoseLOCATION

0.99+

Cadence Design SystemsORGANIZATION

0.99+

KatiePERSON

0.99+

Rebecca KnightPERSON

0.99+

two hoursQUANTITY

0.99+

Faramarz MahdaviPERSON

0.99+

RebeccaPERSON

0.99+

AnaheimLOCATION

0.99+

ten yearsQUANTITY

0.99+

two monthsQUANTITY

0.99+

NUTANIXORGANIZATION

0.99+

NutanixORGANIZATION

0.99+

last yearDATE

0.99+

JoePERSON

0.99+

Anaheim, CaliforniaLOCATION

0.99+

CadenceORGANIZATION

0.99+

KatePERSON

0.99+

twoQUANTITY

0.99+

Anaheim, CaliforniaLOCATION

0.99+

yesterdayDATE

0.99+

oneQUANTITY

0.99+

eight lbsQUANTITY

0.99+

nutanixORGANIZATION

0.98+

bothQUANTITY

0.98+

todayDATE

0.98+

TennesseeLOCATION

0.98+

MahdaviPERSON

0.98+

ThioPERSON

0.97+

firstQUANTITY

0.96+

ten years agoDATE

0.96+

Two new tenantsQUANTITY

0.96+

TioPERSON

0.96+

TeoORGANIZATION

0.96+

ElectronicsORGANIZATION

0.96+

two biggestQUANTITY

0.95+

OneQUANTITY

0.94+

CajunsORGANIZATION

0.94+

each migrationQUANTITY

0.93+

ICTSIORGANIZATION

0.93+

EsoORGANIZATION

0.9+

FurrierPERSON

0.89+

three years agoDATE

0.87+

first businessQUANTITY

0.87+

thirty years agoDATE

0.85+

CubeORGANIZATION

0.83+

this late twentiesDATE

0.82+

PrimaORGANIZATION

0.79+

three and a half yearsQUANTITY

0.78+

NutanixEVENT

0.73+

LeeORGANIZATION

0.73+

first critical businessQUANTITY

0.73+

one big big challengeQUANTITY

0.7+

PharmaORGANIZATION

0.69+

twenty nineteenDATE

0.68+

cadenceORGANIZATION

0.67+

NUTANIX TortoORGANIZATION

0.66+

2019EVENT

0.64+

JohnORGANIZATION

0.63+

General WeiORGANIZATION

0.63+

zeroQUANTITY

0.57+

nutanixTITLE

0.57+

NutanixTITLE

0.55+

Jesse Rothstein, ExtraHop | VMworld 2018


 

(pulsing music) >> Live from Las Vegas, it's theCUBE, covering VMworld 2018. Brought to you by VMware and its ecosystem partners. >> Good morning from day three of theCUBE's coverage of VMworld 2018 from the Mandalay Bay, Las Vegas. I'm Lisa Martin, and I'm joined by my co-host, Justin Warren. Good morning, Justin. >> Good morning, Lisa. >> We're excited to welcome to the first time to theCUBE Jesse Rothstein, co-founder and CTO of ExtraHop. Jesse, it's nice to meet you. >> Nice to meet you, Lisa. Thank you for having me. >> Absolutely, so ExtraHop, you guys are up in Seattle. You are one of Seattle's-- >> Sunny Seattle (Jesse chuckles). >> Sunny Seattle. So, one of the best companies up there to work for. Tell us about ExtraHop. What to you guys do in the software space? >> Great. Well, ExtraHop does network traffic analysis, and that can be applied to both performance, performance optimization, as well as cybersecurity. Now, I'm not unbiased, but what I would tell you is that ExtraHop extracts value from the wired data better than anybody else in the world, and that's our fundamental belief. We believe that if you can extract value from that wired data and insights and apply in real-time analytics and machine-learning, then this can be applied to a variety of use cases, as I said. >> That's quite interesting. Some of the use cases we were talking about off camera, some of the things around micro-segmentation, particularly for security, as you mentioned, is really important, and also in software-defined networking, the fact that you are software, and software-defined networking we've had a few guests on theCUBE so far over the last couple of days, that's something which is really experiencing a lot of growth. We have VMware who's talking about their NSX software-defined networking. Maybe you could give us a bit of detail on how ExtraHop helps in those situations. >> Well, I'm paying a lot of attention to VMware's vision and kind of the journey of NSX and software, really software-defined everything, as well as, and within NSX, you see a lot of applications towards security, kind of a zero-trust, least-privileged model, which I think is very exciting, and there's some great trends around that, but as we've also seen, it's difficult to execute. It's difficult to execute to build the policies such that they maybe don't break. From my perspective, a product like ExtraHop, as solution like ExtraHop, we work great with software-defined environments. First, because they have enabled the type of visibility that we offer in that you can tap traffic from a variety of locations for the purposes of analysis. If left to its own devices, I think these increased layers of abstraction and increased kind of policy frameworks have the potential to introduce complexity and to limit visibility, and this is where solutions like ExtraHop can provide a great deal of value. We apply to both your traditional on-prem environment as well as these hybrid and even public cloud environments. The ability to get visibility across a wide range of environments, really pervasively, in the hybrid enterprise is I think a big value that we offer. >> We are at VMworld and on day one, on Monday, Pat Gelsinger talked about the average enterprise has eight or nine clouds. I heard somebody the other day say that they had four and a half clouds. I didn't know you could have a half a cloud, but you can. Multi-cloud, a big theme here, that's more the vision and direction that VMware's going to go into, but to your point, customers are living in this world, it's not about embracing it, they're in it, but that also I think by default that can create silos that enterprises need to understand or to wrap their heads around. To your point, they have to have visibility, because the data is the power and the currency only if you can have visibility into it and actually extract insights and take action. >> Absolutely. ExtraHop customers are primarily large enterprises and carriers, and everyone single one of them is somewhere on their own cloud journey. You know, maybe they're just beginning it, maybe their quite mature, maybe their doing a lot of data center consolidation or some amount of workload migration to public cloud. No matter where they are in that journey, they require visibility into those environments, and I think it's extremely important that they have the same level of visibility that they're accustomed to in their on-prem environment, with their traditional workloads, as well as in these sort of borne-in-the-cloud workloads. But, I want to stress visibility for its own sake isn't very useful. Organizations are drowning in data, you can drown in visibility. For us, the real trick is to extract insights and bring them to your attention, and that's where we've been investing in data science and machine-learning for about four and a half to five years. This is before it became trendy as it is today. >> Superpower, like Pat called it. >> There's so much ML watching, when you walk in the show floor, almost every vendor talks about their AI and machine-learning. A lot of it's exaggerated, but what I'll say for ExtraHop, of course, ours is real, and we've been investing in this for years. Our vision was that we had this unbelievable amount of data, and when you're looking at the wired data, you're not just drinking from the firehose, you're drinking from Niagara Falls. You have all of this data, and then with machine-learning, you need to perform feature extraction on the data, that's essentially what data science teams are very good at, and then, build the ML models. Our vision was that we don't want to just give you a big pile of data or a bunch of charts and graphs, we actually want to bring things to your attention so that we can say, "Hey, Lisa, look over here, "there's something unusual happening here", or in many cases there's a potential threat or there's suspicious behavior, an indicator of compromise. That's where that sort of machine-learning I believe is the, kind of the-- well, certainly the current horizon or the state of the art for cybersecurity, and it's extremely important. >> Jessie, can you give us an example of one of your enterprise customers and how they've used ExtraHop to manage that complexity that Lisa was talking about, that visibility that they need to get through all the different layers of abstraction, and maybe, if there's one, an example of how they've done some cybersecurity thing, particularly around that machine-learning of detecting an anomaly that they need to deal with? >> Sure, I can think of a lot. One customer of mine, that unfortunately, I can't actually name them, is a very large retail customer, and what I love about them is the actually have ExtraHop deployed at thousands of retail sites, as well as their data centers and distribution centers. Not only does ExtraHop give them visibility into the logistics operations, and they've used ExtraHop to detect performance degradation and things like that, that we're preventing them from, literally preventing the trucks from rolling out. But they're also starting to use ExtraHop more and more to monitor what's going on at the retail sites, in particular, looking for potential compromises in the point-of-sale systems. We've another customer that's a large, telco carrier, and they used ExtraHop at one point to actually monitor phone activations, because this is something that can be frustrating if you buy a new phone, and maybe it's an iPhone, and you go to activate it, it has to communicate to all these different servers, it has to perform some sort of activation, and if that process is somehow slow or could take a long time, that's very frustrating to your users and your customers. They needed the ability to see what was happening, and certainly, if it was taking longer than it usually does. That's a very important use case. And then we have a number of customers on the cybersecurity side who are looking for both the ability to detect potential breaches and maybe ransomware infections, but also the ability to investigate them rapidly. This is extremely important, because in cybersecurity, you have a lot of products that are essentially alert cannons, a product that just says, "Hey, hey, look at this, look at this, look at this. "I think we found something." That just creates noise. That just creates work for cybersecurity teams. The ability to actually surface high-quality anomaly and threats and streamline and even automate the workflows for investigation is super important. It's not just, "Hey, I think I found something", but let's take a click or two and investigate what it is so we can make a decision, does this require immediate action or not. Now, for certain sort of detections, we can actually take an automated response, but there are a variety of detections where you probably want to investigate a little more. >> Yeah. >> I also noticed the Purdue Pharma case study on your website, and looking at some of the bottom line impacts that your technology is making where they saved, reduced their data center footprint by 70% and increased app response times by 70%. We're talking about pharmaceutical data. You guys are also very big in the healthcare space, so we're talking about literally potentially life-saving situations that need to be acted on immediately. >> Certainly that can be true. Healthcare, there can be life-and-death situations, and timely access to medical records, to medical data, whether it's a workstation inside an exam room or an iPad or something like that can be absolutely critical. You often see a lot of desktop and application virtualization in the healthcare environment, primarily due to the protection of PHI, personal health information, and HIPPA constraints, so very common deployments in those environments. If the logins are slow or if there's an inability to access these records, it can be devastating. We have a large number of customers who are essentially care providers, hospital chains, and such that use ExtraHop to ensure that they have timely access to these records. That's more on the performance side. We also have healthcare customers that have used our ability to detect ransomware infections. Ransomware is just a bit of a plague within healthcare. Unfortunately, that industry vertical's been hit quite hard with those infections. The ability to detect a ransomware infection and perform some sort of immediate quarantining is extremely important. This is where I think micro-segmentation comes into play, because as these environments are more and more virtualized, natural micro-segmentation can help limit damage to ransomware, but, more often than not, these systems and workstations do have access to something like a network drive or a share. What I like about micro-segmentation is the flexibility to configure the policies, so when a ransomware infection is detected, we have the ability to quarantine it and shut it down. Keep in mind that there's defense in depth, it's kind of a security strategy that we've been employing for decades. You know, literally multiple layers of protection, so there are always protections at your gateway, and your firewall, at the perimeter, your NGFW, and there are protections at the endpoint, but if these were 100% effective, we wouldn't have ransomware infections. Unfortunately, they're not, and we always require that last, and maybe a last line of defense where we examine what's going on in the east-west corridor, and we look for those potential threats and that sort of suspicious activity or even known behaviors that are known to be bad. >> Well, Jesse, thanks so much for stopping by theCUBE and sharing with us what ExtraHop is doing, and what differentiates you in the market. We appreciate your time. >> My pleasure, Lisa, Justin. Thank you so much for having me. >> And we want to thank you for watching theCUBE. I'm Lisa Martin with Justin Warren. Stick around, we'll be back. Day three of the VMworld 2018 coverage in just a moment. (pulsing music)

Published Date : Aug 29 2018

SUMMARY :

Brought to you by VMware of VMworld 2018 from the and CTO of ExtraHop. Nice to meet you, Lisa. you guys are up in Seattle. What to you guys do in the software space? and that can be applied Some of the use cases we were and kind of the journey going to go into, but to your point, and bring them to your attention, things to your attention but also the ability to in the healthcare space, and timely access to medical and what differentiates you in the market. Thank you so much for having me. you for watching theCUBE.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Justin WarrenPERSON

0.99+

Jesse RothsteinPERSON

0.99+

eightQUANTITY

0.99+

JessePERSON

0.99+

Lisa MartinPERSON

0.99+

Pat GelsingerPERSON

0.99+

LisaPERSON

0.99+

100%QUANTITY

0.99+

SeattleLOCATION

0.99+

JustinPERSON

0.99+

JessiePERSON

0.99+

70%QUANTITY

0.99+

MondayDATE

0.99+

Niagara FallsLOCATION

0.99+

iPadCOMMERCIAL_ITEM

0.99+

Las VegasLOCATION

0.99+

twoQUANTITY

0.99+

iPhoneCOMMERCIAL_ITEM

0.99+

FirstQUANTITY

0.99+

VMwareORGANIZATION

0.99+

ExtraHopORGANIZATION

0.99+

PatPERSON

0.99+

One customerQUANTITY

0.99+

Mandalay Bay, Las VegasLOCATION

0.99+

first timeQUANTITY

0.98+

bothQUANTITY

0.98+

four and a half cloudsQUANTITY

0.98+

VMworldORGANIZATION

0.98+

about four and a halfQUANTITY

0.98+

VMworld 2018EVENT

0.97+

theCUBEORGANIZATION

0.97+

Day threeQUANTITY

0.96+

todayDATE

0.96+

nine cloudsQUANTITY

0.96+

decadesQUANTITY

0.96+

one pointQUANTITY

0.95+

five yearsQUANTITY

0.94+

oneQUANTITY

0.94+

a half a cloudQUANTITY

0.93+

day oneQUANTITY

0.91+

ExtraHopTITLE

0.89+

singleQUANTITY

0.88+

NSXORGANIZATION

0.87+

day threeQUANTITY

0.87+

Purdue PharmaORGANIZATION

0.86+

thousands of retail sitesQUANTITY

0.83+

zeroQUANTITY

0.74+

SunnyPERSON

0.62+

HIPPATITLE

0.57+

daysDATE

0.53+

a clickQUANTITY

0.53+

lastDATE

0.51+

Shyam J Dadala & Sung Nam, Shire Pharmaceuticals | Informatica World 2018


 

why from Las Vegas it's the cube covering informatica world 2018 bacio by inform Attica hey welcome back it runs the cubes exclusive coverage of informatica world 2018 we're here at the Venetian in Las Vegas live I'm John for your co-host with Peterborough's coasting and head of analyst said we keep on insulating all the cube our next guest is jammed the dalla who's the enterprise analytics architecture engineer sire pharmaceutical and some named director of the enterprise analytic solutions lead at sire as well great to have you guys thanks for joining us thank you so love getting the practitioner view of kind of the reality right of what's going on off see dramatic has their show you guys are a customer you're looking at some of their products take a minute first to talk about what you guys do first see Pharma got some stuff going on Davies involved privacy's involves you're in Europe in the u.s. GDP ours here think I'm gonna talk about what you guys do sure so char Pharmaceuticals is a global leader in rare diseases so there's about 350 million patients who are effective remedies is today and so art group with NIT enterprise analytics so we're focused on making sure we bring the right technologies and capabilities around bi and analytics to the organization so we look at products tools figure out how they fit into our our ecosystem of bi stack of tools and make that available to our RIT colleagues as well as our business colleagues so rare disease can you just explain kind of categorically what that is cuz I'm assuming this fits rare is not a lot of data on it or there's data you got to figure out what is that how do you guys categorize that so rare disease you know majority the rare disease affected by affected children so that's a kind of a critical aspect of what we do you know rare disease could be in immunology it could be in oncology GI I mean there's very disease typically you know people who are affected affected probably less than a thousand or 2,000 I think one of our drugs the population is around 5,000 people and these are chronic diseases typically their chronic diseases so they're they're they're diseases that affect the quality of life of an individual so what you guys are doing is identifying what is it about the genealogy etc the genome associated with the disease but then providing treatments that will allow especially kids an opportunity to have live a better life over extensive time yeah and what do you guys do there in terms the data side can you explain what your roles are yeah so like I said we're you're in the enterprise analytics so we're focused on bringing technologies and capabilities around bi and analytics spaces so how do we bring data in and ingest it how do we curate the data how do we do if data visualizations how do we do data discovery advanced analytics so all of those kind of capabilities and we're responsible for so what's your architecture today you have some on premises their cloud involved you just kind of lay out kind of the environment as much as you can share I know maybe some confidential information but for the most part what's the current landscape internally for you guys what are you dealing with the data sure so we fill out a new a new next generation analytics we called it our marketplace or the analytics marketplace we're leveraging both on Prem as well as cloud technologies so we're leveraging Microsoft Azure hdinsight for Hadoop the Big Data technologies as well as informatica for data ingestion and bringing data and transform or transforming yet but there are many tools involved in that one so it's like the whole ecosystem we call does marketplace which is backbone for shared enterprise analytics strategy and future you guys put a policy around what tools people can bring to work so to speak and we're seeing a proliferation of tools there's a tool vendor everywhere we look around the big data it's right I got a tool for this I got a tool for wrangling I've seen everything how do you guys deal with that onslaught of tools coming in do you guys look at it more from a platform respective how are you guys handling that right so look at a platform perspective and we try to bring tools in and make that a standard within the organization we look at you know the security is it enterprise grade technology and yeah it's a challenge I mean they're basically certified you kick the tires give it a pace test through its paces and then we have our own operations team so we can support that that tool set the platform itself so and what are your customers do with the data they doing self service or they data scientists are they like just business analysts what's the profile of the users of your customers of your we have all set of users they have like a technical folks which they want to use the data like traditional ETL reality so there are folks from the business they want to do like self-serve and unless they want to do analysis on the data so we have all the capabilities in our marketplace so some tools enable those guys to get the data for the selves or like the tools we have and dalibor does their own stuff like the eld talk a little bit about the one of the key challenges associated with pharmaceuticals especially in the types of rare disease chronic young people types of things that you guys are mainly focused on a big challenge has always been that people when they start taking a drug that can significantly improve their lives they start to feel better and when they start to feel better they stop taking it so how are you using big data to or using analytics to identify people help describe potential treatments for them help keep them on the regimen how do you do are you first of all are you doing those things and as you do it how are you ensuring that you are compliant with basic ethical and privacy laws and what types of tools are you using to do that it's a big question yeah yeah so we are doing some of that you know we have looked at things around persistence and adherence and understanding kind of you know what what combination of drugs may work best for certain individuals or groups of people yeah and definitely you know some compliance is a big factor in that so when I'm working close with a compliance group understanding how we're allowed to use that data in between which parts of the organization do you anticipate that you'll have a direct relationship as some of these customers or is there an optimist in other words does analytics provide you an opportunity to start to alter the way that you engage the core users of your products and services like I believe so you know I think one thing that we're looking at which strategic standpoint is um how do we diagnose people sooner a lot of these chronic diseases you know they go through 2-3 years of undiagnosed so they'll jump around from you know doctor a doctor if I understand what you know what the issue is so I think one thing we're looking at is how do we use data and AI to to more quickly be able to diagnose patients has a 360 view helped you guys of data you guys have a 360 view how do you cuz we'll look at that in terms of a channel selling a product and serving because we have a different perspective what's the 360 view benefit that you guys are getting yeah so we have a kind of a customer care model which is kind of a 360 for our customer so understanding you know around just drug manufacturing to making sure they have the right you know they have the right supply to understand is it working for the patient's so we've always been talking about the role a big day you mentioned had to do that Hadoop supposed to be this whole industry now it's a feature of data right so there's a variety of you know infrastructure as a service platform as a service some say I pass and Big Data how are you guys looking at that as as as builders of IT next-generation IT the role of I pass and Big Data we see it as a role in a blur you know I think what cloud brings us in the past type solutions is agility you know we as the market is so evolving so quickly and there's new versions of new software coming out so quickly that you wanna be able to embrace that and leverage that give it benefit of like give it some sort of a comparison old way versus a cloud like is there been some immediate benefits that just pop out yeah that a lot more benefits with doing the world way and the cloud way because with the cloud that brings a lot more scalability in in all India's to get like 10 servers you need to work with the infrastructure team I get it like it takes three months or two months again it with the cloud based one you've worked out you can scale up or scale down so that's one thing because it's so you're talking about Big Data yeah you're getting the volume of data you're getting you need to scale up your storage or your any compute you either JMS and compute bring data to the table and then you gotta have the custom tooling for the visualization yeah how that kind of together right you talk about them from your perspective the balance that you have to have guys have to deal with every day like you got to deal with the current situation NIT you got cloud you got an electrical customers personas of people using the product but you got to stay in the cutting edge it's like what's next cuz we going down the cloud road you're looking at containers kubernetes service meshes you need a lot more stuff coming down the pike if you will coming down the road for you guys how are you guys looking at that and how are you managing it you have some greenfield projects do you do a little you know Rd you integrated in how are you dealing with this new cloud native set of technologies yeah definitely a balancing act you know I think we do a lot of pocs and we actually work with our business and IT counterparts to see hey if there's a new use case that is coming down you know how do we solve that use case with some of the newer technologies and we try a POC may bring in a product to just see if it works and then see how do we then do we then take that to the enterprise so I got one final question for you guys and maybe you do as well John but but in life and death businesses like pharmaceuticals is a life and death business the quality of the data is really really important getting it wrong has major implications the fidelity of the system is really crucial you say using informatica for for example ingest and other types of services how has that choice made the business feel more certain about the quality of their data that you're using in your analytic systems into standardization so you know if between MDM round mastering our data - ingesting data transforming our data just having that data lineage having that standard around how that data gets transformed is that fundamentally a feature of the services that you're providing is you not only were you you know the ability to do visualization on data but actually providing your scientists and your businesspeople and your legal staff explicit knowledge about where this data came from and how trustworthy it is and whether they should be making these kind of free complex very real hardcore human level decisions on is that is that all helping yes because it seems like it would be a really crucial determination of what tools you guys would use right it is yeah and absolutely I think also as we move more towards self-service and having these people having data scientists do their things on their own being able to have the tools that can do that kind of audit and data lineage is crucial great to have you guys on we had a wrap I want to ask one more question here you guys were an innovation award e informática congratulations any advice for your peers out there want to unleash the power data and be on the cutting edge and potentially be an honoree yeah I would say just definitely think outside the box seem to try new things try puce you know do POCs is there so much new technologies coming down so quickly that it's hard to keep up Jam cuz it's like a moving target you need to chase your movie target and based on B was it that gets you like what you want it to do you know siding yeah get out front don't keep your eye on the prize yeah focus on task at hand bring in the new technologies guys thanks so much for coming on great to hear the practitioners reality from the trenches certainly front lines you know life-or-death situations of quality of the data matter scaling is important cloud era of data I'm John for a Peterborough's more live coverage after the short break

Published Date : May 22 2018

SUMMARY :

the road for you guys how are you guys

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
three monthsQUANTITY

0.99+

two monthsQUANTITY

0.99+

less than a thousandQUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

EuropeLOCATION

0.99+

JohnPERSON

0.99+

Las VegasLOCATION

0.99+

10 serversQUANTITY

0.99+

Shire PharmaceuticalsORGANIZATION

0.98+

NITORGANIZATION

0.98+

360 viewQUANTITY

0.98+

around 5,000 peopleQUANTITY

0.98+

one final questionQUANTITY

0.98+

bothQUANTITY

0.97+

about 350 million patientsQUANTITY

0.96+

2-3 yearsQUANTITY

0.96+

oneQUANTITY

0.96+

IndiaLOCATION

0.96+

one more questionQUANTITY

0.95+

one thingQUANTITY

0.95+

RITORGANIZATION

0.94+

Shyam J DadalaPERSON

0.92+

u.s.LOCATION

0.92+

2018DATE

0.92+

one thingQUANTITY

0.92+

todayDATE

0.9+

DaviesPERSON

0.9+

VenetianLOCATION

0.89+

JMSTITLE

0.88+

firstQUANTITY

0.87+

360QUANTITY

0.85+

AtticaORGANIZATION

0.83+

2,000QUANTITY

0.82+

SungPERSON

0.77+

AzureTITLE

0.77+

daliborORGANIZATION

0.75+

lot of pocsQUANTITY

0.74+

Informatica WorldEVENT

0.74+

one of our drugsQUANTITY

0.73+

a minuteQUANTITY

0.72+

PharmaORGANIZATION

0.71+

a lot moreQUANTITY

0.68+

informORGANIZATION

0.65+

PharmaceuticalsORGANIZATION

0.64+

NamORGANIZATION

0.62+

biQUANTITY

0.6+

PeterboroughORGANIZATION

0.56+

key challengesQUANTITY

0.53+

dataQUANTITY

0.5+

PeterboroughPERSON

0.49+

sirePERSON

0.42+

Dr. Mark Ramsey & Bruno Aziza | BigData NYC 2017


 

>> Live from Mid Town Manhattan. It's the Cube, covering BIGDATA New York City 2017. Brought to you by, SiliconANGLE Media and it's ecosystems sponsors. >> Hey welcome back everyone live here in New York City for the Cube special presentation of BIGDATA NYC. Here all week with the Cube in conjunction with Strata Data even happening around the corner. I'm John Furrier the host. James Kobielus, our next two guests Doctor Mark Ramsey, chief data officer and senior vice president of R&D at GSK, Glasgow Pharma company. And Bruno as he's the CMO at Fscale, both Cube alumni. Welcome back. >> Thank for having us. >> So Bruno I want to start with you because I think that Doctor Mark has some great use cases I want to dig into and go deep on with Jim. But Fscale, give us the update of the company. You guys doing well, what's happening? How's the, you have the vision of this data layer we talked a couple years ago. It's working so tell us, give us the update. >> A lot of things have happened since we talked last. I think you might have seen some of the news in terms of growth. Ten X growth since we started and mainly driven around the customer use cases. That's why I'm excited to hear from Mark and share his stories with the rest of the audience here. We have a presentation at Strata tomorrow with Vivens. It's a great IOT use case as well. So what we're seeing is the industry is changing in terms of how it's spying the idea platforms. In the past, people would buy idea platforms vertically. They'd buy the visualization, they'd buy the sementic and buy the best of great integration. We're now live in a world where there's a multitude of BI tools. And the data platforms are not standardized either. And so what we're kind of riding as a trend is this idea of the need for the universal semantic layer. This idea that you can have a universal set of semantics. In a dictionary or ontology. that can be shared across all types of business users and business use cases. Or across any data. That's really the trend that's driving our growth. And you'll see it today at this show with the used cases and the customers. And of course some of the announcements that we're doing. We're announcing a new offer with cloud there and tableau. And so we're really excited about again how they in space and the partner ecosystems embracing our solutions. >> And you guys really have a Switzerland kind of strategy. You're going to play neutral, play nicely with everybody. Because you're in a different, your abstraction layer is really more on the data. >> That's right. The whole value proposition is that you don't want to move your data. And you don't want to move your users away from the tools that they already know but you do want them to be able to take advantage of the data that you store. And this concept of virtualized layer and you're universal semantic layer that enables the use case to happen faster. Is a big value proposition to all of them. >> Doctor Mark Ramsey, I want to get your quick thoughts on this. I'm obviously your customer so. I mean you're not bias, you ponder pressure everyday. Competitive noise out there is high in this area and you're a chief data officer. You run R&D so you got that 20 miles stare into the future. You've got experience running data at a wide scale. I mean there's a lot of other potential solutions out there. What made it attractive for you? >> Well it feels a need that we have around really that virtualization. So we can leave the data in the format that it is on the platform. And then allow the users to use like Bruno was mentioning. Use a number of standardized tools to access that information. And it also gives us an ability to learn how folks are consuming the data. So they will use a variety of tools, they'll interact with the data. At scale gives us a great capability to really look under the cover, see how they're using the data. And if we need to physicalize some of that to make easier access in the long term. It gives us that... >> It's really an agility model kind to data. You're kind of agile. >> Yeah its kind of a way to make, you know so if you're using a dash boarding tool it allows you to interact with the data. And then as you see how folks are actually consuming the information. Then you can physicalize it and make that readily available. So it is, it gives you that agile cycles to go through. >> In your use of the solution, what have you seen in terms of usage patterns. What are your users using at scale for? Have you been surprised by how they're using it? And where do you plan to go in terms of the use cases you're addressing going forward with this technology? >> This technology allows us to give the users the ability to query the data. So for example we use standardized ontologies in several of the areas. And standardized ontologies are great because the data is in one format. However that's not necessarily how the business would like to look at the data and so it gives us an ability to make the data appear like the way the users would like to consume the information. And then we understand which parts of the model they're actually flexing and then we can make the decision to physicalize that. Cause again it's a great technology but virtualization there is a cost. Because the machines have to create the illusion of the data being a certain way. If you know it's something that's going to be used day in and day out then you can move it to a physicalized version. >> Is there a specific threshold when you were looking at the metrics of usage. When you know that particular data, particular views need to be physicalized. What is that threshold or what are those criteria? >> I think it's, normally is a combination of the number of connections that you have. So the joins of the data across the number of repositories of data. And that balanced with the volume of data so if you're dealing with thousands of rows verses billions of rows then that can lead you to make that decision faster. There isn't a defined metric that says, well we have this number of rows and this many columns and this size that it really will lead you down that path. But the nice thing is you can experiment and so it does give you that ability to sort of prototype and see, are folks consuming the data before you evoke the energy to make it physical. >> You know federated, I use the word federated but semantic virtualization layers clearly have been around for quite sometime. A lot of solution providers offer them. A lot of customers have used them for disparate use cases. One of the wraps traditionally again estimating virtualization is that it's simply sort of a stop gap between chaos on the one end. You know where you have dozens upon dozens of databases with no unified roll up. That's a stop gap on the way to full centralization or migration to a big data hub. Did you see semantic virtualization as being sort of your target architecture for your operational BI and so forth? Or do you on some level is it simply like I said a stop gap or transitional approach on the way to some more centralized environment? >> I think you're talking about kind of two different scenarios here. One is in federated I would agree, when folks attempted to use that to bring disparate data sources together to make it look like it was consolidated. And they happen to be on different platforms, that was definitely a atop gap on a journey to really addressing the problem. Thing that's a little different here is we're talking about this running on a standardized platform. So it's not platformed disparate it's on the platform the data is being accessed on the platform. It really gives us that flexibility to allow the consumer of the data to have a variety of views of the data without actually physicalizing each of them. So I don' know that it's on a journey cause we're never going to get to where we're going to make the data look as so many different ways. But it's very different than you know ten, 15 years ago. When folks were trying to solve disparate data sources using federation. >> Would it be fair to characterize what you do as agile visualization of the data on a data lake platform? Is that what it's essentially about? >> Yeah that, it certainly enables that. In our particular case we use the data lake as the foundation and then we actually curate the data into standardized ontologies and then really, the consumer access layer is where we're applying virtualization. In the creation of the environment that we have we've integrated about a dozen different technologies. So one of the things we're focused on is trying to create an ecosystem. And at scale is one of the components of that. It gives us flexibility so that we don't have to physicalize. >> Well you'd have to stand up any costs. So you have the flexibility with at scale. I get this right? You get the data and people can play with it without actually provisioning. It's like okay save some cash, but then also you double down on winners that come in. >> Things that are a winner you check the box, you physicalize it. You provide that access. >> You get crowd sourcing benefits like going on in your. >> You know exactly. >> The curation you mentioned. So the curation goes on inside of at scale. Are you using a different tool or something you hand wrote in house to do that? Essentially it's a data governance and data cleansing. >> That is, we use technology called Tamer. That is a machine learning based data curation tool, that's one of our fundamental tools for curation. So one of the things in the life sciences industry is you tend to have several data sources that are slightly aligned. But they're actually different and so machine learning is an excellent application. >> Lets get into the portfolio. Obviously as a CTO you've got to build a holistic view. You have a tool chest of tools and a platform. How do you look at the big picture? On that scale if it's been beautifully makes a lot of sense. So good for those guys. But you know big picture is, you got to have a variety of things in your arsenal. How do you architect that tool shed or your platform? Is everything a hammer, everything's a nail. You've got all of them though. All the things to build. >> You bring up a great point cause unfortunately a lot of times. We'll use your analogy, it's like a tool shed. So you don't want 12 lawnmowers right? In your tool shed right? So one of the challenges is that a lot of the folks in this ecosystem. They start with one area of focus and then they try to grow into area of focuses. Which means that suddenly everybody's starts to be a lawnmower, cause they think that's... >> They start as a hammer and turn into a lawn mower. >> Right. >> How did that happen, that's called pivoting. >> You can mow your lawn with a hammer but. So it's really that portfolio of tools that all together get the job done. So certainly there's a data acquisition component, there's the curation component. There's visualization machines learning, there's the foundational layer of the environment. So all of those things, our approach has been to select. The kind of best in class tools around that and then work together and... Bruno and the team at scale have been part of this. We've actually had partner summits of how do we bring that ecosystem together. >> Is your stuff mostly on prime, obviously a lot of pharma IP there. So you guys have the game that poll patent thing which is well documented. You don't want to open up the kimono and start the cloth until it's releasing so. You obviously got to keep things confidential. Mix of cloud, on prime, is it 100 percent on prime? Is there some versing for the cloud? Is it a private cloud, how do you guys look at the cloud piece? >> Yeah majority of what we're doing is on prime. The profile for us is that we persist the data. So it's not. In some cases when we're doing some of the more advanced analytics we burst to the cloud for additional processors. But the model of persisting the data means that it's much more economical to have on prime instance of what we're doing. But it is a combination, but the majority of what we're doing is on prime. >> So will you hold on Jim, one more question. I mean obviously everyone's knocking on your door. You know how to get in that account. They spend a lot of money. But you're pretty disciplined it sounds like you've got to a good view of you don't want people to come in and turn into someone that you don't want them to be. But you also run R&D so you got to have to understand the head room. How do you look at the head room of what you need down the road in terms of how you interface with the suppliers that knock on your door. Whether it's at scale currently working with you now. And then people just trying to get in there and sell you a hammer or a lawn mower. Whatever they have they're going to try, you know you're dealing with the vendor pressure. >> Right well a lot of that is around what problem we're trying to solve. And we drive all of that based on the use cases and the value to the business. I mean and so if we identify gaps that we need to address. Some of those are more specific to life sciences types of challenges where they're very specific types of tools that the population of partners is quite small. And other things. We're building an actual production, operational environment. We're not building a proof of concept, so security is extremely important. We're coberosa enabled end to end to out rest inflight. Which means it breaks some of the tools and so there's criteria of things that need to be in place in order to... >> So you got anything about scale big time? So not just putting a beach head together. But foundationally building out platform. Having the tools that fit general purpose and also specialty but scales a big thing right? >> And it's also we're addressing what we see is three different cohorts of consumers of the data. One is more in the guided analytics, the more traditional dashboards, reports. One is in more of computational notebooks, more of the scientific using R, Python, other languages. The third is more kind of almost at the bare middle level machine learning, tenser flow a number of tools that people directly interact. People don't necessarily fit nicely into those three cohorts so we're also seeing that, there's a blend. And that's something that we're also... >> There's a fourth cohort. >> Yeah well you know someone's using a computational notebook but they want to draw upon a dashboard graphic. And then they want to run a predefined tenser flow and pull all that together so. >> And what you just said, tied up the question I was going to ask. So it's perfect so. One of my core focuses is as a Wikibon analyst is on deep learning. On AI so in semantic data virtualization in a life sciences pharma context. You have undoubtedly a lot of image data, visual data. So in terms of curating that and enabling you know virtualized access to what extent are you using deep learning, tenser flow, convolutional neural networks to be able to surface up the visual patterns that can conceivably be searched using a variety of techniques. Is that a part of your overall implementation of at scale for your particular use cases currently? Or do you plan to go there in terms of like tenser flow? >> No I mean we're active, very active. In deep learning, artificial intelligence, machine learning. Again it depends on which problem you're trying to solve and so we again, there's a number of components that come together when you're looking at the image analytics. Verses using data to drive out certain decisions. But we're acting in all of those areas. Our ultimate goal is to transform the way that R&D is done within a pharmaceutical company. To accelerate the, right now it takes somewhere between five and 15 years to develop a new medicine. The goal is to really to do a lot more analytics to shorten that time significantly. Helps the patients, gets the medicines to market faster. >> That's your end game you've got to create an architecture that enables the data to add value. >> Right. >> The business. Doctor Mark Ramsey thanks so much for sharing the insight from your environment. Bruno you got something there to show us. What do you got there? He always brings a prop on. >> A few years ago I think I had a tattoo on my neck or something like this. But I'm happy that I brought this because you could see how big Mark's vision is. the reason why he's getting recognized by club they're on the data awards and so forth. Is because he's got a huge vision and it's a great opportunity for a lot of CTOs out there. I think the average CEO spent a 100 million dollars to deploy big data solutions over the last five years. But they're not able to consumer all the data they produce. I think in your case you consume about a 100 percent of the instructor data. And the average in this space is we're able to consume about one percent of the data. And this is essentially the analogy today that you're dealing with if you're on the enterprise. We'd spent a lot of time putting data in large systems and so forth. But the tool set that we give, that you did officers in their team is a cocktail straw lik this in order to drink out of it. >> That's a data lake actually. >> It's an actual lake. It's a Slurpee cup. Multiple Slurpees with the same straw. >> Who has the Hudson river water here? >> I can't answer that question I think I'd have to break a few things if I did. But the idea here is that it's not very satisfying. Enough the frustration business users and business units. When at scale's done is we built this, this is the straw you want. So I would kind of help CTOs contemplate this idea of the Slurpee and the cocktail straw. How much money are you spending here and how much money are you spending there. Because the speed at which you can get the insights to the business user. >> You got to get that straw you got to break it down so it's available everywhere. So I think that's a great innovation and it makes me thirsty. >> You know what, you can have it. >> Bruno thanks for coming from at scale. Doctor Mark Ramsey good to see you again great to have you come back. Again anytime love to have chief data officers on. Really a pioneering position, is the critical position in all organizations. It will be in the future and will continue being. Thanks for sharing your insights. It's the Cube, more live coverage after this short break. (tech music)

Published Date : Sep 27 2017

SUMMARY :

Brought to you by, And Bruno as he's the CMO at Fscale, So Bruno I want to start with you And of course some of the announcements that we're doing. And you guys really have a Switzerland And you don't want to move your users You run R&D so you got that in the format that it is on the platform. It's really an agility model kind to data. So it is, it gives you that agile cycles to go through. And where do you plan to go and day out then you can move it to a physicalized version. When you know that particular data, particular views But the nice thing is you can experiment You know where you have dozens upon dozens of databases So it's not platformed disparate it's on the platform So one of the things we're focused on So you have the flexibility with at scale. Things that are a winner you check the box, You get crowd sourcing benefits So the curation goes on So one of the things in the life sciences industry you got to have a variety of things in your arsenal. So one of the challenges is that a lot of the folks Bruno and the team at scale have been part of this. So you guys have the game that poll patent thing but the majority of what we're doing is on prime. of what you need down the road and the value to the business. So you got anything about scale big time? more of the scientific using R, Python, other languages. Yeah well you know someone's using to what extent are you using deep learning, Helps the patients, gets the medicines to market faster. that enables the data to add value. Bruno you got something there to show us. that you did officers in their team is a cocktail straw It's a Slurpee cup. Because the speed at which you can get the insights you got to break it down so it's available everywhere. Doctor Mark Ramsey good to see you again

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JimPERSON

0.99+

James KobielusPERSON

0.99+

MarkPERSON

0.99+

BrunoPERSON

0.99+

New York CityLOCATION

0.99+

John FurrierPERSON

0.99+

20 milesQUANTITY

0.99+

Mark RamseyPERSON

0.99+

100 percentQUANTITY

0.99+

12 lawnmowersQUANTITY

0.99+

GSKORGANIZATION

0.99+

100 million dollarsQUANTITY

0.99+

FscaleORGANIZATION

0.99+

thirdQUANTITY

0.99+

dozensQUANTITY

0.99+

SiliconANGLE MediaORGANIZATION

0.99+

OneQUANTITY

0.99+

15 yearsQUANTITY

0.99+

PythonTITLE

0.99+

todayDATE

0.99+

Bruno AzizaPERSON

0.99+

bothQUANTITY

0.99+

oneQUANTITY

0.98+

eachQUANTITY

0.98+

fourth cohortQUANTITY

0.98+

NYCLOCATION

0.98+

CubeORGANIZATION

0.98+

Hudson riverLOCATION

0.98+

VivensORGANIZATION

0.98+

SwitzerlandLOCATION

0.98+

three cohortsQUANTITY

0.98+

DoctorPERSON

0.98+

billions of rowsQUANTITY

0.97+

Ten XQUANTITY

0.97+

tomorrowDATE

0.97+

two guestsQUANTITY

0.97+

one formatQUANTITY

0.97+

thousands of rowsQUANTITY

0.97+

BIGDATAORGANIZATION

0.97+

primeCOMMERCIAL_ITEM

0.96+

one more questionQUANTITY

0.96+

couple years agoDATE

0.96+

Dr.PERSON

0.96+

agileTITLE

0.96+

R&DORGANIZATION

0.95+

two different scenariosQUANTITY

0.95+

about one percentQUANTITY

0.95+

fiveQUANTITY

0.93+

Strata DataORGANIZATION

0.93+

three different cohortsQUANTITY

0.92+

Mid Town ManhattanLOCATION

0.92+

dozens of databasesQUANTITY

0.92+

WikibonORGANIZATION

0.92+

ten,DATE

0.89+

about a 100 percentQUANTITY

0.89+

BigDataORGANIZATION

0.88+

2017DATE

0.86+

one areaQUANTITY

0.81+

BIGDATA New York City 2017EVENT

0.79+

last five yearsDATE

0.78+

15 years agoDATE

0.78+

about a dozen different technologiesQUANTITY

0.76+

A few years agoDATE

0.76+

one endQUANTITY

0.74+

Glasgow PharmaORGANIZATION

0.7+

thingsQUANTITY

0.69+

RTITLE

0.65+

Ross Rexer & Eli Lilly - ServiceNow Knowledge13 - theCUBE


 

okay we're back this is Dave vellante I'm with Wikibon organ this is silicon angles the cube the cube is a live mobile studio we come into events we're here at knowledge service now's big customer event we're here at the aria hotel in Las Vegas and we've got wall-to-wall coverage today tomorrow and part of thursday as many of you know we were at sapphire now the big SI p customer show were simulcasting that on SiliconANGLE too but we're here in Las Vegas the ServiceNow conference is all about transformation transforming from no to now we've kind of got a double whammy segment here virtually every industry is transforming and certainly Big Pharma is transforming quite dramatically as well as the IT components of many industries Ross rexer is here he's the managing director at kpmg the global consultancy and T Juan Lumpkin who's an IT practitioner for Eli Lilly gentlemen welcome to the cube okay thanks for you so Ross let's start with you at a high level what's happening in the pharmaceutical industry in general Big Pharma how is the industry itself transforming and then we'll get into the I TPS sure so many of the Big Pharma's find themselves today in a situation that is unique to their their business industry and market where a lot of blocked blockbuster drugs which have been significant sources of revenue over the years are starting to come on with that it brings competition and a loss of revenue so the big farmers are all in a very coordinated methodical process right now to resize their business and at the same time enable the R&D function to bring new drugs to market focusing on patient outcomes that will happen in different ways in them they probably ever done before so the business model itself has changed and along with it all the support functions like ITA of course too so in that so it's all about the pipeline right and and the challenge if I understand it is that historically you got the big pharma companies they would you know go do about go about their thing and develop these drugs and they get a blockbuster and it was a relative today a relatively slow paced environment that's that's changing if I understand it correctly what's driving that change so the the innovation around medicines today is much different than it has been over the last 10 20 years in that composition around in the use of different biotech components to create a to create medicines is now being sourced in different ways historically Pharma built itself and really invested and was really a research and development company almost entirely in-house right so all the support systems and everything the way that the business was run was around that nowadays these the farmers are collaborating with smaller providers many of them in ways that again they just historically have never done everything was done in house to build to bring drugs to market and now it's it's shifted absolute to the opposite side where big farmers are relying on these providers these third-party providers for all stages of R&D and ultimately FDA and the release of these so t1 I introduced you as an IT practitioner and Lily so talk about more specifically about your role there you focused on infrastructure I teach em a list or more about them yeah so my rules are about service integration think about those services that we deliver to our internal customers within lily and how do we do that across our complex ecosystem where you have multiple different IT departments you have multiple suppliers who have different rigs and complexities in that space and so our job is how do we minimize that complexity for our internal business partners and making sure that the way we build variety is seamless for our internal customers okay so we heard Ross talking about the the pressures in the in the industry from a from an IT practitioner standpoint what how does that change change your life what are the drivers and what's the business asking you to do but just like anyone we need more volume but we also have to do that under under constraint and so for us how do we get more fishing so you think about this basically gone under you can only do so much outsourcing you only do so much change and so you have to see how do I start running my business more efficiently and I think that's the big shift and I tias you're moving from a from an internal infrastructure towers are truly looking at how do we deliver IT services and part of living IT services and making sure that we're a value-added partner and also being assured that we're competitive with other sources of our businesses have to get services from an IT perspective yeah so 10 years ago we used to talk a lot about demand management and to me it's that's why i love this from now from know to now because demand management is actually ended up just being no we just can't handle the the volume so you mentioned constraints you've got constraints you've got to be more efficient so so talk a little bit about what you did to get more efficient for us it was all about standardization so how do we how do we build standardization across our IT infrastructure nikka system within our IT partners empower external partners what that does it gives us flexibility so that we can deliver our systems and be more agile they think about our internal space we had a lot of complexity we had multiple procedures multiple processes different business units operating or delivering IT services in a consistent manner what we've been able to do it being able to streamline that we've been able to be more consistent internally in a line on the comments that are processes and how we deliver those ikea services to our customers so Ross you're talking about the sort of changing dynamic of what I would call sort of the pharmaceutical ecosystem right so so that's that sounds like it's relatively new in pharma it used to be sort of a go-it-alone the big guys hey we're multi-billion dollar companies we don't need these little guys you see all these startups coming out there really innovative there faster so take us through sort of how that's evolving how companies are dealing with the ecosystem and what kind of pressures that puts on IT what are you seeing out there so as t1 was was mentioning as well this was pushing to IT service integration as a kind of one of the next frontiers of now right being able to have the single pane of glass single system of record of IT and our ability to bring standardized services up and down in a coordinating consistent way has allowed for the bigger more monolithic type companies in be able to interact with with these smaller more agile more tech-savvy appeal partners and be able to not overburden them so the little provider who has maybe less less overhead of IT infrastructure and their processes would find it hard to be able to collaborate electronically with a big pharma if we had to adopt the big pharma's old-style processes so service integration is all about allowing for the the easy plug-and-play of these providers and establishing the reference set of processes and the supporting data that's needed to govern those transactions or the length of the of the outsourcing arrangement with with that provider in a way that doesn't get overburden them but provides the company Big Pharma the ability to have transparency ability to see risks before they're happening and to enter manage the cause so talk about your practice a little bit how do you what's role do you play it's obviously you've got this increasingly complex ecosystem evolving they've definitely got different infrastructures um how do you sort of mediate all this so Kim G what are our go-to-market offering and our solution set is based around a set of leading practices that that we have established over the past 17 years for example that we've been in the IT service management consulting and advisory business so we have these accelerators that we can we bring to a project and engagement like like the one we're at Eli Lilly where we can quickly faster than ever establish a common ground for those processes the operational processes first and foremost that would don't require years and years of consultancy process engineering 20 years ago type of thing so our role in that is to provide the basis for the are the operating model that's going to go forward and allow the core customer as well as these other providers to get there fast to get operating faster so t1 we've been hearing a similar pattern from the customers that we've talked to a lot of stovepipes a lot of legacy you know tools a lot of uncoordinated sort of activities going going on is that what what Lily with you would you describe that as an accurate depiction of the pasture i think i think that i think you're being kind yeah I'm sure we kind on the cheer we don't like to feed our guests up what I think it not to over use the ERP for IT term but this is something I t we've done for our business partners over the years we haven't done for our so if you think about the essay peas of the world where you get your CI CFO a one-click look at the the financial assets of the company you think about from a CRM perspectively doing that for our sales force we've done that from an HR perspective but we haven't taken the time to look at from an IT perspective and how do i give the cio that same visibility across our portfolio services so that he can ask those same questions you can have that same visibility so i want to add a little color to this whole erp for IT though of course on the one hand you know the sort of single system of record that's a positive but when you think of erp i say we were at SI p sapphire there's a lot of complexity in erp and with that type of complexity you'd never succeed but so what's your experience been thus far with regard to you know the complexity in my senses it's not this big monolithic system it's a cloud-based SAS based system talk about that a little bit well for us it's getting to a set of standards it actually helped reduce the complexity where you have complexity when you have multiple business procedures across the organization delivering services and so to get to that single source that single record it is actually help to reduce a lot of complexity on our part help it make it easier for us to deliver customer service for customers the other piece of that to which is the the singularity of vision of how we deliver I team so right now within our business we're depending on what area in you may get IT servers that delivers slightly differently from each area we've been able to streamline that and say this is how you're going to receive IT services and make it a more predictable experience for our internal users I saw Rus I want to talk about this notion of a single system of record before I ask you why it's so important what are we talking about here because today you've got a single system of record for your transactions you might have a single system of record for your your data warehouse all these single systems are at a record so what do you mean by a single system of record so when we're talking with service now and specifically in the IT Service Management domain what we're talking about is having integrated the capability to see data across the different data domains if you like so operational data performance data service level data with that coupled with the IT finance data as well as a zesty one put 360-degree vision of your assets as well so linking all those sources of data together in a way that can be used for analytics maybe for the first time ever so we we we use the analogy of IT intelligence right so what we've given our business partners and business intelligence over the years mmm it's-- never had that so the ability to provide IT intelligence that allows for the leadership due to to have data have information that they can take decisions and then ultimately become predicted with that right so be able have the knowledge to know what we're doing to make the right choices and in the future be able to do some predictive analysis again back to the point about the demands really never got one hundred percent right over the years we've talked about a lot but having the ability to understand the consumption and have the levers to influence demand and see it grow I want to go back to this business process discussion you were sort of reference the 20 years ago the whole VPO of movement and you know business process reorganization it seems to me that what what occurred was you had let's say a database or some kind of system and maybe there was a module and then you build a business process around that and so you had relatively inflexible business process they were hard to change is are you seeing that change it we at the cusp of the dawn of a new era where I can actually create whatever business process I want to around that single system of record is that truly a vision that's coming to fruition we believe it is and our experience it is it is starting to happen and I think service now with their platform is one of the emerging leaders in this space that's allowing for that to happen percent of the day so you have you have a concise platform that allows you enough flexibility to build new processes but has the common data structure has the common user interface as the common workflow set in a and all wrapped in and easy to maintain type of platform is what I think 20 years ago we wished we had and we tried to build in many different ways and ended up mostly cobbling things together but we really believe that and again our starting to see success out there David the platform question is solved and that we're now able to get to the prosecutor historically we you know delivered value plenty of value the problem is so much of that value was sucked by the infrastructure and and and not enough went into the innovation around it do you want my question to you is so people don't like change naturally now maybe it's different and nit maybe they want change in IT but did you see initial resistance I'll know we have this way of doing it we don't want to change or are people enthusiastic about change talk about that a little quite you hit it spot-on and absolutely the technology is the easy part of it it's really the change part that that's the most difficult piece of it and I would say we've done to a lot of work just a line organization and we've had a lot of support for from not only our internal IT people but also our senior leadership team so we've gotten support we've seen a lot of buy-in not saying still them not going to be easy not gonna be easy but I feel that we've got the right momentum now to make this type of change to get the business volume part of its been able to articulate the value that we're going to receive from from from this initiative so it's early days for for Lily and you guys should just get started on this journey not yesterday but you know you you're in an inference perience to give some advice to your fellow practitioners so my ask you guys both start with t1 what advice would you give to fellow practitioners that are looking to move in this direction great I would say first of all you have to have the business alignment so I need to make sure that you can clearly articulate the value of the change of the company so I can I can talk not in terms of process but in terms of outcomes that we're going to drive for our business partners once you're able to describe those outcomes then you can have the conversation on what's the work it's going to take to get there it's not an easy journey to be able to paint that picture accurately for for our teams and also talk about how we're going to support them through the process and so we're going to talk about the value we're going to we're going to paint the picture the journey we're not going to tell you how I want to support you throughout that process okay Ross you're talking to CIOs what's your what's your main point of advice for CIOs in this regard is look at the transformation as transformational right so it's it's it can be a set of tactical projects and tactical wins based on outcomes that you're looking for however to in order to truly change the way your IT functions runs as a business do all these these great things that we're talking we're talking about today is you have to have the vision and understand that it is there are series of building blocks that we will get you incremental value along the way but this is not a quick you know product slam then again maybe 20 years ago was about let's swap this software for that software and we're going to be good it's not about that and that's not going to get you the transformation so it's about transformation it's about the metrics to be able to prove that you are transforming and continuous improvement Ross do you want thanks very much for coming on the cube and sharing your story we could go on forever we're getting the hook but really appreciate you guys coming up thanks thanks for having right thanks for watching everybody we right back with our next guest Chris Pope is here who's the director of product management for service now so we're going to double-click on the platform and share with you some greater information about that this is the cube I'm Dave vellante we're right back

Published Date : May 15 2013

**Summary and Sentiment Analysis are not been shown because of improper transcript**

ENTITIES

EntityCategoryConfidence
Chris PopePERSON

0.99+

RossPERSON

0.99+

Las VegasLOCATION

0.99+

Dave vellantePERSON

0.99+

Kim GPERSON

0.99+

LilyPERSON

0.99+

Eli LillyPERSON

0.99+

Las VegasLOCATION

0.99+

360-degreeQUANTITY

0.99+

Dave vellantePERSON

0.99+

Big PharmaORGANIZATION

0.99+

Eli LillyPERSON

0.99+

T Juan LumpkinPERSON

0.99+

todayDATE

0.99+

Ross RexerPERSON

0.99+

10 years agoDATE

0.99+

thursdayDATE

0.98+

DavidPERSON

0.98+

first timeQUANTITY

0.98+

20 years agoDATE

0.98+

Eli LillyORGANIZATION

0.98+

20 years agoDATE

0.98+

20 years agoDATE

0.98+

yesterdayDATE

0.97+

20 years agoDATE

0.97+

one hundred percentQUANTITY

0.96+

single systemQUANTITY

0.96+

bothQUANTITY

0.94+

ikeaORGANIZATION

0.94+

Ross rexerPERSON

0.94+

one-clickQUANTITY

0.94+

WikibonORGANIZATION

0.93+

single systemsQUANTITY

0.93+

oneQUANTITY

0.92+

multi-billion dollarQUANTITY

0.92+

ServiceNowEVENT

0.92+

single systemQUANTITY

0.92+

single sourceQUANTITY

0.91+

kpmgORGANIZATION

0.91+

singleQUANTITY

0.9+

each areaQUANTITY

0.9+

firstQUANTITY

0.83+

aria hotelORGANIZATION

0.83+

last 10 20 yearsDATE

0.78+

big pharmaORGANIZATION

0.73+

ServiceNowORGANIZATION

0.72+

big pharmaORGANIZATION

0.72+

single recordQUANTITY

0.7+

double-clickQUANTITY

0.65+

FDATITLE

0.6+

lot of workQUANTITY

0.6+

tomorrowDATE

0.59+

lot ofQUANTITY

0.58+

SiliconANGLELOCATION

0.56+

lotQUANTITY

0.56+

past 17 yearsDATE

0.55+

t1ORGANIZATION

0.53+

PharmaORGANIZATION

0.5+

RusPERSON

0.47+

RossORGANIZATION

0.47+