Image Title

Search Results for IoTahoe:

Adam Worthington, Ethos Technology | IoTahoe | Data Automated


 

>>from around the globe. It's the Cube with digital coverage of data automated and event. Siri's brought to you by Iot. Tahoe. Okay, we're back with Adam Worthington. Who's the CTO and co founder of Ethos Adam. Good to see you. How are things across the pond? >>Thank you. I'm sure that a little bit on your side. >>Okay, so let's let's set it up. Tell us about yourself. What your role is a CTO and give us the low down on those. >>Sure, So we get automatic. As you said CTO and co founder of A were pretty young company ourselves that we're in our sixth year and we specialize in emerging disruptive technologies within the infrastructure Data center kind of cloud space. And my role is the technical lead. So it's kind of my job to be an expert in all of the technologies that we work with, which can be a bit of a challenge if you have a huge portfolio, is one of the reasons we deliberately focusing on on also kind of a validation and evaluation of new technologies. Yeah, >>so you guys are really technology experts, data experts and probably also expert in process and delivering customer outcomes. Right? >>That's a great word there, Dave Outcomes. That's a lot of what I like to speak to customers about on. Sometimes I get that gets lost, particularly with within highly technical field. I like the virtualization guy or a network like very quickly start talking about the nuts and bolts of technology on I'm a techie. I'm absolutely a nerd, like the best tech guitar but fundamentally reporting in technologies to meet. This is outcomes to solve business problems on on to enable a better way. >>Love it. We love tech, too, but really, it's all about the customer. So let's talk about smart data. You know, when you when you throw in terms like this is it kind of Canfield Buzz Wordy. But let's let's get into the meat on it. What does that mean to you? One of the critical aspects of so called smart data >>cool probably hoped to step back a little bit and set the scene a little bit more in in terms of kind of where I came from, the types of problems that I'm really an infrastructure solution architect trace on what I kind of benefits. We organically But over time my personal framework, I focused on three core design principles whatever it was I was designing. And obviously they need different things. Depending on what technology area is that we're working with. That's pretty good on. And what I realized that we realized we started with those principles could be it could be used more broadly in the the absolute best of breed of technologies. And those really disrupt, uh, significantly improve upon the status quo in one or more of those three areas. Ideally or more simple, more on if we look at the data of the challenges that organizations, enterprises organizations have criticized around data and smart fail over the best way. Maybe it's good to reflect on what the opposite end of the story is kind of why data is often quite dumb. The traditional approaches. We have limited visibility into the data that we're up to the story using within our infrastructure as what we kind of ended up with over time, through no fault of the organizations that have happened silos, everyone silos of expertise. So whether that be, that's going out. Specialized teams, socialization, networking. They have been, for example, silos of infrastructure, which trade state of fragmentation copies of data in different areas of the infrastructure on copies of replication in that data set or reputation in terms of application environments. I think that that's kind of what we tend to focus on, what it's becoming, um, resonating with more organizations. There's a survey that one of the vendors that we work with actually are launched vendor 5.5 years ago, a medical be gone. They work with any company called Phantom Born a first of a kind of global market, 900 respondents, all different vectors, a little different countries, the U. S. And Germany. And what they found was shocking. It was a recent survey so focused on secondary data, but the lessons learned the information taken out a survey applies right across the gamut of infrastructure data organizations. Just some stats just pull out the five minutes 85% off the organization surveyed store between two and five stores data in 3 to 5 clouds. 63% of organizations have between four and 16 coffees of exactly the same data. Nearly nine out of 10 respondents believe that organizations, secondly, data's fragmented across silos are touched on is would become nearly impossible to manage over the long term on. And 91% of the vast majority of organizations leadership were concerned about the level of visibility their teams. So they're the kind of areas that a smart approach to data will directly address. So reducing silos that comes from simplifying so moving away from complexity of infrastructure, reducing the amount of copies of data that we have across the infrastructure and reducing the amount of application environment. I mean, Harry, so smarter we get with data is in my eyes. Anyway, the further we moved away from this, >>there was a lot in that answer, but I want to kind of summarize it if I can talk. You started with simplicity, flexibility, efficiency. Of course, that's what customers want. And then I was gonna ask you about you know, what challenges customers are facing, and I think you laid it out here. But I want to I want to pick on a couple of some of the data that you talked about the public cloud treat that adds complexity and diversity in skill requirements. The copies of data is so true, like data is just like like if rebels, If you Star Trek franchise, they just expand and replicate. So that's an expense, and it adds complexity. Silo data means you spend a lot of time trying to figure out who's got the right data. What's the real truth with a lot of manual processes involved in the visibility is obviously critical. So those are the problems on. But course you talked about how you address those, But But how does it work? I mean, how do you know what's what's involved in injecting smarts into your data? Lifecycle >>that plane, Think about it. So insurance of the infrastructure and say they were very good reasons why customers are in situations they have been in this situation because of the limits are traditional prices. So you look at something is fundamental. So a great example, um on applications that utilize the biggest fundamentally back ups are now often what that typically required is completely separate infrastructure to everything else. But when we're talking about the data set, so what would be a perfect is if we could back up data on use it for other things, and that's where a, uh, a technology provider like So So although it better technology is incredibly simple, it's also incredibly powerful and allows identification, consolidation. And then, if you look at just getting insight out of that fundamentally tradition approaches to infrastructure, they're put in a point of putting a requirement. And therefore it wasn't really incumbent exposed any information out of the data that's stored within the division, which makes it really tricky to do anything else outside of the application. That that's where something like Iot how come in in terms of abstracting away the complexity more directly, I So these are the kind of the area. So I think one of my I did not ready, but generally one of my favorite quotes from the French philosopher and a mathematician, Blaise Pascal, he says, I get this right. I have written a short letter, but I didn't have time. But Israel. I love that quite for lots of reasons, that computation of what we're talking about, it is actually really complicated to develop a technology capability to make things simple, more directly meet the needs of the business. So you provide self service capabilities that they just need to stop driving. I mean making data on infrastructure makes sense for the business users. Music. It's My belief is that the technology shouldn't mean that the users of the technology has to be a technology expert what we really want them to be. And they should be a business experts in any technology that you should enable on demand for the types of technologies to get me excited. They're not necessarily from a ftt complicated technology perspective, but those are really focused on impressive the capability. >>Yeah. Okay, so you talked about back up, We're gonna hear from Kohi City a little bit later and beyond backup data protection, Data Management, That insight piece you talked earlier about visibility, and that's what the Iot Tahoe's bringing table with its software. So that's another component of the tech stack, if you will, Um, and then you talk about simplicity. We're gonna hear from pure storage. They're all about simple storage. They call it the modern data experience. I think so. So those are some of the aspects and your job. Correct me. If I'm wrong is to kind of put that all together in a solution and then help the customer realize that we talked about earlier that business out. >>Yeah, it's that they said, in understanding both sides so that it keeps us on our ability to be able to deliver on exactly what you just said. It's being experts in the capabilities and new and better ways to do things but also having the kind of business under. I found it to be able to ask the right questions, identify how new a better price is positions and you touched on. Yet three vendors that we work with that you have on the panel are very genuinely of. I think of the most exciting around storage and pure is a great one. So yes, a lot of the way that they've made their way. The market is through impressive C and through producing data redundancy. But another area that I really like is with that platform, you can do more with less. And that's not just about using data redundancy. That's about creating application environment, that conservative, then the infrastructure to service different requirements are able to do that the random Io thing without getting too kind of low level as well as a sequential. So what that means is that you don't necessarily have to move data from application environment a do one thing. They disseminate it and then move it to the application environment. Be that based environment three in terms of an analytics on the left to right work. So keep the data where it is, use it for different requirements within the infrastructure and again do more with less. And what that does is not just about simplicity and efficiency. It significantly reduces the time to value. Well at that again resonates that I want to pick up a soundbite that resonates with all of the vendors we have on the panel later. This is the way that they're able todo a better a better TCO better our alliance significantly reduce the value of data. But to answer your question, yeah, you're exactly right. So it's key to us to kind of position, understand? Customer climbs, position the right technology. >>Adam. I wonder if you could give us your insights based on your experience with customers in terms of what success looks like. I'm interested in what they're measuring. I'm big on and end cycle times and taking a systems view, but of course you know customers. They want to measure everything, whether it's the productivity of developers or, you know, time to insights, etcetera. What >>are >>they? One of the KP eyes that are driving success and outcomes? >>Those capabilities on historically in our space have always been a bit really. When you talk about total cost of ownership, talk about return on investment, you talk about time to value on. I've worked in many different companies, many different infrastructure, often quite complicated environments and infrastructure. I'm being able to put together anything Security realistic gets proven out. One solution gets turned around our alliance TCO is challenging. But now with these new, a better approach is that more efficient, enables you to really build a true story and on replicate whatever you want. Obviously ran kind of our life, and the key thing is to say from data, But now it's time to value. So what we what? We help in terms of the scoping on in terms of the understanding what the requirements are, we specifically called out business outcomes what organizations are looking to achieve and then back on those metrics, uh, to those outcomes. What that does is a few different things, but it provides a certain success criteria. Whether that's success criteria within a proof of concept of the mobile solutions on being able to speak that language on before, more directly meet the needs of the business kind of crystallized defined way is we're only really be able to do that. Now we work with >>Yeah, So when you think about the business case, they are a why benefit over cost benefit obviously lower tco you lower the denominator, you're going to increase the output in the value. And then I would I would really stress that I think the numerator, ultimately especially in a world of data, is the most important. And I think the TCO is fundamental. It's really becoming table stakes. You gotta have simple. You've gotta have efficient. You've got to be agile. But it enables that that numerator, whether that's new customer revenue, maybe, you know, maybe cost savings across the business. And again that comes from taking that systems view. Do you >>have >>examples that you can share with us even if they're anonymous, eyes the customers that you work with that or maybe a little further down on the journey, or maybe not things that you can share with us that are proof points here. >>Sure, it's quite easy and very gratifying when you've spoken to a customer. We know you've been doing this for 20 years, and this is the way that your infrastructure if you think about it like this, if we implemented that technology or this new approach, then we will enable you to get simple, often ready, populous. Reduce your back. I worked on a project where a customer accused that back book from I think it was. It was nine. Just under 10. It was nine fully loaded. Wraps back. We should just for the it you're providing the fundamental underlying storage architectures. And they were able to consolidate that that down on, provide additional capacity. Great performance. The less than half Uh huh. Looking at the you mentioned data protection earlier. So another organization. This is a project which is just kind of nearing completion of the moment. Huge organization. They're literally petabytes of data that was servicing their back up in archive. And what they have is not just the reams of data, they have the combined thing. I different backup. Yeah, that they have dependent on the what area of infrastructure they were backing up. So whether it was virtualization that was different, they were backing up. Pretty soon they're backing up another database environment using something else in the cloud. So a consolidated approach that we recommended to work with them on they were able to significantly reduce complexity and reduce the amount of time that it system what they were able to achieve. And this is again one of the clients have they've gone above the threshold of being able to back up. When they tried to do a CR, you been everything back up into in a second. They want people to achieve it. Within the timescales is a disaster recovery, business continuity. So with this, we're able to prove them with a proof up. Just before they went into production and the our test using the new approach. And they were able to recover everything the entire interest in minutes instead of a production production, workloads that this was in comparison to hours and that was those hours is just a handful of workloads. They were able to get up and running with the entire estate, and I think it was something like an hour on the core production systems. They were up and running practically instantaneously. So if you look at really stepping back what the customers are looking to the chief, they want to be able to if there is any issues recover from those issues, understand what they're dealing with. Yeah, On another, we have customers that we work with recently what they had huge challenges around and they were understandably very scared about GDP are. But this is a little while ago, actually, a bit still no up. A conversation has gone away. Just everybody are still speaks to issues and concerns around GDP are applying understanding whether they so put in them in us in a position to be able to effectively react. Subject That was something that was a key metric. A target for on infrastructure solution that we work with and we were able to provide them with the insight into their data on day enables them to react to compliance. And they're here to get a subject access request way created in significantly. I'm >>awesome. Thank you for that. I want to pick up on a little bit. So the first example you get your infrastructure in order to bust down those silos and what I've when I talk to customers. And I've talked to a number of banks, insurance companies, other financial services of manufacturers when they're able to sort of streamline that data lifecycle and bring in automation and intelligence, if you will. What they tell me is now they're able to obviously compress the time to value, but also they're loading up on way more initiatives and projects that they can deliver for the business. And you talk for about about the line of business having self served. The businesses feel like they actually are really invested in the data, that it's their data that it's not, you know, confusing and a lot of finger pointing. So so that's that's huge on. And I think that your other example is right on as well of really clear business value that organizations are seeing. So thanks for those you know. Now is the time really, t get these houses in order, if you will, because it really drives competitive advantage, especially take your second example in this isolation economy, you know, being able to respond things like privacy are just increasingly critical. Adam, give us the final thoughts. Bring us home in this segment, >>not the farm of built, something we didn't particularly touch on that I think it's It's fairly fairly hidden. It isn't spoken about as much as I think it is that digital approaches to infrastructure we've already touched on there could be complicated on lack of efficiency, impact, a user's ability to be agile, what you find with traditional approaches. And you already touched on some of the kind of benefits and new approaches that they're often very prescriptive, designed for a particular as the infrastructure environment, the way that it served up to the users in a kind of A packaged either way means that they need to use it in that whatever way, in places. So that kind of self service aspect that comes in from a flexibility standpoint that for me in this platform approach, which is the right way to address technology in my eyes enables it's the infrastructure to be used effectively so that the business uses of the data users what we find in this capability into their hand and start innovating in the way that they use that on the way that they bring benefits a platform to prescriptive, and they are able to do that. So what you're doing with these new approaches is all of the metrics that we touched on fantastic from a cost standpoint, from a visibility standpoint. But what it means is that the innovators in the business want to really, really understand what they're looking to achieve and now tools to innovate with us. Now, I think I've started to see that with projects that were completed, you could do it in the right way. You articulate the capability and empower the business users in the right way. Then very significantly better position. Take advantage of this on really match and significantly bigger than their competition. >>Super Adam in a really exciting space. And we spent the last 10 years gathering all this data, you know, trying to slog through it and figure it out. And now, with the tools that we have and the automation capabilities, it really is a new era of innovation and insights. So, Adam or they didn't thanks so much for coming on the Cube and participating in this program >>Exciting times. And thank you very much today. >>Alright, Stay safe and thank you. Everybody, this is Dave Volante for the Cube. Yeah, yeah, yeah, yeah

Published Date : Jul 29 2020

SUMMARY :

Siri's brought to you by Iot. I'm sure that a little bit on your side. What your role is a CTO So it's kind of my job to be an expert in all of the technologies that we work so you guys are really technology experts, data experts and probably also like the best tech guitar but fundamentally reporting in technologies to meet. One of the critical aspects of so called smart There's a survey that one of the vendors that we work with actually are launched vendor 5.5 to pick on a couple of some of the data that you talked about the public cloud treat that mean that the users of the technology has to be a technology expert what we really want them So that's another component of the tech stack, that it keeps us on our ability to be able to deliver on exactly what you just said. everything, whether it's the productivity of developers or, you know, time to insights, scoping on in terms of the understanding what the requirements are, we specifically is the most important. that or maybe a little further down on the journey, or maybe not things that you can share with us that are proof at the you mentioned data protection earlier. So the first example you get your infrastructure in order to bust ability to be agile, what you find with traditional approaches. you know, trying to slog through it and figure it out. And thank you very much today. Everybody, this is Dave Volante for the Cube.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AdamPERSON

0.99+

Adam WorthingtonPERSON

0.99+

Dave VolantePERSON

0.99+

Blaise PascalPERSON

0.99+

20 yearsQUANTITY

0.99+

16 coffeesQUANTITY

0.99+

sixth yearQUANTITY

0.99+

nineQUANTITY

0.99+

3QUANTITY

0.99+

five minutesQUANTITY

0.99+

SiriTITLE

0.99+

900 respondentsQUANTITY

0.99+

Phantom BornORGANIZATION

0.99+

HarryPERSON

0.99+

91%QUANTITY

0.99+

second exampleQUANTITY

0.99+

fourQUANTITY

0.99+

Ethos TechnologyORGANIZATION

0.99+

63%QUANTITY

0.99+

five storesQUANTITY

0.99+

oneQUANTITY

0.99+

todayDATE

0.99+

GermanyLOCATION

0.98+

both sidesQUANTITY

0.98+

Star TrekTITLE

0.98+

three vendorsQUANTITY

0.98+

U. S.LOCATION

0.98+

IoTahoeORGANIZATION

0.98+

5 cloudsQUANTITY

0.98+

three areasQUANTITY

0.98+

TCOORGANIZATION

0.98+

twoQUANTITY

0.97+

One solutionQUANTITY

0.97+

Iot TahoeORGANIZATION

0.97+

Nearly nineQUANTITY

0.97+

5.5 years agoDATE

0.96+

Kohi CityORGANIZATION

0.96+

10 respondentsQUANTITY

0.96+

under 10QUANTITY

0.95+

OneQUANTITY

0.94+

Dave OutcomesPERSON

0.94+

85%QUANTITY

0.93+

first exampleQUANTITY

0.93+

IsraelLOCATION

0.92+

So SoORGANIZATION

0.9+

firstQUANTITY

0.88+

last 10 yearsDATE

0.86+

less than halfQUANTITY

0.85+

EthosORGANIZATION

0.84+

CubeORGANIZATION

0.81+

three core design principlesQUANTITY

0.8+

secondlyQUANTITY

0.79+

petabytesQUANTITY

0.79+

CTOPERSON

0.79+

an hourQUANTITY

0.72+

FrenchOTHER

0.69+

threeQUANTITY

0.68+

secondaryQUANTITY

0.62+

Iot. TahoeORGANIZATION

0.59+

quotesQUANTITY

0.49+

SuperPERSON

0.49+

secondQUANTITY

0.44+

WordyPERSON

0.43+

CanfieldORGANIZATION

0.35+

Lester Waters, Patrick Smith & Ezat Dayeh | IoTahoe | Data Automated


 

>> Announcer: From around the globe, it's theCUBE, with digital coverage of data automated and event series brought to you by IO Tahoe. >> Welcome back everybody to the power panel, driving business performance with smart data life cycles. Lester Waters is here. He's the chief technology officer from IO Tahoe, he's joined by Patrick Smith, who is field CTO from Pure Storage and Ezat Dayeh, who's a system engineering manager at Cohesity. Gentlemen, good to see you. Thanks so much for coming on this panel. >> Thank you, Dave. >> Let's start with Lester. I wonder if each of you could just give us a quick overview of your role and what's the number one problem that you're focused on solving for your customers? Let's start with Lester please. >> Yes, I'm Lester waters, chief technology officer for IO Tahoe, and really the number one problem that we are trying to solve for our customers is to help them understand what they have. 'Cause if they don't understand what they have in terms of their data, they can't manage it, they can't control it, they can't monitor it. They can't ensure compliance. So really that's finding all you can about your data that you have and building a catalog that can be readily consumed by the entire business is what we do. >> Great. All right, Patrick, field CTO in your title. That says to me you're talking to customers all the time. So you've got a good perspective on it. Give us you know, your take on things here. >> Yeah, absolutely. So my patch is EMEA and talk to customers and prospects in lots of different verticals across the region. And as they look at their environments and their data landscape, they're faced with massive growth in the data that they're trying to analyze and demands to be able to get in site faster and to deliver business value faster than they've ever had to do in the past. So big challenges that we're seeing across the region. >> Got it. And is that, Cohesity? You're like the new kid on the block, you guys are really growing rapidly, created this whole notion of data management backup and beyond, but from a system engineering manager, what are you seeing from customers, your role and the number one problem that you're solving? >> Yeah, sure. So the number one problem, I see time and again, speaking with customers, fall around data fragmentation. So due to things like organic growth, you know, even maybe budgetary limitations, infrastructure has grown over time, very piecemeal and it's highly distributed internally. And just to be clear, you know, when I say internally, you know, that could be that it's on multiple platforms or silos within an on-prem infrastructure, but that it also does extend to the cloud as well. So we've seen, you know, over the past few years, a big drive towards cloud consumption, almost at any cost in some examples. You know, there could be business reasons like moving from things like CapEx to a more of an OPEX model. And what this has done is it's gone to, to create further silos, you know, both on-prem and also in the cloud. And while short term needs may be met by doing that, what it's doing is it's causing longer term problems and it's reducing the agility for these customers to be able to change and transform. >> Right, hey cloud is cool. Everybody wants to be in the cloud, right? So you're right. It creates maybe unintended consequences. So let's start with the business outcome and kind of try to work backwards. I mean, people, you know, they want to get more insights from data. They want to have a more efficient data life cycle, but so Lester, let me start with you, thinking about like the North star to creating data-driven cultures, you know, what is the North star for customers here? >> I think the North star in a nutshell is driving value from your data without question. I mean, we differentiate ourselves these days by even in nuances in our data. Now, underpinning that there's a lot of things that have to happen to make that work out well, you know, for example, making sure you adequately protect your data, you know, do you have a good, do you have a good storage subsystem? Do you have a good backup and recovery point objectives, recovery time objectives? Do you, are you fully compliant? Are you ensuring that you're ticking all the boxes? There's a lot of regulations these days in term, with respect to compliance, data retention, data privacy, and so forth. Are you ticking those boxes? Are you being efficient with your data? You know, in other words, I think there's a statistic that someone mentioned to me the other day, that 53% of all businesses have between three and 15 copies of the same data. So, you know, finding and eliminating those is part of the, part of the problem is you need to chase. >> Yeah, so Patrick and Ezat, I mean, you know, Lester touched on a lot of the areas that you guys are involved in. I like to think of, you know, you're right. Lester, no doubt, business value, and a lot of that comes from reducing the end to end cycle times, but anything that you guys would, would add to that, Patrick, maybe start with Patrick. >> Yeah, I think, I think getting value from data really hits on, it hits on what everyone wants to achieve, but I think there are a couple of key steps in doing that. First of all, is getting access to the data and that really hits three big problems. Firstly, working out what you've got. Secondly, after working out what you've got, how to get access to it, because it's all very well knowing you've got some data, but if you can't get access to it, either because of privacy reasons, security reasons, then that's a big challenge. And then finally, once you've got access to the data, making sure that you can process that data in a timely manner and at the scale that you need to, to deliver your business objectives. So I think those are really three key steps in successfully getting value from the data within our organization. >> Ezat, I'll ask you, anything else you'd fill in? >> Yeah, so the guys have touched on a lot of things already. For me, you know, it would be that an organization has got a really good global view of all of its data. It understands the data flow and dependencies within their infrastructure, understands the precise legal and compliance requirements and have the ability to action changes or initiatives within their environment, forgive the pun, but with a cloud-like agility. You know, and that's no easy feat, right? That is hard work. Another thing as well is that it's for companies to be mature enough, to truly like delete and get rid of unneeded data from their system. You know, I've seen so many times in the past, organizations paying more than they need to because they've acquired a lot of data baggage. Like it just gets carried over from refresh to refresh. And, you know, if you can afford it great, but chances are, you want to be as competitive as possible. And what happens is that this results in, you know, spend that is unnecessary, not just in terms of acquisition, but also in terms of maintaining the infrastructure, but then the other knock on effect as well is, you know, from a compliance and a security point of view, you're exposing yourself. So, you know, if you don't need it, delete it or at least archive it. >> Okay, So we've talked about the challenges in some of the objectives, but there's a lot of blockers out there, and I want to understand how you guys are helping remove them. So Lester, what are some of those blockers? I mean, I can mention a couple, there's their skillsets. There's obviously you talked about the problem of siloed data, but there's also data ownership. That's my data. There's budget issues. What do you see as some of the big blockers in terms of people really leaning in to this smart data life cycle? >> Yeah, silos is probably one of the biggest one I see in businesses. Yes, it's my data, not your data. Lots of compartmentalization and breaking that down is one of the, one of the challenges and having the right tools to help you do that is only part of the solution. There's obviously a lot of cultural things that need to take place to break down those silos and work together. If you can identify where you have redundant data across your enterprise, you might be able to consolidate those, you know, bring together applications. A lot of companies, you know, it's not uncommon for a large enterprise to have, you know, several thousand applications, many of which have their own instance of the very same data. So if there's a customer list, for example, it might be in five or six different sources of truth. And there's no reason to have that, and bringing that together by bringing those things together, you will start to tear down the business boundary silos that automatically exist. I think, I think one of the other challenges too, is self service. As Patrick mentioned, gaining access to your data and being able to work with it in a safe and secure fashion, is key here. You know, right now you typically raise a ticket, wait for access to the data, and then maybe, you know, maybe a week later out pops the bit you need and really, you know, with data being such a commodity and having timeliness to it, being able to have quick access to that data is key. >> Yeah, so I want to go to Patrick. So, you know, one of the blockers that I see is legacy infrastructure, technical debt, sucking all the budget. You've got, you know, too many people having to look after, you know, storage. It's just, it's just too complicated. And I wonder if you have, obviously that's my perspective, what's your perspective on that? >> Yeah, absolutely. We'd agree with that. As you look at the infrastructure that supports people's data landscapes today, for primarily legacy reasons, the infrastructure itself is siloed. So you have different technologies with different underlying hardware, different management methodologies that are there for good reason, because historically you had to have specific fitness for purpose, for different data requirements. That's one of the challenges that we tackled head on at Pure with the flash blade technology and the concept of the data hub, a platform that can deliver in different characteristics for the different workloads, but from a consistent data platform. And it means that we get rid of those silos. It means that from an operational perspective, it's far more efficient. And once your data set is consolidated into the data hub, you don't have to move that data around. You can bring your applications and your workloads to the data rather than the other way around. >> Now, Ezat, I want to go to you because you know, in the world, in your world, which to me goes beyond backup. I mean, one of the challenges is, you know, they say backup is one thing. Recovery is everything, But as well, the CFO doesn't want to pay for just protection. And one of the things that I like about what you guys have done is you've broadened the perspective to get more value out of your, what was once seen as an insurance policy. I wonder if you could talk about that as a blocker and how you're having success removing it. >> Yeah, absolutely. So, you know, as well as what the guys have already said, you know, I do see one of the biggest blockers as the fact that the task at hand can, you know, can be overwhelming for customers and it can overwhelm them very, very quickly. And that's because, you know, this stuff is complicated. It's got risk, you know, people are used to the status quo, but the key here is to remember that it's not an overnight change. It's not, you know, a flick of a switch. It's something that can be tackled in a very piecemeal manner, and absolutely like you you said, you know, reduction in TCO and being able to leverage the data for other purposes is a key driver for this. So like you said, you know, for us specifically, one of the areas that we help customers around with first of all, it's usually data protection. It can also be things like consolidation of unstructured file data. And, you know, the reason why customers are doing this is because legacy data protection is very costly. You know, you'd be surprised how costly it is. A lot of people don't actually know how expensive it can be. And it's very complicated involving multiple vendors. And it's there really to achieve one goal. And the thing is, it's very inflexible and it doesn't help towards being an agile data driven company. So, you know, this can be, this can be resolved. It can be very, you know, pretty straightforward. It can be quite painless as well. Same goes for unstructured data, which is very complex to manage. And, you know, we've all heard the stats from the analysts, you know, data obviously is growing at an extremely rapid rate. But actually when you look at that, you know, how is it actually growing? 80% of that growth is actually in unstructured data. And only 20% of that growth is in structured data. So, you know, these are quick win areas that the customers can realize. Immediate TCO improvement and increased agility as well, when it comes to managing and automating their infrastructure. So, yeah, it's all about making, you know, doing more with, with what you have. >> So let's paint a picture of this guys, if you could bring up the life cycle, I want to explore that a little bit and ask each of you to provide a perspective on this. And so, you know, what you can see here is you've got this, this cycle, the data life cycle, and what we're wanting to do is really inject intelligence or smarts into this life cycle, you can see, you start with ingestion or creation of data. You're storing it. You got to put it somewhere, right? You got to classify it, you got to protect it. And then of course you want to, you know, reduce the copies, make it efficient, and then you want to prepare it, so the businesses can actually consume it. And then you've got clients and governance and privacy issues. And at some point when it's legal to do so, you want to get rid of it. We never get rid of stuff in technology. We keep it forever. But I wonder if we could start with you Lester. This is, you know, the picture of the life cycle. What role does automation play in terms of injecting smarts into the life cycle? >> Automation is key here. You know, especially from the discover catalog and classified perspective. I've seen companies where we, where they go and will take and dump their, all of their database schemes into a spreadsheet so that they can sit down and manually figure out what attribute 37 needs for a column name. And that's only the tip of the iceberg. So being able to automatically detect what you have, automatically deduce what's consuming the data, you know, upstream and downstream, being able to understand all of the things related to the life cycle of your data, backup archive, deletion. It is key. So having good tools is very important. >> So Patrick, obviously you participated in the store piece of this picture. So I wonder if you could just talk more specifically about that, but I'm also interested in how you affect the whole system view, the end to end cycle time. >> Yeah, I think Lester kind of hit the nail on the head in terms of the importance of automation, because data volumes are just so massive now that you, you can't, you can't effectively manage or understand or catalog your data without automation. But once you, once you understand the data and the value of the data, then that's where you can work out where the data needs to be at any point in time. And that's where we come into play. You know, if data needs to be online, if it's hot data, if it's data that needs to be analyzed, and, you know, we're moving to a world of analytics where some of our customers say, there's no such thing as cold data anymore, then it needs to be on a performance platform, but you need to understand exactly what the data is that you have to work out where to place it and where it fits into that data life cycle. And then there's that whole challenge of protecting it through the life cycle, whether that's protecting the hot data or as the data moves off into, you know, into an archive or into a cold store, still making sure you know where it is, and easily retrievable, should you need to move it back into the working set. So I think automation is key, but also making sure that it ties into understanding where you place your data at any point in time. >> Right, so Pure and Cohesity, obviously, partner to do that. And of course, Ezat, you guys are part of the protect, you're certainly part of the retain, but also you provide data management capabilities and analytics. I wonder if you could add some color there. >> Yeah, absolutely. So like you said, you know, we focus pretty heavily on data protection as just one of our areas and that infrastructure, it is just sitting there really you know, the legacy infrastructure, it's just sitting there, you know, consuming power, space cooling and pretty inefficient. And, you know, one of our main purposes is like we said, to make that data useful and automating that process is a key part of that, right? So, you know, not only are we doing things like obviously making it easier to manage, improving RPOs and RTOs with policy-based SLAs, but we're making it useful and having a system that can be automated through APIs and being an API first based system. It's almost mandatory now when you're going through a digital, you know, digital transformation. And one of the things that we can do is as part of that automation, is that we can make copies of data without consuming additional capacity available, pretty much instantaneously. You might want to do that for many different purposes. So examples of that could be, you know, for example, reproducing copies of production data for development purposes, or for testing new applications for example. And you know, how would you, how would you go about doing that in a legacy environment? The simple answer is it's painfully, right? So you just can't do those kinds of things. You know, I need more infrastructure to store the data. I need more compute to actually perform the things that I want to do on it, such as analytics, and to actually get a copy of that data, you know, I have to either manually copy it myself or I restore from a backup. And obviously all of that takes time, additional energy. And you end up with a big sprawling infrastructure, which isn't a manageable, like Patrick said, it's just the sheer amount of data, you know, it doesn't, it doesn't warrant doing that anymore. So, you know, if I have a modern day platform such as, you know, the Cohesity data platform, I can actually do a lot of analytics on that through applications. So we have a marketplace for apps. And the other great thing is that it's an open system, right? So anybody can develop an app. It's not just apps that are developed by us. It can be third parties, it could be customers. And with the data being consolidated in one place, you can then start to start to realize some of these benefits of deriving insights out of your data. >> Yeah, I'm glad you brought that up earlier in your little example there, because you're right. You know, how do you deal with that? You throw people at the problem and it becomes nights and weekends, and that sort of just fails. It doesn't scale. I wonder if we could talk about metadata. It's increasingly important. Metadata is data about the data, but Lester, maybe explain why it's so important and what role it plays in terms of creating smart data lifecycle. >> Well, yes, metadata, it does describe the data, but it's, a lot of people think it's just about the data itself, but there's a lot of extended characteristics about your data. So, imagine if for my data life cycle, I can communicate with the backup system from Cohesity and find out when the last time that data was backed up, or where it's backed up to. I can communicate exchange data with Pure Storage and find out what tier it's on. Is the data at the right tier commensurate with its use level that Patrick pointed out? And being able to share that metadata across systems. I think that's the direction that we're going in. Right now we're at the stage, we're just identifying the metadata and trying to bring it together and catalog it. The next stage will be, okay using the APIs that we have between our systems. Can we communicate and share that data and build good solutions for our customers to use? >> I think it's a huge point that you just made. I mean, you know, 10 years ago, automating classification was the big problem and it was machine intelligence. You know, we're obviously attacking that, but your point about as machines start communicating to each other and you start, you know, it's cloud to cloud, there's all kinds of metadata, kind of new metadata that's being created. I often joke that someday there's going to be more metadata than the data. So that brings us to cloud. And Ezat, I'd like to start with you, because you were talking about some cloud creep before. So what's your take on cloud? I mean, you've got private clouds, you got hybrid clouds, public clouds, inter clouds, IOT, and the edge is sort of another form of cloud. So how does cloud fit into the data life cycle? How does it affect the data life cycle? >> Yeah, sure. So, you know, I do think, you know, having the cloud is a great thing and it has got its role to play and you can have many different permutations and iterations of how you use it. And, you know, as I, as I may have sort of mentioned previously, you know, I've seen customers go into the cloud very, very quickly. And actually recently they're starting to remove web codes from the cloud. And the reason why this happens is that, you know, cloud has got its role to play, but it's not right for absolutely everything, especially in their current form as well. So, you know, a good analogy I like to use, and this may sound a little bit cliche, but you know, when you compare clouds versus on premises data centers, you can use the analogy of houses and hotels. So to give you an idea, so, you know, when we look at hotels, that's like the equivalent of a cloud, right? I can get everything I need from there. I can get my food, my water, my outdoor facilities. If I need to accommodate more people, I can rent some more rooms. I don't have to maintain the hotel. It's all done for me. When you look at houses, the equivalent to, you know, on premises infrastructure, I pretty much have to do everything myself, right? So I have to purchase the house. I have to maintain it. I have to buy my own food and water, eat it. I have to make improvements myself, but then why do we all live in houses, not in hotels? And the simple answer that I can, I can only think of is, is that it's cheaper, right? It's cheaper to do it myself, but that's not to say that hotels haven't got their role to play. You know, so for example, if I've got loads of visitors coming over for the weekend, I'm not going to go and build an extension to my house, just for them. I will burst into my hotel, into the cloud, and use it for, you know, for things like that. And you know, if I want to go somewhere on holiday, for example, then I'm not going to go buy a house there. I'm going to go in, I'm going to stay in a hotel, same thing. I need some temporary usage. You know, I'll use the cloud for that as well. Now, look, this is a loose analogy, right? But it kind of works. And it resonates with me at least anyway. So what I'm really saying is the cloud is great for many things, but it can work out costlier for certain applications while others are a perfect fit. So when customers do want to look at using the cloud, it really does need to be planned in an organized way, you know, so that you can avoid some of the pitfalls that we're talking about around, for example, creating additional silos, which are just going to make your life more complicated in the long run. So, you know, things like security planning, you know, adequate training for staff is absolutely a must. We've all seen the, you know, the horror stories in the press where certain data maybe has been left exposed in the cloud. Obviously nobody wants to see that. So as long as it's a well planned and considered approach, the cloud is great and it really does help customers out. >> Yeah, it's an interesting analogy. I hadn't thought of that before, but you're right. 'Cause I was going to say, well, part of it is you want the cloud experience everywhere, but you don't always want the cloud experience, especially, you know, when you're with your family, you want certain privacy. I've not heard that before Ezat, so that's a new perspective, so thank you. But so, but Patrick, I do want to come back to that cloud experience because in fact, that's what's happening in a lot of cases. Organizations are extending the cloud properties of automation on-prem and in hybrid. And certainly you guys have done that. You've created, you know, cloud-based capabilities. They can run in AWS or wherever, but what's your take on cloud? What's Pure's perspective? >> Yeah, I thought Ezat brought up a really interesting point and a great analogy for the use of the public cloud, and it really reinforces the importance of the hybrid and multicloud environment, because it gives you that flexibility to choose where is the optimal environment to run your business workloads. And that's what it's all about. And the flexibility to change which environment you're running in, either from one month to the next or from one year to the next, because workloads change and the characteristics that are available in the cloud change on a pretty frequent basis. It's a fast moving world. So one of the areas of focus for us with our cloud block store technology is to provide effectively a bridge between the on-prem cloud and the public cloud, to provide that consistent data management layer that allows customers to move their data where they need it when they need it. And the hybrid cloud is something that we've lived with ourselves at Pure. So our Pure1 management technology actually sits in a hybrid cloud environment. We started off entirely cloud native, but now we use the public cloud for compute and we use our own technology, the end of a high performance network link to support our data platform. So we get the best of both worlds. And I think that's where a lot of our customers are trying to get to is cloud flexibility, but also efficiency and optimization. >> All right, I want to come back in a moment there, but before we do, Lester, I wonder if we could talk a little bit about compliance governance and privacy. You know, that, a lot of that comes down to data, the EU right now, I think the Brits on this panel are still in the EU for now, but the EU are looking at new rules, new regulations going beyond GDPR, tightening things up in a, specifically kind of pointing at the cloud. Where does sort of privacy, governance, compliance fit in to the, to the data life cycle, then Ezat, I want your thoughts on this as well. >> Yeah, this is a very important point because the landscape for compliance around data privacy and data retention is changing very rapidly and being able to keep up with those changing regulations in an automated fashion is the only way you're going to be able to do it. Even, I think there's a, some sort of a, maybe a ruling coming out today or tomorrow with the change to GDPR. So this is, these are all very key points, and being able to codify those rules into some software, whether you know, IO Tahoe or your storage system or Cohesity that'll help you be compliant is crucial. >> Yeah, Esat, anything you can add there? I mean, this really is your wheelhouse. >> Yeah, absolutely. So, you know, I think anybody who's watching this probably has gotten the message that, you know, less silos is better. And then absolutely it also applies to data in the cloud as well. So, you know, by aiming to consolidate into fewer platforms, customers can realize a lot better control over their data. And then natural effect of this is that it makes meeting compliance and governance a lot easier. So when it's consolidated, you can start to confidently understand who is accessing your data, how frequently are they accessing the data? You can also do things like detecting anomalous file access activities, and quickly identify potential threats. You know, and this can be delivered by apps which are running on one platform that has consolidated the data as well. And you can also start getting into lots of things like, you know, rapidly searching for PII. So personally identifiable information across different file types. And you can report on all of this activity back to the business, by identifying, you know, where are you storing your copies of data? How many copies have you got and who has access to them? These are all becoming table stakes as far as I'm concerned. >> Right, right. >> The organizations continue that move into digital transformation and more regulation comes into law. So it's something that has to be taken very, very seriously. The easier you make your infrastructure, the easier it will be for you to comply with it. >> Okay, Patrick, we were talking, you talked earlier about storage optimization. We talked to Adam Worthington about the business case. You get the sort of numerator, which is the business value and then the denominator, which is the cost. And so storage efficiency is obviously a key part of it. It's part of your value proposition to pick up on your sort of earlier comments, and what's unique about Pure in this regard? >> Yeah, and I think there are, there are multiple dimensions to that. Firstly, if you look at the difference between legacy storage platforms, they used to take up racks or isles of space in a data center with flash technology that underpins flash blade, we effectively switch out racks for rack units. And it has a big play in terms of data center footprint, and the environmentals associated with the data center, but it doesn't stop at that. You know, we make sure that we efficiently store data on our platforms. We use advanced compression techniques to make sure that we make flash storage as cost competitive as we possibly can. And then if you look at extending out storage efficiencies and the benefits it brings, just the performance has a direct effect on staff, whether that's, you know, the staff and the simplicity of the platform, so that it's easy and efficient to manage, or whether it's the efficiency you get from your data scientists who are using the outcomes from the platform and making them more efficient. If you look at some of our customers in the financial space, their time to results are improved by 10 or 20 X by switching to our technology from legacy technologies for their analytics platforms. >> So guys we've been running, you know, CUBE interviews in our studios remotely for the last 120 days, it's probably the first interview I've done where I haven't started off talking about COVID, but digital transformation, you know, BC, before COVID. Yeah, it was real, but it was all of a buzzy wordy too. And now it's like a mandate. So Lester, I wonder if you could talk about smart data life cycle and how it fits into this isolation economy and hopefully what will soon be a post isolation economy? >> Yeah, COVID has dramatically accelerated the data economy. I think, you know, first and foremost, we've all learned to work at home. I, you know, we've all had that experience where, you know, there were people who would um and ah about being able to work at home just a couple of days a week. And here we are working five days a week. That's had a knock on impact to infrastructure to be able to support that. But going further than that, you know, the data economy is all about how a business can leverage their data to compete in this new world order that we are now in. So, you know, they've got to be able to drive that value from their data and if they're not prepared for it, they're going to falter. We've unfortunately seen a few companies that have faltered because they weren't prepared for this data economy. This is where all your value is driven from. So COVID has really been a forcing function to, you know, it's probably one of the few good things that have come out of COVID, is that we have been forced to adapt. And it's been an interesting journey and it continues to be so. >> Well, is that too, you know, everybody talks about business resiliency, ransomware comes into effect here, and Patrick, you, you may have some thoughts on this too, but Ezat, your thoughts on the whole work from home pivot and how it's impacting the data life cycle. >> Absolutely, like, like Lester said, you know, we've, we're seeing a huge impact here. You know, working from home has, has pretty much become the norm now. Companies have been forced into basically making it work. If you look at online retail, that's accelerated dramatically as well. Unified communications and video conferencing. So really, you know, the point here is that yes, absolutely. You know, we've compressed you know, in the past maybe four months, what probably would have taken maybe even five years, maybe 10 years or so. And so with all this digital capability, you know, when you talk about things like RPOs and RTOs, these things are, you know, very much, you know, front of mind basically and they're being taken very seriously. You know, with legacy infrastructure, you're pretty much limited with what you can do around that. But with next generation, it puts it front and center. And when it comes to, you know, to ransomware, of course, it's not a case of if it's going to happen, it's a case of when it's going to happen. Again, we've all seen lots of stuff in the press, different companies being impacted by this, you know, both private and public organizations. So it's a case of, you know, you have to think long and hard about how you're going to combat this, because actually malware also, it's becoming, it's becoming a lot more sophisticated. You know, what we're seeing now is that actually, when, when customers get impacted, the malware will sit in their environment and it will have a look around it, it won't actually do anything. And what it's actually trying to do is, it's trying to identify things like your backups, where are your backups? Because you know, what do, what do we all do? If we get hit by a situation like this, we go to our backups. But you know, the bad actors out there, they, you know, they're getting pretty smart as well. And if your legacy solution is sitting on a system that can be compromised quite easily, that's a really bad situation, you know, waiting to happen. And, you know, if you can't recover from your backups, essentially, unfortunately, you know, people are going to be making trips to the bank because you're going to have to pay to get your data back. And of course, nobody wants to see that happening. So one of the ways, for example, that we look to help customers defend against this is actually we have, we have a three pronged approach. So protect, detect, and respond. So what we mean by protect, and let me say, you know, first of all, this isn't a silver bullet, right? Security is an industry all of itself. It's very complicated. And the approach here is that you have to layer it. What Cohesity, for example, helps customers with, is around protecting that insurance policy, right? The backups. So by ensuring that that data is immutable, cannot be edited in any way, which is inherent to our file system. We make sure that nothing can affect that, but it's not just external actors you have to think about, it's also potentially internal bad actors as well. So things like being able to data lock your information so that even administrators can't change, edit or delete data, is just another way in which we help customers to protect. And then also you have things like multifactor authentication as well, but once we've okay, so we've protected the data. Now, when it comes, now it comes to detection. So again, being, you know, ingrained into data protection, we have a good view of what's happening with all of this data that's flowing around the organization. And if we start to see, for example, that backup times, or, you know, backup quantities, data quantities are suddenly spiking all of a sudden, we use things like, you know, AI machine learning to highlight these, and once we detect an anomaly such as this, we can then alert our users to this fact. And not only do we alert them and just say, look, we think something might be going on with your systems, but we'll also point them to a known good recovery point as well, so that they don't have to sit searching, well, when did this thing hit and you know, which recovery point do I have to use? And so, you know, and we use metadata to do all of these kinds of things with our global management platform called Helios. And that actually runs in the cloud as well. And so when we find this kind of stuff, we can basically recover it very, very quickly. And this comes back now to the RPOs and the RTOs. So your recovery point objective, we can shrink that, right? And essentially what that means is that you will lose less data. But more importantly, the RTO, your recovery time objective, it means that actually, should something happen and we need to recover that data, we can also shrink that dramatically. So again, when you think about other, you know, legacy technology out there, when something like this happens, you might be waiting hours, most likely days, possibly even weeks and months, depending on the severity. Whereas we're talking about being able to bring data back, you know, we're talking maybe, you know, a few hundred virtual machines in seconds and minutes. And so, you know, when you think about the value that that can give an organization, it becomes, it becomes a no brainer really, as far as, as far as I'm concerned. So, you know, that really covers how we respond to these situations. So protect, detect, and respond. >> Great, great summary. I mean, my summary is adverse, right? The adversaries are very, very capable. You got to put security practices in place. The backup Corpus becomes increasingly important. You got to have analytics to detect anomalous behavior and you got to have, you know, fast recovery. And thank you for that. We got to wrap, but so Lester, let me, let me ask you to sort of paint picture of the sort of journey or the maturity model that people have to take. You know, if they want to get into it, where do they start and where are they going? Give us that view. >> I think first it's knowing what you have. If you don't know what you have, you can't manage it, you can't control it, you can't secure it, you can't ensure it's compliant. So that's first and foremost. The second is really, you know, ensuring that you're compliant. Once you know what you have, are you securing it? Are you following the regulatory, the applicable regulations? Are you able to evidence that? How are you storing your data? Are you archiving it? Are you storing it effectively and efficiently? You know, have you, Nirvana from my perspective is really getting to a point where you've consolidated your data, you've broken down the silos and you have a virtually self service environment by which the business can consume and build upon their data. And really at the end of the day, as we said at the beginning, it's all about driving value out of your data. And the automation is key to this journey. >> That's awesome. And you just described sort of a winning data culture. Lester, Patrick, Ezat, thanks so much for participating in this power panel. >> Thank you, David. >> Thank you. >> Thank you for watching everybody. This is Dave Vellante for theCUBE. (bright music)

Published Date : Jul 16 2020

SUMMARY :

brought to you by IO Tahoe. to the power panel, I wonder if each of you could that you have and building a catalog Give us you know, your and demands to be able what are you seeing from customers, to create further silos, you know, I mean, people, you know, So, you know, finding Ezat, I mean, you know, manner and at the scale that you need to, So, you know, if you don't need it, and I want to understand how you guys enterprise to have, you know, So, you know, one of the So you have different technologies to you because you know, from the analysts, you know, And so, you know, what you can you know, upstream and downstream, So I wonder if you could or as the data moves off into, you know, And of course, Ezat, you And you know, how would you, You know, how do you deal with that? And being able to share that I mean, you know, 10 years ago, the equivalent to, you know, you know, when you're with your family, And the flexibility to that comes down to data, whether you know, IO Tahoe Yeah, Esat, anything you can add there? the message that, you know, So it's something that has to you talked earlier about whether that's, you know, So guys we've been running, you know, I think, you know, first and foremost, Well, is that too, you know, So it's a case of, you know, you know, fast recovery. And the automation is key to this journey. And you just described sort Thank you for watching everybody.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
PatrickPERSON

0.99+

DavidPERSON

0.99+

Patrick SmithPERSON

0.99+

Dave VellantePERSON

0.99+

DavePERSON

0.99+

fiveQUANTITY

0.99+

EzatPERSON

0.99+

Adam WorthingtonPERSON

0.99+

80%QUANTITY

0.99+

53%QUANTITY

0.99+

IO TahoeORGANIZATION

0.99+

Ezat DayehPERSON

0.99+

10QUANTITY

0.99+

10 yearsQUANTITY

0.99+

EUORGANIZATION

0.99+

five yearsQUANTITY

0.99+

tomorrowDATE

0.99+

GDPRTITLE

0.99+

AWSORGANIZATION

0.99+

CohesityORGANIZATION

0.99+

EMEAORGANIZATION

0.99+

one monthQUANTITY

0.99+

LesterPERSON

0.99+

one platformQUANTITY

0.99+

firstQUANTITY

0.99+

15 copiesQUANTITY

0.99+

Pure StorageORGANIZATION

0.99+

secondQUANTITY

0.98+

one yearQUANTITY

0.98+

both worldsQUANTITY

0.98+

six different sourcesQUANTITY

0.98+

oneQUANTITY

0.98+

LesterORGANIZATION

0.98+

one goalQUANTITY

0.98+

20 XQUANTITY

0.98+

todayDATE

0.98+

eachQUANTITY

0.98+

10 years agoDATE

0.98+

SecondlyQUANTITY

0.98+

first interviewQUANTITY

0.98+

FirstlyQUANTITY

0.98+

five days a weekQUANTITY

0.98+

a week laterDATE

0.97+

bothQUANTITY

0.97+

three key stepsQUANTITY

0.97+

threeQUANTITY

0.97+

Lester WatersPERSON

0.97+

20%QUANTITY

0.97+

FirstQUANTITY

0.94+

IoTahoeORGANIZATION

0.93+

one placeQUANTITY

0.92+

PureORGANIZATION

0.9+

NirvanaPERSON

0.89+