Image Title

Search Results for Lester:

Ajay Vohora and Lester Waters, Io-Tahoe | Io-Tahoe Adaptive Data Governance


 

>> Narrator: From around the globe its "theCUBE" presenting Adaptive Data Governance, brought to you by Io-Tahoe. >> And we're back with the Data Automation series. In this episode we're going to learn more about what Io-Tahoe is doing in the field of adaptive data governance, how can help achieve business outcomes and mitigate data security risks. I'm Lisa Martin and I'm joined by Ajay Vohora the CEO of Io-Tahoe, and Lester Waters the CTO of Io-Tahoe. Gentlemen it's great to have you on the program. >> Thank you Lisa is good to be back. >> Great to see you Lisa. >> Likewise, very seriously this isn't cautious as we are. Lester were going to start with you, what's going on at Io-Tahoe, what's new? >> Well, I've been with Io-Tahoe for a little over the year, and one thing I've learned is every customer needs are just a bit different. So we've been working on our next major release of the Io-Tahoe product and to really try to address these customer concerns because we want to be flexible enough in order to come in and not just profile the data and not just understand data quality and lineage, but also to address the unique needs of each and every customer that we have. And so that required a platform rewrite of our product so that we could extend the product without building a new version of the product, we wanted to be able to have pluggable modules. We are also focused a lot on performance, that's very important with the bulk of data that we deal with and we're able to pass through that data in a single pass and do the analytics that are needed whether it's a lineage data quality or just identifying the underlying data. And we're incorporating all that we've learned, we're tuning up our machine learning, we're analyzing on more dimensions than we've ever done before, we're able to do data quality without doing an initial reggie expert for example, just out of the box. So I think it's all of these things are coming together to form our next version of our product and We're really excited about. >> Sounds exciting, Ajay from the CEOs level what's going on? >> Wow, I think just building on that, what Lester just mentioned now it's we're growing pretty quickly with our partners, and today here with Oracle we're excited to explain how that's shaping up lots of collaboration already with Oracle, and government in insurance and in banking. And we're excited because we get to have an impact, it's really satisfying to see how we're able to help businesses transform and redefine what's possible with their data. And having Oracle there as a partner to lean in with is definitely helping. >> Excellent, we're going to dig into that a little bit later. Lester let's go back over to you, explain adaptive data governance, help us understand that. >> Really adaptive data governance is about achieving business outcomes through automation. It's really also about establishing a data-driven culture and pushing what's traditionally managed in IT out to the business. And to do that, you've got to enable an environment where people can actually access and look at the information about the data, not necessarily access the underlying data because we've got privacy concern system, but they need to understand what kind of data they have, what shape it's in, what's dependent on it upstream and downstream, and so that they can make their educated decisions on what they need to do to achieve those business outcomes. A lot of frameworks these days are hardwired, so you can set up a set of business rules, and that set of business rules works for a very specific database and a specific schema. But imagine a world where you could just say, you know, (tapping) the start date of a loan must always be before the end date of a loan, and having that generic rule regardless of the underlying database, and applying it even when a new database comes online and having those rules applied, that's what adaptive data governance about. I like to think of it as the intersection of three circles, really it's the technical metadata coming together with policies and rules, and coming together with the business ontologies that are unique to that particular business. And bringing this all together allows you to enable rapid change in your environment, so, it's a mouthful adaptive data governance, but that's what it kind of comes down to. >> So Ajay help me understand this, is this what enterprise companies are doing now or are they not quite there yet? >> Well, you know Lisa I think every organization is going at his pace, but markets are changing economy and the speed at which some of the changes in the economy happening is compelling more businesses to look at being more digital in how they serve their own customers. So what we're saying is a number of trends here from heads of data, chief data officers, CIO stepping back from a one size fits all approach because they've tried that before and it just hasn't worked. They've spent millions of dollars on IT programs trying to drive value from that data, and they've ended up with large teams of manual processing around data to try and hard-wire these policies to fit with the context and each line of business, and that hasn't worked. So, the trends that we're seeing emerge really relate to how do I as a chief data officer, as a CIO, inject more automation and to allow these common tasks. And we've been able to see that impact, I think the news here is if you're trying to create a knowledge graph, a data catalog, or a business glossary, and you're trying to do that manually, well stop, you don't have to do that manual anymore. I think best example I can give is Lester and I we like Chinese food and Japanese food, and if you were sitting there with your chopsticks you wouldn't eat a bowl of rice with the chopsticks one grain at a time, what you'd want to do is to find a more productive way to enjoy that meal before it gets cold. And that's similar to how we're able to help organizations to digest their data is to get through it faster, enjoy the benefits of putting that data to work. >> And if it was me eating that food with you guys I would be not using chopsticks I would be using a fork and probably a spoon. So Lester how then does Io-Tahoe go about doing this and enabling customers to achieve this? >> Let me show you a little story here. So if you take a look at the challenges that most customers have they're very similar, but every customer is on a different data journey, so, but it all starts with what data do I have, what shape is that data in, how is it structured, what's dependent on it upstream and downstream, what insights can I derive from that data, and how can I answer all of those questions automatically? So if you look at the challenges for these data professionals, you know, they're either on a journey to the cloud, maybe they're doing a migration to Oracle, maybe they're doing some data governance changes, and it's about enabling this. So if you look at these challenges, I'm going to take you through a story here, and I want to introduce Amanda. Amanda is not Latin like anyone in any large organizations, she is looking around and she just sees stacks of data, I mean, different databases the one she knows about, the ones she doesn't know about but should know about, various different kinds of databases, and Amanda is this tasking with understanding all of this so that they can embark on her data journey program. So Amanda goes through and she's great, (snaps finger) "I've got some handy tools, I can start looking at these databases and getting an idea of what we've got." But when she digs into the databases she starts to see that not everything is as clear as she might've hoped it would be. Property names or column names have ambiguous names like Attribute one and Attribute two, or maybe Date one and Date two, so Amanda is starting to struggle even though she's got tools to visualize and look at these databases, she's still knows she's got a long road ahead, and with 2000 databases in her large enterprise, yes it's going to be a long journey. But Amanda is smart, so she pulls out her trusty spreadsheet to track all of her findings, and what she doesn't know about she raises a ticket or maybe tries to track down in order to find what that data means, and she's tracking all this information, but clearly this doesn't scale that well for Amanda. So maybe the organization will get 10 Amanda's to sort of divide and conquer that work. But even that doesn't work that well 'cause there's still ambiguities in the data. With Io-Tahoe what we do is we actually profile the underlying data. By looking at the underlying data, we can quickly see that Attribute one looks very much like a US social security number, and Attribute two looks like a ICD 10 medical code. And we do this by using ontologies, and dictionaries, and algorithms to help identify the underlying data and then tag it. Key to doing this automation is really being able to normalize things across different databases so that where there's differences in column names, I know that in fact they contain the same data. And by going through this exercise with Io-Tahoe, not only can we identify the data, but we also can gain insights about the data. So for example, we can see that 97% of that time, that column named Attribute one that's got US social security numbers, has something that looks like a social security number. But 3% of the time it doesn't quite look right, maybe there's a dash missing, maybe there's a digit dropped, or maybe there's even characters embedded in it, that may be indicative of a data quality issues, so we try to find those kinds of things. Going a step further, we also try to identify data quality relationships. So for example we have two columns, one date one date two, through observation we can see the date one 99% of the time is less than date two, 1% of the time it's not, probably indicative of the data quality issue, but going a step further we can also build a business rule that says date one is actually than date two, and so then when it pops up again we can quickly identify and remediate that problem. So these are the kinds of things that we can do with Io-Tahoe. Going even a step further, we can take your favorite data science solution, productionize it, and incorporate it into our next version as what we call a worker process to do your own bespoke analytics. >> Bespoke analytics, excellent, Lester thank you. So Ajay, talk us through some examples of where you're putting this to use, and also what is some of the feedback from some customers. >> Yeah, what I'm thinking how do you bring into life a little bit Lisa lets just talk through a case study. We put something together, I know it's available for download, but in a well-known telecommunications media company, they have a lot of the issues that lasted just spoke about lots of teams of Amanda's, super bright data practitioners, and are maybe looking to get more productivity out of their day, and deliver a good result for their own customers, for cell phone subscribers and broadband users. So, there are so many examples that we can see here is how we went about auto generating a lot of that old understanding of that data within hours. So, Amanda had her data catalog populated automatically, a business glossary built up, and maybe I would start to say, "Okay, where do I want to apply some policies to the data to set in place some controls, whether I want to adapt how different lines of business maybe tasks versus customer operations have different access or permissions to that data." And what we've been able to do that is to build up that picture to see how does data move across the entire organization, across the state, and monitor that over time for improvement. So we've taken it from being like reactive, let's do something to fix something to now more proactive. We can see what's happening with our data, who's using it, who's accessing it, how it's being used, how it's being combined, and from there taking a proactive approach is a real smart use of the tanons in that telco organization and the folks that work there with data. >> Okay Ajay, so digging into that a little bit deeper, and one of the things I was thinking when you were talking through some of those outcomes that you're helping customers achieve is ROI. How do customers measure ROI, What are they seeing with Io-Tahoe solution? >> Yeah, right now the big ticket item is time to value. And I think in data a lot of the upfront investment costs are quite expensive, they happen today with a lot of the larger vendors and technologies. Well, a CIO, an economic buyer really needs to be certain about this, how quickly can I get that ROI? And I think we've got something that we can show just pull up a before and after, and it really comes down to hours, days, and weeks where we've been able to have that impact. And in this playbook that we put together the before and after picture really shows those savings that committed a bit through providing data into some actionable form within hours and days to drive agility. But at the same time being able to enforce the controls to protect the use of that data and who has access to it, so atleast the number one thing I'd have to say is time, and we can see that on the graphic that we've just pulled up here. >> Excellent, so ostensible measurable outcomes that time to value. We talk about achieving adaptive data governance. Lester, you guys talk about automation, you talk about machine learning, how are you seeing those technologies being a facilitator of organizations adopting adaptive data governance? >> Well, as we see the manual date, the days of manual effort are out, so I think this is a multi-step process, but the very first step is understanding what you have in normalizing that across your data estate. So, you couple this with the ontologies that are unique to your business and algorithms, and you basically go across it and you identify and tag that data, that allows for the next steps to happen. So now I can write business rules not in terms of named columns, but I can write them in terms of the tags. Using that automated pattern recognition where we observed the loan starts should be before the loan (indistinct), being able to automate that is a huge time saver, and the fact that we can suggest that as a rule rather than waiting for a person to come along and say, "Oh wow, okay, I need this rule, I need this rule." These are steps that increase, or I should say decrease that time to value that Ajay talked about. And then lastly, a couple of machine learning, because even with great automation and being able to profile all your data and getting a good understanding, that brings you to a certain point, but there's still ambiguity in the data. So for example I might have two columns date one and date two, I may have even observed that date one should be less than date two, but I don't really know what date one and date two are other than a date. So, this is where it comes in and I'm like, "As the user said, can you help me identify what date one and day two are in this table?" It turns out they're a start date and an end date for a loan, that gets remembered, cycled into machine learning step by step to see this pattern of date one date two. Elsewhere I'm going to say, "Is it start date and end date?" Bringing all these things together with all this automation is really what's key to enable this data database, your data governance program. >> Great, thanks Lester. And Ajay I do want to wrap things up with something that you mentioned in the beginning about what you guys are doing with Oracle, take us out by telling us what you're doing there, how are you guys working together? >> Yeah, I think those of us who worked in IT for many years we've learned to trust Oracle's technology that they're shifting now to a hybrid on-prem cloud generation 2 platform which is exciting, and their existing customers and new customers moving to Oracle are on a journey. So Oracle came to us and said, "Now, we can see how quickly you're able to help us change mindsets," and as mindsets are locked in a way of thinking around operating models of IT that are maybe not agile or more siloed, and they're wanting to break free of that and adopt a more agile API driven approach with their data. So, a lot of the work that we're doing with Oracle is around accelerating what customers can do with understanding their data and to build digital apps by identifying the underlying data that has value. And the time we're able to do that in hours, days, and weeks, rather than many months is opening up the eyes to chief data officers, CIO is to say, "Well, maybe we can do this whole digital transformation this year, maybe we can bring that forward and transform who we are as a company." And that's driving innovation which we're excited about, and I know Oracle keen to drive through. >> And helping businesses transform digitally is so incredibly important in this time as we look to things changing in 2021. Ajay and Lester thank you so much for joining me on this segment, explaining adaptive data governance, how organizations can use it, benefit from it, and achieve ROI, thanks so much guys. >> Thanks you. >> Thanks again Lisa. (bright music)

Published Date : Dec 11 2020

SUMMARY :

brought to you by Io-Tahoe. going to learn more about this isn't cautious as we are. and do the analytics that are needed to lean in with is definitely helping. Lester let's go back over to you, and so that they can make and to allow these common tasks. and enabling customers to achieve this? that we can do with Io-Tahoe. and also what is some of the in that telco organization and the folks and one of the things I was thinking and we can see that that time to value. that allows for the next steps to happen. that you mentioned in the beginning and I know Oracle keen to drive through. Ajay and Lester thank you Thanks again Lisa.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Ajay VohoraPERSON

0.99+

AmandaPERSON

0.99+

Lisa MartinPERSON

0.99+

AjayPERSON

0.99+

OracleORGANIZATION

0.99+

2021DATE

0.99+

LisaPERSON

0.99+

LesterPERSON

0.99+

oneQUANTITY

0.99+

two columnsQUANTITY

0.99+

97%QUANTITY

0.99+

Io-TahoeORGANIZATION

0.99+

Lester WatersPERSON

0.99+

3%QUANTITY

0.99+

LesterORGANIZATION

0.99+

each lineQUANTITY

0.99+

first stepQUANTITY

0.98+

two columnsQUANTITY

0.98+

todayDATE

0.98+

twoQUANTITY

0.98+

millions of dollarsQUANTITY

0.98+

1%QUANTITY

0.98+

telcoORGANIZATION

0.98+

one grainQUANTITY

0.97+

2000 databasesQUANTITY

0.96+

USLOCATION

0.95+

ICD 10OTHER

0.95+

one thingQUANTITY

0.94+

three circlesQUANTITY

0.92+

Attribute twoOTHER

0.92+

this yearDATE

0.91+

singleQUANTITY

0.91+

isaPERSON

0.9+

LatinOTHER

0.9+

eachQUANTITY

0.87+

10QUANTITY

0.85+

dayQUANTITY

0.84+

Attribute oneOTHER

0.84+

99%QUANTITY

0.83+

Io-TahoePERSON

0.79+

Data AutomationTITLE

0.77+

Io-ORGANIZATION

0.74+

Lester WatersPERSON

0.72+

one sizeQUANTITY

0.71+

IoORGANIZATION

0.69+

Lester Waters, Patrick Smith & Ezat Dayeh | IoTahoe | Data Automated


 

>> Announcer: From around the globe, it's theCUBE, with digital coverage of data automated and event series brought to you by IO Tahoe. >> Welcome back everybody to the power panel, driving business performance with smart data life cycles. Lester Waters is here. He's the chief technology officer from IO Tahoe, he's joined by Patrick Smith, who is field CTO from Pure Storage and Ezat Dayeh, who's a system engineering manager at Cohesity. Gentlemen, good to see you. Thanks so much for coming on this panel. >> Thank you, Dave. >> Let's start with Lester. I wonder if each of you could just give us a quick overview of your role and what's the number one problem that you're focused on solving for your customers? Let's start with Lester please. >> Yes, I'm Lester waters, chief technology officer for IO Tahoe, and really the number one problem that we are trying to solve for our customers is to help them understand what they have. 'Cause if they don't understand what they have in terms of their data, they can't manage it, they can't control it, they can't monitor it. They can't ensure compliance. So really that's finding all you can about your data that you have and building a catalog that can be readily consumed by the entire business is what we do. >> Great. All right, Patrick, field CTO in your title. That says to me you're talking to customers all the time. So you've got a good perspective on it. Give us you know, your take on things here. >> Yeah, absolutely. So my patch is EMEA and talk to customers and prospects in lots of different verticals across the region. And as they look at their environments and their data landscape, they're faced with massive growth in the data that they're trying to analyze and demands to be able to get in site faster and to deliver business value faster than they've ever had to do in the past. So big challenges that we're seeing across the region. >> Got it. And is that, Cohesity? You're like the new kid on the block, you guys are really growing rapidly, created this whole notion of data management backup and beyond, but from a system engineering manager, what are you seeing from customers, your role and the number one problem that you're solving? >> Yeah, sure. So the number one problem, I see time and again, speaking with customers, fall around data fragmentation. So due to things like organic growth, you know, even maybe budgetary limitations, infrastructure has grown over time, very piecemeal and it's highly distributed internally. And just to be clear, you know, when I say internally, you know, that could be that it's on multiple platforms or silos within an on-prem infrastructure, but that it also does extend to the cloud as well. So we've seen, you know, over the past few years, a big drive towards cloud consumption, almost at any cost in some examples. You know, there could be business reasons like moving from things like CapEx to a more of an OPEX model. And what this has done is it's gone to, to create further silos, you know, both on-prem and also in the cloud. And while short term needs may be met by doing that, what it's doing is it's causing longer term problems and it's reducing the agility for these customers to be able to change and transform. >> Right, hey cloud is cool. Everybody wants to be in the cloud, right? So you're right. It creates maybe unintended consequences. So let's start with the business outcome and kind of try to work backwards. I mean, people, you know, they want to get more insights from data. They want to have a more efficient data life cycle, but so Lester, let me start with you, thinking about like the North star to creating data-driven cultures, you know, what is the North star for customers here? >> I think the North star in a nutshell is driving value from your data without question. I mean, we differentiate ourselves these days by even in nuances in our data. Now, underpinning that there's a lot of things that have to happen to make that work out well, you know, for example, making sure you adequately protect your data, you know, do you have a good, do you have a good storage subsystem? Do you have a good backup and recovery point objectives, recovery time objectives? Do you, are you fully compliant? Are you ensuring that you're ticking all the boxes? There's a lot of regulations these days in term, with respect to compliance, data retention, data privacy, and so forth. Are you ticking those boxes? Are you being efficient with your data? You know, in other words, I think there's a statistic that someone mentioned to me the other day, that 53% of all businesses have between three and 15 copies of the same data. So, you know, finding and eliminating those is part of the, part of the problem is you need to chase. >> Yeah, so Patrick and Ezat, I mean, you know, Lester touched on a lot of the areas that you guys are involved in. I like to think of, you know, you're right. Lester, no doubt, business value, and a lot of that comes from reducing the end to end cycle times, but anything that you guys would, would add to that, Patrick, maybe start with Patrick. >> Yeah, I think, I think getting value from data really hits on, it hits on what everyone wants to achieve, but I think there are a couple of key steps in doing that. First of all, is getting access to the data and that really hits three big problems. Firstly, working out what you've got. Secondly, after working out what you've got, how to get access to it, because it's all very well knowing you've got some data, but if you can't get access to it, either because of privacy reasons, security reasons, then that's a big challenge. And then finally, once you've got access to the data, making sure that you can process that data in a timely manner and at the scale that you need to, to deliver your business objectives. So I think those are really three key steps in successfully getting value from the data within our organization. >> Ezat, I'll ask you, anything else you'd fill in? >> Yeah, so the guys have touched on a lot of things already. For me, you know, it would be that an organization has got a really good global view of all of its data. It understands the data flow and dependencies within their infrastructure, understands the precise legal and compliance requirements and have the ability to action changes or initiatives within their environment, forgive the pun, but with a cloud-like agility. You know, and that's no easy feat, right? That is hard work. Another thing as well is that it's for companies to be mature enough, to truly like delete and get rid of unneeded data from their system. You know, I've seen so many times in the past, organizations paying more than they need to because they've acquired a lot of data baggage. Like it just gets carried over from refresh to refresh. And, you know, if you can afford it great, but chances are, you want to be as competitive as possible. And what happens is that this results in, you know, spend that is unnecessary, not just in terms of acquisition, but also in terms of maintaining the infrastructure, but then the other knock on effect as well is, you know, from a compliance and a security point of view, you're exposing yourself. So, you know, if you don't need it, delete it or at least archive it. >> Okay, So we've talked about the challenges in some of the objectives, but there's a lot of blockers out there, and I want to understand how you guys are helping remove them. So Lester, what are some of those blockers? I mean, I can mention a couple, there's their skillsets. There's obviously you talked about the problem of siloed data, but there's also data ownership. That's my data. There's budget issues. What do you see as some of the big blockers in terms of people really leaning in to this smart data life cycle? >> Yeah, silos is probably one of the biggest one I see in businesses. Yes, it's my data, not your data. Lots of compartmentalization and breaking that down is one of the, one of the challenges and having the right tools to help you do that is only part of the solution. There's obviously a lot of cultural things that need to take place to break down those silos and work together. If you can identify where you have redundant data across your enterprise, you might be able to consolidate those, you know, bring together applications. A lot of companies, you know, it's not uncommon for a large enterprise to have, you know, several thousand applications, many of which have their own instance of the very same data. So if there's a customer list, for example, it might be in five or six different sources of truth. And there's no reason to have that, and bringing that together by bringing those things together, you will start to tear down the business boundary silos that automatically exist. I think, I think one of the other challenges too, is self service. As Patrick mentioned, gaining access to your data and being able to work with it in a safe and secure fashion, is key here. You know, right now you typically raise a ticket, wait for access to the data, and then maybe, you know, maybe a week later out pops the bit you need and really, you know, with data being such a commodity and having timeliness to it, being able to have quick access to that data is key. >> Yeah, so I want to go to Patrick. So, you know, one of the blockers that I see is legacy infrastructure, technical debt, sucking all the budget. You've got, you know, too many people having to look after, you know, storage. It's just, it's just too complicated. And I wonder if you have, obviously that's my perspective, what's your perspective on that? >> Yeah, absolutely. We'd agree with that. As you look at the infrastructure that supports people's data landscapes today, for primarily legacy reasons, the infrastructure itself is siloed. So you have different technologies with different underlying hardware, different management methodologies that are there for good reason, because historically you had to have specific fitness for purpose, for different data requirements. That's one of the challenges that we tackled head on at Pure with the flash blade technology and the concept of the data hub, a platform that can deliver in different characteristics for the different workloads, but from a consistent data platform. And it means that we get rid of those silos. It means that from an operational perspective, it's far more efficient. And once your data set is consolidated into the data hub, you don't have to move that data around. You can bring your applications and your workloads to the data rather than the other way around. >> Now, Ezat, I want to go to you because you know, in the world, in your world, which to me goes beyond backup. I mean, one of the challenges is, you know, they say backup is one thing. Recovery is everything, But as well, the CFO doesn't want to pay for just protection. And one of the things that I like about what you guys have done is you've broadened the perspective to get more value out of your, what was once seen as an insurance policy. I wonder if you could talk about that as a blocker and how you're having success removing it. >> Yeah, absolutely. So, you know, as well as what the guys have already said, you know, I do see one of the biggest blockers as the fact that the task at hand can, you know, can be overwhelming for customers and it can overwhelm them very, very quickly. And that's because, you know, this stuff is complicated. It's got risk, you know, people are used to the status quo, but the key here is to remember that it's not an overnight change. It's not, you know, a flick of a switch. It's something that can be tackled in a very piecemeal manner, and absolutely like you you said, you know, reduction in TCO and being able to leverage the data for other purposes is a key driver for this. So like you said, you know, for us specifically, one of the areas that we help customers around with first of all, it's usually data protection. It can also be things like consolidation of unstructured file data. And, you know, the reason why customers are doing this is because legacy data protection is very costly. You know, you'd be surprised how costly it is. A lot of people don't actually know how expensive it can be. And it's very complicated involving multiple vendors. And it's there really to achieve one goal. And the thing is, it's very inflexible and it doesn't help towards being an agile data driven company. So, you know, this can be, this can be resolved. It can be very, you know, pretty straightforward. It can be quite painless as well. Same goes for unstructured data, which is very complex to manage. And, you know, we've all heard the stats from the analysts, you know, data obviously is growing at an extremely rapid rate. But actually when you look at that, you know, how is it actually growing? 80% of that growth is actually in unstructured data. And only 20% of that growth is in structured data. So, you know, these are quick win areas that the customers can realize. Immediate TCO improvement and increased agility as well, when it comes to managing and automating their infrastructure. So, yeah, it's all about making, you know, doing more with, with what you have. >> So let's paint a picture of this guys, if you could bring up the life cycle, I want to explore that a little bit and ask each of you to provide a perspective on this. And so, you know, what you can see here is you've got this, this cycle, the data life cycle, and what we're wanting to do is really inject intelligence or smarts into this life cycle, you can see, you start with ingestion or creation of data. You're storing it. You got to put it somewhere, right? You got to classify it, you got to protect it. And then of course you want to, you know, reduce the copies, make it efficient, and then you want to prepare it, so the businesses can actually consume it. And then you've got clients and governance and privacy issues. And at some point when it's legal to do so, you want to get rid of it. We never get rid of stuff in technology. We keep it forever. But I wonder if we could start with you Lester. This is, you know, the picture of the life cycle. What role does automation play in terms of injecting smarts into the life cycle? >> Automation is key here. You know, especially from the discover catalog and classified perspective. I've seen companies where we, where they go and will take and dump their, all of their database schemes into a spreadsheet so that they can sit down and manually figure out what attribute 37 needs for a column name. And that's only the tip of the iceberg. So being able to automatically detect what you have, automatically deduce what's consuming the data, you know, upstream and downstream, being able to understand all of the things related to the life cycle of your data, backup archive, deletion. It is key. So having good tools is very important. >> So Patrick, obviously you participated in the store piece of this picture. So I wonder if you could just talk more specifically about that, but I'm also interested in how you affect the whole system view, the end to end cycle time. >> Yeah, I think Lester kind of hit the nail on the head in terms of the importance of automation, because data volumes are just so massive now that you, you can't, you can't effectively manage or understand or catalog your data without automation. But once you, once you understand the data and the value of the data, then that's where you can work out where the data needs to be at any point in time. And that's where we come into play. You know, if data needs to be online, if it's hot data, if it's data that needs to be analyzed, and, you know, we're moving to a world of analytics where some of our customers say, there's no such thing as cold data anymore, then it needs to be on a performance platform, but you need to understand exactly what the data is that you have to work out where to place it and where it fits into that data life cycle. And then there's that whole challenge of protecting it through the life cycle, whether that's protecting the hot data or as the data moves off into, you know, into an archive or into a cold store, still making sure you know where it is, and easily retrievable, should you need to move it back into the working set. So I think automation is key, but also making sure that it ties into understanding where you place your data at any point in time. >> Right, so Pure and Cohesity, obviously, partner to do that. And of course, Ezat, you guys are part of the protect, you're certainly part of the retain, but also you provide data management capabilities and analytics. I wonder if you could add some color there. >> Yeah, absolutely. So like you said, you know, we focus pretty heavily on data protection as just one of our areas and that infrastructure, it is just sitting there really you know, the legacy infrastructure, it's just sitting there, you know, consuming power, space cooling and pretty inefficient. And, you know, one of our main purposes is like we said, to make that data useful and automating that process is a key part of that, right? So, you know, not only are we doing things like obviously making it easier to manage, improving RPOs and RTOs with policy-based SLAs, but we're making it useful and having a system that can be automated through APIs and being an API first based system. It's almost mandatory now when you're going through a digital, you know, digital transformation. And one of the things that we can do is as part of that automation, is that we can make copies of data without consuming additional capacity available, pretty much instantaneously. You might want to do that for many different purposes. So examples of that could be, you know, for example, reproducing copies of production data for development purposes, or for testing new applications for example. And you know, how would you, how would you go about doing that in a legacy environment? The simple answer is it's painfully, right? So you just can't do those kinds of things. You know, I need more infrastructure to store the data. I need more compute to actually perform the things that I want to do on it, such as analytics, and to actually get a copy of that data, you know, I have to either manually copy it myself or I restore from a backup. And obviously all of that takes time, additional energy. And you end up with a big sprawling infrastructure, which isn't a manageable, like Patrick said, it's just the sheer amount of data, you know, it doesn't, it doesn't warrant doing that anymore. So, you know, if I have a modern day platform such as, you know, the Cohesity data platform, I can actually do a lot of analytics on that through applications. So we have a marketplace for apps. And the other great thing is that it's an open system, right? So anybody can develop an app. It's not just apps that are developed by us. It can be third parties, it could be customers. And with the data being consolidated in one place, you can then start to start to realize some of these benefits of deriving insights out of your data. >> Yeah, I'm glad you brought that up earlier in your little example there, because you're right. You know, how do you deal with that? You throw people at the problem and it becomes nights and weekends, and that sort of just fails. It doesn't scale. I wonder if we could talk about metadata. It's increasingly important. Metadata is data about the data, but Lester, maybe explain why it's so important and what role it plays in terms of creating smart data lifecycle. >> Well, yes, metadata, it does describe the data, but it's, a lot of people think it's just about the data itself, but there's a lot of extended characteristics about your data. So, imagine if for my data life cycle, I can communicate with the backup system from Cohesity and find out when the last time that data was backed up, or where it's backed up to. I can communicate exchange data with Pure Storage and find out what tier it's on. Is the data at the right tier commensurate with its use level that Patrick pointed out? And being able to share that metadata across systems. I think that's the direction that we're going in. Right now we're at the stage, we're just identifying the metadata and trying to bring it together and catalog it. The next stage will be, okay using the APIs that we have between our systems. Can we communicate and share that data and build good solutions for our customers to use? >> I think it's a huge point that you just made. I mean, you know, 10 years ago, automating classification was the big problem and it was machine intelligence. You know, we're obviously attacking that, but your point about as machines start communicating to each other and you start, you know, it's cloud to cloud, there's all kinds of metadata, kind of new metadata that's being created. I often joke that someday there's going to be more metadata than the data. So that brings us to cloud. And Ezat, I'd like to start with you, because you were talking about some cloud creep before. So what's your take on cloud? I mean, you've got private clouds, you got hybrid clouds, public clouds, inter clouds, IOT, and the edge is sort of another form of cloud. So how does cloud fit into the data life cycle? How does it affect the data life cycle? >> Yeah, sure. So, you know, I do think, you know, having the cloud is a great thing and it has got its role to play and you can have many different permutations and iterations of how you use it. And, you know, as I, as I may have sort of mentioned previously, you know, I've seen customers go into the cloud very, very quickly. And actually recently they're starting to remove web codes from the cloud. And the reason why this happens is that, you know, cloud has got its role to play, but it's not right for absolutely everything, especially in their current form as well. So, you know, a good analogy I like to use, and this may sound a little bit cliche, but you know, when you compare clouds versus on premises data centers, you can use the analogy of houses and hotels. So to give you an idea, so, you know, when we look at hotels, that's like the equivalent of a cloud, right? I can get everything I need from there. I can get my food, my water, my outdoor facilities. If I need to accommodate more people, I can rent some more rooms. I don't have to maintain the hotel. It's all done for me. When you look at houses, the equivalent to, you know, on premises infrastructure, I pretty much have to do everything myself, right? So I have to purchase the house. I have to maintain it. I have to buy my own food and water, eat it. I have to make improvements myself, but then why do we all live in houses, not in hotels? And the simple answer that I can, I can only think of is, is that it's cheaper, right? It's cheaper to do it myself, but that's not to say that hotels haven't got their role to play. You know, so for example, if I've got loads of visitors coming over for the weekend, I'm not going to go and build an extension to my house, just for them. I will burst into my hotel, into the cloud, and use it for, you know, for things like that. And you know, if I want to go somewhere on holiday, for example, then I'm not going to go buy a house there. I'm going to go in, I'm going to stay in a hotel, same thing. I need some temporary usage. You know, I'll use the cloud for that as well. Now, look, this is a loose analogy, right? But it kind of works. And it resonates with me at least anyway. So what I'm really saying is the cloud is great for many things, but it can work out costlier for certain applications while others are a perfect fit. So when customers do want to look at using the cloud, it really does need to be planned in an organized way, you know, so that you can avoid some of the pitfalls that we're talking about around, for example, creating additional silos, which are just going to make your life more complicated in the long run. So, you know, things like security planning, you know, adequate training for staff is absolutely a must. We've all seen the, you know, the horror stories in the press where certain data maybe has been left exposed in the cloud. Obviously nobody wants to see that. So as long as it's a well planned and considered approach, the cloud is great and it really does help customers out. >> Yeah, it's an interesting analogy. I hadn't thought of that before, but you're right. 'Cause I was going to say, well, part of it is you want the cloud experience everywhere, but you don't always want the cloud experience, especially, you know, when you're with your family, you want certain privacy. I've not heard that before Ezat, so that's a new perspective, so thank you. But so, but Patrick, I do want to come back to that cloud experience because in fact, that's what's happening in a lot of cases. Organizations are extending the cloud properties of automation on-prem and in hybrid. And certainly you guys have done that. You've created, you know, cloud-based capabilities. They can run in AWS or wherever, but what's your take on cloud? What's Pure's perspective? >> Yeah, I thought Ezat brought up a really interesting point and a great analogy for the use of the public cloud, and it really reinforces the importance of the hybrid and multicloud environment, because it gives you that flexibility to choose where is the optimal environment to run your business workloads. And that's what it's all about. And the flexibility to change which environment you're running in, either from one month to the next or from one year to the next, because workloads change and the characteristics that are available in the cloud change on a pretty frequent basis. It's a fast moving world. So one of the areas of focus for us with our cloud block store technology is to provide effectively a bridge between the on-prem cloud and the public cloud, to provide that consistent data management layer that allows customers to move their data where they need it when they need it. And the hybrid cloud is something that we've lived with ourselves at Pure. So our Pure1 management technology actually sits in a hybrid cloud environment. We started off entirely cloud native, but now we use the public cloud for compute and we use our own technology, the end of a high performance network link to support our data platform. So we get the best of both worlds. And I think that's where a lot of our customers are trying to get to is cloud flexibility, but also efficiency and optimization. >> All right, I want to come back in a moment there, but before we do, Lester, I wonder if we could talk a little bit about compliance governance and privacy. You know, that, a lot of that comes down to data, the EU right now, I think the Brits on this panel are still in the EU for now, but the EU are looking at new rules, new regulations going beyond GDPR, tightening things up in a, specifically kind of pointing at the cloud. Where does sort of privacy, governance, compliance fit in to the, to the data life cycle, then Ezat, I want your thoughts on this as well. >> Yeah, this is a very important point because the landscape for compliance around data privacy and data retention is changing very rapidly and being able to keep up with those changing regulations in an automated fashion is the only way you're going to be able to do it. Even, I think there's a, some sort of a, maybe a ruling coming out today or tomorrow with the change to GDPR. So this is, these are all very key points, and being able to codify those rules into some software, whether you know, IO Tahoe or your storage system or Cohesity that'll help you be compliant is crucial. >> Yeah, Esat, anything you can add there? I mean, this really is your wheelhouse. >> Yeah, absolutely. So, you know, I think anybody who's watching this probably has gotten the message that, you know, less silos is better. And then absolutely it also applies to data in the cloud as well. So, you know, by aiming to consolidate into fewer platforms, customers can realize a lot better control over their data. And then natural effect of this is that it makes meeting compliance and governance a lot easier. So when it's consolidated, you can start to confidently understand who is accessing your data, how frequently are they accessing the data? You can also do things like detecting anomalous file access activities, and quickly identify potential threats. You know, and this can be delivered by apps which are running on one platform that has consolidated the data as well. And you can also start getting into lots of things like, you know, rapidly searching for PII. So personally identifiable information across different file types. And you can report on all of this activity back to the business, by identifying, you know, where are you storing your copies of data? How many copies have you got and who has access to them? These are all becoming table stakes as far as I'm concerned. >> Right, right. >> The organizations continue that move into digital transformation and more regulation comes into law. So it's something that has to be taken very, very seriously. The easier you make your infrastructure, the easier it will be for you to comply with it. >> Okay, Patrick, we were talking, you talked earlier about storage optimization. We talked to Adam Worthington about the business case. You get the sort of numerator, which is the business value and then the denominator, which is the cost. And so storage efficiency is obviously a key part of it. It's part of your value proposition to pick up on your sort of earlier comments, and what's unique about Pure in this regard? >> Yeah, and I think there are, there are multiple dimensions to that. Firstly, if you look at the difference between legacy storage platforms, they used to take up racks or isles of space in a data center with flash technology that underpins flash blade, we effectively switch out racks for rack units. And it has a big play in terms of data center footprint, and the environmentals associated with the data center, but it doesn't stop at that. You know, we make sure that we efficiently store data on our platforms. We use advanced compression techniques to make sure that we make flash storage as cost competitive as we possibly can. And then if you look at extending out storage efficiencies and the benefits it brings, just the performance has a direct effect on staff, whether that's, you know, the staff and the simplicity of the platform, so that it's easy and efficient to manage, or whether it's the efficiency you get from your data scientists who are using the outcomes from the platform and making them more efficient. If you look at some of our customers in the financial space, their time to results are improved by 10 or 20 X by switching to our technology from legacy technologies for their analytics platforms. >> So guys we've been running, you know, CUBE interviews in our studios remotely for the last 120 days, it's probably the first interview I've done where I haven't started off talking about COVID, but digital transformation, you know, BC, before COVID. Yeah, it was real, but it was all of a buzzy wordy too. And now it's like a mandate. So Lester, I wonder if you could talk about smart data life cycle and how it fits into this isolation economy and hopefully what will soon be a post isolation economy? >> Yeah, COVID has dramatically accelerated the data economy. I think, you know, first and foremost, we've all learned to work at home. I, you know, we've all had that experience where, you know, there were people who would um and ah about being able to work at home just a couple of days a week. And here we are working five days a week. That's had a knock on impact to infrastructure to be able to support that. But going further than that, you know, the data economy is all about how a business can leverage their data to compete in this new world order that we are now in. So, you know, they've got to be able to drive that value from their data and if they're not prepared for it, they're going to falter. We've unfortunately seen a few companies that have faltered because they weren't prepared for this data economy. This is where all your value is driven from. So COVID has really been a forcing function to, you know, it's probably one of the few good things that have come out of COVID, is that we have been forced to adapt. And it's been an interesting journey and it continues to be so. >> Well, is that too, you know, everybody talks about business resiliency, ransomware comes into effect here, and Patrick, you, you may have some thoughts on this too, but Ezat, your thoughts on the whole work from home pivot and how it's impacting the data life cycle. >> Absolutely, like, like Lester said, you know, we've, we're seeing a huge impact here. You know, working from home has, has pretty much become the norm now. Companies have been forced into basically making it work. If you look at online retail, that's accelerated dramatically as well. Unified communications and video conferencing. So really, you know, the point here is that yes, absolutely. You know, we've compressed you know, in the past maybe four months, what probably would have taken maybe even five years, maybe 10 years or so. And so with all this digital capability, you know, when you talk about things like RPOs and RTOs, these things are, you know, very much, you know, front of mind basically and they're being taken very seriously. You know, with legacy infrastructure, you're pretty much limited with what you can do around that. But with next generation, it puts it front and center. And when it comes to, you know, to ransomware, of course, it's not a case of if it's going to happen, it's a case of when it's going to happen. Again, we've all seen lots of stuff in the press, different companies being impacted by this, you know, both private and public organizations. So it's a case of, you know, you have to think long and hard about how you're going to combat this, because actually malware also, it's becoming, it's becoming a lot more sophisticated. You know, what we're seeing now is that actually, when, when customers get impacted, the malware will sit in their environment and it will have a look around it, it won't actually do anything. And what it's actually trying to do is, it's trying to identify things like your backups, where are your backups? Because you know, what do, what do we all do? If we get hit by a situation like this, we go to our backups. But you know, the bad actors out there, they, you know, they're getting pretty smart as well. And if your legacy solution is sitting on a system that can be compromised quite easily, that's a really bad situation, you know, waiting to happen. And, you know, if you can't recover from your backups, essentially, unfortunately, you know, people are going to be making trips to the bank because you're going to have to pay to get your data back. And of course, nobody wants to see that happening. So one of the ways, for example, that we look to help customers defend against this is actually we have, we have a three pronged approach. So protect, detect, and respond. So what we mean by protect, and let me say, you know, first of all, this isn't a silver bullet, right? Security is an industry all of itself. It's very complicated. And the approach here is that you have to layer it. What Cohesity, for example, helps customers with, is around protecting that insurance policy, right? The backups. So by ensuring that that data is immutable, cannot be edited in any way, which is inherent to our file system. We make sure that nothing can affect that, but it's not just external actors you have to think about, it's also potentially internal bad actors as well. So things like being able to data lock your information so that even administrators can't change, edit or delete data, is just another way in which we help customers to protect. And then also you have things like multifactor authentication as well, but once we've okay, so we've protected the data. Now, when it comes, now it comes to detection. So again, being, you know, ingrained into data protection, we have a good view of what's happening with all of this data that's flowing around the organization. And if we start to see, for example, that backup times, or, you know, backup quantities, data quantities are suddenly spiking all of a sudden, we use things like, you know, AI machine learning to highlight these, and once we detect an anomaly such as this, we can then alert our users to this fact. And not only do we alert them and just say, look, we think something might be going on with your systems, but we'll also point them to a known good recovery point as well, so that they don't have to sit searching, well, when did this thing hit and you know, which recovery point do I have to use? And so, you know, and we use metadata to do all of these kinds of things with our global management platform called Helios. And that actually runs in the cloud as well. And so when we find this kind of stuff, we can basically recover it very, very quickly. And this comes back now to the RPOs and the RTOs. So your recovery point objective, we can shrink that, right? And essentially what that means is that you will lose less data. But more importantly, the RTO, your recovery time objective, it means that actually, should something happen and we need to recover that data, we can also shrink that dramatically. So again, when you think about other, you know, legacy technology out there, when something like this happens, you might be waiting hours, most likely days, possibly even weeks and months, depending on the severity. Whereas we're talking about being able to bring data back, you know, we're talking maybe, you know, a few hundred virtual machines in seconds and minutes. And so, you know, when you think about the value that that can give an organization, it becomes, it becomes a no brainer really, as far as, as far as I'm concerned. So, you know, that really covers how we respond to these situations. So protect, detect, and respond. >> Great, great summary. I mean, my summary is adverse, right? The adversaries are very, very capable. You got to put security practices in place. The backup Corpus becomes increasingly important. You got to have analytics to detect anomalous behavior and you got to have, you know, fast recovery. And thank you for that. We got to wrap, but so Lester, let me, let me ask you to sort of paint picture of the sort of journey or the maturity model that people have to take. You know, if they want to get into it, where do they start and where are they going? Give us that view. >> I think first it's knowing what you have. If you don't know what you have, you can't manage it, you can't control it, you can't secure it, you can't ensure it's compliant. So that's first and foremost. The second is really, you know, ensuring that you're compliant. Once you know what you have, are you securing it? Are you following the regulatory, the applicable regulations? Are you able to evidence that? How are you storing your data? Are you archiving it? Are you storing it effectively and efficiently? You know, have you, Nirvana from my perspective is really getting to a point where you've consolidated your data, you've broken down the silos and you have a virtually self service environment by which the business can consume and build upon their data. And really at the end of the day, as we said at the beginning, it's all about driving value out of your data. And the automation is key to this journey. >> That's awesome. And you just described sort of a winning data culture. Lester, Patrick, Ezat, thanks so much for participating in this power panel. >> Thank you, David. >> Thank you. >> Thank you for watching everybody. This is Dave Vellante for theCUBE. (bright music)

Published Date : Jul 16 2020

SUMMARY :

brought to you by IO Tahoe. to the power panel, I wonder if each of you could that you have and building a catalog Give us you know, your and demands to be able what are you seeing from customers, to create further silos, you know, I mean, people, you know, So, you know, finding Ezat, I mean, you know, manner and at the scale that you need to, So, you know, if you don't need it, and I want to understand how you guys enterprise to have, you know, So, you know, one of the So you have different technologies to you because you know, from the analysts, you know, And so, you know, what you can you know, upstream and downstream, So I wonder if you could or as the data moves off into, you know, And of course, Ezat, you And you know, how would you, You know, how do you deal with that? And being able to share that I mean, you know, 10 years ago, the equivalent to, you know, you know, when you're with your family, And the flexibility to that comes down to data, whether you know, IO Tahoe Yeah, Esat, anything you can add there? the message that, you know, So it's something that has to you talked earlier about whether that's, you know, So guys we've been running, you know, I think, you know, first and foremost, Well, is that too, you know, So it's a case of, you know, you know, fast recovery. And the automation is key to this journey. And you just described sort Thank you for watching everybody.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
PatrickPERSON

0.99+

DavidPERSON

0.99+

Patrick SmithPERSON

0.99+

Dave VellantePERSON

0.99+

DavePERSON

0.99+

fiveQUANTITY

0.99+

EzatPERSON

0.99+

Adam WorthingtonPERSON

0.99+

80%QUANTITY

0.99+

53%QUANTITY

0.99+

IO TahoeORGANIZATION

0.99+

Ezat DayehPERSON

0.99+

10QUANTITY

0.99+

10 yearsQUANTITY

0.99+

EUORGANIZATION

0.99+

five yearsQUANTITY

0.99+

tomorrowDATE

0.99+

GDPRTITLE

0.99+

AWSORGANIZATION

0.99+

CohesityORGANIZATION

0.99+

EMEAORGANIZATION

0.99+

one monthQUANTITY

0.99+

LesterPERSON

0.99+

one platformQUANTITY

0.99+

firstQUANTITY

0.99+

15 copiesQUANTITY

0.99+

Pure StorageORGANIZATION

0.99+

secondQUANTITY

0.98+

one yearQUANTITY

0.98+

both worldsQUANTITY

0.98+

six different sourcesQUANTITY

0.98+

oneQUANTITY

0.98+

LesterORGANIZATION

0.98+

one goalQUANTITY

0.98+

20 XQUANTITY

0.98+

todayDATE

0.98+

eachQUANTITY

0.98+

10 years agoDATE

0.98+

SecondlyQUANTITY

0.98+

first interviewQUANTITY

0.98+

FirstlyQUANTITY

0.98+

five days a weekQUANTITY

0.98+

a week laterDATE

0.97+

bothQUANTITY

0.97+

three key stepsQUANTITY

0.97+

threeQUANTITY

0.97+

Lester WatersPERSON

0.97+

20%QUANTITY

0.97+

FirstQUANTITY

0.94+

IoTahoeORGANIZATION

0.93+

one placeQUANTITY

0.92+

PureORGANIZATION

0.9+

NirvanaPERSON

0.89+

Lester Waters, Io Tahoe | Enterprise Data Automation


 

(upbeat music) >> Reporter: From around the globe, it's The Cube with digital coverage of enterprise data automation and event series brought to you by Io-Tahoe. >> Okay, we're back. Focusing on enterprise data automation, we're going to talk about the journey to the cloud. Remember, the hashtag is data automated. We're here with Lester Waters who's the CTO of Io-Tahoe, Lester, good to see you from across the pond on video, wish we were face to face, but it's great to have you on The Cube. >> Also I do, thank you for having me. >> Oh, you're very welcome. Hey, give us a little background on CTO, you got a deep expertise in a lot of different areas, but what do we need to know? >> Well, David, I started my career basically at Microsoft, where I started the Information Security Cryptography Group. They're the very first one that the company had and that led to a career in information security and of course, as you go along with the information security, data is the key element to be protected. So I always had my hands in data and that naturally progressed into a role with Io-Tahoe as their CTO. >> Guys, I have to invite you back, we'll talk crypto all day we'd love to do that but we're here talking about yeah, awesome, right? But we're here talking about the cloud and here we'll talk about the journey to the cloud and accelerate. Everybody's really interested obviously in cloud, even more interested now with the pandemic, but what's that all about? >> Well, moving to the cloud is quite an undertaking for most organizations. First of all, we've got as probably if you're a large enterprise, you probably have thousands of applications, you have hundreds and hundreds of database instances, and trying to shed some light on that, just to plan your move to the cloud is a real challenge. And some organizations try to tackle that manually. Really what Io-Tahoe is bringing is trying to tackle that in an automated version to help you with your journey to the cloud. >> Well, look at migrations are sometimes just an evil word to a lot of organizations, but at the same time, building up technical debt veneer after veneer and year, and year, and year is something that many companies are saying, "Okay, it's got to stop." So what's the prescription for that automation journey and simplifying that migration to the cloud? >> Well, I think the very first thing that's all about is data hygiene. You don't want to pick up your bad habits and take them to the cloud. You've got an opportunity here, so I see the journey to the cloud is an opportunity to really clean house, reorganize things, like moving out. You might move all your boxes, but you're kind of probably cherry pick what you're going to take with you and then you're going to organize it as you end up at your new destination. So from that, I get there's seven key principles that I like to operate by when I advise on the cloud migration. >> Okay. So, where do you start? >> Well, I think the first thing is understanding what you got, so discover and cataloging your data and your applications. If I don't know what I have, I can't move it, I can't improve it, I can't build up on it. And I have to understand there is dependency, so building that data catalog is the very first step. What do I got? >> Now, is that a metadata exercise? Sometimes there's more metadata than there is data. Is metadata part of that first step or? >> In deed, metadata is the first step so the metadata really describes the data you have. So, the metadata is going to tell me I have 2000 tables and maybe of those tables, there's an average of 25 columns each, and so that gives me a sketch if you will, of what I need to move. How big are the boxes I need to pack for my move to the cloud? >> Okay, and you're saying you can automate that data classification, categorization, discovery, correct using math machine intelligence, is that correct? >> Yeah, that's correct. So basically we go, and we will discover all of the schema, if you will, that's the metadata description of your tables and columns in your database in the data types. So we take, we will ingest that in, and we will build some insights around that. And we do that across a variety of platforms because everybody's organization has you've got a one yeah, an Oracle Database here, and you've got a Microsoft SQL Database here, you might have something else there that you need to bring site onto. And part of this journey is going to be about breaking down your data silos and understanding what you've got. >> Okay. So, we've done the audit, we know what we've got, what's next? Where do we go next? >> So the next thing is remediating that data. Where do I have duplicate data? Often times in an organization, data will get duplicated. So, somebody will take a snapshot of a data, and then ended up building a new application, which suddenly becomes dependent on that data. So it's not uncommon for an organization of 20 master instances of a customer. And you can see where that will go when trying to keep all that stuff in sync becomes a nightmare all by itself. So you want to understand where all your redundant data is. So when you go to the cloud, maybe you have an opportunity here to consolidate that data. >> Yeah, because you like to borrow in an Einstein or apply an Einstein Bromide right. Keep as much data as you can, but no more. >> Correct. >> Okay. So you get to the point to the second step you're kind of a one to reduce costs, then what? You figure out what to get rid of, or actually get rid of it, what's next? >> Yes, that would be the next step. So figuring out what you need and what you don't need often times I've found that there's obsolete columns of data in your databases that you just don't need, or maybe it's been superseded by another, you've got tables that have been superseded by other tables in your database. So you got to understand what's being used and what's not and then from that, you can decide, "I'm going to leave this stuff behind, "or I'm going to archive this stuff "cause I might need it for data retention "or I'm just going to delete it, "I don't need it at all." >> Well, Lester, most organizations, if they've been around a while, and the so-called incumbents, they've got data all over the place, their data marts, data warehouses, there are all kinds of different systems and the data lives in silos. So, how do you kind of deal with that problem? Is that part of the journey? >> That's a great point Dave, because you're right that the data silos happen because this business unit is chartered with this task another business unit has this task and that's how you get those instantiations of the same data occurring in multiple places. So as part of your cloud migration journey, you really want to plan where there's an opportunity to consolidate your data, because that means there'll be less to manage, there'll be less data to secure, and it'll have a smaller footprint, which means reduced costs. >> So, people always talk about a single version of the truth, data quality is a huge issue. I've talked to data practitioners and they've indicated that the quality metrics are in the single digits and they're trying to get to 90% plus, but maybe you could address data quality. Where does that fit in on the journey? >> That's, a very important point. First of all, you don't want to bring your legacy issues with you. As the point I made earlier, if you've got data quality issues, this is a good time to find those and identify and remediate them. But that can be a laborious task. We've had customers that have tried to do this by hand and it's very, very time consuming, cause you imagine if you've got 200 tables, 50,000 columns, imagine, the manual labor involved in doing that. And you could probably accomplish it, but it'll take a lot of work. So the opportunity to use tools here and automate that process is really will help you find those outliers there's that bad data and correct it before you move to the cloud. >> And you're just talking about that automation it's the same thing with data catalog and that one of the earlier steps. Organizations would do this manually or they try to do it manually and that's a lot of reason for the failure. They just, it's like cleaning out your data like you just don't want to do it (laughs). Okay, so then what's next? I think we're plowing through your steps here. What what's next on the journey? >> The next one is, in a nutshell, preserve your data format. Don't boil the ocean here to use a cliche. You want to do a certain degree of lift and shift because you've got application dependencies on that data and the data format, the tables on which they sit, the columns and the way they're named. So, some degree you are going to be doing a lift and shift, but it's an intelligent lift and shift using all the insights you've gathered by cataloging the data, looking for data quality issues, looking for duplicate columns, doing planning consolidation. You don't want to also rewrite your application. So, in that aspect, I think it's important to do a bit of lift and shift and preserve those data formats as they sit. >> Okay, so let me follow up on that. That sounds really important to me, because if you're doing a conversion and you're rewriting applications, that means that you're going to have to freeze the existing application, and then you going to be refueling the plane as you're in midair and a lot of times, especially with mission critical systems, you're never going to bring those together and that's a recipe for disaster, isn't it? >> Great analogy unless you're with the air force, you'll (mumbles) (laughs). Now, that's correct. It's you want to have bite-sized steps and that's why it's important to plan your journey, take these steps. You're using automation where you can to make that journey to the cloud much easier and more straightforward. >> All right, I like that. So we're taking a kind of a systems view and end to end view of the data pipeline, if you will. What's next? I think we're through. I think I've counted six. What's the lucky seven? >> Lucky seven, involve your business users. Really, when you think about it, your data is in silos. Part of this migration to the cloud is an opportunity to break down these silos, these silos that naturally occur as part of the business unit. You've got to break these cultural barriers that sometimes exist between business and say, so for example, I always advise, there's an opportunity here to consolidate your sensitive data, your PII, your personally identifiable information, and if three different business units have the same source of truth for that, there's was an opportunity to consolidate that into one as you migrate. That might be a little bit of tweaking to some of the apps that you have that are dependent on it, but in the long run, that's what you really want to do. You want to have a single source of truth, you want to ring fence that sensitive data, and you want all your business users talking together so that you're not reinventing the wheel. >> Well, the reason I think too that's so important is that you're now I would say you're creating a data driven culture. I know that's sort of a buzz word, but what it's true and what that means to me is that your users, your lines of business feel like they actually own the data rather than pointing fingers at the data group, the IT group, the data quality people, data engineers, saying, "Oh, I don't believe it." If the lines of business own the data, they're going to lean in, they're going to maybe bring their own data science resources to the table, and it's going to be a much more collaborative effort as opposed to a non-productive argument. >> Yeah. And that's where we want to get to. DataOps is key, and maybe that's a term that's still evolving. But really, you want the data to drive the business because that's where your insights are, that's where your value is. You want to break down the silos between not only the business units, as I mentioned, but also as you pointed out, the roles of the people that are working with it. A self service data culture is the right way to go with the right security controls, putting on my security hat of course in place so that if I'm a developer and I'm building a new application, I'd love to be able to go to the data catalog, "Oh, there's already a database that has the customer "what the customers have clicked on when shopping." I could use that. I don't have to rebuild that, I'll just use that as for my application. That's the kind of problems you want to be able to solve and that's where your cost reductions come in across the board. >> Yeah. I want to talk a little bit about the business context here. We always talk about data, it's the new source of competitive advantage, I think there's not a lot of debate about that, but it's hard. A lot of companies are struggling to get value out of their data because it's so difficult. All the things we've talked about, the silos, the data quality, et cetera. So, you mentioned the term data apps, data apps is all about streamlining, that data, pipelining, infusing automation and machine intelligence into that pipeline and then ultimately taking a systems view and compressing that time to insights so that you can drive monetization, whether it's cut costs, maybe it's new revenue, drive productivity, but it's that end to end cycle time reduction that successful practitioners talk about as having the biggest business impact. Are you seeing that? >> Absolutely, but it is a journey and it's a huge cultural change for some companies that are. I've worked in many companies that are ticket based IT-driven and just do even the marginalist of change or get insight, raise a ticket, wait a week and then out the other end will pop maybe a change that I needed and it'll take a while for us to get to a culture that truly has a self service data-driven nature where I'm the business owner, and I want to bring in a data scientist because we're losing. For example, a business might be losing to a competitor and they want to find what insights, why is the customer churn, for example, happening every Tuesday? What is it about Tuesday? This is where your data scientist comes in. The last thing you want is to raise a ticket, wait for the snapshot of the data, you want to enable that data scientist to come in, securely connect into the data, and do his analysis, and come back and give you those insights, which will give you that competitive advantage. >> Well, I love your point about churn, maybe it talks about the Andreessen quote that "Software's eating the world," and all companies are our software companies, and SaaS companies, and churn is the killer of SaaS companies. So very, very important point you're making. My last question for you before we summarize is the tech behind all of these. What makes Io-Tahoe unique in its ability to help automate that data pipeline? >> Well, we've done a lot of research, we have I think now maybe 11 pending patent applications, I think one has been approved to be issued (mumbles), but really, it's really about sitting down and doing the right kind of analysis and figuring out how we can optimize this journey. Some of these stuff isn't rocket science. You can read a schema and into an open source solution, but you can't necessarily find the hidden insights. So if I want to find my foreign key dependencies, which aren't always declared in the database, or I want to identify columns by their content, which because the columns might be labeled attribute one, attribute two, attribute three, or I want to find out how my data flows between the various tables in my database. That's the point at which you need to bring in automation, you need to bring in data science solutions, and there's even a degree of machine learning because for example, we might deduce that data is flowing from this table to this table and upon when you present that to the user with a 87% confidence, for example, and the user can go, or the administrator can go. Now, it really goes the other way, it was an invalid collusion and that's the machine learning cycle. So the next time we see that pattern again, in that environment we will be able to make a better recommendation because some things aren't black and white, they need that human intervention loop. >> All right, I just want to summarize with Lester Waters' playbook to moving to the cloud and I'll go through them. Hopefully, I took some notes, hopefully, I got them right. So step one, you want to do that data discovery audit, you want to be fact-based. Two is you want to remediate that data redundancy, and then three identify what you can get rid of. Oftentimes you don't get rid of stuff in IT, or maybe archive it to cheaper media. Four is consolidate those data silos, which is critical, breaking down those data barriers. And then, five is attack the quality issues before you do the migration. Six, which I thought was really intriguing was preserve that data format, you don't want to do the rewrite applications and do that conversion. It's okay to do a little bit of lifting and shifting >> This comes in after the task. >> Yeah, and then finally, and probably the most important is you got to have that relationship with the lines of business, your users, get them involved, begin that cultural shift. So I think great recipe Lester for safe cloud migration. I really appreciate your time. I'll give you the final word if you will bring us home. >> All right. Well, I think the journey to the cloud it's a tough one. You will save money, I have heard people say, you got to the cloud, it's too expensive, it's too this, too that, but really, there is an opportunity for savings. I'll tell you when I run data services as a PaaS service in the cloud, it's wonderful because I can scale up and scale down almost by virtually turning a knob. And so I'll have complete control and visibility of my costs. And so for me, that's very important. Io also, it gives me the opportunity to really ring fence my sensitive data, because let's face it, most organizations like being in a cheese grater when you talk about security, because there's so many ways in and out. So I find that by consolidating and bringing together the crown jewels, if you will. As a security practitioner, it's much more easy to control. But it's very important. You can't get there without some automation and automating this discovery and analysis process. >> Well, great advice. Lester, thanks so much. It's clear that the capex investments on data centers are generally not a good investment for most companies. Lester, really appreciate, Lester waters CTO of Io-Tahoe. Let's watch this short video and we'll come right back. You're watching The Cube, thank you. (upbeat music)

Published Date : Jun 23 2020

SUMMARY :

to you by Io-Tahoe. but it's great to have you on The Cube. you got a deep expertise in and that led to a career Guys, I have to invite you back, to help you with your and simplifying that so I see the journey to is the very first step. Now, is that a metadata exercise? and so that gives me a sketch if you will, that you need to bring site onto. we know what we've got, what's next? So you want to understand where Yeah, because you like point to the second step and then from that, you can decide, and the data lives in silos. and that's how you get Where does that fit in on the journey? So the opportunity to use tools here and that one of the earlier steps. and the data format, the and then you going to to plan your journey, and end to end view of the and you want all your business and it's going to be a much database that has the customer and compressing that time to insights and just do even the marginalist of change and churn is the killer That's the point at which you and do that conversion. after the task. and probably the most important is the journey to the cloud It's clear that the capex

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

DavePERSON

0.99+

200 tablesQUANTITY

0.99+

hundredsQUANTITY

0.99+

90%QUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

Lester WatersPERSON

0.99+

sixQUANTITY

0.99+

first stepQUANTITY

0.99+

87%QUANTITY

0.99+

Information Security Cryptography GroupORGANIZATION

0.99+

25 columnsQUANTITY

0.99+

Io-TahoeORGANIZATION

0.99+

seven key principlesQUANTITY

0.99+

2000 tablesQUANTITY

0.99+

AndreessenPERSON

0.99+

SixQUANTITY

0.99+

second stepQUANTITY

0.99+

Io TahoePERSON

0.99+

TuesdayDATE

0.99+

50,000 columnsQUANTITY

0.99+

LesterPERSON

0.98+

11 pending patent applicationsQUANTITY

0.98+

fiveQUANTITY

0.97+

a weekQUANTITY

0.97+

20 master instancesQUANTITY

0.97+

EinsteinPERSON

0.97+

FourQUANTITY

0.97+

first oneQUANTITY

0.96+

oneQUANTITY

0.96+

first thingQUANTITY

0.95+

LesterORGANIZATION

0.95+

TwoQUANTITY

0.93+

FirstQUANTITY

0.93+

Enterprise Data AutomationORGANIZATION

0.93+

threeQUANTITY

0.93+

sevenQUANTITY

0.92+

step oneQUANTITY

0.92+

single versionQUANTITY

0.92+

pandemicEVENT

0.91+

SQL DatabaseTITLE

0.91+

single sourceQUANTITY

0.86+

three different business unitsQUANTITY

0.82+

The CubeORGANIZATION

0.8+

Oracle DatabaseTITLE

0.79+

thousands of applicationsQUANTITY

0.76+

single digitsQUANTITY

0.76+

capexORGANIZATION

0.74+

CTOPERSON

0.73+

Waters'PERSON

0.69+

eachQUANTITY

0.68+

attribute twoOTHER

0.65+

attribute threeOTHER

0.59+

The CubeTITLE

0.57+

attribute oneOTHER

0.44+

Lester Waters, Io-Tahoe


 

(upbeat music) >> Reporter: From around the globe, it's The Cube with digital coverage of enterprise data automation and event series brought to you by Io-Tahoe. >> Okay, we're back. Focusing on enterprise data automation, we're going to talk about the journey to the cloud. Remember, the hashtag is data automated. We're here with Lester Waters who's the CTO of Io-Tahoe, Lester, good to see you from across the pond on video, wish we were face to face, but it's great to have you on The Cube. >> Also I do, thank you for having me. >> Oh, you're very welcome. Hey, give us a little background on CTO, you got a deep expertise in a lot of different areas, but what do we need to know? >> Well, David, I started my career basically at Microsoft, where I started the Information Security Cryptography Group. They're the very first one that the company had and that led to a career in information security and of course, as you go along with the information security, data is the key element to be protected. So I always had my hands in data and that naturally progressed into a role with Io-Tahoe as their CTO. >> Guys, I have to invite you back, we'll talk crypto all day we'd love to do that but we're here talking about yeah, awesome, right? But we're here talking about the cloud and here we'll talk about the journey to the cloud and accelerate. Everybody's really interested obviously in cloud, even more interested now with the pandemic, but what's that all about? >> Well, moving to the cloud is quite an undertaking for most organizations. First of all, we've got as probably if you're a large enterprise, you probably have thousands of applications, you have hundreds and hundreds of database instances, and trying to shed some light on that, just to plan your move to the cloud is a real challenge. And some organizations try to tackle that manually. Really what Io-Tahoe is bringing is trying to tackle that in an automated version to help you with your journey to the cloud. >> Well, look at migrations are sometimes just an evil word to a lot of organizations, but at the same time, building up technical debt veneer after veneer and year, and year, and year is something that many companies are saying, "Okay, it's got to stop." So what's the prescription for that automation journey and simplifying that migration to the cloud? >> Well, I think the very first thing that's all about is data hygiene. You don't want to pick up your bad habits and take them to the cloud. You've got an opportunity here, so I see the journey to the cloud is an opportunity to really clean house, reorganize things, like moving out. You might move all your boxes, but you're kind of probably cherry pick what you're going to take with you and then you're going to organize it as you end up at your new destination. So from that, I get there's seven key principles that I like to operate by when I advise on the cloud migration. >> Okay. So, where do you start? >> Well, I think the first thing is understanding what you got, so discover and cataloging your data and your applications. If I don't know what I have, I can't move it, I can't improve it, I can't build up on it. And I have to understand there is dependency, so building that data catalog is the very first step. What do I got? >> Now, is that a metadata exercise? Sometimes there's more metadata than there is data. Is metadata part of that first step or? >> In deed, metadata is the first step so the metadata really describes the data you have. So, the metadata is going to tell me I have 2000 tables and maybe of those tables, there's an average of 25 columns each, and so that gives me a sketch if you will, of what I need to move. How big are the boxes I need to pack for my move to the cloud? >> Okay, and you're saying you can automate that data classification, categorization, discovery, correct using math machine intelligence, is that correct? >> Yeah, that's correct. So basically we go, and we will discover all of the schema, if you will, that's the metadata description of your tables and columns in your database in the data types. So we take, we will ingest that in, and we will build some insights around that. And we do that across a variety of platforms because everybody's organization has you've got a one yeah, an Oracle Database here, and you've got a Microsoft SQL Database here, you might have something else there that you need to bring site onto. And part of this journey is going to be about breaking down your data silos and understanding what you've got. >> Okay. So, we've done the audit, we know what we've got, what's next? Where do we go next? >> So the next thing is remediating that data. Where do I have duplicate data? Often times in an organization, data will get duplicated. So, somebody will take a snapshot of a data, and then ended up building a new application, which suddenly becomes dependent on that data. So it's not uncommon for an organization of 20 master instances of a customer. And you can see where that will go when trying to keep all that stuff in sync becomes a nightmare all by itself. So you want to understand where all your redundant data is. So when you go to the cloud, maybe you have an opportunity here to consolidate that data. >> Yeah, because you like to borrow in an Einstein or apply an Einstein Bromide right. Keep as much data as you can, but no more. >> Correct. >> Okay. So you get to the point to the second step you're kind of a one to reduce costs, then what? You figure out what to get rid of, or actually get rid of it, what's next? >> Yes, that would be the next step. So figuring out what you need and what you don't need often times I've found that there's obsolete columns of data in your databases that you just don't need, or maybe it's been superseded by another, you've got tables that have been superseded by other tables in your database. So you got to understand what's being used and what's not and then from that, you can decide, "I'm going to leave this stuff behind, "or I'm going to archive this stuff "cause I might need it for data retention "or I'm just going to delete it, "I don't need it at all." >> Well, Lester, most organizations, if they've been around a while, and the so-called incumbents, they've got data all over the place, their data marts, data warehouses, there are all kinds of different systems and the data lives in silos. So, how do you kind of deal with that problem? Is that part of the journey? >> That's a great point Dave, because you're right that the data silos happen because this business unit is chartered with this task another business unit has this task and that's how you get those instantiations of the same data occurring in multiple places. So as part of your cloud migration journey, you really want to plan where there's an opportunity to consolidate your data, because that means there'll be less to manage, there'll be less data to secure, and it'll have a smaller footprint, which means reduced costs. >> So, people always talk about a single version of the truth, data quality is a huge issue. I've talked to data practitioners and they've indicated that the quality metrics are in the single digits and they're trying to get to 90% plus, but maybe you could address data quality. Where does that fit in on the journey? >> That's, a very important point. First of all, you don't want to bring your legacy issues with you. As the point I made earlier, if you've got data quality issues, this is a good time to find those and identify and remediate them. But that can be a laborious task. We've had customers that have tried to do this by hand and it's very, very time consuming, cause you imagine if you've got 200 tables, 50,000 columns, imagine, the manual labor involved in doing that. And you could probably accomplish it, but it'll take a lot of work. So the opportunity to use tools here and automate that process is really will help you find those outliers there's that bad data and correct it before you move to the cloud. >> And you're just talking about that automation it's the same thing with data catalog and that one of the earlier steps. Organizations would do this manually or they try to do it manually and that's a lot of reason for the failure. They just, it's like cleaning out your data like you just don't want to do it (laughs). Okay, so then what's next? I think we're plowing through your steps here. What what's next on the journey? >> The next one is, in a nutshell, preserve your data format. Don't boil the ocean here to use a cliche. You want to do a certain degree of lift and shift because you've got application dependencies on that data and the data format, the tables on which they sit, the columns and the way they're named. So, some degree you are going to be doing a lift and shift, but it's an intelligent lift and shift using all the insights you've gathered by cataloging the data, looking for data quality issues, looking for duplicate columns, doing planning consolidation. You don't want to also rewrite your application. So, in that aspect, I think it's important to do a bit of lift and shift and preserve those data formats as they sit. >> Okay, so let me follow up on that. That sounds really important to me, because if you're doing a conversion and you're rewriting applications, that means that you're going to have to freeze the existing application, and then you going to be refueling the plane as you're in midair and a lot of times, especially with mission critical systems, you're never going to bring those together and that's a recipe for disaster, isn't it? >> Great analogy unless you're with the air force, you'll (mumbles) (laughs). Now, that's correct. It's you want to have bite-sized steps and that's why it's important to plan your journey, take these steps. You're using automation where you can to make that journey to the cloud much easier and more straightforward. >> All right, I like that. So we're taking a kind of a systems view and end to end view of the data pipeline, if you will. What's next? I think we're through. I think I've counted six. What's the lucky seven? >> Lucky seven, involve your business users. Really, when you think about it, your data is in silos. Part of this migration to the cloud is an opportunity to break down these silos, these silos that naturally occur as part of the business unit. You've got to break these cultural barriers that sometimes exist between business and say, so for example, I always advise, there's an opportunity here to consolidate your sensitive data, your PII, your personally identifiable information, and if three different business units have the same source of truth for that, there's was an opportunity to consolidate that into one as you migrate. That might be a little bit of tweaking to some of the apps that you have that are dependent on it, but in the long run, that's what you really want to do. You want to have a single source of truth, you want to ring fence that sensitive data, and you want all your business users talking together so that you're not reinventing the wheel. >> Well, the reason I think too that's so important is that you're now I would say you're creating a data driven culture. I know that's sort of a buzz word, but what it's true and what that means to me is that your users, your lines of business feel like they actually own the data rather than pointing fingers at the data group, the IT group, the data quality people, data engineers, saying, "Oh, I don't believe it." If the lines of business own the data, they're going to lean in, they're going to maybe bring their own data science resources to the table, and it's going to be a much more collaborative effort as opposed to a non-productive argument. >> Yeah. And that's where we want to get to. Data apps is key, and maybe that's a term that's still evolving. But really, you want the data to drive the business because that's where your insights are, that's where your value is. You want to break down the silos between not only the business units, as I mentioned, but also as you pointed out, the roles of the people that are working with it. A self service data culture is the right way to go with the right security controls, putting on my security hat of course in place so that if I'm a developer and I'm building a new application, I'd love to be able to go to the data catalog, "Oh, there's already a database that has the customer "what the customers have clicked on when shopping." I could use that. I don't have to rebuild that, I'll just use that as for my application. That's the kind of problems you want to be able to solve and that's where your cost reductions come in across the board. >> Yeah. I want to talk a little bit about the business context here. We always talk about data, it's the new source of competitive advantage, I think there's not a lot of debate about that, but it's hard. A lot of companies are struggling to get value out of their data because it's so difficult. All the things we've talked about, the silos, the data quality, et cetera. So, you mentioned the term data apps, data apps is all about streamlining, that data, pipelining, infusing automation and machine intelligence into that pipeline and then ultimately taking a systems view and compressing that time to insights so that you can drive monetization, whether it's cut costs, maybe it's new revenue, drive productivity, but it's that end to end cycle time reduction that successful practitioners talk about as having the biggest business impact. Are you seeing that? >> Absolutely, but it is a journey and it's a huge cultural change for some companies that are. I've worked in many companies that are ticket based IT-driven and just do even the marginalist of change or get insight, raise a ticket, wait a week and then out the other end will pop maybe a change that I needed and it'll take a while for us to get to a culture that truly has a self service data-driven nature where I'm the business owner, and I want to bring in a data scientist because we're losing. For example, a business might be losing to a competitor and they want to find what insights, why is the customer churn, for example, happening every Tuesday? What is it about Tuesday? This is where your data scientist comes in. The last thing you want is to raise a ticket, wait for the snapshot of the data, you want to enable that data scientist to come in, securely connect into the data, and do his analysis, and come back and give you those insights, which will give you that competitive advantage. >> Well, I love your point about churn, maybe it talks about the Andreessen quote that "Software's eating the world," and all companies are our software companies, and SaaS companies, and churn is the killer of SaaS companies. So very, very important point you're making. My last question for you before we summarize is the tech behind all of these. What makes Io-Tahoe unique in its ability to help automate that data pipeline? >> Well, we've done a lot of research, we have I think now maybe 11 pending patent applications, I think one has been approved to be issued (mumbles), but really, it's really about sitting down and doing the right kind of analysis and figuring out how we can optimize this journey. Some of these stuff isn't rocket science. You can read a schema and into an open source solution, but you can't necessarily find the hidden insights. So if I want to find my foreign key dependencies, which aren't always declared in the database, or I want to identify columns by their content, which because the columns might be labeled attribute one, attribute two, attribute three, or I want to find out how my data flows between the various tables in my database. That's the point at which you need to bring in automation, you need to bring in data science solutions, and there's even a degree of machine learning because for example, we might deduce that data is flowing from this table to this table and upon when you present that to the user with a 87% confidence, for example, and the user can go, or the administrator can go. Now, it really goes the other way, it was an invalid collusion and that's the machine learning cycle. So the next time we see that pattern again, in that environment we will be able to make a better recommendation because some things aren't black and white, they need that human intervention loop. >> All right, I just want to summarize with Lester Waters' playbook to moving to the cloud and I'll go through them. Hopefully, I took some notes, hopefully, I got them right. So step one, you want to do that data discovery audit, you want to be fact-based. Two is you want to remediate that data redundancy, and then three identify what you can get rid of. Oftentimes you don't get rid of stuff in IT, or maybe archive it to cheaper media. Four is consolidate those data silos, which is critical, breaking down those data barriers. And then, five is attack the quality issues before you do the migration. Six, which I thought was really intriguing was preserve that data format, you don't want to do the rewrite applications and do that conversion. It's okay to do a little bit of lifting and shifting >> This comes in after the task. >> Yeah, and then finally, and probably the most important is you got to have that relationship with the lines of business, your users, get them involved, begin that cultural shift. So I think great recipe Lester for safe cloud migration. I really appreciate your time. I'll give you the final word if you will bring us home. >> All right. Well, I think the journey to the cloud it's a tough one. You will save money, I have heard people say, you got to the cloud, it's too expensive, it's too this, too that, but really, there is an opportunity for savings. I'll tell you when I run data services as a PaaS service in the cloud, it's wonderful because I can scale up and scale down almost by virtually turning a knob. And so I'll have complete control and visibility of my costs. And so for me, that's very important. Io also, it gives me the opportunity to really ring fence my sensitive data, because let's face it, most organizations like being in a cheese grater when you talk about security, because there's so many ways in and out. So I find that by consolidating and bringing together the crown jewels, if you will. As a security practitioner, it's much more easy to control. But it's very important. You can't get there without some automation and automating this discovery and analysis process. >> Well, great advice. Lester, thanks so much. It's clear that the capex investments on data centers are generally not a good investment for most companies. Lester, really appreciate, Lester waters CTO of Io-Tahoe. Let's watch this short video and we'll come right back. You're watching The Cube, thank you. (upbeat music)

Published Date : Jun 4 2020

SUMMARY :

to you by Io-Tahoe. but it's great to have you on The Cube. you got a deep expertise in and that led to a career Guys, I have to invite you back, to help you with your and simplifying that so I see the journey to is the very first step. Now, is that a metadata exercise? and so that gives me a sketch if you will, that you need to bring site onto. we know what we've got, what's next? So you want to understand where Yeah, because you like point to the second step and then from that, you can decide, and the data lives in silos. and that's how you get Where does that fit in on the journey? So the opportunity to use tools here and that one of the earlier steps. and the data format, the and then you going to to plan your journey, and end to end view of the and you want all your business and it's going to be a much database that has the customer and compressing that time to insights and just do even the marginalist of change and churn is the killer That's the point at which you and do that conversion. after the task. and probably the most important is the journey to the cloud It's clear that the capex

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

DavePERSON

0.99+

200 tablesQUANTITY

0.99+

hundredsQUANTITY

0.99+

90%QUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

Lester WatersPERSON

0.99+

sixQUANTITY

0.99+

first stepQUANTITY

0.99+

87%QUANTITY

0.99+

Information Security Cryptography GroupORGANIZATION

0.99+

25 columnsQUANTITY

0.99+

Io-TahoeORGANIZATION

0.99+

seven key principlesQUANTITY

0.99+

2000 tablesQUANTITY

0.99+

AndreessenPERSON

0.99+

SixQUANTITY

0.99+

second stepQUANTITY

0.99+

Io-TahoePERSON

0.99+

TuesdayDATE

0.99+

50,000 columnsQUANTITY

0.99+

LesterPERSON

0.98+

11 pending patent applicationsQUANTITY

0.98+

fiveQUANTITY

0.97+

a weekQUANTITY

0.97+

20 master instancesQUANTITY

0.97+

EinsteinPERSON

0.97+

FourQUANTITY

0.97+

first oneQUANTITY

0.96+

oneQUANTITY

0.96+

first thingQUANTITY

0.95+

LesterORGANIZATION

0.95+

TwoQUANTITY

0.93+

FirstQUANTITY

0.93+

threeQUANTITY

0.93+

sevenQUANTITY

0.92+

step oneQUANTITY

0.92+

single versionQUANTITY

0.92+

pandemicEVENT

0.91+

SQL DatabaseTITLE

0.91+

single sourceQUANTITY

0.86+

three different business unitsQUANTITY

0.82+

The CubeORGANIZATION

0.8+

Oracle DatabaseTITLE

0.79+

thousands of applicationsQUANTITY

0.76+

single digitsQUANTITY

0.76+

CTOPERSON

0.74+

Waters'PERSON

0.69+

eachQUANTITY

0.68+

capexORGANIZATION

0.67+

attribute twoOTHER

0.65+

The CubeTITLE

0.6+

attribute threeOTHER

0.59+

attribute oneOTHER

0.44+

Ajay Vohora & Lester Waters, Io-Tahoe | AWS re:Invent 2019


 

>>LA Las Vegas. It's the cube covering AWS reinvent 2019, brought to you by Amazon web services and they don't care along with its ecosystem partners. >>Fine. Oh, welcome back here to Las Vegas. We are alive at AWS. Reinvent a lot with Justin Warren. I'm John Walls day one of a jam pack show. We had great keynotes this morning from Andy Jassy, uh, also representatives from Goldman Sachs and number of other enterprises on this stage right now we're gonna talk about data. It's all about data with IO Tahoe, a couple of the companies, representatives, CEO H J for horror. Jorge J. Thanks for being with us. Thank you Joan. And uh, Lester waters is the CSO at IO Tahoe. Leicester. Good afternoon to you. Thanks for being with us. Thank you for having us. CJ, you brought a football with you there. I see. So you've come prepared for a sport sport. I love it. All right. But if this is that your booth and your, you're showing here I assume and exhibiting and I know you've got a big offering we're going to talk about a little bit later on. First tell us about IO Tahoe a little bit to inform our viewers right now who might not be too familiar with the company. >>Sure. Well, our background was dealing with enterprise scale data issues that were really about the complexity, the amount of data and different types of data. So 2014 around when we're in stealth, kind of working on our technology, uh, the, a lot of the common technologies around them were Apache base. So Hadoop, um, large enterprises that were working with like a GE, Comcast had a cow help us come out of stealth in 2017. Uh, and grave, it's gave us a great story of solving petabyte scale data challenges, uh, using machine learning. So, uh, that manual overhead, that more and more as we look at, uh, AWS services, how do we drive the automation and get the value from data, uh, automation. >>It's gotta be the way forwards. All right, so let's, let's jump onto that then. Uh, on, on that notion, you've got this exponential growth in data, obviously working off the edge internet of things. Um, all these inputs, right? And we have so much more information at our disposal. Some of it's great, some of it's not. How do we know the difference, especially in this world where this exponential increase has happened. Lester, I mean, just tackle that for, from a, uh, from a company perspective and identifying, you know, first off, how do we ever figure out what do we have that's that valuable? Where do we get the value out of that, right? And then, um, how do we make sense of it? How do we put it into practice? >>Yeah. So I think not most enterprises have a problem with data sprawl. There's project startup, we get a block of data and then all of a sudden the new, a new project comes along, they take a copy of that data. There's another instance of it. Then there's another instance for another project. >>And suddenly these different data sources become authoritative and become production. So now I have three, four, or five different instances. Oh, and then there's the three or four that got canceled and they're still sitting around. And as an information security professional, my challenge is to know where all of those pieces of data are so that, so that I can govern it and make sure that the stuff I don't need is gotten rid of it deleted. Uh, so you know, using the IO Tahoe software, I'm able to catalog all of that. I'm able to garner insights into that data using the, the nine patent pending algorithms that we have, uh, to, to find that, uh, to do intelligent tagging, if you will. So, uh, from my perspective, I'm very interested in making sure that I'm adhering to compliance rules. So the really cool thing about the stuff is that we go and tag data, we look at it and we actually tie it to lines of regulations. So you could go CC CCPA. This bit of text here applies to this. And that's really helpful for me as an information security professional because I'm not necessarily versed on every line of regulation, but when I can go and look at it handily like that, it makes it easier for me to go, Oh, okay, that's great. I know how to treat that in terms of control. So that for, that's the important bit for me. So if you don't know where your data is, you can't control it. You can't monitor it. >>Governance. Yeah. The, the knowing where stuff is, I'm familiar with a framework that was developed at Telstra back in Australia called the five no's, which is about exactly that. Knowing where your data is, what is it, who has access to it? Cause I actually being able to cattle on the data then like knowing what it is that you have. This is a mammoth task. I mean that's, that's hard enough 12 years ago. But like today with the amount of data that's actually actively being created every single day, so how, how does your system help CSOs tackle this, this kind of issue and maybe less listed. You can, you can start off and then, then you can tell us a bit more of yourself. >>Yeah, I mean I'll start off on that. It's a, a place to kind of see the feedback from our enterprise customers is as that veracity and volume of data increases. The, the challenge is definitely there to keep on top of governing that. So continually discovering that new data created, how is it different? How's it adding to the existing data? Uh, using machine learning and the models that we create, whether it's anomaly detection or classifying the data based on certain features in the data that allows us to tag it, load that in our catalog. So I've discovered it now we've made it accessible. Now any BI developer data engineer can search for that data in a catalog and make something from it. So if there were 10 steps in that data mile, we definitely sold the first four or five to of bring that momentum to getting value from that data. So discovering it, catalog it, tagging the data to make it searchable, and then it's free to pick up for whatever use case is out there, whether it's migration, security, compliance, um, security is a big one for you. >>And I would also add too, for the data scientists, you know, knowing all the assets they have available to them in order to, to drive those business value insights that they're so important these days. For companies because you know, a lot of companies compete on very thin margins and, and, and having insights into their data and to the way customers can use their data really can make, make or break a company these days. So that's, that's critical. And as Aja pointed out, being able to automate that through, through data ops if you will, uh, and drive those insights automatically is great. Like for example, from an information security standpoint, I want to fingerprint my data and I want to feed it into a DLP system. And so that, you know, I can really sort of keep an eye out if this data is actually going out. And it really is my data versus a standard reject kind of matching, which isn't the best, uh, techniques. So >>yeah. So walk us through that in a bit more detail. So you mentioned tagging is essentially that a couple of times. So let's go into the details a little bit about what that, what that actually means for customers. My understanding is that you're looking for things like a social security number that could be sitting somewhere in this data. So finding out where are all these social security numbers that I may not be aware of and it could be being shared with someone who shouldn't have access to that, but it is there, is that what it is or are they, are there other kinds of data that you're able to tag that traditional purchase? >>Yeah. Was wait straight out of the box. You've got your um, PII or personally, um, identifiable information, that kind of day that is covered under the CCPA GDPR. So there are those standards, regulatory driven definitions that is social security number name, address would fall under. Um, beyond that. Then in a large enterprise, you've got a clever data scientists, data engineers you through the nature of their work can combine sets of data that could include work patterns, IDs, um, lots of activity. You bring that together and that suddenly becomes, uh, under that umbrella of sensitive. Um, so being able to tag and classify data under those regulatory policies, but then is what and what could be an operational risk to an organization, whether it's a bank, insurance, utility, health care in particular, if you work in all those verticals or yeah, across the way, agnostic to any vertical. >>Okay. All right. And the nature of being able to do that is having that machine learning set up a baseline, um, around what is sensitive and then honing that to what is particular to that organization. So, you know, lots of people will use ever sort of seen here at AWS S three, uh, Aurora, Postgres or, or my sequel Redshift. Um, and also different ways the underlying sources of that data, whether it's a CRM system, a IOT, all of those sources have got nuances that makes every enterprise data landscape just slightly different. So China make a rules based, one size fits all approach is, is going to be limiting, um, that the increase your manual overhead. So customers like GE, Comcast, um, that move way beyond throwing people at the problem, that's no longer possible. Uh, so being smart about how to approach this, classifying the data, using features in the data crane, that metadata as an asset just as an eight data warehouse would be, allows you to, to enable the rest of the organization. >>So, I mean, you've talked about, um, you know, deriving value and identifying value. Um, how does ultimately, once you catalog your tag, what does this mean to the bottom line of terms of ROI? How does AWS play into that? Um, you know, why am I as, as a, as a company, you know, what value am I getting out of, of your abilities with AWS and then having that kind of capability. >>Yeah. We, we did a great study with Forester. Um, they calculated the ROI and it's a mixture of things. It's that manual personnel overhead who are locked into that. Um, pretty unpleasant low productivity role of wrangling with data for want of a better words to make something of it. They'd much rather be creating the dashboards that the BI or the insights. Um, so moving, you know, dozens of people from the back office manual wrangling into what's going to make difference to the chief marketing officer and your CFO bring down the cost of served your customer by getting those operational insights is how they want to get to working with that data. So that automation to take out the manual overhead of the upfront task is an allowing that, that resource to be better deployed onto the more interesting productive work. So that's one part of the ROI. >>The other is with AWS. What we've found here engaging with the AWS ecosystem is just that speed of migration to AWS. We can take months out of that by cataloging what's on premise and saying, huh, I date aside. So our data engineering team want to create products on for their own customers using Sage maker using Redshift, Athena. Um, but what is the exact data that we need to push into the cloud to use those services? Is it the 20 petabytes that we've accumulated over the 20 last 20 years? That's probably not going to be the case. So tiering the on prem and cloud, um, base of that data is, is really helpful to a data officer and an information architect to set themselves up to accelerate that migration to AWS. So for people who've used this kind of system and they've run through the tagging and seen the power of the platform that you've got there. So what are some of the things that they're now able to do once they've got these highly qual, high quality tagged data set? >>So it's not just tagging too. We also do, uh, we do, we do, we do fuzzy, fuzzy magic so we can find relationships in the data or even relationships within the data in terms of duplicate. So, so for example, somebody, somebody got married and they're really the same, you know, so now there's their surname has changed. We can help companies find that, those bits of a matching. And I think we had one customer where we saved about, saved him about a hundred thousand a year in mailing costs because they were sending, you know, to, you know, misses, you know, right there anymore. Her name was. And having the, you know, being able to deduplicate that kind of data really helps with that helps people save money. >>Yep. And that's kind of the next phase in our journey is moving beyond the tag in the classification is uh, our roadmap working with AWS is very much machine learning driven. So our engineering team, uh, what they're excited about is what's the next model, what's the next problem we can solve with AI machine learning to throw at the large scale data problem. So we'll continually be curating and creating that metadata catalog asset. So allow that to be used as a resource to enable the rest of the, the data landscape. >>And I think what's interesting about our product is we really have multiple audiences for it. We've got the chief data officer who wants to make sure that we're completely compliant because it doesn't want that 4% potential fine. You know, so being able to evidence that they're having due diligence and their data management will go a long way towards if there is a breach because zero days do happen. But if you can evidence that you've really been, been, had a good discipline, then you won't get that fine or hopefully you won't get a big fine. And that the second audience is going to be information security professionals who want to secure that perimeter. The third is going to be the data architects who are trying to, to uh, to, you know, manage and, and create new solutions with that data. And the fourth of course is the data scientists trying to drive >>new business value. >>Alright, well before we, we, we, we um, let y'all take off, I want to know about, uh, an offering that you've launched this week, uh, apparently to great success and you're pretty excited about just your space alone here, your presence here. But tell us a little bit about that before you take off. >>Yeah. So we're here also sponsoring the jam lounge and everybody's welcome to sign up. It's, um, a number of our friends there to competitively take some challenges, come into the jam lounge, use our products, and kind of understand what it means to accelerate that journey onto AWS. What can I do if I show what what? Yeah, give me, give me an idea about the blog. You can take some chances to discover data and understand what data is there. Isn't there fighting relationships and intuitively through our UI, start exploring that and, and joining the dots. Um, uh, what, what is my day that knowing your data and then creating policies to drive that data into use. Cool. Good. And maybe pick up a football along the way so I know. Yeah. Thanks for being with us. Thank you for half the time. And, uh, again, the jam lounge, right? Right, right here at the SAS Bora AWS reinvent. We are alive. And you're watching this right here on the queue.

Published Date : Dec 4 2019

SUMMARY :

AWS reinvent 2019, brought to you by Amazon web services So you've come prepared for So Hadoop, um, large enterprises that were working with like and identifying, you know, first off, how do we ever figure out what do we have that's that There's project startup, we get a block of data and then all of a sudden the new, a new project comes along, So that for, that's the important bit for me. it is that you have. tagging the data to make it searchable, and then it's free to pick up for And I would also add too, for the data scientists, you know, knowing all the assets they So let's go into the details a little bit about what that, what that actually means for customers. Um, so being able to tag and classify And the nature of being able to do that is having Um, you know, why am I as, as a, as a company, you know, what value am I Um, so moving, you know, dozens of people from the back office base of that data is, is really helpful to a data officer and And having the, you know, being able to deduplicate that kind of data really So allow that to be used as a resource And that the second audience is going you take off. start exploring that and, and joining the dots.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
ComcastORGANIZATION

0.99+

GEORGANIZATION

0.99+

Justin WarrenPERSON

0.99+

Andy JassyPERSON

0.99+

Goldman SachsORGANIZATION

0.99+

AustraliaLOCATION

0.99+

2017DATE

0.99+

JoanPERSON

0.99+

AWSORGANIZATION

0.99+

10 stepsQUANTITY

0.99+

threeQUANTITY

0.99+

Las VegasLOCATION

0.99+

2014DATE

0.99+

TelstraORGANIZATION

0.99+

Jorge J.PERSON

0.99+

fiveQUANTITY

0.99+

Ajay VohoraPERSON

0.99+

AmazonORGANIZATION

0.99+

20 petabytesQUANTITY

0.99+

fourQUANTITY

0.99+

John WallsPERSON

0.99+

IO TahoeORGANIZATION

0.99+

4%QUANTITY

0.99+

Io-TahoePERSON

0.99+

one customerQUANTITY

0.99+

FirstQUANTITY

0.99+

CJPERSON

0.99+

RedshiftTITLE

0.99+

thirdQUANTITY

0.99+

12 years agoDATE

0.98+

fourthQUANTITY

0.98+

todayDATE

0.98+

Lester WatersPERSON

0.98+

H JPERSON

0.97+

AjaPERSON

0.97+

ForesterORGANIZATION

0.97+

CCPATITLE

0.97+

this weekDATE

0.97+

zero daysQUANTITY

0.96+

about a hundred thousand a yearQUANTITY

0.96+

firstQUANTITY

0.95+

second audienceQUANTITY

0.94+

nineQUANTITY

0.94+

LA Las VegasLOCATION

0.94+

SageORGANIZATION

0.92+

LeicesterLOCATION

0.91+

ApacheORGANIZATION

0.9+

LesterPERSON

0.9+

SAS BoraORGANIZATION

0.88+

first fourQUANTITY

0.87+

one partQUANTITY

0.87+

oneQUANTITY

0.87+

2019DATE

0.85+

HadoopORGANIZATION

0.84+

AuroraTITLE

0.82+

dozens of peopleQUANTITY

0.79+

RedshiftORGANIZATION

0.78+

PostgresORGANIZATION

0.76+

20DATE

0.75+

eight data warehouseQUANTITY

0.74+

five differentQUANTITY

0.73+

CEOPERSON

0.7+

single dayQUANTITY

0.69+

ChinaLOCATION

0.68+

20 lastQUANTITY

0.65+

AthenaLOCATION

0.63+

morningDATE

0.55+

InventEVENT

0.54+

GDPRTITLE

0.53+

S threeTITLE

0.52+

yearsQUANTITY

0.51+

noOTHER

0.4+

watersORGANIZATION

0.39+

Tori Bedford, Caroline Lester & Hilary Burns, GroundTruth Project, Grace Hopper Celebration 2017


 

>> Announcer: Live, from Orlando, Florida, it's theCube, covering Grace Hopper Celebration of Women in Computing. Brought to you by SiliconAngle Media. >> Welcome back to theCube's coverage of Grace Hopper Conference here in Orlando, Florida, I'm your host, Rebecca Knight. We have a great panel here today, we have three guests. We have Hilary Burns and Caroline Lester, both Reporting Fellows for the GroundTruth Project, and Tori Bedford, who is a Field Producer for the GroundTruth Project. It's great to have you guys on here. >> Thank you for having us. >> Thank you. >> So, I'll start with you, Tori, since you were a reporting fellow last year at the Grace Hopper Conference, tell our viewers what the GroundTruth Project is, and what your mission is. >> So, the GroundTruth Project is a non-profit based in Boston and it hopes to encourage young journalists and earlier-career journalists all around the world. So there is a series of fellowships going on, pretty much at all times. Different projects, there's one going across America right now that's looking at, it's called Crossing the Divide, it's looking at divides in America. It's a very divisive time for American politics so they're doing stories about that. And, obviously, we are re-upping our women in tech, women in leadership fellowship this year which we're really excited about. >> And so, each of you are working on your own, individual stories and then you will get back to Boston and produce. So, Hilary, let's hear from you, what are you working on here, what's your topic? >> Sure, so most of my time spend at the Grace Hopper Celebration so far has been spent talking with student about their career aspirations, any barriers they foresee, coming across any concerns they have about entering a male-dominated industry. And it's really been fascinating hearing their stories, some of them are international students, others are from universities all over the world and including Canada and the U.S. So, it's been very inspirational to hear. >> So, but here the ones that are aspiring to careers in technology and they're here at Grace Hopper, but there must be other ones who are too discouraged so they're not here. Are you also getting that angle, too? >> Well, I think it's important for that group of women to see these women who do feel empowered and are, a lot of them use phrases like, "We are making a difference in the gender gap "and if I don't do it, who else will do it?" So, I think it's important for all aspiring technologists to hear these women's stories. >> Are they discouraged, though? Because the headlines are bleak, I mean, we know that it's the numbers, but it's also the Google manifesto, it's the shenanigans of Travis Kalanick and people like him in Silicon Valley. What do they make of that? >> It's interesting, all of them are very intelligent, very aware of what's going on in the world. I've heard a mixed bag of responses from, "I try not to "read too much because I don't want to go in expecting "and having my own biases, I want to see for myself." Others are saying, "Yeah, I am nervous and I want to see "more women creating a path that I can then follow." So, I think there are a lot of people that are optimistically optimistic about their future. >> Cautiously optimistic. >> Thank you. Thank you for correcting me. (laughs) But, it's been interesting to hear all the different perspectives. >> Great, Caroline, how about you, what are you working on? >> Yeah, so, I am personally interested in the more personal stories of some of these women speaking at the conference. I've talked to the four really, wonderful, inspirational women. So, one of my favorites, I've just published a story on her, Chieko Asakawa, who is an IBM Fellow which is the highest honor you can receive at IBM. And she went blind at the age of 11, and has spent her life programming and creating programs and tools to help the blind access the world that is pretty hard to navigate if you don't have eyesight. So, she is super inspirational, super smart, super funny. So, it was a pleasure talking with her. And then I'm talking with three other women, Yasmine Mustafa, who started something called Roar for Good. >> Rebecca: We've had her on the show. >> Oh, you did? >> Yeah. >> Wonderful, great. So, she's fantastic, I'm really glad you covered her. And then another woman named Sarah Echohawk, who, sort of, is an advocate, an activist and is getting more and young, native women involved in STEM. And then, finally, I'll be talking with Stephanie Lampkin of Blendoor, who started this wonderful app to try and overcome the implicit bias, and unconscious bias that happens when people are hiring women or people of color in recruiting for them. >> So she's starting this app that she will then sell to companies, or sell to other recruiters? >> So, she's already started it and she has a lot of major tech companies involved. I think Airbnb uses it, I want to say SalesForce uses it, you're going to have to check me on that one. But she's got about 5,000 people on it right now. >> Wow, so the goal of these stories is to inspire other women by their success. >> Exactly, so these are four radically different women coming into tech in radically different ways and it's just really incredible to see how they've managed to overcome all sorts of obstacles in their way. And not only overcome them, but, sort of, utilize them to their advantage and stake out a place for themselves in this industry. >> Great, Tori, what are the projects that you are working on here? >> So, we've been hearing a lot about diversity, diversity is so important, and we've been hearing about how increasing diversity in a company makes your company better. It just brings in more perspectives, and it also, what's really interesting is that, in tech, it can catch people who have a diverse range of perspectives, can catch problems with products, or with a code, or with something, and how it would be implemented out into the world. I caught this really interesting panel yesterday about disability and looking at how people with disabilities can make companies, specifically tech companies, can help to improve them. This woman, Jennifer Jong, who is an Accessibility Program Manager at Microsoft, she was really interesting. She was talking about how, I wrote a piece on this yesterday, she's talking about how, when you bring people in with a disability, how they can catch things that other people just don't see or wouldn't normally notice. And also how, when we create things for those with disabilities, you know, a lot of things that have been implemented by the Americans with Disabilities Act. She talked about the button that you press to go through the door, how it can also be used by people who don't have disabilities and how it's important to create things that can be used by everybody, but that have inclusion in mind. >> So, why is that true? What is her perspective on why people with disabilities have this special ways to detect blind spots? >> So, if you're creating something, there's no way that you can know how many users are going to be interacting with it, there's no way that you can predict that a person with a disability won't be using it, and so it's diversity, it's really important to bring in different perspectives. So, they had talked about a video, a really beautiful, promotional video that showed a range of visuals, it was very effective but it had no sound and a blind person wouldn't get anything out of it. And so, it's like looking at a product, you need somebody to be in the room, just like you want women and people of color and a range of ethnicities, you want diversity, you want someone to be able to say, "This isn't going to work for me, this isn't going "to work for my child, this isn't going to work "for a range of people." And that's a really effective and important thing that ultimately saves your company's bottom line, because then, you won't have to go back and change your product in the future. >> And fix it, fix it as a problem. >> Right, you'll spend more money fixing your product than you would if you had just talked about, had inclusion and diversity, if you'd just considered that from the get-go, you'll ultimately save your company more money. >> So, the question for the three of you, really, is that as you said, we hear so much about the importance of diversity and of getting a variety of perspectives, and having people of different genders, and races, and cultures feel included and having a voice at the table, I just want to know, I mean, do companies really feel this way or is that what they say at Grace Hopper because this is what makes sense to say to their target audience? >> It's totally possible that it's just a marketing ploy, it's totally possible that they're realizing that half the population makes money and can do things, and that makes more money. I mean, a lot of tech is driven by the bottom line, it's driven by financials, but in the case of the disability thing, it's like, it almost doesn't matter. It is not only the right thing to do, if you need a financial incentive, that's not good. Obviously, it's the right thing to do so you should be doing it for that reason, but if you do also have a financial incentive, that's not bad. And if we're, sort of, driving more towards empowering women and giving women a voice and allowing women to do things and taking them seriously, ultimately that's not a bad thing. >> And just to add to that, I think there is a lot of research out there today, for example, having more women on corporate boards, that that does impact the bottom line and, obviously, that's what companies are most concerned about. So, I think that companies are starting to realize that having that diversity and inclusion is good for business as well as a marketing ploy. >> And I think, I mean, just to add, I also think that, you know, whether or not this is a good thing, I think companies do realize that that is important. And they're realizing that it's necessary, I don't know, it's necessary to impact the bottom line and that is something that, whether or not we like it, it is the most convincing factor for many of these companies. >> I think it's also, when you have women moving up to positions of power, to the C-Suite, to positions of leadership, they understand that women are people with skills and they are the ones who are, you know, hiring more women, and that ultimately helps the bottom line. So, as you have more and more women moving higher and higher to the top, that's when, like when we talk about the companies changing, that's because women are changing. And they're changing the perspectives of men and everybody else in between that works at the company. >> Are women changing? I mean, I think that's a question, too, is that we're all as collectively as a society, becoming more aware that these biases exist in hiring and recruitment practices. But, I think that's the question, are women starting to change, too, the way they behave in the workplace, the way they go about managing their careers? >> I know it's changing minds, like other peoples' minds. >> That's a really interesting question, though. One student I talked to who was from India, talked about the gender discrimination she has faced. And she said she did change how she acted, she shut down all emotions, she took any emotion out of her responses because her colleagues would say, "Oh, you're a woman, "you're so emotional," and she was tired of that. So, it's an interesting question to look at. I don't know, I don't have the data in front of me but it would be interesting to look into that. >> Yeah, great, that's the next GroundTruth project. Excellent, well Hilary, Caroline, Tori, thanks so much for being on theCube, we've had great fun talking to you. >> Yes, thanks for having us. >> Thank you. >> We will have more from the Orange County Convention Center, the Grace Hopper Celebration of Women in Computing, just after this. (upbeat music)

Published Date : Oct 6 2017

SUMMARY :

Brought to you by SiliconAngle Media. It's great to have you guys on here. So, I'll start with you, Tori, since you were a So, the GroundTruth Project is a non-profit based And so, each of you are working on your own, individual and including Canada and the U.S. So, but here the ones that are aspiring to careers to see these women who do feel empowered and are, it's the numbers, but it's also the Google manifesto, So, I think there are a lot of people that are But, it's been interesting to hear pretty hard to navigate if you don't have eyesight. So, she's fantastic, I'm really glad you covered her. I think Airbnb uses it, I want to say SalesForce uses it, Wow, so the goal of these stories is to inspire and it's just really incredible to see how they've managed She talked about the button that you press to go through to be in the room, just like you want women that from the get-go, you'll ultimately save your Obviously, it's the right thing to do so you should So, I think that companies are starting to realize that And I think, I mean, just to add, I think it's also, when you have women moving up the way they go about managing their careers? So, it's an interesting question to look at. Yeah, great, that's the next GroundTruth project. Center, the Grace Hopper Celebration of Women

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Sarah EchohawkPERSON

0.99+

ToriPERSON

0.99+

Jennifer JongPERSON

0.99+

Yasmine MustafaPERSON

0.99+

CarolinePERSON

0.99+

HilaryPERSON

0.99+

Rebecca KnightPERSON

0.99+

Tori BedfordPERSON

0.99+

Caroline LesterPERSON

0.99+

Stephanie LampkinPERSON

0.99+

Hilary BurnsPERSON

0.99+

Chieko AsakawaPERSON

0.99+

BostonLOCATION

0.99+

MicrosoftORGANIZATION

0.99+

RebeccaPERSON

0.99+

CanadaLOCATION

0.99+

Travis KalanickPERSON

0.99+

AmericaLOCATION

0.99+

Americans with Disabilities ActTITLE

0.99+

IBMORGANIZATION

0.99+

Orlando, FloridaLOCATION

0.99+

SiliconAngle MediaORGANIZATION

0.99+

threeQUANTITY

0.99+

Silicon ValleyLOCATION

0.99+

last yearDATE

0.99+

yesterdayDATE

0.99+

three guestsQUANTITY

0.99+

U.S.LOCATION

0.99+

oneQUANTITY

0.99+

fourQUANTITY

0.98+

bothQUANTITY

0.98+

Orange County Convention CenterLOCATION

0.98+

IndiaLOCATION

0.98+

One studentQUANTITY

0.98+

AirbnbORGANIZATION

0.98+

Grace HopperEVENT

0.97+

this yearDATE

0.97+

11QUANTITY

0.97+

todayDATE

0.97+

eachQUANTITY

0.96+

BlendoorORGANIZATION

0.96+

Grace HopperORGANIZATION

0.95+

three other womenQUANTITY

0.95+

GoogleORGANIZATION

0.94+

theCubeORGANIZATION

0.94+

GroundTruthORGANIZATION

0.91+

Grace Hopper Celebration of Women in ComputingEVENT

0.91+

Grace Hopper ConferenceEVENT

0.9+

about 5,000 peopleQUANTITY

0.89+

Grace Hopper ConferenceEVENT

0.89+

Grace Hopper Celebration of Women in ComputingEVENT

0.89+

half the populationQUANTITY

0.89+

GroundTruth ProjectORGANIZATION

0.87+

AmericanOTHER

0.86+

SalesForceORGANIZATION

0.84+

Roar for GoodTITLE

0.78+

inspirational womenQUANTITY

0.75+

Grace Hopper CelebrationEVENT

0.74+

four radically different womenQUANTITY

0.7+

Crossing the DivideEVENT

0.58+

2017DATE

0.47+

GroundTruthEVENT

0.42+

ProjectORGANIZATION

0.41+

Jason Chaffee & Eileen Haggerty | CUBE Conversation


 

(bright music) >> Hey, welcome to this "CUBE Conversation." I'm your host, Lisa Martin. I've got two guests from NETSCOUT here with me today. Eileen Haggerty joins us, the AVP of Product and Solutions Marketing and Jason Chaffee, Senior Product Manager. We're going to be talking about the importance of quality end user experience with UC&C, Unified Communications and Collaborations services for something that will be near and dear to all of our hearts, employee productivity. Eileen, let's go ahead and start with you and the impact of COVID on UC&C, what has it been? >> Oh, Lisa, great question, because we really have seen an evolution in the importance and reliance on UCC. COVID would not have allowed us to go to work, do business continuity, any of those things had it not been for strong communications platforms to help us do that. And in fact, really the hero of all of this has been what's called Unified Communications as a Service or UCaaS. Enterprise businesses really depended entirely on the communications between the home office and the employees remotely. This is also known to be the way we all went to work. It was no longer a car. We picked up the phone basically or the computer. So Zoom, WebEx, Teams, Google Meets, they've all become household names really over the last two years. That's kind of exciting for them. And businesses during that period of time expanded their tools to keep business running and employees in communication using these very platforms, and we'll refer to this a couple of times during this conversation too, Lisa. We did a survey at the end of 2021, IT leaders, about their use of UCaaS and UCC during this period of time. We found that almost of them had used collaboration tools, and in fact, added to their arsenal of tools during this period of time to such an extent that they're now ranging, the majority of them, between three and nine different platforms that their corporate employees use. This became unwieldy, of course, during that time, and so their strategy going forward is going to be to reduce some of that number, but pretty interesting details. >> Yeah, between three and nine is a lot, and certainly, UC&C became a lifeline for all of us, professionally and personally. Even my mom learned how to use Zoom during this time. I was pretty proud of helping her with that. But, Jason, talk to us about all these new communication services. We're completely dependent on them, but overall what have you found out in terms of how they worked out? >> Well, to be honest with you, I think from the IT organization, it's been a challenge. It's been difficult. I think every IT organization is really motivated to ensure the quality of the services throughout the whole company, but as you can imagine, the increase of these communications Eileen just talked about during the pandemic is just significantly increasing the number of IT help desk tickets that have come through. And in that survey that Eileen just talked about, in fact, a third of those that responded said that 50 to 75% of their help desk tickets right now are related to UC&C or UCaaS services, and in fact, they say that over half of them have said that they get those tickets at least once a day, if not multiple times a day. And I think another big aspect of this that's been a challenge is everybody working from home now and the whole hybrid environment, and IT teams are really trying to understand and make sure that they get the same delivery of services if they were in the corporate headquarters, and I think they felt a loss of control and visibility in the services that are being delivered. I think the other thing that came out of this survey was about 25% of those said that they could get these issues if and when they happen, resolved in just a matter of minutes, but most said that it can take hours or even days to get through those, and that's obviously a really bad look for the company and really hinders productivity. So overall, I'd say it's been a challenge. I think as this onslaught of services that have come through and hampered and that everyone's trying to manage and get through, along with the lack of visibility when everybody's working from home. Of course, it's been fantastic for those of us that are working from home and made everything easier, but I think it's just made it that much more difficult for the IT teams that are trying to manage this new environment. >> Right, definitely difficulty behind the scenes there. You talked about the 25% of IT organizations being able to resolve quickly, but that leaves 75% of organizations where it takes more than a few minutes, and I can imagine individually, that might not be a big impact, but, Eileen, overall if it's taking more than a few minutes to resolve UC&C IT help desk issues, what's the overall impact to a business? >> It can be significant, and we hear a lot of little stories sadly sometimes on Lester Holt's evening news, but really what you are looking at here are longer periods of time where employees can't talk to each other. We've got email. We can probably compensate a different manner, but when it happens to be your customers not being able to talk to customer service reps in the contact center, couple of hours, that can be a big issue. Partners and suppliers who might be trying to get you important information very quickly. Maybe it's a supply chain issue item that they want to alert you to that you need to act on. That's a long period of time. And I think it's kind of important here to call out one special group, and that would be corporate executives. I think we've all heard about these big town hall meetings that corporate executives may be holding with employees or investors and all of a sudden their UCaaS support freezes, or it doesn't connect the voice in the video, and all of a sudden, you've got a very embarrassing situation. It really gets the attention of the public. Losing communication for a couple of hours, bottom line, it is going to impact productivity, customer service, and it could impact reputation, especially with those social media influencers that we all both favor and fear. So, when we were talking about our survey results, that is actually a top concern of IT executives, that productivity will get hit if communications problems do exist. So I think really ultimately for all of us in the business, disruptions and communication, it's going to be bad for business, any length. >> It is bad for business at any length, and that's a huge risk for businesses in any industry. I've been on those executive town halls where video wouldn't connect, and you just think, as much as we wanted that human connection during this time, and you couldn't get it, it made the the interaction not as ideal and obviously a risk for the organization. So, Jason, how can IT then jump in and resolve these disruptions faster, because time is of the essence here? >> Well, yeah, exactly. As we've discussed and Eileen just talked about, I think resolving issues quickly is really the key. I think we all know issues are going to happen, they just will, but it's really the IT team that can solve those the fastest is the team that's going to win, and so I think that's really the key to all of that. And one of the things that comes out of that is, again, from this survey is that only about 54% of the respondents said that they felt confident that they could understand root cause and be able to get to those issues quickly, which leaves about 43%, almost as many, that said they were less than confident or somewhat confident in finding that root cause, and so I think that's really the key there is really having the confidence to be able to find that, and to get that confidence, you need to be able to understand root cause quickly. And in order to get that, I think you need a combination of two things, which is passive, packet-based monitoring as well as continuous active testing or monitoring of those solutions. So, what I mean by that is being able to automatically and continuously test these services, even if nobody's on the system and nobody's on trying to make a phone call. So you have somebody who's trying to host, an active agent that's trying to host a meeting and others that are trying to join the meeting and sending an audio and sending and receiving video and looking at the measurements and trying to take all of that data in to really proactively understand what's going on and doing this every 15 minutes or every once an hour to really, again, get ahead of things before they become a problem. But I think beyond that, it's really about being able to take that data and the packets from those transactions that you were just testing and be able to trend that data and define problems and diagnose issues proactively. Again, as Eileen just said, before the CEO gets on there and tries to make his town hall call, so that that's really important to be able to solve those things more quickly. I think it's really a combination of a passive, scalable monitoring solution along with scheduled automatic testing of those, and along with the packets that go with that, that's really a combination of both. It's kind of a best of both worlds in order to get those things solved quickly. >> To get them solved quickly, I want to go back to something that Eileen said. You mentioned the word 'confidence,' and that I think it's important to point out that you're not saying that trivially, that IT needs to have the confidence that it has the right solutions in place to discover these faster. Eileen, from your perspective, talk to me about what that confidence means to IT and how it can shift up the stack to the C-suite. >> You know, honestly, processes and policies in these organizations are critical. They need to be able to notice when the trouble ticket comes in, and there's a lot of 'em, let's face it, and they're coming from all kinds of locations. Now, it's some of the remote offices. Some of 'em are still people at home. You've got to be able to know where to turn, what screen to use, what tool to adjust, what workflow to process, and that does come with practice, but it also comes with a solid set of tools and visibility strategies, and then you follow that process through, you work together. Maybe the voice, people in the network, people have to work together, maybe the cloud people, 'cause it's a contract with UCaaS, work together, gather the evidence and pinpoint the solution that's going to fix the problem with those locations. And it is, it becomes then a confidence builder, proof points. >> Right, proof points are critical. So, the solution that you both talked about, Jason, you elaborated on this, I'd love to get some real world examples. Tell me how you've seen this in practice. Jason, we'll start with you and then, Eileen, we'll go to you. >> Okay, yeah, great, I was just thinking of one that we had that really was one of the largest insurance companies in the country, if not the world, and when the pandemic hit, they suddenly had to send everybody home, and this is the lifeblood of their company, the contact centers that are answering these calls and the ones that were processing these claims. And as everybody went home, their strategy really was to actually go buy new laptops for everyone and implement VPNs that had a little bit of, but not fully and then implement SD-WAN, and so they had all of this traffic going over VPNs and through SD-WANs and new UCaaS solutions and all of this and what they quickly learned and found out was they just didn't have the visibility to be able to fuel, again, that word confidence that they were serving their their customers very well. So, they actually implemented one of our solutions and put these agents out at all their different desktops and started watching and doing these proactive calls and making going through the meeting life cycle and actually testing the bandwidths of their SD-WAN and ensuring all of those services. And what they found was they were able to solve some of the solutions that are even harder to solve normally, because it was affecting some users, but not all of them, and that's often harder to try and get their arms around. And so as they continued to do this, and just got their arms more around it and got more visibility, they really feel like everything's under control. And as of now, they're actually planning on leaving all those users working from home now, because they can actually ensure the same type of experience for both the users and for their customers as if those people were working from their corporate headquarters. >> Jason, that sort of sounds like a bit of a COVID silver lining. >> (laughs) Yeah, I think so. I think a lot of us actually started working from home and so there was kind of the silver lining of flexibility for the employee, but for the customer and the company itself, they learned this new visibility and this new way to ensure that across everywhere, wherever they may be, and I don't know that that would've come out without the COVID silver lining, as you just said. So I think it was something that really came out of it that might've been a good thing. >> And there are a few of those, which is nice. Eileen, talk to me about some of the experiences that you've had. What have you seen out in the field? >> Yeah, we have one really terrific energy company that was talking with us the other day, and their employees use Microsoft Office 365 which has the teams collaboration and communications system with it. And, what they've been doing for those at-home employees was configuring tests on their works stations, much like Jason explained, but it mimics exactly how an employee might be making their call and joining the sessions from video to audio, to going through login and log out. What's interesting is, and this is a compelling differentiator, a lot of tools may just watch traffic as it's happening, and certainly that's a value, but these tests even run when our agents are asleep. And what that does is these are all 24 hour a day businesses, and so maybe they have followed-the-sun contact centers or whatnot and something's happening in one part of the world, but then it's rolling to others, and we have all heard those disaster stories online when we wake up and we're hearing it on the morning news. So, if an organization can find the problem and detect it early enough and then get it when it's a few people that are involved, they can actually resolve it with our tools, find the root cause, implement a corrective action before the majority of their agents are even logging in in the morning. Nobody even knew that there was a problem overnight, because they were able to get to it and resolve it faster, and when you can do that, you're being proactive. And this, again, builds on the confidence that you get doing this kind of activity over and over and over again. But at the same time, it's also enormously beneficial from a business productivity perspective for the employees and certainly reputationally in revenue-based customer service, making sure that things are available whenever they're necessary. So, making sure they can perform their jobs, I know it sounds trite, but it's really the most critical thing we can help 'em with. >> Absolutely, 'cause I think, Eileen, one of the things that I've always thought for years is that employee productivity and employee satisfaction is directly tied to customer satisfaction, customer delight, and as you talked about, there's plenty of social media influencers who are happy to share news, good or bad, so that employee productivity is a direct relation on the customer satisfaction, the brand reputation. Jason, what are your thoughts there? >> Well, I think that's exactly right. I think it's, again, being able to continuously have your arms around that and make sure, because if you can't make phone calls or customers can't call in or things aren't working then it is, it's really a revenue impact, but it's also reputation impact, and you're going to remember that company that just didn't have their act together if you will, so I think it's important to, again, invest in this and make sure that no matter what, wherever your end users are or wherever your employees are, you're providing that experience just as if they were in the corporate office, and even when they're in the corporate office, being able to, as Eileen talked about, know ahead of time and proactively when issues happen in these very complex UCaaS and UC&C solutions that are out there now. >> And last question, Eileen for you, I imagine that these solutions are horizontal across every industry, every type of business, every size of business? >> Yeah, it's one of those phenomenon that's really critical is the ability to be ubiquitous in any environment, not being vendor-specific or dependent because now look at it, we shot that stat, three to nine different platforms in one company. If you had to buy three or nine different platforms to resolve problems, that reduces your ability to build workflows, consistent ones and know what you're doing every single time. You'd have to learn nine different platforms. That's not productive and that's certainly not realistic. So yeah, I think that this is really key. You have to be able to look at all of the traffic and be able to resolve the problems, regardless of what they happen to be running on. >> And the great thing is hearing the tools and the capabilities and solutions that NETSCOUT has to help businesses in any industry, at any size be able to identify these issues, resolve them faster and then create some silver linings. Guys, thank you so much for joining me today. Always a pleasure talking to you. This was really interesting to talk about the importance of quality end user experience with communication services for the employee productivity and of course, ultimately consumer customer satisfaction. We appreciate your insights. >> Thank you so much. >> Thank you. >> For Eileen Haggerty and Jason Chaffee, I'm Lisa Martin, you're watching a "CUBE Conversation." (bright music)

Published Date : Apr 8 2022

SUMMARY :

and the impact of COVID and in fact, added to and certainly, UC&C became and make sure that they get and I can imagine individually, and that would be corporate executives. and obviously a risk for the organization. and be able to get to and that I think it's and then you follow that process and then, Eileen, we'll go to you. and the ones that were Jason, that sort of sounds like and the company itself, some of the experiences and joining the sessions and as you talked about, and make sure that no matter what, and be able to resolve the problems, and the capabilities and solutions For Eileen Haggerty and

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JasonPERSON

0.99+

Eileen HaggertyPERSON

0.99+

EileenPERSON

0.99+

Lisa MartinPERSON

0.99+

Jason ChaffeePERSON

0.99+

UCCORGANIZATION

0.99+

threeQUANTITY

0.99+

50QUANTITY

0.99+

LisaPERSON

0.99+

75%QUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

25%QUANTITY

0.99+

UCaaSORGANIZATION

0.99+

UC&CORGANIZATION

0.99+

two thingsQUANTITY

0.99+

two guestsQUANTITY

0.99+

nineQUANTITY

0.99+

bothQUANTITY

0.99+

one companyQUANTITY

0.99+

todayDATE

0.99+

oneQUANTITY

0.99+

both worldsQUANTITY

0.98+

about 43%QUANTITY

0.97+

nine different platformsQUANTITY

0.97+

about 25%QUANTITY

0.97+

Unified CommunicationsORGANIZATION

0.96+

end of 2021DATE

0.96+

Office 365TITLE

0.96+

COVIDOTHER

0.96+

more than a few minutesQUANTITY

0.96+

one partQUANTITY

0.95+

NETSCOUTORGANIZATION

0.95+

UCORGANIZATION

0.94+

ZoomORGANIZATION

0.92+

WebExORGANIZATION

0.91+

nine different platformsQUANTITY

0.89+

24 hour a dayQUANTITY

0.87+

pandemicEVENT

0.87+

TeamsORGANIZATION

0.86+

COVIDTITLE

0.86+

last two yearsDATE

0.84+

thirdQUANTITY

0.83+

15 minutesQUANTITY

0.82+

about 54% of the respondentsQUANTITY

0.8+

Lester HoltPERSON

0.8+

over half of themQUANTITY

0.76+

at least once a dayQUANTITY

0.76+

once an hourQUANTITY

0.75+

James Slessor, Accenture, and Loren Atherley, Seattle PD | AWS PS Partner Awards 2021


 

>>Mhm. >>What? >>Hello and welcome to today's session of the 2021 AWS Global public sector partner awards for today's award for the award of best partner transformation, best global expansion. I'm your host Natalie ehrlich and now I'm very pleased to introduce you to our next guest. They are James Lester Global Managing Director, public Safety attic censure. And Lauren a thoroughly director of performance analytics and research of the Seattle Police Department. Welcome gentlemen, it's wonderful to have you on the program. >>Thanks for having us >>terrific. Well, we're going to talk a lot about data and a lot about public safety and how, you know, data analytics analytics is making a big impact um in the public safety world. So do tell us I'd like to start with you James. Uh tell us how X enters intelligent public safety platform turns data into a strategic asset. >>Thanks Natalie. Well, the intelligent public safety platform is all about combining different data sets together and taking a platform approach to using data within public safety. What it does is it allows us to bring a whole host of different types of data together in one place, put that through a series of different analytical transactions and then visualize that information back to where however within the public safety environment needs it and really does four key things. One is, it helps with situational awareness, helps the officer understand the situation that they're in and gives them insight to help support and guide them. Secondly it helps enhance investigations. So how do you join those dots? How do you help navigate and speed up complex investigations by better understanding a range of data sets. And thirdly it really helps with force management and understanding the behavior and the activities within the force and how best to use those critical assets of police officers and police staff themselves. And then finally what it does is it really looks at digital evidence management. How do you actually manage data effectively as an asset within the force? So those are the four key things. And certainly with our work at Seattle we've really focused on that force management area. >>Yeah. Thanks for mentioning that. Now let's shift to Lauren tell us how has I PSP you know, really helped your staff make some key contributions towards public safety in the city of Seattle. >>Yeah. Thanks. Uh so you know I think our business intelligence journey started maybe a little in advance of the I. P. S. P. With our partnership With accenture on the data analytics platform. And we've been taking that, say my PSP approach since 2015 as part of our efforts to comply with a federal consent decree. So, you know, I think what what we probably don't understand necessarily is that most police departments build sort of purpose built source systems to onboard data and make good use of them. But that doesn't necessarily mean that that data is readily available. So, um, you know, we've been able to demonstrate compliance with the elements of a settlement agreement for our consent decree, but we've also been able to do a whole host of research projects designed to better understand how police operate in the criminal logical environment, how they perform and um, and really make the best use of those assets as we have them deployed around the city doing law enforcement work. >>Terrific. Now, James, let's shift to you one of the kind of key dilemmas here in the sectors. You know, how can you utilize these um, new technologies in policing, um, and law enforcement while still building trust with the public? >>Absolutely. I mean, I do think that it is critical that public safety agencies are able to use the benefits of new technology, criminals are using technology in all sorts of different ways. Uh and it's important that policing and public safety organizations are able to exploit the advantages that we now see through technology and the ability to understand and analyze data. But equally, it's critical that these implemented in ways that engage and involve the public, that the way in which the analytics and analysis is conducted is open and transparent, so people understand how the data is being used, uh and also that officers themselves are part of the process when these tools are built and developed, so they gain a thorough understanding of how to use them and how to implement them. So, being open and transparent in the way that these platforms are built is absolutely critical. >>Yeah, that's an excellent point because clearly bad actors are already using data. Um, so we might as well use it to help, you know, the good actors out there and help the public. So in your opinion, Lauren, um, you know, what is the next phase of this kind of model? Um, what are you hoping to do next with this kind of technology? >>So as we use this technology as we understand more about it, we're really building data curiosity within the management group at SPD. So really sort of, I would say the first phase of a business intelligence platform in policing is about orienting people to the problem, how many of these things happen at what time and where do they happen around the city? And then beginning to build better questions from the people who are actually doing the business of delivering police service in the city and the future of that, I think is taking that critical feedback and understanding how to respond with really more intelligence services, predictive services that help to kind of cut through that just general descriptive noise and provide insights to the operation in a city that has About 900,000 dispatches in a year. It's difficult to pinpoint which dispatches are of interest to police managers, which crimes which calls may be of interest to the city at large as they manage public safety and risk management. And so are, you know, sort of future development agenda. Our road map, if you will for the next several years is really focused on developing intelligent processes that make use of all of that data, boil it down to what's critically important and help direct people who are most familiar with the operation. To those uh those events, those critical pieces of insight that might be helpful in allowing them to make better management decisions. >>Yeah. And what what are some of the key areas that you find this platform can be effective in terms of uh you know public safety, certain criminal activities James. >>Um I think the PSP has a wide range of applications so certainly looking at how we can bring a whole range of data together that previously has maybe been locked away in individual silos or separate systems. So public safety agencies are really able to understand what they know and the information that they have and make it much easier to access and understand that information. Um I also think it's allowing us to perform levels of analytics and therefore insight on those data sets, which previously public safety agencies have have struggled to do. Um And in the case of Seattle focusing on the uh force management aspect, I think it's helped them understand the activities and behavior of their workforce um in context and in relation to other events and other activities to a much greater depth than they've been able to do previously. >>Terrific. Well, Lauren obviously, you know, this was a really tough year with Covid. What impact did the pandemic have on your operations and some of your more modern policing efforts? >>Oh, I mean, obviously it radically changed the way that we deploy forces in the organization beginning early in March. Uh you know, like most of the world, we all moved home trying to keep up the pace of development and continue to manage the operation. But as that was happening, you know, people are still living their lives out in the world and out in the city. So we pretty quickly found ourselves trying to adapt to that new use of public spaces, trying to identify problems in an environment that really doesn't look anything like the previous couple of years that we were working in, uh and uh, you know, data and and really sort of the availability of technology that helps too identify what's new and what's interesting and rapidly develop those insights and get them available for police managers was critical and helping us identify things like trends in potential exposure events. So being able to identify uh, you know, just exactly how many calls involve the use of personal protective equipment, use that to forecast potential exposure for our workforce. Be able to track exposure reports in the field to be able to determine whether there are staffing concerns that need to be considered. Uh and all of that. Uh you know, we're able to pretty rapidly prototype and deploy dashboards and tools that help folks, especially the command staff, have kind of a global sense for how the operation is functioning as the environment is literally shifting underneath them as uh, you know, uh the use of public spaces is changing and as dispatch procedures are changing as public policy is changing related to, you know, things like jail booking availability and public health and safety policies. The department was able to stay on top of those key metrics and really make sort of the best minute by minute decisions based in the data. And that's really not something that's been available, uh You know, without sort of the ready availability of data at your fingertips and the ability to rapidly prototype things that direct people to what's important. >>Yeah, thank you for that. Now, James, I'd love to hear your comments on that. I mean, has the pandemic altered or, you know, given you any kind of fresh perspective on uh you know, modern policing efforts using these kinds of platforms? >>Well, I think that the pandemic has shown the importance of using data in new and different ways. I mean, one thing the pandemic certainly did was see a shift in in crime types. You know, traditional street based volume crime declined, where we saw increases in cyber and online crime. And therefore the flexibility that police services have had to have in order to shift how they combat changing crime types has meant that they've had to be able to use data as they say, in new and different ways. And think about how can they be more disruptive in their tactics? How can they get new types of insight and really platforms like the intelligent public safety platform help them become much more flexible and much more nimble and that's certainly something that's been required as a result of the pandemic. >>Yeah, that's really great to hear. Um you know, Lauren going to you, I'd love to hear how specifically I PSP was able to help you uh you know, the Seattle Police department as well as statewide inquiries and end investigations. What kind of enhancements were you able to receive from that? >>Uh Well, you know, I mean in terms of investigations, uh the way that Seattle deploys the intelligent public safety platform, our focus is really primarily on deployment of resources that force management, the accountability, piece of things. And so from our perspective, the ability to onboard new data sources quickly uh and make use of that information in a kind of a rapid sort of responsive function was really critical for us but um you know, certainly and I think as as most communities are exploring new ways of approaching community safety, uh the intelligent public safety platform uh for us was really effective in being able to answer those, those questions that are coming up as as people are reforming the way that policing is deployed in their communities, were able to reach out and see just exactly how many hours are spent on one particular function over another, something that perhaps could be available for a co responder model, or take a look at, you know, this sort of natural experiment that we have out in our criminal logical environment as people are using spaces differently. And as we are approaching enforcement policy differently, being able to take a look at what are the effects of perhaps not arresting people for certain types of crime? Do we see some displacement of those effects across different crime types? Do we see an increase in harm in other areas of the operation? Have we seen you know increases in one particular crime type while another one declines? How is the environment responding these rapid changes and what really is a natural experiment occurring out in the world? >>Yeah I mean it's really incredible um Having all that data at our fingertips and really being able to utilize it to have a fuller perspective of what's really happening right? What what do you think James? >>Yeah. I mean I think being able to really utilize different data sets is something that police forces are seeing to become more and more important. Um They're recognizing that becoming increasingly data lead can really help improve their performance. Um And the challenge to date has really been how do we bring those data sets together but not then require police officers to way through reams and reams of data. I mean the volumes of data now that organizations are having to manage is huge. And so really the power of the I. P. S. P. Is being able to filter through all of that data and really deliver actionable insight. So something that the police officer can go and do something with and really make a difference around. Um And that's something that that's absolutely critical. And modern day policing is increasingly having this data driven evidence based approach to help make it far more effective and really focused on the needs of its citizens. >>Yeah and as you mentioned, I mean the algorithms are really driving this you know, um giving us these actionable insights but how can we ensure that they're acting fairly to all the stakeholders James. I'd like you to answer this please. >>Um Absolutely. I mean, trust and confidence within policing is absolutely paramount. Uh and whilst the use of these sorts of tools, I think is critical to helping keep communities in the public safe. It's very important that these tools are deployed in an open transparent way. And part of that is understanding the algorithms, making sure that algorithmic fairness is built in so that these are tested and any sort of bias or unintended consequences are understood and known and factored in to the way in which the tools are both built and used. Um, and then on top of that, I think it's open, it's important that these are open and transparent, that it's clear how and why departments are using these technologies. And it's also critical that the officers using them are trained and understood how to use them and how to use the insights that they're starting to deliver. >>Yeah, and thanks for mentioning that Lauren, what kind of training are you providing your staff at the Seattle police department And you know, how do you see this evolving in the next few years >>with regard to algorithmic fairness, what kind of training along those lines or training >>with the I. P. S. P. And all these other kinds of technologies that you're embracing now to help with your public safety initiatives? >>Well, you know, I think one of the one of the real benefits to becoming an evidence based organization, a truly evidence led organization is that you don't have to train folks uh to use data. What you have to do is leverage data to make it work and be really infused with their everyday operations. So we, you know, we have police officers and we have managers and we have commanders and they've got a very complex set of tasks that they've been trained to work with. It's really sort of our mission to be trained in, how to identify uh you know, the correct UX UI design, how to make sure that the insights that are being directed to those folks are really tailored to the business they're operating. And so to that extent, the analytical staff that we have is really focused on sort of continuous improvement and constant learning about how we can be mindful of things like bias and the algorithms and the various systems that we're deploying uh and also be up to date on the latest and how police operations really are sort of deployed around the city and ways that we can infuse those various management functions or those police service functions with data and analytics that are just naturally working with people's business sense and they're uh really sort of primary function, which is the delivery of police service >>terrific. Well, James lastly with you um just real quick you know, what are your thoughts in terms of being able to extend the power of I. P. S. P. Beyonce Seattle uh in the broader United States? >>Well I mean I think my PSP has huge applicability to any public safety agency in in the US and beyond and we're already seeing other agencies around the world interested in using it and deploying it um Where they basically want to get uh and be able to utilize a wider range of data where they want to be able to drive greater insight into that that data set um Where they want to be confident in deploying open and fair algorithms um to really make a difference. Um And if we to take the the specific example of the U. S. And the work that we've done with Seattle then I think tools like the intention public safety platform have a huge part to play in the wider reimagining of policing within the US in understanding officer and departmental behavior and actually opening up and sharing information with citizens that increased levels of trust and transparency between public safety agencies and the communities and citizens that they serve. >>And you know, on that note, do you think that I PSP is useful in terms of collaboration efforts, you know, with other police departments, perhaps in other states? Um you know or just just as a global national effort. Lauren, do you see that kind of potential in the future? >>Yeah and actually we do that now. So one of the really sort of powerful things about having all of this data at your fingertips and I would say having this kind of awesome responsibility of being the steward of this type of asset for the community. Um and and really sort of for the industry at large is that we're able to take the data and rapidly develop new research projects with researchers around the world. So the Seattle Police Department maintains a network of about I think we're up to about 55 current researchers and institutions. I think we've got about 33, institutions around the world. People really working on real time problems related to the things that matter to our community right now. And having this data available at our fingertips allows us to rapidly develop data sources. We can actually get on a call with one of our researchers uh and build out a table for them to use or start exploring the data in an ad hoc querying layer layer and, you know, making visualizations and helping the researchers form better questions so that when we develop their data, when we deploy it to them, uh they can pretty quickly get in there. It's in the format that they're looking for, They understand it. They can run some tests and determine whether the data that we provided for them actually meets their needs. And if it doesn't, we can develop a new set pretty quickly. I I think that also that research function, that discovery function that were enabled through the use of these data is actually helping to bring together uh the community of law enforcement around this this idea of Collaborative understanding of how policing works around the city, you know, sorry, around the world. So of 18,000 or so law enforcement agencies in the United States, there is broad variability in people's competency in their use of data, but we're finding that agencies that have access to these types of tools or who are starting to develop access to these tools and the competencies to use them are coming together. Uh and beginning to talk about how we can understand sort of cross cultural and cross regional correlations and patterns that we see across our multiple operations. And although, you know, those are varied uh and and range around the country or even around the world, I think that that collaboration on understanding how policing works, what's normal, what's abnormal, what we can do about it is really going to be powerful in the future. >>Yeah, Well, this is really exciting. Yeah. Well, what are your thoughts? >>I was just going to build on the point that Lauren was making there because I think I think that is a really important one. Um you know, when when you look around the world, the challenges that different public safety and policing agencies face are actually dramatically similar um and the ability for policing organizations to come together and think about how they use data, think about how they use data in a fair and transparent way is something we're really starting to see and that ability to share insight to experiment um and really make sure that you're bringing lots of different insight together to further the way in which police forces all over the world can actually help keep their citizens safe and combat what is an increasingly rapidly and evolving threat. Landscape is something that we see tools like the intelligent public safety platform really helping to do and if one police force starts to use it in a certain way in one jurisdiction and has success there, there is definitely the ability to share that insight with others and get this global pool of understanding and knowledge all furthering the level of safety and security that can be delivered to communities in the public. >>Terrific. Well, thank you both so much for your insights has been really fantastic to hear. You know, how these new technologies are really coming to the aid of public safety officials and helping secure the public. That was Lauren a thoroughly director of performance analytics and research at the Seattle police Department and James Schlesser. Global Managing Director, Public Safety at its center. And I'm Natalie early, your host for the cube and that was our session for the AWS Global Public uh, partner Awards. Thank you very much for watching. >>Mm

Published Date : Jun 30 2021

SUMMARY :

and now I'm very pleased to introduce you to our next guest. So do tell us I'd like to start with you James. that they're in and gives them insight to help support and guide them. you know, really helped your staff make some key contributions towards public safety and really make the best use of those assets as we have them deployed You know, how can you utilize these um, new technologies in policing, and the ability to understand and analyze data. Um, so we might as well use it to help, you know, the good actors out there and help the And so are, you know, sort of future development agenda. platform can be effective in terms of uh you know public safety, Um And in the case of Seattle focusing on the uh force management aspect, What impact did the pandemic have on your operations and some of your more modern So being able to identify uh, you know, just exactly how many calls involve the use altered or, you know, given you any kind of fresh perspective on uh you flexibility that police services have had to have in order to shift how they combat changing Um you know, Lauren going to you, I'd love to hear how specifically the ability to onboard new data sources quickly uh and make use of that information in a of the I. P. S. P. Is being able to filter through all of that data and really deliver Yeah and as you mentioned, I mean the algorithms are really driving this you know, um giving And it's also critical that the officers using them are with your public safety initiatives? to be trained in, how to identify uh you know, the correct UX UI Well, James lastly with you um just real quick you know, what are your thoughts in terms agency in in the US and beyond and we're already seeing other agencies And you know, on that note, do you think that I PSP is useful in terms Um and and really sort of for the industry at large is Well, what are your thoughts? and the ability for policing organizations to come together and think about and research at the Seattle police Department and James Schlesser.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
LaurenPERSON

0.99+

JamesPERSON

0.99+

NataliePERSON

0.99+

James SchlesserPERSON

0.99+

United StatesLOCATION

0.99+

USLOCATION

0.99+

SeattleLOCATION

0.99+

Loren AtherleyPERSON

0.99+

18,000QUANTITY

0.99+

Natalie ehrlichPERSON

0.99+

James SlessorPERSON

0.99+

Seattle Police DepartmentORGANIZATION

0.99+

2015DATE

0.99+

one placeQUANTITY

0.99+

United StatesLOCATION

0.99+

one jurisdictionQUANTITY

0.98+

SPDORGANIZATION

0.98+

I. P. S. P.ORGANIZATION

0.98+

bothQUANTITY

0.98+

pandemicEVENT

0.98+

OneQUANTITY

0.98+

About 900,000 dispatchesQUANTITY

0.98+

James LesterPERSON

0.97+

Seattle police DepartmentORGANIZATION

0.96+

first phaseQUANTITY

0.96+

Seattle Police departmentORGANIZATION

0.96+

AccentureORGANIZATION

0.95+

oneQUANTITY

0.95+

SecondlyQUANTITY

0.94+

about 55 current researchersQUANTITY

0.94+

todayDATE

0.94+

four key thingsQUANTITY

0.92+

CovidPERSON

0.88+

2021EVENT

0.88+

thirdlyQUANTITY

0.88+

one particular crime typeQUANTITY

0.87+

AWS Global Public uh, partner AwardsEVENT

0.86+

early in MarchDATE

0.85+

U. S.LOCATION

0.83+

AWSEVENT

0.81+

33QUANTITY

0.81+

PS Partner Awards 2021EVENT

0.77+

a yearQUANTITY

0.77+

PSPCOMMERCIAL_ITEM

0.77+

Global public sector partner awardsEVENT

0.73+

yearsDATE

0.72+

one policeQUANTITY

0.69+

nextDATE

0.65+

aboutQUANTITY

0.64+

upQUANTITY

0.62+

AWSORGANIZATION

0.6+

I. P.ORGANIZATION

0.59+

GlobalORGANIZATION

0.56+

BeyoncePERSON

0.54+

next several yearsDATE

0.54+

departmentORGANIZATION

0.47+

GlobalPERSON

0.38+

IO TAHOE EPISODE 4 DATA GOVERNANCE V2


 

>>from around the globe. It's the Cube presenting adaptive data governance brought to you by Iota Ho. >>And we're back with the data automation. Siri's. In this episode, we're gonna learn more about what I owe Tahoe is doing in the field of adaptive data governance how it can help achieve business outcomes and mitigate data security risks. I'm Lisa Martin, and I'm joined by a J. Bihar on the CEO of Iot Tahoe and Lester Waters, the CEO of Bio Tahoe. Gentlemen, it's great to have you on the program. >>Thank you. Lisa is good to be back. >>Great. Staley's >>likewise very socially distant. Of course as we are. Listen, we're gonna start with you. What's going on? And I am Tahoe. What's name? Well, >>I've been with Iot Tahoe for a little over the year, and one thing I've learned is every customer needs air just a bit different. So we've been working on our next major release of the I O. Tahoe product. But to really try to address these customer concerns because, you know, we wanna we wanna be flexible enough in order to come in and not just profile the date and not just understand data quality and lineage, but also to address the unique needs of each and every customer that we have. And so that required a platform rewrite of our product so that we could, uh, extend the product without building a new version of the product. We wanted to be able to have plausible modules. We also focused a lot on performance. That's very important with the bulk of data that we deal with that we're able to pass through that data in a single pass and do the analytics that are needed, whether it's, uh, lineage, data quality or just identifying the underlying data. And we're incorporating all that we've learned. We're tuning up our machine learning we're analyzing on MAWR dimensions than we've ever done before. We're able to do data quality without doing a Nen initial rejects for, for example, just out of the box. So I think it's all of these things were coming together to form our next version of our product. We're really excited by it, >>So it's exciting a J from the CEO's level. What's going on? >>Wow, I think just building on that. But let's still just mentioned there. It's were growing pretty quickly with our partners. And today, here with Oracle are excited. Thio explain how that shaping up lots of collaboration already with Oracle in government, in insurance, on in banking and we're excited because we get to have an impact. It's real satisfying to see how we're able. Thio. Help businesses transform, Redefine what's possible with their data on bond. Having I recall there is a partner, uh, to lean in with is definitely helping. >>Excellent. We're gonna dig into that a little bit later. Let's let's go back over to you. Explain adaptive data governance. Help us understand that >>really adaptive data governance is about achieving business outcomes through automation. It's really also about establishing a data driven culture and pushing what's traditionally managed in I t out to the business. And to do that, you've got to you've got Thio. You've got to enable an environment where people can actually access and look at the information about the data, not necessarily access the underlying data because we've got privacy concerns itself. But they need to understand what kind of data they have, what shape it's in what's dependent on it upstream and downstream, and so that they could make their educated decisions on on what they need to do to achieve those business outcomes. >>Ah, >>lot of a lot of frameworks these days are hardwired, so you can set up a set of business rules, and that set of business rules works for a very specific database and a specific schema. But imagine a world where you could just >>say, you >>know, the start date of alone must always be before the end date of alone and having that generic rule, regardless of the underlying database and applying it even when a new database comes online and having those rules applied. That's what adaptive data governance about I like to think of. It is the intersection of three circles, Really. It's the technical metadata coming together with policies and rules and coming together with the business ontology ease that are that are unique to that particular business. And this all of this. Bringing this all together allows you to enable rapid change in your environment. So it's a mouthful, adaptive data governance. But that's what it kind of comes down to. >>So, Angie, help me understand this. Is this book enterprise companies are doing now? Are they not quite there yet. >>Well, you know, Lisa, I think every organization is is going at its pace. But, you know, markets are changing the economy and the speed at which, um, some of the changes in the economy happening is is compelling more businesses to look at being more digital in how they serve their own customers. Eh? So what we're seeing is a number of trends here from heads of data Chief Data Officers, CEO, stepping back from, ah, one size fits all approach because they've tried that before, and it it just hasn't worked. They've spent millions of dollars on I T programs China Dr Value from that data on Bennett. And they've ended up with large teams of manual processing around data to try and hardwire these policies to fit with the context and each line of business and on that hasn't worked. So the trends that we're seeing emerge really relate. Thio, How do I There's a chief data officer as a CEO. Inject more automation into a lot of these common tax. Andi, you know, we've been able toc that impact. I think the news here is you know, if you're trying to create a knowledge graph a data catalog or Ah, business glossary. And you're trying to do that manually will stop you. You don't have to do that manually anymore. I think best example I can give is Lester and I We we like Chinese food and Japanese food on. If you were sitting there with your chopsticks, you wouldn't eat the bowl of rice with the chopsticks, one grain at a time. What you'd want to do is to find a more productive way to to enjoy that meal before it gets cold. Andi, that's similar to how we're able to help the organizations to digest their data is to get through it faster, enjoy the benefits of putting that data to work. >>And if it was me eating that food with you guys, I would be not using chopsticks. I would be using a fork and probably a spoon. So eso Lester, how then does iota who go about doing this and enabling customers to achieve this? >>Let me, uh, let me show you a little story have here. So if you take a look at the challenges the most customers have, they're very similar, but every customers on a different data journey, so but it all starts with what data do I have? What questions or what shape is that data in? Uh, how is it structured? What's dependent on it? Upstream and downstream. Um, what insights can I derive from that data? And how can I answer all of those questions automatically? So if you look at the challenges for these data professionals, you know, they're either on a journey to the cloud. Maybe they're doing a migration oracle. Maybe they're doing some data governance changes on bits about enabling this. So if you look at these challenges and I'm gonna take you through a >>story here, E, >>I want to introduce Amanda. Man does not live like, uh, anyone in any large organization. She's looking around and she just sees stacks of data. I mean, different databases, the one she knows about, the one she doesn't know about what should know about various different kinds of databases. And a man is just tasking with understanding all of this so that they can embark on her data journey program. So So a man who goes through and she's great. I've got some handy tools. I can start looking at these databases and getting an idea of what we've got. Well, as she digs into the databases, she starts to see that not everything is as clear as she might have hoped it would be. You know, property names or column names, or have ambiguous names like Attribute one and attribute to or maybe date one and date to s Oh, man is starting to struggle, even though she's get tools to visualize. And look what look at these databases. She still No, she's got a long road ahead. And with 2000 databases in her large enterprise, yes, it's gonna be a long turkey but Amanda Smart. So she pulls out her trusty spreadsheet to track all of her findings on what she doesn't know about. She raises a ticket or maybe tries to track down the owner to find what the data means. And she's tracking all this information. Clearly, this doesn't scale that well for Amanda, you know? So maybe organization will get 10 Amanda's to sort of divide and conquer that work. But even that doesn't work that well because they're still ambiguities in the data with Iota ho. What we do is we actually profile the underlying data. By looking at the underlying data, we can quickly see that attribute. One looks very much like a U. S. Social Security number and attribute to looks like a I c D 10 medical code. And we do this by using anthologies and dictionaries and algorithms to help identify the underlying data and then tag it. Key Thio Doing, uh, this automation is really being able to normalize things across different databases, so that where there's differences in column names, I know that in fact, they contain contain the same data. And by going through this exercise with a Tahoe, not only can we identify the data, but we also could gain insights about the data. So, for example, we can see that 97% of that time that column named Attribute one that's got us Social Security numbers has something that looks like a Social Security number. But 3% of the time, it doesn't quite look right. Maybe there's a dash missing. Maybe there's a digit dropped. Or maybe there's even characters embedded in it. So there may be that may be indicative of a data quality issues, so we try to find those kind of things going a step further. We also try to identify data quality relationships. So, for example, we have two columns, one date, one date to through Ah, observation. We can see that date 1 99% of the time is less than date, too. 1% of the time. It's not probably indicative of a data quality issue, but going a step further, we can also build a business rule that says Day one is less than date to. And so then when it pops up again, we can quickly identify and re mediate that problem. So these are the kinds of things that we could do with with iota going even a step further. You could take your your favorite data science solution production ISAT and incorporated into our next version a zey what we call a worker process to do your own bespoke analytics. >>We spoke analytics. Excellent, Lester. Thank you. So a J talk us through some examples of where you're putting this to use. And also what is some of the feedback from >>some customers? But I think it helped do this Bring it to life a little bit. Lisa is just to talk through a case study way. Pull something together. I know it's available for download, but in ah, well known telecommunications media company, they had a lot of the issues that lasted. You spoke about lots of teams of Amanda's, um, super bright data practitioners, um, on baby looking to to get more productivity out of their day on, deliver a good result for their own customers for cell phone subscribers, Um, on broadband users. So you know that some of the examples that we can see here is how we went about auto generating a lot of that understanding off that data within hours. So Amanda had her data catalog populated automatically. A business class three built up on it. Really? Then start to see. Okay, where do I want Thio? Apply some policies to the data to to set in place some controls where they want to adapt, how different lines of business, maybe tax versus customer operations have different access or permissions to that data on What we've been able to do there is, is to build up that picture to see how does data move across the entire organization across the state. Andi on monitor that overtime for improvement, so have taken it from being a reactive. Let's do something Thio. Fix something. Thio, Now more proactive. We can see what's happening with our data. Who's using it? Who's accessing it, how it's being used, how it's being combined. Um, on from there. Taking a proactive approach is a real smart use of of the talents in in that telco organization Onda folks that worked there with data. >>Okay, Jason, dig into that a little bit deeper. And one of the things I was thinking when you were talking through some of those outcomes that you're helping customers achieve is our ally. How do customers measure are? Why? What are they seeing with iota host >>solution? Yeah, right now that the big ticket item is time to value on. And I think in data, a lot of the upfront investment cause quite expensive. They have been today with a lot of the larger vendors and technologies. So what a CEO and economic bio really needs to be certain of is how quickly can I get that are away. I think we've got something we can show. Just pull up a before and after, and it really comes down to hours, days and weeks. Um, where we've been able Thio have that impact on in this playbook that we pulled together before and after picture really shows. You know, those savings that committed a bit through providing data into some actionable form within hours and days to to drive agility, but at the same time being out and forced the controls to protect the use of that data who has access to it. So these are the number one thing I'd have to say. It's time on. We can see that on the the graphic that we've just pulled up here. >>We talk about achieving adaptive data governance. Lester, you guys talk about automation. You talk about machine learning. How are you seeing those technologies being a facilitator of organizations adopting adaptive data governance? Well, >>Azaz, we see Mitt Emmanuel day. The days of manual effort are so I think you know this >>is a >>multi step process. But the very first step is understanding what you have in normalizing that across your data estate. So you couple this with the ontology, that air unique to your business. There is no algorithms, and you basically go across and you identify and tag tag that data that allows for the next steps toe happen. So now I can write business rules not in terms of columns named columns, but I could write him in terms of the tags being able to automate. That is a huge time saver and the fact that we can suggest that as a rule, rather than waiting for a person to come along and say, Oh, wow. Okay, I need this rule. I need this will thes air steps that increased that are, I should say, decrease that time to value that A. J talked about and then, lastly, a couple of machine learning because even with even with great automation and being able to profile all of your data and getting a good understanding, that brings you to a certain point. But there's still ambiguities in the data. So, for example, I might have to columns date one and date to. I may have even observed the date. One should be less than day two, but I don't really know what date one and date to our other than a date. So this is where it comes in, and I might ask the user said, >>Can >>you help me identify what date? One and date You are in this in this table. Turns out they're a start date and an end date for alone That gets remembered, cycled into the machine learning. So if I start to see this pattern of date one day to elsewhere, I'm going to say, Is it start dating and date? And these Bringing all these things together with this all this automation is really what's key to enabling this This'll data governance. Yeah, >>great. Thanks. Lester and a j wanna wrap things up with something that you mentioned in the beginning about what you guys were doing with Oracle. Take us out by telling us what you're doing there. How are you guys working together? >>Yeah, I think those of us who worked in i t for many years we've We've learned Thio trust articles technology that they're shifting now to ah, hybrid on Prohm Cloud Generation to platform, which is exciting. Andi on their existing customers and new customers moving to article on a journey. So? So Oracle came to us and said, you know, we can see how quickly you're able to help us change mindsets Ondas mindsets are locked in a way of thinking around operating models of I t. That there may be no agile and what siloed on day wanting to break free of that and adopt a more agile A p I at driven approach. A lot of the work that we're doing with our recall no is around, uh, accelerating what customers conduce with understanding their data and to build digital APS by identifying the the underlying data that has value. Onda at the time were able to do that in in in hours, days and weeks. Rather many months. Is opening up the eyes to Chief Data Officers CEO to say, Well, maybe we can do this whole digital transformation this year. Maybe we can bring that forward and and transform who we are as a company on that's driving innovation, which we're excited about it. I know Oracle, a keen Thio to drive through and >>helping businesses transformed digitally is so incredibly important in this time as we look Thio things changing in 2021 a. J. Lester thank you so much for joining me on this segment explaining adaptive data governance, how organizations can use it benefit from it and achieve our Oi. Thanks so much, guys. >>Thank you. Thanks again, Lisa. >>In a moment, we'll look a adaptive data governance in banking. This is the Cube, your global leader in high tech coverage. >>Innovation, impact influence. Welcome to the Cube. Disruptors. Developers and practitioners learn from the voices of leaders who share their personal insights from the hottest digital events around the globe. Enjoy the best this community has to offer on the Cube, your global leader in high tech digital coverage. >>Our next segment here is an interesting panel you're gonna hear from three gentlemen about adaptive data. Governments want to talk a lot about that. Please welcome Yusuf Khan, the global director of data services for Iot Tahoe. We also have Santiago Castor, the chief data officer at the First Bank of Nigeria, and good John Vander Wal, Oracle's senior manager of digital transformation and industries. Gentlemen, it's great to have you joining us in this in this panel. Great >>to be >>tried for me. >>Alright, Santiago, we're going to start with you. Can you talk to the audience a little bit about the first Bank of Nigeria and its scale? This is beyond Nigeria. Talk to us about that. >>Yes, eso First Bank of Nigeria was created 125 years ago. One of the oldest ignored the old in Africa because of the history he grew everywhere in the region on beyond the region. I am calling based in London, where it's kind of the headquarters and it really promotes trade, finance, institutional banking, corporate banking, private banking around the world in particular, in relationship to Africa. We are also in Asia in in the Middle East. >>So, Sanjay, go talk to me about what adaptive data governance means to you. And how does it help the first Bank of Nigeria to be able to innovate faster with the data that you have? >>Yes, I like that concept off adaptive data governor, because it's kind of Ah, I would say an approach that can really happen today with the new technologies before it was much more difficult to implement. So just to give you a little bit of context, I I used to work in consulting for 16, 17 years before joining the president of Nigeria, and I saw many organizations trying to apply different type of approaches in the governance on by the beginning early days was really kind of a year. A Chicago A. A top down approach where data governance was seeing as implement a set of rules, policies and procedures. But really, from the top down on is important. It's important to have the battle off your sea level of your of your director. Whatever I saw, just the way it fails, you really need to have a complimentary approach. You can say bottom are actually as a CEO are really trying to decentralize the governor's. Really, Instead of imposing a framework that some people in the business don't understand or don't care about it, it really needs to come from them. So what I'm trying to say is that data basically support business objectives on what you need to do is every business area needs information on the detector decisions toe actually be able to be more efficient or create value etcetera. Now, depending on the business questions they have to solve, they will need certain data set. So they need actually to be ableto have data quality for their own. For us now, when they understand that they become the stores naturally on their own data sets. And that is where my bottom line is meeting my top down. You can guide them from the top, but they need themselves to be also empower and be actually, in a way flexible to adapt the different questions that they have in orderto be able to respond to the business needs. Now I cannot impose at the finish for everyone. I need them to adapt and to bring their answers toe their own business questions. That is adaptive data governor and all That is possible because we have. And I was saying at the very beginning just to finalize the point, we have new technologies that allow you to do this method data classifications, uh, in a very sophisticated way that you can actually create analitico of your metadata. You can understand your different data sources in order to be able to create those classifications like nationalities, a way of classifying your customers, your products, etcetera. >>So one of the things that you just said Santa kind of struck me to enable the users to be adaptive. They probably don't want to be logging in support ticket. So how do you support that sort of self service to meet the demand of the users so that they can be adaptive. >>More and more business users wants autonomy, and they want to basically be ableto grab the data and answer their own question. Now when you have, that is great, because then you have demand of businesses asking for data. They're asking for the insight. Eso How do you actually support that? I would say there is a changing culture that is happening more and more. I would say even the current pandemic has helped a lot into that because you have had, in a way, off course, technology is one of the biggest winners without technology. We couldn't have been working remotely without these technologies where people can actually looking from their homes and still have a market data marketplaces where they self serve their their information. But even beyond that data is a big winner. Data because the pandemic has shown us that crisis happened, that we cannot predict everything and that we are actually facing a new kind of situation out of our comfort zone, where we need to explore that we need to adapt and we need to be flexible. How do we do that with data. Every single company either saw the revenue going down or the revenue going very up For those companies that are very digital already. Now it changed the reality, so they needed to adapt. But for that they needed information. In order to think on innovate, try toe, create responses So that type of, uh, self service off data Haider for data in order to be able to understand what's happening when the prospect is changing is something that is becoming more, uh, the topic today because off the condemning because of the new abilities, the technologies that allow that and then you then are allowed to basically help your data. Citizens that call them in the organization people that no other business and can actually start playing and an answer their own questions. Eso so these technologies that gives more accessibility to the data that is some cataloging so they can understand where to go or what to find lineage and relationships. All this is is basically the new type of platforms and tools that allow you to create what are called a data marketplace. I think these new tools are really strong because they are now allowing for people that are not technology or I t people to be able to play with data because it comes in the digital world There. Used to a given example without your who You have a very interesting search functionality. Where if you want to find your data you want to sell, Sir, you go there in that search and you actually go on book for your data. Everybody knows how to search in Google, everybody's searching Internet. So this is part of the data culture, the digital culture. They know how to use those schools. Now, similarly, that data marketplace is, uh, in you can, for example, see which data sources they're mostly used >>and enabling that speed that we're all demanding today during these unprecedented times. Goodwin, I wanted to go to you as we talk about in the spirit of evolution, technology is changing. Talk to us a little bit about Oracle Digital. What are you guys doing there? >>Yeah, Thank you. Um, well, Oracle Digital is a business unit that Oracle EMEA on. We focus on emerging countries as well as low and enterprises in the mid market, in more developed countries and four years ago. This started with the idea to engage digital with our customers. Fear Central helps across EMEA. That means engaging with video, having conference calls, having a wall, a green wall where we stand in front and engage with our customers. No one at that time could have foreseen how this is the situation today, and this helps us to engage with our customers in the way we were already doing and then about my team. The focus of my team is to have early stage conversations with our with our customers on digital transformation and innovation. And we also have a team off industry experts who engaged with our customers and share expertise across EMEA, and we inspire our customers. The outcome of these conversations for Oracle is a deep understanding of our customer needs, which is very important so we can help the customer and for the customer means that we will help them with our technology and our resource is to achieve their goals. >>It's all about outcomes, right? Good Ron. So in terms of automation, what are some of the things Oracle's doing there to help your clients leverage automation to improve agility? So that they can innovate faster, which in these interesting times it's demanded. >>Yeah, thank you. Well, traditionally, Oracle is known for their databases, which have bean innovated year over year. So here's the first lunch on the latest innovation is the autonomous database and autonomous data warehouse. For our customers, this means a reduction in operational costs by 90% with a multi medal converts, database and machine learning based automation for full life cycle management. Our databases self driving. This means we automate database provisioning, tuning and scaling. The database is self securing. This means ultimate data protection and security, and it's self repairing the automates failure, detection fail over and repair. And then the question is for our customers, What does it mean? It means they can focus on their on their business instead off maintaining their infrastructure and their operations. >>That's absolutely critical use if I want to go over to you now. Some of the things that we've talked about, just the massive progression and technology, the evolution of that. But we know that whether we're talking about beta management or digital transformation, a one size fits all approach doesn't work to address the challenges that the business has, um that the i t folks have, as you're looking through the industry with what Santiago told us about first Bank of Nigeria. What are some of the changes that you're seeing that I owe Tahoe seeing throughout the industry? >>Uh, well, Lisa, I think the first way I'd characterize it is to say, the traditional kind of top down approach to data where you have almost a data Policeman who tells you what you can and can't do, just doesn't work anymore. It's too slow. It's too resource intensive. Uh, data management data, governments, digital transformation itself. It has to be collaborative on. There has to be in a personalization to data users. Um, in the environment we find ourselves in. Now, it has to be about enabling self service as well. Um, a one size fits all model when it comes to those things around. Data doesn't work. As Santiago was saying, it needs to be adapted toe how the data is used. Andi, who is using it on in order to do this cos enterprises organizations really need to know their data. They need to understand what data they hold, where it is on what the sensitivity of it is they can then any more agile way apply appropriate controls on access so that people themselves are and groups within businesses are our job and could innovate. Otherwise, everything grinds to a halt, and you risk falling behind your competitors. >>Yeah, that one size fits all term just doesn't apply when you're talking about adaptive and agility. So we heard from Santiago about some of the impact that they're making with First Bank of Nigeria. Used to talk to us about some of the business outcomes that you're seeing other customers make leveraging automation that they could not do >>before it's it's automatically being able to classify terabytes, terabytes of data or even petabytes of data across different sources to find duplicates, which you can then re mediate on. Deletes now, with the capabilities that iota offers on the Oracle offers, you can do things not just where the five times or 10 times improvement, but it actually enables you to do projects for Stop that otherwise would fail or you would just not be able to dio I mean, uh, classifying multi terrible and multi petabytes states across different sources, formats very large volumes of data in many scenarios. You just can't do that manually. I mean, we've worked with government departments on the issues there is expect are the result of fragmented data. There's a lot of different sources. There's lot of different formats and without these newer technologies to address it with automation on machine learning, the project isn't durable. But now it is on that that could lead to a revolution in some of these businesses organizations >>to enable that revolution that there's got to be the right cultural mindset. And one of the when Santiago was talking about folks really kind of adapted that. The thing I always call that getting comfortably uncomfortable. But that's hard for organizations to. The technology is here to enable that. But well, you're talking with customers use. How do you help them build the trust in the confidence that the new technologies and a new approaches can deliver what they need? How do you help drive the kind of a tech in the culture? >>It's really good question is because it can be quite scary. I think the first thing we'd start with is to say, Look, the technology is here with businesses like I Tahoe. Unlike Oracle, it's already arrived. What you need to be comfortable doing is experimenting being agile around it, Andi trying new ways of doing things. Uh, if you don't wanna get less behind that Santiago on the team that fbn are a great example off embracing it, testing it on a small scale on, then scaling up a Toyota, we offer what we call a data health check, which can actually be done very quickly in a matter of a few weeks. So we'll work with a customer. Picky use case, install the application, uh, analyzed data. Drive out Cem Cem quick winds. So we worked in the last few weeks of a large entity energy supplier, and in about 20 days, we were able to give them an accurate understanding of their critical data. Elements apply. Helping apply data protection policies. Minimize copies of the data on work out what data they needed to delete to reduce their infrastructure. Spend eso. It's about experimenting on that small scale, being agile on, then scaling up in a kind of very modern way. >>Great advice. Uh, Santiago, I'd like to go back to Is we kind of look at again that that topic of culture and the need to get that mindset there to facilitate these rapid changes, I want to understand kind of last question for you about how you're doing that from a digital transformation perspective. We know everything is accelerating in 2020. So how are you building resilience into your data architecture and also driving that cultural change that can help everyone in this shift to remote working and a lot of the the digital challenges and changes that we're all going through? >>The new technologies allowed us to discover the dating anyway. Toe flawed and see very quickly Information toe. Have new models off over in the data on giving autonomy to our different data units. Now, from that autonomy, they can then compose an innovator own ways. So for me now, we're talking about resilience because in a way, autonomy and flexibility in a organization in a data structure with platform gives you resilience. The organizations and the business units that I have experienced in the pandemic are working well. Are those that actually because they're not physically present during more in the office, you need to give them their autonomy and let them actually engaged on their own side that do their own job and trust them in a way on as you give them, that they start innovating and they start having a really interesting ideas. So autonomy and flexibility. I think this is a key component off the new infrastructure. But even the new reality that on then it show us that, yes, we used to be very kind off structure, policies, procedures as very important. But now we learn flexibility and adaptability of the same side. Now, when you have that a key, other components of resiliency speed, because people want, you know, to access the data and access it fast and on the site fast, especially changes are changing so quickly nowadays that you need to be ableto do you know, interact. Reiterate with your information to answer your questions. Pretty, um, so technology that allows you toe be flexible iterating on in a very fast job way continue will allow you toe actually be resilient in that way, because you are flexible, you adapt your job and you continue answering questions as they come without having everything, setting a structure that is too hard. We also are a partner off Oracle and Oracle. Embodies is great. They have embedded within the transactional system many algorithms that are allowing us to calculate as the transactions happened. What happened there is that when our customers engaged with algorithms and again without your powers, well, the machine learning that is there for for speeding the automation of how you find your data allows you to create a new alliance with the machine. The machine is their toe, actually, in a way to your best friend to actually have more volume of data calculated faster. In a way, it's cover more variety. I mean, we couldn't hope without being connected to this algorithm on >>that engagement is absolutely critical. Santiago. Thank you for sharing that. I do wanna rap really quickly. Good On one last question for you, Santiago talked about Oracle. You've talked about a little bit. As we look at digital resilience, talk to us a little bit in the last minute about the evolution of Oracle. What you guys were doing there to help your customers get the resilience that they have toe have to be not just survive but thrive. >>Yeah. Oracle has a cloud offering for infrastructure, database, platform service and a complete solutions offered a South on Daz. As Santiago also mentioned, We are using AI across our entire portfolio and by this will help our customers to focus on their business innovation and capitalize on data by enabling new business models. Um, and Oracle has a global conference with our cloud regions. It's massively investing and innovating and expanding their clouds. And by offering clouds as public cloud in our data centers and also as private cloud with clouded customer, we can meet every sovereignty and security requirements. And in this way we help people to see data in new ways. We discover insights and unlock endless possibilities. And and maybe 11 of my takeaways is if I If I speak with customers, I always tell them you better start collecting your data. Now we enable this partners like Iota help us as well. If you collect your data now, you are ready for tomorrow. You can never collect your data backwards, So that is my take away for today. >>You can't collect your data backwards. Excellently, John. Gentlemen, thank you for sharing all of your insights. Very informative conversation in a moment, we'll address the question. Do you know your data? >>Are you interested in test driving the iota Ho platform kick Start the benefits of data automation for your business through the Iota Ho Data Health check program. Ah, flexible, scalable sandbox environment on the cloud of your choice with set up service and support provided by Iota ho. Look time with a data engineer to learn more and see Io Tahoe in action from around the globe. It's the Cube presenting adaptive data governance brought to you by Iota Ho. >>In this next segment, we're gonna be talking to you about getting to know your data. And specifically you're gonna hear from two folks at Io Tahoe. We've got enterprise account execs to be to Davis here, as well as Enterprise Data engineer Patrick Simon. They're gonna be sharing insights and tips and tricks for how you could get to know your data and quickly on. We also want to encourage you to engage with the media and Patrick, use the chat feature to the right, send comments, questions or feedback so you can participate. All right, Patrick Savita, take it away. Alright. >>Thankfully saw great to be here as Lisa mentioned guys, I'm the enterprise account executive here in Ohio. Tahoe you Pat? >>Yeah. Hey, everyone so great to be here. I said my name is Patrick Samit. I'm the enterprise data engineer here in Ohio Tahoe. And we're so excited to be here and talk about this topic as one thing we're really trying to perpetuate is that data is everyone's business. >>So, guys, what patent I got? I've actually had multiple discussions with clients from different organizations with different roles. So we spoke with both your technical and your non technical audience. So while they were interested in different aspects of our platform, we found that what they had in common was they wanted to make data easy to understand and usable. So that comes back. The pats point off to being everybody's business because no matter your role, we're all dependent on data. So what Pan I wanted to do today was wanted to walk you guys through some of those client questions, slash pain points that we're hearing from different industries and different rules and demo how our platform here, like Tahoe, is used for automating Dozier related tasks. So with that said are you ready for the first one, Pat? >>Yeah, Let's do it. >>Great. So I'm gonna put my technical hat on for this one. So I'm a data practitioner. I just started my job. ABC Bank. I have, like, over 100 different data sources. So I have data kept in Data Lakes, legacy data, sources, even the cloud. So my issue is I don't know what those data sources hold. I don't know what data sensitive, and I don't even understand how that data is connected. So how can I saw who help? >>Yeah, I think that's a very common experience many are facing and definitely something I've encountered in my past. Typically, the first step is to catalog the data and then start mapping the relationships between your various data stores. Now, more often than not, this has tackled through numerous meetings and a combination of excel and something similar to video which are too great tools in their own part. But they're very difficult to maintain. Just due to the rate that we are creating data in the modern world. It starts to beg for an idea that can scale with your business needs. And this is where a platform like Io Tahoe becomes so appealing, you can see here visualization of the data relationships created by the I. O. Tahoe service. Now, what is fantastic about this is it's not only laid out in a very human and digestible format in the same action of creating this view, the data catalog was constructed. >>Um so is the data catalog automatically populated? Correct. Okay, so So what I'm using Iota hope at what I'm getting is this complete, unified automated platform without the added cost? Of course. >>Exactly. And that's at the heart of Iota Ho. A great feature with that data catalog is that Iota Ho will also profile your data as it creates the catalog, assigning some meaning to those pesky column underscore ones and custom variable underscore tents. They're always such a joy to deal with. Now, by leveraging this interface, we can start to answer the first part of your question and understand where the core relationships within our data exists. Uh, personally, I'm a big fan of this view, as it really just helps the i b naturally John to these focal points that coincide with these key columns following that train of thought, Let's examine the customer I D column that seems to be at the center of a lot of these relationships. We can see that it's a fairly important column as it's maintaining the relationship between at least three other tables. >>Now you >>notice all the connectors are in this blue color. This means that their system defined relationships. But I hope Tahoe goes that extra mile and actually creates thes orange colored connectors as well. These air ones that are machine learning algorithms have predicted to be relationships on. You can leverage to try and make new and powerful relationships within your data. >>Eso So this is really cool, and I can see how this could be leverage quickly now. What if I added new data sources or your multiple data sources and need toe identify what data sensitive can iota who detect that? >>Yeah, definitely. Within the hotel platform. There, already over 300 pre defined policies such as hip for C, C, P. A and the like one can choose which of these policies to run against their data along for flexibility and efficiency and running the policies that affect organization. >>Okay, so so 300 is an exceptional number. I'll give you that. But what about internal policies that apply to my organization? Is there any ability for me to write custom policies? >>Yeah, that's no issue. And it's something that clients leverage fairly often to utilize this function when simply has to write a rejects that our team has helped many deploy. After that, the custom policy is stored for future use to profile sensitive data. One then selects the data sources they're interested in and select the policies that meet your particular needs. The interface will automatically take your data according to the policies of detects, after which you can review the discoveries confirming or rejecting the tagging. All of these insights are easily exported through the interface. Someone can work these into the action items within your project management systems, and I think this lends to the collaboration as a team can work through the discovery simultaneously, and as each item is confirmed or rejected, they can see it ni instantaneously. All this translates to a confidence that with iota hope, you can be sure you're in compliance. >>So I'm glad you mentioned compliance because that's extremely important to my organization. So what you're saying when I use the eye a Tahoe automated platform, we'd be 90% more compliant that before were other than if you were going to be using a human. >>Yeah, definitely the collaboration and documentation that the Iot Tahoe interface lends itself to really help you build that confidence that your compliance is sound. >>So we're planning a migration. Andi, I have a set of reports I need to migrate. But what I need to know is, uh well, what what data sources? Those report those reports are dependent on. And what's feeding those tables? >>Yeah, it's a fantastic questions to be toe identifying critical data elements, and the interdependencies within the various databases could be a time consuming but vital process and the migration initiative. Luckily, Iota Ho does have an answer, and again, it's presented in a very visual format. >>Eso So what I'm looking at here is my entire day landscape. >>Yes, exactly. >>Let's say I add another data source. I can still see that unified 3 60 view. >>Yeah, One future that is particularly helpful is the ability to add data sources after the data lineage. Discovery has finished alone for the flexibility and scope necessary for any data migration project. If you only need need to select a few databases or your entirety, this service will provide the answers. You're looking for things. Visual representation of the connectivity makes the identification of critical data elements a simple matter. The connections air driven by both system defined flows as well as those predicted by our algorithms, the confidence of which, uh, can actually be customized to make sure that they're meeting the needs of the initiative that you have in place. This also provides tabular output in case you needed for your own internal documentation or for your action items, which we can see right here. Uh, in this interface, you can actually also confirm or deny the pair rejection the pair directions, allowing to make sure that the data is as accurate as possible. Does that help with your data lineage needs? >>Definitely. So So, Pat, My next big question here is So now I know a little bit about my data. How do I know I can trust >>it? So >>what I'm interested in knowing, really is is it in a fit state for me to use it? Is it accurate? Does it conform to the right format? >>Yeah, that's a great question. And I think that is a pain point felt across the board, be it by data practitioners or data consumers alike. Another service that I owe Tahoe provides is the ability to write custom data quality rules and understand how well the data pertains to these rules. This dashboard gives a unified view of the strength of these rules, and your dad is overall quality. >>Okay, so Pat s o on on the accuracy scores there. So if my marketing team needs to run, a campaign can read dependent those accuracy scores to know what what tables have quality data to use for our marketing campaign. >>Yeah, this view would allow you to understand your overall accuracy as well as dive into the minutia to see which data elements are of the highest quality. So for that marketing campaign, if you need everything in a strong form, you'll be able to see very quickly with these high level numbers. But if you're only dependent on a few columns to get that information out the door, you can find that within this view, eso >>you >>no longer have to rely on reports about reports, but instead just come to this one platform to help drive conversations between stakeholders and data practitioners. >>So I get now the value of IATA who brings by automatically capturing all those technical metadata from sources. But how do we match that with the business glossary? >>Yeah, within the same data quality service that we just reviewed, one can actually add business rules detailing the definitions and the business domains that these fall into. What's more is that the data quality rules were just looking at can then be tied into these definitions. Allowing insight into the strength of these business rules is this service that empowers stakeholders across the business to be involved with the data life cycle and take ownership over the rules that fall within their domain. >>Okay, >>so those custom rules can I apply that across data sources? >>Yeah, you could bring in as many data sources as you need, so long as you could tie them to that unified definition. >>Okay, great. Thanks so much bad. And we just want to quickly say to everyone working in data, we understand your pain, so please feel free to reach out to us. we are Website the chapel. Oh, Arlington. And let's get a conversation started on how iota Who can help you guys automate all those manual task to help save you time and money. Thank you. Thank >>you. Your Honor, >>if I could ask you one quick question, how do you advise customers? You just walk in this great example this banking example that you instantly to talk through. How do you advise customers get started? >>Yeah, I think the number one thing that customers could do to get started with our platform is to just run the tag discovery and build up that data catalog. It lends itself very quickly to the other needs you might have, such as thes quality rules. A swell is identifying those kind of tricky columns that might exist in your data. Those custom variable underscore tens I mentioned before >>last questions to be to anything to add to what Pat just described as a starting place. >>I'm no, I think actually passed something that pretty well, I mean, just just by automating all those manual task. I mean, it definitely can save your company a lot of time and money, so we we encourage you just reach out to us. Let's get that conversation >>started. Excellent. So, Pete and Pat, thank you so much. We hope you have learned a lot from these folks about how to get to know your data. Make sure that it's quality, something you can maximize the value of it. Thanks >>for watching. Thanks again, Lisa, for that very insightful and useful deep dive into the world of adaptive data governance with Iota Ho Oracle First Bank of Nigeria This is Dave a lot You won't wanna mess Iota, whose fifth episode in the data automation Siri's in that we'll talk to experts from Red Hat and Happiest Minds about their best practices for managing data across hybrid cloud Inter Cloud multi Cloud I T environment So market calendar for Wednesday, January 27th That's Episode five. You're watching the Cube Global Leader digital event technique

Published Date : Dec 10 2020

SUMMARY :

adaptive data governance brought to you by Iota Ho. Gentlemen, it's great to have you on the program. Lisa is good to be back. Great. Listen, we're gonna start with you. But to really try to address these customer concerns because, you know, we wanna we So it's exciting a J from the CEO's level. It's real satisfying to see how we're able. Let's let's go back over to you. But they need to understand what kind of data they have, what shape it's in what's dependent lot of a lot of frameworks these days are hardwired, so you can set up a set It's the technical metadata coming together with policies Is this book enterprise companies are doing now? help the organizations to digest their data is to And if it was me eating that food with you guys, I would be not using chopsticks. So if you look at the challenges for these data professionals, you know, they're either on a journey to the cloud. Well, as she digs into the databases, she starts to see that So a J talk us through some examples of where But I think it helped do this Bring it to life a little bit. And one of the things I was thinking when you were talking through some We can see that on the the graphic that we've just How are you seeing those technologies being think you know this But the very first step is understanding what you have in normalizing that So if I start to see this pattern of date one day to elsewhere, I'm going to say, in the beginning about what you guys were doing with Oracle. So Oracle came to us and said, you know, we can see things changing in 2021 a. J. Lester thank you so much for joining me on this segment Thank you. is the Cube, your global leader in high tech coverage. Enjoy the best this community has to offer on the Cube, Gentlemen, it's great to have you joining us in this in this panel. Can you talk to the audience a little bit about the first Bank of One of the oldest ignored the old in Africa because of the history And how does it help the first Bank of Nigeria to be able to innovate faster with the point, we have new technologies that allow you to do this method data So one of the things that you just said Santa kind of struck me to enable the users to be adaptive. Now it changed the reality, so they needed to adapt. I wanted to go to you as we talk about in the spirit of evolution, technology is changing. customer and for the customer means that we will help them with our technology and our resource is to achieve doing there to help your clients leverage automation to improve agility? So here's the first lunch on the latest innovation Some of the things that we've talked about, Otherwise, everything grinds to a halt, and you risk falling behind your competitors. Used to talk to us about some of the business outcomes that you're seeing other customers make leveraging automation different sources to find duplicates, which you can then re And one of the when Santiago was talking about folks really kind of adapted that. Minimize copies of the data can help everyone in this shift to remote working and a lot of the the and on the site fast, especially changes are changing so quickly nowadays that you need to be What you guys were doing there to help your customers I always tell them you better start collecting your data. Gentlemen, thank you for sharing all of your insights. adaptive data governance brought to you by Iota Ho. In this next segment, we're gonna be talking to you about getting to know your data. Thankfully saw great to be here as Lisa mentioned guys, I'm the enterprise account executive here in Ohio. I'm the enterprise data engineer here in Ohio Tahoe. So with that said are you ready for the first one, Pat? So I have data kept in Data Lakes, legacy data, sources, even the cloud. Typically, the first step is to catalog the data and then start mapping the relationships Um so is the data catalog automatically populated? i b naturally John to these focal points that coincide with these key columns following These air ones that are machine learning algorithms have predicted to be relationships Eso So this is really cool, and I can see how this could be leverage quickly now. such as hip for C, C, P. A and the like one can choose which of these policies policies that apply to my organization? And it's something that clients leverage fairly often to utilize this So I'm glad you mentioned compliance because that's extremely important to my organization. interface lends itself to really help you build that confidence that your compliance is Andi, I have a set of reports I need to migrate. Yeah, it's a fantastic questions to be toe identifying critical data elements, I can still see that unified 3 60 view. Yeah, One future that is particularly helpful is the ability to add data sources after So now I know a little bit about my data. the data pertains to these rules. So if my marketing team needs to run, a campaign can read dependent those accuracy scores to know what the minutia to see which data elements are of the highest quality. no longer have to rely on reports about reports, but instead just come to this one So I get now the value of IATA who brings by automatically capturing all those technical to be involved with the data life cycle and take ownership over the rules that fall within their domain. Yeah, you could bring in as many data sources as you need, so long as you could manual task to help save you time and money. you. this banking example that you instantly to talk through. Yeah, I think the number one thing that customers could do to get started with our so we we encourage you just reach out to us. folks about how to get to know your data. into the world of adaptive data governance with Iota Ho Oracle First Bank of Nigeria

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AmandaPERSON

0.99+

JasonPERSON

0.99+

LisaPERSON

0.99+

Patrick SimonPERSON

0.99+

Lisa MartinPERSON

0.99+

SantiagoPERSON

0.99+

OracleORGANIZATION

0.99+

Yusuf KhanPERSON

0.99+

AsiaLOCATION

0.99+

16QUANTITY

0.99+

Santiago CastorPERSON

0.99+

OhioLOCATION

0.99+

LondonLOCATION

0.99+

ABC BankORGANIZATION

0.99+

Patrick SavitaPERSON

0.99+

10 timesQUANTITY

0.99+

SanjayPERSON

0.99+

AngiePERSON

0.99+

Wednesday, January 27thDATE

0.99+

AfricaLOCATION

0.99+

ThioPERSON

0.99+

John Vander WalPERSON

0.99+

2020DATE

0.99+

PatrickPERSON

0.99+

two columnsQUANTITY

0.99+

90%QUANTITY

0.99+

SiriTITLE

0.99+

ToyotaORGANIZATION

0.99+

Bio TahoeORGANIZATION

0.99+

AzazPERSON

0.99+

PatPERSON

0.99+

11QUANTITY

0.99+

five timesQUANTITY

0.99+

Oracle DigitalORGANIZATION

0.99+

J. BiharPERSON

0.99+

1%QUANTITY

0.99+

StaleyPERSON

0.99+

Iot TahoeORGANIZATION

0.99+

Iota hoORGANIZATION

0.99+

todayDATE

0.99+

RonPERSON

0.99+

firstQUANTITY

0.99+

10QUANTITY

0.99+

Iota HoORGANIZATION

0.99+

AndiPERSON

0.99+

Io TahoeORGANIZATION

0.99+

one dateQUANTITY

0.99+

OneQUANTITY

0.99+

excelTITLE

0.99+

tomorrowDATE

0.99+

3%QUANTITY

0.99+

JohnPERSON

0.99+

First Bank of NigeriaORGANIZATION

0.99+

Middle EastLOCATION

0.99+

Patrick SamitPERSON

0.99+

I. O. TahoeORGANIZATION

0.99+

first stepQUANTITY

0.99+

97%QUANTITY

0.99+

LesterPERSON

0.99+

two folksQUANTITY

0.99+

DavePERSON

0.99+

2021DATE

0.99+

fifth episodeQUANTITY

0.99+

one grainQUANTITY

0.99+

Leicester Clinical Data Science Initiative


 

>>Hello. I'm Professor Toru Suzuki Cherif cardiovascular medicine on associate dean of the College of Life Sciences at the University of Leicester in the United Kingdom, where I'm also director of the Lester Life Sciences accelerator. I'm also honorary consultant cardiologist within our university hospitals. It's part of the national health system NHS Trust. Today, I'd like to talk to you about our Lester Clinical Data Science Initiative. Now brief background on Lester. It's university in hospitals. Lester is in the center of England. The national health system is divided depending on the countries. The United Kingdom, which is comprised of, uh, England, Scotland to the north, whales to the west and Northern Ireland is another part in a different island. But national health system of England is what will be predominantly be discussed. Today has a history of about 70 years now, owing to the fact that we're basically in the center of England. Although this is only about one hour north of London, we have a catchment of about 100 miles, which takes us from the eastern coast of England, bordering with Birmingham to the west north just south of Liverpool, Manchester and just south to the tip of London. We have one of the busiest national health system trust in the United Kingdom, with a catchment about 100 miles and one million patients a year. Our main hospital, the General Hospital, which is actually called the Royal Infirmary, which can has an accident and emergency, which means Emergency Department is that has one of the busiest emergency departments in the nation. I work at Glen Field Hospital, which is one of the main cardiovascular hospitals of the United Kingdom and Europe. Academically, the Medical School of the University of Leicester is ranked 20th in the world on Lee, behind Cambridge, Oxford Imperial College and University College London. For the UK, this is very research. Waited, uh, ranking is Therefore we are very research focused universities as well for the cardiovascular research groups, with it mainly within Glenn Field Hospital, we are ranked as the 29th Independent research institution in the world which places us. A Suffield waited within our group. As you can see those their top ranked this is regardless of cardiology, include institutes like the Broad Institute and Whitehead Institute. Mitt Welcome Trust Sanger, Howard Hughes Medical Institute, Kemble, Cold Spring Harbor and as a hospital we rank within ah in this field in a relatively competitive manner as well. Therefore, we're very research focused. Hospital is well now to give you the unique selling points of Leicester. We're we're the largest and busiest national health system trust in the United Kingdom, but we also have a very large and stable as well as ethnically diverse population. The population ranges often into three generations, which allows us to do a lot of cohort based studies which allows us for the primary and secondary care cohorts, lot of which are well characterized and focused on genomics. In the past. We also have a biomedical research center focusing on chronic diseases, which is funded by the National Institutes of Health Research, which funds clinical research the hospitals of United Kingdom on we also have a very rich regional life science cluster, including med techs and small and medium sized enterprises. Now for this, the bottom line is that I am the director of the letter site left Sciences accelerator, >>which is tasked with industrial engagement in the local national sectors but not excluding the international sectors as well. Broadly, we have academics and clinicians with interest in health care, which includes science and engineering as well as non clinical researchers. And prior to the cove it outbreak, the government announced the £450 million investment into our university hospitals, which I hope will be going forward now to give you a brief background on where the scientific strategy the United Kingdom lies. Three industrial strategy was brought out a za part of the process which involved exiting the European Union, and part of that was the life science sector deal. And among this, as you will see, there were four grand challenges that were put in place a I and data economy, future of mobility, clean growth and aging society and as a medical research institute. A lot of the focus that we have been transitioning with within my group are projects are focused on using data and analytics using artificial intelligence, but also understanding how chronic diseases evolved as part of the aging society, and therefore we will be able to address these grand challenges for the country. Additionally, the national health system also has its long term plans, which we align to. One of those is digitally enabled care and that this hope you're going mainstream over the next 10 years. And to do this, what is envision will be The clinicians will be able to access and interact with patient records and care plants wherever they are with ready access to decision support and artificial intelligence, and that this will enable predictive techniques, which include linking with clinical genomic as well as other data supports, such as image ing a new medical breakthroughs. There has been what's called the Topol Review that discusses the future of health care in the United Kingdom and preparing the health care workforce for the delivery of the digital future, which clearly discusses in the end that we would be using automated image interpretation. Is using artificial intelligence predictive analytics using artificial intelligence as mentioned in the long term plans. That is part of that. We will also be engaging natural language processing speech recognition. I'm reading the genome amusing. Genomic announced this as well. We are in what is called the Midland's. As I mentioned previously, the Midland's comprised the East Midlands, where we are as Lester, other places such as Nottingham. We're here. The West Midland involves Birmingham, and here is ah collective. We are the Midlands. Here we comprise what is called the Midlands engine on the Midland's engine focuses on transport, accelerating innovation, trading with the world as well as the ultra connected region. And therefore our work will also involve connectivity moving forward. And it's part of that. It's part of our health care plans. We hope to also enable total digital connectivity moving forward and that will allow us to embrace digital data as well as collectivity. These three key words will ah Linkous our health care systems for the future. Now, to give you a vision for the future of medicine vision that there will be a very complex data set that we will need to work on, which will involve genomics Phanom ICS image ing which will called, uh oh mix analysis. But this is just meaning that is, uh complex data sets that we need to work on. This will integrate with our clinical data Platforms are bioinformatics, and we'll also get real time information of physiology through interfaces and wearables. Important for this is that we have computing, uh, processes that will now allow this kind of complex data analysis in real time using artificial intelligence and machine learning based applications to allow visualization Analytics, which could be out, put it through various user interfaces to the clinician and others. One of the characteristics of the United Kingdom is that the NHS is that we embrace data and captured data from when most citizens have been born from the cradle toe when they die to the grave. And it's important that we were able to link this data up to understand the journey of that patient. Over time. When they come to hospital, which is secondary care data, we will get disease data when they go to their primary care general practitioner, we will be able to get early check up data is Paula's follow monitoring monitoring, but also social care data. If this could be linked, allow us to understand how aging and deterioration as well as frailty, uh, encompasses thes patients. And to do this, we have many, many numerous data sets available, including clinical letters, blood tests, more advanced tests, which is genetics and imaging, which we can possibly, um, integrate into a patient journey which will allow us to understand the digital journey of that patient. I have called this the digital twin patient cohort to do a digital simulation of patient health journeys using data integration and analytics. This is a technique that has often been used in industrial manufacturing to understand the maintenance and service points for hardware and instruments. But we would be using this to stratify predict diseases. This'll would also be monitored and refined, using wearables and other types of complex data analysis to allow for, in the end, preemptive intervention to allow paradigm shifting. How we undertake medicine at this time, which is more reactive rather than proactive as infrastructure we are presently working on putting together what's it called the Data Safe haven or trusted research environment? One which with in the clinical environment, the university hospitals and curated and data manner, which allows us to enable data mining off the databases or, I should say, the trusted research environment within the clinical environment. Hopefully, we will then be able to anonymous that to allow ah used by academics and possibly also, uh, partnering industry to do further data mining and tool development, which we could then further field test again using our real world data base of patients that will be continually, uh, updating in our system. In the cardiovascular group, we have what's called the bricks cohort, which means biomedical research. Informatics Center for Cardiovascular Science, which was done, started long time even before I joined, uh, in 2010 which has today almost captured about 10,000 patients arm or who come through to Glenn Field Hospital for various treatments or and even those who have not on. We asked for their consent to their blood for genetics, but also for blood tests, uh, genomics testing, but also image ing as well as other consent. Hable medical information s so far there about 10,000 patients and we've been trying to extract and curate their data accordingly. Again, a za reminder of what the strengths of Leicester are. We have one of the largest and busiest trust with the very large, uh, patient cohort Ah, focused dr at the university, which allows for chronic diseases such as heart disease. I just mentioned our efforts on heart disease, uh which are about 10,000 patients ongoing right now. But we would wish thio include further chronic diseases such as diabetes, respiratory diseases, renal disease and further to understand the multi modality between these diseases so that we can understand how they >>interact as well. Finally, I like to talk about the lesser life science accelerator as well. This is a new project that was funded by >>the U started this January for three years. I'm the director for this and all the groups within the College of Life Sciences that are involved with healthcare but also clinical work are involved. And through this we hope to support innovative industrial partnerships and collaborations in the region, a swells nationally and further on into internationally as well. I realized that today is a talked to um, or business and commercial oriented audience. And we would welcome interest from your companies and partners to come to Leicester toe work with us on, uh, clinical health care data and to drive our agenda forward for this so that we can enable innovative research but also product development in partnership with you moving forward. Thank you for your time.

Published Date : Sep 21 2020

SUMMARY :

We have one of the busiest national health system trust in the United Kingdom, with a catchment as part of the aging society, and therefore we will be able to address these grand challenges for Finally, I like to talk about the lesser the U started this January for three years.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
National Institutes of Health ResearchORGANIZATION

0.99+

Howard Hughes Medical InstituteORGANIZATION

0.99+

BirminghamLOCATION

0.99+

2010DATE

0.99+

Broad InstituteORGANIZATION

0.99+

EnglandLOCATION

0.99+

College of Life SciencesORGANIZATION

0.99+

Whitehead InstituteORGANIZATION

0.99+

United KingdomLOCATION

0.99+

Toru Suzuki CherifPERSON

0.99+

EuropeLOCATION

0.99+

LondonLOCATION

0.99+

£450 millionQUANTITY

0.99+

LesterORGANIZATION

0.99+

three yearsQUANTITY

0.99+

Oxford Imperial CollegeORGANIZATION

0.99+

LeicesterLOCATION

0.99+

European UnionORGANIZATION

0.99+

Informatics Center for Cardiovascular ScienceORGANIZATION

0.99+

ScotlandLOCATION

0.99+

Glenn Field HospitalORGANIZATION

0.99+

ManchesterLOCATION

0.99+

TodayDATE

0.99+

NottinghamLOCATION

0.99+

Cold Spring HarborORGANIZATION

0.99+

todayDATE

0.99+

General HospitalORGANIZATION

0.99+

oneQUANTITY

0.99+

Glen Field HospitalORGANIZATION

0.99+

KembleORGANIZATION

0.99+

Royal InfirmaryORGANIZATION

0.99+

about 100 milesQUANTITY

0.99+

Northern IrelandLOCATION

0.99+

Lester Life SciencesORGANIZATION

0.99+

LiverpoolLOCATION

0.99+

UKLOCATION

0.98+

about 70 yearsQUANTITY

0.98+

MidlandLOCATION

0.98+

about 10,000 patientsQUANTITY

0.98+

University of LeicesterORGANIZATION

0.98+

NHS TrustORGANIZATION

0.98+

Mitt Welcome Trust SangerORGANIZATION

0.98+

PaulaPERSON

0.98+

West MidlandLOCATION

0.98+

about 10,000 patientsQUANTITY

0.97+

East MidlandsLOCATION

0.97+

about one hourQUANTITY

0.97+

NHSORGANIZATION

0.97+

20thQUANTITY

0.97+

United KingdomLOCATION

0.96+

University College LondonORGANIZATION

0.96+

OneQUANTITY

0.95+

one million patients a yearQUANTITY

0.93+

SuffieldORGANIZATION

0.92+

Three industrial strategyQUANTITY

0.92+

three generationsQUANTITY

0.92+

Lester Clinical Data Science InitiativeORGANIZATION

0.89+

LeeLOCATION

0.88+

JanuaryDATE

0.88+

Medical School of theORGANIZATION

0.87+

University of LeicesterORGANIZATION

0.87+

MidlandsLOCATION

0.87+

LesterLOCATION

0.87+

three key wordsQUANTITY

0.86+

Topol ReviewTITLE

0.85+

LeicesterORGANIZATION

0.83+

Leicester Clinical Data Science InitiativeORGANIZATION

0.82+

four grand challengesQUANTITY

0.82+

Emergency DepartmentORGANIZATION

0.8+

twin patientQUANTITY

0.73+

29th Independent researchQUANTITY

0.69+

next 10 yearsDATE

0.66+

Io-Tahoe Smart Data Lifecycle CrowdChat | Digital


 

>>from around the globe. It's the Cube with digital coverage of data automated and event. Siri's Brought to You by Iot Tahoe Welcome, everyone to the second episode in our data automated Siri's made possible with support from Iot Tahoe. Today we're gonna drill into the data lifecycle, meaning the sequence of stages that data travels through from creation to consumption to archive. The problem, as we discussed in our last episode, is that data pipelines, they're complicated, They're cumbersome, that disjointed, and they involve highly manual processes. Ah, smart data lifecycle uses automation and metadata to approve agility, performance, data quality and governance and ultimately reduce costs and time to outcomes. Now, in today's session will define the data lifecycle in detail and provide perspectives on what makes a data lifecycle smart and importantly, how to build smarts into your processes. In a moment, we'll be back with Adam Worthington from ethos to kick things off, and then we'll go into an export power panel to dig into the tech behind smart data life cycles, and it will hop into the crowdchat and give you a chance to ask questions. So stay right there. You're watching the cube innovation impact influence. Welcome >>to the Cube disruptors. Developers and practitioners learn from the voices of leaders who share their personal insights from the hottest digital events around the globe. Enjoy the best this community has to offer on the Cube, your global leader. >>High tech digital coverage. Okay, we're back with Adam Worthington. Adam, good to see you. How are things across the pond? >>Thank you, I'm sure. >>Okay, so let's let's set it up. Tell us about yourself. What? Your role is a CTO and >>automatically. As you said, we found a way to have a pretty in company ourselves that we're in our third year on. Do we specialize in emerging disruptive technologies within the infrastructure? That's the kind of cloud space on my phone is the technical lead. So I kind of my job to be an expert in all of the technologies that we work with, which can be a bit of a challenge if you have a huge for phone is one of the reasons, like deliberately focusing on on also kind of pieces a successful validation and evaluation of new technologies. >>So you guys really technology experts, data experts and probably also expert in process and delivering customer outcomes. Right? >>That's a great word there, Dave Outcomes. That's a lot of what I like to speak to customers about. >>Let's talk about smart data, you know, when you when you throw in terms like this is it kind of can feel buzz, wordy. But what are the critical aspects of so called smart data? >>Help to step back a little bit, seen a little bit more in terms of kind of where I can see the types of problems I saw. I'm really an infrastructure solution architect trace on and what I kind of benefit we organically. But over time my personal framework, I focused on three core design principal simplicity, flexibility, inefficient, whatever it was designing. And obviously they need different things, depending on what the technology area is working with. But that's a pretty good. So they're the kind of areas that a smart approach to data will directly address. Reducing silos that comes from simplifying, so moving away from conflict of infrastructure, reducing the amount of copies of data that we have across the infrastructure and reducing the amount of application environments that need different areas so smarter get with data in my eyes anyway, the further we moved away from this. >>But how does it work? I mean, how do you know what's what's involved in injecting smarts into your data lifecycle? >>I think one of my I actually did not ready, but generally one of my favorite quotes from the French lost a mathematician, Blaise Pascal. He said, If I get this right, I have written a short letter, but I didn't have time. But Israel, I love that quite for lots of reasons >>why >>direct application in terms of what we're talking about, it is actually really complicated. These developers technology capabilities to make things simple, more directly meet the needs of the business. So you provide self service capabilities that they just need to stop driving. I mean, making data on infrastructure makes the business users using >>your job. Correct me. If I'm wrong is to kind of put that all together in a solution and then help the customer realize that we talked about earlier that business out. >>Yeah, enough if they said in understanding both sides so that it keeps us on our ability to deliver on exactly what you just said is big experts in the capabilities and new a better way to do things but also having the kind of the business understanding to be able to ask the right questions. That's how new a better price is. Positions another area that I really like his stuff with their platforms. You can do more with less. And that's not just about using data redundancy. That's about creating application environments, that conservative and then the infrastructure to service different requirements that are able to use the random Io thing without getting too kind of low level as well as the sequential. So what that means is you don't necessarily have to move data from application environment a do one thing related, and then move it to the application environment. Be that environment free terms of an analytics on the left Right works. Both keep the data where it is, use it or different different requirements within the infrastructure and again do more with less. And what that does is not just about simplicity and efficiency. It significantly reduces the time to value of that as well. >>Do you have examples that you can share with us even if they're anonymous customers that you work with that are maybe a little further down on the journey. Or maybe not >>looking at the you mentioned data protection earlier. So another organization This is a project which is just kind of hearing confessions moment, huge organization. They're literally petabytes of data that was servicing their back up in archive. And what they have is not just this realization they have combined. I think I different that they have dependent on the what area of infrastructure they were backing up, whether it was virtualization, that was different because they were backing up PC's June 6th. They're backing up another database environment, using something else in the cloud knowledge bases approach that we recommended to work with them on. They were able to significantly reduce complexity and reduce the amount of time that it systems of what they were able to achieve and what this is again. One of the clients have They've gone above the threshold of being able to back up for that. >>Adam, give us the final thoughts, bring us home. In this segment, >>the family built something we didn't particularly such on, that I think it is really barely hidden. It is spoken about as much as I think it is, that agile approaches to infrastructure we're going to be touched on there could be complicated on the lack of it efficient, the impact, a user's ability to be agile. But what you find with traditional approaches and you already touched on some of the kind of benefits new approaches there. It's often very prescriptive, designed for a particular as the infrastructure environment, the way that it served up the users in kind of a packaged. Either way, it means that they need to use it in that whatever wave in data bases, that kind of service of as it comes in from a flexibility standpoint. But for this platform approach, which is the right way to address technology in my eyes enables, it's the infrastructure to be used. Flexible piece of it, the business users of the data users what we find this capability into their innovating in the way they use that on the White House. I bring benefits. This is a platform to prescriptive, and they are able to do that. What you're doing with these new approaches is all of the metrics that we touched on and pass it from a cost standpoint from a visibility standpoint, but what it means is that the innovators in the business want really, is to really understand what they're looking to achieve and now have to to innovate with us. Now, I think I've started to see that with projects season places. If you do it in the right way, you articulate the capability and empower the business users in the right ways. Very significantly. Better position. The advantages on really matching significantly bigger than their competition. Yeah, >>Super Adam in a really exciting space. And we spent the last 10 years gathering all this data, you know, trying to slog through it and figure it out. And now, with the tools that we have and the automation capabilities, it really is a new era of innovation and insights. So, Adam or they didn't thanks so much for coming on the Cube and participating in this program. >>Exciting times with that. Thank you very much Today. >>Now we're going to go into the power panel and go deeper into the technologies that enable smart data life cycles. Stay right there. You're watching the cube. Are >>you interested in test driving? The i o ta ho platform Kickstart the benefits of data automation for your business through the Iot Labs program. Ah, flexible, scalable sandbox environment on the cloud of your choice with set up a service and support provided by Iot. Top. Click on the Link and connect with the data engineer to learn more and see Iot Tahoe in action. >>Welcome back, everybody to the power panel driving business performance with smart data life cycles. Leicester Waters is here. He's the chief technology officer from Iot Tahoe. He's joined by Patrick Smith, who was field CTO from pure storage. And is that data? Who's a system engineering manager at KohI City? Gentlemen, good to see you. Thanks so much for coming on this panel. >>Thank you. >>Let's start with Lester. I wonder if each of you could just give us a quick overview of your role. And what's the number one problem that you're focused on solving for your customers? Let's start with Lester Fleet. >>Yes, I'm Lost Waters, chief technology officer for Iot Tahoe and really the number one problem that we're trying to solve for our customers is to understand, help them understand what they have, because if they don't understand what they have in terms of their data. They can't manage it. They can't control it. The cap monitor. They can't ensure compliance. So really, that's finding all you can about your data that you have. And building a catalog that could be readily consumed by the entire business is what we do. >>Patrick Field, CTO in your title That says to me, You're talking to customers all the time, so you got a good perspective on it. Give us your take on things here. >>Yeah, absolutely. So my patches in here on day talkto customers and prospects in lots of different verticals across the region. And as they look at their environments and their data landscape, they're faced with massive growth in the data that they're trying to analyze and demands to be able to get insight our stuff and to deliver better business value faster than they've ever had to do in the past. So >>got it. And is that of course, Kohi City. You're like the new kid on the block. You guys were really growing rapidly created this whole notion of data management, backup and and beyond. But I'm assistant system engineering manager. What are you seeing from from from customers your role and the number one problem that you're solving. >>Yeah, sure. So the number one problem I see time and again speaking with customers. It's around data fragmentation. So do two things like organic growth, even maybe budgetary limitations. Infrastructure has grown over time very piecemeal, and it's highly distributed internally. And just to be clear, you know, when I say internally, that >>could be >>that it's on multiple platforms or silos within an on Prem infrastructure that it also does extend to the cloud as well. >>Right Cloud is cool. Everybody wants to be in the cloud, right? So you're right, It creates, Ah, maybe unintended consequences. So let's start with the business outcome and kind of try to work backwards to people you know. They want to get more insights from data they want to have. Ah, Mawr efficient data lifecycle. But so let's let me start with you were thinking about like the North Star for creating data driven cultures. You know, what is the North Star or customers >>here? I think the North Star, in a nutshell, is driving value from your data. Without question, I mean way, differentiate ourselves these days by even nuances in our data now, underpinning that, there's a lot of things that have to happen to make that work out. Well, you know, for example, making sure you adequately protect your data, you know? Do you have a good You have a good storage sub system? Do you have a good backup and recovery point objectives? Recovery time objective. How do you Ah, are you fully compliant? Are you ensuring that you're taking all the boxes? There's a lot of regulations these days in terms with respect to compliance, data retention, data, privacy and so forth. Are you taking those boxes? Are you being efficient with your, uh, your your your data? You know, In other words, I think there's a statistic that someone mentioned me the other day that 53% of all businesses have between three and 15 copies of the same data. So you know, finding and eliminating does is it is part of the part of the problem is when you do a chase, >>um, I I like to think of you're right, no doubt, business value and and a lot of that comes from reducing the end in cycle times. But anything that you guys would would add to that. Patrick, Maybe start with Patrick. >>Yeah, I think I think in value from your data really hits on tips on what everyone wants to achieve. But I think there are a couple of key steps in doing that. First of all, is getting access to the data and asked that, Really, it's three big problems, firstly, working out what you've got. Secondly, looking at what? After working on what you've got, how to get access to it? Because it's all very well knowing that you've got some data. But if you can't get access to it either because of privacy reasons, security reasons, then that's a big challenge. And then finally, once you've got access to the data making sure that you can process that data in a timely manner >>for me, you know it would be that an organization has got a really good global view of all of its data. It understands the data flow and dependencies within their infrastructure, understands that precise legal and compliance requirements, and you had the ability to action changes or initiatives within their environment to give the fun. But with a cloud like agility. Um, you know, and that's no easy feat, right? That is hard work. >>Okay, so we've we've talked about. The challenge is in some of the objectives, but there's a lot of blockers out there, and I want to understand how you guys are helping remove them. So So, Lester. But what do you see as some of the big blockers in terms of people really leaning in? So this smart data lifecycle >>yeah, Silos is is probably one of the biggest one I see in business is yes, it's it's my data, not your data. Lots of lots of compartmentalization. Breaking that down is one of the one of the challenges. And having the right tools to help you do that is only part of the solution. There's obviously a lot of cultural things that need to take place Teoh to break down those silos and work together. If you can identify where you have redundant data across your enterprise, you might be able to consolidate those. >>So, Patrick, so one of the blockers that I see is legacy infrastructure, technical debt, sucking all the budget you got. You know, too many people have having to look after, >>as you look at the infrastructure that supports people's data landscapes today for primarily legacy reasons. The infrastructure itself is siloed. So you have different technologies with different underlying hardware and different management methodologies that they're there for good reason, because historically you have to have specific fitness, the purpose for different data requirements. And that's one of the challenges that we tackled head on a pure with with the flash blade technology and the concept of the data, a platform that can deliver in different characteristics for the different workloads. But from a consistent data platform >>now is that I want to go to you because, you know, in the world in your world, which to me goes beyond backup. And one of the challenges is, you know, they say backup is one thing. Recovery is everything, but as well. The the CFO doesn't want to pay for just protection, and one of things that I like about what you guys have done is you. You broadened the perspective to get more value out of your what was once seen as an insurance policy. >>I do see one of the one of the biggest blockers as the fact that the task at hand can, you know, can be overwhelming for customers. But the key here is to remember that it's not an overnight change. It's not, you know, a flick of a switch. It's something that can be tackled in a very piecemeal manner on. Absolutely. Like you said, You know, reduction in TCO and being able to leverage the data for other purposes is a key driver for this. So, you know, this can be this can be resolved. It would be very, you know, pretty straightforward. It can be quite painless as well. Same goes for unstructured data, which is very complex to manage. And, you know, we've all heard the stats from the the analysts. You know, data obviously is growing at an extremely rapid rate, but actually, when you look at that, you know how is actually growing. 80% of that growth is actually in unstructured data, and only 20% of that growth is in unstructured data. S o. You know, these are quick win areas that customers can realize immediate tco improvement and increased agility as well >>paint a picture of this guy that you could bring up the life cycle. You know what you can see here is you've got this this cycle, the data lifecycle and what we're wanting to do is inject intelligence or smarts into this, like like life cycles. You see, you start with ingestion or creation of data. You're you're storing it. You got to put it somewhere, right? You gotta classify it. You got to protect it. And then, of course, you want to reduce the copies, make it, you know, efficient on. And then you want to prepare it so that businesses can actually sumit. And then you've got clients and governance and privacy issues, and I wonder if we could start with you. Lester, this is, you know, the picture of the life cycle. What role does automation play in terms of injecting smarts into the lifecycle? >>Automation is key here, especially from the discover it catalog and classify perspective. I've seen companies where they geo and will take and dump their all of their database scheme is into a spreadsheet so that they can sit down and manually figure out what attributes 37 means for a column names, Uh, and that's that's only the tip of the iceberg. So being able to do automatically detect what you have automatically deduced where what's consuming the data, you know, upstream and downstream. Being able to understand all of the things related to the lifecycle of your data. Back up archive deletion. It is key. And so we're having having good tool. IShares is very >>important. So, Patrick, obviously you participate in the store piece of this picture s I wonder if you could talk more specifically about that. But I'm also interested in how you effect the whole system view the the end end cycle time. >>Yeah, I think Leicester kind of hit the nail on the head in terms of the importance of automation because the data volumes are just just so massive. Now that you can, you can you can effectively manage or understand or catalog your data without automation. Once you understand the data and the value of the data, then that's where you can work out where the data needs to be at any point in >>time, right? So pure and kohi city obviously partner to do that and of course, is that you guys were part of the protect you certainly part of the retain. But Also, you provide data management capabilities and analytics. I wonder if you could add some color there. >>Yeah, absolutely. So, like you said, you know, we focused pretty heavily on data protection. Is just one of our one of our areas on that infrastructure. It is just sitting there, really? Can, you know, with the legacy infrastructure, It's just sitting there, you know, consuming power, space cooling and pretty inefficient. And what, if anything, that protest is a key part of that. If I If I have a modern data platform such as, you know, the cohesive data platform, I can actually do a lot of analytics on that through application. So we have a marketplace for APS. >>I wonder if we could talk about metadata. It's It's increasingly important. Metadata is data about the data, but Leicester maybe explain why it's so important and what role it plays in terms of creating smart data lifecycle. A >>lot of people think it's just about the data itself, but there's a lot of extended characteristics about your data. So so imagine if or my data life cycle I can communicate with the backup system from Kohi City and find out when the last time that data was backed up or where is backed up to. I can communicate exchange data with pure storage and find out what two years? And is the data at the right tier commensurate with its use level pointed out and being able to share that metadata across systems? I think that's the direction that we're going in right now. We're at the stage where just identifying the metadata and trying to bring it together and catalog the next stage will be OK using the AP eyes it that that we have between our systems can't communicate and share that data and build good solutions for customers to use. >>It's a huge point that you just made. I mean, you know, 10 years ago, automating classification was the big problem, and it was machine intelligence, you know, obviously attacking that, But your point about as machines start communicating to each other and you start, it's cloud to cloud. There's all kinds of metadata, uh, kind of new meta data that's being created. I often joke that someday there's gonna be more metadata than data, so that brings us to cloud and that I'd like to start with you. >>You know, I do think, you know, having the cloud is a great thing. And it has got its role to play, and you can have many different permutations and iterations of how you use it on. Um, you know, I may have sort of mentioned previously. You know, I've seen customers go into the cloud very, very quickly, and actually recently, they're starting to remove workloads from the cloud. And the reason why this happens is that, you know, Cloud has got its role to play, but it's not right for absolutely everything, especially in their current form as well. A good analogy I like to use on this may sound a little bit cliche, but you know, when you compare clouds versus on premises data centers, you can use the analogy of houses and hotels. So to give you an idea so you know, when we look at hotels, that's like the equivalent of a cloud, right? I can get everything I need from there. I can get my food, my water, my outdoor facilities. If I need to accommodate more people, I can rent some more rooms. I don't have to maintain the hotel. It's all done for me. When you look at houses the equivalent to on premises infrastructure, I pretty much have to do everything myself, right. So I have to purchase the house. I have to maintain it. I have to buy my own food and water. Eat it. You have to make improvements myself. But then why do we all live in houses? No, in hotels. And the simple answer that I can I can only think of is, is that it's cheaper, right. It's cheaper to do it myself. But that's not to say that hotels haven't got their role to play. Um, you know? So, for example, if I've got loads of visitors coming over for the weekend, I'm not going to go build an extension to my house just for them. I will burst into my hotel into the cloud, um, and use it for, you know, for for things like that. So what I'm really saying is the cloud is great for many things, but it can work out costlier for certain applications, while others are a perfect >>It's an interesting analogy. I hadn't thought of that before, but you're right because I was going to say Well, part of it is you want the cloud experience everywhere, but you don't always want the cloud experience especially, you know, when you're with your family, you want certain privacy that I've not heard that before. He's out. So that's the new perspective s Oh, thank you, but but But Patrick, I do want to come back to that cloud experience because, in fact, that's what's happening. In a lot of cases, organizations are extending the cloud properties of automation on Prem. >>Yeah, I thought, as I thought, a really interesting point and a great analogy for the use of the public cloud. And it really reinforces the importance of the hybrid and multi cloud environment because it gives you the flexibility to choose where is the optimal environment to run your business workloads? And that's what it's all about and the flexibility to change which environment you're running in, either for more months to the next or from one year to the next. Because workloads change and the characteristics that are available in the cloud change, the hybrid cloud is something that we've we've lived with ourselves of pure, So our pure one management technology actually sits in hybrid cloud and what we we started off entirely cloud native. But now we use public cloud for compute. We use our own technology at the end of a high performance network link to support our data platform. So we get the best of both worlds and I think that's where a lot of our customers are trying to get to. >>Alright, I want to come back in a moment there. But before we do, let's see, I wonder if we could talk a little bit about compliance, governance and privacy. I think the Brits hung on. This panel is still in the EU for now, but the you are looking at new rules. New regulations going beyond GDP are where does sort of privacy governance, compliance fit in the data lifecycle, then, is that I want your thoughts on this as well. >>Yeah, this is this is a very important point because the landscape for for compliance, around data privacy and data retention is changing very rapidly. And being able to keep up with those changing regulations in an automated fashion is the only way you're gonna be able to do it. Even I think there's a some sort of Ah, maybe ruling coming out today or tomorrow with the changed in the r. So this is things are all very key points and being able to codify those rules into some software. Whether you know, Iot Tahoe or or your storage system or kohi city, it will help you be compliant is crucial. >>Yeah. Is that anything you can add there? I mean, it's really is your wheelhouse. >>Yeah, absolutely. So, you know, I think anybody who's watching this probably has gotten the message that, you know, less silos is better. And it absolutely it also applies to data in the cloud is where as well. So you know, my aiming Teoh consolidate into fewer platforms, customers can realize a lot better control over their data. And the natural effect of this is that it makes meeting compliance and governance a lot easier. So when it's consolidated, you can start to confidently understand who's accessing your data. How frequently are they accessing the data? You can also do things like, you know, detecting anomalous file access activities and quickly identify potential threats. >>Okay, Patrick, we were talking. You talked earlier about storage optimization. We talked to Adam Worthington about the business case, the numerator, which is the business value, and then the denominator, which is the cost and what's unique about pure in this regard. >>Yeah, and I think there are. There are multiple time dimensions to that. Firstly, if you look at the difference between legacy storage platforms that used to take up racks or aisles of space in the data center, the flash technology that underpins flash blade way effectively switch out racks rack units on. It has a big play in terms of data center footprint, and the environmental is associated with the data center. If you look at extending out storage efficiencies and the benefits it brings, just the performance has a direct effect on start we whether that's, you know, the start from the simplicity that platform so that it's easy and efficient to manage, whether it's the efficiency you get from your data. Scientists who are using the outcomes from the platform, making them more efficient to new. If you look at some of our customers in the financial space there, their time to results are improved by 10 or 20 x by switching to our technology from legacy technologies for their analytics, platforms. >>The guys we've been running, you know, Cube interviews in our studios remotely for the last 120 days is probably the first interview I've done where haven't started off talking about Cove it, Lester. I wonder if you could talk about smart data lifecycle and how it fits into this isolation economy. And hopefully, what will soon be a post isolation economy? >>Yeah, Come. It has dramatically accelerated the data economy. I think. You know, first and foremost, we've all learned to work at home. You know, we've all had that experience where, you know, people would have been all about being able to work at home just a couple days a week. And here we are working five days. That's how to knock on impact to infrastructure, to be able to support that. But going further than that, you know, the data economy is all about how a business can leverage their data to compete in this New World order that we are now in code has really been a forcing function to, you know, it's probably one of the few good things that have come out of government is that we've been forced to adapt and It's a zoo. Been an interesting journey and it continues to be so >>like Lester said, you know, we've We're seeing huge impact here. Working from home has pretty much become the norm. Now, you know, companies have been forced into basically making it work. If you look online retail, that's accelerated dramatically as well. Unified communications and videoconferencing. So really, you know the point here, is that Yes, absolutely. We're you know, we've compressed, you know, in the past, maybe four months. What already would have taken maybe even five years, maybe 10 years or so >>We got to wrap. But Celester Louis, let me ask you to sort of get paint. A picture of the sort of journey the maturity model that people have to take. You know, if they want to get into it, where did they start? And where are they going to give us that view, >>I think, versus knowing what you have. You don't know what you have. You can't manage it. You can't control that. You can't secure what you can't ensure. It's a compliant s so that that's first and foremost. Uh, the second is really, you know, ensuring that your compliance once, once you know what you have. Are you securing it? Are you following the regulatory? The applicable regulations? Are you able to evidence that, uh, how are you storing your data? Are you archiving it? Are you storing it effectively and efficiently? Um, you know, have you Nirvana from my perspective, is really getting to a point where you you've consolidated your data, you've broken down the silos and you have a virtually self service environment by which the business can consume and build upon their data. And really, at the end of the day, as we said at the beginning, it's all about driving value out of your data. And ah, the automation is is key to this, sir. This journey >>that's awesome and you just described is sort of a winning data culture. Lester, Patrick, thanks so much for participating in this power panel. >>Thank you, David. >>Alright, So great overview of the steps in the data lifecycle and how to inject smarts into the process is really to drive business outcomes. Now it's your turn. Hop into the crowd chat, please log in with Twitter or linked in or Facebook. Ask questions, answer questions and engage with the community. Let's crowdchat, right. Yeah, yeah, yeah.

Published Date : Jul 31 2020

SUMMARY :

behind smart data life cycles, and it will hop into the crowdchat and give you a chance to ask questions. Enjoy the best this community has to offer Adam, good to see you. and So I kind of my job to be an expert in all of the technologies that we work with, So you guys really technology experts, data experts and probably also expert in That's a lot of what I like to speak to customers Let's talk about smart data, you know, when you when you throw in terms like this is it kind of can feel buzz, reducing the amount of copies of data that we have across the infrastructure and reducing I love that quite for lots of reasons So you provide self service capabilities help the customer realize that we talked about earlier that business out. that it keeps us on our ability to deliver on exactly what you just said is big experts Do you have examples that you can share with us even if they're anonymous customers that you work looking at the you mentioned data protection earlier. In this segment, But what you find with traditional approaches and you already touched on some of you know, trying to slog through it and figure it out. Thank you very much Today. Now we're going to go into the power panel and go deeper into the technologies that enable Click on the Link and connect with the data Welcome back, everybody to the power panel driving business performance with smart data life I wonder if each of you could just give us a quick overview of your role. So really, that's finding all you can about your data that you so you got a good perspective on it. to deliver better business value faster than they've ever had to do in the past. What are you seeing from from from And just to be clear, you know, when I say internally, that it also does extend to the cloud as well. So let's start with the business outcome and kind of try to work backwards to people you and eliminating does is it is part of the part of the problem is when you do a chase, But anything that you guys would would add to that. But if you can't get access to it either because of privacy reasons, and you had the ability to action changes or initiatives within their environment to give But what do you see as some of the big blockers in terms of people really If you can identify where you have redundant data across your enterprise, technical debt, sucking all the budget you got. So you have different And one of the challenges is, you know, they say backup is one thing. But the key here is to remember that it's not an overnight the copies, make it, you know, efficient on. what you have automatically deduced where what's consuming the data, this picture s I wonder if you could talk more specifically about that. you can you can effectively manage or understand or catalog your data without automation. is that you guys were part of the protect you certainly part of the retain. Can, you know, with the legacy infrastructure, It's just sitting there, you know, consuming power, the data, but Leicester maybe explain why it's so important and what role it And is the data at the right tier commensurate with its use level pointed out I mean, you know, 10 years ago, automating classification And it has got its role to play, and you can have many different permutations and iterations of how you you know, when you're with your family, you want certain privacy that I've not heard that before. at the end of a high performance network link to support our data platform. This panel is still in the EU for now, but the you are looking at new Whether you know, Iot Tahoe or or your storage system I mean, it's really is your wheelhouse. So you know, my aiming Teoh consolidate into Worthington about the business case, the numerator, which is the business value, to manage, whether it's the efficiency you get from your data. The guys we've been running, you know, Cube interviews in our studios remotely for the last 120 days But going further than that, you know, the data economy is all about how a business can leverage we've compressed, you know, in the past, maybe four months. A picture of the sort of journey the maturity model that people have to take. from my perspective, is really getting to a point where you you've consolidated your that's awesome and you just described is sort of a winning data culture. Alright, So great overview of the steps in the data lifecycle and how to inject smarts into the process

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
PatrickPERSON

0.99+

DavidPERSON

0.99+

Adam WorthingtonPERSON

0.99+

Adam WorthingtonPERSON

0.99+

Patrick FieldPERSON

0.99+

Patrick SmithPERSON

0.99+

AdamPERSON

0.99+

five daysQUANTITY

0.99+

June 6thDATE

0.99+

10QUANTITY

0.99+

tomorrowDATE

0.99+

five yearsQUANTITY

0.99+

third yearQUANTITY

0.99+

North StarORGANIZATION

0.99+

LesterPERSON

0.99+

SiriTITLE

0.99+

10 yearsQUANTITY

0.99+

80%QUANTITY

0.99+

second episodeQUANTITY

0.99+

Blaise PascalPERSON

0.99+

Leicester WatersORGANIZATION

0.99+

15 copiesQUANTITY

0.99+

53%QUANTITY

0.99+

LesterORGANIZATION

0.99+

TodayDATE

0.99+

both sidesQUANTITY

0.99+

four monthsQUANTITY

0.99+

eachQUANTITY

0.99+

todayDATE

0.99+

two yearsQUANTITY

0.99+

20 xQUANTITY

0.99+

Iot TahoeORGANIZATION

0.99+

oneQUANTITY

0.99+

first interviewQUANTITY

0.99+

secondQUANTITY

0.98+

Celester LouisPERSON

0.98+

TwitterORGANIZATION

0.98+

Lester FleetORGANIZATION

0.98+

FacebookORGANIZATION

0.98+

BothQUANTITY

0.98+

FirstlyQUANTITY

0.98+

firstQUANTITY

0.98+

one yearQUANTITY

0.98+

10 years agoDATE

0.98+

White HouseORGANIZATION

0.98+

OneQUANTITY

0.98+

two thingsQUANTITY

0.97+

both worldsQUANTITY

0.97+

SecondlyQUANTITY

0.97+

IotORGANIZATION

0.97+

Iot LabsORGANIZATION

0.97+

20%QUANTITY

0.96+

CoveORGANIZATION

0.96+

FirstQUANTITY

0.96+

Dave OutcomesPERSON

0.95+

firstlyQUANTITY

0.95+

three big problemsQUANTITY

0.94+

three coreQUANTITY

0.94+

IsraelLOCATION

0.94+

threeQUANTITY

0.94+

KohI CityORGANIZATION

0.91+

Kohi CityLOCATION

0.9+

one thingQUANTITY

0.89+

LeicesterORGANIZATION

0.89+

Io-Tahoe Smart Data Lifecycle CrowdChat | Digital


 

(upbeat music) >> Voiceover: From around the globe, it's theCUBE with digital coverage of Data Automated. An event series brought to you by Io-Tahoe. >> Welcome everyone to the second episode in our Data Automated series made possible with support from Io-Tahoe. Today, we're going to drill into the data lifecycle. Meaning the sequence of stages that data travels through from creation to consumption to archive. The problem as we discussed in our last episode is that data pipelines are complicated, they're cumbersome, they're disjointed and they involve highly manual processes. A smart data lifecycle uses automation and metadata to improve agility, performance, data quality and governance. And ultimately, reduce costs and time to outcomes. Now, in today's session we'll define the data lifecycle in detail and provide perspectives on what makes a data lifecycle smart? And importantly, how to build smarts into your processes. In a moment we'll be back with Adam Worthington from Ethos to kick things off. And then, we'll go into an expert power panel to dig into the tech behind smart data lifecyles. And, then we'll hop into the crowd chat and give you a chance to ask questions. So, stay right there, you're watching theCUBE. (upbeat music) >> Voiceover: Innovation. Impact. Influence. Welcome to theCUBE. Disruptors. Developers. And, practitioners. Learn from the voices of leaders, who share their personal insights from the hottest digital events around the globe. Enjoy the best this community has to offer on theCUBE. Your global leader in high tech digital coverage. >> Okay, we're back with Adam Worthington. Adam, good to see you, how are things across the pond? >> Good thank you, I'm sure our weather's a little bit worse than yours is over the other side, but good. >> Hey, so let's set it up, tell us about yourself, what your role is as CTO and--- >> Yeah, Adam Worthington as you said, CTO and co-founder of Ethos. But, we're a pretty young company ourselves, so we're in our sixth year. And, we specialize in emerging disruptive technology. So, within the infrastructure data center kind of cloud space. And, my role is a technical lead, so I, it's kind of my job to be an expert in all of the technologies that we work with. Which can be a bit of a challenge if you have a huge portfolio. One of the reasons we got to deliberately focus on. And also, kind of pieces of technical validation and evaluation of new technologies. >> So, you guys are really technology experts, data experts, and probably also expert in process and delivering customer outcomes, right? >> That's a great word there Dave, outcomes. I mean, that's a lot of what I like to speak to customers about. >> Let's talk about smart data you know, when you throw out terms like this it kind of can feel buzz wordy but what are the critical aspects of so-called smart data? >> Cool, well typically I had to step back a little bit and set the scene a little bit more in terms of kind of where I came from. So, and the types of problems I've sorted out. So, I'm really an infrastructure or solution architect by trade. And, what I kind of, relatively organically, but over time my personal framework and approach. I focused on three core design principles. So, simplicity, flexibility and efficiency. So, whatever it was I was designing and obviously they need different things depending on what the technology area is that we're working with. So, that's for me a pretty good step. So, they're the kind of areas that a smart approach in data will directly address both reducing silos. So, that comes from simplifying. So, moving away from complexity of infrastructure. Reducing the amount of copies of data that we have across the infrastructure. And, reducing the amount of application environment for the need for different areas. So, the smarter we get with data it's in my eyes anyway, the further we move away from those traditional legacy. >> But, how does it work? I mean, how, in other words, what's involved in injecting smarts into your data lifecycle? >> I think one of my, well actually I didn't have this quote ready, but genuinely one of my favorite quotes is from the French philosopher and mathematician, Blaise Pascal and he says, if I get this right, "I'd have written you a shorter letter, but I didn't have the time." So, there's real, I love that quote for lots of reasons. >> Dave: Alright. >> That's direct applications in terms of what we're talking about. In terms of, it's actually really complicated to develop a technology capability to make things simple. Be more directly meeting the needs of the business through tech. So, you provide self-service capability. And, I don't just mean self-driving, I mean making data and infrastructure make sense to the business users that are using it. >> Your job, correct me if I'm wrong, is to kind of put that all together in a solution. And then, help the customer you know, realize what we talked about earlier that business out. >> Yeah, and that's, it's sitting at both sides and understanding both sides. So, kind of key to us in our abilities to be able to deliver on exactly what you've just said, is being experts in the capabilities and new and better ways of doing things. But also, having the kind of, better business understanding to be able to ask the right questions to identify how can you better approach this 'cause it helps solve these issues. But, another area that I really like is the, with the platforms you can do more with less. And, that's not just about reducing data redundancy, that's about creating application environments that can service, an infrastructure to service different requirements that are able to do the random IO thing without getting too kind of low level tech. As well as the sequential. So, what that means is, that you don't necessarily have to move data from application environment A, do one thing with it, collate it and then move it to the application environment B, to application environment C, in terms of an analytics kind of left to right workload, you keep your data where it is, use it for different requirements within the infrastructure and again, do more with less. And, what that does, it's not just about simplicity and efficiency, it significantly reduces the times of value that that faces, as well. >> Do you have examples that you can share with us, even if they're anonymized of customers that you've worked with, that are maybe a little further down on the journey. Or, maybe not and--- >> Looking at the, you mentioned data protection earlier. So, another organization this is a project which is just coming nearing completion at the moment. Huge organization, that literally petabytes of data that was servicing their backup and archive. And, what they had is not just this reams of data. They had, I think I'm right in saying, five different backup applications that they had depending on the, what area of infrastructure they were backing up. So, whether it was virtualization, that was different to if they were backing up, different if they were backing up another data base environment they were using something else in the cloud. So, a consolidated approach that we recommended to work with them on. They were able to significantly reduce complexity and reduce the amount of time that it took them. So, what they were able to achieve and this was again, one of the key departments they had. They'd gone above the threshold of being able to backup all of them. >> Adam, give us the final thoughts, bring us home in this segment. >> Well, the final thoughts, so this is something, yeah we didn't particularly touch on. But, I think it's kind of slightly hidden, it isn't spoken about as much as I think it could be. Is the traditional approaches to infrastructure. We've already touched on that they can be complicated and there's a lack of efficiency. It impacts a user's ability to be agile. But, what you find with traditional approaches and we've already touched on some of the kind of benefits to new approaches there, is that they're often very prescriptive. They're designed for a particular firm. The infrastructure environment, the way that it's served up to the users in a kind of a packaged kind of way, means that they need to use it in that, whatever way it's been dictated. So, that kind of self-service aspect, as it comes in from a flexibility standpoint. But, these platforms and these platform approaches is the right way to address technology in my eyes. Enables the infrastructure to be used flexibly. So, the business users and the data users, what we find is that if we put in this capability into their hands. They start innovating the way that they use that data. And, the way that they bring benefits. And, if a platform is too prescriptive and they aren't able to do that, then what you're doing with these new approaches is get all of the metrics that we've touched on. It's fantastic from a cost standpoint, from an agility standpoint. But, what it means is that the innovators in the business, the ones that really understand what they're looking to achieve, they now have the tools to innovate with that. And, I think, and I've started to see that with projects that we've completed, if you do it in the right way, if you articulate the capability and you empower the business users in the right way. Then, they're in a significantly better position, these businesses to take advantages and really sort of match and significantly beat off their competition environment spaces. >> Super Adam, I mean a really exciting space. I mean we spent the last 10 years gathering all this data. You know, trying to slog through it and figure it out and now, with the tools that we have and the automation capabilities, it really is a new era of innovation and insight. So, Adam Worthington, thanks so much for coming in theCUBE and participating in this program. >> Yeah, exciting times and thank you very much Dave for inviting me, and yeah big pleasure. >> Now, we're going to go into the power panel and go deeper into the technologies that enable smart data lifecyles. And, stay right there, you're watching theCUBE. (light music) >> Voiceover: Are you interested in test-driving the Io-Tahoe platform? Kickstart the benefits of Data Automation for your business through the IoLabs program. A flexible, scalable, sandbox environment on the cloud of your choice. With setup, service and support provided by Io-Tahoe. Click on the link and connect with a data engineer to learn more and see Io-Tahoe in action. >> Welcome back everybody to the power panel, driving business performance with smart data lifecyles. Lester Waters is here, he's the Chief Technology Officer from Io-Tahoe. He's joined by Patrick Smith, who is field CTO from Pure Storage. And, Ezat Dayeh who is Assistant Engineering Manager at Cohesity. Gentlemen, good to see you, thanks so much for coming on this panel. >> Thank you, Dave. >> Yes. >> Thank you, Dave. >> Let's start with Lester, I wonder if each of you could just give us a quick overview of your role and what's the number one problem that you're focused on solving for your customers? Let's start with Lester, please. >> Ah yes, I'm Lester Waters, Chief Technology Officer for Io-Tahoe. And really, the number one problem that we are trying to solve for our customers is to help them understand what they have. 'Cause if they don't understand what they have in terms of their data, they can't manage it, they can't control it, they can't monitor it, they can't ensure compliance. So, really that's finding all that you can about your data that you have and building a catalog that can be readily consumed by the entire business is what we do. >> Patrick, field CTO in your title, that says to me you're talking to customers all the time so you've got a good perspective on it. Give us you know, your take on things here. >> Yeah absolutely, so my patch is in the air and talk to customers and prospects in lots of different verticals across the region. And, as they look at their environments and their data landscape, they're faced with massive growth in the data that they're trying to analyze. And, demands to be able to get inside are faster. And, to deliver business value faster than they've ever had to do in the past, so. >> Got it and then Ezat at Cohesity, you're like the new kid on the block. You guys are really growing rapidly. You created this whole notion of data management, backup and beyond, but from Assistant Engineering Manager what are you seeing from customers, your role and the number one problem that you're solving? >> Yeah sure, so the number one problem I see you know, time and again speaking with customers it's all around data fragmentation. So, due to things like organic growth you know, even maybe budgetary limitations, infrastructure has grown you know, over time, very piecemeal. And, it's highly distributed internally. And, just to be clear you know, when I say internally you know, that could be that it's on multiple platforms or silos within an on-prem infrastructure. But, that it also does extend to the cloud, as well. >> Right hey, cloud is cool, everybody wants to be in the cloud, right? So, you're right it creates maybe unattended consequences. So, let's start with the business outcome and kind of try to work backwards. I mean people you know, they want to get more insights from data, they want to have a more efficient data lifecyle. But, so Lester let me start with you, in thinking about like, the North Star, creating data driven cultures you know, what is the North Star for customers here? >> I think the North Star in a nutshell is driving value from your data. Without question, I mean we differentiate ourselves these days by even the nuances in our data. Now, underpinning that there's a lot of things that have to happen to make that work out well. You know for example, making sure you adequately protect your data. You know, do you have a good storage system? Do you have a good backup and recovery point objectives, recovering time objectives? Do you, are you fully compliant? Are you ensuring that you're ticking all the boxes? There's a lot of regulations these days in terms, with respect to compliance, data retention, data privacy and so fourth. Are you ticking those boxes? Are you being efficient with your data? You know, in other words I think there's a statistic that someone mentioned to me the other day that 53% of all businesses have between three and 15 copies of the same data. So you know, finding and eliminating those is part of the problems you need to chase. >> I like to think of you know, you're right. Lester, no doubt, business value and a lot of that comes from reducing the end to end cycle times. But, anything that you guys would add to that, Patrick and Ezat, maybe start with Patrick. >> Yeah, I think getting value from data really hits on, it hits on what everyone wants to achieve. But, I think there are a couple of key steps in doing that. First of all is getting access to the data. And that's, that really hits three big problems. Firstly, working out what you've got. Secondly, after working out what you've got, how to get access to it. Because, it's all very well knowing that you've got some data but if you can't get access to it. Either, because of privacy reasons, security reasons. Then, that's a big challenge. And then finally, once you've got access to the data, making sure that you can process that data in a timely manner. >> For me you know, it would be that an organization has got a really good global view of all of its data. It understands the data flow and dependencies within their infrastructure. Understands the precise legal and compliance requirements. And, has the ability to action changes or initiatives within their environment. Forgive the pun, but with a cloud like agility. You know, and that's no easy feat, right? That is hard work. >> Okay, so we've talked about the challenges and some of the objectives, but there's a lot of blockers out there and I want to understand how you guys are helping remove them? So, Lester what do you see as some of the big blockers in terms of people really leaning in to this smart data lifecycle. >> Yeah silos, is probably one of the biggest one I see in businesses. Yes, it's my data not your data. Lots of compartmentalization. And, breaking that down is one of the challenges. And, having the right tools to help you do that is only part of the solution. There's obviously a lot of cultural things that need to take place to break down those silos and work together. If you can identify where you have redundant data across your enterprise, you might be able to consolidate those. >> Yeah so, over to Patrick, so you know, one of the blockers that I see is legacy infrastructure, technical debt sucking all the budget. You got you know, too many people having to look after. >> As you look at the infrastructure that supports peoples data landscapes today. For primarily legacy reasons, the infrastructure itself is siloed. So, you have different technologies with different underlying hardware, different management methodologies that are there for good reason. Because, historically you had to have specific fitness for purpose for different data requirements. >> Dave: Ah-hm. >> And, that's one of the challenges that we tackled head on at Pure. With the flash plate technology and the concept of the data hub. A platform that can deliver in different characteristics for the different workloads. But, from a consistent data platform. >> Now, Ezat I want to go to you because you know, in the world, in your world which to me goes beyond backup and one of the challenges is you know, they say backup is one thing, recovery is everything. But as well, the CFO doesn't want to pay for just protection. Now, one of the things that I like about what you guys have done is you've broadened the perspective to get more value out of your what was once seen as an insurance policy. >> I do see one of the biggest blockers as the fact that the task at hand can you know, be overwhelming for customers. But, the key here is to remember that it's not an overnight change, it's not you know, the flick of the switch. It's something that can be tackled in a very piecemeal manner. And, absolutely like you've said you know, reduction in TCO and being able to leverage the data for other purposes is a key driver for this. So you know, this can be resolved. It can be very you know, pretty straightforward. It can be quite painless, as well. Same goes for unstructured data, which is very complex to manage. And you know, we've all heard the stats from the analysts, you know data obviously is growing at an extremely rapid rate. But, actually when you look at that you know, how is it actually growing? 80% of that growth is actually in unstructured data and only 20% of that growth is in structured data. So you know, these are quick win areas that the customers can realize immediate TCO improvement and increased agility, as well. >> Let's paint a picture of this guys, if I can bring up the lifecyle. You know what you can see here is you've got this cycle, the data lifecycle and what we're wanting to do is inject intelligence or smarts into this lifecyle. So, you can see you start with ingestion or creation of data. You're storing it, you've got to put it somewhere, right? You've got to classify it, you've got to protect it. And then, of course you want to you know, reduce the copies, make it you know, efficient. And then, you want to prepare it so that businesses can actually consume it and then you've got compliance and governance and privacy issues. And, I wonder if we could start with you Lester, this is you know, the picture of the lifecycle. What role does automation play in terms of injecting smarts into the lifecycle? >> Automation is key here, you know. Especially from the discover, catalog and classify perspective. I've seen companies where they go and we'll take and dump all of their data base schemes into a spreadsheet. So, that they can sit down and manually figure out what attribute 37 means for a column name. And, that's only the tip of the iceberg. So, being able to automatically detect what you have, automatically deduce where, what's consuming the data, you know upstream and downstream, being able to understand all of the things related to the lifecycle of your data backup, archive, deletion, it is key. And so, having good toolage areas is very important. >> So Patrick, obviously you participate in the store piece of this picture. So, I wondered if you could just talk more specifically about that, but I'm also interested in how you affect the whole system view, the end-to-end cycle time. >> Yeah, I think Lester kind of hit the nail on the head in terms of the importance of automation. Because, the data volumes are just so massive now that you can't effectively manage or understand or catalog your data without automation. Once you understand the data and the value of the data, then that's where you can work out where the data needs to be at any point in time. >> Right, so Pure and Cohesity obviously partnered to do that and of course, Ezat you guys are part of the protect, you're certainly part of the retain. But also, you provide data management capabilities and analytics, I wonder if you could add some color there? >> Yeah absolutely, so like you said you know, we focus pretty heavily on data protection as just one of our areas. And, that infrastructure it is just sitting there really can you know, the legacy infrastructure it's just sitting there you know, consuming power, space, cooling and pretty inefficient. And, automating that process is a key part of that. If I have a modern day platform such as you know, the Cohesity data platform I can actually do a lot of analytics on that through applications. So, we have a marketplace for apps. >> I wonder if we could talk about metadata. It's increasingly important you know, metadata is data about the data. But, Lester maybe explain why it's so important and what role it plays in terms of creating smart data lifecycle. >> A lot of people think it's just about the data itself. But, there's a lot of extended characteristics about your data. So, imagine if for my data lifecycle I can communicate with the backup system from Cohesity. And, find out when the last time that data was backed up or where it's backed up to. I can communicate, exchange data with Pure Storage and find out what tier it's on. Is the data at the right tier commencer with it's use level? If I could point it out. And, being able to share that metadata across systems. I think that's the direction that we're going in. Right now, we're at the stage we're just identifying the metadata and trying to bring it together and catalog it. The next stage will be okay, using the APIs and that we have between our systems. Can we communicate and share that data and build good solutions for customers to use? >> I think it's a huge point that you just made, I mean you know 10 years ago, automating classification was the big problem. And you know, with machine intelligence you know, we're obviously attacking that. But, your point about as machines start communicating to each other and you start you know, it's cloud to cloud. There's all kinds of metadata, kind of new metadata that's being created. I often joke that some day there's going to be more metadata than data. So, that brings us to cloud and Ezat, I'd like to start with you. >> You know, I do think that you know, having the cloud is a great thing. And, it has got its role to play and you can have many different you know, permutations and iterations of how you use it. And, you know, as I've may have sort of mentioned previously you know, I've seen customers go into the cloud very, very quickly and actually recently they're starting to remove workloads from the cloud. And, the reason why this happens is that you know, cloud has got its role to play but it's not right for absolutely everything. Especially in their current form, as well. A good analogy I like to use and this may sound a little bit clique but you know, when you compare clouds versus on premises data centers. You can use the analogies of houses and hotels. So, to give you an idea, so you know, when we look at hotels that's like the equivalent of a cloud, right? I can get everything I need from there. I can get my food, my water, my outdoor facilities, if I need to accommodate more people, I can rent some more rooms. I don't have to maintain the hotel, it's all done for me. When you look at houses the equivalent to you know, on premises infrastructure. I pretty much have to do everything myself, right? So, I have to purchase the house, I have to maintain it, I have buy my own food and water, eat it, I have to make improvements myself. But, then why do we all live in houses, not in hotels? And, the simple answer that I can only think of is, is that it's cheaper, right? It's cheaper to do it myself, but that's not to say that hotels haven't got their role to play. You know, so for example if I've got loads of visitors coming over for the weekend, I'm not going to go and build an extension to my house, just for them. I will burst into my hotel, into the cloud. And, you use it for you know, for things like that. So, what I'm really saying is the cloud is great for many things, but it can work out costlier for certain applications, while others are a perfect fit. >> That's an interesting analogy, I hadn't thought of that before. But, you're right, 'cause I was going to say well part of it is you want the cloud experience everywhere. But, you don't always want the cloud experience, especially you know, when you're with your family, you want certain privacy. I've not heard that before, Ezat. So, that's a new perspective, so thank you. But, Patrick I do want to come back to that cloud experience because in fact that's what's happening in a lot of cases. Organizations are extending the cloud properties of automation on-prem. >> Yeah, I thought Ezat brought up a really interesting point and a great analogy for the use of the public cloud. And, it really reinforces the importance of the Hybrid and the multicloud environment. Because, it gives you that flexibility to choose where is the optimal environment to run your business workloads. And, that's what it's all about. And, the flexibility to change which environment you're running in, either from one month to the next or from one year to the next. Because, workloads change and the characteristics that are available in the cloud change. The Hybrid cloud is something that we've lived with ourselves at Pure. So, our Pure management technology actually sits in a Hybrid cloud environment. We started off entirely cloud native but now, we use the public cloud for compute and we use our own technology at the end of a high performance network link to support our data platform. So, we're getting the best of both worlds. I think that's where a lot of our customers are trying to get to. >> All right, I want to come back in a moment there. But before we do, Lester I wonder if we could talk a little bit about compliance and governance and privacy. I think the Brits on this panel, we're still in the EU for now but the EU are looking at new rules, new regulations going beyond GDPR. Where does sort of privacy, governance, compliance fit in for the data lifecycle. And Ezat, I want your thought on this as well? >> Ah yeah, this is a very important point because the landscape for compliance around data privacy and data retention is changing very rapidly. And, being able to keep up with those changing regulations in an automated fashion is the only way you're going to be able to do it. Even, I think there's a some sort of a maybe ruling coming out today or tomorrow with a change to GDPR. So, this is, these are all very key points and being able to codify those rules into some software whether you know, Io-Tahoe or your storage system or Cohesity, it'll help you be compliant is crucial. >> Yeah, Ezat anything you can add there, I mean this really is your wheel house? >> Yeah, absolutely, so you know, I think anybody who's watching this probably has gotten the message that you know, less silos is better. And, it absolutely it also applies to data in the cloud, as well. So you know, by aiming to consolidate into you know, fewer platforms customers can realize a lot better control over their data. And, the natural affect of this is that it makes meeting compliance and governance a lot easier. So, when it's consolidated you can start to confidently understand who's accessing your data, how frequently are they accessing the data. You can also do things like you know, detecting an ominous file access activities and quickly identify potential threats. >> Okay Patrick, we were talking, you talked earlier about storage optimization. We talked to Adam Worthington about the business case, you've got the sort numerator which is the business value and then a denominator which is the cost. And, what's unique about Pure in this regard? >> Yeah, and I think there are multiple dimensions to that. Firstly, if you look at the difference between legacy storage platforms, they used to take up racks or aisles of space in a data center. With flash technology that underpins flash played we effectively switch out racks for rack units. And, it has a big play in terms of data center footprint and the environmentals associated with a data center. If you look at extending out storage efficiencies and the benefits it brings. Just the performance has a direct effect on staff. Whether that's you know, the staff and the simplicity of the platform so that it's easy and efficient to manage. Or, whether it's the efficiency you get from your data scientists who are using the outcomes from the platform and making them more efficient. If you look at some of our customers in the financial space their time to results are improved by 10 or 20 x by switching to our technology. From legacy technologies for their analytics platforms. >> So guys, we've been running you know, CUBE interviews in our studios remotely for the last 120 days. This is probably the first interview I've done where I haven't started off talking about COVID. Lester, I wondered if you could talk about smart data lifecycle and how it fits into this isolation economy and hopefully what will soon be a post-isolation economy? >> Yeah, COVID has dramatically accelerated the data economy. I think you know, first and foremost we've all learned to work at home. I you know, we've all had that experience where you know, people would hum and har about being able to work at home just a couple of days a week. And, here we are working five days a week. That's had a knock on impact to infrastructure to be able to support that. But, going further than that you know, the data economy is all about how a business can leverage their data to compete in this new world order that we are now in. COVID has really been a forcing function to you know, it's probably one of the few good things that have come out of COVID is that we've been forced to adapt. And, it's been an interesting journey and it continues to be so. >> Like Lester said you know, we're seeing huge impact here. You know, working from home has pretty much become the norm now. You know, companies have been forced into making it work. If you look at online retail, that's accelerated dramatically, as well. Unified communications and video conferencing. So, really you know, that the point here is that, yes absolutely we've compressed you know, in the past maybe four months what probably would have taken maybe even five years, maybe 10 years or so. >> We've got to wrap, but so Lester let me ask you, sort of paint a picture of the sort of journey the maturity model that people have to take. You know, if they want to get into it, where do they start and where are they going? Give us that view. >> Yeah, I think first is knowing what you have. If you don't know what you have you can't manage it, you can't control it, you can't secure it, you can't ensure it's compliant. So, that's first and foremost. The second is really you know, ensuring that you're compliant once you know what you have, are you securing it? Are you following the regulatory, the regulations? Are you able to evidence that? How are you storing your data? Are you archiving it? Are you storing it effectively and efficiently? You know, have you, nirvana from my perspective is really getting to a point where you've consolidated your data, you've broken down the silos and you have a virtually self-service environment by which the business can consume and build upon their data. And, really at the end of the day as we said at the beginning, it's all about driving value out of your data. And, automation is key to this journey. >> That's awesome and you've just described like sort of a winning data culture. Lester, Patrick, Ezat, thanks so much for participating in this power panel. >> Thank you, David. >> Thank you. >> All right, so great overview of the steps in the data lifecyle and how to inject smarts into the processes, really to drive business outcomes. Now, it's your turn, hop into the crowd chat. Please log in with Twitter or LinkedIn or Facebook, ask questions, answer questions and engage with the community. Let's crowd chat! (bright music)

Published Date : Jul 29 2020

SUMMARY :

to you by Io-Tahoe. and give you a chance to ask questions. Enjoy the best this community Adam, good to see you, how Good thank you, I'm sure our of the technologies that we work with. I like to speak to customers about. So, and the types of is from the French of the business through tech. And then, help the customer you know, to identify how can you that you can share with us, and reduce the amount of Adam, give us the final thoughts, the kind of benefits to and the automation capabilities, thank you very much Dave and go deeper into the technologies on the cloud of your choice. he's the Chief Technology I wonder if each of you So, really that's finding all that you can Give us you know, your in the data that they're and the number one problem And, just to be clear you know, I mean people you know, they is part of the problems you need to chase. from reducing the end to end cycle times. making sure that you can process And, has the ability to action changes So, Lester what do you see as some of And, having the right tools to help you Yeah so, over to Patrick, so you know, So, you have different technologies and the concept of the data hub. the challenges is you know, the analysts, you know to you know, reduce the copies, And, that's only the tip of the iceberg. in the store piece of this picture. the data needs to be at any point in time. and analytics, I wonder if you it's just sitting there you know, It's increasingly important you know, And, being able to share to each other and you start So, to give you an idea, so you know, especially you know, when And, the flexibility to change compliance fit in for the data lifecycle. in an automated fashion is the only way You can also do things like you know, about the business case, Whether that's you know, you know, CUBE interviews forcing function to you know, So, really you know, that of the sort of journey And, really at the end of the day for participating in this power panel. the processes, really to

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
PatrickPERSON

0.99+

DavidPERSON

0.99+

Ezat DayehPERSON

0.99+

DavePERSON

0.99+

Adam WorthingtonPERSON

0.99+

Patrick SmithPERSON

0.99+

AdamPERSON

0.99+

EzatPERSON

0.99+

80%QUANTITY

0.99+

10QUANTITY

0.99+

second episodeQUANTITY

0.99+

Blaise PascalPERSON

0.99+

53%QUANTITY

0.99+

five yearsQUANTITY

0.99+

tomorrowDATE

0.99+

10 yearsQUANTITY

0.99+

EUORGANIZATION

0.99+

sixth yearQUANTITY

0.99+

Io-TahoeORGANIZATION

0.99+

EthosORGANIZATION

0.99+

North StarORGANIZATION

0.99+

LesterPERSON

0.99+

CohesityORGANIZATION

0.99+

secondQUANTITY

0.99+

both sidesQUANTITY

0.99+

first interviewQUANTITY

0.99+

eachQUANTITY

0.99+

firstQUANTITY

0.99+

one monthQUANTITY

0.99+

Lester WatersPERSON

0.99+

GDPRTITLE

0.98+

todayDATE

0.98+

FirstlyQUANTITY

0.98+

one yearQUANTITY

0.98+

15 copiesQUANTITY

0.98+

LinkedInORGANIZATION

0.98+

FirstQUANTITY

0.98+

TodayDATE

0.98+

20 xQUANTITY

0.98+

oneQUANTITY

0.98+

bothQUANTITY

0.97+

10 years agoDATE

0.97+

four monthsQUANTITY

0.97+

five days a weekQUANTITY

0.97+

SecondlyQUANTITY

0.97+

FacebookORGANIZATION

0.97+

both worldsQUANTITY

0.97+

TwitterORGANIZATION

0.97+

threeQUANTITY

0.96+

OneQUANTITY

0.96+

Pure StorageORGANIZATION

0.95+

LesterORGANIZATION

0.94+

20%QUANTITY

0.94+

PureORGANIZATION

0.93+

fourthQUANTITY

0.93+

Enterprise Data Automation | Crowdchat


 

>>from around the globe. It's the Cube with digital coverage of enterprise data automation, an event Siri's brought to you by Iot. Tahoe Welcome everybody to Enterprise Data Automation. Ah co created digital program on the Cube with support from my hotel. So my name is Dave Volante. And today we're using the hashtag data automated. You know, organizations. They really struggle to get more value out of their data, time to data driven insights that drive cost savings or new revenue opportunities. They simply take too long. So today we're gonna talk about how organizations can streamline their data operations through automation, machine intelligence and really simplifying data migrations to the cloud. We'll be talking to technologists, visionaries, hands on practitioners and experts that are not just talking about streamlining their data pipelines. They're actually doing it. So keep it right there. We'll be back shortly with a J ahora who's the CEO of Iot Tahoe to kick off the program. You're watching the Cube, the leader in digital global coverage. We're right back right after this short break. Innovation impact influence. Welcome to the Cube disruptors. Developers and practitioners learn from the voices of leaders who share their personal insights from the hottest digital events around the globe. Enjoy the best this community has to offer on the Cube, your global leader. High tech digital coverage from around the globe. It's the Cube with digital coverage of enterprise, data, automation and event. Siri's brought to you by Iot. Tahoe. Okay, we're back. Welcome back to Data Automated. A J ahora is CEO of I O ta ho, JJ. Good to see how things in London >>Thanks doing well. Things in, well, customers that I speak to on day in, day out that we partner with, um, they're busy adapting their businesses to serve their customers. It's very much a game of ensuring the week and serve our customers to help their customers. Um, you know, the adaptation that's happening here is, um, trying to be more agile. Got to be more flexible. Um, a lot of pressure on data, a lot of demand on data and to deliver more value to the business, too. So that customers, >>as I said, we've been talking about data ops a lot. The idea being Dev Ops applied to the data pipeline, But talk about enterprise data automation. What is it to you. And how is it different from data off >>Dev Ops, you know, has been great for breaking down those silos between different roles functions and bring people together to collaborate. Andi, you know, we definitely see that those tools, those methodologies, those processes, that kind of thinking, um, lending itself to data with data is exciting. We look to do is build on top of that when data automation, it's the it's the nuts and bolts of the the algorithms, the models behind machine learning that the functions. That's where we investors, our r and d on bringing that in to build on top of the the methods, the ways of thinking that break down those silos on injecting that automation into the business processes that are going to drive a business to serve its customers. It's, um, a layer beyond Dev ops data ops. They can get to that point where well, I think about it is is the automation behind new dimension. We've come a long way in the last few years. Boy is, we started out with automating some of those simple, um, to codify, um, I have a high impact on organization across the data a cost effective way house. There's data related tasks that classify data on and a lot of our original pattern certain people value that were built up is is very much around that >>love to get into the tech a little bit in terms of how it works. And I think we have a graphic here that gets into that a little bit. So, guys, if you bring that up, >>sure. I mean right there in the middle that the heart of what we do it is, you know, the intellectual property now that we've built up over time that takes from Hacha genius data sources. Your Oracle Relational database. Short your mainframe. It's a lay and increasingly AP eyes and devices that produce data and that creates the ability to automatically discover that data. Classify that data after it's classified. Them have the ability to form relationships across those different source systems, silos, different lines of business. And once we've automated that that we can start to do some cool things that just puts of contact and meaning around that data. So it's moving it now from bringing data driven on increasingly where we have really smile, right people in our customer organizations you want I do some of those advanced knowledge tasks data scientists and ah, yeah, quants in some of the banks that we work with, the the onus is on, then, putting everything we've done there with automation, pacifying it, relationship, understanding that equality, the policies that you can apply to that data. I'm putting it in context once you've got the ability to power. Okay, a professional is using data, um, to be able to put that data and contacts and search across the entire enterprise estate. Then then they can start to do some exciting things and piece together the the tapestry that fabric across that different system could be crm air P system such as s AP and some of the newer brown databases that we work with. Snowflake is a great well, if I look back maybe five years ago, we had prevalence of daily technologies at the cutting edge. Those are converging to some of the cloud platforms that we work with Google and AWS and I think very much is, as you said it, those manual attempts to try and grasp. But it is such a complex challenges scale quickly runs out of steam because once, once you've got your hat, once you've got your fingers on the details Oh, um, what's what's in your data state? It's changed, You know, you've onboard a new customer. You signed up a new partner. Um, customer has, you know, adopted a new product that you just Lawrence and there that that slew of data keeps coming. So it's keeping pace with that. The only answer really is is some form of automation >>you're working with AWS. You're working with Google, You got red hat. IBM is as partners. What is attracting those folks to your ecosystem and give us your thoughts on the importance of ecosystem? >>That's fundamental. So, I mean, when I caimans where you tell here is the CEO of one of the, um, trends that I wanted us CIO to be part of was being open, having an open architecture allowed one thing that was close to my heart, which is as a CEO, um, a c i o where you go, a budget vision on and you've already made investments into your organization, and some of those are pretty long term bets. They should be going out 5 10 years, sometimes with the CRM system training up your people, getting everybody working together around a common business platform. What I wanted to ensure is that we could openly like it using AP eyes that were available, the love that some investment on the cost that has already gone into managing in organizations I t. But business users to before. So part of the reason why we've been able to be successful with, um, the partners like Google AWS and increasingly, a number of technology players. That red hat mongo DB is another one where we're doing a lot of good work with, um and snowflake here is, um Is those investments have been made by the organizations that are our customers, and we want to make sure we're adding to that. And they're leveraging the value that they've already committed to. >>Yeah, and maybe you could give us some examples of the r A y and the business impact. >>Yeah, I mean, the r a y David is is built upon on three things that I mentioned is a combination off. You're leveraging the existing investment with the existing estate, whether that's on Microsoft Azure or AWS or Google, IBM, and I'm putting that to work because, yeah, the customers that we work with have had made those choices. On top of that, it's, um, is ensuring that we have got the automation that is working right down to the level off data, a column level or the file level we don't do with meta data. It is being very specific to be at the most granular level. So as we've grown our processes and on the automation, gasification tagging, applying policies from across different compliance and regulatory needs that an organization has to the data, everything that then happens downstream from that is ready to serve a business outcome now without hoping out which run those processes within hours of getting started And, um, Bill that picture, visualize that picture and bring it to life. You know, the PR Oh, I that's off the bat with finding data that should have been deleted data that was copies off on and being able to allow the architect whether it's we're working on GCB or a migration to any other clouds such as AWS or a multi cloud landscape right off the map. >>A. J. Thanks so much for coming on the Cube and sharing your insights and your experience is great to have you. >>Thank you, David. Look who is smoking in >>now. We want to bring in the customer perspective. We have a great conversation with Paul Damico, senior vice president data architecture, Webster Bank. So keep it right there. >>Utah Data automated Improve efficiency, Drive down costs and make your enterprise data work for you. Yeah, we're on a mission to enable our customers to automate the management of data to realise maximum strategic and operational benefits. We envisage a world where data users consume accurate, up to date unified data distilled from many silos to deliver transformational outcomes, activate your data and avoid manual processing. Accelerate data projects by enabling non I t resources and data experts to consolidate categorize and master data. Automate your data operations Power digital transformations by automating a significant portion of data management through human guided machine learning. Yeah, get value from the start. Increase the velocity of business outcomes with complete accurate data curated automatically for data, visualization tours and analytic insights. Improve the security and quality of your data. Data automation improves security by reducing the number of individuals who have access to sensitive data, and it can improve quality. Many companies report double digit era reduction in data entry and other repetitive tasks. Trust the way data works for you. Data automation by our Tahoe learns as it works and can ornament business user behavior. It learns from exception handling and scales up or down is needed to prevent system or application overloads or crashes. It also allows for innate knowledge to be socialized rather than individualized. No longer will your companies struggle when the employee who knows how this report is done, retires or takes another job, the work continues on without the need for detailed information transfer. Continue supporting the digital shift. Perhaps most importantly, data automation allows companies to begin making moves towards a broader, more aspirational transformation, but on a small scale but is easy to implement and manage and delivers quick wins. Digital is the buzzword of the day, but many companies recognized that it is a complex strategy requires time and investment. Once you get started with data automation, the digital transformation initiated and leaders and employees alike become more eager to invest time and effort in a broader digital transformational agenda. Yeah, >>everybody, we're back. And this is Dave Volante, and we're covering the whole notion of automating data in the Enterprise. And I'm really excited to have Paul Damico here. She's a senior vice president of enterprise Data Architecture at Webster Bank. Good to see you. Thanks for coming on. >>Nice to see you too. Yes. >>So let's let's start with Let's start with Webster Bank. You guys are kind of a regional. I think New York, New England, uh, leave headquartered out of Connecticut, but tell us a little bit about the >>bank. Yeah, Webster Bank is regional, Boston. And that again in New York, Um, very focused on in Westchester and Fairfield County. Um, they're a really highly rated bank regional bank for this area. They, um, hold, um, quite a few awards for the area for being supportive for the community. And, um, are really moving forward. Technology lives. Currently, today we have, ah, a small group that is just working toward moving into a more futuristic, more data driven data warehouse. That's our first item. And then the other item is to drive new revenue by anticipating what customers do when they go to the bank or when they log into there to be able to give them the best offer. The only way to do that is you have timely, accurate, complete data on the customer and what's really a great value on off something to offer that >>at the top level, what were some of what are some of the key business drivers there catalyzing your desire for change >>the ability to give the customer what they need at the time when they need it? And what I mean by that is that we have, um, customer interactions and multiple weights, right? And I want to be able for the customer, too. Walk into a bank, um, or online and see the same the same format and being able to have the same feel, the same look and also to be able to offer them the next best offer for them. >>Part of it is really the cycle time, the end end cycle, time that you're pressing. And then there's if I understand it, residual benefits that are pretty substantial from a revenue opportunity >>exactly. It's drive new customers, Teoh new opportunities. It's enhanced the risk, and it's to optimize the banking process and then obviously, to create new business. Um, and the only way we're going to be able to do that is that we have the ability to look at the data right when the customer walks in the door or right when they open up their app. >>Do you see the potential to increase the data sources and hence the quality of the data? Or is that sort of premature? >>Oh, no. Um, exactly. Right. So right now we ingest a lot of flat files and from our mainframe type of runnin system that we've had for quite a few years. But now that we're moving to the cloud and off Prem and on France, you know, moving off Prem into, like, an s three bucket Where that data king, we can process that data and get that data faster by using real time tools to move that data into a place where, like, snowflake Good, um, utilize that data or we can give it out to our market. The data scientists are out in the lines of business right now, which is great, cause I think that's where data science belongs. We should give them on, and that's what we're working towards now is giving them more self service, giving them the ability to access the data in a more robust way. And it's a single source of truth. So they're not pulling the data down into their own like tableau dashboards and then pushing the data back out. I have eight engineers, data architects, they database administrators, right, um, and then data traditional data forwarding people, Um, and because some customers that I have that our business customers lines of business, they want to just subscribe to a report. They don't want to go out and do any data science work. Um, and we still have to provide that. So we still want to provide them some kind of read regiment that they wake up in the morning and they open up their email. And there's the report that they just drive, um, which is great. And it works out really well. And one of the things. This is why we purchase I o waas. I would have the ability to give the lines of business the ability to do search within the data, and we read the data flows and data redundancy and things like that and help me cleanup the data and also, um, to give it to the data. Analysts who say All right, they just asked me. They want this certain report and it used to take Okay, well, we're gonna four weeks, we're going to go. We're gonna look at the data, and then we'll come back and tell you what we dio. But now with Iot Tahoe, they're able to look at the data and then, in one or two days of being able to go back and say, Yes, we have data. This is where it is. This is where we found that this is the data flows that we've found also, which is what I call it is the birth of a column. It's where the calm was created and where it went live as a teenager. And then it went to, you know, die very archive. >>In researching Iot Tahoe, it seems like one of the strengths of their platform is the ability to visualize data the data structure, and actually dig into it. But also see it, um, and that speeds things up and gives everybody additional confidence. And then the other pieces essentially infusing ai or machine intelligence into the data pipeline is really how you're attacking automation, right? >>Exactly. So you're able to let's say that I have I have seven cause lines of business that are asking me questions. And one of the questions I'll ask me is, um, we want to know if this customer is okay to contact, right? And you know, there's different avenues so you can go online to go. Do not contact me. You can go to the bank And you could say, I don't want, um, email, but I'll take tests and I want, you know, phone calls. Um, all that information. So seven different lines of business asked me that question in different ways once said Okay to contact the other one says, You know, just for one to pray all these, you know, um, and each project before I got there used to be siloed. So one customer would be 100 hours for them to do that and analytical work, and then another cut. Another of analysts would do another 100 hours on the other project. Well, now I can do that all at once, and I can do those type of searches and say yes we already have that documentation. Here it is. And this is where you can find where the customer has said, You know, you don't want I don't want to get access from you by email, or I've subscribed to get emails from you. I'm using Iot typos eight automation right now to bring in the data and to start analyzing the data close to make sure that I'm not missing anything and that I'm not bringing over redundant data. Um, the data warehouse that I'm working off is not, um a It's an on prem. It's an oracle database. Um, and it's 15 years old, so it has extra data in it. It has, um, things that we don't need anymore. And Iot. Tahoe's helping me shake out that, um, extra data that does not need to be moved into my S three. So it's saving me money when I'm moving from offering on Prem. >>What's your vision or your your data driven organization? >>Um, I want for the bankers to be able to walk around with on iPad in their hands and be able to access data for that customer really fast and be able to give them the best deal that they can get. I want Webster to be right there on top, with being able to add new customers and to be able to serve our existing customers who had bank accounts. Since you were 12 years old there and now our, you know, multi. Whatever. Um, I want them to be able to have the best experience with our our bankers. >>That's really what I want is a banking customer. I want my bank to know who I am, anticipate my needs and create a great experience for me. And then let me go on with my life. And so that's a great story. Love your experience, your background and your knowledge. Can't thank you enough for coming on the Cube. >>No, thank you very much. And you guys have a great day. >>Next, we'll talk with Lester Waters, who's the CTO of Iot Toe cluster takes us through the key considerations of moving to the cloud. >>Yeah, right. The entire platform Automated data Discovery data Discovery is the first step to knowing your data auto discover data across any application on any infrastructure and identify all unknown data relationships across the entire siloed data landscape. smart data catalog. Know how everything is connected? Understand everything in context, regained ownership and trust in your data and maintain a single source of truth across cloud platforms, SAS applications, reference data and legacy systems and power business users to quickly discover and understand the data that matters to them with a smart data catalog continuously updated ensuring business teams always have access to the most trusted data available. Automated data mapping and linking automate the identification of unknown relationships within and across data silos throughout the organization. Build your business glossary automatically using in house common business terms, vocabulary and definitions. Discovered relationships appears connections or dependencies between data entities such as customer account, address invoice and these data entities have many discovery properties. At a granular level, data signals dashboards. Get up to date feeds on the health of your data for faster improved data management. See trends, view for history. Compare versions and get accurate and timely visual insights from across the organization. Automated data flows automatically captured every data flow to locate all the dependencies across systems. Visualize how they work together collectively and know who within your organization has access to data. Understand the source and destination for all your business data with comprehensive data lineage constructed automatically during with data discovery phase and continuously load results into the smart Data catalog. Active, geeky automated data quality assessments Powered by active geek You ensure data is fit for consumption that meets the needs of enterprise data users. Keep information about the current data quality state readily available faster Improved decision making Data policy. Governor Automate data governance End to end over the entire data lifecycle with automation, instant transparency and control Automate data policy assessments with glossaries, metadata and policies for sensitive data discovery that automatically tag link and annotate with metadata to provide enterprise wide search for all lines of business self service knowledge graph Digitize and search your enterprise knowledge. Turn multiple siloed data sources into machine Understandable knowledge from a single data canvas searching Explore data content across systems including GRP CRM billing systems, social media to fuel data pipelines >>Yeah, yeah, focusing on enterprise data automation. We're gonna talk about the journey to the cloud Remember, the hashtag is data automate and we're here with Leicester Waters. Who's the CTO of Iot Tahoe? Give us a little background CTO, You've got a deep, deep expertise in a lot of different areas. But what do we need to know? >>Well, David, I started my career basically at Microsoft, uh, where I started the information Security Cryptography group. They're the very 1st 1 that the company had, and that led to a career in information, security. And and, of course, as easy as you go along with information security data is the key element to be protected. Eso I always had my hands and data not naturally progressed into a roll out Iot talk was their CTO. >>What's the prescription for that automation journey and simplifying that migration to the cloud? >>Well, I think the first thing is understanding what you've got. So discover and cataloging your data and your applications. You know, I don't know what I have. I can't move it. I can't. I can't improve it. I can't build upon it. And I have to understand there's dependence. And so building that data catalog is the very first step What I got. Okay, >>so So we've done the audit. We know we've got what's what's next? Where do we go >>next? So the next thing is remediating that data you know, where do I have duplicate data? I may have often times in an organization. Uh, data will get duplicated. So somebody will take a snapshot of the data, you know, and then end up building a new application, which suddenly becomes dependent on that data. So it's not uncommon for an organization of 20 master instances of a customer, and you can see where that will go. And trying to keep all that stuff in sync becomes a nightmare all by itself. So you want to sort of understand where all your redundant data is? So when you go to the cloud, maybe you have an opportunity here to do you consolidate that that data, >>then what? You figure out what to get rid of our actually get rid of it. What's what's next? >>Yes, yes, that would be the next step. So figure out what you need. What, you don't need you Often times I've found that there's obsolete columns of data in your databases that you just don't need. Or maybe it's been superseded by another. You've got tables have been superseded by other tables in your database, so you got to kind of understand what's being used and what's not. And then from that, you can decide. I'm gonna leave this stuff behind or I'm gonna I'm gonna archive this stuff because I might need it for data retention where I'm just gonna delete it. You don't need it. All were >>plowing through your steps here. What's next on the >>journey? The next one is is in a nutshell. Preserve your data format. Don't. Don't, Don't. Don't boil the ocean here at music Cliche. You know, you you want to do a certain degree of lift and shift because you've got application dependencies on that data and the data format, the tables in which they sent the columns and the way they're named. So some degree, you are gonna be doing a lift and ship, but it's an intelligent lift and ship. The >>data lives in silos. So how do you kind of deal with that? Problem? Is that is that part of the journey? >>That's that's great pointed because you're right that the data silos happen because, you know, this business unit is start chartered with this task. Another business unit has this task and that's how you get those in stance creations of the same data occurring in multiple places. So you really want to is part of your cloud migration. You really want a plan where there's an opportunity to consolidate your data because that means it will be less to manage. Would be less data to secure, and it will be. It will have a smaller footprint, which means reduce costs. >>But maybe you could address data quality. Where does that fit in on the >>journey? That's that's a very important point, you know. First of all, you don't want to bring your legacy issues with U. S. As the point I made earlier. If you've got data quality issues, this is a good time to find those and and identify and remediate them. But that could be a laborious task, and you could probably accomplish. It will take a lot of work. So the opportunity used tools you and automate that process is really will help you find those outliers that >>what's next? I think we're through. I think I've counted six. What's the What's the lucky seven >>Lucky seven involved your business users. Really, When you think about it, you're your data is in silos, part of part of this migration to cloud as an opportunity to break down the silos. These silence that naturally occurs are the business. You, uh, you've got to break these cultural barriers that sometimes exists between business and say so. For example, I always advise there's an opportunity year to consolidate your sensitive data. Your P I. I personally identifiable information and and three different business units have the same source of truth From that, there's an opportunity to consolidate that into one. >>Well, great advice, Lester. Thanks so much. I mean, it's clear that the Cap Ex investments on data centers they're generally not a good investment for most companies. Lester really appreciate Lester Water CTO of Iot Tahoe. Let's watch this short video and we'll come right back. >>Use cases. Data migration. Accelerate digitization of business by providing automated data migration work flows that save time in achieving project milestones. Eradicate operational risk and minimize labor intensive manual processes that demand costly overhead data quality. You know the data swamp and re establish trust in the data to enable data signs and Data analytics data governance. Ensure that business and technology understand critical data elements and have control over the enterprise data landscape Data Analytics ENABLEMENT Data Discovery to enable data scientists and Data Analytics teams to identify the right data set through self service for business demands or analytical reporting that advanced too complex regulatory compliance. Government mandated data privacy requirements. GDP Our CCP, A, e, p, R HIPPA and Data Lake Management. Identify late contents cleanup manage ongoing activity. Data mapping and knowledge graph Creates BKG models on business enterprise data with automated mapping to a specific ontology enabling semantic search across all sources in the data estate data ops scale as a foundation to automate data management presences. >>Are you interested in test driving the i o ta ho platform Kickstart the benefits of data automation for your business through the Iot Labs program? Ah, flexible, scalable sandbox environment on the cloud of your choice with set up service and support provided by Iot. Top Click on the link and connect with the data engineer to learn more and see Iot Tahoe in action. Everybody, we're back. We're talking about enterprise data automation. The hashtag is data automated and we're going to really dig into data migrations, data migrations. They're risky, they're time consuming and they're expensive. Yousef con is here. He's the head of partnerships and alliances at I o ta ho coming again from London. Hey, good to see you, Seth. Thanks very much. >>Thank you. >>So let's set up the problem a little bit. And then I want to get into some of the data said that migration is a risky, time consuming, expensive. They're they're often times a blocker for organizations to really get value out of data. Why is that? >>I think I mean, all migrations have to start with knowing the facts about your data. Uh, and you can try and do this manually. But when you have an organization that may have been going for decades or longer, they will probably have a pretty large legacy data estate so that I have everything from on premise mainframes. They may have stuff which is probably in the cloud, but they probably have hundreds, if not thousands of applications and potentially hundreds of different data stores. >>So I want to dig into this migration and let's let's pull up graphic. It will talk about We'll talk about what a typical migration project looks like. So what you see, here it is. It's very detailed. I know it's a bit of an eye test, but let me call your attention to some of the key aspects of this, uh and then use if I want you to chime in. So at the top here, you see that area graph that's operational risk for a typical migration project, and you can see the timeline and the the milestones That Blue Bar is the time to test so you can see the second step. Data analysis. It's 24 weeks so very time consuming, and then let's not get dig into the stuff in the middle of the fine print. But there's some real good detail there, but go down the bottom. That's labor intensity in the in the bottom, and you can see hi is that sort of brown and and you could see a number of data analysis data staging data prep, the trial, the implementation post implementation fixtures, the transition to be a Blu, which I think is business as usual. >>The key thing is, when you don't understand your data upfront, it's very difficult to scope to set up a project because you go to business stakeholders and decision makers, and you say Okay, we want to migrate these data stores. We want to put them in the cloud most often, but actually, you probably don't know how much data is there. You don't necessarily know how many applications that relates to, you know, the relationships between the data. You don't know the flow of the basis of the direction in which the data is going between different data stores and tables. So you start from a position where you have pretty high risk and probably the area that risk you could be. Stack your project team of lots and lots of people to do the next phase, which is analysis. And so you set up a project which has got a pretty high cost. The big projects, more people, the heavy of governance, obviously on then there, then in the phase where they're trying to do lots and lots of manual analysis, um, manual processes, as we all know, on the layer of trying to relate data that's in different grocery stores relating individual tables and columns, very time consuming, expensive. If you're hiring in resource from consultants or systems integrators externally, you might need to buy or to use party tools. Aziz said earlier the people who understand some of those systems may have left a while ago. CEO even higher risks quite cost situation from the off on the same things that have developed through the project. Um, what are you doing with Ayatollah? Who is that? We're able to automate a lot of this process from the very beginning because we can do the initial data. Discovery run, for example, automatically you very quickly have an automated validator. A data met on the data flow has been generated automatically, much less time and effort and much less cars stopped. >>Yeah. And now let's bring up the the the same chart. But with a set of an automation injection in here and now. So you now see the sort of Cisco said accelerated by Iot, Tom. Okay, great. And we're gonna talk about this, but look, what happens to the operational risk. A dramatic reduction in that, That that graph and then look at the bars, the bars, those blue bars. You know, data analysis went from 24 weeks down to four weeks and then look at the labor intensity. The it was all these were high data analysis, data staging data prep trialling post implementation fixtures in transition to be a you all those went from high labor intensity. So we've now attacked that and gone to low labor intensity. Explain how that magic happened. >>I think that the example off a data catalog. So every large enterprise wants to have some kind of repository where they put all their understanding about their data in its price States catalog. If you like, imagine trying to do that manually, you need to go into every individual data store. You need a DB, a business analyst, reach data store. They need to do an extract of the data. But it on the table was individually they need to cross reference that with other data school, it stores and schemers and tables you probably with the mother of all Lock Excel spreadsheets. It would be a very, very difficult exercise to do. I mean, in fact, one of our reflections as we automate lots of data lots of these things is, um it accelerates the ability to water may, But in some cases, it also makes it possible for enterprise customers with legacy systems take banks, for example. There quite often end up staying on mainframe systems that they've had in place for decades. I'm not migrating away from them because they're not able to actually do the work of understanding the data, duplicating the data, deleting data isn't relevant and then confidently going forward to migrate. So they stay where they are with all the attendant problems assistance systems that are out of support. You know, you know, the biggest frustration for lots of them and the thing that they spend far too much time doing is trying to work out what the right data is on cleaning data, which really you don't want a highly paid thanks to scientists doing with their time. But if you sort out your data in the first place, get rid of duplication that sounds migrate to cloud store where things are really accessible. It's easy to build connections and to use native machine learning tools. You well, on the way up to the maturity card, you can start to use some of the more advanced applications >>massive opportunities not only for technology companies, but for those organizations that can apply technology for business. Advantage yourself, count. Thanks so much for coming on the Cube. Much appreciated. Yeah, yeah, yeah, yeah

Published Date : Jun 23 2020

SUMMARY :

of enterprise data automation, an event Siri's brought to you by Iot. a lot of pressure on data, a lot of demand on data and to deliver more value What is it to you. into the business processes that are going to drive a business to love to get into the tech a little bit in terms of how it works. the ability to automatically discover that data. What is attracting those folks to your ecosystem and give us your thoughts on the So part of the reason why we've IBM, and I'm putting that to work because, yeah, the A. J. Thanks so much for coming on the Cube and sharing your insights and your experience is great to have Look who is smoking in We have a great conversation with Paul Increase the velocity of business outcomes with complete accurate data curated automatically And I'm really excited to have Paul Damico here. Nice to see you too. So let's let's start with Let's start with Webster Bank. complete data on the customer and what's really a great value the ability to give the customer what they need at the Part of it is really the cycle time, the end end cycle, time that you're pressing. It's enhanced the risk, and it's to optimize the banking process and to the cloud and off Prem and on France, you know, moving off Prem into, In researching Iot Tahoe, it seems like one of the strengths of their platform is the ability to visualize data the You know, just for one to pray all these, you know, um, and each project before data for that customer really fast and be able to give them the best deal that they Can't thank you enough for coming on the Cube. And you guys have a great day. Next, we'll talk with Lester Waters, who's the CTO of Iot Toe cluster takes Automated data Discovery data Discovery is the first step to knowing your We're gonna talk about the journey to the cloud Remember, the hashtag is data automate and we're here with Leicester Waters. data is the key element to be protected. And so building that data catalog is the very first step What I got. Where do we go So the next thing is remediating that data you know, You figure out what to get rid of our actually get rid of it. And then from that, you can decide. What's next on the You know, you you want to do a certain degree of lift and shift Is that is that part of the journey? So you really want to is part of your cloud migration. Where does that fit in on the So the opportunity used tools you and automate that process What's the What's the lucky seven there's an opportunity to consolidate that into one. I mean, it's clear that the Cap Ex investments You know the data swamp and re establish trust in the data to enable Top Click on the link and connect with the data for organizations to really get value out of data. Uh, and you can try and milestones That Blue Bar is the time to test so you can see the second step. have pretty high risk and probably the area that risk you could be. to be a you all those went from high labor intensity. But it on the table was individually they need to cross reference that with other data school, Thanks so much for coming on the Cube.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

Dave VolantePERSON

0.99+

Paul DamicoPERSON

0.99+

Paul DamicoPERSON

0.99+

IBMORGANIZATION

0.99+

AzizPERSON

0.99+

Webster BankORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

WestchesterLOCATION

0.99+

AWSORGANIZATION

0.99+

24 weeksQUANTITY

0.99+

SethPERSON

0.99+

LondonLOCATION

0.99+

oneQUANTITY

0.99+

hundredsQUANTITY

0.99+

ConnecticutLOCATION

0.99+

New YorkLOCATION

0.99+

100 hoursQUANTITY

0.99+

iPadCOMMERCIAL_ITEM

0.99+

CiscoORGANIZATION

0.99+

four weeksQUANTITY

0.99+

SiriTITLE

0.99+

thousandsQUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

sixQUANTITY

0.99+

first itemQUANTITY

0.99+

20 master instancesQUANTITY

0.99+

todayDATE

0.99+

second stepQUANTITY

0.99+

S threeCOMMERCIAL_ITEM

0.99+

I o ta hoORGANIZATION

0.99+

first stepQUANTITY

0.99+

Fairfield CountyLOCATION

0.99+

five years agoDATE

0.99+

firstQUANTITY

0.99+

each projectQUANTITY

0.99+

FranceLOCATION

0.98+

two daysQUANTITY

0.98+

Leicester WatersORGANIZATION

0.98+

Iot TahoeORGANIZATION

0.98+

Cap ExORGANIZATION

0.98+

seven causeQUANTITY

0.98+

Lester WatersPERSON

0.98+

5 10 yearsQUANTITY

0.98+

BostonLOCATION

0.97+

IotORGANIZATION

0.97+

TahoeORGANIZATION

0.97+

TomPERSON

0.97+

FirstQUANTITY

0.97+

15 years oldQUANTITY

0.96+

seven different linesQUANTITY

0.96+

single sourceQUANTITY

0.96+

UtahLOCATION

0.96+

New EnglandLOCATION

0.96+

WebsterORGANIZATION

0.95+

12 years oldQUANTITY

0.95+

Iot LabsORGANIZATION

0.95+

Iot. TahoeORGANIZATION

0.95+

1st 1QUANTITY

0.95+

U. S.LOCATION

0.95+

J ahoraORGANIZATION

0.95+

CubeCOMMERCIAL_ITEM

0.94+

PremORGANIZATION

0.94+

one customerQUANTITY

0.93+

OracleORGANIZATION

0.93+

I O ta hoORGANIZATION

0.92+

SnowflakeTITLE

0.92+

sevenQUANTITY

0.92+

singleQUANTITY

0.92+

LesterORGANIZATION

0.91+

Chris Degnan, Snowflake & Anthony Brooks Williams, HVR | AWS re:Invent 2019


 

>>LA Las Vegas. It's the cube hovering AWS reinvent 2019 brought to you by Amazon web services and along with its ecosystem partners. >>Hey, welcome back to the cube. Our day one coverage of AWS reinvent 19 continues. Lisa Martin with Dave Volante. Dave and I have a couple of guests we'd like you to walk up. We've got Anthony Brooks billions, the CEO of HBR back on the cube. You're alumni. We should get you a pin and snowflake alumni. But Chris, your new Chris Dagon, chief revenue officer from snowflake. Chris, welcome to the program. Excited to be here. All right guys. So even though both companies have been on before, Anthony, let's start with you. Give our audience a refresher about HVR, who you guys are at, what you do. >>Sure. So we're in the data integration space, particularly a real time data integration. So we move data to the cloud in the in the most efficient way and we make sure it's secure and it's accurate and you're moving into environments such as snowflake. Um, and that's where we've got some really good customers that we happy to talk about joint custody that we're doing together. But Chris can tell us a little bit about snowflake. >>Sure. And snowflake is a cloud data warehousing company. We are cloud native, we are on AWS or on GCP and we're on Azure. And if you look at the competitive landscape, we compete with our friends at Amazon. We compete with our friends at Microsoft and our friends at Google. So it's super interesting place to be, but it very exciting at the same time and super excited to partner with Anthony and some others who aren't really a friends. That's correct. So I wonder if we could start by just talking about the data warehouse sort of trends that you guys see. When I talk to practitioners in the old days, they used to say to me things like, Oh, infrastructure management, it's such a nightmare. It's like a snake swallowing a basketball every time until it comes out with a new chips. We chase it because we just need more performance and we can't get our jobs done fast enough. And there's only three. There's three guys that we got to go through to get any answers and it was just never really lived up to the promise of 360 degree view of your business and realtime analytics. How has that changed? >>Well, there's that too. I mean obviously the cloud has had a big difference on that illustrious city. Um, what you would find is in, in, in yesterday, customers have these, a retail customer has these big events twice a year. And so to do an analysis on what's being sold and Casper's transactions, they bought this big data warehouse environment for two events a year typically. And so what's happening that's highly cost, highly costly as we know to maintain and then cause the advances in technology and trips and stuff. And then you move into this cloud world which gives you that Lester city of scale up, scale down as you need to. And then particular where we've got Tonies snowflake that is built for that environment and that elicited city. And so you get someone like us that can move this data at today's scale and volume through these techniques we have into an environment that then bleeds into helping them solve the challenge that you talk about of Yesi of >>these big clunky environments. That side, I think you, I think you kind of nailed it. I think like early days. So our founders are from Oracle and they were building Oracle AI nine nine, 10 G. and when I interviewed them I was the first sales rep showing up and day one I'm like, what the heck am I selling? And when I met them I said, tell me what the benefit of snowflake is. And they're like, well at Oracle, and we'd go talk to customers and they'd say, Oracles, you know, I have this problem with Oracle. They'd say, Hey, that's, you know, seven generations ago were Oracle. Do you have an upgraded to the latest code? So one of the things they talked about as being a service, Hey, we want to make it really easy. You never have to upgrade the service. And then to your point around, you have a fixed amount of resources on premise, so you can't all of a sudden if you have a new project, do you want to bring on the first question I asked when I started snowflake to customers was how long does it take you to kick off a net new workload onto your data, onto your Vertica and it take them nine to 12 months because they'd have to go procure the new hardware, install it, and guess what? >>With snowflake, you can make an instantaneous decision and because of our last test city, because the benefits of our partner from Amazon, you can really grow with your demand of your business. >>Many don't have the luxury of nine to 12 months anymore, Chris, because we all know if, if an enterprise legacy business isn't thinking, there's somebody not far behind me who has the elasticity, who has the appetite, who's who understands the opportunity that cloud provides. If you're not thinking that, as auntie Jessie will say, you're going to be on the wrong end of that equation. But for large enterprises, that's hard. The whole change culture is very hard to do. I'd love to get your perspective, Chris, what you're seeing in terms of industries shifting their mindsets to understand the value that they could unlock with this data, but how are big industries legacy industries changing? >>I'd say that, look, we were chasing Amad, we were chasing the cloud providers early days, so five years ago, we're selling to ad tech and online gaming companies today. What's happened in the industry is, and I'll give you a perfect example, is Ben wa and I, one of our founders went out to one of the largest investment banks on wall street five years ago, and they said, and they have more money than God, and they say, Hey, we love what you've built. We love, when are you going to run on premise? And Ben, Ben wa uttered this phrase of, Hey, you will run on the public cloud before we ever run in the private cloud. And guess what? He was a truth teller because five years later, they are one of our largest customers today. And they made the decision to move to the cloud and we're seeing financial services at a blistering face moved to the cloud. >>And that's where, you know, partnering with folks from HR is super important for us because we don't have the ability to just magically have this data appear in the cloud. And that's where we rely quite heavily on on instance. So Anthony, in the financial services world in particular, it used to be a cloud. Never that was an evil word. Automation. No, we have to have full control and in migration, never digital transformation to start to change those things. It's really become an imperative, but it's by in particular is really challenging. So I wonder if we could dig into that a little bit and help us understand how you solve that problem. >>Yes. A customer say they want to adopt some of these technologies. So there's the migration route. They may want to go adopt some of these, these cloud databases, the cloud data warehouses. And so we have some areas where we, you know, we can do that and keep the business up and running at the same time. So the techniques we use are we reading the transactional logs, other databases or something called CDC. And so there'll be an initial transfer of the bulk of the data initiative stantiating or refresh. At that same time we capturing data out of the transaction logs, wildlife systems live and doing a migration to the new environment or into snowflakes world, capturing data where it's happening, where the data is generated and moving that real time securely, accurately into this environment for somewhere like 1-800-FLOWERS where they can do this, make better decisions to say the cost is better at point of sale. >>So have all their business divisions pulling it in. So there's the migration aspects and then there's the, the use case around the realtime reporting as well. So you're essentially refueling the plane. Well while you're in mid air. Um, yeah, that's a good one. So what does the customer see? How disruptive is it? How do you minimize that disruption? Well, the good thing is, well we've all got these experienced teams like Chris said that have been around the block and a lot of us have done this. What we do, what ed days fail for the last 15 years, that companies like golden gate that we sold to Oracle and those things. And so there's a whole consultative approach to them versus just here's some software, good luck with it. So there's that aspect where there's a lot of planning that goes into that and then through that using our technologies that are well suited to this Appleton shows some good success and that's a key focus for us. And in our world, in this subscription by SAS top world, customer success is key. And so we have to build a lot of that into how we make this successful as well. >>I think it's a barrier to entry, like going, going from on premise to the cloud. That's the number one pushback that we get when we go out and say, Hey, we have a cloud native data warehouse. Like how the heck are we going to get the data to the cloud? And that's where, you know, a partnership with HR. Super important. Yeah. >>What are some of the things that you guys encountered? Because we many businesses live in the multi-cloud world most of the time, not by strategy, right? A lot of the CIO say, well we sort of inherited this, or it's M and a or it's developers that have preference. How do you help customers move data appropriately based on the value that the perceived value that it can give in what is really a multi world today? Chris, we'll start with you. >>Yeah, I think so. So as we go into customers, I think the biggest hurdle for them to move to the cloud is security because they think the cloud is not secure. So if we, if you look at our engagement with customers, we go in and we actually have to sell the value snowflake and then they say, well, okay great, go talk to the security team. And then we talked to security team and say, Hey, let me show you how we secure data. And then then they have to get comfortable around how they're going to actually move, get the data from on premise to the cloud. And that's again, when we engage with partners like her. So yeah, >>and then we go through a whole process with a customer. There's a taking some of that data in a, in a POC type environment and proving that after, as before it gets rolled out. And a lot of, you know, references and case studies around it as well. >>Depends on the customer that you have some customers who are bold and it doesn't matter the size. We have a fortune 100 customer who literally had an on premise Teradata system that they moved from on prem, from on premise 30 to choose snowflake in 111 days because they were all in. You have other customers that say, Hey, I'm going to take it easy. I'm going to workload by workload. And it just depends. And the mileage may vary is what can it give us an example of maybe a customer example or in what workloads they moved? Was it reporting? What other kinds? Yeah. >>Oh yeah. We got a couple of, you mean we could talk a little bit about 1-800-FLOWERS. We can talk about someone like Pitney Bowes where they were moving from Oracle to secret server. It's a bunch of SAP data sitting in SAP ECC. So there's some complexity around how you acquire, how you decode that data, which we ever built a unique ability to do where we can decode the cluster and pool tables coupled with our CDC technique and they had some stringent performance loads, um, that a bunch of the vendors couldn't meet the needs between both our companies. And so we were able to solve their challenge for them jointly and move this data at scale in the performance that they needed out with these articles, secret server enrollments into, into snowflake. >>I almost feel like when you have an SAP environment, it's almost stuck in SAP. So to get it out is like, it's scary, right? And this is where it's super awesome for us to do work like this. >>On that front, I wanted to understand your thoughts on transformation. It's a word, it's a theme of reinvent 2019. It's a word that we hear at every event, whether we're talking about digital transformation, workforce, it, et cetera. But one of the things that Andy Jassy said this morning was that got us start. It's this is more than technology, right? This, the next gen cloud is more than technology. It's about getting those senior leaders on board. Chris, your perspective, looking at financial services first, we were really surprised at how quickly they've been able to move. Understanding presumably that if they don't, there's going to be other businesses. But are you seeing that as the chief revenue officer or your conversations starting at that CEO level? >>It kinda has to like in the reason why if you do in bottoms up approach and say, Hey, I've got a great technology and you sell this great technology to, you know, a tech person. The reality is unless the C E O CIO or CTO has an initiative to do digital transformation and move to the cloud, you'll die. You'll die in security, you'll die in legal lawyers love to kill deals. And so those are the two areas that I see D deals, you know, slow down significantly. And that's where, you know, we, it's, it's getting through those processes and finding the champion at the CEO level, CIO level, CTO level. If you're, if you're a modern day CIO and you do not have a a cloud strategy, you're probably going to get replaced >>in 18 months. So you know, you better get on board and you'd better take, you know, taking advantage of what's happening in the industry. >>And I think that coupled with the fact that in today's world, you mean, you said there's a, it gets thrown around as a, as a theme and particularly the last couple of years, I think it's, it's now it is actually a strategy and, and reality because what Josephine is that there's as many it tech savvy people sit in the business side of organizations today that used to sit in legacy it. And I think it's that coupled with the leadership driving it that's, that's demanding it, that demanding to be able to access that certain type of data in a geo to make decisions that affect the business. Right now. >>I wonder if we could talk a little bit more about some of the innovations that are coming up. I mean I've been really hard on data. The data warehouse industry, you can tell I'm jaded. I've been around a long time. I mean I've always said that that Sarbanes Oxley saved the old school BI and data warehousing and because all the reporting requirements, and again that business never lived up to its promises, but it seems like there's this whole new set of workloads emerging in the cloud where you take a data warehouse like a snowflake, you may be bringing in some ML tools, maybe it's Databricks or whatever. You HVR helping you sort of virtualize the data and people are driving new workloads that are, that are bringing insights that they couldn't get before in near real time. What are you seeing in terms of some of those gestalt trends and how are companies taking advantage of these innovations? >>I think one is just the general proliferation of data. There's just more data and like you're saying from many different sources, so they're capturing data from CNC machines in factories, you know like like we do for someone like GE, that type of data is to data financial data that's sitting in a BU taking all of that and going there's just as boss some of data, how can we get a total view of our business and at a board level make better decisions and that's where they got put it in I snowflake in this an elastic environment that allows them to do this consolidated view of that whole organization, but I think it's largely been driven by things that digitize their sensors on everything and there's just a sheer volume of data. I think all of that coming together is what's, what's driven it >>is is data access. We talked about security a little bit, but who has rights to access the data? Is that a challenge? How are you guys solving that or is it, I mean I think it's like anything like once people start to understand how a date where we're an acid compliant date sequel database, so we whatever your security you use on your on premise, you can use the same on snowflake. It's just a misperception that the industry has that being on, on in a data center is more secure than being in the cloud and it's actually wrong. I guess my question is not so much security in the cloud, it's more what you were saying about the disparate data sources that coming in hard and fast now. And how do you keep track of who has access to the data? I mean is it another security tool or is it a partnership within owes? >>Yeah, absolutely man. So there's also, there's in financial data, there's certain geos, data leaves, certain geos, whether it be in the EU or certain companies, particularly this end, there's big banks now California, there's stuff that we can do from a security perspective in the data that we move that's secure, it's encrypted. If we capturing data from multiple different sources, items we have that we have the ability to take it all through one, one proxy in the firewall, which does, it helps him a lot in that aspect. Something unique in our technology. But then there's other tools that they have and largely you sit down with them and it's their sort of governance that they have in the, in the organization to go, how do they tackle that and the rules they set around it, you know? >>Well, last question I have is, so we're seeing, you know, I look at the spending data and my breaking analysis, go on my LinkedIn, you'll see it snowflakes off the charts. It's up there with, with robotic process automation and obviously Redshift. Very strong. Do you see those two? I think you addressed it before, but I'd love to get you on record sort of coexisting and thriving. Really, that's not the enemy, right? It's the, it's the Terra data's and the IBM's and the Oracles. The, >>I think, look, uh, you know, Amazon, our relationship with Amazon is like a, you know, a 20 year marriage, right? Sometimes there's good days, sometimes there's bad days. And I think, uh, you know, every year about this time, you know, we get a bat phone call from someone at Amazon saying, Hey, you know, the Redshift team's coming out with a snowflake killer. And I've heard that literally for six years now. Um, it turns out that there's an opportunity for us to coexist. Turns out there's an opportunity for us to compete. Um, and it's all about how they handle themselves as a business. Amazon has been tremendous in separation of that, of, okay, are going to partner here, we're going to compete here, and we're okay if you guys beat us. And, and so that's how they operate. But yes, it is complex and it's, it's, there are challenges. >>Well, the marketplace guys must love you though because you're selling a lot of computers. >>Well, yeah, yeah. This is three guys. They, when they left, we have a summer thing. You mean NWS have a technological DMS, their data migration service, they work with us. They refer opportunities to us when it's these big enterprises that are use cases, scale complexity, volume of data. That's what we do. We're not necessary into the the smaller mom and pop type shops that just want to adopt it, and I think that's where we all both able to go coexist together. There's more than enough. >>All right. You're right. It's like, it's like, Hey, we have champions in the Esri group, the EEC tuna group, that private link group, you know, across all the Amazon products. So there's a lot of friends of ours. Yeah, the red shift team doesn't like us, but that's okay. I can live in >>healthy coopertition, but it just goes to show that not only do customers and partners have toys, but they're exercising it. Gentlemen, thank you for joining David knee on the key of this afternoon. We appreciate your time. Thank you for having us. Pleasure our pleasure for Dave Volante. I'm Lisa Martin. You're watching the queue from day one of our coverage of AWS reinvent 19 thanks for watching.

Published Date : Dec 3 2019

SUMMARY :

AWS reinvent 2019 brought to you by Amazon web services Dave and I have a couple of guests we'd like you to walk up. So we move data to the cloud in the in the most efficient way and we make sure it's secure and And if you look at the competitive landscape, And then you move into this cloud world which gives you that Lester city of scale to customers was how long does it take you to kick off a net new workload onto your data, from Amazon, you can really grow with your demand of your business. Many don't have the luxury of nine to 12 months anymore, Chris, And they made the decision to move to the cloud and we're seeing financial services And that's where, you know, partnering with folks from HR is super important for us because And so we have some areas where we, And so we have to build a lot of that into how we make this successful And that's where, you know, a partnership with HR. What are some of the things that you guys encountered? And then we talked to security team and say, Hey, let me show you how we secure data. And a lot of, you know, references and case studies around it as well. Depends on the customer that you have some customers who are bold and it doesn't matter the size. So there's some complexity around how you acquire, how you decode that data, I almost feel like when you have an SAP environment, it's almost stuck in SAP. But are you seeing that And that's where, you know, So you know, you better get on board and you'd better take, you know, taking advantage of what's happening And I think that coupled with the fact that in today's world, you mean, you said there's a, it gets thrown around as a, like there's this whole new set of workloads emerging in the cloud where you take a factories, you know like like we do for someone like GE, that type of is not so much security in the cloud, it's more what you were saying about the disparate in the organization to go, how do they tackle that and the rules they set around it, Well, last question I have is, so we're seeing, you know, I look at the spending data and my breaking analysis, separation of that, of, okay, are going to partner here, we're going to compete here, and we're okay if you guys to us when it's these big enterprises that are use cases, scale complexity, that private link group, you know, across all the Amazon products. Gentlemen, thank you for joining David knee on the key of this afternoon.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavePERSON

0.99+

Lisa MartinPERSON

0.99+

ChrisPERSON

0.99+

AmazonORGANIZATION

0.99+

Dave VolantePERSON

0.99+

OracleORGANIZATION

0.99+

IBMORGANIZATION

0.99+

AnthonyPERSON

0.99+

BenPERSON

0.99+

Andy JassyPERSON

0.99+

Chris DagonPERSON

0.99+

DavidPERSON

0.99+

JessiePERSON

0.99+

six yearsQUANTITY

0.99+

three guysQUANTITY

0.99+

GoogleORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

HBRORGANIZATION

0.99+

NWSORGANIZATION

0.99+

Ben waPERSON

0.99+

Chris DegnanPERSON

0.99+

first questionQUANTITY

0.99+

Anthony BrooksPERSON

0.99+

OraclesORGANIZATION

0.99+

360 degreeQUANTITY

0.99+

111 daysQUANTITY

0.99+

CasperORGANIZATION

0.99+

nineQUANTITY

0.99+

five years laterDATE

0.99+

12 monthsQUANTITY

0.99+

oneQUANTITY

0.99+

five years agoDATE

0.99+

yesterdayDATE

0.99+

LinkedInORGANIZATION

0.99+

twoQUANTITY

0.99+

EsriORGANIZATION

0.99+

AWSORGANIZATION

0.99+

CaliforniaLOCATION

0.99+

both companiesQUANTITY

0.99+

GEORGANIZATION

0.98+

two areasQUANTITY

0.98+

bothQUANTITY

0.98+

18 monthsQUANTITY

0.98+

todayDATE

0.98+

SASORGANIZATION

0.98+

twice a yearQUANTITY

0.98+

golden gateORGANIZATION

0.97+

JosephinePERSON

0.96+

EULOCATION

0.96+

TerraORGANIZATION

0.95+

threeQUANTITY

0.93+

day oneQUANTITY

0.93+

this morningDATE

0.93+

one proxyQUANTITY

0.93+

CTOORGANIZATION

0.93+

LA Las VegasLOCATION

0.92+

C E O CIOORGANIZATION

0.91+

Anthony Brooks WilliamsPERSON

0.91+

Carey Stanton, Veeam & Vaughn Stewart, Pure Storage | Pure Accelerate 2019


 

>> from Austin, Texas. It's Theo Cube, covering your storage. Accelerate 2019. Brought to you by pure storage. >> Welcome back to the Q B. All the leader in live tech coverage. I'm Lisa Martin with David Dante. Couple of gents back on the Cube we have on Stuart the VP of technology for pure von. Welcome back. >> It's great to be here. Thanks for being accelerate. >> Were accepted severe. And we've got Carrie Stanton, VP of Global Biz Dev and corporate development from Theme Carrie, Welcome back. Thank you very much. I'm in the rain. I love the love it planned. Of course. Thank you. Very good branding here. Lots going on with theme and pure. Let's secure. Let's go ahead and start with you. Talk to us about the nature of the V Impure partnership. I'm assuming better together, but give us the breakdown. Sure, >> we've had a relationship for many years, but over the past three years we've seen it. You know, this year, counting this year, like the scale out is just unbelievable. We're growing at triple digits on our Cosell winds in the field, all of its writing, all of the predominantly being driven from the flash blade success that we've had in the marketplace, Our customers are buying into the performance that they have. Our our relationship is growing through joint innovation and joint development. And so what we've seen is raising them to a global partner, on having dedicated resources on it, as only amplified our success. We have. So yeah, it's fantastic. >> And then one from your perspective, what are some of the things that you are hearing? Are you guys being brought in? Maur from team customers is being being brought in more from pure side. What's that mixed like >> we've had? We've had a strong set of channel partners that I think promoting our joint solution on our products kind of a top of their line card. Of course, there's always the customer requested to get pulled in, and I think customers who have experienced either one of our products look at their satisfaction. They look extremely it, like NPS scores right and say, you know, if I'm a pure customer, there's a data protection company. That's gotta nps very similar years, you know, tell us more about what you're doing with with theme. If you look at kind of our common ethos. Right simplicity in the model right co innovation Help Dr Scale. Whether it's been through joint A P I integration with the universal adaptor or tryingto lean into next generation architectures like Flash to flash the cloud. It's just been a very easy progressive partnership to drive and bring in a market. >> Talk more about that joint development. Um, there's a start in the field. No engineering resource is I'd love to Have you had some color to that? >> I think I think I think it's >> a combination of. So we'll start with a universal adapter that was beams initiative to help add scale to the back of process to as you're putting virtue machines into backup mode along, you know, leverage these the storage controller snapshots so that you could come in and out of that back about very quick. V, invisible to production operations, offload a bunch of data processing and in time, out of the equation that just helps scale right back up, more virtual machines faster. That's a program that they initiated that we were one of the founding partners on one of the first partners to publish ah Universal adaptor, or R A p i for it. The >> results have been The results are pure is by far the number one partner for downloads for a customer downloads that we have across our partner Rico system. So we have a vote 15 partner Rico Systems that have written to the universal FBI on. So just last week, you know, over 3000 downloads surpassed over 3000 downloads. Here is 6500 customers. I'll let you do the math. All right, so it's it's great that we see such strong adoption from their customer base. Almost 50% of their customers are team customers on. Then that >> contusion. That's hi, >> It's very high. >> Wow. So give me your favorite customer example that really articulates the value that pure brings the value that being brings. >> We've got a lot going on in the financial space in the healthcare space. >> Butler Health is a joint customer that we have a customer reference win that they've published in that we've published on dhe obviously many, many more, but especially in the people, customers in the financial health care that are looking for performance on Dhe. Looking to that flash blade, a za landing zone that's going to give them more than just a backup target. It's going to give them the ability to leverage it for a I and ML and many other factors, which is again, one of the reasons why we've seen such strong adoption. >> You talk about health care, we're talking about patient data, lives at stake. Give me some of the meat about what this customer, for example, is achieving at the business. Subtle and the human lives level >> Well, I think what they're seeing is of what they were used. It's not so much the exact stats that I could give you down to how money they're getting per second, but it's what they were using before, which is one of the legacy competitors that we have, which we call. You know, some of these donors that they give to market share that we take away day in and day out with without saying names. But there was a reform replace that we came in and taking a second generation solution from a legacy hardware appliance that was being used previously in a secondary storage. >> Yeah, allow me to elaborate a bit, right? So you asked about the technology we kind of talked about the universal adapter for the off load where we've really seen growth has been in this notion of flash to flash the cloud and peers introduced this notion of rapid restore. So again, how do we grow our businesses together? Growing amore mission critical or patient? Critical deployments has been this notion of not just backing up the data faster. That's kind >> of the the >> daily repetitive task that no organization wants to to deal with. Where the rubber meets the road is Can you put the data back? And we've seen this explosion in the increase of of the capacity of data, set sizes and the pressure they put on restoring that data. When you happen to have, ah, harbor failure, a data center go off line or a power issue and this goes so you go back to patient records gotta be online when everything fails and there's an issue with a chair, whatever. Maybe how quickly can we get the data? And we're orders of magnitude faster, then the legacy >> platform. So having an integrated appliance is part of that key and co engineering. Is that right? I mean, you guys pure software no pun intended, right? You don't want to be >> No, no, it sze taking the they wrote to our a p I right So the work that they did on the FBI and then continue to innovate and iterated against it right and coming out with the next version that they just come out with it is, is just differentiating themselves in the marketplace. And that's really what we're seeing. And we're seeing that success that the enterprise today, from what we have without even looking forward to our upcoming V 10 which is gonna have some high end enterprise feature sets. >> And we want to get into that. But something that mom that you were just saying It's almost as if data protection is no longer just an insurance policy. It's an asset. We have to be able to get it back. >> Absolutely fuel, We believe if you look at the legacy backup appliances, they were designed and optimized for short backup windows and are proving to be a challenge at restoring the data, which is actually where the value in the architecture is. We've talked about rapid restore in bringing, flashing that space. We worked with team engineering on V 10 actually double that performance so that customers, as they upgrade their code line, can again bring those mission critical workloads back online even faster than in the past. In addition to that, we've worked through some of the VM integrations for customs who want to mind that data who want to clone those workloads and bring them up on online and ADM or analytics or searching the metadata of that data. So there's a lot going on besides just your backup and recovery. >> So you guys are saying, Chuck, the appliance don't need the appliance. You've got a better model. Is that what I'm hearing? Or >> we win against appliances day in and day out? So absolutely software. Best of breed software. Best of breed storage hardware. >> What should we expect for V 10 adoption there? You guys announced in the spring? >> Yes, and it will shift in Q four. Dave, honestly, this is gonna be Anton is gonna shit >> a good track record. They're gonna go out there. >> No, but we have some key features that will differentiate us in the marketplace, especially as we go to the enterprise with pier storage, such as immune ability right, So that's a feature that we've talked about. You know, we've been hyping because we believe in it that what it's gonna bring for the protection of ransom, where malware and it's it's gonna be a game changer. We believe in the marketplace and our famous now, as they were finally gonna support now support for their enterprise customer base. So, I mean, those two keep features in and of itself. So again, I talked about the scale that we're having today in the marketplace without these key enterprise features and then having those chip, you know, in the next 90 days are again we believe just gonna continue to elevate our business. >> We're talking to Charlie earlier today about just a CZ. Part of his job is tam expansion and data protection is an obvious area for that. You could have chosen to go buy a small software company, certainly have the cash on your balance sheet and compete. We have chosen to partner talk about the opportunity that you guys jointly see in terms of the market you can penetrate. >> I think it is such a Our ecosystem is so comprised today of partnerships that are based on. On one hand, you're partnering, and on the other hand, you're competing that it is. It is really refreshing to find a partnership like Veen, where we've got very clear lines of what our product offerings are, where they come together and no competitive obstacles. It makes partying in the field the easiest, right? We've got great partnerships across the board somewhere. Appliance vendors. Sometimes those partnerships work fast. Sometimes they running hurdles. We never run into a hurdle together, so it's worked very well. I think our partners, our channel partners, have preferences around the server side that they like to go to market with. We give them the freedom together to pick and choose. So they put invested class software with best class storage to to meet the needs. They put the rest together based on what fits their business model or their current agreements go forward. So >> clear, clear swim lanes, Big market. You guys showed some data at V Mon. I want to say Danny's data, maybe $15 billion Tim man larger. You guys get a piece of that, you get a piece of that >> on a savant said. It's just there's no there's no friction in the marketplace is going out and doing the work we need to do to win. But we never get it that Oh, we can introduce this because it's gonna compete with, even if it's only 2% of what they have, there's there's looting. No, they do not have data protection. And we don't do as, you know. We don't do hardware in storage. So again invested breeds. And I >> think those numbers maybe even conservative because, you know, as you were pointing out, the traditional backup products were designed to deal with the biggest problem, which was back up window, which, by the way, 60% of times the backup didn't work anyway. But you have to get inside of, you know, Yeah, we backed it up check. But backup is One thing is my friend Fred Morris. Recovery is everything. So things are shifting in a digital business recovery. You know, it is tantamount. You know, ever you can't ever not be without your data. So it's an imperative. Yeah, >> it's, um, when you're and the flashlight business unit first came up with the construct of a rapid restore. I mean, admittedly, I was sitting in the corner. I'm just saying there's no way. There's no way that a customer would look to pay a premium for Flash for their backup. And then you meet the customers and it's just one after the other. And there's these stories around. We had to stop production. We couldn't get the AARP back online. Right Way couldn't take transactions because the processing database of the purchasing database was off line and you're just sitting there going. These are really world right issues that impact revenue for organizations. And so we are going through an evolution about rethinking around data protection and what it means into in today's day and age. >> It's security. Such top of mind carry today on the CEO's mind and data protection is part of that. Backup is a key part of that. You think about Ransomware, right? You guys get solutions there. I mean, it all fits together. It's not these sort of bespoke, you know, ideas anymore. It's really one big mosaic so that people can drive their digital transformations. I mean, that's really what they care about. >> I think the themes, old slogan, it just works right. It continues to evolve and that you talked about backup not working in the first place, right? So we have our core fundamental foundations. That theme has right is that it will trust that the customer will know that it will be online. We have the shortest r p o r t o is right in the marketplace, and then you take that and the's enterprise class features again. That's why marrying it with Piers route to market and there go to market strategy is having the success we're having in the marketplace. >> You're hearing a lot from customers. Flash Flash MacLeod. This is There is a very strong need for this. Some of the things that were announced today terms up some more firsts that piers delivering to the market. What are some of the things that you guys were? You maybe Carrie. We'll start with you from themes partnership perspective like a flash Teresi, for example, or starting to be able to deliver. I saw Blake smiles, uh, be ableto bring the cost down so that customers could look at putting a spectrum of workloads, even backups on flash. What is themes? Reaction? Well, smiles. I tend to >> do with Lisa, but I mean, to be honest with you. We sit back and love everything that piers doing from innovation. And so if they're going to come out with a broader set of target solutions for secondary storage, then we're going to be there partner there as we are with flashlights. So we're sitting back and loving the innovation that they're bringing to the market place and to their customers. >> I saw that Cheshire cat grin von >> s o for the audience who may be missed. We had a number of product announcements this morning taking the flash ray from a single product line into a portfolio going to that two year zero workload with the direct memory cache acceleration powered by Intel's often products as we go into a chair to economic space but still keeping all the Tier one features and availability we not flash or a C, which is leveraging QSC is a storage medium. Uh, while we have a design, do expand our tam and find new workloads. We have not looked at backup for the flash rate. See, at this point the flash, the flash, the cloud powered by the data hub in the rapid restore is going strong, so you want to kind of keep the team focused on that? And we've got other markets that we have yet to penetrate that have been more price sensitive where we think the flash racy is a better alignment. Now again, maybe over time I'll be found wrong and we'll change our tune. But you know, I'll give an example. Go back to Ransomware. Ransomware is a top three question in terms of any storage conversation. When you deal with a financial institution today to the point where not only are they asking about, what are you doing in your products? What are you doing across your partner ecosystem? Some of the modern proof of concepts required it to go through a ransomware recovery procedure because you know these financial institutions, they're worried about getting not just locked out, but locked out on your H a sight because you just replicated the ransomware over. So this this ability have immutable, immutable image to bill to bring it back online fast a rapid restored somewhere. You could see what these technologies start to line up in a comprehensive solution for the customers, and so flash racy is great. It has nowhere. The band with a flash blade. So we're gonna try to keep those a separate products in different markets at the time. But at least for time being, >> thanks for clarifying >> that cloud. I gotta ask the quad cloud question. It's interesting you guys have both embraced. Cloud is you're seeing it. In the old days, I was saying, I think I'm saying Charlie again. Executives were like, No, don't do that. It's gonna kill us. But now it's okay. It's not a zero sum game. That trend is your friend. You gotta embrace it. How are you making cloud each of you a tailwind versus the You know what all the analysts expect ahead, What else gets going? Zero sum game is going to steal from a to B. >> Well, I mean, Dave, you can imagine from my vantage point, it's easy to say that we're looking at Cloud is just, you know, expanding the TAM, expanding the ecosystem features we have today at the archive here. The success we're having with both Microsoft Azure and eight of us are phenomenal. Growing 40% month over month, right, the adoption with all the new innovations that Danny and Antonio have talked on the show that were coming out with envy. 10 are only gonna amplify that. But it all starts back with our partners ships today that we have one private clouds and as customers are looking to evolve to the cloud So we work with our partners like peer to ensure that we're working with them today. And as customers want to embrace the cloud they can. But predominantly, those primary workloads are still remaining on Prem and they're looking on how they're going to support the cloud. And we're doing that today and we'll be doing that. Maura's we go forward >> block storage announcement you guys made today was quite interesting way now spinning up East End shoes and s threes And what >> So this morning we announced general availability for pure Claude Block store on AWS and plans, as we are currently in beta and development for other clouds. But the folks today is this AWS and you pair Claude Block store, which is basically the software of a flash ray architect for the hardware inside of a W s so that you have the same functionality and service that you have on Prem and you pair that with pure is a service, which is our op X moderate could pay as you consume and the flexibility of sign a 12 month contracts. You want 90% on Prem today in 10% of cloud two months from now, you want it 50 50 like used the utility model to consume wherever you want, so you can meet the requirements of your infrastructure, whether it's on Prem in the cloud or some hybrid combination. >> But the interesting thing to me was your doing a lot of the heavy lifting for the customers with regard to the architecture. What you architect in the club that I wonder. Is there an opportunity to do something like that with backup? Or is that just, you know, not economical, deep, deep archive, things like that? I mean, >> I'm pretty sure we're told not to make any news right now because >> stay tuned. I've already said >> too much, so I'm probably a >> good thing. We're live >> in big trouble. >> Wow, guys. So the 1st 10 years of pure, tremendous amount of innovation is, Charlie said, an overnight success in 10 years, so much more coming down. We've already heard about a tremendous amount of innovation and evolution today. So we can't wait to have you guys back on to the next event in here. Get our neck braces on for the whiplash of news that's gonna be coming at us. All right. We are like your day Volante. I'm Lester Martin. Go pats. >> You're sorry. And Bruce. Carrie and I were crazy >> sports fans. Let's just be very PC. Go, everybody. Everybody gets participation. Trophies just coming anyway. You're watching the Cube. Lisa Martin for day, Volante. Thanks for watching.

Published Date : Sep 18 2019

SUMMARY :

Brought to you by Couple of gents back on the Cube we have on Stuart the VP of technology for pure It's great to be here. I love the love it planned. buying into the performance that they have. Are you guys being brought in? That's gotta nps very similar years, you know, tell us more about what you're doing with No engineering resource is I'd love to Have you had some color to that? partners on one of the first partners to publish ah Universal adaptor, So just last week, you know, over 3000 That's hi, the value that being brings. Butler Health is a joint customer that we have a customer reference win that they've published in that we've published Give me some of the meat about what this customer, for example, is achieving at the business. It's not so much the exact stats that I could give you down So you asked about the technology we kind of talked about the universal adapter for the road is Can you put the data back? I mean, you guys pure software no pun intended, right? they did on the FBI and then continue to innovate and iterated against it right and coming out with the next version that But something that mom that you were just saying It's almost as if data protection is no Absolutely fuel, We believe if you look at the legacy backup appliances, So you guys are saying, Chuck, the appliance don't need the appliance. we win against appliances day in and day out? is gonna shit a good track record. in the marketplace without these key enterprise features and then having those chip, you know, opportunity that you guys jointly see in terms of the market you can penetrate. our channel partners, have preferences around the server side that they like to go to market with. You guys get a piece of that, you get a piece of that And we don't do as, you know. the traditional backup products were designed to deal with the biggest problem, And then you meet the customers and it's just you know, ideas anymore. the marketplace, and then you take that and the's enterprise class features again. What are some of the things that you guys were? And so if they're going to come out with a broader set of target to the point where not only are they asking about, what are you doing in your products? It's interesting you guys have both embraced. and Antonio have talked on the show that were coming out with envy. But the folks today is this AWS and you pair Claude Block store, But the interesting thing to me was your doing a lot of the heavy lifting for the customers with regard to the architecture. I've already said good thing. So we can't wait to have you guys back on to the next event in here. Carrie and I were crazy Let's just be very PC.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
CharliePERSON

0.99+

DavePERSON

0.99+

BrucePERSON

0.99+

David DantePERSON

0.99+

60%QUANTITY

0.99+

Lisa MartinPERSON

0.99+

12 monthQUANTITY

0.99+

DannyPERSON

0.99+

Lester MartinPERSON

0.99+

90%QUANTITY

0.99+

AWSORGANIZATION

0.99+

Carrie StantonPERSON

0.99+

CarriePERSON

0.99+

StuartPERSON

0.99+

Carey StantonPERSON

0.99+

FBIORGANIZATION

0.99+

AntonioPERSON

0.99+

6500 customersQUANTITY

0.99+

RicoORGANIZATION

0.99+

$15 billionQUANTITY

0.99+

ChuckPERSON

0.99+

LisaPERSON

0.99+

Fred MorrisPERSON

0.99+

Austin, TexasLOCATION

0.99+

BlakePERSON

0.99+

eightQUANTITY

0.99+

10%QUANTITY

0.99+

last weekDATE

0.99+

2019DATE

0.99+

Rico SystemsORGANIZATION

0.99+

10 yearsQUANTITY

0.99+

40%QUANTITY

0.99+

two yearQUANTITY

0.99+

this yearDATE

0.99+

MicrosoftORGANIZATION

0.99+

Butler HealthORGANIZATION

0.99+

VeenORGANIZATION

0.99+

second generationQUANTITY

0.99+

todayDATE

0.99+

bothQUANTITY

0.99+

over 3000 downloadsQUANTITY

0.98+

twoQUANTITY

0.98+

first partnersQUANTITY

0.98+

oneQUANTITY

0.98+

over 3000 downloadsQUANTITY

0.98+

PremORGANIZATION

0.98+

2%QUANTITY

0.98+

TimPERSON

0.98+

1st 10 yearsQUANTITY

0.97+

AntonPERSON

0.97+

TAMORGANIZATION

0.96+

IntelORGANIZATION

0.95+

VolantePERSON

0.94+

50QUANTITY

0.94+

firstQUANTITY

0.94+

V 10TITLE

0.93+

Vaughn StewartPERSON

0.92+

earlier todayDATE

0.92+

15 partnerQUANTITY

0.92+

Theo CubePERSON

0.91+

TeresiPERSON

0.91+

single productQUANTITY

0.91+

MauraPERSON

0.91+

eachQUANTITY

0.91+

Global Biz DevORGANIZATION

0.9+

this morningDATE

0.88+

Stormy Peters, Red Hat | Red Hat Summit 2019


 

>> live from Boston, Massachusetts. It's the queue covering your red. Have some twenty nineteen. You buy bread hat. >> Welcome back here to the >> B C E >> C. We're in Boston, Massachusetts, right. Had summit the six time around for us here. The Cube, proud to be a part of this event. Once again, along with student Eamon. I'm John Walls. And thank you for joining us here on the Cube. Continue our coverage. We're joined by Stormy Peters, who was the senior manager community lead at Red Hat and stormy. Good afternoon to you. Are you going after him? All right. So you think about you know I love that, you know, community lead. You know, in an open source based company. You like Red Hat. Your job is very simply expanded. Evolved ecosystem, Right? So So I mean, how are you? I guess using that that company culture that embedded culture to grow, I think it's already pretty well established. What are your reputation is for how open you guys are right to the community. And what have you What are you doing in terms of leveraging that and trying to expand on that reputation? >> Yeah. Our goal is to make sure we're supporting those upstream communities. So all of all of Red Hat software is open source, and we worked with a whole community of individuals and companies, and the upstream opens our software. And we want to make sure that we're not just contributing features that we want but that were a good player, that we're helping to make sure those communities air healthy. And so, for a number of the projects that were involved in, we actually assigned a fulltime community manager a community lead to help make sure that project is healthy. So we have someone on everything from stuff and Lester Toe fedora Toe Cooper Netease. I'm just making sure the community does, well, >> stormy. You actually did a session for analysts about about a month or so ago, and I've been involved with open source for about twenty years, and you said something that made me do a double take and had to rethink the way I look at this commune unity. And it was we used to think of open source as well. May be I worked on a project, or maybe I spent a little bit time on nights and weekends and it was just kind of giving of time. You said that a majority of people working in this, they've got day jobs that is contribution to this. It's, you know, we understand that companies like IBM and Red Hat and Google often will have that. But the majority of people that are contributing open source now that is their job or a major part of their job. Could you stand a little bit about You know, how we saw that shift? And it's just me that snuck up on. >> So I think it's stuck up on us, all of us. But I really do think it's a fundamental shift that we need to consider so that we can make sure that we're helping the ecosystem the best way possible. So when open Source first started, it was people in their free time. You know, Linus Torvalds had a project he wanted to work on. You had it, it's described. You wanted your desktop to run free software, and so you put your free time into it evenings and weekends. And if you've got a paid job, working on it like that was something to celebrate like that was everybody's dream. And these days, with software becoming more complicated, more complex, and the solutions are even bigger and greater with the cloud there more than a one person project. They're like multi people, multi company projects. And so more and more people are getting paid to work on them and they're getting forty hours a week paid time to work on these projects. They might give more, but they're getting a full time salary. And so how we include not just the individuals but the companies that are paying them to work on it, I think changes how our project's work. I think it's a huge opportunity, >> and I mean talk about that shift a little bit, if you would then and how that has, I would say mature the marketplace. But certainly it's altered the the flows of jobs and innovation and development and all that, because I kind of passed time before now full time and what comes with that? I mean, what challenges come with that we're all of a sudden It is Ah, little, it's it's a little more parent, if you will, right and that you're a little more evident in wherever you're working because it is a full time commitment outs no longer just a of casual or less than full time pursuit. >> Yeah, I think it's a good thing, but I do think adds challenges. So, for example, on boarding process, you used to know when you had an open source software project you've got. Someone was giving up on our two of their evening TTO. Learn your project so you had to make sure that getting started docks worked for them within twenty thirty minutes. Maybe these days, you know, it's really hard to install a lot of this software in twenty or thirty minutes, but someone's doing it is their day job. They're going to have a day or a week. So the on boarding process is different, which I think makes it harder for volunteers and easier for paid volunteers and paid so little a little hard to distinguish. But for people that have all day to do it, they have a little more time to get on board into the on boarding processes are take longer. I think the problem is that we can solve our more complex because someone can spend an entire week. They're not breaking their thought process up like evenings. They have, like, all day. They can work with teams across cos he can pull in lots more expertise. We have a special interest groups and projects like Santos on where we're pulling in different companies to work together. I'm so like we're working on an NFI save with Intel and others. I'm so you get you get more diversity of people that could work on it. That can dedicate more brainpower to in one one setting. >> You can't. Can you talk a little bit about? You've worked on foundations and you support foundations. Talk about special interest groups. It's pay broad and very diverse ecosystem. Sometimes the outside rose like, Oh, it's thie, open source community and like No, no, no. There is not the open source community, their communities and lots of overlap. And they work in Iraq, maybe give us a little bit of context and love to hear some examples of some of the things you're working on. >> Yes, I think the first point is like projects aren't there. How they worked. Their governance isn't isn't static like it's always changing, like you might start a project on your own in your free time and it grew and you convinced all of us to join you. And now there's twenty people working on it and you want to be able to go on vacation and then you want to leave somebody in charge. So do you give him maintainer status? Do you create a board and let people vote? So you create a foundation like someone offers you money? How do you take it like I do? You put it in your bank account. Er, do you have to start it like a nonprofit to take this money? So I think they're constantly evolving. So an example that I have is the foundation we created the set foundation thiss year last year. Recently, um, and stuff has been open. Source It was it was open source created by thinktank acquired by Red. How we created a board of advisors around it to keep all those companies involved. And it had evolved to the point where people wanted to give it money. And so it needed to be something. You know, these companies wanted to collaborate on marketing together, So we created this a foundation as ah directed funded clinics foundation and had, like thirty companies joined in the very beginning, so I think I don't know what the next stage herself will be, but they're always evolving like that. >> But so what does it do if you will? Self? How do you pick projects out if you have thirty? Voice is a lot of voice. Is a lot of people raising their hands and let's look at this. Look at that. You know how to govern that. How do you, ah, assign work? How does all that work in that kind of? That's a really open environment that you're trying to corral a little bit. >> So So we're not trying to corral. We're tryingto like we're >> organized. A better >> word. Better word. >> So that the foundation was was enable people to collaborate on the marketing side, mostly a money side they wanted to give money for, like Suffolk on event. That's happening in a couple of weeks. There's a big annual event they wanted to be able to do Seth days. I'm things that you want to give money to enable it was getting really complicated. Will you pay for the beer and I'LL pay for the food and you know we'LL do it that way. The stuff project technically is led by a group of volunteers will have paid jobs and there's a project lead person for each sub project. And then they have a monthly meeting of all of the the whole project. And then each of those sub projects has a weekly meeting and something that stuff doesn't I think, is really interesting as they record all of their team meetings like it's a video meeting and they record it and they put it on YouTube and people watch the like. I think that's awesome. But it helps them with the time zone problem to record the meeting and put it on YouTube. >> One of the other things that I find really fascinating is many enterprise Cos now you know, we know they're using open source, but they're contributing open source. I remember back the future of open source survey that was done is think it was like half of companies that you know we're using it are also contributing. What do you see? You know, we've talked to users of the show for many years, is toe You know why they see value and why'd you do it, but it would love to hear your take. >> So I do think cos air. They're using open source software, but they're contributing on DH. People talk about what you contribute, the features that you want to see, but I think you contribute to the things you find exciting and that you want to participate in contributions. Starts at like very beginning level of just filing a bug report when you see it are coming to an events and going to the happy hour for that for that project of seven Cluster have won this afternoon. This afternoon. You know there's different get togethers and you participate by meeting the people, telling them how you're using it. I'm telling them what you'd like to see What's cool. I think a lot of people in the open source world, there's an opportunity for the developers to be very close to the users in the way that's harder and proprietary software. And it's really exciting, like if you're working on something and someone comes up and says, Hey, I'm using it and here's what I like Peanuts. It's fun. >> It's working All right. >> How about career advancement? You know, everybody I know in the developer world. It's like, Well, get really is your resume these days? So I gotta imagine that just the skill set and the education is such a huge part for so many companies. >> Yeah, with more people getting paid to work on open source and they can show what they worked on on that it's more, not more coming. It's very easy to move to another job, taking your skill set with you and it's very valued and you even get to keep your community of people that you're working with as you move around and help different companies with that project. How do you >> divvy it up in a community where you know the workload is kind of equally shared? Or there's a fair share of work being done and you and you want to. Maybe some people have a different level of expertise on DH, so there's some policing that kind of has to be done or argus. Some responsibilities assign whatever >> that could be >> a little delicate. Sometimes candidates you won't get the right people doing the right things, and you'LL love, willingness and enthusiasm. But sometimes you do have to kind of decide are you gonna work on this? We're gonna work on that. >> So some projects have done a really excellent job of defining the rolls and assigning them and having like, a mentoring process to get new people there. So, for example, Cooper Netease on the release team, there's like people that worked on the release team. And then, if you're interested, you raise your hand and you like work with the the person that's in that role for, like, an entire release. And so you get like a whole released to be mentored and taught, and then the next year you're the person doing the release and you can mentor somebody else. So I think the process is help with that. And it's, I think there's some really great work. >> You're building the farm team basically right. You're bringing them along on training wheels to a certain degree and then let him ride the bike by themselves, yet makes sense. >> So speaking of getting people ready, there was something new announced this week that I'm hoping you could explain. The Red Hat Universal Base image was explained to me that this is really a subset, Terrell or being roll ready. What Does that mean? How is that going to impact developers? >> Yeah, the idea is to help developers developed in containers on Lenox and in a way that they can so the FBI is based on, Well, it's a subset of rela packages. It's a it's a container, so in the clouds face and that you can develop your app on it. And then you can share that container with anybody. Whether or not there are real user, I'm so you can share it with anybody in the world had them developed on it. But then, when you're done, it is supported on grell and open shift so you can have full enterprise support for that's >> a show like this. Inject new blood new perspective into what you do. But I would assume this is a pretty good recruiting opportunity to in a lot of respects. And you stay pretty busy over the course of these three days, meeting with a lot of new people, meeting a lot of new faces, getting a lot of new ideas, a house to show kind of fit into what you're going to do the other three hundred sixty two days in a year. >> Well, We look forward to this show for three hundred sixty four days a year, so we're always planning foreign prepping for it. It adds energy. It adds excitement. We get to connect with people. They're using the software. Hopefully, they do come to the happy hours or down to the booth and talk to us and say, Here's how we're using it. We hope to get more people involved. People that are using software that want to learn about it, get him more involved. >> Well, you've got a great job of pulling the community together. We wish you continued success in doing that. Thanks for the time today. Here in the Cube. Nice tohave you. >> Thank you very much for having >> that story. Peter is joining us from red hat back with more in just a little bit. You're watching the Red Hat Summit and you're watching exclusive coverage right here on the Q

Published Date : May 7 2019

SUMMARY :

It's the queue covering And what have you What are you doing in terms of leveraging that and trying to expand And so, for a number of the projects that were with open source for about twenty years, and you said something that made me do but the companies that are paying them to work on it, I think changes how our project's work. and I mean talk about that shift a little bit, if you would then and how that has, I'm so you get you get more diversity of people that could work on it. You've worked on foundations and you support foundations. And now there's twenty people working on it and you want to be able to go on vacation and then you want to leave somebody in charge. But so what does it do if you will? So So we're not trying to corral. word. So that the foundation was was enable people to collaborate on the marketing side, One of the other things that I find really fascinating is many enterprise Cos now you and that you want to participate in contributions. So I gotta imagine that just the skill set set with you and it's very valued and you even get to keep your community of people that you're working with as you move Or there's a fair share of work being done and you and you want to. But sometimes you do have to kind of decide are you gonna work on this? And so you get like a whole released to be mentored and taught, and then the next year you're the person doing the release You're building the farm team basically right. How is that going to impact developers? It's a it's a container, so in the clouds face and that you can develop And you stay pretty busy over the course of these three days, meeting with a lot of new people, We get to connect with people. We wish you continued success in doing that. Peter is joining us from red hat back with more in just a little bit.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
IBMORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

John WallsPERSON

0.99+

twentyQUANTITY

0.99+

Red HatORGANIZATION

0.99+

PeterPERSON

0.99+

IraqLOCATION

0.99+

twenty peopleQUANTITY

0.99+

thirty companiesQUANTITY

0.99+

Boston, MassachusettsLOCATION

0.99+

eachQUANTITY

0.99+

Linus TorvaldsPERSON

0.99+

Cooper NeteasePERSON

0.99+

FBIORGANIZATION

0.99+

Stormy PetersPERSON

0.99+

first pointQUANTITY

0.99+

twoQUANTITY

0.99+

EamonPERSON

0.99+

thirtyQUANTITY

0.99+

todayDATE

0.99+

three daysQUANTITY

0.99+

twenty thirty minutesQUANTITY

0.99+

IntelORGANIZATION

0.99+

three hundred sixty two daysQUANTITY

0.99+

YouTubeORGANIZATION

0.98+

next yearDATE

0.98+

This afternoonDATE

0.98+

LenoxORGANIZATION

0.97+

thirty minutesQUANTITY

0.97+

last yearDATE

0.97+

Red Hat SummitEVENT

0.97+

a dayQUANTITY

0.97+

a weekQUANTITY

0.97+

about twenty yearsQUANTITY

0.97+

this weekDATE

0.96+

forty hours a weekQUANTITY

0.95+

OneQUANTITY

0.94+

more than a one personQUANTITY

0.93+

ClusterORGANIZATION

0.91+

RedORGANIZATION

0.9+

each sub projectQUANTITY

0.9+

three hundred sixty four days a yearQUANTITY

0.89+

this afternoonDATE

0.87+

Red Hat Summit 2019EVENT

0.85+

twenty nineteenQUANTITY

0.85+

SuffolkLOCATION

0.82+

firstQUANTITY

0.79+

doubleQUANTITY

0.78+

about a month or so agoDATE

0.75+

CubeORGANIZATION

0.74+

six timeQUANTITY

0.74+

one one settingQUANTITY

0.73+

Lester ToeORGANIZATION

0.72+

red hatTITLE

0.72+

Seth daysEVENT

0.72+

Red HatTITLE

0.66+

redORGANIZATION

0.66+

sevenQUANTITY

0.62+

Toe Cooper NeteaseORGANIZATION

0.59+

a yearQUANTITY

0.56+

thinktankORGANIZATION

0.53+

SantosPERSON

0.51+

thissDATE

0.5+

Universal BaseORGANIZATION

0.39+

TerrellPERSON

0.33+

Bob Swanson, dcVAST | Veritas Vision 2017


 

>> Announcer: Live from Las Vegas, it's theCUBE, covering Veritas Vision 2017. Brought to you by Veritas. (rippling music) >> Welcome back to The Aria in Las Vegas, everybody. This is theCUBE, the leader in live tech coverage. We go out to the events and extract the signal from the noise. My name is Dave Vellante, I'm here with Stuart Miniman, who's my cohost for the week. Bob Swanson is here, he's the head of sales for dcVAST out of Chicago. Bob, thanks for coming on theCUBE! >> Thanks for having me, guys. >> So, well first of all, the show, how's it going for you? We've now got enough data, it's been a couple of days, a few days perhaps for you. What's the vibe like, what are the conversations like? >> Yeah, it's been a great week. This is the very tail end of the event, so a little exhausted. But it's been exciting, there's been a good buzz at the event and we get a lot of our customers here, and just kind of seeing the buzz and the pace of innovation that's goin on here with Veritas, you know, it has been exciting. >> Tell us more about dcVAST. You're focused on IT infrastructure services, but dig a little deeper. Right, yep, so we're headquartered in Chicago, Illinois, and you're right, we do infrastructure and cloud services. So we do support-type services with a seven by 24 call center, have different managed service offerings, different cloud offerings, and certainly do consulting and project work as well. >> Yeah, and Bob, so what does multi-cloud mean to your customers? (chuckles) >> It's only natural that if they're not there today, then they're going to be multi-cloud at some point. So, Veritas here is pretty uniquely positioned. to be able to get customers there. It's all about flexibility and data portability. So, I think where infrastructure and storage and data protection is sometimes not that exciting of a conversation, now kind of changing the conversation, the data management, 'cause everybody needs their data to become more productive for them. It changes the conversation, has a little more sizzle. >> Okay, but you know, your primary area of focus is infrastructure services, so that means first and foremost, every year you got to help me lower my costs, right, you've been hearing that, I'm sure, for years, and help me improve my operational efficiency. And you do that, and really attack my labor problem, IT labor problem so I can focus on my business, right? Are those still the big overriding themes? Oh, yeah, there's no question. I mean, I think the public cloud has been probably the most disruptive thing in our space since the internet. And it's making customers re-evaluate all cost and really how they're doing things, and different consumption and financial models. So, the technology is cool, and we like that conversation, but it naturally brings a big financial and cost savings, and do-more-with-less element to all the conversations. >> So what are the big trends that you're seeing in marketplace, what are the conversations like with your customers? >> Yeah, and I'll give you an example. I think customers have different approaches to cloud, right, some cloud-first, everything's got to go. Others maybe want to keep more of their workloads on premise. And in one customer example, where they said, hey, we want to move all non-production out to the cloud and it was a single cloud provider. And they got about 40% of what they were looking to move out there and they reached what they thought their estimated budget was going to be. So at that point, having that portability and having the tool sets to be able to move those workloads around becomes very important from a financial standpoint. >> So, I wonder if we can unpack those. Cloud first, and then these other guys on-prem. The motivation for cloud first, and the type of company. Do they tend to be a smaller companies, or do you see larger companies saying hey, we're going all in? I mean, you've seen some stories in the press, you know, large company, GE's going all to the cloud, okay I'm sure there's still a lot of on-prem going on there. What do you see? >> Yeah, you're right. A lot of small business is certainly, it makes sense for them, any startups too are pretty much born in the cloud now. You're not going to have too many financial backers that are going to want a startup to be spending too much money on data center, or buying hardware. But the established large enterprises, too, are kind of all over the map, but there are already some of them that are taking this cloud first approach. But, the large enterprises and companies that have been around, where it's not kind of a clean slate, naturally it's going to be hybrid and ultimately there's probably a lot of predictable static workloads that are, at the end of the day, going to be cheaper to run on-prem than they are out in the public cloud. Public cloud's great for the stuff that's not predictable, or is very dynamic, so we're seeing, and I am from Chicago and so we say the coasts move faster, maybe, than the Midwest does as well, but we're seeing varying degrees of adoption and strategy. >> But the business in the data center's good right now, I mean, the market's sort of booming, but if you roll back a few years, you guys must have thought, and maybe you're still thinking it, okay, see this cloud that's coming. Like you said, it's one of the most disruptive, if not the most disruptive in a while, and it's aiming right at the heart of your business, infrastructure services. So how have you responded to that, you must be riding the wave now of data center growth and investment, but strategically, what are you thinking about in your firm? >> Yeah, I mean, there's no question. We've had to pivot. But it does create opportunity. And we do need to help our customers be able to be most cost-effectively managing their workload, right, helping them with that. So where there's challenge and change, there's certainly inopportunity. And we've seen it. >> So, but my understanding, your firm also offers managed cloud offerings. That's been one of the things we've looked at is the channel, can they get on board, can they offer that, how is it working with the big cloud providers, and yeah, let's start there. >> Yeah, that's a good question, and a lot of people have a misperception that the cloud is kind of the easy button. (laughter) But at the end of the day-- >> Stu: Maybe 10 years ago we thought that-- >> Dave: You have your hoodie. >> Right, but I mean, people need to realize the same architecture and security considerations are there as they are for on-premise, so it's not the easy button, and you can just kind of set it and forget it. So some people that are underestimating that still need help from a third party like ourselves to be able to help them manage it. >> Could you speak about the maturation of your support services? >> Yeah, we started doing a lot of hardware support years ago when the business was founded in 1989. And at that time, it was a lot of Unix-based engineering workstations and kind of morphed into servers and storage and other data center equipment, and then started doing a lot more software support, which all can be delivered remotely, for the most part. From time to time, you may need to be onsite for something, so that kind of changes the logistical model, and now with the cloud as well, we've just kind of evolved in that direction. >> And how about the Veritas relationship? What's that been like, you know, the Symantec sale, any comments on how that's evolved, and where do you see that going? >> Yeah, we've been a long-time Veritas partner, and really the reason why we first got started with them was because they were relatively platform-agnostic, and supported and endorsed heterogeneity. And in the old Foundation Suite days, which now their InfoScale product, it's obviously had some name changes, it didn't matter what operating system, didn't matter which array vendor you used. And it's good to have friends in the industry and alliances, but there's also some benefit of staying relatively agnostic like Veritas has, and that message resonates now more than ever with all the different cloud providers out there, and just being able to be interoperable with a lot of different technologies. >> What's your customer's reaction been to all the announcements that Veritas has been making here? >> Yeah, yeah, everyone's excited. Now it's getting the word out. And I mentioned pace of innovation earlier, and it seems to have gone from zero to 100, really, really fast. So, that's exciting. It shows commitment, I think, from the new executive leadership team at Veritas, and their backers at Carlyle as well. So, you know, I think it's an exciting time for Veritas, and for us as a partner as well, and our customers. >> And anything you want to see out of those guys? From your perspective, in the partner standpoint, in the voice of the customer, what's on their to-do list? >> Yeah, and I mean, the concept of data management, looking at it holistically is important. After people and intellectual property, data's the most valuable asset a company has, and a lot of the intellectual property resides in the form of data as well. So, it's an exciting place to be as we kind of see the industry shift. >> Dave: Cubs or White Sox? >> Bob: Cubbies! >> Hey, well, congratulations on that! >> Yeah, it's been a-- >> Really, really Cubbies, not just White Sox, oh, the Cubbies won it? >> No, Cubbies all the way. >> Hardcore Cubbies fan. >> Diehard, absolutely, yep. >> Well, you're welcome for Theo Epstein. We gave Theo, and Lester, you know. And Lackey. (laughs) >> You know, Theo seems to have the Midas touch, you know, and it's interesting too, you can use sports analogies for a lot of things, and Theo's a guy who was a little disruptive by using data and analytics in his approach to managing a baseball team. >> Right, right, well, good. That's great. It was an exciting World Series last year. Hope it can be as exciting again. Must have been insane in Chicago. >> Absolutely, yep, getting ready for another run this year, hopefully. >> Excellent, well, Bob, thanks very much for coming on theCUBE. Really appreciate it. >> Thanks again, gentlemen. >> You're welcome, all right, keep it right there, buddy, we'll be back to wrap up Vision 2017. This is theCUBE. (rippling music)

Published Date : Sep 21 2017

SUMMARY :

Brought to you by Veritas. and extract the signal from the noise. What's the vibe like, what and just kind of seeing the buzz and you're right, we do now kind of changing the in our space since the internet. and having the tool sets to be first, and the type of company. are kind of all over the and it's aiming right at the heart our customers be able to the channel, can they get on board, that the cloud is kind of the easy button. and you can just kind From time to time, you may need and really the reason why we and it seems to have and a lot of the intellectual property We gave Theo, and Lester, you know. and Theo's a guy who Right, right, well, good. for another run this year, hopefully. Excellent, well, Bob, This is theCUBE.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

Stuart MinimanPERSON

0.99+

VeritasORGANIZATION

0.99+

TheoPERSON

0.99+

SymantecORGANIZATION

0.99+

ChicagoLOCATION

0.99+

Bob SwansonPERSON

0.99+

1989DATE

0.99+

DavePERSON

0.99+

BobPERSON

0.99+

Theo EpsteinPERSON

0.99+

LesterPERSON

0.99+

White SoxORGANIZATION

0.99+

CarlyleORGANIZATION

0.99+

zeroQUANTITY

0.99+

LackeyPERSON

0.99+

10 years agoDATE

0.99+

100QUANTITY

0.99+

World SeriesEVENT

0.98+

GEORGANIZATION

0.98+

Las VegasLOCATION

0.98+

StuPERSON

0.98+

one customerQUANTITY

0.97+

todayDATE

0.97+

last yearDATE

0.97+

oneQUANTITY

0.97+

24 call centerQUANTITY

0.97+

this yearDATE

0.96+

about 40%QUANTITY

0.96+

dcVASTORGANIZATION

0.95+

first approachQUANTITY

0.95+

Chicago, IllinoisLOCATION

0.94+

sevenQUANTITY

0.93+

firstQUANTITY

0.92+

CubsORGANIZATION

0.92+

single cloud providerQUANTITY

0.86+

The AriaORGANIZATION

0.82+

Veritas VisionORGANIZATION

0.76+

CubbiesORGANIZATION

0.76+

yearsDATE

0.72+

yearsQUANTITY

0.67+

theCUBEORGANIZATION

0.67+

MidwestLOCATION

0.66+

muchQUANTITY

0.65+

UnixTITLE

0.58+

2017DATE

0.58+

DiehardPERSON

0.57+

InfoScaleORGANIZATION

0.53+

Vision 2017EVENT

0.53+

VisionEVENT

0.47+

MidasPERSON

0.39+