Glenn Grossman and Yusef Khan | Io-Tahoe ActiveDQ Intelligent Automation
>>from around the globe. It's the >>cube presenting >>active de que intelligent automation for data quality brought to you by Iota Ho >>Welcome to the sixth episode of the I. O. Tahoe data automation series. On the cube. We're gonna start off with a segment on how to accelerate the adoption of snowflake with Glenn Grossman, who is the enterprise account executive from Snowflake and yusef khan, the head of data services from Iota. Gentlemen welcome. >>Good afternoon. Good morning, Good evening. Dave. >>Good to see you. Dave. Good to see you. >>Okay glenn uh let's start with you. I mean the Cube hosted the snowflake data cloud summit in November and we heard from customers and going from love the tagline zero to snowflake, you know, 90 minutes very quickly. And of course you want to make it simple and attractive for enterprises to move data and analytics into the snowflake platform but help us understand once the data is there, how is snowflake helping to achieve savings compared to the data lake? >>Absolutely. dave. It's a great question, you know, it starts off first with the notion and uh kind of, we coined it in the industry or t shirt size pricing. You know, you don't necessarily always need the performance of a high end sports car when you're just trying to go get some groceries and drive down the street 20 mph. The t shirt pricing really aligns to, depending on what your operational workload is to support the business and the value that you need from that business? Not every day. Do you need data? Every second of the moment? Might be once a day, once a week through that t shirt size price and we can align for the performance according to the environmental needs of the business. What those drivers are the key performance indicators to drive that insight to make better decisions, It allows us to control that cost. So to my point, not always do you need the performance of a Ferrari? Maybe you need the performance and gas mileage of the Honda Civic if you would just get and deliver the value of the business but knowing that you have that entire performance landscape at a moments notice and that's really what what allows us to hold and get away from. How much is it going to cost me in a data lake type of environment? >>Got it. Thank you for that yussef. Where does Io Tahoe fit into this equation? I mean what's, what's, what's unique about the approach that you're taking towards this notion of mobilizing data on snowflake? >>Well, Dave in the first instance we profile the data itself at the data level, so not just at the level of metadata and we do that wherever that data lives. So it could be structured data could be semi structured data could be unstructured data and that data could be on premise. It could be in the cloud or it could be on some kind of SAAS platform. And so we profile this data at the source system that is feeding snowflake within snowflake itself within the end applications and the reports that the snowflake environment is serving. So what we've done here is take our machine learning discovery technology and make snowflake itself the repository for knowledge and insights on data. And this is pretty unique. Uh automation in the form of our P. A. Is being applied to the data both before after and within snowflake. And so the ultimate outcome is that business users can have a much greater degree of confidence that the data they're using can be trusted. Um The other thing we do uh which is unique is employee data R. P. A. To proactively detect and recommend fixes the data quality so that removes the manual time and effort and cost it takes to fix those data quality issues. Uh If they're left unchecked and untouched >>so that's key to things their trust, nobody's gonna use the data. It's not trusted. But also context. If you think about it, we've contextualized are operational systems but not our analytic system. So there's a big step forward glen. I wonder if you can tell us how customers are managing data quality when they migrate to snowflake because there's a lot of baggage in in traditional data warehouses and data lakes and and data hubs. Maybe you can talk about why this is a challenge for customers. And like for instance can you proactively address some of those challenges that customers face >>that we certainly can. They have. You know, data quality. Legacy data sources are always inherent with D. Q. Issues whether it's been master data management and data stewardship programs over the last really almost two decades right now, you do have systemic data issues. You have siloed data, you have information operational, data stores data marks. It became a hodgepodge when organizations are starting their journey to migrate to the cloud. One of the things that were first doing is that inspection of data um you know first and foremost even looking to retire legacy data sources that aren't even used across the enterprise but because they were part of the systemic long running operational on premise technology, it stayed there when we start to look at data pipelines as we onboard a customer. You know we want to do that era. We want to do QA and quality assurance so that we can, And our ultimate goal eliminate the garbage in garbage out scenarios that we've been plagued with really over the last 40, 50 years of just data in general. So we have to take an inspection where traditionally it was E. T. L. Now in the world of snowflake, it's really lt we're extracting were loading or inspecting them. We're transforming out to the business so that these routines could be done once and again give great business value back to making decisions around the data instead of spending all this long time. Always re architect ng the data pipeline to serve the business. >>Got it. Thank you. Glenda yourself of course. Snowflakes renowned for customers. Tell me all the time. It's so easy. It's so easy to spin up a data warehouse. It helps with my security. Again it simplifies everything but so you know, getting started is one thing but then adoption is also a key. So I'm interested in the role that that I owe. Tahoe plays in accelerating adoption for new customers. >>Absolutely. David. I mean as Ben said, you know every every migration to Snowflake is going to have a business case. Um uh and that is going to be uh partly about reducing spending legacy I. T. Servers, storage licenses, support all those good things um that see I want to be able to turn off entirely ultimately. And what Ayatollah does is help discover all the legacy undocumented silos that have been built up, as Glenn says on the data estate across a period of time, build intelligence around those silos and help reduce those legacy costs sooner by accelerating that that whole process. Because obviously the quicker that I. T. Um and Cdos can turn off legacy data sources the more funding and resources going to be available to them to manage the new uh Snowflake based data estate on the cloud. And so turning off the old building, the new go hand in hand to make sure those those numbers stack up the program is delivered uh and the benefits are delivered. And so what we're doing here with a Tahoe is improving the customers are y by accelerating their ability to adopt Snowflake. >>Great. And I mean we're talking a lot about data quality here but in a lot of ways that's table stakes like I said, if you don't trust the data, nobody's going to use it. And glenn, I mean I look at Snowflake and I see obviously the ease of use the simplicity you guys are nailing that the data sharing capabilities I think are really exciting because you know everybody talks about sharing data but then we talked about data as an asset, Everyone so high I to hold it. And so sharing is is something that I see as a paradigm shift and you guys are enabling that. So one of the things beyond data quality that are notable that customers are excited about that, maybe you're excited about >>David, I think you just cleared it out. It's it's this massive data sharing play part of the data cloud platform. Uh you know, just as of last year we had a little over about 100 people, 100 vendors in our data marketplace. That number today is well over 450 it is all about democratizing and sharing data in a world that is no longer held back by FTp s and C. S. V. S and then the organization having to take that data and ingested into their systems. You're a snowflake customer. want to subscribe to an S and P data sources an example, go subscribe it to it. It's in your account there was no data engineering, there was no physical lift of data and that becomes the most important thing when we talk about getting broader insights, data quality. Well, the data has already been inspected from your vendor is just available in your account. It's obviously a very simplistic thing to describe behind the scenes is what our founders have created to make it very, very easy for us to democratize not only internal with private sharing of data, but this notion of marketplace ensuring across your customers um marketplace is certainly on the type of all of my customers minds and probably some other areas that might have heard out of a recent cloud summit is the introduction of snow park and being able to do where all this data is going towards us. Am I in an ale, you know, along with our partners at Io Tahoe and R. P. A. Automation is what do we do with all this data? How do we put the algorithms and targets now? We'll be able to run in the future R and python scripts and java libraries directly inside Snowflake, which allows you to even accelerate even faster, Which people found traditionally when we started off eight years ago just as a data warehousing platform. >>Yeah, I think we're on the cusp of just a new way of thinking about data. I mean obviously simplicity is a starting point but but data by its very nature is decentralized. You talk about democratizing data. I like this idea of the global mesh. I mean it's very powerful concept and again it's early days but you know, keep part of this is is automation and trust, yussef you've worked with Snowflake and you're bringing active D. Q. To the market what our customers telling you so far? >>Well David the feedback so far has been great. Which is brilliant. So I mean firstly there's a point about speed and acceleration. Um So that's the speed to incite really. So where you have inherent data quality issues uh whether that's with data that was on premise and being brought into snowflake or on snowflake itself, we're able to show the customer results and help them understand their data quality better Within Day one which is which is a fantastic acceleration. I'm related to that. There's the cost and effort to get that insight is it's a massive productivity gain versus where you're seeing customers who've been struggling sometimes too remediate legacy data and legacy decisions that they've made over the past couple of decades, so that that cost and effort is much lower than it would otherwise have been. Um 3rdly, there's confidence and trust, so you can see Cdos and see IOS got demonstrable results that they've been able to improve data quality across a whole bunch of use cases for business users in marketing and customer services, for commercial teams, for financial teams. So there's that very quick kind of growth in confidence and credibility as the projects get moving. And then finally, I mean really all the use cases for the snowflake depend on data quality, really whether it's data science, uh and and the kind of snow park applications that Glenn has talked about, all those use cases work better when we're able to accelerate the ri for our joint customers by very quickly pushing out these data quality um insights. Um And I think one of the one of the things that the snowflake have recognized is that in order for C. I. O. Is to really adopt enterprise wide, um It's also as well as the great technology with Snowflake offers, it's about cleaning up that legacy data state, freeing up the budget for CIA to spend it on the new modern day to a state that lets them mobilise their data with snowflake. >>So you're seeing the Senate progression. We're simplifying the the the analytics from a tech perspective. You bring in Federated governance which which brings more trust. Then then you bring in the automation of the data quality piece which is fundamental. And now you can really start to, as you guys are saying, democratized and scale uh and share data. Very powerful guys. Thanks so much for coming on the program. Really appreciate your time. >>Thank you. I appreciate as well. Yeah.
SUMMARY :
It's the the head of data services from Iota. Good afternoon. Good to see you. I mean the Cube hosted the snowflake data cloud summit and the value that you need from that business? Thank you for that yussef. so not just at the level of metadata and we do that wherever that data lives. so that's key to things their trust, nobody's gonna use the data. Always re architect ng the data pipeline to serve the business. Again it simplifies everything but so you know, getting started is one thing but then I mean as Ben said, you know every every migration to Snowflake is going I see obviously the ease of use the simplicity you guys are nailing that the data sharing that might have heard out of a recent cloud summit is the introduction of snow park and I mean it's very powerful concept and again it's early days but you know, Um So that's the speed to incite And now you can really start to, as you guys are saying, democratized and scale uh and I appreciate as well.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Glenn Grossman | PERSON | 0.99+ |
Ben | PERSON | 0.99+ |
Io Tahoe | ORGANIZATION | 0.99+ |
Yusef Khan | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
20 mph | QUANTITY | 0.99+ |
Glenn | PERSON | 0.99+ |
CIA | ORGANIZATION | 0.99+ |
IOS | TITLE | 0.99+ |
Glenda | PERSON | 0.99+ |
90 minutes | QUANTITY | 0.99+ |
100 vendors | QUANTITY | 0.99+ |
Ferrari | ORGANIZATION | 0.99+ |
last year | DATE | 0.99+ |
One | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
first instance | QUANTITY | 0.99+ |
November | DATE | 0.99+ |
sixth episode | QUANTITY | 0.99+ |
once a day | QUANTITY | 0.99+ |
once a week | QUANTITY | 0.98+ |
Senate | ORGANIZATION | 0.98+ |
today | DATE | 0.98+ |
both | QUANTITY | 0.98+ |
eight years ago | DATE | 0.97+ |
yusef khan | PERSON | 0.97+ |
over | QUANTITY | 0.96+ |
one | QUANTITY | 0.95+ |
R. P. A. Automation | ORGANIZATION | 0.95+ |
python | TITLE | 0.95+ |
Tahoe | ORGANIZATION | 0.94+ |
I. O. Tahoe | TITLE | 0.93+ |
Honda | ORGANIZATION | 0.93+ |
Io-Tahoe | ORGANIZATION | 0.93+ |
one thing | QUANTITY | 0.91+ |
Io Tahoe | PERSON | 0.87+ |
firstly | QUANTITY | 0.87+ |
Civic | COMMERCIAL_ITEM | 0.87+ |
Snowflake | TITLE | 0.86+ |
Tahoe | PERSON | 0.85+ |
Ayatollah | PERSON | 0.84+ |
Snowflake | EVENT | 0.83+ |
past couple of decades | DATE | 0.82+ |
about 100 people | QUANTITY | 0.81+ |
two decades | QUANTITY | 0.8+ |
over 450 | QUANTITY | 0.79+ |
40, 50 years | QUANTITY | 0.76+ |
Day one | QUANTITY | 0.75+ |
glenn | PERSON | 0.74+ |
java | TITLE | 0.72+ |
snowflake | EVENT | 0.7+ |
Iota Ho | ORGANIZATION | 0.68+ |
P. | ORGANIZATION | 0.62+ |
ActiveDQ Intelligent Automation | ORGANIZATION | 0.61+ |
snowflake data cloud summit | EVENT | 0.6+ |
Iota | LOCATION | 0.58+ |
FTp | TITLE | 0.56+ |
Snowflake | ORGANIZATION | 0.54+ |
zero | QUANTITY | 0.53+ |
R | TITLE | 0.52+ |
O. | EVENT | 0.41+ |
C. | EVENT | 0.34+ |
Io-Tahoe Episode 5: Enterprise Digital Resilience on Hybrid and Multicloud
>>from around the globe. It's the Cube presenting enterprise. Digital resilience on hybrid and multi cloud Brought to You by Iota Ho. Hello, everyone, and welcome to our continuing Siri's covering data automation brought to you by Io Tahoe. Today we're gonna look at how to ensure enterprise resilience for hybrid and multi cloud. Let's welcome in age. Eva Hora, who is the CEO of Iota A J. Always good to see you again. Thanks for coming on. >>Great to be back. David Pleasure. >>And he's joined by Fozzy Coons, who is a global principal architect for financial services. The vertical of financial services. That red hat. He's got deep experiences in that sector. Welcome, Fozzie. Good to see you. >>Thank you very much. Happy to be here. >>Fancy. Let's start with you. Look, there are a lot of views on cloud and what it is. I wonder if you could explain to us how you think about what is a hybrid cloud and and how it works. >>Sure, yes. So the hybrid cloud is a 90 architecture that incorporates some degree off workload, possibility, orchestration and management across multiple clouds. Those clouds could be private cloud or public cloud or even your own data centers. And how does it all work? It's all about secure interconnectivity and on demand. Allocation of resources across clouds and separate clouds can become hydrate when they're similarly >>interconnected. And >>it is that interconnectivity that allows the workloads workers to be moved and how management can be unified in off the street. You can work and how well you have. These interconnections has a direct impact on how well your hybrid cloud will work. >>Okay, so we'll fancy staying with you for a minute. So in the early days of Cloud that turned private Cloud was thrown a lot around a lot, but often just meant virtualization of an on PREM system and a network connection to the public cloud. Let's bring it forward. What, in your view, does a modern hybrid cloud architecture look like? >>Sure. So for modern public clouds, we see that, um, teams organizations need to focus on the portability off applications across clouds. That's very important, right? And when organizations build applications, they need to build and deploy these applications as small collections off independently, loosely coupled services, and then have those things run on the same operating system which means, in other words, running it on Lenox everywhere and building cloud native applications and being able to manage and orchestrate thes applications with platforms like KUBERNETES or read it open shit, for example. >>Okay, so that Z, that's definitely different from building a monolithic application that's fossilized and and doesn't move. So what are the challenges for customers, you know, to get to that modern cloud? Aziz, you've just described it. Is it skill sets? Is that the ability to leverage things like containers? What's your view there? >>So, I mean, from what we've seen around around the industry, especially around financial services, where I spent most of my time, we see that the first thing that we see is management right now because you have all these clouds and all these applications, you have a massive array off connections off interconnections. You also have massive array off integrations, possibility and resource allocations as well, and then orchestrating all those different moving pieces. Things like storage networks and things like those are really difficult to manage, right? That's one. What s O Management is the first challenge. The second one is workload, placement, placement. Where do you place this? How do you place this cloud? Native applications. Do you or do you keep on site on Prem? And what do you put in the cloud? That is the the the other challenge. The major one. The third one is security. Security now becomes the key challenge and concern for most customers. And we could talk about how hundreds? Yeah, >>we're definitely gonna dig into that. Let's bring a J into the conversation. A J. You know, you and I have talked about this in the past. One of the big problems that virtually every companies face is data fragmentation. Um, talk a little bit about how I owe Tahoe unifies data across both traditional systems legacy systems. And it connects to these modern I t environments. >>Yeah, sure, Dave. I mean, fancy just nailed it. There used to be about data of the volume of data on the different types of data. But as applications become or connected and interconnected at the location of that data really matters how we serve that data up to those those app. So working with red hat in our partnership with Red Hat being able Thio, inject our data Discovery machine learning into these multiple different locations. Would it be in AWS on IBM Cloud or A D. C p R. On Prem being able thio Automate that discovery? I'm pulling that. That single view of where is all my data then allows the CEO to manage cast that can do things like one. I keep the data where it is on premise or in my Oracle Cloud or in my IBM cloud on Connect. The application that needs to feed off that data on the way in which you do that is machine learning. That learns over time is it recognizes different types of data, applies policies to declassify that data. Andi and brings it all together with automation. >>Right? And that's one of the big themes and we've talked about this on earlier episodes. Is really simplification really abstracting a lot of that heavy lifting away so we can focus on things A. J A. Z. You just mentioned e nifaz e. One of the big challenges that, of course, we all talk about his governance across thes disparity data sets. I'm curious as your thoughts. How does Red Hat really think about helping customers adhere to corporate edicts and compliance regulations, which, of course, are are particularly acute within financial services. >>Oh, yeah, Yes. So for banks and the payment providers, like you've just mentioned their insurers and many other financial services firms, Um, you know, they have to adhere Thio standards such as a PC. I. D. S s in Europe. You've got the G g d p g d p r, which requires strange and tracking, reporting documentation. And you know, for them to to remain in compliance and the way we recommend our customers to address these challenges is by having an automation strategy. Right. And that type of strategy can help you to improve the security on compliance off the organization and reduce the risk after the business. Right. And we help organizations build security and compliance from the start without consulting services residencies. We also offer courses that help customers to understand how to address some of these challenges. And that's also we help organizations build security into their applications without open sources. Mueller, where, um, middle offerings and even using a platform like open shift because it allows you to run legacy applications and also continue rights applications in a unified platform right And also that provides you with, you know, with the automation and the truly that you need to continuously monitor, manage and automate the systems for security and compliance >>purposes. Hey, >>Jay, anything. Any color you could add to this conversation? >>Yeah, I'm pleased. Badly brought up Open shift. I mean, we're using open shift to be able. Thio, take that security application of controls to to the data level. It's all about context. So, understanding what data is there being able to assess it to say who should have access to it. Which application permission should be applied to it. Um, that za great combination of Red Hat tonight. Tahoe. >>But what about multi Cloud? Doesn't that complicate the situation even even further? Maybe you could talk about some of the best practices to apply automation across not only hybrid cloud, but multi >>cloud a swell. Yeah, sure. >>Yeah. So the right automation solution, you know, can be the difference between, you know, cultivating an automated enterprise or automation caress. And some of the recommendations we give our clients is to look for an automation platform that can offer the first thing is complete support. So that means have an automation solution that provides that provides, um, you know, promotes I t availability and reliability with your platform so that you can provide, you know, enterprise great support, including security and testing, integration and clear roadmaps. The second thing is vendor interoperability interoperability in that you are going to be integrating multiple clouds. So you're going to need a solution that can connect to multiple clouds. Simples lee, right? And with that comes the challenge off maintain ability. So you you you're going to need to look into a automation Ah, solution that that is easy to learn or has an easy learning curve. And then the fourth idea that we tell our customers is scalability in the in the hybrid cloud space scale is >>is >>a big, big deal here, and you need a to deploy an automation solution that can span across the whole enterprise in a constituent, consistent manner, right? And then also, that allows you finally to, uh, integrate the multiple data centers that you have, >>So A J I mean, this is a complicated situation, for if a customer has toe, make sure things work on AWS or azure or Google. Uh, they're gonna spend all their time doing that, huh? What can you add really? To simplify that that multi cloud and hybrid cloud equation? >>Yeah. I could give a few customer examples here Warming a manufacturer that we've worked with to drive that simplification Onda riel bonuses for them is has been a reduction cost. We worked with them late last year to bring the cost bend down by $10 million in 2021 so they could hit that reduced budget. Andre, What we brought to that was the ability thio deploy using open shift templates into their different environments. Where there is on premise on bond or in as you mentioned, a W s. They had G cps well, for their marketing team on a cross, those different platforms being out Thio use a template, use pre built scripts to get up and running in catalog and discover that data within minutes. It takes away the legacy of having teams of people having Thio to jump on workshop cause and I know we're all on a lot of teens. The zoom cause, um, in these current times, they just sent me is in in of hours in the day Thio manually perform all of this. So yeah, working with red hat applying machine learning into those templates those little recipes that we can put that automation toe work, regardless of which location the data is in allows us thio pull that unified view together. Right? >>Thank you, Fozzie. I wanna come back to you. So the early days of cloud, you're in the big apple, you know, financial services. Really well. Cloud was like an evil word within financial services, and obviously that's changed. It's evolved. We talked about the pandemic, has even accelerated that, Um And when you really, you know, dug into it when you talk to customers about their experiences with security in the cloud it was it was not that it wasn't good. It was great, whatever. But it was different. And there's always this issue of skill, lack of skills and multiple tools suck up teams, they're really overburdened. But in the cloud requires new thinking. You've got the shared responsibility model you've got obviously have specific corporate requirements and compliance. So this is even more complicated when you introduce multiple clouds. So what are the differences that you can share from your experience is running on a sort of either on Prem or on a mono cloud, um, or, you know, and versus across clouds. What? What? What do you suggest there? >>Yeah, you know, because of these complexities that you have explained here, Miss Configurations and the inadequate change control the top security threats. So human error is what we want to avoid because is, you know, as your clouds grow with complexity and you put humans in the mix, then the rate off eras is going to increase, and that is going to exposure to security threat. So this is where automation comes in because automation will streamline and increase the consistency off your infrastructure management. Also application development and even security operations to improve in your protection, compliance and change control. So you want to consistently configure resources according to a pre approved um, you know, pre approved policies and you want to proactively maintain a to them in a repeatable fashion over the whole life cycle. And then you also want to rapid the identified system that require patches and and reconfiguration and automate that process off patching and reconfiguring so that you don't have humans doing this type of thing, right? And you want to be able to easily apply patches and change assistant settings. According Thio, Pre defined, based on like explained before, you know, with the pre approved policies and also you want is off auditing and troubleshooting, right? And from a rate of perspective, we provide tools that enable you to do this. We have, for example, a tool called danceable that enables you to automate data center operations and security and also deployment of applications and also obvious shit yourself, you know, automates most of these things and obstruct the human beings from putting their fingers on, causing, uh, potentially introducing errors right now in looking into the new world off multiple clouds and so forth. The difference is that we're seeing here between running a single cloud or on prem is three main areas which is control security and compliance. Right control here it means if your on premise or you have one cloud, um, you know, in most cases you have control over your data and your applications, especially if you're on Prem. However, if you're in the public cloud, there is a difference there. The ownership, it is still yours. But your resources are running on somebody else's or the public clouds. You know, e w s and so forth infrastructure. So people that are going to do this need to really especially banks and governments need to be aware off the regulatory constraints off running, uh, those applications in the public cloud. And we also help customers regionalize some of these choices and also on security. You will see that if you're running on premises or in a single cloud, you have more control, especially if you're on Prem. You can control this sensitive information that you have, however, in the cloud. That's a different situation, especially from personal information of employees and things like that. You need to be really careful off that. And also again, we help you rationalize some of those choices. And then the last one is compliant. Aziz. Well, you see that if you're running on Prem or a single cloud, um, regulations come into play again, right? And if you're running a problem, you have control over that. You can document everything you have access to everything that you need. But if you're gonna go to the public cloud again, you need to think about that. We have automation, and we have standards that can help you, uh, you know, address some of these challenges for security and compliance. >>So that's really strong insights, Potsie. I mean, first of all, answerable has a lot of market momentum. Red hats in a really good job with that acquisition, your point about repeatability is critical because you can't scale otherwise. And then that idea you're you're putting forth about control, security compliance It's so true is I called it the shared responsibility model. And there was a lot of misunderstanding in the early days of cloud. I mean, yeah, maybe a W s is gonna physically secure the, you know, s three, but in the bucket. But we saw so many Miss configurations early on. And so it's key to have partners that really understand this stuff and can share the experiences of other clients. So this all sounds great. A j. You're sharp, you know, financial background. What about the economics? >>You >>know, our survey data shows that security it's at the top of the spending priority list, but budgets are stretched thin. E especially when you think about the work from home pivot and and all the areas that they had toe the holes that they had to fill their, whether it was laptops, you know, new security models, etcetera. So how do organizations pay for this? What's the business case look like in terms of maybe reducing infrastructure costs so I could, you know, pay it forward or there's a There's a risk reduction angle. What can you share >>their? Yeah. I mean, the perspective I'd like to give here is, um, not being multi cloud is multi copies of an application or data. When I think about 20 years, a lot of the work in financial services I was looking at with managing copies of data that we're feeding different pipelines, different applications. Now what we're saying I talk a lot of the work that we're doing is reducing the number of copies of that data so that if I've got a product lifecycle management set of data, if I'm a manufacturer, I'm just gonna keep that in one location. But across my different clouds, I'm gonna have best of breed applications developed in house third parties in collaboration with my supply chain connecting securely to that. That single version of the truth. What I'm not going to do is to copy that data. So ah, lot of what we're seeing now is that interconnectivity using applications built on kubernetes. Um, that decoupled from the data source that allows us to reduce those copies of data within that you're gaining from the security capability and resilience because you're not leaving yourself open to those multiple copies of data on with that. Couldn't come. Cost, cost of storage on duh cost of compute. So what we're seeing is using multi cloud to leverage the best of what each cloud platform has to offer That goes all the way to Snowflake and Hiroko on Cloud manage databases, too. >>Well, and the people cost to a swell when you think about yes, the copy creep. But then you know when something goes wrong, a human has to come in and figured out um, you brought up snowflake, get this vision of the data cloud, which is, you know, data data. I think this we're gonna be rethinking a j, uh, data architectures in the coming decade where data stays where it belongs. It's distributed, and you're providing access. Like you said, you're separating the data from the applications applications as we talked about with Fozzie. Much more portable. So it Z really the last 10 years will be different than the next 10 years. A. >>J Definitely. I think the people cast election is used. Gone are the days where you needed thio have a dozen people governing managing black policies to data. Ah, lot of that repetitive work. Those tests can be in power automated. We've seen examples in insurance were reduced teams of 15 people working in the the back office China apply security controls compliance down to just a couple of people who are looking at the exceptions that don't fit. And that's really important because maybe two years ago the emphasis was on regulatory compliance of data with policies such as GDP are in CCP a last year, very much the economic effect of reduce headcounts on on enterprises of running lean looking to reduce that cost. This year, we can see that already some of the more proactive cos they're looking at initiatives such as net zero emissions how they use data toe under understand how cape how they can become more have a better social impact. Um, and using data to drive that, and that's across all of their operations and supply chain. So those regulatory compliance issues that may have been external we see similar patterns emerging for internal initiatives that benefiting the environment, social impact and and, of course, course, >>great perspectives. Yeah, Jeff Hammer, Bucker once famously said, The best minds of my generation are trying to get people to click on ads and a J. Those examples that you just gave of, you know, social good and moving. Uh, things forward are really critical. And I think that's where Data is gonna have the biggest societal impact. Okay, guys, great conversation. Thanks so much for coming on the program. Really appreciate your time. Keep it right there from, or insight and conversation around, creating a resilient digital business model. You're watching the >>Cube digital resilience, automated compliance, privacy and security for your multi cloud. Congratulations. You're on the journey. You have successfully transformed your organization by moving to a cloud based platform to ensure business continuity in these challenging times. But as you scale your digital activities, there is an inevitable influx of users that outpaces traditional methods of cybersecurity, exposing your data toe underlying threats on making your company susceptible toe ever greater risk to become digitally resilient. Have you applied controls your data continuously throughout the data Lifecycle? What are you doing to keep your customer on supply data private and secure? I owe Tahoe's automated, sensitive data. Discovery is pre programmed with over 300 existing policies that meet government mandated risk and compliance standards. Thes automate the process of applying policies and controls to your data. Our algorithm driven recommendation engine alerts you to risk exposure at the data level and suggests the appropriate next steps to remain compliant on ensure sensitive data is secure. Unsure about where your organization stands In terms of digital resilience, Sign up for a minimal cost commitment. Free data Health check. Let us run our sensitive data discovery on key unmapped data silos and sources to give you a clear understanding of what's in your environment. Book time within Iot. Tahoe Engineer Now >>Okay, let's now get into the next segment where we'll explore data automation. But from the angle of digital resilience within and as a service consumption model, we're now joined by Yusuf Khan, who heads data services for Iot, Tahoe and Shirish County up in. Who's the vice president and head of U. S. Sales at happiest Minds? Gents, welcome to the program. Great to have you in the Cube. >>Thank you, David. >>Trust you guys talk about happiest minds. This notion of born digital, foreign agile. I like that. But talk about your mission at the company. >>Sure. >>A former in 2011 Happiest Mind is a born digital born a child company. The reason is that we are focused on customers. Our customer centric approach on delivering digitals and seamless solutions have helped us be in the race. Along with the Tier one providers, Our mission, happiest people, happiest customers is focused to enable customer happiness through people happiness. We have Bean ranked among the top 25 i t services company in the great places to work serving hour glass to ratings off 41 against the rating off. Five is among the job in the Indian nineties services company that >>shows the >>mission on the culture. What we have built on the values right sharing, mindful, integrity, learning and social on social responsibilities are the core values off our company on. That's where the entire culture of the company has been built. >>That's great. That sounds like a happy place to be. Now you said you had up data services for Iot Tahoe. We've talked in the past. Of course you're out of London. What >>do you what? Your >>day to day focus with customers and partners. What you focused >>on? Well, David, my team work daily with customers and partners to help them better understand their data, improve their data quality, their data governance on help them make that data more accessible in a self service kind of way. To the stakeholders within those businesses on dis is all a key part of digital resilience that will will come on to talk about but later. You're >>right, e mean, that self service theme is something that we're gonna we're gonna really accelerate this decade, Yussef and so. But I wonder before we get into that, maybe you could talk about the nature of the partnership with happiest minds, you know? Why do you guys choose toe work closely together? >>Very good question. Um, we see Hyo Tahoe on happiest minds as a great mutual fit. A Suresh has said, uh, happiest minds are very agile organization um, I think that's one of the key things that attracts their customers on Io. Tahoe is all about automation. Uh, we're using machine learning algorithms to make data discovery data cataloging, understanding, data done. See, uh, much easier on. We're enabling customers and partners to do it much more quickly. So when you combine our emphasis on automation with the emphasis on agility that happiest minds have that that's a really nice combination work works very well together, very powerful. I think the other things that a key are both businesses, a serious have said, are really innovative digital native type type companies. Um, very focused on newer technologies, the cloud etcetera on. Then finally, I think they're both Challenger brands on happiest minds have a really positive, fresh ethical approach to people and customers that really resonates with us at Ideo Tahoe to >>great thank you for that. So Russia, let's get into the whole notion of digital resilience. I wanna I wanna sort of set it up with what I see, and maybe you can comment be prior to the pandemic. A lot of customers that kind of equated disaster recovery with their business continuance or business resilient strategy, and that's changed almost overnight. How have you seen your clients respond to that? What? I sometimes called the forced march to become a digital business. And maybe you could talk about some of the challenges that they faced along the way. >>Absolutely. So, uh, especially during this pandemic, times when you say Dave, customers have been having tough times managing their business. So happiest minds. Being a digital Brazilian company, we were able to react much faster in the industry, apart from the other services company. So one of the key things is the organisation's trying to adopt onto the digital technologies. Right there has bean lot off data which has been to manage by these customers on There have been lot off threats and risk, which has been to manage by the CEO Seo's so happiest minds digital resilient technology, right where we bring in the data. Complaints as a service were ableto manage the resilience much ahead off other competitors in the market. We were ableto bring in our business continuity processes from day one, where we were ableto deliver our services without any interruption to the services. What we were delivered to our customers So that is where the digital resilience with business community process enabled was very helpful for us. Toe enable our customers continue their business without any interruptions during pandemics. >>So I mean, some of the challenges that customers tell me they obviously they had to figure out how to get laptops to remote workers and that that whole remote work from home pivot figure out how to secure the end points. And, you know, those were kind of looking back there kind of table stakes, But it sounds like you've got a digital business. Means a data business putting data at the core, I like to say, but so I wonder if you could talk a little bit more about maybe the philosophy you have toward digital resilience in the specific approach you take with clients? >>Absolutely. They seen any organization data becomes. The key on that, for the first step is to identify the critical data. Right. So we this is a six step process. What we following happiest minds. First of all, we take stock off the current state, though the customers think that they have a clear visibility off their data. How are we do more often assessment from an external point off view on see how critical their data is, then we help the customers to strategies that right. The most important thing is to identify the most important critical herself. Data being the most critical assert for any organization. Identification off the data's key for the customers. Then we help in building a viable operating model to ensure these identified critical assets are secure on monitor dearly so that they are consumed well as well as protected from external threats. Then, as 1/4 step, we try to bring in awareness, toe the people we train them >>at >>all levels in the organization. That is a P for people to understand the importance off the digital ourselves and then as 1/5 step, we work as a back up plan in terms of bringing in a very comprehensive and a holistic testing approach on people process as well as in technology. We'll see how the organization can withstand during a crisis time, and finally we do a continuous governance off this data, which is a key right. It is not just a one step process. We set up the environment, we do the initial analysis and set up the strategy on continuously govern this data to ensure that they are not only know managed will secure as well as they also have to meet the compliance requirements off the organization's right. That is where we help organizations toe secure on Meet the regulations off the organizations. As for the privacy laws, so this is a constant process. It's not on one time effort. We do a constant process because every organization goes towards their digital journey on. They have to face all these as part off the evolving environment on digital journey. And that's where they should be kept ready in terms off. No recovering, rebounding on moving forward if things goes wrong. >>So let's stick on that for a minute, and then I wanna bring yourself into the conversation. So you mentioned compliance and governance when when your digital business, you're, as you say, you're a data business, so that brings up issues. Data sovereignty. Uh, there's governance, this compliance. There's things like right to be forgotten. There's data privacy, so many things. These were often kind of afterthoughts for businesses that bolted on, if you will. I know a lot of executives are very much concerned that these air built in on, and it's not a one shot deal. So do you have solutions around compliance and governance? Can you deliver that as a service? Maybe you could talk about some of the specifics there, >>so some of way have offered multiple services. Tow our customers on digital against. On one of the key service is the data complaints. As a service here we help organizations toe map the key data against the data compliance requirements. Some of the features includes in terms off the continuous discovery off data right, because organizations keep adding on data when they move more digital on helping the helping and understanding the actual data in terms off the residents of data, it could be a heterogeneous data soldiers. It could be on data basis, or it could be even on the data legs. Or it could be a no even on compromise all the cloud environment. So identifying the data across the various no heterogeneous environment is very key. Feature off our solution. Once we identify classify this sensitive data, the data privacy regulations on the traveling laws have to be map based on the business rules So we define those rules on help map those data so that organizations know how critical their digital assets are. Then we work on a continuous marching off data for anomalies because that's one of the key teachers off the solution, which needs to be implemented on the day to day operational basis. So we're helping monitoring those anomalies off data for data quality management on an ongoing basis. On finally, we also bringing the automated data governance where we can manage the sensory data policies on their later relationships in terms off mapping on manage their business roots on we drive reputations toe Also suggest appropriate actions to the customers. Take on those specific data sets. >>Great. Thank you, Yousef. Thanks for being patient. I want to bring in Iota ho thio discussion and understand where your customers and happiest minds can leverage your data automation capability that you and I have talked about in the past. I'm gonna be great if you had an example is well, but maybe you could pick it up from there, >>John. I mean, at a high level, assertions are clearly articulated. Really? Um, Hyoty, who delivers business agility. So that's by, um accelerating the time to operationalize data, automating, putting in place controls and actually putting helping put in place digital resilience. I mean way if we step back a little bit in time, um, traditional resilience in relation to data often met manually, making multiple copies of the same data. So you have a d b A. They would copy the data to various different places, and then business users would access it in those functional style owes. And of course, what happened was you ended up with lots of different copies off the same data around the enterprise. Very inefficient. ONDA course ultimately, uh, increases your risk profile. Your risk of a data breach. Um, it's very hard to know where everything is. And I realized that expression. They used David the idea of the forced march to digital. So with enterprises that are going on this forced march, what they're finding is they don't have a single version of the truth, and almost nobody has an accurate view of where their critical data is. Then you have containers bond with containers that enables a big leap forward so you could break applications down into micro services. Updates are available via a p I s on. So you don't have the same need thio to build and to manage multiple copies of the data. So you have an opportunity to just have a single version of the truth. Then your challenge is, how do you deal with these large legacy data states that the service has been referring Thio, where you you have toe consolidate and that's really where I attack comes in. Um, we massively accelerate that process of putting in a single version of the truth into place. So by automatically discovering the data, discovering what's dubica? What's redundant? Uh, that means you can consolidate it down to a single trusted version much more quickly. We've seen many customers have tried to do this manually, and it's literally taken years using manual methods to cover even a small percentage of their I T estates. With our tire, you could do it really very quickly on you can have tangible results within weeks and months on Ben, you can apply controls to the data based on context. So who's the user? What's the content? What's the use case? Things like data quality validations or access permissions on. Then, once you've done there. Your applications and your enterprise are much more secure, much more resilient. As a result, you've got to do these things whilst retaining agility, though. So coming full circle. This is where the partnership with happiest minds really comes in as well. You've got to be agile. You've gotta have controls. Um, on you've got a drug toward the business outcomes. Uh, and it's doing those three things together that really deliver for the customer. >>Thank you. Use f. I mean you and I. In previous episodes, we've looked in detail at the business case. You were just talking about the manual labor involved. We know that you can't scale, but also there's that compression of time. Thio get to the next step in terms of ultimately getting to the outcome. And we talked to a number of customers in the Cube, and the conclusion is, it's really consistent that if you could accelerate the time to value, that's the key driver reducing complexity, automating and getting to insights faster. That's where you see telephone numbers in terms of business impact. So my question is, where should customers start? I mean, how can they take advantage of some of these opportunities that we've discussed today. >>Well, we've tried to make that easy for customers. So with our Tahoe and happiest minds, you can very quickly do what we call a data health check. Um, this is a is a 2 to 3 week process, uh, to really quickly start to understand on deliver value from your data. Um, so, iota, who deploys into the customer environment? Data doesn't go anywhere. Um, we would look at a few data sources on a sample of data. Onda. We can very rapidly demonstrate how they discovery those catalog e on understanding Jupiter data and redundant data can be done. Um, using machine learning, um, on how those problems can be solved. Um, And so what we tend to find is that we can very quickly, as I say in the matter of a few weeks, show a customer how they could get toe, um, or Brazilian outcome on then how they can scale that up, take it into production on, then really understand their data state? Better on build. Um, Brasiliense into the enterprise. >>Excellent. There you have it. We'll leave it right there. Guys, great conversation. Thanks so much for coming on the program. Best of luck to you and the partnership Be well, >>Thank you, David Suresh. Thank you. Thank >>you for watching everybody, This is Dave Volonte for the Cuban are ongoing Siris on data automation without >>Tahoe, digital resilience, automated compliance, privacy and security for your multi cloud. Congratulations. You're on the journey. You have successfully transformed your organization by moving to a cloud based platform to ensure business continuity in these challenging times. But as you scale your digital activities, there is an inevitable influx of users that outpaces traditional methods of cybersecurity, exposing your data toe underlying threats on making your company susceptible toe ever greater risk to become digitally resilient. Have you applied controls your data continuously throughout the data lifecycle? What are you doing to keep your customer on supply data private and secure? I owe Tahoe's automated sensitive data. Discovery is pre programmed with over 300 existing policies that meet government mandated risk and compliance standards. Thes automate the process of applying policies and controls to your data. Our algorithm driven recommendation engine alerts you to risk exposure at the data level and suggests the appropriate next steps to remain compliant on ensure sensitive data is secure. Unsure about where your organization stands in terms of digital resilience. Sign up for our minimal cost commitment. Free data health check. Let us run our sensitive data discovery on key unmapped data silos and sources to give you a clear understanding of what's in your environment. Book time within Iot. Tahoe Engineer. Now. >>Okay, now we're >>gonna go into the demo. We want to get a better understanding of how you can leverage open shift. And I owe Tahoe to facilitate faster application deployment. Let me pass the mic to Sabetta. Take it away. >>Uh, thanks, Dave. Happy to be here again, Guys, uh, they've mentioned names to be the Davis. I'm the enterprise account executive here. Toyota ho eso Today we just wanted to give you guys a general overview of how we're using open shift. Yeah. Hey, I'm Noah Iota host data operations engineer, working with open ship. And I've been learning the Internets of open shift for, like, the past few months, and I'm here to share. What a plan. Okay, so So before we begin, I'm sure everybody wants to know. Noel, what are the benefits of using open shift. Well, there's five that I can think of a faster time, the operation simplicity, automation control and digital resilience. Okay, so that that's really interesting, because there's an exact same benefits that we had a Tahoe delivered to our customers. But let's start with faster time the operation by running iota. Who on open shift? Is it faster than, let's say, using kubernetes and other platforms >>are >>objective iota. Who is to be accessible across multiple cloud platforms, right? And so by hosting our application and containers were able to achieve this. So to answer your question, it's faster to create and use your application images using container tools like kubernetes with open shift as compared to, like kubernetes with docker cry over container D. Okay, so we got a bit technical there. Can you explain that in a bit more detail? Yeah, there's a bit of vocabulary involved, uh, so basically, containers are used in developing things like databases, Web servers or applications such as I have top. What's great about containers is that they split the workload so developers can select the libraries without breaking anything. And since Hammond's can update the host without interrupting the programmers. Uh, now, open shift works hand in hand with kubernetes to provide a way to build those containers for applications. Okay, got It s basically containers make life easier for developers and system happens. How does open shift differ from other platforms? Well, this kind of leads into the second benefit I want to talk about, which is simplicity. Basically, there's a lot of steps involved with when using kubernetes with docker. But open shift simplifies this with their source to image process that takes the source code and turns it into a container image. But that's not all. Open shift has a lot of automation and features that simplify working with containers, an important one being its Web console. Here. I've set up a light version of open ship called Code Ready Containers, and I was able to set up her application right from the Web console. And I was able to set up this entire thing in Windows, Mac and Lennox. So its environment agnostic in that sense. Okay, so I think I've seen the top left that this is a developers view. What would a systems admin view look like? It's a good question. So here's the administrator view and this kind of ties into the benefit of control. Um, this view gives insights into each one of the applications and containers that are running, and you could make changes without affecting deployment. Andi can also, within this view, set up each layer of security, and there's multiple that you can prop up. But I haven't fully messed around with it because with my luck, I'd probably locked myself out. So that seems pretty secure. Is there a single point security such as you use a log in? Or are there multiple layers of security? Yeah, there are multiple layers of security. There's your user login security groups and general role based access controls. Um, but there's also a ton of layers of security surrounding like the containers themselves. But for the sake of time, I won't get too far into it. Okay, eso you mentioned simplicity In time. The operation is being two of the benefits. You also briefly mention automation. And as you know, automation is the backbone of our platform here, Toyota Ho. So that's certainly grabbed my attention. Can you go a bit more in depth in terms of automation? Open shift provides extensive automation that speeds up that time the operation. Right. So the latest versions of open should come with a built in cryo container engine, which basically means that you get to skip that container engine insulation step and you don't have to, like, log into each individual container host and configure networking, configure registry servers, storage, etcetera. So I'd say, uh, it automates the more boring kind of tedious process is Okay, so I see the iota ho template there. What does it allow me to do? Um, in terms of automation in application development. So we've created an open shift template which contains our application. This allows developers thio instantly, like set up our product within that template. So, Noah Last question. Speaking of vocabulary, you mentioned earlier digital resilience of the term we're hearing, especially in the banking and finance world. Um, it seems from what you described, industries like banking and finance would be more resilient using open shift, Correct. Yeah, In terms of digital resilience, open shift will give you better control over the consumption of resource is each container is using. In addition, the benefit of containers is that, like I mentioned earlier since Hammond's can troubleshoot servers about bringing down the application and if the application does go down is easy to bring it back up using templates and, like the other automation features that open ship provides. Okay, so thanks so much. Know us? So any final thoughts you want to share? Yeah. I just want to give a quick recap with, like, the five benefits that you gained by using open shift. Uh, the five are timeto operation automation, control, security and simplicity. You could deploy applications faster. You could simplify the workload you could automate. A lot of the otherwise tedious processes can maintain full control over your workflow. And you could assert digital resilience within your environment. Guys, >>Thanks for that. Appreciate the demo. Um, I wonder you guys have been talking about the combination of a Iot Tahoe and red hat. Can you tie that in subito Digital resilience >>Specifically? Yeah, sure, Dave eso when we speak to the benefits of security controls in terms of digital resilience at Io Tahoe, we automated detection and apply controls at the data level, so this would provide for more enhanced security. >>Okay, But so if you were trying to do all these things manually. I mean, what what does that do? How much time can I compress? What's the time to value? >>So with our latest versions, Biota we're taking advantage of faster deployment time associated with container ization and kubernetes. So this kind of speeds up the time it takes for customers. Start using our software as they be ableto quickly spin up io towel on their own on premise environment are otherwise in their own cloud environment, like including aws. Assure or call GP on IBM Cloud a quick start templates allow flexibility deploy into multi cloud environments all just using, like, a few clicks. Okay, so so now just quickly add So what we've done iota, Who here is We've really moved our customers away from the whole idea of needing a team of engineers to apply controls to data as compared to other manually driven work flows. Eso with templates, automation, previous policies and data controls. One person can be fully operational within a few hours and achieve results straight out of the box on any cloud. >>Yeah, we've been talking about this theme of abstracting the complexity. That's really what we're seeing is a major trend in in this coming decade. Okay, great. Thanks, Sabina. Noah, How could people get more information or if they have any follow up questions? Where should they go? >>Yeah, sure. They've. I mean, if you guys are interested in learning more, you know, reach out to us at info at iata ho dot com to speak with one of our sales engineers. I mean, we love to hear from you, so book a meeting as soon as you can. All >>right. Thanks, guys. Keep it right there from or cube content with.
SUMMARY :
Always good to see you again. Great to be back. Good to see you. Thank you very much. I wonder if you could explain to us how you think about what is a hybrid cloud and So the hybrid cloud is a 90 architecture that incorporates some degree off And it is that interconnectivity that allows the workloads workers to be moved So in the early days of Cloud that turned private Cloud was thrown a lot to manage and orchestrate thes applications with platforms like Is that the ability to leverage things like containers? And what do you put in the cloud? One of the big problems that virtually every companies face is data fragmentation. the way in which you do that is machine learning. And that's one of the big themes and we've talked about this on earlier episodes. And that type of strategy can help you to improve the security on Hey, Any color you could add to this conversation? is there being able to assess it to say who should have access to it. Yeah, sure. the difference between, you know, cultivating an automated enterprise or automation caress. What can you add really? bond or in as you mentioned, a W s. They had G cps well, So what are the differences that you can share from your experience is running on a sort of either And from a rate of perspective, we provide tools that enable you to do this. A j. You're sharp, you know, financial background. know, our survey data shows that security it's at the top of the spending priority list, Um, that decoupled from the data source that Well, and the people cost to a swell when you think about yes, the copy creep. Gone are the days where you needed thio have a dozen people governing managing to get people to click on ads and a J. Those examples that you just gave of, you know, to give you a clear understanding of what's in your environment. Great to have you in the Cube. Trust you guys talk about happiest minds. We have Bean ranked among the mission on the culture. Now you said you had up data services for Iot Tahoe. What you focused To the stakeholders within those businesses on dis is of the partnership with happiest minds, you know? So when you combine our emphasis on automation with the emphasis And maybe you could talk about some of the challenges that they faced along the way. So one of the key things putting data at the core, I like to say, but so I wonder if you could talk a little bit more about maybe for the first step is to identify the critical data. off the digital ourselves and then as 1/5 step, we work as a back up plan So you mentioned compliance and governance when when your digital business, you're, as you say, So identifying the data across the various no heterogeneous environment is well, but maybe you could pick it up from there, So you don't have the same need thio to build and to manage multiple copies of the data. and the conclusion is, it's really consistent that if you could accelerate the time to value, to really quickly start to understand on deliver value from your data. Best of luck to you and the partnership Be well, Thank you, David Suresh. to give you a clear understanding of what's in your environment. Let me pass the mic to And I've been learning the Internets of open shift for, like, the past few months, and I'm here to share. into each one of the applications and containers that are running, and you could make changes without affecting Um, I wonder you guys have been talking about the combination of apply controls at the data level, so this would provide for more enhanced security. What's the time to value? a team of engineers to apply controls to data as compared to other manually driven work That's really what we're seeing I mean, if you guys are interested in learning more, you know, reach out to us at info at iata Keep it right there from or cube content with.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Jeff Hammer | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Eva Hora | PERSON | 0.99+ |
David Suresh | PERSON | 0.99+ |
Sabina | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Yusuf Khan | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
London | LOCATION | 0.99+ |
2021 | DATE | 0.99+ |
two | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Dave Volonte | PERSON | 0.99+ |
Siri | TITLE | 0.99+ |
ORGANIZATION | 0.99+ | |
Fozzie | PERSON | 0.99+ |
2 | QUANTITY | 0.99+ |
five | QUANTITY | 0.99+ |
David Pleasure | PERSON | 0.99+ |
iata ho dot com | ORGANIZATION | 0.99+ |
Jay | PERSON | 0.99+ |
Five | QUANTITY | 0.99+ |
six step | QUANTITY | 0.99+ |
five benefits | QUANTITY | 0.99+ |
15 people | QUANTITY | 0.99+ |
Yousef | PERSON | 0.99+ |
$10 million | QUANTITY | 0.99+ |
This year | DATE | 0.99+ |
first step | QUANTITY | 0.99+ |
Ideo Tahoe | ORGANIZATION | 0.99+ |
last year | DATE | 0.99+ |
Andre | PERSON | 0.99+ |
hundreds | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
one cloud | QUANTITY | 0.99+ |
2011 | DATE | 0.99+ |
Tahoe | ORGANIZATION | 0.99+ |
Today | DATE | 0.99+ |
Noel | PERSON | 0.99+ |
Red Hat | ORGANIZATION | 0.99+ |
Prem | ORGANIZATION | 0.99+ |
today | DATE | 0.99+ |
tonight | DATE | 0.99+ |
Io Tahoe | ORGANIZATION | 0.99+ |
second benefit | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
Iota A J. | ORGANIZATION | 0.99+ |
one step | QUANTITY | 0.99+ |
both | QUANTITY | 0.98+ |
third one | QUANTITY | 0.98+ |
Siris | TITLE | 0.98+ |
Aziz | PERSON | 0.98+ |
red hat | ORGANIZATION | 0.98+ |
each layer | QUANTITY | 0.98+ |
both businesses | QUANTITY | 0.98+ |
fourth idea | QUANTITY | 0.98+ |
apple | ORGANIZATION | 0.98+ |
1/5 step | QUANTITY | 0.98+ |
Toyota Ho | ORGANIZATION | 0.98+ |
first challenge | QUANTITY | 0.98+ |
41 | QUANTITY | 0.98+ |
azure | ORGANIZATION | 0.98+ |
Io Tahoe | PERSON | 0.98+ |
One person | QUANTITY | 0.98+ |
one location | QUANTITY | 0.98+ |
single | QUANTITY | 0.98+ |
Noah | PERSON | 0.98+ |
over 300 existing policies | QUANTITY | 0.98+ |
Iot Tahoe | ORGANIZATION | 0.98+ |
Thio | PERSON | 0.98+ |
Lenox | ORGANIZATION | 0.98+ |
two years ago | DATE | 0.98+ |
A. J A. Z. | PERSON | 0.98+ |
single point | QUANTITY | 0.98+ |
first thing | QUANTITY | 0.97+ |
Yussef | PERSON | 0.97+ |
Jupiter | LOCATION | 0.97+ |
second thing | QUANTITY | 0.97+ |
three things | QUANTITY | 0.97+ |
about 20 years | QUANTITY | 0.97+ |
single cloud | QUANTITY | 0.97+ |
First | QUANTITY | 0.97+ |
Suresh | PERSON | 0.97+ |
3 week | QUANTITY | 0.97+ |
each container | QUANTITY | 0.97+ |
each cloud platform | QUANTITY | 0.97+ |
Yusef Khan & Suresh Kanniappan | Io Tahoe Enterprise Digital Resilience on Hybrid & Multicloud
>>from around the globe. It's the Cube presenting enterprise, Digital resilience on hybrid and multi cloud Brought to You by Iota Ho. Okay, let's now get into the next segment where we'll explore data automation. But from the angle of digital resilience within and as a service consumption model, we're now joined by Yusuf Khan, who heads data services for Iota Ho and Shirish County. Up in Who's the vice president and head of U. S. Sales at happiest Minds. Gents, welcome to the program. Great to have you in the Cube. >>Thank you, David. >>Stretch. You guys talk about happiest minds. This notion of born digital, foreign agile. I like that. But talk about your mission at the company. >>Sure. A former in 2011 Happiest minds Up Born digital born a child company. >>The >>reason is that we are focused on customers. Our customer centric approach on delivering digitals and seamless solutions have helped us be in the race. Along with the Tier one providers, our mission, happiest people, happiest customers is focused to enable customer happiness through people happiness. We have Bean ranked among the top 25 I t services company in the great places to work serving hour glass to ratings off 4.1 against the rating off five is among the job in the Indian nineties services company that >>shows the >>mission on the culture. What we have built on the values, right sharing, mindful, integrity, learning and social on social responsibilities are the core values off our company on. That's where the entire culture of the company has been built. >>That's great. That sounds like a happy place to be. Now you have you head up data services for Iot Tahoe. We've talked in the past. Of course you're out of London. What do you what's your day to day focus with customers and partners? What you focused on? >>Well, David, my team work daily with customers and partners to help them better understand their data, improve their data quality, their data governance on help them make that data more accessible in a self service kind of way. To the stakeholders within those businesses on dis is all a key part of digital resilience that will will come on to talk about but later. You're >>right, e mean, that self service theme is something that we're gonna we're gonna really accelerate this decade, Yussef and so. But I wonder before we get into that, maybe you could talk about the nature of the partnership with happiest minds. You know, why do you guys choose toe work closely together? >>Very good question. Um, we see Io Tahoe on Happiest minds as a great mutual fit. A Suresh has said happiest minds are very agile organization. Um, I think that's one of the key things that attracts their customers on Io. Tahoe is all about automation. We're using machine learning algorithms to make data discovery data cataloging, understanding, data, redundancy, uh, much easier on. We're enabling customers and partners to do it much more quickly. So when you combine our emphasis on automation with the emphasis on agility, the happiest minds have that. That's a really nice combination. Work works very well together, very powerful. I think the other things that a key are both businesses, a serious have said are really innovative digital native type type companies. Um, very focused on newer technologies, the cloud etcetera, uh, on. Then finally, I think that both challenger brands Andi happiest minds have a really positive, fresh ethical approach to people and customers that really resonates with us that I have tied to its >>great thank you for that. So Russia, Let's get into the whole notion of digital resilience. I wanna I wanna sort of set it up with what I see. And maybe you can comment be prior to the pandemic. A lot of customers that kind of equated disaster recovery with their business continuance or business resilient strategy, and that's changed almost overnight. How have you seen your clients respond to that? What? I sometimes called the forced march to become a digital business. And maybe you could talk about some of the challenges that they faced along the way. >>Absolutely. So, uh, especially during this pandemic times when you see Dave customers have been having tough times managing their business. So happiest minds. Being a digital Brazilian company, we were able to react much faster in the industry, apart from the other services company. So one of the key things is the organizations trying to adopt onto the digital technologies right there has bean lot off data which has been to managed by these customers on. There have been lot off threats and risk, which has been to manage by the CEO Seo's so happiest minds digital resilient technology fight the where we're bringing the data complaints as a service, we were ableto manage the resilience much ahead off other competitors in the market. We were ableto bring in our business community processes from day one, where we were ableto deliver our services without any interruption to the services what we were delivering to our customers. >>So >>that is where the digital resilience with business community process enabled was very helpful for us who enable our customers continue there business without any interruptions during pandemics. >>So, I mean, some of the challenges that that customers tell me they obviously had to figure out how to get laptops to remote workers and that that whole remote, you know, work from home pivot figure out how to secure the end points. And, you know, those were kind of looking back there kind of table stakes, but it sounds like you've got a digital business means a data business putting data at the core, I like to say, but so I wonder if you could talk a little bit more about maybe the philosophy you have toward digital resilience in the specific approach you take with clients? >>Absolutely. They seen any organization data becomes. The key on this for the first step is to identify the critical data. Right. So we this is 1/6 process. What we following happiest minds. First of all, we take stock off the current state, though the customers think that they have a clear visibility off their data. How are we do more often assessment from an external point off view on See how critical their data is? Then we help the customers to strategies that right the most important thing is to identify the most important critical herself. Data being the most critical assault for any organization. Identification off the data's key for the customers. Then we help in building a viable operating model to ensure these identified critical assets are secure on monitor dearly so that they are consumed well as well as protected from external threats. Then, as 1/4 step, we try to bring in awareness, toe the people we train them at all levels in the organization. That is a P for people to understand the importance off the residual our cells. And then as 1/5 step, we work as a back up plan in terms of bringing in a very comprehensive and the holistic testing approach on people process as well as in technology. We'll see how the organization can withstand during a crisis time. And finally we do a continuous governance off this data, which is a key right. It is not just a one step process. We set up the environment. We do the initial analysis and set up the strategy on continuously govern this data to ensure that they are not only know managed will secure as well as they also have to meet the compliance requirements off the organization's right. That is where we help organizations toe secure on Meet the regulations off the organizations. As for the privacy laws, >>so >>this is a constant process. It's not on one time effort. We do a constant process because every organization goes towards the digital journey on. They have to face all these as part off the evolving environment on digital journey, and that's where they should be kept ready in terms off. No recovering, rebounding on moving forward if things goes wrong. >>So let's stick on that for a minute, and then I wanna bring yourself into the conversation. So you mentioned compliance and governance. When? When your digital business. Here, as you say, you're a data business. So that brings up issues. Data sovereignty. Uh, there's governance, this compliance. There's things like right to be forgotten. There's data privacy, so many things. These were often kind of afterthoughts for businesses that bolted on, if you will. I know a lot of executives are very much concerned that these air built in on, and it's not a one shot deal. So do you have solutions around compliance and governance? Can you deliver that as a service? Maybe you could talk about some of the specifics there, >>so some of way have offered multiple services. Tow our customers on digital race against. On one of the key service is the data complaints. As a service here we help organizations toe map the key data against the data compliance requirements. Some of the features includes in terms off the continuous discovery off data right, because organizations keep adding on data when they move more digital on helping the helping and understanding the actual data in terms off the residents of data, it could be a heterogeneous data sources. It could be on data basis or it could be even on the data lakes. Or it could be or no even on compromise, all the cloud environment. So identifying the data across the various no heterogeneous environment is very key. Feature off our solution. Once we identify, classify this sensitive data, the data privacy regulations on the traveling laws have to be map based on the business rules. So we define those rules on help map those data so that organizations know how critical their digital assets are. Then we work on a continuous marching off data for anomalies because that's one of the key teachers off the solution, which needs to be implemented on the day to day operational basis. So we're helping monitoring those anomalies off data for data quality management on an ongoing basis. And finally we also bringing the automatic data governance where we can manage the sensory data policies on their data relationships in terms off, mapping on manage their business rules on we drive reputations toe also suggest appropriate actions to the customers. Take on those specific data sets. >>Great. Thank you, Yousef. Thanks for being patient. I want to bring in Iota ho thio discussion and understand where your customers and happiest minds can leverage your data automation capability that you and I have talked about in the past. And I'm gonna be great if you had an example is well, but maybe you could pick it up from there. >>Sure. I mean, at a high level, assertions are clearly articulated. Really? Um, Iota, who delivers business agility. So that's by, um, accelerating the time to operationalize data, automating, putting in place controls and ultimately putting, helping put in place digital resilience. I mean, way if we step back a little bit in time, um, traditional resilience in relation to data are often met manually, making multiple copies of the same data. So you have a DB A. They would copy the data to various different places on business. Users would access it in those functional style owes. And of course, what happened was you ended up with lots of different copies off the same data around the enterprise. Very inefficient. Onda course ultimately, uh, increases your risk profile. Your risk of a data breach. Um, it's very hard to know where everything is, and I realized that expression they used David, the idea of the forced march to digital. So with enterprises that are going on this forced march, what they're finding is they don't have a single version of the truth, and almost nobody has an accurate view of where their critical data is. Then you have containers bond with containers that enables a big leap forward so you could break applications down into micro services. Updates are available via a P I s. And so you don't have the same need to build and to manage multiple copies of the data. So you have an opportunity to just have a single version of the truth. Then your challenge is, how do you deal with these large legacy data states that the service has been referring Thio, where you you have toe consolidate, and that's really where I Tahoe comes in. Um, we massively accelerate that process of putting in a single version of the truth into place. So by automatically discovering the data, um, discovering what's duplicate what's redundant, that means you can consolidate it down to a single trusted version much more quickly. We've seen many customers have tried to do this manually, and it's literally taken years using manual methods to cover even a small percentage of their I T estates with a tire. You could do it really very quickly on you can have tangible results within weeks and months. Um, and then you can apply controls to the data based on context. So who's the user? What's the content? What's the use case? Things like data quality validations or access permissions on. Then once you've done there, your applications and your enterprise are much more secure, much more resilient. As a result, you've got to do these things whilst retaining agility, though. So coming full circle. This is where the partnership with happiest minds really comes in as well. You've got to be agile. You've gotta have controls, um, on you've got a drug towards the business outcomes and it's doing those three things together that really deliver for the customer. Thank >>you. Use f. I mean you and I. In previous episodes, we've looked in detail at the business case. You were just talking about the manual labor involved. We know that you can't scale, but also there's that compression of time. Thio get to the next step in terms of ultimately getting to the outcome and we talked to a number of customers in the Cube. And the conclusion is really consistent that if you could accelerate the time to value, that's the key driver reducing complexity, automating and getting to insights faster. That's where you see telephone numbers in terms of business impact. So my question is, where should customers start? I mean, how can they take advantage of some of these opportunities that we've discussed >>today? Well, we've tried to make that easy for customers. So with our Tahoe and happiest minds, you can very quickly do what we call a data health check on. Dis is a is a 2 to 3 weeks process are two Really quickly start to understand and deliver value from your data. Um, so, iota, who deploys into the customer environment? Data doesn't go anywhere. Um, we would look at a few data sources on a sample of data Onda. We can very rapidly demonstrate how date discovery those catalog e understanding Jupiter data and redundant data can be done. Um, using machine learning, um, on how those problems can be solved. Um, and so what we tend to find is that we can very quickly as I say in a matter of a few weeks, show a customer how they could get toe, um, or Brazilian outcome on. Then how they can scale that up, take it into production on, then really understand their data state Better on build resilience into the enterprise. >>Excellent. There you have it. We'll leave it right there. Guys. Great conversation. Thanks so much for coming on the program. Best of luck to you in the partnership. Be well. >>Thank you, David. Sorry. Thank you. Thank >>you for watching everybody, This is Dave Volonte for the Cuban Are ongoing Siris on data Automation without Tahoe.
SUMMARY :
Great to have you in the Cube. But talk about your mission at the company. digital born a child company. I t services company in the great places to work serving hour glass to ratings mission on the culture. What do you what's your day to day focus To the stakeholders within those businesses on dis is all a key part of digital of the partnership with happiest minds. So when you combine our emphasis I sometimes called the forced march to become a digital business. So one of the key things that is where the digital resilience with business community process enabled was very putting data at the core, I like to say, but so I wonder if you could talk a little bit more about maybe for the first step is to identify the critical data. They have to face all these as part off the evolving environment So do you have solutions around compliance and governance? So identifying the data across the various no heterogeneous is well, but maybe you could pick it up from there. So by automatically discovering the data, um, And the conclusion is really consistent that if you could accelerate the time to value, So with our Tahoe and happiest minds, you can very quickly do what we call Best of luck to you in the partnership. Thank you. you for watching everybody, This is Dave Volonte for the Cuban Are ongoing Siris on data Automation without
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Yusuf Khan | PERSON | 0.99+ |
Yusef Khan | PERSON | 0.99+ |
2 | QUANTITY | 0.99+ |
London | LOCATION | 0.99+ |
Suresh Kanniappan | PERSON | 0.99+ |
Yousef | PERSON | 0.99+ |
one step | QUANTITY | 0.99+ |
Dave Volonte | PERSON | 0.99+ |
first step | QUANTITY | 0.99+ |
2011 | DATE | 0.99+ |
1/5 step | QUANTITY | 0.99+ |
4.1 | QUANTITY | 0.99+ |
Yussef | PERSON | 0.99+ |
Iot Tahoe | ORGANIZATION | 0.99+ |
both | QUANTITY | 0.99+ |
both businesses | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
two | QUANTITY | 0.98+ |
five | QUANTITY | 0.98+ |
single | QUANTITY | 0.98+ |
Dave | PERSON | 0.98+ |
1/6 | QUANTITY | 0.98+ |
today | DATE | 0.97+ |
3 weeks | QUANTITY | 0.97+ |
Suresh | PERSON | 0.97+ |
Jupiter | LOCATION | 0.96+ |
Io Tahoe | ORGANIZATION | 0.96+ |
one shot | QUANTITY | 0.96+ |
single version | QUANTITY | 0.96+ |
Russia | LOCATION | 0.96+ |
1/4 step | QUANTITY | 0.96+ |
First | QUANTITY | 0.96+ |
Siris | TITLE | 0.96+ |
Tahoe | PERSON | 0.94+ |
Cube | ORGANIZATION | 0.93+ |
Iota | ORGANIZATION | 0.92+ |
day one | QUANTITY | 0.9+ |
one time | QUANTITY | 0.88+ |
Iota Ho | ORGANIZATION | 0.87+ |
three things | QUANTITY | 0.85+ |
Brazilian | OTHER | 0.84+ |
Tier one | QUANTITY | 0.84+ |
forced | EVENT | 0.82+ |
Shirish County | LOCATION | 0.81+ |
Seo | PERSON | 0.81+ |
Cuban | OTHER | 0.81+ |
Tahoe | ORGANIZATION | 0.73+ |
Bean | PERSON | 0.72+ |
Iota | TITLE | 0.69+ |
pandemic | EVENT | 0.67+ |
U. S. Sales | ORGANIZATION | 0.66+ |
top 25 I t | QUANTITY | 0.64+ |
Thio | PERSON | 0.61+ |
Io | ORGANIZATION | 0.57+ |
Indian | OTHER | 0.55+ |
teachers | QUANTITY | 0.55+ |
Andi | PERSON | 0.54+ |
minute | QUANTITY | 0.53+ |
CEO | PERSON | 0.52+ |
Onda | LOCATION | 0.51+ |
Cube | COMMERCIAL_ITEM | 0.45+ |
service | QUANTITY | 0.45+ |
march | EVENT | 0.44+ |
nineties | DATE | 0.41+ |
Noah Fields and Sabita Davis | Io-Tahoe
>>From around the globe. It's the cube presenting enterprise digital resilience on hybrid and multicloud brought to you by IO Tahoe. Okay. Now we're going to go into the demo and we want to get a better understanding of how you can leverage OpenShift and IO Tahoe to facilitate faster application deployment. Let me pass the mic to Savita, take it away. >>Uh, thanks Dave. Happy to be here again. Um, guys, as they've mentioned, my name is to be the Davis. I'm the enterprise account executive here at IO Tahoe. Uh, so today we just wanted to give you guys a general overview of how we're using open shift. >>Yeah. Hey, I'm Noah IO. Tahoe's data operations engineer working with OpenShift, and I've been learning the ins and outs of OpenShift for like the past few months and I'm here to share it up line. >>Okay. So, so before we begin, I'm sure everybody wants to know Noah. What are the benefits of using OpenShift? >>Well, um, there's five that I can think of a faster time to operations, simplicity, automation control, and digital resilience. >>Okay. So, so that, that's really interesting because those are the exact same benefits that we at Aja Tahoe delivered to our customers. But, uh, let's start with faster time to operation by running IO Tahoe on OpenShift. Is it faster than let's say using Kubernetes and other platforms? >>Well, um, our objective at IO Tahoe has to be accessible across multiple cloud platforms, right? And so by hosting our application and containers, uh, we're able to achieve this. So to answer your question, it's faster to create end user application images, using container tools like Kubernetes with OpenShift as compared to like Kubernetes with Docker cryo or container D. >>Okay. So, so we got a bit technical there. Um, can you explain that in a bit more detail? >>Yeah, there's a bit of vocabulary involved. Uh, so basically containers are used in developing things like databases, web servers, or applications such as I've taught. What's great about containers is that they split the workload. So developers can select a libraries without breaking anything. And CIS admins can update the host without interrupting the programmers. Uh, now OpenShift works hand-in-hand with Kubernetes to provide a way to build those containers for applications. >>Okay, got it. Uh, so basically containers make life easier for developers and system admins. So how does OpenShift differ from other platforms? >>Um, well this kind of leads into the second benefit I want to talk about, which is simplicity. Basically. There's a lot of steps involved with when you're using Kubernetes with a Docker, but OpenShift simplifies this with their source to image process that takes the source code and turns it into a container image, but that's not all, uh, OpenShift has a lot of automation and features that simplify working with containers and important one being its web console. Um, so here I've set up a light version of OpenShift code ready containers. And I was able to set up our application right from the web console. And I was able to set up this entire thing in windows, Mac, and Linux. So it's environment agnostic in that sense. >>Okay. So I think I seen the top left. This is a developer's view. What would a systems admin view look like? >>That's a good question. So, uh, here's the, uh, administrator view and this kind of ties into the benefit of control. Um, this view gives insights into each one of the applications and containers that are running and you can make changes without affecting deployment. Um, and you can also within this view, set up each layer of security and there's multiple that you can prop up, but I haven't fully messed around with it because since with my look, I'd probably locked myself out. >>Okay. Um, so, so that seems pretty secure. Um, is there a single point security such as you use a login or are there multiple layers of security? Yeah. >>Um, there are multiple layers of security. There's your user login security groups and general role based access controls. Um, but there's also a ton of layers of security surrounding like the containers themselves. But for the sake of time, I won't get too far into it. >>Okay. Uh, so you mentioned simplicity and time to operation as being two of the benefits. You also briefly mentioned automation and as you know, automation is the backbone of our platform here at IO Tahoe. So that's certainly grabbed my attention. Can you go a bit more in depth in terms of automation? >>Yeah, sure. I'd say that automation is important benefit. Uh, OpenShift provides extensive automation that speeds up that time to operation, right? So the latest versions of open should come with a built-in cryo container engine, which basically means that you get to skip that container engine installation step. And you don't have to like log into each individual container hosts and configure networking, configure the registered servers, storage, et cetera. So I'd say, uh, it automates the more boring kind of tedious processes. >>Okay. So I see the iota template there. What does it allow me to do >>In terms of automation in application development? So we've created an OpenShift template, which contains our application. This allows developers to instantly like, um, set up a product within that template or within that. Yeah. >>Okay. Um, so Noah, last question. Speaking of vocabulary, you mentioned earlier digital resilience is a term we're hearing, especially in the banking and finance world. Um, it seems from what you described industries like banking and finance would be more resilient using OpenShift, correct? >>Yeah. In terms of digital resilience, OpenShift will give you better control over the consumption of resources each container is using. In addition, the benefit of containers is that, uh, like I mentioned earlier, CIS admins can troubleshoot the servers about like bringing down the application. And if the application does go down, it's easy to bring it back up using the templates and like the other automation features that OpenShift provides. >>Okay. So thanks so much. So any final thoughts you want to share? >>Yeah. Just want to give a quick recap of like the five benefits that you gain by using OpenShift. Uh, the five are time to operation automation, control, security and simplicity. Uh, you can deploy applications faster. You can simplify the workload. You can automate a lot of the otherwise tedious processes can maintain full control over your workflow and you can assert digital resilience within your environment. >>So guys, thanks for that. Appreciate the demo. Um, I wonder you guys have been talking about the combination of IO Tahoe and red hat. Can you tie that in Sabita to digital resilience specifically? >>Yeah, sure. Dave, um, so why don't we speak to the benefits of security controls in terms of digital resilience at Iowa hope? Uh, we automated detection and applied controls at the data level. So this would provide for more enhanced security. >>Okay. But so if you were to try to do all these things manually, I mean, what's, what does that do? How, how much time can I compress? What's the time to value? >>So, um, with our latest versions via Tahoe, we're taking advantage of faster deployment time, um, associated with containerization and Kubernetes. So this kind of speeds up the time it takes for customers to start using our software as they be able to quickly spin up a hotel and their own on-premise environment or otherwise in their own cloud environment, like including AWS or shore Oracle GCP and IBM cloud. Um, our quick start templates allow flexibility to deploy into multicloud environments, all just using like a few clicks. >>Okay. Um, so, so now I'll just quickly add, so what we've done, I Tahoe here is we've really moved our customers away from the whole idea of needing a team of engineers to apply controls to data as compared to other manually driven workflows. Uh, so with templates, automation, pre-built policies and data controls, one person can be fully operational within a few hours and achieve results straight out of the box, uh, on any cloud. >>Yeah. We've been talking about this theme of abstracting, the complexity that's really what we're seeing is a major trend in this coming decade. Okay, great. Thanks Savita Noah. Uh, ho how can people get more information or if they have any follow-up questions, where should they go? >>Yeah, sure. They've I mean, if you guys are interested in learning more, you know, reach out to us at info at dot com to speak with one of our sales engineers. I mean, we'd love to hear from you. So book a meeting as soon as you can. >>All right. Thanks guys. Keep it right there for more cube content with IO Tahoe.
SUMMARY :
resilience on hybrid and multicloud brought to you by IO Tahoe. so today we just wanted to give you guys a general overview of how we're using open shift. and I've been learning the ins and outs of OpenShift for like the past few months and I'm here to share it up line. What are the benefits of using OpenShift? Well, um, there's five that I can think of a faster time to operations, at Aja Tahoe delivered to our customers. So to answer your question, it's faster to create end user application Um, can you explain that in a bit more detail? Uh, so basically containers are used in Uh, so basically containers make life easier for developers and system Um, so here I've set up a light version of OpenShift code ready containers. This is a developer's view. Um, and you can also within this view, set up each layer of security and there's multiple that you can prop you use a login or are there multiple layers of security? But for the sake of time, I won't get too far into it. You also briefly mentioned automation and as you know, automation is the backbone of our platform here at IO Tahoe. So the latest versions of open should come with a built-in cryo container engine, What does it allow me to do This allows developers to instantly like, Um, it seems from what you described industries like banking and finance would be more resilient go down, it's easy to bring it back up using the templates and like the other automation features that OpenShift provides. So any final thoughts you want to share? Uh, the five are time to operation automation, Um, I wonder you guys have been talking about the combination So this would provide for more enhanced security. What's the time to value? So this kind of speeds up the time it takes for Uh, so with templates, Uh, ho how can people get more information or if they have any follow-up questions, where should they go? So book a meeting as soon as you can. Keep it right there for more cube content with IO Tahoe.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Aja Tahoe | ORGANIZATION | 0.99+ |
Dave | PERSON | 0.99+ |
IO Tahoe | ORGANIZATION | 0.99+ |
Sabita Davis | PERSON | 0.99+ |
two | QUANTITY | 0.99+ |
five | QUANTITY | 0.99+ |
five benefits | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
OpenShift | TITLE | 0.99+ |
each layer | QUANTITY | 0.99+ |
Kubernetes | TITLE | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
each container | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
second benefit | QUANTITY | 0.98+ |
Noah | PERSON | 0.98+ |
one | QUANTITY | 0.98+ |
Savita Noah | PERSON | 0.98+ |
Linux | TITLE | 0.98+ |
Noah Fields | PERSON | 0.98+ |
windows | TITLE | 0.97+ |
Oracle | ORGANIZATION | 0.97+ |
single point | QUANTITY | 0.97+ |
one person | QUANTITY | 0.96+ |
Iowa | LOCATION | 0.96+ |
Tahoe | ORGANIZATION | 0.96+ |
dot com | ORGANIZATION | 0.96+ |
Io-Tahoe | PERSON | 0.96+ |
CIS | ORGANIZATION | 0.95+ |
Savita | PERSON | 0.94+ |
iota | TITLE | 0.94+ |
IO | ORGANIZATION | 0.93+ |
each one | QUANTITY | 0.91+ |
Davis | PERSON | 0.89+ |
each individual container | QUANTITY | 0.8+ |
Docker cryo | TITLE | 0.79+ |
this coming decade | DATE | 0.78+ |
open | TITLE | 0.76+ |
OpenShift | ORGANIZATION | 0.73+ |
Tahoe | TITLE | 0.72+ |
GCP | TITLE | 0.67+ |
past few months | DATE | 0.64+ |
Mac | COMMERCIAL_ITEM | 0.59+ |
Tahoe | PERSON | 0.58+ |
container | TITLE | 0.58+ |
ton | QUANTITY | 0.56+ |
red | ORGANIZATION | 0.51+ |
benefits | QUANTITY | 0.43+ |
Sabita | COMMERCIAL_ITEM | 0.36+ |
Sabita Davis and Patrick Zeimet | Io-Tahoe Adaptive Data Governance
>>from around the globe. It's the Cube presenting adaptive data governance brought >>to you by >>Iota Ho. In this next segment, we're gonna be talking to you about getting to know your data. And specifically you're gonna hear from two folks at Io Tahoe. We've got enterprise account execs Evita Davis here, as well as Enterprise Data engineer Patrick Simon. They're gonna be sharing insights and tips and tricks for how you can get to know your data and quickly on. We also want to encourage you to engage with Sabina and Patrick. Use the chat feature to the right, send comments, questions or feedback so you can participate. All right, Patrick Sabetta, take it away. All right. >>Thanks, Lisa. Great to be here as Lisa mentioned guys. I'm the enterprise account executive here in Ohio. Tahoe you Pat? >>Yeah. Hey, everyone so great to be here. A said My name's Patrick Samit. I'm the enterprise data engineer here at Iota Ho. And we're so excited to be here and talk about this topic as one thing we're really trying to perpetuate is that data is everyone's business. >>I couldn't agree more, Pat. So, guys, what patent? I patent. I've actually had multiple discussions with clients from different organizations with different roles. So we spoke with both your technical and your non technical audience. So while they were interested in different aspects of our platform, we found that what they had in common was they wanted to make data easy to understand and usable. So that comes back. The pats point off being everybody's business because no matter your role, we're all dependent on data. So what Pan I wanted to do today was wanted toe walk. You guys through some of those client questions, slash pain points that we're hearing from different industries and different roles and demo how our platform here, like Tahoe, is used for automating those, uh, automating Dozier related tasks. So with that said, are you ready for the first one, Pat? >>Yeah, Let's do it. >>Great. So I'm gonna put my technical hat on for this one, So I'm a data practitioner. I just started my job. ABC Bank. I have over 100 different data sources. So I have data kept in Data Lakes, legacy data, sources, even the cloud. So my issue is I don't know what those data sources hold. I don't know what data sensitive, and I don't even understand how that data is connected. So how can I talk to help? >>Yeah, I think that's a very common experience many are facing and definitely something I've encountered in my past. Typically, the first step is to catalog the data and then start mapping the relationships between your various data stores. Now, more often than not, this has tackled through numerous meetings and a combination of Excel and something similar to video, which are too great tools in their own part. But they're very difficult to maintain. Just due to the rate that we are creating data in the modern world. It starts to beg for an idea that can scale with your business needs. And this is where a platform like Io Tahoe becomes so appealing. You can see here visualization of the data relationships created by the I Ho Tahoe service. Now, what is fantastic about this is it's not only laid out in a very human and digestible format in the same action of creating this view, the data catalog was constructed. >>Um, So is the data catalog automatically populated? Correct. Okay, so So what? I'm using iota. Hope at what I'm getting is this complete, unified automated platform without the added cost, of course. >>Exactly. And that's at the heart of Iota Ho. A great feature with that data catalog is that Iota Ho will also profile your data as it creates the catalog, assigning some meaning to those pesky column Underscore ones and custom variable underscore tents that are always such a joy to deal with. Uh, now, by leveraging this interface, we can start to answer the first part of your question and understand where the core relationships within our data exists. Personally, I'm a big fan of this >>view, >>as it really just helps the i b naturally John to these focal points that coincide with these key columns following that train of thought. Let's examine the customer I D column that seems to be at the center of a lot of these relationships. We can see that it's a fairly important column as it's maintaining the relationship between at least three other tables. Now you notice all the connectors are in this blue color. This means that their system defined relationships. But I hope Tahoe goes that extra mile and actually creates thes orange colored connectors as well. These air ones that are machine learning algorithms have predicted to be relationships. Uh, and you can leverage to try and make new and powerful relationships within your data. So I hope that answers the first part of your question. >>Eso So this is really cool. And I can see how this could be leverage quickly. Now. What if I added new data sources or your multiple data sources and needed toe? Identify what data sensitive. Can I Oh, Tahoe, Detect that. >>Yeah, definitely. Within the i o ta platform. There already over 300 pre defined policies such as HIPAA, ferpa, C, c, p, a and the like. One can choose which of these policies to run against their data along for flexibility and efficiency and running the policies that affect organization. >>Okay, so so 300 is an exceptional number. I'll give you that. But what about internal policies that apply to my organization? Is there any ability for me to write custom policies? >>Yeah, that's no issue. And is something that clients leverage fairly often to utilize this function when simply has to write a rejects that our team has helped many deploy. After that, the custom policy is stored for future use to profile sensitive data. One then selects the data sources they're interested in and select the policies that meet your particular needs. The interface will automatically take your data according to the policies of detects, after which you can review the discoveries confirming or rejecting the tagging. All of these insights are easily exported through the interface, so one can work these into the action items within your project management systems. And I think this lends to the collaboration as a team can work through the discovery simultaneously. And as each item is confirmed or rejected, they can see it ni instantaneously. All this translates to a confidence that with iota how you can be sure you're in compliance. >>Um, so I'm glad you mentioned compliance because that's extremely important to my organization. >>So >>what you're saying when I use the eye a Tahoe automated platform, we'd be 90% more compliant that before were other than if you were going to be using a human. >>Yeah, definitely. The collaboration and documentation that the iota ho interface lends itself to can really help you build that confidence that your compliance is sound. >>Does >>that answer your question about sense of data? >>Definitely so. So path. I have the next question for you. So we're planning on migration on guy. Have a set of reports I need to migrate. But what I need to know is that well, what what data sources? Those report those reports are dependent on and what's feeding those tables? >>Yeah, it's a fantastic questions to be toe identifying critical data elements, and the interdependencies within the various databases could be a time consuming but vital process and the migration initiative. Luckily, Iota Ho does have an answer, and again, it's presented in a very visual format. >>So what I'm looking at here is my entire day landscape. >>Yes, exactly. >>So let's say I add another data source. I can still see that Unified 3 60 view. >>Yeah, One feature that is particularly helpful is the ability to add data sources after the data lineage. Discovery has finished along for the flexibility and scope necessary for any data migration project. If you only need need to select a few databases or your entirety, this service will provide the answers. You're looking for this visual representation of the connectivity makes the identification of critical data elements a simple matter. The connections air driven by both system defined flows as well as those predicted by our algorithms, the confidence of which, uh can actually be customized to make sure that they're meeting the needs of the initiative that you have in place. Now, this also provides tabular output in case you need it for your own internal documentation or for your action items, which we can see right here. Uh, in this interface, you can actually also confirm or deny the pair rejection the pair directions along to make sure that the data is as accurate as possible. Does that help with your data lineage needs? >>Definitely. So So, Pat, My next big question here is So now I know a little bit about my data. How do I know I can trust it? So what I'm interested in knowing really is is it in a fit state for Meteo use it? Is it accurate? Does it conform to the right format? >>Yeah, that's a great question. I think that is a pain point felt across the board, be it by data practitioners or data consumers alike. another service that iota hope provides is the ability to write custom data quality rules and understand how well the data pertains to these rules. This dashboard gives a unified view of the strength of these rules, and your dad is overall quality. >>Okay, so Pat s o on on the accuracy scores there. So if my marketing team needs to run, a campaign can read dependent those accuracy scores to know what what tables have quality data to use for our marketing campaign. >>Yeah, this view would allow you to understand your overall accuracy as well as dive into the minutia to see which data elements are of the highest quality. So for that marketing campaign, if you need everything in a strong form, you'll be able to see very quickly with these high level numbers. But if you're only dependent on a few columns to get that information out the door, you can find that within this view, uh, >>so you >>no longer have to rely on reports about reports, but instead just come to this one platform to help drive conversations between stakeholders and data practitioners. I hope that helps answer your questions about that quality. >>Oh, definitely. So I have another one for you here. Path. So I get now the value of IATA who brings by automatically captured all those technical metadata from sources. But how do we match that with the business glossary? >>Yeah, within the same data quality service that we just reviewed. One can actually add business rules detailing the definitions and the business domains that these fall into. What's more is that the data quality rules were just looking at can then be tied into these definitions, allowing insight into the strength of these business rules. It is this service that empowers stakeholders across the business to be involved with the data life cycle and take ownership over the rules that fall within their domain. >>Okay, so those custom rules can I apply that across data sources? >>Yeah. You can bring in as many data sources as you need, so long as you could tie them to that unified definition. >>Okay, great. Thanks so much bad. And we just want to quickly say to everyone working in data, we understand your pain, so please feel free to reach out >>to us. We >>are website the chapel. Oh, Arlington. And let's get a conversation started on how iota Who can help you guys automate all those manual task to help save you time and money. Thank you. Thank >>you. Erin. >>Impact. If I could ask you one quick question, how do you advise customers? You just walk in this great example This banking example that you and city to talk through. How do you advise customers get started? >>Yeah, I think the number one thing that customers could do to get started with our platform is to just run the tag discovery and build up that data catalog. It lends itself very quickly to the other needs you might have, such as thes quality rules as well as identifying those kind of tricky columns that might exist in your data. Those custom variable underscore tens I mentioned before >>last questions to be to anything to add to what Pat just described as a starting place. >>Um, no, I think actually passed something that pretty well, I mean, just just by automating all those manual tasks, I mean, it definitely can save your company a lot of time and money, so we we encourage you just reach out to us. Let's get that conversation started. >>Excellent. Savita and Pat, Thank you so much. We hope you have learned a lot from these folks about how to get to know your data. Make sure that it's quality so that you can maximize the value of it. Thanks for watching.
SUMMARY :
from around the globe. for how you can get to know your data and quickly on. I'm the enterprise account executive here in Ohio. I'm the enterprise data engineer here at Iota Ho. So we spoke with both your technical and your non technical So I have data kept in Data Lakes, legacy data, sources, even the cloud. Typically, the first step is to catalog the data and then start mapping the relationships Um, So is the data catalog automatically populated? Uh, now, by leveraging this interface, we can start to answer the first part of your question So I hope that answers the first part of your question. And I can see how this could be leverage quickly. to run against their data along for flexibility and efficiency and running the policies that affect organization. policies that apply to my organization? And I think this lends to the collaboration as a team can work through the discovery that before were other than if you were going to be using a human. interface lends itself to can really help you build that confidence that your compliance is I have the next question for you. Yeah, it's a fantastic questions to be toe identifying critical data elements, and the interdependencies within I can still see that Unified 3 60 view. Yeah, One feature that is particularly helpful is the ability to add data sources after the data Does it conform to the right format? hope provides is the ability to write custom data quality rules and understand how well the data needs to run, a campaign can read dependent those accuracy scores to know what what tables have quality Yeah, this view would allow you to understand your overall accuracy as well as dive into the minutia I hope that helps answer your questions about that quality. So I have another one for you here. to be involved with the data life cycle and take ownership over the rules that fall within their domain. so long as you could tie them to that unified definition. we understand your pain, so please feel free to reach out to us. help you guys automate all those manual task to help save you time and money. you. This banking example that you and city to talk through. Yeah, I think the number one thing that customers could do to get started with our platform so we we encourage you just reach out to us. Make sure that it's quality so that you can maximize the value of it.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Sabina | PERSON | 0.99+ |
Savita | PERSON | 0.99+ |
Pat | PERSON | 0.99+ |
Patrick | PERSON | 0.99+ |
Patrick Zeimet | PERSON | 0.99+ |
Patrick Simon | PERSON | 0.99+ |
Evita Davis | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
Ohio | LOCATION | 0.99+ |
ABC Bank | ORGANIZATION | 0.99+ |
Patrick Sabetta | PERSON | 0.99+ |
Sabita Davis | PERSON | 0.99+ |
I Ho Tahoe | ORGANIZATION | 0.99+ |
Patrick Samit | PERSON | 0.99+ |
90% | QUANTITY | 0.99+ |
Erin | PERSON | 0.99+ |
Excel | TITLE | 0.99+ |
each item | QUANTITY | 0.99+ |
first step | QUANTITY | 0.99+ |
two folks | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Io Tahoe | ORGANIZATION | 0.98+ |
both | QUANTITY | 0.98+ |
first part | QUANTITY | 0.98+ |
John | PERSON | 0.98+ |
HIPAA | TITLE | 0.98+ |
first one | QUANTITY | 0.97+ |
iota | TITLE | 0.95+ |
one quick question | QUANTITY | 0.94+ |
ferpa | TITLE | 0.93+ |
Iota Ho | TITLE | 0.93+ |
Cube | ORGANIZATION | 0.93+ |
One feature | QUANTITY | 0.92+ |
IATA | ORGANIZATION | 0.92+ |
over 100 different data sources | QUANTITY | 0.9+ |
one | QUANTITY | 0.89+ |
one platform | QUANTITY | 0.88+ |
three other tables | QUANTITY | 0.86+ |
Pan | PERSON | 0.85+ |
Tahoe | ORGANIZATION | 0.84+ |
Iota Ho | TITLE | 0.84+ |
one thing | QUANTITY | 0.82+ |
Tahoe | PERSON | 0.82+ |
Iota Ho | ORGANIZATION | 0.75+ |
over 300 | QUANTITY | 0.74+ |
C | TITLE | 0.74+ |
both system | QUANTITY | 0.72+ |
at least | QUANTITY | 0.68+ |
Data Lakes | LOCATION | 0.68+ |
Meteo | ORGANIZATION | 0.64+ |
One | QUANTITY | 0.58+ |
Io-Tahoe | ORGANIZATION | 0.56+ |
Dozier | ORGANIZATION | 0.56+ |
p | TITLE | 0.52+ |
300 | OTHER | 0.48+ |
Arlington | PERSON | 0.41+ |
Tahoe | LOCATION | 0.4+ |
3 60 | OTHER | 0.38+ |
IO TAHOE EPISODE 4 DATA GOVERNANCE V2
>>from around the globe. It's the Cube presenting adaptive data governance brought to you by Iota Ho. >>And we're back with the data automation. Siri's. In this episode, we're gonna learn more about what I owe Tahoe is doing in the field of adaptive data governance how it can help achieve business outcomes and mitigate data security risks. I'm Lisa Martin, and I'm joined by a J. Bihar on the CEO of Iot Tahoe and Lester Waters, the CEO of Bio Tahoe. Gentlemen, it's great to have you on the program. >>Thank you. Lisa is good to be back. >>Great. Staley's >>likewise very socially distant. Of course as we are. Listen, we're gonna start with you. What's going on? And I am Tahoe. What's name? Well, >>I've been with Iot Tahoe for a little over the year, and one thing I've learned is every customer needs air just a bit different. So we've been working on our next major release of the I O. Tahoe product. But to really try to address these customer concerns because, you know, we wanna we wanna be flexible enough in order to come in and not just profile the date and not just understand data quality and lineage, but also to address the unique needs of each and every customer that we have. And so that required a platform rewrite of our product so that we could, uh, extend the product without building a new version of the product. We wanted to be able to have plausible modules. We also focused a lot on performance. That's very important with the bulk of data that we deal with that we're able to pass through that data in a single pass and do the analytics that are needed, whether it's, uh, lineage, data quality or just identifying the underlying data. And we're incorporating all that we've learned. We're tuning up our machine learning we're analyzing on MAWR dimensions than we've ever done before. We're able to do data quality without doing a Nen initial rejects for, for example, just out of the box. So I think it's all of these things were coming together to form our next version of our product. We're really excited by it, >>So it's exciting a J from the CEO's level. What's going on? >>Wow, I think just building on that. But let's still just mentioned there. It's were growing pretty quickly with our partners. And today, here with Oracle are excited. Thio explain how that shaping up lots of collaboration already with Oracle in government, in insurance, on in banking and we're excited because we get to have an impact. It's real satisfying to see how we're able. Thio. Help businesses transform, Redefine what's possible with their data on bond. Having I recall there is a partner, uh, to lean in with is definitely helping. >>Excellent. We're gonna dig into that a little bit later. Let's let's go back over to you. Explain adaptive data governance. Help us understand that >>really adaptive data governance is about achieving business outcomes through automation. It's really also about establishing a data driven culture and pushing what's traditionally managed in I t out to the business. And to do that, you've got to you've got Thio. You've got to enable an environment where people can actually access and look at the information about the data, not necessarily access the underlying data because we've got privacy concerns itself. But they need to understand what kind of data they have, what shape it's in what's dependent on it upstream and downstream, and so that they could make their educated decisions on on what they need to do to achieve those business outcomes. >>Ah, >>lot of a lot of frameworks these days are hardwired, so you can set up a set of business rules, and that set of business rules works for a very specific database and a specific schema. But imagine a world where you could just >>say, you >>know, the start date of alone must always be before the end date of alone and having that generic rule, regardless of the underlying database and applying it even when a new database comes online and having those rules applied. That's what adaptive data governance about I like to think of. It is the intersection of three circles, Really. It's the technical metadata coming together with policies and rules and coming together with the business ontology ease that are that are unique to that particular business. And this all of this. Bringing this all together allows you to enable rapid change in your environment. So it's a mouthful, adaptive data governance. But that's what it kind of comes down to. >>So, Angie, help me understand this. Is this book enterprise companies are doing now? Are they not quite there yet. >>Well, you know, Lisa, I think every organization is is going at its pace. But, you know, markets are changing the economy and the speed at which, um, some of the changes in the economy happening is is compelling more businesses to look at being more digital in how they serve their own customers. Eh? So what we're seeing is a number of trends here from heads of data Chief Data Officers, CEO, stepping back from, ah, one size fits all approach because they've tried that before, and it it just hasn't worked. They've spent millions of dollars on I T programs China Dr Value from that data on Bennett. And they've ended up with large teams of manual processing around data to try and hardwire these policies to fit with the context and each line of business and on that hasn't worked. So the trends that we're seeing emerge really relate. Thio, How do I There's a chief data officer as a CEO. Inject more automation into a lot of these common tax. Andi, you know, we've been able toc that impact. I think the news here is you know, if you're trying to create a knowledge graph a data catalog or Ah, business glossary. And you're trying to do that manually will stop you. You don't have to do that manually anymore. I think best example I can give is Lester and I We we like Chinese food and Japanese food on. If you were sitting there with your chopsticks, you wouldn't eat the bowl of rice with the chopsticks, one grain at a time. What you'd want to do is to find a more productive way to to enjoy that meal before it gets cold. Andi, that's similar to how we're able to help the organizations to digest their data is to get through it faster, enjoy the benefits of putting that data to work. >>And if it was me eating that food with you guys, I would be not using chopsticks. I would be using a fork and probably a spoon. So eso Lester, how then does iota who go about doing this and enabling customers to achieve this? >>Let me, uh, let me show you a little story have here. So if you take a look at the challenges the most customers have, they're very similar, but every customers on a different data journey, so but it all starts with what data do I have? What questions or what shape is that data in? Uh, how is it structured? What's dependent on it? Upstream and downstream. Um, what insights can I derive from that data? And how can I answer all of those questions automatically? So if you look at the challenges for these data professionals, you know, they're either on a journey to the cloud. Maybe they're doing a migration oracle. Maybe they're doing some data governance changes on bits about enabling this. So if you look at these challenges and I'm gonna take you through a >>story here, E, >>I want to introduce Amanda. Man does not live like, uh, anyone in any large organization. She's looking around and she just sees stacks of data. I mean, different databases, the one she knows about, the one she doesn't know about what should know about various different kinds of databases. And a man is just tasking with understanding all of this so that they can embark on her data journey program. So So a man who goes through and she's great. I've got some handy tools. I can start looking at these databases and getting an idea of what we've got. Well, as she digs into the databases, she starts to see that not everything is as clear as she might have hoped it would be. You know, property names or column names, or have ambiguous names like Attribute one and attribute to or maybe date one and date to s Oh, man is starting to struggle, even though she's get tools to visualize. And look what look at these databases. She still No, she's got a long road ahead. And with 2000 databases in her large enterprise, yes, it's gonna be a long turkey but Amanda Smart. So she pulls out her trusty spreadsheet to track all of her findings on what she doesn't know about. She raises a ticket or maybe tries to track down the owner to find what the data means. And she's tracking all this information. Clearly, this doesn't scale that well for Amanda, you know? So maybe organization will get 10 Amanda's to sort of divide and conquer that work. But even that doesn't work that well because they're still ambiguities in the data with Iota ho. What we do is we actually profile the underlying data. By looking at the underlying data, we can quickly see that attribute. One looks very much like a U. S. Social Security number and attribute to looks like a I c D 10 medical code. And we do this by using anthologies and dictionaries and algorithms to help identify the underlying data and then tag it. Key Thio Doing, uh, this automation is really being able to normalize things across different databases, so that where there's differences in column names, I know that in fact, they contain contain the same data. And by going through this exercise with a Tahoe, not only can we identify the data, but we also could gain insights about the data. So, for example, we can see that 97% of that time that column named Attribute one that's got us Social Security numbers has something that looks like a Social Security number. But 3% of the time, it doesn't quite look right. Maybe there's a dash missing. Maybe there's a digit dropped. Or maybe there's even characters embedded in it. So there may be that may be indicative of a data quality issues, so we try to find those kind of things going a step further. We also try to identify data quality relationships. So, for example, we have two columns, one date, one date to through Ah, observation. We can see that date 1 99% of the time is less than date, too. 1% of the time. It's not probably indicative of a data quality issue, but going a step further, we can also build a business rule that says Day one is less than date to. And so then when it pops up again, we can quickly identify and re mediate that problem. So these are the kinds of things that we could do with with iota going even a step further. You could take your your favorite data science solution production ISAT and incorporated into our next version a zey what we call a worker process to do your own bespoke analytics. >>We spoke analytics. Excellent, Lester. Thank you. So a J talk us through some examples of where you're putting this to use. And also what is some of the feedback from >>some customers? But I think it helped do this Bring it to life a little bit. Lisa is just to talk through a case study way. Pull something together. I know it's available for download, but in ah, well known telecommunications media company, they had a lot of the issues that lasted. You spoke about lots of teams of Amanda's, um, super bright data practitioners, um, on baby looking to to get more productivity out of their day on, deliver a good result for their own customers for cell phone subscribers, Um, on broadband users. So you know that some of the examples that we can see here is how we went about auto generating a lot of that understanding off that data within hours. So Amanda had her data catalog populated automatically. A business class three built up on it. Really? Then start to see. Okay, where do I want Thio? Apply some policies to the data to to set in place some controls where they want to adapt, how different lines of business, maybe tax versus customer operations have different access or permissions to that data on What we've been able to do there is, is to build up that picture to see how does data move across the entire organization across the state. Andi on monitor that overtime for improvement, so have taken it from being a reactive. Let's do something Thio. Fix something. Thio, Now more proactive. We can see what's happening with our data. Who's using it? Who's accessing it, how it's being used, how it's being combined. Um, on from there. Taking a proactive approach is a real smart use of of the talents in in that telco organization Onda folks that worked there with data. >>Okay, Jason, dig into that a little bit deeper. And one of the things I was thinking when you were talking through some of those outcomes that you're helping customers achieve is our ally. How do customers measure are? Why? What are they seeing with iota host >>solution? Yeah, right now that the big ticket item is time to value on. And I think in data, a lot of the upfront investment cause quite expensive. They have been today with a lot of the larger vendors and technologies. So what a CEO and economic bio really needs to be certain of is how quickly can I get that are away. I think we've got something we can show. Just pull up a before and after, and it really comes down to hours, days and weeks. Um, where we've been able Thio have that impact on in this playbook that we pulled together before and after picture really shows. You know, those savings that committed a bit through providing data into some actionable form within hours and days to to drive agility, but at the same time being out and forced the controls to protect the use of that data who has access to it. So these are the number one thing I'd have to say. It's time on. We can see that on the the graphic that we've just pulled up here. >>We talk about achieving adaptive data governance. Lester, you guys talk about automation. You talk about machine learning. How are you seeing those technologies being a facilitator of organizations adopting adaptive data governance? Well, >>Azaz, we see Mitt Emmanuel day. The days of manual effort are so I think you know this >>is a >>multi step process. But the very first step is understanding what you have in normalizing that across your data estate. So you couple this with the ontology, that air unique to your business. There is no algorithms, and you basically go across and you identify and tag tag that data that allows for the next steps toe happen. So now I can write business rules not in terms of columns named columns, but I could write him in terms of the tags being able to automate. That is a huge time saver and the fact that we can suggest that as a rule, rather than waiting for a person to come along and say, Oh, wow. Okay, I need this rule. I need this will thes air steps that increased that are, I should say, decrease that time to value that A. J talked about and then, lastly, a couple of machine learning because even with even with great automation and being able to profile all of your data and getting a good understanding, that brings you to a certain point. But there's still ambiguities in the data. So, for example, I might have to columns date one and date to. I may have even observed the date. One should be less than day two, but I don't really know what date one and date to our other than a date. So this is where it comes in, and I might ask the user said, >>Can >>you help me identify what date? One and date You are in this in this table. Turns out they're a start date and an end date for alone That gets remembered, cycled into the machine learning. So if I start to see this pattern of date one day to elsewhere, I'm going to say, Is it start dating and date? And these Bringing all these things together with this all this automation is really what's key to enabling this This'll data governance. Yeah, >>great. Thanks. Lester and a j wanna wrap things up with something that you mentioned in the beginning about what you guys were doing with Oracle. Take us out by telling us what you're doing there. How are you guys working together? >>Yeah, I think those of us who worked in i t for many years we've We've learned Thio trust articles technology that they're shifting now to ah, hybrid on Prohm Cloud Generation to platform, which is exciting. Andi on their existing customers and new customers moving to article on a journey. So? So Oracle came to us and said, you know, we can see how quickly you're able to help us change mindsets Ondas mindsets are locked in a way of thinking around operating models of I t. That there may be no agile and what siloed on day wanting to break free of that and adopt a more agile A p I at driven approach. A lot of the work that we're doing with our recall no is around, uh, accelerating what customers conduce with understanding their data and to build digital APS by identifying the the underlying data that has value. Onda at the time were able to do that in in in hours, days and weeks. Rather many months. Is opening up the eyes to Chief Data Officers CEO to say, Well, maybe we can do this whole digital transformation this year. Maybe we can bring that forward and and transform who we are as a company on that's driving innovation, which we're excited about it. I know Oracle, a keen Thio to drive through and >>helping businesses transformed digitally is so incredibly important in this time as we look Thio things changing in 2021 a. J. Lester thank you so much for joining me on this segment explaining adaptive data governance, how organizations can use it benefit from it and achieve our Oi. Thanks so much, guys. >>Thank you. Thanks again, Lisa. >>In a moment, we'll look a adaptive data governance in banking. This is the Cube, your global leader in high tech coverage. >>Innovation, impact influence. Welcome to the Cube. Disruptors. Developers and practitioners learn from the voices of leaders who share their personal insights from the hottest digital events around the globe. Enjoy the best this community has to offer on the Cube, your global leader in high tech digital coverage. >>Our next segment here is an interesting panel you're gonna hear from three gentlemen about adaptive data. Governments want to talk a lot about that. Please welcome Yusuf Khan, the global director of data services for Iot Tahoe. We also have Santiago Castor, the chief data officer at the First Bank of Nigeria, and good John Vander Wal, Oracle's senior manager of digital transformation and industries. Gentlemen, it's great to have you joining us in this in this panel. Great >>to be >>tried for me. >>Alright, Santiago, we're going to start with you. Can you talk to the audience a little bit about the first Bank of Nigeria and its scale? This is beyond Nigeria. Talk to us about that. >>Yes, eso First Bank of Nigeria was created 125 years ago. One of the oldest ignored the old in Africa because of the history he grew everywhere in the region on beyond the region. I am calling based in London, where it's kind of the headquarters and it really promotes trade, finance, institutional banking, corporate banking, private banking around the world in particular, in relationship to Africa. We are also in Asia in in the Middle East. >>So, Sanjay, go talk to me about what adaptive data governance means to you. And how does it help the first Bank of Nigeria to be able to innovate faster with the data that you have? >>Yes, I like that concept off adaptive data governor, because it's kind of Ah, I would say an approach that can really happen today with the new technologies before it was much more difficult to implement. So just to give you a little bit of context, I I used to work in consulting for 16, 17 years before joining the president of Nigeria, and I saw many organizations trying to apply different type of approaches in the governance on by the beginning early days was really kind of a year. A Chicago A. A top down approach where data governance was seeing as implement a set of rules, policies and procedures. But really, from the top down on is important. It's important to have the battle off your sea level of your of your director. Whatever I saw, just the way it fails, you really need to have a complimentary approach. You can say bottom are actually as a CEO are really trying to decentralize the governor's. Really, Instead of imposing a framework that some people in the business don't understand or don't care about it, it really needs to come from them. So what I'm trying to say is that data basically support business objectives on what you need to do is every business area needs information on the detector decisions toe actually be able to be more efficient or create value etcetera. Now, depending on the business questions they have to solve, they will need certain data set. So they need actually to be ableto have data quality for their own. For us now, when they understand that they become the stores naturally on their own data sets. And that is where my bottom line is meeting my top down. You can guide them from the top, but they need themselves to be also empower and be actually, in a way flexible to adapt the different questions that they have in orderto be able to respond to the business needs. Now I cannot impose at the finish for everyone. I need them to adapt and to bring their answers toe their own business questions. That is adaptive data governor and all That is possible because we have. And I was saying at the very beginning just to finalize the point, we have new technologies that allow you to do this method data classifications, uh, in a very sophisticated way that you can actually create analitico of your metadata. You can understand your different data sources in order to be able to create those classifications like nationalities, a way of classifying your customers, your products, etcetera. >>So one of the things that you just said Santa kind of struck me to enable the users to be adaptive. They probably don't want to be logging in support ticket. So how do you support that sort of self service to meet the demand of the users so that they can be adaptive. >>More and more business users wants autonomy, and they want to basically be ableto grab the data and answer their own question. Now when you have, that is great, because then you have demand of businesses asking for data. They're asking for the insight. Eso How do you actually support that? I would say there is a changing culture that is happening more and more. I would say even the current pandemic has helped a lot into that because you have had, in a way, off course, technology is one of the biggest winners without technology. We couldn't have been working remotely without these technologies where people can actually looking from their homes and still have a market data marketplaces where they self serve their their information. But even beyond that data is a big winner. Data because the pandemic has shown us that crisis happened, that we cannot predict everything and that we are actually facing a new kind of situation out of our comfort zone, where we need to explore that we need to adapt and we need to be flexible. How do we do that with data. Every single company either saw the revenue going down or the revenue going very up For those companies that are very digital already. Now it changed the reality, so they needed to adapt. But for that they needed information. In order to think on innovate, try toe, create responses So that type of, uh, self service off data Haider for data in order to be able to understand what's happening when the prospect is changing is something that is becoming more, uh, the topic today because off the condemning because of the new abilities, the technologies that allow that and then you then are allowed to basically help your data. Citizens that call them in the organization people that no other business and can actually start playing and an answer their own questions. Eso so these technologies that gives more accessibility to the data that is some cataloging so they can understand where to go or what to find lineage and relationships. All this is is basically the new type of platforms and tools that allow you to create what are called a data marketplace. I think these new tools are really strong because they are now allowing for people that are not technology or I t people to be able to play with data because it comes in the digital world There. Used to a given example without your who You have a very interesting search functionality. Where if you want to find your data you want to sell, Sir, you go there in that search and you actually go on book for your data. Everybody knows how to search in Google, everybody's searching Internet. So this is part of the data culture, the digital culture. They know how to use those schools. Now, similarly, that data marketplace is, uh, in you can, for example, see which data sources they're mostly used >>and enabling that speed that we're all demanding today during these unprecedented times. Goodwin, I wanted to go to you as we talk about in the spirit of evolution, technology is changing. Talk to us a little bit about Oracle Digital. What are you guys doing there? >>Yeah, Thank you. Um, well, Oracle Digital is a business unit that Oracle EMEA on. We focus on emerging countries as well as low and enterprises in the mid market, in more developed countries and four years ago. This started with the idea to engage digital with our customers. Fear Central helps across EMEA. That means engaging with video, having conference calls, having a wall, a green wall where we stand in front and engage with our customers. No one at that time could have foreseen how this is the situation today, and this helps us to engage with our customers in the way we were already doing and then about my team. The focus of my team is to have early stage conversations with our with our customers on digital transformation and innovation. And we also have a team off industry experts who engaged with our customers and share expertise across EMEA, and we inspire our customers. The outcome of these conversations for Oracle is a deep understanding of our customer needs, which is very important so we can help the customer and for the customer means that we will help them with our technology and our resource is to achieve their goals. >>It's all about outcomes, right? Good Ron. So in terms of automation, what are some of the things Oracle's doing there to help your clients leverage automation to improve agility? So that they can innovate faster, which in these interesting times it's demanded. >>Yeah, thank you. Well, traditionally, Oracle is known for their databases, which have bean innovated year over year. So here's the first lunch on the latest innovation is the autonomous database and autonomous data warehouse. For our customers, this means a reduction in operational costs by 90% with a multi medal converts, database and machine learning based automation for full life cycle management. Our databases self driving. This means we automate database provisioning, tuning and scaling. The database is self securing. This means ultimate data protection and security, and it's self repairing the automates failure, detection fail over and repair. And then the question is for our customers, What does it mean? It means they can focus on their on their business instead off maintaining their infrastructure and their operations. >>That's absolutely critical use if I want to go over to you now. Some of the things that we've talked about, just the massive progression and technology, the evolution of that. But we know that whether we're talking about beta management or digital transformation, a one size fits all approach doesn't work to address the challenges that the business has, um that the i t folks have, as you're looking through the industry with what Santiago told us about first Bank of Nigeria. What are some of the changes that you're seeing that I owe Tahoe seeing throughout the industry? >>Uh, well, Lisa, I think the first way I'd characterize it is to say, the traditional kind of top down approach to data where you have almost a data Policeman who tells you what you can and can't do, just doesn't work anymore. It's too slow. It's too resource intensive. Uh, data management data, governments, digital transformation itself. It has to be collaborative on. There has to be in a personalization to data users. Um, in the environment we find ourselves in. Now, it has to be about enabling self service as well. Um, a one size fits all model when it comes to those things around. Data doesn't work. As Santiago was saying, it needs to be adapted toe how the data is used. Andi, who is using it on in order to do this cos enterprises organizations really need to know their data. They need to understand what data they hold, where it is on what the sensitivity of it is they can then any more agile way apply appropriate controls on access so that people themselves are and groups within businesses are our job and could innovate. Otherwise, everything grinds to a halt, and you risk falling behind your competitors. >>Yeah, that one size fits all term just doesn't apply when you're talking about adaptive and agility. So we heard from Santiago about some of the impact that they're making with First Bank of Nigeria. Used to talk to us about some of the business outcomes that you're seeing other customers make leveraging automation that they could not do >>before it's it's automatically being able to classify terabytes, terabytes of data or even petabytes of data across different sources to find duplicates, which you can then re mediate on. Deletes now, with the capabilities that iota offers on the Oracle offers, you can do things not just where the five times or 10 times improvement, but it actually enables you to do projects for Stop that otherwise would fail or you would just not be able to dio I mean, uh, classifying multi terrible and multi petabytes states across different sources, formats very large volumes of data in many scenarios. You just can't do that manually. I mean, we've worked with government departments on the issues there is expect are the result of fragmented data. There's a lot of different sources. There's lot of different formats and without these newer technologies to address it with automation on machine learning, the project isn't durable. But now it is on that that could lead to a revolution in some of these businesses organizations >>to enable that revolution that there's got to be the right cultural mindset. And one of the when Santiago was talking about folks really kind of adapted that. The thing I always call that getting comfortably uncomfortable. But that's hard for organizations to. The technology is here to enable that. But well, you're talking with customers use. How do you help them build the trust in the confidence that the new technologies and a new approaches can deliver what they need? How do you help drive the kind of a tech in the culture? >>It's really good question is because it can be quite scary. I think the first thing we'd start with is to say, Look, the technology is here with businesses like I Tahoe. Unlike Oracle, it's already arrived. What you need to be comfortable doing is experimenting being agile around it, Andi trying new ways of doing things. Uh, if you don't wanna get less behind that Santiago on the team that fbn are a great example off embracing it, testing it on a small scale on, then scaling up a Toyota, we offer what we call a data health check, which can actually be done very quickly in a matter of a few weeks. So we'll work with a customer. Picky use case, install the application, uh, analyzed data. Drive out Cem Cem quick winds. So we worked in the last few weeks of a large entity energy supplier, and in about 20 days, we were able to give them an accurate understanding of their critical data. Elements apply. Helping apply data protection policies. Minimize copies of the data on work out what data they needed to delete to reduce their infrastructure. Spend eso. It's about experimenting on that small scale, being agile on, then scaling up in a kind of very modern way. >>Great advice. Uh, Santiago, I'd like to go back to Is we kind of look at again that that topic of culture and the need to get that mindset there to facilitate these rapid changes, I want to understand kind of last question for you about how you're doing that from a digital transformation perspective. We know everything is accelerating in 2020. So how are you building resilience into your data architecture and also driving that cultural change that can help everyone in this shift to remote working and a lot of the the digital challenges and changes that we're all going through? >>The new technologies allowed us to discover the dating anyway. Toe flawed and see very quickly Information toe. Have new models off over in the data on giving autonomy to our different data units. Now, from that autonomy, they can then compose an innovator own ways. So for me now, we're talking about resilience because in a way, autonomy and flexibility in a organization in a data structure with platform gives you resilience. The organizations and the business units that I have experienced in the pandemic are working well. Are those that actually because they're not physically present during more in the office, you need to give them their autonomy and let them actually engaged on their own side that do their own job and trust them in a way on as you give them, that they start innovating and they start having a really interesting ideas. So autonomy and flexibility. I think this is a key component off the new infrastructure. But even the new reality that on then it show us that, yes, we used to be very kind off structure, policies, procedures as very important. But now we learn flexibility and adaptability of the same side. Now, when you have that a key, other components of resiliency speed, because people want, you know, to access the data and access it fast and on the site fast, especially changes are changing so quickly nowadays that you need to be ableto do you know, interact. Reiterate with your information to answer your questions. Pretty, um, so technology that allows you toe be flexible iterating on in a very fast job way continue will allow you toe actually be resilient in that way, because you are flexible, you adapt your job and you continue answering questions as they come without having everything, setting a structure that is too hard. We also are a partner off Oracle and Oracle. Embodies is great. They have embedded within the transactional system many algorithms that are allowing us to calculate as the transactions happened. What happened there is that when our customers engaged with algorithms and again without your powers, well, the machine learning that is there for for speeding the automation of how you find your data allows you to create a new alliance with the machine. The machine is their toe, actually, in a way to your best friend to actually have more volume of data calculated faster. In a way, it's cover more variety. I mean, we couldn't hope without being connected to this algorithm on >>that engagement is absolutely critical. Santiago. Thank you for sharing that. I do wanna rap really quickly. Good On one last question for you, Santiago talked about Oracle. You've talked about a little bit. As we look at digital resilience, talk to us a little bit in the last minute about the evolution of Oracle. What you guys were doing there to help your customers get the resilience that they have toe have to be not just survive but thrive. >>Yeah. Oracle has a cloud offering for infrastructure, database, platform service and a complete solutions offered a South on Daz. As Santiago also mentioned, We are using AI across our entire portfolio and by this will help our customers to focus on their business innovation and capitalize on data by enabling new business models. Um, and Oracle has a global conference with our cloud regions. It's massively investing and innovating and expanding their clouds. And by offering clouds as public cloud in our data centers and also as private cloud with clouded customer, we can meet every sovereignty and security requirements. And in this way we help people to see data in new ways. We discover insights and unlock endless possibilities. And and maybe 11 of my takeaways is if I If I speak with customers, I always tell them you better start collecting your data. Now we enable this partners like Iota help us as well. If you collect your data now, you are ready for tomorrow. You can never collect your data backwards, So that is my take away for today. >>You can't collect your data backwards. Excellently, John. Gentlemen, thank you for sharing all of your insights. Very informative conversation in a moment, we'll address the question. Do you know your data? >>Are you interested in test driving the iota Ho platform kick Start the benefits of data automation for your business through the Iota Ho Data Health check program. Ah, flexible, scalable sandbox environment on the cloud of your choice with set up service and support provided by Iota ho. Look time with a data engineer to learn more and see Io Tahoe in action from around the globe. It's the Cube presenting adaptive data governance brought to you by Iota Ho. >>In this next segment, we're gonna be talking to you about getting to know your data. And specifically you're gonna hear from two folks at Io Tahoe. We've got enterprise account execs to be to Davis here, as well as Enterprise Data engineer Patrick Simon. They're gonna be sharing insights and tips and tricks for how you could get to know your data and quickly on. We also want to encourage you to engage with the media and Patrick, use the chat feature to the right, send comments, questions or feedback so you can participate. All right, Patrick Savita, take it away. Alright. >>Thankfully saw great to be here as Lisa mentioned guys, I'm the enterprise account executive here in Ohio. Tahoe you Pat? >>Yeah. Hey, everyone so great to be here. I said my name is Patrick Samit. I'm the enterprise data engineer here in Ohio Tahoe. And we're so excited to be here and talk about this topic as one thing we're really trying to perpetuate is that data is everyone's business. >>So, guys, what patent I got? I've actually had multiple discussions with clients from different organizations with different roles. So we spoke with both your technical and your non technical audience. So while they were interested in different aspects of our platform, we found that what they had in common was they wanted to make data easy to understand and usable. So that comes back. The pats point off to being everybody's business because no matter your role, we're all dependent on data. So what Pan I wanted to do today was wanted to walk you guys through some of those client questions, slash pain points that we're hearing from different industries and different rules and demo how our platform here, like Tahoe, is used for automating Dozier related tasks. So with that said are you ready for the first one, Pat? >>Yeah, Let's do it. >>Great. So I'm gonna put my technical hat on for this one. So I'm a data practitioner. I just started my job. ABC Bank. I have, like, over 100 different data sources. So I have data kept in Data Lakes, legacy data, sources, even the cloud. So my issue is I don't know what those data sources hold. I don't know what data sensitive, and I don't even understand how that data is connected. So how can I saw who help? >>Yeah, I think that's a very common experience many are facing and definitely something I've encountered in my past. Typically, the first step is to catalog the data and then start mapping the relationships between your various data stores. Now, more often than not, this has tackled through numerous meetings and a combination of excel and something similar to video which are too great tools in their own part. But they're very difficult to maintain. Just due to the rate that we are creating data in the modern world. It starts to beg for an idea that can scale with your business needs. And this is where a platform like Io Tahoe becomes so appealing, you can see here visualization of the data relationships created by the I. O. Tahoe service. Now, what is fantastic about this is it's not only laid out in a very human and digestible format in the same action of creating this view, the data catalog was constructed. >>Um so is the data catalog automatically populated? Correct. Okay, so So what I'm using Iota hope at what I'm getting is this complete, unified automated platform without the added cost? Of course. >>Exactly. And that's at the heart of Iota Ho. A great feature with that data catalog is that Iota Ho will also profile your data as it creates the catalog, assigning some meaning to those pesky column underscore ones and custom variable underscore tents. They're always such a joy to deal with. Now, by leveraging this interface, we can start to answer the first part of your question and understand where the core relationships within our data exists. Uh, personally, I'm a big fan of this view, as it really just helps the i b naturally John to these focal points that coincide with these key columns following that train of thought, Let's examine the customer I D column that seems to be at the center of a lot of these relationships. We can see that it's a fairly important column as it's maintaining the relationship between at least three other tables. >>Now you >>notice all the connectors are in this blue color. This means that their system defined relationships. But I hope Tahoe goes that extra mile and actually creates thes orange colored connectors as well. These air ones that are machine learning algorithms have predicted to be relationships on. You can leverage to try and make new and powerful relationships within your data. >>Eso So this is really cool, and I can see how this could be leverage quickly now. What if I added new data sources or your multiple data sources and need toe identify what data sensitive can iota who detect that? >>Yeah, definitely. Within the hotel platform. There, already over 300 pre defined policies such as hip for C, C, P. A and the like one can choose which of these policies to run against their data along for flexibility and efficiency and running the policies that affect organization. >>Okay, so so 300 is an exceptional number. I'll give you that. But what about internal policies that apply to my organization? Is there any ability for me to write custom policies? >>Yeah, that's no issue. And it's something that clients leverage fairly often to utilize this function when simply has to write a rejects that our team has helped many deploy. After that, the custom policy is stored for future use to profile sensitive data. One then selects the data sources they're interested in and select the policies that meet your particular needs. The interface will automatically take your data according to the policies of detects, after which you can review the discoveries confirming or rejecting the tagging. All of these insights are easily exported through the interface. Someone can work these into the action items within your project management systems, and I think this lends to the collaboration as a team can work through the discovery simultaneously, and as each item is confirmed or rejected, they can see it ni instantaneously. All this translates to a confidence that with iota hope, you can be sure you're in compliance. >>So I'm glad you mentioned compliance because that's extremely important to my organization. So what you're saying when I use the eye a Tahoe automated platform, we'd be 90% more compliant that before were other than if you were going to be using a human. >>Yeah, definitely the collaboration and documentation that the Iot Tahoe interface lends itself to really help you build that confidence that your compliance is sound. >>So we're planning a migration. Andi, I have a set of reports I need to migrate. But what I need to know is, uh well, what what data sources? Those report those reports are dependent on. And what's feeding those tables? >>Yeah, it's a fantastic questions to be toe identifying critical data elements, and the interdependencies within the various databases could be a time consuming but vital process and the migration initiative. Luckily, Iota Ho does have an answer, and again, it's presented in a very visual format. >>Eso So what I'm looking at here is my entire day landscape. >>Yes, exactly. >>Let's say I add another data source. I can still see that unified 3 60 view. >>Yeah, One future that is particularly helpful is the ability to add data sources after the data lineage. Discovery has finished alone for the flexibility and scope necessary for any data migration project. If you only need need to select a few databases or your entirety, this service will provide the answers. You're looking for things. Visual representation of the connectivity makes the identification of critical data elements a simple matter. The connections air driven by both system defined flows as well as those predicted by our algorithms, the confidence of which, uh, can actually be customized to make sure that they're meeting the needs of the initiative that you have in place. This also provides tabular output in case you needed for your own internal documentation or for your action items, which we can see right here. Uh, in this interface, you can actually also confirm or deny the pair rejection the pair directions, allowing to make sure that the data is as accurate as possible. Does that help with your data lineage needs? >>Definitely. So So, Pat, My next big question here is So now I know a little bit about my data. How do I know I can trust >>it? So >>what I'm interested in knowing, really is is it in a fit state for me to use it? Is it accurate? Does it conform to the right format? >>Yeah, that's a great question. And I think that is a pain point felt across the board, be it by data practitioners or data consumers alike. Another service that I owe Tahoe provides is the ability to write custom data quality rules and understand how well the data pertains to these rules. This dashboard gives a unified view of the strength of these rules, and your dad is overall quality. >>Okay, so Pat s o on on the accuracy scores there. So if my marketing team needs to run, a campaign can read dependent those accuracy scores to know what what tables have quality data to use for our marketing campaign. >>Yeah, this view would allow you to understand your overall accuracy as well as dive into the minutia to see which data elements are of the highest quality. So for that marketing campaign, if you need everything in a strong form, you'll be able to see very quickly with these high level numbers. But if you're only dependent on a few columns to get that information out the door, you can find that within this view, eso >>you >>no longer have to rely on reports about reports, but instead just come to this one platform to help drive conversations between stakeholders and data practitioners. >>So I get now the value of IATA who brings by automatically capturing all those technical metadata from sources. But how do we match that with the business glossary? >>Yeah, within the same data quality service that we just reviewed, one can actually add business rules detailing the definitions and the business domains that these fall into. What's more is that the data quality rules were just looking at can then be tied into these definitions. Allowing insight into the strength of these business rules is this service that empowers stakeholders across the business to be involved with the data life cycle and take ownership over the rules that fall within their domain. >>Okay, >>so those custom rules can I apply that across data sources? >>Yeah, you could bring in as many data sources as you need, so long as you could tie them to that unified definition. >>Okay, great. Thanks so much bad. And we just want to quickly say to everyone working in data, we understand your pain, so please feel free to reach out to us. we are Website the chapel. Oh, Arlington. And let's get a conversation started on how iota Who can help you guys automate all those manual task to help save you time and money. Thank you. Thank >>you. Your Honor, >>if I could ask you one quick question, how do you advise customers? You just walk in this great example this banking example that you instantly to talk through. How do you advise customers get started? >>Yeah, I think the number one thing that customers could do to get started with our platform is to just run the tag discovery and build up that data catalog. It lends itself very quickly to the other needs you might have, such as thes quality rules. A swell is identifying those kind of tricky columns that might exist in your data. Those custom variable underscore tens I mentioned before >>last questions to be to anything to add to what Pat just described as a starting place. >>I'm no, I think actually passed something that pretty well, I mean, just just by automating all those manual task. I mean, it definitely can save your company a lot of time and money, so we we encourage you just reach out to us. Let's get that conversation >>started. Excellent. So, Pete and Pat, thank you so much. We hope you have learned a lot from these folks about how to get to know your data. Make sure that it's quality, something you can maximize the value of it. Thanks >>for watching. Thanks again, Lisa, for that very insightful and useful deep dive into the world of adaptive data governance with Iota Ho Oracle First Bank of Nigeria This is Dave a lot You won't wanna mess Iota, whose fifth episode in the data automation Siri's in that we'll talk to experts from Red Hat and Happiest Minds about their best practices for managing data across hybrid cloud Inter Cloud multi Cloud I T environment So market calendar for Wednesday, January 27th That's Episode five. You're watching the Cube Global Leader digital event technique
SUMMARY :
adaptive data governance brought to you by Iota Ho. Gentlemen, it's great to have you on the program. Lisa is good to be back. Great. Listen, we're gonna start with you. But to really try to address these customer concerns because, you know, we wanna we So it's exciting a J from the CEO's level. It's real satisfying to see how we're able. Let's let's go back over to you. But they need to understand what kind of data they have, what shape it's in what's dependent lot of a lot of frameworks these days are hardwired, so you can set up a set It's the technical metadata coming together with policies Is this book enterprise companies are doing now? help the organizations to digest their data is to And if it was me eating that food with you guys, I would be not using chopsticks. So if you look at the challenges for these data professionals, you know, they're either on a journey to the cloud. Well, as she digs into the databases, she starts to see that So a J talk us through some examples of where But I think it helped do this Bring it to life a little bit. And one of the things I was thinking when you were talking through some We can see that on the the graphic that we've just How are you seeing those technologies being think you know this But the very first step is understanding what you have in normalizing that So if I start to see this pattern of date one day to elsewhere, I'm going to say, in the beginning about what you guys were doing with Oracle. So Oracle came to us and said, you know, we can see things changing in 2021 a. J. Lester thank you so much for joining me on this segment Thank you. is the Cube, your global leader in high tech coverage. Enjoy the best this community has to offer on the Cube, Gentlemen, it's great to have you joining us in this in this panel. Can you talk to the audience a little bit about the first Bank of One of the oldest ignored the old in Africa because of the history And how does it help the first Bank of Nigeria to be able to innovate faster with the point, we have new technologies that allow you to do this method data So one of the things that you just said Santa kind of struck me to enable the users to be adaptive. Now it changed the reality, so they needed to adapt. I wanted to go to you as we talk about in the spirit of evolution, technology is changing. customer and for the customer means that we will help them with our technology and our resource is to achieve doing there to help your clients leverage automation to improve agility? So here's the first lunch on the latest innovation Some of the things that we've talked about, Otherwise, everything grinds to a halt, and you risk falling behind your competitors. Used to talk to us about some of the business outcomes that you're seeing other customers make leveraging automation different sources to find duplicates, which you can then re And one of the when Santiago was talking about folks really kind of adapted that. Minimize copies of the data can help everyone in this shift to remote working and a lot of the the and on the site fast, especially changes are changing so quickly nowadays that you need to be What you guys were doing there to help your customers I always tell them you better start collecting your data. Gentlemen, thank you for sharing all of your insights. adaptive data governance brought to you by Iota Ho. In this next segment, we're gonna be talking to you about getting to know your data. Thankfully saw great to be here as Lisa mentioned guys, I'm the enterprise account executive here in Ohio. I'm the enterprise data engineer here in Ohio Tahoe. So with that said are you ready for the first one, Pat? So I have data kept in Data Lakes, legacy data, sources, even the cloud. Typically, the first step is to catalog the data and then start mapping the relationships Um so is the data catalog automatically populated? i b naturally John to these focal points that coincide with these key columns following These air ones that are machine learning algorithms have predicted to be relationships Eso So this is really cool, and I can see how this could be leverage quickly now. such as hip for C, C, P. A and the like one can choose which of these policies policies that apply to my organization? And it's something that clients leverage fairly often to utilize this So I'm glad you mentioned compliance because that's extremely important to my organization. interface lends itself to really help you build that confidence that your compliance is Andi, I have a set of reports I need to migrate. Yeah, it's a fantastic questions to be toe identifying critical data elements, I can still see that unified 3 60 view. Yeah, One future that is particularly helpful is the ability to add data sources after So now I know a little bit about my data. the data pertains to these rules. So if my marketing team needs to run, a campaign can read dependent those accuracy scores to know what the minutia to see which data elements are of the highest quality. no longer have to rely on reports about reports, but instead just come to this one So I get now the value of IATA who brings by automatically capturing all those technical to be involved with the data life cycle and take ownership over the rules that fall within their domain. Yeah, you could bring in as many data sources as you need, so long as you could manual task to help save you time and money. you. this banking example that you instantly to talk through. Yeah, I think the number one thing that customers could do to get started with our so we we encourage you just reach out to us. folks about how to get to know your data. into the world of adaptive data governance with Iota Ho Oracle First Bank of Nigeria
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Amanda | PERSON | 0.99+ |
Jason | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
Patrick Simon | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Santiago | PERSON | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Yusuf Khan | PERSON | 0.99+ |
Asia | LOCATION | 0.99+ |
16 | QUANTITY | 0.99+ |
Santiago Castor | PERSON | 0.99+ |
Ohio | LOCATION | 0.99+ |
London | LOCATION | 0.99+ |
ABC Bank | ORGANIZATION | 0.99+ |
Patrick Savita | PERSON | 0.99+ |
10 times | QUANTITY | 0.99+ |
Sanjay | PERSON | 0.99+ |
Angie | PERSON | 0.99+ |
Wednesday, January 27th | DATE | 0.99+ |
Africa | LOCATION | 0.99+ |
Thio | PERSON | 0.99+ |
John Vander Wal | PERSON | 0.99+ |
2020 | DATE | 0.99+ |
Patrick | PERSON | 0.99+ |
two columns | QUANTITY | 0.99+ |
90% | QUANTITY | 0.99+ |
Siri | TITLE | 0.99+ |
Toyota | ORGANIZATION | 0.99+ |
Bio Tahoe | ORGANIZATION | 0.99+ |
Azaz | PERSON | 0.99+ |
Pat | PERSON | 0.99+ |
11 | QUANTITY | 0.99+ |
five times | QUANTITY | 0.99+ |
Oracle Digital | ORGANIZATION | 0.99+ |
J. Bihar | PERSON | 0.99+ |
1% | QUANTITY | 0.99+ |
Staley | PERSON | 0.99+ |
Iot Tahoe | ORGANIZATION | 0.99+ |
Iota ho | ORGANIZATION | 0.99+ |
today | DATE | 0.99+ |
Ron | PERSON | 0.99+ |
first | QUANTITY | 0.99+ |
10 | QUANTITY | 0.99+ |
Iota Ho | ORGANIZATION | 0.99+ |
Andi | PERSON | 0.99+ |
Io Tahoe | ORGANIZATION | 0.99+ |
one date | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
excel | TITLE | 0.99+ |
tomorrow | DATE | 0.99+ |
3% | QUANTITY | 0.99+ |
John | PERSON | 0.99+ |
First Bank of Nigeria | ORGANIZATION | 0.99+ |
Middle East | LOCATION | 0.99+ |
Patrick Samit | PERSON | 0.99+ |
I. O. Tahoe | ORGANIZATION | 0.99+ |
first step | QUANTITY | 0.99+ |
97% | QUANTITY | 0.99+ |
Lester | PERSON | 0.99+ |
two folks | QUANTITY | 0.99+ |
Dave | PERSON | 0.99+ |
2021 | DATE | 0.99+ |
fifth episode | QUANTITY | 0.99+ |
one grain | QUANTITY | 0.99+ |
Ajay Vohora 9 9 V1
>>from around the globe. It's the Cube with digital coverage of smart data. Marketplace is brought to You by Io Tahoe Digital transformation is really gone from buzzword to a mandate. Additional businesses, a data business. And for the last several months, we've been working with Iot Tahoe on an ongoing content. Serious, serious, focused on smart data and automation to drive better insights and outcomes, essentially putting data to work. And today we're gonna do a deeper dive on automating data Discovery. And one of the thought leaders in this space is a J ahora who is the CEO of Iot. Tahoe's once again joining Me A J Good to see you. Thanks for coming on. >>A great to be here, David. Thank you. >>So let's start by talking about some of the business realities. And what are the economics that air? That air driving, automated data Discovery? Why is that so important? >>Yeah, and on this one, David, it's It's a number of competing factors we've got. The reality is data which may be sensitive, so this control on three other elements are wanting to drive value from that data. So innovation, you can't really drive a lot of value without exchanging data. So the ability to exchange data and to manage those costs, overheads and data discovery is at the roots of managing that in an automated way to classify that data in sets and policies to put that automation in place. >>Yeah. Okay, look, we have a picture of this. We could bring it up, guys, because I want oh, A j help the audience. Understand? Unaware data Discovery fits in here. This is as we talked about this, a complicated situation for a lot of customers. They got a variety of different tools, and you really laid it out nicely here in this diagram. So take us through. Sort of where that he spits. >>Yeah. I mean, where at the right hand side, This exchange. You know, we're really now in a data driven economy that is, everything's connected through AP, eyes that we consume on mine free mobile relapse. And what's not a parent is the chain of activities and tasks that have to go into serving that data two and eight p. I. At the outset, there may be many legacy systems, technologies, platforms on premise and cloud hybrids. You name it. Andi across those silos. Getting to a unified view is the heavy lifting. I think we've seen Cem some great impacts that be I titles such as Power Bi I tableau looker on DSO on in Clear. Who had Andi there in our ecosystem on visualising Data and CEO's managers, people that are working in companies day to day get a lot of value from saying What's the was the real time activity? What was the trend over this month? First his last month. The tools to enable that you know, we here, Um, a lot of good things are work that we're doing with snowflake mongo db on the public cloud platforms gcpd as your, um, about enabling building those pay planes to feed into those analytics. But what often gets hidden is have you sauce that data that could be locked into a mainframe, a data warehouse? I ot data on DPA, though, that all of that together that is the reality of that is it's it's, um, it's a lot of heavy lifting It z hands on what that, um, can be time consuming on the issue There is that data may have value. It might have potential to have an impact on the on the top line for a business on outcomes for consumers. But you never any sure unless you you've done the investigation discovered it unified that Onda and be able to serve that through to other technologies. >>Guys have. You would bring that picture back up again because A. J, you made a point, and I wanna land on that for a second. There's a lot of manual curating. Ah, an example would be the data catalogue if they decide to complain all the time that they're manually wrangling data. So you're trying to inject automation in the cycle, and then the other piece that I want you to addresses the importance of AP eyes. You really can't do this without an architecture that allows you to connect things together. That sort of enables some of the automation. >>Yeah, I mean, I don't take that in two parts. They would be the AP eyes so virtual machines connected by AP eyes, um, business rules and business logic driven by AP eyes applications. So everything across the stack from infrastructure down to the network um, hardware is all connected through AP eyes and the work of serving data three to an MP I Building these pipelines is is often, um, miscalculated. Just how much manual effort that takes and that manual ever. We've got a nice list here of what we automate down at the bottom. Those tasks of indexing, labeling, mapping across different legacy systems. Um, all of that takes away from the job of a data scientist today to engineer it, looking to produce value monetize data on day two to help their business day to conceive us. >>Yes. So it's that top layer that the business sees, of course, is a lot of work that has to go went into achieving that. I want to talk about some of the key tech trends that you're seeing and one of the things that we talked about a lot of metadata at the importance of metadata. It can't be understated. What are some of the big trends that you're seeing metadata and others? >>Yeah, I'll summarize. It is five. There's trains now, look, a metadata more holistically across the enterprise, and that really makes sense from trying. Teoh look across different data silos on apply, um, a policy to manage that data. So that's the control piece. That's that lever the other side's on. Sometimes competing with that control around sense of data around managing the costs of data is innovation innovation, being able to speculate on experiment and trying things out where you don't really know what the outcome is. If you're a data scientist and engineer, you've got a hypothesis. And now, before you got that tension between control over data on innovation and driving value from it. So enterprise wide manage data management is really helping to enough. Where might that latent value be across that sets of data? The other piece is adaptive data governance. Those controls that that that stick from the data policemen on day to steer its where they're trying to protect the organization, protect the brand, protect consumers data is necessary. But in different use cases, you might want to nuance and apply a different policy to govern that data run of into the context where you may have data that is less sensitive. Um, that can me used for innovation. Andi. Adapting the style of governance to fit the context is another trend that we're seeing coming up here. A few others is where we're sitting quite extensively and working with automating data discovery. We're now breaking that down into what can we direct? What do we know is a business outcome is a known up front objective on direct that data discovery to towards that. And that means applying around with Dems run technology and our tools towards solving a known problem. The other one is autonomous data discovery. And that means, you know, trying to allow background processes do winds down what changes are happening with data over time flagging those anomalies. And the reason that's important is when you look over a length of time to see different spikes, different trends and activity that's really giving a day drops team the ability to to manage and calibrate how they're applying policies and controls today. There, in the last two David that we're seeing is this huge drive towards self service so reimagining how to play policy data governance into the hands off, um, a day to consumer inside a business or indeed, the consumer themselves. The South service, um, if their banking customer or healthcare customer and the policies and the controls and rules, making sure that those are all in place to adaptive Lee, um, serve those data marketplaces that, um when they're involved in creating, >>I want to ask you about the autonomous data discovering the adaptive data. Governance is the is the problem where addressing their one of quality. In other words, machines air better than humans are doing this. Is that one of scale that humans just don't don't scale that well, is it? Is it both? Can you add some color to that >>yet? Honestly, it's the same equation that existed 10 years ago, 20 years ago. It's It's being exacerbated, but it's that equation is how do I control both things that I need to protect? How do we enable innovation where it is going to deliver business value? Had to exchange data between a customer, somebody in my supply chains safely. And all of that was managing the fourth that leg, which is cost overheads. You know, there's no no can checkbook here. I've got a figure out. If only see io and CDO how I do all of this within a fixed budget so that those aspects have always been there. Now, with more choices. Infrastructure in the cloud, um, NPR driven applications own promise. And that is expanding the choices that a a business has and how they put mandated what it's also then creating a layer off management and data governance that really has to now, uh, manage those full wrath space control, innovation, exchange of data on the cost overhead. >>That that top layer of the first slide that we showed was all about business value. So I wonder if we could drill into the business impact a little bit. What do your customers seeing you know, specifically in terms of the impact of all this automation on their business? >>Yeah, so we've had some great results. I think view the biggest Have Bean helping customers move away from manually curating their data in their metadata. It used to be a time where for data quality initiatives or data governance initiative that be teams of people manually feeding a data Cavallo. And it's great to have the inventory of classified data to be out to understand single version of the trees. But in a having 10 15 people manually process that keep it up to date when it's moving feet. The reality of it is what's what's true about data today? and another few sources in a few months. Time to your business on start collaborating with new partners. Suddenly the landscape has changed. The amount of work is gonna But the, um, what we're finding is through automating creating that data discovery feeding a dent convoke that's releasing a lot more time for our CAS. Mr Spend on innovating and managing their data. A couple of others is around cell service data and medics moving the the choices of what data might have business value into the hands of business users and and data consumers to They're faster cycle times around generating insights. Um, we really helping that by automating the creation of those those data sets that are needed for that. And in the last piece, I'd have to say where we're seeing impacts. A more recently is in the exchange of data. There are a number of marketplaces out there who are now being compelled to become more digital to rewire their business processes. Andi. Everything from an r p a initiative. Teoh automation involving digital transformation is having, um, see iose Chief data officers Andi Enterprise architects rethink how do they how they re worthy pipelines? But they dated to feed that additional transformation. >>Yeah, to me, it comes down to monetization. Of course, that's for for profit in industry, from if nonprofits, for sure, the cost cutting or, in the case of healthcare, which we'll talk about in a moment. I mean, it's patient outcomes. But you know, the the job of ah, chief data officer has gone from your data quality and governance and compliance to really figuring out how data and be monetized, not necessarily selling the data, but how it contributes for the monetization of the company and then really understanding specifically for that organization how to apply that. And that is a big challenge. We chatted about it 10 years ago in the early days of a Duke. And then, you know, 1% of the companies had enough engineers to figure it out. But now the tooling is available, the technology is there and the the practices air there, and that really to me, is the bottom line. A. J is it says to show me the money. >>Absolutely. It's is definitely then six sing links is focusing in on the saying over here, that customer Onda, where we're helping there is dio go together. Those disparities siloed source of data to understand what are the needs of the patient of the broker of the if it's insurance? Ah, one of the needs of the supply chain manager If its manufacturing onda providing that 3 60 view of data, um is helping to see helping that individual unlock the value for the business. Eso data is providing the lens, provided you know which data it is that can God assist in doing that? >>And you know, you mentioned r p A. Before an r p A customer tell me she was a six Sigma expert and she told me we would never try to apply six segment to a business process. But with our P A. We can do so very cheaply. Well, what that means is lower costs means better employee satisfaction and, really importantly, better customer satisfaction and better customer outcomes. Let's talk about health care for a minute because it's a really important industry. It's one that is ripe for disruption on has really been up until recently, pretty slow. Teoh adopt ah, lot of the major technologies that have been made available, but come, what are you seeing in terms of this theme, we're using a putting data to work in health care. Specific. >>Yeah, I mean, healthcare's Havlat thrown at it. There's been a lot of change in terms of legislation recently. Um, particularly in the U. S. Market on in other economies, um, healthcare ease on a path to becoming more digital on. Part of that is around transparency of price, saying to be operating effectively as a health care marketplace, being out to have that price transparency, um, around what an elective procedure is going to cost before taking that that's that forward. It's super important to have an informed decision around there. So we look at the US, for example. We've seen that health care costs annually have risen to $4 trillion. But even with all of that on cost, we have health care consumers who are reluctant sometimes to take up health care if they even if they have symptoms on a lot of that is driven through, not knowing what they're opening themselves up to. Andi and I think David, if you are, I want to book, travel, holiday, maybe, or trip. We want to know what what we're in for what we're paying for outfront, but sometimes in how okay, that choice, the option might be their plan, but the cost that comes with it isn't so recent legislation in the US Is it certainly helpful to bring for that tryst price, transparency, the underlying issue there? There is the disparity. Different formats, types of data that being used from payers, patients, employers, different healthcare departments try and make that make that work. And when we're helping on that aspect in particular related to track price transparency is to help make that date of machine readable. So sometimes with with data, the beneficiary might be on a person. I've been a lot of cases now we're seeing the ability to have different systems, interact and exchange data in order to process the workflow. To generate online at lists of pricing from a provider that's been negotiated with a payer is, um, is really a neighboring factor. >>So, guys, I wonder if you bring up the next slide, which is kind of the Nirvana. So if you if you saw the previous slide that the middle there was all different shapes and presumably to disparage data, this is that this is the outcome that you want to get. Everything fits together nicely and you've got this open exchange. It's not opaque as it is today. It's not bubble gum band aids and duct tape, but but but described this sort of outcome the trying to achieve and maybe a little bit about what gonna take to get there. >>Yeah, that's a combination of a number of things. It's making sure that the data is machine readable. Um, making it available to AP eyes that could be our ph toes. We're working with technology companies that employ R P. A full health care. I'm specifically to manage that patient and pay a data. Teoh, bring that together in our data Discovery. What we're able to do is to classify that data on having made available to eight downstream tour technology or person to imply that that workflow to to the data. So this looks like nirvana. It looks like utopia. But it's, you know, the end objective of a journey that we can see in different economies there at different stages of maturity, in turning healthcare into a digital service, even so that you could consume it from when you live from home when telling medicine. Intellicast >>Yes, so And this is not just health care but you wanna achieve that self service doing data marketplace in virtually any industry you working with TCS, Tata Consultancy Services Toe Achieve this You know, if you are a company like Iota has toe have partnerships with organizations that have deep industry expertise Talk about your relationship with TCS and what you guys are doing specifically in this regard. >>Yeah, we've been working with TCS now for room for a long while. Andi will be announcing some of those initiatives here where we're now working together to reach their customers where they've got a a brilliant framework of business for that zero when there re imagining with their clients. Um, how their business cause can operate with ai with automation on, become more agile in digital. Um, our technology, the dreams of patients that we have in our portfolio being out to apply that at scale on the global scale across industries such as banking, insurance and health care is is really allowing us to see a bigger impact on consumer outcomes. Patient outcomes And the feedback from TCS is that we're really helping in those initiatives remove that friction. They talk a lot about data. Friction. Um, I think that's a polite term for the the image that we just saw with the disparity technologies that the legacy that has built up. So if we want to create a transformation, Um, having a partnership with TCS across Industries is giving us that that reach and that impacts on many different people's day to day jobs and knives. >>Let's talk a little bit about the cloud. It's It's a topic that we've hit on quite a bit here in this in this content Siri's. But But you know, the cloud companies, the big hyper scale should put everything into the cloud, right? But but customers are more circumspect than that. But at the same time, machine intelligence M. L. A. The cloud is a place to do a lot of that. That's where a lot of the innovation occurs. And so what are your thoughts on getting to the cloud? Ah, putting dated to work, if you will, with machine learning stuff you're doing with aws. What? You're fit there? >>Yeah, we we and David. We work with all of the cloud platforms. Mike stuffed as your G, c p IBM. Um, but we're expanding our partnership now with AWS Onda we really opening up the ability to work with their Greenfield accounts, where a lot of that data that technology is in their own data centers at the customer, and that's across banking, health care, manufacturing and insurance. And for good reason. A lot of companies have taken the time to see what works well for them, with the technologies that the cloud providers ah, are offered a offering in a lot of cases testing services or analytics using the cloud to move workloads to the cloud to drive Data Analytics is is a real game changer. So there's good reason to maintain a lot of systems on premise. If that makes sense from a cost from a liability point of view on the number of clients that we work with, that do have and we will keep their mainframe systems within kobo is is no surprise to us, but equally they want to tap into technologies that AWS have such a sage maker. The issue is as a chief data officer, I don't have the budget to me, everything to the cloud day one, I might want to show some results. First upfront to my business users Um, Onda worked closely with my chief marketing officer to look at what's happening in terms of customer trains and customer behavior. What are the customer outcomes? Patient outcomes and partner at comes I can achieve through analytics data signs. So I, working with AWS and with clients to manage that hybrid topology of some of that data being, uh, in the cloud being put to work with AWS age maker on night, I hope being used to identify where is the data that needs to bay amalgamated and curated to provide the data set for machine learning advanced and medics to have an impact for the business. >>So what are the critical attributes of what you're looking at to help customers decide what what to move and what to keep, if you will. >>Well, what one of the quickest outcomes that we help custom achieve is to buy that business blustery. You know that the items of data that means something to them across those different silos and pour all of that together into a unified view once they've got that for a data engineer working with a a business manager to think through how we want to create this application. There was the turn model, the loyalty or the propensity model that we want to put in place here. Um, how do we use predictive and medics to understand what needs are for a patient, that sort of innovation is what we're looking applying the tools such a sagemaker, uh, night to be west. So they do the the computation and to build those models to deliver the outcome is is across that value chain, and it goes back to the first picture that we put up. David, you know the outcome Is that a P I On the back of it, you've got the machine learning model that's been developed in That's always such as data breaks. But with Jupiter notebook, that data has to be sourced from somewhere. Somebody has to say that yet you've got permission to do what you're trying to do without falling foul of any compliance around data. Um, it'll goes back to discovering that data, classifying it, indexing it in an automated way to cut those timelines down two hours and days. >>Yeah, it's the it's the innovation part of your data portfolio, if you will, that you're gonna put into the cloud. Apply tools like sage maker and others. You told the jury. Whatever your favorite tool is, you don't care. The customer's gonna choose that and hear the cloud vendors. Maybe they want you to use their tool, but they're making their marketplaces available to everybody. But it's it's that innovation piece, the ones that you where you want to apply that self service data marketplace to and really drive. As I said before monetization. All right, give us your final thoughts. A. J bring us home. >>So final thoughts on this David is that at the moment we're seeing, um, a lot of value in helping customers discover that day the using automation automatically curating a data catalogue, and that unified view is then being put to work through our A B. I's having an open architecture to plug in whatever tool technology our clients have decided to use, and that open architecture is really feeding into the reality of what see Iose in Chief Data Officers of Managing, which is a hybrid on premise cloud approach. Do you suppose to breed Andi but business users wanting to use a particular technology to get their business outcome having the flexibility to do that no matter where you're dating. Sitting on Premise on Cloud is where self service comes in that self service. You of what data I can plug together, Dr Exchange. Monetizing that data is where we're starting to see some real traction. Um, with customers now accelerating becoming more digital, uh, to serve their own customers, >>we really have seen a cultural mind shift going from sort of complacency. And obviously, cove, it has accelerated this. But the combination of that cultural shift the cloud machine intelligence tools give give me a lot of hope that the promises of big data will ultimately be lived up to ah, in this next next 10 years. So a J ahora thanks so much for coming back on the Cube. You're you're a great guest. And ah, appreciate your insights. >>Appreciate, David. See you next time. >>All right? And keep it right there. Very right back. Right after this short break
SUMMARY :
And for the last several months, we've been working with Iot Tahoe on an ongoing content. A great to be here, David. So let's start by talking about some of the business realities. So the ability to exchange and you really laid it out nicely here in this diagram. tasks that have to go into serving that data two and eight p. addresses the importance of AP eyes. So everything across the stack from infrastructure down to the network um, What are some of the big trends that you're the costs of data is innovation innovation, being able to speculate Governance is the is and data governance that really has to now, uh, manage those full wrath space control, the impact of all this automation on their business? And in the last piece, I'd have to say where we're seeing in the case of healthcare, which we'll talk about in a moment. Eso data is providing the lens, provided you know Teoh adopt ah, lot of the major technologies that have been made available, that choice, the option might be their plan, but the cost that comes with it isn't the previous slide that the middle there was all different shapes and presumably to disparage into a digital service, even so that you could consume it from Yes, so And this is not just health care but you wanna achieve that self service the image that we just saw with the disparity technologies that the legacy Ah, putting dated to work, if you will, with machine learning stuff A lot of companies have taken the time to see what works well for them, to move and what to keep, if you will. You know that the items of data that means something to The customer's gonna choose that and hear the cloud vendors. the flexibility to do that no matter where you're dating. that cultural shift the cloud machine intelligence tools give give me a lot of hope See you next time. And keep it right there.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Ajay Vohora | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
TCS | ORGANIZATION | 0.99+ |
Mike | PERSON | 0.99+ |
Andi | PERSON | 0.99+ |
Iota | ORGANIZATION | 0.99+ |
Iot Tahoe | ORGANIZATION | 0.99+ |
Iot | ORGANIZATION | 0.99+ |
$4 trillion | QUANTITY | 0.99+ |
Siri | TITLE | 0.99+ |
first picture | QUANTITY | 0.99+ |
five | QUANTITY | 0.99+ |
two parts | QUANTITY | 0.99+ |
two hours | QUANTITY | 0.99+ |
US | LOCATION | 0.99+ |
both | QUANTITY | 0.99+ |
Tata Consultancy Services | ORGANIZATION | 0.99+ |
10 years ago | DATE | 0.99+ |
first slide | QUANTITY | 0.99+ |
1% | QUANTITY | 0.99+ |
both things | QUANTITY | 0.98+ |
First | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
Spend | PERSON | 0.98+ |
fourth | QUANTITY | 0.97+ |
U. S. | LOCATION | 0.97+ |
Iose | PERSON | 0.97+ |
today | DATE | 0.97+ |
six sing links | QUANTITY | 0.96+ |
A. J | PERSON | 0.96+ |
20 years ago | DATE | 0.96+ |
three other elements | QUANTITY | 0.96+ |
day two | QUANTITY | 0.95+ |
two | QUANTITY | 0.95+ |
last month | DATE | 0.94+ |
Duke | ORGANIZATION | 0.93+ |
zero | QUANTITY | 0.91+ |
cove | PERSON | 0.9+ |
Andi | ORGANIZATION | 0.9+ |
Onda | ORGANIZATION | 0.88+ |
10 15 people | QUANTITY | 0.87+ |
single version | QUANTITY | 0.86+ |
this month | DATE | 0.83+ |
CDO | TITLE | 0.83+ |
eight | QUANTITY | 0.81+ |
Tahoe | ORGANIZATION | 0.79+ |
eight downstream | QUANTITY | 0.77+ |
AWS Onda | ORGANIZATION | 0.77+ |
Io Tahoe | ORGANIZATION | 0.77+ |
next next 10 years | DATE | 0.76+ |
day | QUANTITY | 0.73+ |
Cube | COMMERCIAL_ITEM | 0.72+ |
six segment | QUANTITY | 0.71+ |
last several months | DATE | 0.71+ |
three | QUANTITY | 0.68+ |
Eso | ORGANIZATION | 0.6+ |
Have Bean | ORGANIZATION | 0.6+ |
3 60 view | QUANTITY | 0.58+ |
J | ORGANIZATION | 0.55+ |
Greenfield | ORGANIZATION | 0.49+ |
ahora | PERSON | 0.48+ |
J | PERSON | 0.48+ |
South | ORGANIZATION | 0.47+ |
six Sigma | QUANTITY | 0.46+ |
David Piester, Io-Tahoe & Eddie Edwards, Direct Energy | AWS re:Invent 2019
>>long from Las Vegas. It's the Q covering a ws re invent 2019. Brought to you by Amazon Web service is and in along with its ecosystem partners. >>Hey, welcome back to the cubes. Coverage of AWS 19 from Las Vegas. This is Day two of our coverage of three days. Two sets, lots of cute content. Lisa Martin here with Justin Warren, founder and chief analyst. A pivot nine. Justin and I are joined by a couple of guests New to the Cube. We've got David Meister next to meet Global head of sales for Io Tahoe. Welcome. Eddie Edwards with a cool name. Global Data Service is director from Direct Energy. Welcome, Eddie. Thank you. Okay, So, David, I know we had somebody from Io Tahoe on yesterday, but I'd love for you to give her audience an overview of Io Tahoe, and then you gotta tell us what the name means. >>Okay. Well, day pie stir. Io Tahoe thinks it's wonderful event here in AWS and excited to be here. Uh, I, oh, Tahoe were located in downtown on Wall Street, New York on and I Oh, Tahoe. Well, there's a lot of different meanings, but mainly Tahoe for Data Lake Input output into the lake is how it was originally meant So But ah, little background on Io Tahoe way are 2014. We came out way started in stealth came out of stealth in 2017 with two signature clients. When you're going to hear from in a moment direct energy, the other one g e and we'll speak to those in just a moment I owe Tahoe takes a unique approach way have nine machine learning machine learning algorithms 14 future sets that interrogates the data. At the data level, we go past metadata, so solving that really difficult data challenge and I'm gonna let Eddie describe some of the use cases that were around data migration, P II discovery, and so over to you >>a little bit about direct energy. What, you where you're located, What you guys do and how data is absolutely critical to your business. Yeah, >>sure. So direct energy. Well, it's the largest residential energy supplier in the er us around 5000 employees. Loss of this is coming from acquisitions. So as you can imagine, we have a vast amount of data that we need some money. Currently, I've got just under 1700 applications in my portfolio. Onda a lot. The challenges We guys are around the cost, driving down costs to serve so we can pass that back onto our consumers on the challenge that with hard is how best to gain that understanding. Where I alter whole came into play, it was vainly around off ability to use the products quickly for being able to connect to our existing sources to discover the data. What, then, that Thio catalog that information to start applying the rules around whether it be legislation like GDP, are or that way gets a lot of cases where these difference between the states on the standings and definitions so the product gives us the ability to bring a common approach So that information a good success story, would be about three months ago, we took the 30 and applications for our North America home business. We were able to running through the product within a week on that gave us the information to them, consolidate the estate downwards, working with bar business colleagues Thio, identify all the data we don't see the archival retention reels on, bring you no more meaning to the data on actually improve ourselves opportunities by highlights in that rich information that was not known >>previously. Yes, you mentioned that you growing through acquisition. One thing that people tend to underestimate around I t. Is that it's not a heterogeneous. It's not a homogeneous environments hatred genius. Like as soon as you buy another company, you've got another. You got another silent. You got another day to say. You got something else. So walk us through how iota who actually deals with that very disparity set of data that you've night out inherited from just acquiring all of these different companies? >>Yeah, so exactly right. You know, every time we a private organization, they would have various different applications that were running in the estate. Where would be an old article? I say, Hey, sequel tap environment. What we're able to do is use the products to plug in a name profile to understand what's inside knowledge they have around their customer base and how we can number in. That's in to build up a single view and offer additional products value adding products or rewards for customers, whether that be, uh on our hay truck side our heat in a ventilation and air con unit, which again we have 4600 engineers in that space. So it's opening up new opportunities and territories to us. >>Go ahead, >>say additionally to that, we're across multiple sectors, but the problem death by Excel was in the financial service is we're located on Wall Street. As I mentioned on this problem of legacy to spirit, data, sources and understanding, and knowing your data was a common problem, banks were just throwing people at the problem. So his use case with 1700 applications, a lot of them legacy is fits right into what we d'oh and cataloging is he mentioned. We catalogue with that discover in search engine that we have. We enable search cross enterprise. But Discovery we auto tag and auto classify the sensitive data into the catalog automatically, and that's a key part of what we do. And it >>was that Dave is something in thinking of differentiation, wanting to know what is unique about Iota. What was the opportunity that you guys saw? But is the cataloging and the sensitive information one of the key things that makes it a difference >>Way enabled data governance. So it's not just sensitive information way catalog, entire data set multiple data sets. And what makes us what differentiates us is that the machine learning way Interrogate in brute force The data So every single so metadata beyond so 1,000,000,000 rose. 100,000 columns. Large, complex data sets way. Interrogate every field value. And we tell you what this looks like A phone number. This looks like an address. This looks like a first name. This looks like the last name and we tagged at to the catalog. And then anything that sensitive in nature will color coded red green, highly sensitive, sensitive. So that's our big differentiator. >>So is that like 100% visibility into the granularity of what is in this data? >>Yes, that's that's one of the issues is who were here ahead of us. We're finding a lot of folks are wanting to go to the cloud, but they can't get access to the data. They don't know their data. They don't understand it. On DSO where that bridge were a key strategic partner for aws Andi we're excited about the opportunity that's come about in the last six months with AWS because we're gonna be that key geese for migration to the cloud >>so that the data like I love the name iota, How But in your opinion, you know, you could hear so many different things about Data Lake Data's turning into data Swamp is there's still a lot of value and data lakes that customers just like you're saying before, you just don't know what they have. >>Well, what's interesting in this transition to one of other clients? But on I just want to make a note that way actually started in the relational world. So we're already a mess. We're across header genius environment so but Tahoe does have more to do with Lake. But at a time a few years back, everybody was just dumping data into the lake. They didn't understand what what was in there, and it's created in this era of privacy, a big issue, and Comcast had this problem. The large Terry Tate instance just dumping into the lake, not understanding data flows, how they're data's flowing, not understanding what's in the lake, sensitivity wise, and they want to start, you know they want enable b I. They want they want to start doing analytics, but you gotta understand and know the data, right? So for Comcast, we enable data ops for them automatically with our machine learning. So that was one of the use cases. And then they put the information and we integrated with Apache Atlas, and they have a large JW aws instance, and they're able to then better govern their data on S O N G. Digital. One other customer very complex use case around their data. 36 e. R. P s being migrated toe one virtually r p in the lake. And think about finance data How difficult that is to manage and understand. So we were a key piece in helping that migration happen in weeks rather than months. >>David, you mentioned cloud. Clearly weird. We're at a cloud show, but you mentioned knowing your data. One of the aspect of that cloud is that it moves fast, and it's a much bigger scale than what we've been used to. So I'm interested. Maybe, Eddie, you can. You can fill us in here as well about the use of a tool to help you know your data when we're not creating any less stated. There's just more and more data. So at this speed and this scale, how important is it that you actually have tooling to provide to the to the humans who have to go on that operate on all of this data >>building on what David was saying around the speed in the agility side, you know, now all our information I would know for North America home business is in AWS Hold on ns free bucket. We are already starting work with AWS connect on the call center side. Being able to stream that information through so we're getting to the point now is an organization where we're able to profile the data riel. Time on. Take that information Bolts predict what the customers going going to do is part that machine learning side. So we're starting to trial where we will interject into a call to say, Well, you know, a customer might be on your digital site trying to do a journey. You can see the challenges around data, and you could Then they go in with a chop using, say, the new AWS trap that's just coming through at the moment. So >>one of the things that opportunities I'm here. Sorry, Eddie is the opportunity to leverage the insights into the data to deliver more. You mentioned like customer words, are more personalized experience or a call center agent. Knowing this is the problem of this customer is experiencing this way. Have tried X, y and Z to resolve, or this customer is loyal to pay their bills on time. They should be eligible for some sort of reward program. I think consumers that I think amazon dot com has created us this demanding consumer that way expect you to know us. I expect you to serve us up things that you think we want. Talk to me about the opportunity that I owe Ty was is giving your business to be able to delight customers in ways that you probably couldn't even have predicted? >>Well, they touched on the tagging earlier, you know, survive on the stunned in the data that's coming through. Being able to use the data flow technology on dhe categorizing were able than telling kidding with wider estate. So David mentioned Comcast around 36 e. R. P. You know, we've just gone through the same in other parts of our organization. We're driving the additional level of value, turning away from being a manually labor intensive task. So I used to have 20 architects that daily goal through trying to build an understanding the relationship. I do not need that now. I just have a couple of people that are able to take the outputs and then be able to validate the information using the products. >>And I like that. There's just so much you mentioned customer 360. Example at a call centre. There's so much data ops that has to happen to make that happen on. That's the most difficult challenge to solve. And that's where we come in. And after you catalogue the data, I just want to touch on this. We enable search for the enterprise so you're now connected to 50 115 100 sources with our software. Now you've catalogued it. You profiled it. Now you can search Karen Kim Kim Smith, So your your your engineers, your architect, your data stewards influences your business analysts. This is folks can now search anything they want and find anything sensitive. Find that person find an invoice, and that helps enable. But you mentioned the customer >>360. But I can Also. What I'm hearing is, as it has the potential to enable a better relationship between I t in the business. >>Absolutely. It brings those both together because they're so siloed. In this day and age, your data siloed and your business is siloed in a different business unit. So this helps exactly collaborate crowdsource, bring it all together. One platform >>and how many you so 1700 applications. How many you mentioned the 36 or so air peace. What percentage? If you can guess who have you been able to reduce duplicate triplicate at center applications? And what are some of the overarching business benefits that direct energy is achieving? >>So incentive the direct senator, decide that we're just at the beginning about journey. We're about four months in what? We've already decommissioned 12. The applications I was starting to move out to the wider side in terms of benefits are oh, I probably around 300% of the moment >>in a 300% r A y in just a few months. >>Just now, you know you've got some of the basic savings around the story side. We're also getting large savings from some of the existing that support agreements that we have in place. David touched on data Rob's. I've been able to reduce the amount of people that are required to support the team. There is now a more common on the standing within the organization and have money to turn it more into a self care opportunity with the business operations by pushing the line from being a technical problem to a business challenge. And at the end of the day, they're the experts. They understand the data better than any IittIe fault that sat in a corner, right? So I'm >>gonna ask you one more question. What gave you the confidence that I Oh, Tahoe was the right solution for you >>purely down Thio three Open Soul site. So we come from a you know I've been using. I'll tell whole probably for about two years in parts of the organization. We were very early. Adopters are over technologies in the open source market, and it was just the ability thio on the proof of concept to be able to turn it around iTunes, where you'll go to a traditional vendor, which would take a few months large business cases. They need any of that. We were able to show results within 24 48 hours on now buys the confidence. And I'm sure David would take the challenge of being able to plug in some day. It says on to show you the day. >>Cool stuff, guys. Well, thank you for sharing with us what you guys are doing. And I have a Iot Tahoe keeping up data Lake Blue and the successes that you're cheating in such a short time, but direct energy. I appreciate your time, guys. Thank you. Excellent. Our pleasure. >>No, you'll day. >>Exactly know your data. My guests and my co host, Justin Warren. I'm Lisa Martin. I'm gonna go often. Learn my data. Now you've been watching the Cube and AWS reinvent 19. Thanks for watching
SUMMARY :
Brought to you by Amazon Web service Justin and I are joined by a couple of guests New to the Cube. P II discovery, and so over to you critical to your business. the products quickly for being able to connect to our existing sources to discover You got another day to say. That's in to build up a single view and offer but the problem death by Excel was in the financial service is we're But is the cataloging and the sensitive information one of the key things that makes it And we tell you what this looks like A phone number. in the last six months with AWS because we're gonna be that key geese for so that the data like I love the name iota, How But in does have more to do with Lake. So at this speed and this scale, how important is it that you actually have tooling into a call to say, Well, you know, a customer might be on your digital site Sorry, Eddie is the opportunity to leverage I just have a couple of people that are able to take the outputs and then be on. That's the most difficult challenge to solve. What I'm hearing is, as it has the potential to enable So this helps exactly How many you mentioned the 36 or so So incentive the direct senator, decide that we're just at the beginning about journey. reduce the amount of people that are required to support the team. Tahoe was the right solution for you It says on to show you the day. Well, thank you for sharing with us what you guys are doing. Exactly know your data.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Justin Warren | PERSON | 0.99+ |
Comcast | ORGANIZATION | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Eddie | PERSON | 0.99+ |
David Meister | PERSON | 0.99+ |
Justin | PERSON | 0.99+ |
Eddie Edwards | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
2017 | DATE | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
David Piester | PERSON | 0.99+ |
100% | QUANTITY | 0.99+ |
2014 | DATE | 0.99+ |
Karen Kim Kim Smith | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
North America | LOCATION | 0.99+ |
three days | QUANTITY | 0.99+ |
20 architects | QUANTITY | 0.99+ |
Two sets | QUANTITY | 0.99+ |
300% | QUANTITY | 0.99+ |
4600 engineers | QUANTITY | 0.99+ |
1,000,000,000 | QUANTITY | 0.99+ |
Rob | PERSON | 0.99+ |
1700 applications | QUANTITY | 0.99+ |
One platform | QUANTITY | 0.99+ |
North America | LOCATION | 0.99+ |
Io-Tahoe | PERSON | 0.99+ |
30 | QUANTITY | 0.99+ |
Direct Energy | ORGANIZATION | 0.99+ |
Excel | TITLE | 0.99+ |
100,000 columns | QUANTITY | 0.98+ |
Wall Street | LOCATION | 0.98+ |
36 | QUANTITY | 0.98+ |
yesterday | DATE | 0.98+ |
Global Data Service | ORGANIZATION | 0.98+ |
iTunes | TITLE | 0.98+ |
amazon dot com | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
12 | QUANTITY | 0.98+ |
first name | QUANTITY | 0.98+ |
both | QUANTITY | 0.97+ |
Io Tahoe | ORGANIZATION | 0.97+ |
aws | ORGANIZATION | 0.97+ |
Day two | QUANTITY | 0.97+ |
14 future sets | QUANTITY | 0.97+ |
around 5000 employees | QUANTITY | 0.96+ |
two signature clients | QUANTITY | 0.96+ |
around 300% | QUANTITY | 0.96+ |
under 1700 applications | QUANTITY | 0.96+ |
One | QUANTITY | 0.96+ |
one more question | QUANTITY | 0.95+ |
about two years | QUANTITY | 0.95+ |
24 48 hours | QUANTITY | 0.95+ |
2019 | DATE | 0.95+ |
Amazon Web | ORGANIZATION | 0.94+ |
Thio | ORGANIZATION | 0.94+ |
Terry Tate | PERSON | 0.94+ |
a week | QUANTITY | 0.94+ |
One thing | QUANTITY | 0.94+ |
about four months | QUANTITY | 0.94+ |
Discovery | ORGANIZATION | 0.91+ |
nine | QUANTITY | 0.91+ |
last six months | DATE | 0.9+ |
Andi | PERSON | 0.89+ |
Iota | TITLE | 0.89+ |
Tahoe | ORGANIZATION | 0.89+ |
Data Lake Data | ORGANIZATION | 0.88+ |
DSO | ORGANIZATION | 0.88+ |
Wall Street, New York | LOCATION | 0.86+ |
Kickoff | Global Cloud & Blockchain Summit 2018
>> Live from Toronto, Canada, it's theCUBE, covering Global Cloud and Blockchain Summit 2018. Brought to you by theCUBE. >> Hello everyone, welcome to the live coverage here in Toronto for the Global Cloud and Blockchain Summit here put on as prior to the big event this week called the Futurist Conference. TheCUBE will be here all week with live coverage. I'm John Furrier with Dave Vellante as we expand our coverage with theCUBE into the blockchain and crypto token economics world. We're here on the ground. We're covering the best events. We started in 2018 initiating CUBE coverage on the sector. Of course we've been covering Bitcoin and blockchain going back to 2011 on SiliconANGLE.com. Dave, we're here to kick off what is the first inaugural event of its kind, combining cloud computing coverage with blockchain, and as we had on our fireside chat last night, we discussed this in detail. Cloud computing and blockchain, either going to be a collision course or it's going to be a nice integration. And we discussed that. This is what this show is all about, is it's really about connecting the dots to the future. The role that cloud computing will play with blockchain and token economics, a variety of different perspectives, but again, this is the first time we in the industry are starting to unpack the mega-trend of cloud computing, which we know is like a freight train powering and disrupting, and we cover it in detail. But blockchain is certainly transforming and reimagining business and process coming together. >> Well, we're here in Toronto, which of course is the birthplace of Ethereum, and it's interesting to see how Toronto has attracted so many developers in the software and engineering space, and there's a huge crypto community here. I'll give you my take on the cloud and blockchain. I don't see them on a collision course. I see blockchain, and we've talked about this, and crypto as a part of this other layer that's emerging. You had the internet, you had the web. On top of that you had cloud, mobile, social, big data, and it was essentially a cloud of remote services. What we're seeing now is this ubiquitous set of digital services of which blockchain is one. And to me it's all about automation, machine intelligence, blockchain being able to do things without middle man. You made that point last night on the fireside chat. And I think it's complementary. You need cloud for scale. Everything's digital, which means data. And you need machine intelligence for automation. And that is the new era that we're entering, and blockchain is playing a big part of that because of its inherent encryption, its immutability, and its ability to show proof of work. So it's a key component of a number of different digital services that are going to transform virtually every industry. >> Certainly, then, that's a tailwind for the industry, and certainly we see that. All the alpha entrepreneurs, alpha geeks, and a lot of the business pros see blockchain and token economics as a dynamic that will certainly change things. Today in Toronto this week, certainly not a good week for pricing of currencies. The crypto market is down, Ethereum and Ripple are at yearly lows. And communities are kind of getting scared. We talked with Matt Roszak, an early investor and founder of BloQ, last night about the price declines, and he said, "I've seen this pattern before. "These price selloffs also kick off "the next wave of growth." So there's a kind of a weeding out, was his perspective. But you can't deny that over the past 24 hours, 30 billion has been erased from the crypto market caps, and the greatest decline is happening under Bitcoin's dominance, and still increased over, still 56% over the year. So Bitcoin seems to be holding more value than, say, Ethereum. Ethereum and Ripple really under a lot of pressure. So the insiders, some are scared, some are like, hey, we've seen this movie before. Waves are a little bit rough right now, but they're in for the long game. So this is a long game going on and then there's also money being lost. >> Well, Matt was saying bet the farm now. He said he's seen this before. Take everything, the mortgage, the house. I'm not sure I would advise doing that, but this is the time, buy low. So just for the numbers, Bitcoin's high last November/December was 19,000, it's down at 6,000 now. So as you say, it's still up almost 50% for the year, but if you compress that timeframe to nine months, it's down 60%, so very, very volatile. Ethereum, on the other hand, last September was trading at around 240, 250, and today it's in the 260s. So back to where it was last September. The curve on Ethereum sort of looks like it did end of last summer, whereas Bitcoin is still almost 70% up from where it was last September. So quite a bit of difference between the two cryptocurrencies. And you mentioned Ripple, IOTA, many of the cryptocurrencies-- >> Ripple's dropping 90% from its 2018 highs. 90%. (both laugh) Some money was made and lost on that one, so again, we always say when the music stops you better be sitting in a chair. Otherwise this is bubble behavior, but you know Matt and others and the insiders are saying they're still bullish because of the pattern. Even though it's a selloff, it's a weeding out process and they see still good deals going on. And again, this is going to come fundamentally down to whose technology's going to be adopted, what kind of application can be written on blockchain, which is seeing some promise in the enterprise. Just yesterday Microsoft announced a blockchain as a service kind of thing with proof of authority and new concepts. IBM, we've been covering IBM with blockchain, their work with the Hyperledger standards. You've got the enterprises. Amazon has kind of telegraphed, they actually put a professional service note out where they are doing some blockchain. The big clouds are getting into the game, so the question is, will the clouds suck all the oxygen out of the blockchain room, and will there be room for other blockchains? Again, this is the big debate. Is it going to be a fragmentation of a series of blockchains, or will there be some sort of set of standards? Again, we don't know what the stack's going to look like because the best thing about blockchain is you could roll it out and implement a portion of the stack and still coexist with whatever standards emerge. So again, these are the questions. >> Well, one of the conversations that of course is going on is actually, the number of transactions that's occurring with Bitcoin is way down, it's probably down 20% year to date. The other conversation is we all know that Bitcoin and Ethereum, the transaction volumes can't really support what we do with Visa or even Amazon. There's a discussion in the industry going around about what if Amazon shows some other coin? Like Ripple, for example, which has much higher transaction volumes. Or what if Amazon tokenized its own business, came up its own cryptocurrency? What would that do to the price of Bitcoin, if all of a sudden you could transact in Prime using AmazonCoin or something like that? And we know that Amazon understands how to scale, it obviously understands cloud. That's why I do see cloud and blockchain as complementary. It's very difficult to predict the future. There are those who say Bitcoin is the standard, it's got the brand. There are those who say that Ethereum, because it's much more flexible and you can program distributed apps with it, have a great future. And then everybody points to the transaction volumes and says, this is just a Petri dish for the future where new technologies will emerge that scale better and can produce. >> What's interesting last night on the, we had a fireside chat with Al Burgio, serial entrepreneur, founder of DigitalBits, and Matt Roszak, obviously founder of BloQ and investor, he's on the Forbes billionaire list, super active, very engaged on a lot of advisors, Binance is one and many other deals he's done, it's interesting, you got two perspectives. Al is the networking guy who knows plumbing, knows how networks work, and Matt's a token economics genius. So the two have interesting perspectives and the battle royal going on right now, in my opinion, is two things. I think token economics is a wonderful thing that's going to happen no matter what the standards are, 'cause token economics really is the value to me of the cryptocurrency that can be applied to new business models and efficiencies. The blockchain is a land grab, and here's why. I think whoever can nail the plumbing and the pipes of the infrastructure reminds me of the early days of the dial-up web, when you had points of presence and you had the infrastructure had to be laid down. Although slow, people can dial up and get the internet, then obviously the internet got faster and faster. Blockchain's struggling from that scalability performance issues, and so the question is, on a public blockchain, you got to have the supernodes, you got to have the core infrastructure plumbing nailed. I think Al Burgio takes that perspective. Then everything else just will flourish from there. So the question is, what do those hurdles look like? And this is where the cloud guys could either be an enabler or they could be a foe against the core community. Like you said, Amazon could just snap their fingers tomorrow and take out the entire industry with one move. Just, we're going to do our own blockchain as a service. Everyone uses it, here's our token, and then a set of sub-tokens would have to be coexisting with that. And that could be a good thing, we don't know. This is the discussion. >> And governments around the world could do the same. US government could do Fedcoin, the Chinese government could do Chinacoin. I mean, what would that do to the prices of cryptocurrencies? I mean, it would send it into a tailspin, you would presume. And it was interesting. Matt Roszak on your panel last night, I asked the question, well, traditional banks lose control of the payment systems. And granted, he's biased, and he was definitive. Yes, absolutely. But the counterargument to that, John, and I'd love your thoughts on this, is the US government and the banks have a lot to lose. And they're kind in bed together and always have been. So one would think, with the backing of the US, its might, its military, et cetera, that they're not just going to let the banks lose control. Now, to his point is, why do you need to pay transaction fees to a bank? But you're paying transaction fees to somebody, even in crypto. >> I think our government in the United States is really asleep at the wheel on this one. And here's why. One of the beautiful things about the internet was it was started through collaboration in the universities in the United States. The United States enabled the internet to happen, and the Department of Commerce managed it. The Domain Name System was managed in a very community-oriented way. Again, community, keyword. As opposed to all this, that history is well-documented. If people aren't familiar with the history of the Domain Name System, DNS, go check out the Wikipedia, research it. It was run by a bunch of people who managed the database of website names. And that became sacred and was distributed. >> And funded by the US government. >> Funded by the US government, but the community managed it. The problem with the US government today is that they are meddling in areas that they actually shouldn't be even playing in. You got the SEC, it's shutting down everything right now just by the threat of subpoenas in the ICO market, which puts the overall country into a handicapped position, because now the innovation of blockchain and the entrepreneurial innovation that's happening is stunted, and it's just shifting outside the United States. So what's happening is the money flow and the energy and the activity is so high that incubation's not happening in the United States, although a lot of people are working on it. There's no funding mechanism. The capital formation of blockchain's different than venture. It's not super different, but somewhat different, but it's happening outside the United States. Certainly the Chinese will be in benefit of this. And if the Chinese wanted to shut down blockchain they would have done it by now. They're actually fostering it, and it's an opportunity for someone on the international stage to get a lever in the United States. So that's one. The second thing is they can enable crypto if they wanted to and I think they really should look at that and I think the banks are central organizations, the World Bank, they're under a lot of pressure. They don't know what to do. So when I talk to people, that's the same answer in so many words, is the government and the regulators really just don't know what to do. >> Well, and Matt made the point last night, Matt Roszak, that when he talks to these banks they're talking about using blockchain and they're very excited because they're going to take hundreds of millions of dollars of cost out of their, you know, infrastructure and their processes that are just not very efficient, and that's going to drop right to the bottom line. And of course they're in the money business, so that gets them very excited. His point was that's really not what it's about. Yeah, that's nice, but it's really about transforming the businesses, and that's why I asked the question about banks losing control of the payment systems. Opens up a whole new opportunities, whether it's financial services, healthcare, automotive. And again, to me, it comes back to digital, which is data, plus machine intelligence plus cloud for scale. You called it. I think at IBM Think, you coined it the innovation sandwich. Data plus machine intelligence plus cloud for scale. Put that together, that is the innovation engine for the next decade plus. >> The innovation sandwich, unlike a wish sandwich, where you wish you had some meat in the middle. You know, this is a good point. Let's end this kickoff and get into some of the interviews here with these really early thought leaders in this new conference. This is the first of its kind, cloud and blockchain, and we're going to certainly continue this in Silicon Valley with theCUBE summit coming up and our events that we do. But let's get some predictions out, because remember, this is theCUBE. Everything's going to be out there, it's going to be on the record, so we can look back and say, hey Dave, remember in 2018 when I asked you what's going to happen? So let's get into a prediction. What do you think's going to happen? I'll start and you can think up an answer. So here's my prediction on this whole blockchain world. Not so much crypto or token economics. It's really two predictions. With respect to blockchain, I think you're going to see an exact movement that the cloud market took, and I think it's going to happen in three phases. Phase one is all the energy's going to go into public blockchain, and public blockchain will be figured out first, and people are going to get excited by the new operation models of blockchain, specifically the decentralization of how that works and the benefits of decentralized blockchain, immutability, no central authority, and all the benefits of blockchain. I think it's going to be very rapid growth in the fixing of blockchain. Speed, scale, that's going to happen very quickly. And it's going to happen publicly. Then you're going to see private blockchains. You're going to see on premises kind of like blockchain. Kind of like the cloud, people have onsite, private. And then you're going to see a hybrid. The hybrid will look like multi-chain solutions. This is almost an exact trajectory that cloud computing took, because blockchain feels like a cousin of cloud or a brother or a sister. So it's related, but not exactly, but I think it's kind of the same trajectory. Public, private, hybrid, which is a multi-chain model, and I think that's going to be the standards. That's going to be the market track. On the token side, I think you're going to see a couple key tokens, like certainly Bitcoin's not going away. I'd be doubling down on Bitcoin under 6,000, like everything on that. That should hit 20,000, in my opinion, over the next timeframe. But there's going to be a lot of token integrations. My token integrates with your token and almost natives and secondary tokens kind of blending together where people with coexist tokens on one platform. So it's just too powerful not to have that happen. So that's my prediction. What do you think? >> I think as it relates to blockchain, I think blockchain becomes, in the enterprise I think it becomes an invisible component of virtually every industry. 'Cause every industry has waste, can improve efficiencies, and blockchain becomes a way to, whether it's supply chain or settlements or shared ledgers, I mean, there's dozens of applications for them and I think blockchain becomes a fundamental component of a digital infrastructure, and it's starting now and I think it's here to stay for many decades and beyond. And you won't even see it. It's just going to be there. It's going to become a fundamental part of how we do business. On the token side, very interesting, obviously, hard to predict. I think that you're going to see continued volatility, of course, I think that's a safe bet. But I also think it's potentially going to get worse before it gets better. I think there's going to be a shakeout. I think you're going to see, there continues to be pump and dump scams going on, the US government's getting more aggressive, a bunch of subpoenas went out, and people are still trying to understand what that all means. So I think it's going to be rocky roads for a while. I think you're going to see a big shakeout, like a big dip, and then I think it's going to power back. I think the crypto is here to stay. And it's very, very hard to time these markets, so my advice is just buy, trickle buys on the way down and hold. HODL, as they say in this world. And I think 10 years from now it's going to be worth a lot. >> Alright, you got it here, theCUBE. We are in Toronto for the first inaugural Global Cloud and Blockchain Summit. Of course, part of the big event here in Toronto, Futurist Conference, which we'll be there live. Wednesday and Thursday, the kickoff is Tuesday night for the opening reception. It's theCUBE coverage continuing for blockchain and crypto markets. I'm John Furrier with Dave Vellante. Stay with us for more live coverage here in Toronto.
SUMMARY :
Brought to you by theCUBE. is it's really about connecting the dots to the future. And that is the new era that we're entering, and a lot of the business pros see blockchain many of the cryptocurrencies-- and implement a portion of the stack is actually, the number of transactions and take out the entire industry with one move. is the US government and the banks have a lot to lose. The United States enabled the internet to happen, and the energy and the activity is so high Well, and Matt made the point last night, Matt Roszak, and I think that's going to be the standards. and it's starting now and I think it's here to stay Wednesday and Thursday, the kickoff is Tuesday night
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Vellante | PERSON | 0.99+ |
Matt Roszak | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
Matt | PERSON | 0.99+ |
SEC | ORGANIZATION | 0.99+ |
Dave | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Toronto | LOCATION | 0.99+ |
World Bank | ORGANIZATION | 0.99+ |
2018 | DATE | 0.99+ |
56% | QUANTITY | 0.99+ |
United States | LOCATION | 0.99+ |
6,000 | QUANTITY | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Al Burgio | PERSON | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
30 billion | QUANTITY | 0.99+ |
20% | QUANTITY | 0.99+ |
two things | QUANTITY | 0.99+ |
nine months | QUANTITY | 0.99+ |
60% | QUANTITY | 0.99+ |
2011 | DATE | 0.99+ |
19,000 | QUANTITY | 0.99+ |
last September | DATE | 0.99+ |
Wednesday | DATE | 0.99+ |
90% | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
Tuesday night | DATE | 0.99+ |
DigitalBits | ORGANIZATION | 0.99+ |
last night | DATE | 0.99+ |
US government | ORGANIZATION | 0.99+ |
20,000 | QUANTITY | 0.99+ |
Thursday | DATE | 0.99+ |
BloQ | ORGANIZATION | 0.99+ |
Today | DATE | 0.99+ |
second thing | QUANTITY | 0.99+ |
next decade | DATE | 0.99+ |
260s | QUANTITY | 0.99+ |
two cryptocurrencies | QUANTITY | 0.99+ |
Global Cloud and Blockchain Summit | EVENT | 0.99+ |
one platform | QUANTITY | 0.98+ |
Al | PERSON | 0.98+ |
Department of Commerce | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
first time | QUANTITY | 0.98+ |
two predictions | QUANTITY | 0.98+ |
tomorrow | DATE | 0.98+ |
under 6,000 | QUANTITY | 0.98+ |
Toronto, Canada | LOCATION | 0.98+ |
today | DATE | 0.98+ |
this week | DATE | 0.97+ |
first | QUANTITY | 0.97+ |
Global Cloud & Blockchain Summit 2018 | EVENT | 0.97+ |
SiliconANGLE.com | OTHER | 0.97+ |
yesterday | DATE | 0.97+ |
two perspectives | QUANTITY | 0.96+ |
three phases | QUANTITY | 0.96+ |
One | QUANTITY | 0.96+ |
Carlo Vaiti | DataWorks Summit Europe 2017
>> Announcer: You are CUBE Alumni. Live from Munich, Germany, it's theCUBE. Covering, DataWorks Summit Europe 2017. Brought to you by Hortonworks. >> Hello, everyone, welcome back to live coverage at DataWorks 2017, I'm John Furrier with my cohost, Dave Vellante. Two days of coverage here in Munich, Germany, covering Hortonworks and Yahoo, presenting Hadoop Summit, now called DataWorks 2017. Our next guest is Carlo Vaiti, who's the HPE chief technology strategist, EMEA Digital Solutions, Europe, Middle East, and Africa. Welcome to theCUBE. >> Thank you, John. >> So we were just chatting before we came on, of your historic background at IBM, Oracle, and now HPE, and now back into the saddle there. >> Don't forget Sun Microsystems. >> Sun Microsystems, sorry, Sun, yeah. I mean, great, great run. >> It was a long run. >> You've seen the computer revolution happen. I worked at HP for nine years, from '88 to '97. Again, Dave was a premier analyst during that run of client-server. We've seen the computer revolution happen. Now we're seeing the digital revolution where the iPhone is now 10 years old, Cloud is booming, data's at the center of the value proposition, so a completely new disruptive capability. >> Carlo: Sure, yes. >> So what are you doing as the CTO, chief technologist for HPE, how are you guys bringing this story together? 'Cause there's so much going on at HPE. You got the services spit, you got the software split, and HP's focusing on the new style of IT, as Meg Whitman calls it. >> So, yeah. My role in EMEA is actually all about having basically a visionary kind of strategy role for what's going to be HP in the future, in terms of IT. And one of the things that we are looking at is, is specifically to have, we split our strategy in three different aspects, so three transformation areas. The first one which we usually talk is what I call hybrid IT, right, which is basically making services around either On-Premise or on Cloud for our customer base. The second one is actually power the Intelligent Edge, so is actually looking after our collaboration and when we acquire Aruba components. And the third one, which is in the middle, and that's why I'm here at the DataWorks Summit, is actually the data-analytics aspects. And we have a couple of solution in there. One is the Enterprise great Hadoop, which is part of this. This is actually how we generalize all the figure and the strategy for HP. >> It's interesting, Dave and I were talking yesterday, being in Europe, it's obviously a different sideshow, it's smaller than the DataWorks or Hadoop Summit in North America in San Jose, but there's a ton of Internet of things, IoT or IIoT, 'cause here in Germany, obviously, a lot of industrial nations, but in Europe in general, a lot of smart cities initiatives, a lot of mobility, a ton of Internet of things opportunity, more than in the US. >> Absolutely. >> Can you comment on how you guys are tackling the IoT? Because it's an Intelligent Edge, certainly, but it's also data, it's in your wheelhouse. >> Yes, sure. So I'm actually working, it's a good question, because I'm actually working a couple of projects in Eastern Europe, where it's all about Industrial IoT Analytics, IIoTA. That's the new terminology we use. So what we do is actually, we analyze from a business perspective, what are the business pain points, in an oil and gas company for example. And we understand for example, what kind of things that they need and must have. And what I'm saying here is, one of the aspects for example, is the drilling opportunity. So how much oil you can extract from a specific rig in the middle of the North Sea, for example. This is one of the key question, because the customer want to understand, in the future, how much oil they can extract. The other one is for example, the upstream business. So doing on the retail side and having, say, when my customer is stopping in a gas station, I want go in the shop, immediately giving, I dunno, my daughter, a kind of campaign for the Barbie, because they like the Barbie. So IoT, Industrial IoT help us in actually making a much better customer experience, and that's the case of the upstream business, but is also helping us in actually much faster business outcomes. And that's what the customer wants, right? 'Cause, and was talking with your colleague before, I'm talking to the business guy. I'm not talking to the IT anymore in these kind of place, and that's how IoT allow us a chance to change the conversation at the industry level. >> These are first-time conversations too. You're getting at the kinds of business conversations that weren't possible five years ago. >> Carlo: Yes, sure. >> I mean and 10 years ago, they would have seemed fantasy. Now they're reality. >> The role of analytics in my opinion, is becoming extremely key, and I said this morning, for me my best center is that the detail, is the stone foundation of the digital economy. I continue to repeat this terminology, because it's actually where everything is starting from. So what I mean is, let's take a look at the analytic aspect. So if I'm able to analyze the data close to the shop floor, okay, close to the shop manufacturing floor, if I'm able to analyze my data on the rig, in the oil and gas industry, if I'm able to analyze doing preprocessing analytics, with Kafka, Druid, these kind of open-source software, where close to the Intelligent Edge, then my customers going to be happy, because I give them very fast response, and the decision-maker can get to decision in a faster time. Today, it takes a long time to take these type of decision. So that's why we want to move into the power Intelligent Edge. >> So you're saying, data's foundational, but if you get to the Intelligent Edge, it's dynamic. So you have a dynamic reactive, realtime time series, or presences of data, but you need the foundational pre-data. >> Perfect. >> Is that kind of what you're getting at? >> Yes, that's the first step. Preprocessing analytics is what we do. In the next generation of, we think is going to be Industrial IoT Analytics, we're going to actually put massive amount of compute close to the shop manufacturing floor. We call internally or actually externally, convergent planned infrastructure. And that's the key point, right? >> John: Convergent plan? >> Convergent planned infrastructure, CPI. If you look at in Google, you will find. It's a solution we bring in the market a few months ago. We announce it in December last year. >> Yeah, Antonio's smart. He also had a converged systems as well. One of the first ones. >> Yeah, so that's converge compute at the edge basically. >> Correct, converge compute-- >> Very powerful. >> Very powerful, and we run analytics on the edge. That's the key point. >> Which we love, because that means you don't have to send everything back to the Cloud because it's too expensive, it's going to take too long, it's not going to work. >> Carlo: The bandwidth on the network is much less. >> There's no way that's going to be successful, unless you go to the edge and-- >> It takes time. >> With a cost. >> Now the other thing is, of course, you've got the Aruba asset, to be able to, I always say, joke, connect the windmill. But, Carlo, can we go back to the IoTA example? >> Carlo: Correct, yeah. >> I want to help, help our audience understand, sort of, the new HP, post these spin merges. So perviously you would say, okay, we have Vertica. You still have partnership, or you still own Vertica, but after September 1st-- >> Absolutely, absolutely. It's part of the columnar side-- >> Right, yes, absolutely, but, so. But the new strategy is to be more of a platform for a variety of technology. So how for instance would you solve, or did you solve, that problem that you described? What did you actually deliver? >> So again, as I said, we're, especially in the Industrial IoT, we are an ecosystem, okay? So we're one element of the ecosystem solution. For the oil and gas specifically, we're working with other system integrator. We're working with oil and the industry gas expertise, like DXC company, right, the company that we just split a few days ago, and we're working with them. They're providing the industry expertise. We are a infrastructure provided around that, and the services around that for the infrastructure element. But for the industry expertise, we try to have a kind of little bit of knowledge, to start the conversation with the customer. But again, my role in the strategy is actually to be a ecosystem digital integrator. That's the new terminology we like to bring in the market, because we really believe that's the way HP role is going to be. And the relevance of HP is totally depending if we are going to be successful in these type of things. >> Okay, now a couple other things you talked about in your keynote. I'm just going to list them, and then we can go wherever we want. There was Data Link 3.0, Storage Disaggregation, which is kind of interesting, 'cause it's been a problem. Hadoop as a service, Realtime Everywhere, and then Analytics at the Edge, which we kind of just talked about. Let's pick one. Let's start with Data Link 3.0. What is that? John doesn't like the term data link. He likes data ocean. >> I like data ocean. >> Is Data Link 3.0 becoming an ocean? >> It's becoming an ocean. So, Data Link 3.0 for us is actually following what is going to be the future for HDFS 3.0. So we have three elements. The erasure coding feature, which is coming on HDFS. The second element is around having HDFS data tier, multi-data tier. So we're going to have faster SSD drives. We're going to have big memory nodes. We're going to have GPU nodes. And the reason why I say disaggregation is because some of the workload will be only compute, and some of the workload will be only storage, okay? So we're going to bring, and the customer require this, because it's getting more data, and they need to have for example, YARN application running on compute nodes, and the same level, they want to have storage compute block, sorry, storage components, running on the storage model, like HBase for example, like HDFS 3.0 with the multi-tier option. So that's why the data disaggregation, or disaggregation between compute and storage, is the key point. We call this asymmetric, right? Hadoop is becoming asymmetric. That's what it mean. >> And the problem you're solving there, is when I add a node to a cluster, I don't have to add compute and storage together, I can disaggregate and choose whatever I need, >> Everyone that we did. >> based on the workload. >> They are all multitenancy kind of workload, and they are independent and they scale out. Of course, it's much more complex, but we have actually proved that this is the way to go, because that's what the customer is demanding. >> So, 3.0 is actually functional. It's erasure coding, you said. There's a data tier. You've got different memory levels. >> And I forgot to mention, the containerization of the application. Having dockerized the application for example. Using mesosphere for example, right? So having the containerization of the application is what all of that means, because what we do in Hadoop, we actually build the different clusters, they need to talk to each other, and change data in a faster way. And a solution like, a product like SQL Manager, from Hortonworks, is actually helping us to get this connection between the cluster faster and faster. And that's what the customer wants. >> And then Hadoop as a service, is that an on-premise solution, is that a hybrid solution, is it a Cloud solution, all three? >> I can offer all of them. Hadoop is a service could be run on-premise, could be run on a public Cloud, could be run on Azure, or could be mix of them, partially on-premise, and partially on public. >> And what are you seeing with regard to customer adoption of Cloud, and specifically around Hadoop and big data? >> I think the way I see that option is all the customer want to start very small. The maturity is actually better from a technology standpoint. If you're asking me the same question maybe a year ago, I would say, it's difficult. Now I think they've got the point. Every large customer, they want to build this big data ocean, note the delay, ocean, whatever you want to call it. >> John: Love that. (laughs) >> All right. They want to build this data ocean, and the point I want to make is, they want to start small, but they want to think very high. Very big, right, from their perspective. And the way they approach us is, we have a kind of methodology. We establish the maturity assessment. We do a kind of capability maturity assessment, where we find that if the customer is actually a pioneer, or is actually a very traditional one, so it's very slow-going. Once we determine where is the stage of the customer is, we propose some specific proof of concept. And in three months usually, we're putting this in place. >> You also talked about realtime everywhere. We in our research, we talk about the, historically, you had batchy of interactive, and now you have what we call continuous, or realtime streaming workloads. How prevalent is that? Where do you see it going in the future? >> So I think is another train for the future, as I mentioned this morning in my presentation. So and Spark is actually doing the open-source memory engine process, is actually the core of this stuff. We see 60 to 70 time faster analytics, compared to not to use Spark. So many customer implemented Spark because of this. The requirement are that the customer needs an immediate response time, okay, for a specific decision-making that they have to do, in order to improve their business, in order to improve their life. But this require a different architecture. >> I have a question, 'cause you, you've lived in the United States, you're obviously global, and spent a lot of time in Europe as well, and a lot of times, people want to discuss the differences between, let's make it specific here, the European continent and North America, and from a sophistication standpoint, same, we can agree on that, but there are still differences. Maybe, more greater privacy concerns. The whole thing with the Cloud and the NSA in the United States, created some concerns. What do you see as the differences today between North America and Europe? >> From my perspective, I think we are much more for example take IoT, Industrial IoT. I think in Europe we are much more advanced. I think in the manufacturing and the automotive space, the connected car kind of things, autonomous driving, this is something that we know already how to manage, how to do it. I mean, Tesla in the US is a good example that what I'm saying is not true, but if I look at for example, large German manufacturing car, they always implemented these type of things already today. >> Dave: For years, yeah. >> That's the difference, right? I think the second step is about the faster analytic approach. So what I mentioned before. The Power the Intelligent Edge, in my opinion at the moment, is much more advanced in the US compared to Europe. But I think Europe is starting to run back, and going on the same route. Because we believe that putting compute capacity on the edge is what actually the customer wants. But that's the two big differences I see. >> The other two big external factors that we like to look at, are Brexit and Trump. So (laughs) how 'about Brexit? Now that it's starting to sort of actually become, begin the process, how should we think about it? Is it overblown? It is critical? What's your take? >> Well, I think it's too early to say. UK just split a few days ago, right, officially. It's going to take another 18 months before it's going to be completed. From a commercial standpoint, we don't see any difference so far. We're actually working the same way. For me it's too early to say if there's going to be any implication on that. >> And we don't know about Trump. We don't have to talk about it, but the, but I saw some data recently that's, European sentiment, business sentiment is trending stronger than the US, which is different than it's been for the last many years. What do you see in terms of just sentiment, business conditions in Europe? Do you see a pick up? >> It's getting better, it is getting better. I mean, if I look at the major countries, the P&L is going positive, 1.5%. So I think from that perspective, we are getting better. Of course we are still suffering from the Chinese, and Japanese market sometimes. Especially in some of the big large deals. The inclusion of the Japanese market, I feel it, and the Chinese market, I feel that. But I think the economy is going to be okay, so it's going to be good. >> Carlo, I want to thank you for coming on and sharing your insight, final question for you. You're new to HPE, okay. We have a lot of history, obviously I was, spent a long part of my career there, early in my career. Dave and I have covered the transformation of HP for many, many years, with theCUBE certainly. What attracted you to HP and what would you say is going on at HP from your standpoint, that people should know about? >> So I think the number one thing is that for us the word is going to be hybrid. It means that some of the services that you can implement, either on-premise or on Cloud, could be done very well by the new Pointnext organization. I'm not part of Pointnext. I'm in the EG, Enterprise Group division. But I am fan for Pointnext because I believe this is the future of our company, is on the services side, that's where it's going. >> I would just point out, Dave and I, our commentary on the spin merge has been, create these highly cohesive entities, very focused. Antonio now running EG, big fans, of where it's actually an efficient business model. >> Carlo: Absolutely. >> And Chris Hsu is running the Micro Focus, CUBE Alumni. >> Carlo: It's a very efficient model, yes. >> Well, congratulations and thanks for coming on and sharing your insights here in Europe. And certainly it is an IoT world, IIoT. I love the analytics story, foundational services. It's going to be great, open source powering it, and this is theCUBE, opening up our content, and sharing that with you. I'm John Furrier, Dave Vellante. Stay with us for more great coverage, here from Munich after the short break.
SUMMARY :
Brought to you by Hortonworks. Welcome to theCUBE. and now back into the saddle there. I mean, great, great run. data's at the center of the value proposition, and HP's focusing on the new style And one of the things that we are looking at is, it's smaller than the DataWorks or Hadoop Summit Can you comment on how you guys are tackling the IoT? and that's the case of the upstream business, You're getting at the kinds of business conversations I mean and 10 years ago, they would have seemed fantasy. and the decision-maker can get to decision in a faster time. So you have a dynamic reactive, And that's the key point, right? It's a solution we bring in the market a few months ago. One of the first ones. That's the key point. it's going to take too long, it's not going to work. Now the other thing is, sort of, the new HP, post these spin merges. It's part of the columnar side-- But the new strategy is to be more That's the new terminology we like to bring in the market, John doesn't like the term data link. and the same level, they want to have but we have actually proved that this is the way to go, So, 3.0 is actually functional. So having the containerization of the application Hadoop is a service could be run on-premise, all the customer want to start very small. John: Love that. and the point I want to make is, they want to start small, and now you have what we call continuous, is actually the core of this stuff. in the United States, created some concerns. I mean, Tesla in the US is a good example is much more advanced in the US compared to Europe. actually become, begin the process, before it's going to be completed. We don't have to talk about it, but the, and the Chinese market, I feel that. Dave and I have covered the transformation of HP It means that some of the services that you can implement, our commentary on the spin merge has been, I love the analytics story, foundational services.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Carlo | PERSON | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Europe | LOCATION | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Germany | LOCATION | 0.99+ |
Trump | PERSON | 0.99+ |
Meg Whitman | PERSON | 0.99+ |
Vertica | ORGANIZATION | 0.99+ |
Pointnext | ORGANIZATION | 0.99+ |
Chris Hsu | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Carlo Vaiti | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
HP | ORGANIZATION | 0.99+ |
Munich | LOCATION | 0.99+ |
HPE | ORGANIZATION | 0.99+ |
Yahoo | ORGANIZATION | 0.99+ |
Sun Microsystems | ORGANIZATION | 0.99+ |
Antonio | PERSON | 0.99+ |
US | LOCATION | 0.99+ |
EG | ORGANIZATION | 0.99+ |
second element | QUANTITY | 0.99+ |
United States | LOCATION | 0.99+ |
second step | QUANTITY | 0.99+ |
Hortonworks | ORGANIZATION | 0.99+ |
December last year | DATE | 0.99+ |
iPhone | COMMERCIAL_ITEM | 0.99+ |
San Jose | LOCATION | 0.99+ |
1.5% | QUANTITY | 0.99+ |
yesterday | DATE | 0.99+ |
North America | LOCATION | 0.99+ |
September 1st | DATE | 0.99+ |
'97 | DATE | 0.99+ |
'88 | DATE | 0.99+ |
Africa | LOCATION | 0.99+ |
one | QUANTITY | 0.99+ |
Today | DATE | 0.99+ |
three months | QUANTITY | 0.99+ |
Eastern Europe | LOCATION | 0.99+ |
Sun | ORGANIZATION | 0.99+ |
Two days | QUANTITY | 0.99+ |
60 | QUANTITY | 0.99+ |
DataWorks 2017 | EVENT | 0.99+ |
10 years ago | DATE | 0.99+ |
DXC | ORGANIZATION | 0.98+ |
EMEA Digital Solutions | ORGANIZATION | 0.98+ |
five years ago | DATE | 0.98+ |
a year ago | DATE | 0.98+ |
Tesla | ORGANIZATION | 0.98+ |