Image Title

Search Results for Yusef:

Yusef Khan


 

(gentle music) >> From around the globe, it's theCUBE, presenting Building Immersive Customer Experiences with Customer Data 360. Brought to you by Io-Tahoe. >> Hello everyone and welcome back to Io-Tahoe's seventh installment of their Data Automation series, Building Immersive Customer Experiences with Customer Data 360. Now in this first segment, we're to catch up with Yusef Khan, who is Io-Tahoe's Head of Data Services. Yusef, always great to see you. Welcome back to theCUBE. >> Thank you, Dave. It's great to be back. Thank you for having me. >> Our pleasure. So let's talk about Customer Data 360. What does that actually mean in terms of the data? Give us a little background here. >> Well, Dave, we're living in a world now, where customer expectations are really, really high. A world in which the customer ethos if you like, is almost, talk to me like you love me. And that attitude is pretty common. So it's a world in which if you've shared your data with an organization, you absolutely expect that organization, that company to optimize your experience using that data. And when it comes to data, these very high expectations can be challenging to meet and there are several reasons for that. I mean, to mention just a few, an enterprise can have many different diverse data sources. It can have customer records that are duplicated or incomplete, the data quality itself can be poor, and what Customer Data 360 does, is help enterprises understand their data states, get more insight on their customer base, improve data quality, and then ultimately improve their customer experience and bring it in line with the expectation of today's customers. >> Great. Thank you for that. Well, so maybe not love me, but at least know me, right? So, poor data quality, and I think we can all relate to this. Like, you call a service provider, they either have old data, or bad data, you sometimes get double billed and it's up to you to figure that out. So, can the 360 degree view help with this problem? How so? What data does it generate to address this? >> Yeah, absolutely. It can help. So Customer Data 360 allows organizations to produce a fundamentally more personalized experience for customers. It helps eliminate the often generic sales pitches people get on email or in social media ads. It helps curate recommendations that add genuine value to that specific customer. So for example, if you typically buy three products from a certain brand every month, that data is going to be tracked, saved for the future, and it will make the next month's shopping more convenient by suggesting the same products or complementary products. Not only that, Customer Data 360 will track purchases across all touch points, and understand the customer in the round. So across in store, online, mobile app, tracking all those patterns. Same time, all your data is kept secure and private, and it's only used in ways that you expect it to be used. >> Well, to me, this is really, really important. I mean, especially after this year, we've seen online purchases go through the roof. (chuckles) Every time I buy something, I get an ad for that something, then for the next week, until I turn it off. I mean, it's clear that the state of data still has a way to go based on the quality and so you're addressing that, but take us through the process of identifying for instance, incorrect data or duplicate customer data. How do you do that? >> Well, Dave customer data changes so frequently. So for example, people get married and there are name changes. People move homes, so the address changes. Emails change or get updated, people change phones or phone numbers. The list goes on. Customer Data 360 identifies records that probably belong to the same customer, and offers a unified view of the customer for insights and for campaigns. It also offers a single household view, hoping to link together data from customers based at the same address. And then finally, it gives a datum, a data target operating model, to help drive continuous improvement through the enterprise. This means it helps embed the right process and culture with the organization's people, as well as the technology. >> So Yusef, just a quick aside, if I may. So essentially, I presume you're using some kind of machine intelligence which we've talked about before, to infer from, triangulate different data points and identify the probability that this individual is the same person, right? And then making that call. >> Yeah. Using machine learning and algorithms, you're able to do this much more quickly, much more effectively, much more cost-effectively than doing it via manual methods. Sometimes using manual methods, it's not really possible to do this type of work. So absolutely, there is a technological core backend that enables this work. >> Yeah, the manual just doesn't scale and humans just frankly aren't that good at it. So besides incorrect customer data, what other kinds of challenges are companies facing, and how are you addressing those? >> There are lots of different challenges. The data quality itself may be poor, so you've got the classic, "I've got the wrong address for that customer or the wrong email address", and that can happen multiple times over if you've got multiple records for each customer as well. The customer age might not be there, can be quite critical for streaming and other online services, so who's really a child and who's an adult? That can be very, very key for consent and things like that. Data relationships and data lineage may be unclear. Updating one system may not flow through into another system. Marketing and other permissions may not be captured correctly, and even sensitive data, PII, Personally Identifiable Information may be spread through the enterprise with no real understanding of where it is. And finally, there are cultural factors, like individual functions may jealously guard their own database, they may not share data in a way that's collaborative or useful for the whole enterprise. >> Great. Thank you for that. So, the big picture is this is going to drop right to my bottom line. I mean, if I'm sending duplicate communications, physical flyers, snail mail to the same household, people are just tossing it, they get frustrated. Or if I'm unknowingly giving minors access to restricted information, we've seen horror shows like that before, if that happens, you're going to lose customers, you're going to lose money. We all know the cost of losing customers is much, much higher (chuckles) than getting them. You have to get them back, forget it. It's three, four times X, what it originally cost. Where is Io-Tahoe going, to address this and remediate these problems? >> Well, Customer Data 360 really starts by understanding and fixing the fundamentals. So it starts by helping the customer understand their data estate, mapping the data relationships and the data lineage, automatically populating a data catalog so the customer knows what they have in terms of data, automatically assessing data quality, and recommending how it can be improved, automatically analyzing data record duplication and data source redundancy, and the customer can then get to a single view of the customer and the household as we said, this is enabled by the data target operating model which embeds this process and drives continuous improvement. The enterprise can then deploy raw data for analytics, model building, data science, can then productionize those models and related pipelines, and use them to start pushing out relevant messages and offers to customers. Obviously then, you capture the results. You use those to refine the offering and continuously improve, win customers, win friends, influence people, and grow revenue times a thousand. >> So, I've got to ask you another aside, if I may. I mean, we've talked about this in previous episodes. A lot of this, correct me if I'm wrong, you've got data source issues as well. I mean, you may not know that the address has changed but there may be other data sources that you can ingest that where the address has changed and you can bring that into your platform, but oftentimes, organizations don't want to do that. They don't want to add the data source, it's too complex, it adds more data quality issues, so it's a challenge somewhat. So, I'm just kind of connecting the dots from previous conversations that we've had. You know, we're at number seven now, but I can start to see this coming together. Maybe you could comment on that data source challenge. >> Yeah, absolutely. Organizations often have, I suppose you could call it dark data or data that they don't know that they have. So it does partly start with going back to the fundamentals of what data do you hold, rationalizing that data, using automated processes and machine learning to do that so you can do it more rapidly and effectively, getting them to a single view of the customer, and then using that in all the ways that advanced analytics and data science give you these days to get to a better customer experience and a better customer outcome. But as you say, a lot of that starts with identifying your data sources and understanding your data sources in the first place. >> Well, I've been watching you guys, your progress since COVID began and you're making some good moves here, Yusef and always great to catch up. I really appreciate your time and insights. >> Thank you, Dave. Nice to speak to you. Thanks for having me. >> Our pleasure. Okay, don't go away folks. Up next, we've got Ajay Vohora. He's the CEO of Io-Tahoe, and he's going to be joined by Mongo DB's principal solutions architect, talking through how to build modern apps using data RPA. Keep it right there, be right back. (gentle music)

Published Date : Jun 22 2021

SUMMARY :

Brought to you by Io-Tahoe. Yusef, always great to see you. It's great to be back. mean in terms of the data? is almost, talk to me like you love me. and it's up to you to figure that out. that data is going to be tracked, I mean, it's clear that the state of data that probably belong to the same customer, and identify the probability to do this type of work. and how are you addressing those? and that can happen multiple times over this is going to drop and offers to customers. and you can bring that into your platform, and then using that in all the ways and always great to catch up. Nice to speak to you. and he's going to be joined

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavePERSON

0.99+

YusefPERSON

0.99+

Yusef KhanPERSON

0.99+

Ajay VohoraPERSON

0.99+

Io-TahoeORGANIZATION

0.99+

threeQUANTITY

0.99+

singleQUANTITY

0.99+

next weekDATE

0.98+

first segmentQUANTITY

0.98+

this yearDATE

0.98+

three productsQUANTITY

0.98+

each customerQUANTITY

0.98+

360 degreeQUANTITY

0.98+

next monthDATE

0.97+

Mongo DBORGANIZATION

0.96+

seventh installmentQUANTITY

0.96+

theCUBEORGANIZATION

0.95+

todayDATE

0.95+

Customer Data 360ORGANIZATION

0.92+

firstQUANTITY

0.92+

four timesQUANTITY

0.89+

one systemQUANTITY

0.87+

single viewQUANTITY

0.84+

a thousandQUANTITY

0.82+

360ORGANIZATION

0.79+

COVIDORGANIZATION

0.75+

single householdQUANTITY

0.67+

doubleQUANTITY

0.65+

Io-TahoePERSON

0.54+

number sevenQUANTITY

0.51+

360QUANTITY

0.5+

Glenn Grossman and Yusef Khan | Io-Tahoe ActiveDQ Intelligent Automation


 

>>from around the globe. It's the >>cube presenting >>active de que intelligent automation for data quality brought to you by Iota Ho >>Welcome to the sixth episode of the I. O. Tahoe data automation series. On the cube. We're gonna start off with a segment on how to accelerate the adoption of snowflake with Glenn Grossman, who is the enterprise account executive from Snowflake and yusef khan, the head of data services from Iota. Gentlemen welcome. >>Good afternoon. Good morning, Good evening. Dave. >>Good to see you. Dave. Good to see you. >>Okay glenn uh let's start with you. I mean the Cube hosted the snowflake data cloud summit in November and we heard from customers and going from love the tagline zero to snowflake, you know, 90 minutes very quickly. And of course you want to make it simple and attractive for enterprises to move data and analytics into the snowflake platform but help us understand once the data is there, how is snowflake helping to achieve savings compared to the data lake? >>Absolutely. dave. It's a great question, you know, it starts off first with the notion and uh kind of, we coined it in the industry or t shirt size pricing. You know, you don't necessarily always need the performance of a high end sports car when you're just trying to go get some groceries and drive down the street 20 mph. The t shirt pricing really aligns to, depending on what your operational workload is to support the business and the value that you need from that business? Not every day. Do you need data? Every second of the moment? Might be once a day, once a week through that t shirt size price and we can align for the performance according to the environmental needs of the business. What those drivers are the key performance indicators to drive that insight to make better decisions, It allows us to control that cost. So to my point, not always do you need the performance of a Ferrari? Maybe you need the performance and gas mileage of the Honda Civic if you would just get and deliver the value of the business but knowing that you have that entire performance landscape at a moments notice and that's really what what allows us to hold and get away from. How much is it going to cost me in a data lake type of environment? >>Got it. Thank you for that yussef. Where does Io Tahoe fit into this equation? I mean what's, what's, what's unique about the approach that you're taking towards this notion of mobilizing data on snowflake? >>Well, Dave in the first instance we profile the data itself at the data level, so not just at the level of metadata and we do that wherever that data lives. So it could be structured data could be semi structured data could be unstructured data and that data could be on premise. It could be in the cloud or it could be on some kind of SAAS platform. And so we profile this data at the source system that is feeding snowflake within snowflake itself within the end applications and the reports that the snowflake environment is serving. So what we've done here is take our machine learning discovery technology and make snowflake itself the repository for knowledge and insights on data. And this is pretty unique. Uh automation in the form of our P. A. Is being applied to the data both before after and within snowflake. And so the ultimate outcome is that business users can have a much greater degree of confidence that the data they're using can be trusted. Um The other thing we do uh which is unique is employee data R. P. A. To proactively detect and recommend fixes the data quality so that removes the manual time and effort and cost it takes to fix those data quality issues. Uh If they're left unchecked and untouched >>so that's key to things their trust, nobody's gonna use the data. It's not trusted. But also context. If you think about it, we've contextualized are operational systems but not our analytic system. So there's a big step forward glen. I wonder if you can tell us how customers are managing data quality when they migrate to snowflake because there's a lot of baggage in in traditional data warehouses and data lakes and and data hubs. Maybe you can talk about why this is a challenge for customers. And like for instance can you proactively address some of those challenges that customers face >>that we certainly can. They have. You know, data quality. Legacy data sources are always inherent with D. Q. Issues whether it's been master data management and data stewardship programs over the last really almost two decades right now, you do have systemic data issues. You have siloed data, you have information operational, data stores data marks. It became a hodgepodge when organizations are starting their journey to migrate to the cloud. One of the things that were first doing is that inspection of data um you know first and foremost even looking to retire legacy data sources that aren't even used across the enterprise but because they were part of the systemic long running operational on premise technology, it stayed there when we start to look at data pipelines as we onboard a customer. You know we want to do that era. We want to do QA and quality assurance so that we can, And our ultimate goal eliminate the garbage in garbage out scenarios that we've been plagued with really over the last 40, 50 years of just data in general. So we have to take an inspection where traditionally it was E. T. L. Now in the world of snowflake, it's really lt we're extracting were loading or inspecting them. We're transforming out to the business so that these routines could be done once and again give great business value back to making decisions around the data instead of spending all this long time. Always re architect ng the data pipeline to serve the business. >>Got it. Thank you. Glenda yourself of course. Snowflakes renowned for customers. Tell me all the time. It's so easy. It's so easy to spin up a data warehouse. It helps with my security. Again it simplifies everything but so you know, getting started is one thing but then adoption is also a key. So I'm interested in the role that that I owe. Tahoe plays in accelerating adoption for new customers. >>Absolutely. David. I mean as Ben said, you know every every migration to Snowflake is going to have a business case. Um uh and that is going to be uh partly about reducing spending legacy I. T. Servers, storage licenses, support all those good things um that see I want to be able to turn off entirely ultimately. And what Ayatollah does is help discover all the legacy undocumented silos that have been built up, as Glenn says on the data estate across a period of time, build intelligence around those silos and help reduce those legacy costs sooner by accelerating that that whole process. Because obviously the quicker that I. T. Um and Cdos can turn off legacy data sources the more funding and resources going to be available to them to manage the new uh Snowflake based data estate on the cloud. And so turning off the old building, the new go hand in hand to make sure those those numbers stack up the program is delivered uh and the benefits are delivered. And so what we're doing here with a Tahoe is improving the customers are y by accelerating their ability to adopt Snowflake. >>Great. And I mean we're talking a lot about data quality here but in a lot of ways that's table stakes like I said, if you don't trust the data, nobody's going to use it. And glenn, I mean I look at Snowflake and I see obviously the ease of use the simplicity you guys are nailing that the data sharing capabilities I think are really exciting because you know everybody talks about sharing data but then we talked about data as an asset, Everyone so high I to hold it. And so sharing is is something that I see as a paradigm shift and you guys are enabling that. So one of the things beyond data quality that are notable that customers are excited about that, maybe you're excited about >>David, I think you just cleared it out. It's it's this massive data sharing play part of the data cloud platform. Uh you know, just as of last year we had a little over about 100 people, 100 vendors in our data marketplace. That number today is well over 450 it is all about democratizing and sharing data in a world that is no longer held back by FTp s and C. S. V. S and then the organization having to take that data and ingested into their systems. You're a snowflake customer. want to subscribe to an S and P data sources an example, go subscribe it to it. It's in your account there was no data engineering, there was no physical lift of data and that becomes the most important thing when we talk about getting broader insights, data quality. Well, the data has already been inspected from your vendor is just available in your account. It's obviously a very simplistic thing to describe behind the scenes is what our founders have created to make it very, very easy for us to democratize not only internal with private sharing of data, but this notion of marketplace ensuring across your customers um marketplace is certainly on the type of all of my customers minds and probably some other areas that might have heard out of a recent cloud summit is the introduction of snow park and being able to do where all this data is going towards us. Am I in an ale, you know, along with our partners at Io Tahoe and R. P. A. Automation is what do we do with all this data? How do we put the algorithms and targets now? We'll be able to run in the future R and python scripts and java libraries directly inside Snowflake, which allows you to even accelerate even faster, Which people found traditionally when we started off eight years ago just as a data warehousing platform. >>Yeah, I think we're on the cusp of just a new way of thinking about data. I mean obviously simplicity is a starting point but but data by its very nature is decentralized. You talk about democratizing data. I like this idea of the global mesh. I mean it's very powerful concept and again it's early days but you know, keep part of this is is automation and trust, yussef you've worked with Snowflake and you're bringing active D. Q. To the market what our customers telling you so far? >>Well David the feedback so far has been great. Which is brilliant. So I mean firstly there's a point about speed and acceleration. Um So that's the speed to incite really. So where you have inherent data quality issues uh whether that's with data that was on premise and being brought into snowflake or on snowflake itself, we're able to show the customer results and help them understand their data quality better Within Day one which is which is a fantastic acceleration. I'm related to that. There's the cost and effort to get that insight is it's a massive productivity gain versus where you're seeing customers who've been struggling sometimes too remediate legacy data and legacy decisions that they've made over the past couple of decades, so that that cost and effort is much lower than it would otherwise have been. Um 3rdly, there's confidence and trust, so you can see Cdos and see IOS got demonstrable results that they've been able to improve data quality across a whole bunch of use cases for business users in marketing and customer services, for commercial teams, for financial teams. So there's that very quick kind of growth in confidence and credibility as the projects get moving. And then finally, I mean really all the use cases for the snowflake depend on data quality, really whether it's data science, uh and and the kind of snow park applications that Glenn has talked about, all those use cases work better when we're able to accelerate the ri for our joint customers by very quickly pushing out these data quality um insights. Um And I think one of the one of the things that the snowflake have recognized is that in order for C. I. O. Is to really adopt enterprise wide, um It's also as well as the great technology with Snowflake offers, it's about cleaning up that legacy data state, freeing up the budget for CIA to spend it on the new modern day to a state that lets them mobilise their data with snowflake. >>So you're seeing the Senate progression. We're simplifying the the the analytics from a tech perspective. You bring in Federated governance which which brings more trust. Then then you bring in the automation of the data quality piece which is fundamental. And now you can really start to, as you guys are saying, democratized and scale uh and share data. Very powerful guys. Thanks so much for coming on the program. Really appreciate your time. >>Thank you. I appreciate as well. Yeah.

Published Date : Apr 29 2021

SUMMARY :

It's the the head of data services from Iota. Good afternoon. Good to see you. I mean the Cube hosted the snowflake data cloud summit and the value that you need from that business? Thank you for that yussef. so not just at the level of metadata and we do that wherever that data lives. so that's key to things their trust, nobody's gonna use the data. Always re architect ng the data pipeline to serve the business. Again it simplifies everything but so you know, getting started is one thing but then I mean as Ben said, you know every every migration to Snowflake is going I see obviously the ease of use the simplicity you guys are nailing that the data sharing that might have heard out of a recent cloud summit is the introduction of snow park and I mean it's very powerful concept and again it's early days but you know, Um So that's the speed to incite And now you can really start to, as you guys are saying, democratized and scale uh and I appreciate as well.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

Glenn GrossmanPERSON

0.99+

BenPERSON

0.99+

Io TahoeORGANIZATION

0.99+

Yusef KhanPERSON

0.99+

DavePERSON

0.99+

20 mphQUANTITY

0.99+

GlennPERSON

0.99+

CIAORGANIZATION

0.99+

IOSTITLE

0.99+

GlendaPERSON

0.99+

90 minutesQUANTITY

0.99+

100 vendorsQUANTITY

0.99+

FerrariORGANIZATION

0.99+

last yearDATE

0.99+

OneQUANTITY

0.99+

firstQUANTITY

0.99+

first instanceQUANTITY

0.99+

NovemberDATE

0.99+

sixth episodeQUANTITY

0.99+

once a dayQUANTITY

0.99+

once a weekQUANTITY

0.98+

SenateORGANIZATION

0.98+

todayDATE

0.98+

bothQUANTITY

0.98+

eight years agoDATE

0.97+

yusef khanPERSON

0.97+

overQUANTITY

0.96+

oneQUANTITY

0.95+

R. P. A. AutomationORGANIZATION

0.95+

pythonTITLE

0.95+

TahoeORGANIZATION

0.94+

I. O. TahoeTITLE

0.93+

HondaORGANIZATION

0.93+

Io-TahoeORGANIZATION

0.93+

one thingQUANTITY

0.91+

Io TahoePERSON

0.87+

firstlyQUANTITY

0.87+

CivicCOMMERCIAL_ITEM

0.87+

SnowflakeTITLE

0.86+

TahoePERSON

0.85+

AyatollahPERSON

0.84+

SnowflakeEVENT

0.83+

past couple of decadesDATE

0.82+

about 100 peopleQUANTITY

0.81+

two decadesQUANTITY

0.8+

over 450QUANTITY

0.79+

40, 50 yearsQUANTITY

0.76+

Day oneQUANTITY

0.75+

glennPERSON

0.74+

javaTITLE

0.72+

snowflakeEVENT

0.7+

Iota HoORGANIZATION

0.68+

P.ORGANIZATION

0.62+

ActiveDQ Intelligent AutomationORGANIZATION

0.61+

snowflake data cloud summitEVENT

0.6+

IotaLOCATION

0.58+

FTpTITLE

0.56+

SnowflakeORGANIZATION

0.54+

zeroQUANTITY

0.53+

RTITLE

0.52+

O.EVENT

0.41+

C.EVENT

0.34+

Yusef Khan & Suresh Kanniappan | Io Tahoe Enterprise Digital Resilience on Hybrid & Multicloud


 

>>from around the globe. It's the Cube presenting enterprise, Digital resilience on hybrid and multi cloud Brought to You by Iota Ho. Okay, let's now get into the next segment where we'll explore data automation. But from the angle of digital resilience within and as a service consumption model, we're now joined by Yusuf Khan, who heads data services for Iota Ho and Shirish County. Up in Who's the vice president and head of U. S. Sales at happiest Minds. Gents, welcome to the program. Great to have you in the Cube. >>Thank you, David. >>Stretch. You guys talk about happiest minds. This notion of born digital, foreign agile. I like that. But talk about your mission at the company. >>Sure. A former in 2011 Happiest minds Up Born digital born a child company. >>The >>reason is that we are focused on customers. Our customer centric approach on delivering digitals and seamless solutions have helped us be in the race. Along with the Tier one providers, our mission, happiest people, happiest customers is focused to enable customer happiness through people happiness. We have Bean ranked among the top 25 I t services company in the great places to work serving hour glass to ratings off 4.1 against the rating off five is among the job in the Indian nineties services company that >>shows the >>mission on the culture. What we have built on the values, right sharing, mindful, integrity, learning and social on social responsibilities are the core values off our company on. That's where the entire culture of the company has been built. >>That's great. That sounds like a happy place to be. Now you have you head up data services for Iot Tahoe. We've talked in the past. Of course you're out of London. What do you what's your day to day focus with customers and partners? What you focused on? >>Well, David, my team work daily with customers and partners to help them better understand their data, improve their data quality, their data governance on help them make that data more accessible in a self service kind of way. To the stakeholders within those businesses on dis is all a key part of digital resilience that will will come on to talk about but later. You're >>right, e mean, that self service theme is something that we're gonna we're gonna really accelerate this decade, Yussef and so. But I wonder before we get into that, maybe you could talk about the nature of the partnership with happiest minds. You know, why do you guys choose toe work closely together? >>Very good question. Um, we see Io Tahoe on Happiest minds as a great mutual fit. A Suresh has said happiest minds are very agile organization. Um, I think that's one of the key things that attracts their customers on Io. Tahoe is all about automation. We're using machine learning algorithms to make data discovery data cataloging, understanding, data, redundancy, uh, much easier on. We're enabling customers and partners to do it much more quickly. So when you combine our emphasis on automation with the emphasis on agility, the happiest minds have that. That's a really nice combination. Work works very well together, very powerful. I think the other things that a key are both businesses, a serious have said are really innovative digital native type type companies. Um, very focused on newer technologies, the cloud etcetera, uh, on. Then finally, I think that both challenger brands Andi happiest minds have a really positive, fresh ethical approach to people and customers that really resonates with us that I have tied to its >>great thank you for that. So Russia, Let's get into the whole notion of digital resilience. I wanna I wanna sort of set it up with what I see. And maybe you can comment be prior to the pandemic. A lot of customers that kind of equated disaster recovery with their business continuance or business resilient strategy, and that's changed almost overnight. How have you seen your clients respond to that? What? I sometimes called the forced march to become a digital business. And maybe you could talk about some of the challenges that they faced along the way. >>Absolutely. So, uh, especially during this pandemic times when you see Dave customers have been having tough times managing their business. So happiest minds. Being a digital Brazilian company, we were able to react much faster in the industry, apart from the other services company. So one of the key things is the organizations trying to adopt onto the digital technologies right there has bean lot off data which has been to managed by these customers on. There have been lot off threats and risk, which has been to manage by the CEO Seo's so happiest minds digital resilient technology fight the where we're bringing the data complaints as a service, we were ableto manage the resilience much ahead off other competitors in the market. We were ableto bring in our business community processes from day one, where we were ableto deliver our services without any interruption to the services what we were delivering to our customers. >>So >>that is where the digital resilience with business community process enabled was very helpful for us who enable our customers continue there business without any interruptions during pandemics. >>So, I mean, some of the challenges that that customers tell me they obviously had to figure out how to get laptops to remote workers and that that whole remote, you know, work from home pivot figure out how to secure the end points. And, you know, those were kind of looking back there kind of table stakes, but it sounds like you've got a digital business means a data business putting data at the core, I like to say, but so I wonder if you could talk a little bit more about maybe the philosophy you have toward digital resilience in the specific approach you take with clients? >>Absolutely. They seen any organization data becomes. The key on this for the first step is to identify the critical data. Right. So we this is 1/6 process. What we following happiest minds. First of all, we take stock off the current state, though the customers think that they have a clear visibility off their data. How are we do more often assessment from an external point off view on See how critical their data is? Then we help the customers to strategies that right the most important thing is to identify the most important critical herself. Data being the most critical assault for any organization. Identification off the data's key for the customers. Then we help in building a viable operating model to ensure these identified critical assets are secure on monitor dearly so that they are consumed well as well as protected from external threats. Then, as 1/4 step, we try to bring in awareness, toe the people we train them at all levels in the organization. That is a P for people to understand the importance off the residual our cells. And then as 1/5 step, we work as a back up plan in terms of bringing in a very comprehensive and the holistic testing approach on people process as well as in technology. We'll see how the organization can withstand during a crisis time. And finally we do a continuous governance off this data, which is a key right. It is not just a one step process. We set up the environment. We do the initial analysis and set up the strategy on continuously govern this data to ensure that they are not only know managed will secure as well as they also have to meet the compliance requirements off the organization's right. That is where we help organizations toe secure on Meet the regulations off the organizations. As for the privacy laws, >>so >>this is a constant process. It's not on one time effort. We do a constant process because every organization goes towards the digital journey on. They have to face all these as part off the evolving environment on digital journey, and that's where they should be kept ready in terms off. No recovering, rebounding on moving forward if things goes wrong. >>So let's stick on that for a minute, and then I wanna bring yourself into the conversation. So you mentioned compliance and governance. When? When your digital business. Here, as you say, you're a data business. So that brings up issues. Data sovereignty. Uh, there's governance, this compliance. There's things like right to be forgotten. There's data privacy, so many things. These were often kind of afterthoughts for businesses that bolted on, if you will. I know a lot of executives are very much concerned that these air built in on, and it's not a one shot deal. So do you have solutions around compliance and governance? Can you deliver that as a service? Maybe you could talk about some of the specifics there, >>so some of way have offered multiple services. Tow our customers on digital race against. On one of the key service is the data complaints. As a service here we help organizations toe map the key data against the data compliance requirements. Some of the features includes in terms off the continuous discovery off data right, because organizations keep adding on data when they move more digital on helping the helping and understanding the actual data in terms off the residents of data, it could be a heterogeneous data sources. It could be on data basis or it could be even on the data lakes. Or it could be or no even on compromise, all the cloud environment. So identifying the data across the various no heterogeneous environment is very key. Feature off our solution. Once we identify, classify this sensitive data, the data privacy regulations on the traveling laws have to be map based on the business rules. So we define those rules on help map those data so that organizations know how critical their digital assets are. Then we work on a continuous marching off data for anomalies because that's one of the key teachers off the solution, which needs to be implemented on the day to day operational basis. So we're helping monitoring those anomalies off data for data quality management on an ongoing basis. And finally we also bringing the automatic data governance where we can manage the sensory data policies on their data relationships in terms off, mapping on manage their business rules on we drive reputations toe also suggest appropriate actions to the customers. Take on those specific data sets. >>Great. Thank you, Yousef. Thanks for being patient. I want to bring in Iota ho thio discussion and understand where your customers and happiest minds can leverage your data automation capability that you and I have talked about in the past. And I'm gonna be great if you had an example is well, but maybe you could pick it up from there. >>Sure. I mean, at a high level, assertions are clearly articulated. Really? Um, Iota, who delivers business agility. So that's by, um, accelerating the time to operationalize data, automating, putting in place controls and ultimately putting, helping put in place digital resilience. I mean, way if we step back a little bit in time, um, traditional resilience in relation to data are often met manually, making multiple copies of the same data. So you have a DB A. They would copy the data to various different places on business. Users would access it in those functional style owes. And of course, what happened was you ended up with lots of different copies off the same data around the enterprise. Very inefficient. Onda course ultimately, uh, increases your risk profile. Your risk of a data breach. Um, it's very hard to know where everything is, and I realized that expression they used David, the idea of the forced march to digital. So with enterprises that are going on this forced march, what they're finding is they don't have a single version of the truth, and almost nobody has an accurate view of where their critical data is. Then you have containers bond with containers that enables a big leap forward so you could break applications down into micro services. Updates are available via a P I s. And so you don't have the same need to build and to manage multiple copies of the data. So you have an opportunity to just have a single version of the truth. Then your challenge is, how do you deal with these large legacy data states that the service has been referring Thio, where you you have toe consolidate, and that's really where I Tahoe comes in. Um, we massively accelerate that process of putting in a single version of the truth into place. So by automatically discovering the data, um, discovering what's duplicate what's redundant, that means you can consolidate it down to a single trusted version much more quickly. We've seen many customers have tried to do this manually, and it's literally taken years using manual methods to cover even a small percentage of their I T estates with a tire. You could do it really very quickly on you can have tangible results within weeks and months. Um, and then you can apply controls to the data based on context. So who's the user? What's the content? What's the use case? Things like data quality validations or access permissions on. Then once you've done there, your applications and your enterprise are much more secure, much more resilient. As a result, you've got to do these things whilst retaining agility, though. So coming full circle. This is where the partnership with happiest minds really comes in as well. You've got to be agile. You've gotta have controls, um, on you've got a drug towards the business outcomes and it's doing those three things together that really deliver for the customer. Thank >>you. Use f. I mean you and I. In previous episodes, we've looked in detail at the business case. You were just talking about the manual labor involved. We know that you can't scale, but also there's that compression of time. Thio get to the next step in terms of ultimately getting to the outcome and we talked to a number of customers in the Cube. And the conclusion is really consistent that if you could accelerate the time to value, that's the key driver reducing complexity, automating and getting to insights faster. That's where you see telephone numbers in terms of business impact. So my question is, where should customers start? I mean, how can they take advantage of some of these opportunities that we've discussed >>today? Well, we've tried to make that easy for customers. So with our Tahoe and happiest minds, you can very quickly do what we call a data health check on. Dis is a is a 2 to 3 weeks process are two Really quickly start to understand and deliver value from your data. Um, so, iota, who deploys into the customer environment? Data doesn't go anywhere. Um, we would look at a few data sources on a sample of data Onda. We can very rapidly demonstrate how date discovery those catalog e understanding Jupiter data and redundant data can be done. Um, using machine learning, um, on how those problems can be solved. Um, and so what we tend to find is that we can very quickly as I say in a matter of a few weeks, show a customer how they could get toe, um, or Brazilian outcome on. Then how they can scale that up, take it into production on, then really understand their data state Better on build resilience into the enterprise. >>Excellent. There you have it. We'll leave it right there. Guys. Great conversation. Thanks so much for coming on the program. Best of luck to you in the partnership. Be well. >>Thank you, David. Sorry. Thank you. Thank >>you for watching everybody, This is Dave Volonte for the Cuban Are ongoing Siris on data Automation without Tahoe.

Published Date : Jan 27 2021

SUMMARY :

Great to have you in the Cube. But talk about your mission at the company. digital born a child company. I t services company in the great places to work serving hour glass to ratings mission on the culture. What do you what's your day to day focus To the stakeholders within those businesses on dis is all a key part of digital of the partnership with happiest minds. So when you combine our emphasis I sometimes called the forced march to become a digital business. So one of the key things that is where the digital resilience with business community process enabled was very putting data at the core, I like to say, but so I wonder if you could talk a little bit more about maybe for the first step is to identify the critical data. They have to face all these as part off the evolving environment So do you have solutions around compliance and governance? So identifying the data across the various no heterogeneous is well, but maybe you could pick it up from there. So by automatically discovering the data, um, And the conclusion is really consistent that if you could accelerate the time to value, So with our Tahoe and happiest minds, you can very quickly do what we call Best of luck to you in the partnership. Thank you. you for watching everybody, This is Dave Volonte for the Cuban Are ongoing Siris on data Automation without

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

Yusuf KhanPERSON

0.99+

Yusef KhanPERSON

0.99+

2QUANTITY

0.99+

LondonLOCATION

0.99+

Suresh KanniappanPERSON

0.99+

YousefPERSON

0.99+

one stepQUANTITY

0.99+

Dave VolontePERSON

0.99+

first stepQUANTITY

0.99+

2011DATE

0.99+

1/5 stepQUANTITY

0.99+

4.1QUANTITY

0.99+

YussefPERSON

0.99+

Iot TahoeORGANIZATION

0.99+

bothQUANTITY

0.99+

both businessesQUANTITY

0.98+

oneQUANTITY

0.98+

twoQUANTITY

0.98+

fiveQUANTITY

0.98+

singleQUANTITY

0.98+

DavePERSON

0.98+

1/6QUANTITY

0.98+

todayDATE

0.97+

3 weeksQUANTITY

0.97+

SureshPERSON

0.97+

JupiterLOCATION

0.96+

Io TahoeORGANIZATION

0.96+

one shotQUANTITY

0.96+

single versionQUANTITY

0.96+

RussiaLOCATION

0.96+

1/4 stepQUANTITY

0.96+

FirstQUANTITY

0.96+

SirisTITLE

0.96+

TahoePERSON

0.94+

CubeORGANIZATION

0.93+

IotaORGANIZATION

0.92+

day oneQUANTITY

0.9+

one timeQUANTITY

0.88+

Iota HoORGANIZATION

0.87+

three thingsQUANTITY

0.85+

BrazilianOTHER

0.84+

Tier oneQUANTITY

0.84+

forcedEVENT

0.82+

Shirish CountyLOCATION

0.81+

SeoPERSON

0.81+

CubanOTHER

0.81+

TahoeORGANIZATION

0.73+

BeanPERSON

0.72+

IotaTITLE

0.69+

pandemicEVENT

0.67+

U. S. SalesORGANIZATION

0.66+

top 25 I tQUANTITY

0.64+

ThioPERSON

0.61+

IoORGANIZATION

0.57+

IndianOTHER

0.55+

teachersQUANTITY

0.55+

AndiPERSON

0.54+

minuteQUANTITY

0.53+

CEOPERSON

0.52+

OndaLOCATION

0.51+

CubeCOMMERCIAL_ITEM

0.45+

serviceQUANTITY

0.45+

marchEVENT

0.44+

ninetiesDATE

0.41+

Yusef Khan & Suresh Kanniappan


 

>> Announcer: From around the globe, It's theCUBE. Presenting Enterprise Digital Resilience on Hybrid and Multicloud. Brought to you by Io-Tahoe. >> Okay, Let's now get into the next segment where we'll explore data automation but from the angle of digital resilience within and as a service consumption model. We're now joined by Yusef Khan, who heads data services for Io-Tahoe and Suresh Kanniappan who's the vice president and head of US sales at Happiest Minds. Gents, welcome to the program, great to have you in theCUBE. >> Thank you, David. >> Suresh, you guys talk about at Happiest Minds this notion of born digital, foreign agile, I like that but talk about your mission at the company. >> Sure, far in 2011, Happiest minds is a born digital, born agile company. The reason is that, we are focused on customers. Our customer centric approach and delivering digital and seamless solutions, have helped us be in the race along with the Tier 1 providers. Our mission, Happiest People, Happiest Customers is focused to enable customer happiness through people happiness. We have been ranked among the top 25 ID services company in the great places to work in service. Our Glassdoor ratings, of four dot one against the rating of five, is among the top in the Indian ID services company, that shows the mission and the culture what we have built on the values, right? Is sharing, mindful, integrity, learning and social responsibilities, are the core values of our company. And that's where the entire culture of the company has been built. >> That's great, sounds like a happy place to be. Now Yusef, you had updated services for Io-Tahoe, we've talked in the past year, of course you're at London. What's your day to day focus with customers and partners? What are you focused on? >> Well David, my team worked daily with customers and partners to help them better understand their data, improve their data quality, their data governance, and help them make that data more accessible in a self-service kind of way to the stakeholders within those businesses. And this is a key part of digital resilience that we allow. We'll come on to talk about a bit later. >> You're right, I mean that self-service theme is something that we're going to really accelerate this decade Yusef. And so, but I wonder before we get into that, maybe you could talk about the nature of the partnership with Happiest Minds, why do you guys choose to work closely together? >> Very good question. We see Io-Tahoe and Happiest Minds as a great mutual fit. As Suresh said, Happiest Minds are a very agile organization. I think that's one of the key things that attracts the customers. And Io-Tahoe is all about automation. We're using machine learning algorithms to make data discovery, data cataloging, understanding data redundancy much easier and we're enabling customers and partners to do it much more quickly. So when you combine our emphasis on automation, with the emphasis on agility that Happiest Minds have. That's a really nice combination, works very well together, very powerful. I think the other things that are key, both businesses as Suresh have said, are really innovative, digital native type companies. Very focused on newer technologies, the cloud, et cetera. And then finally I think they're both challenge brands and Happiest Minds have a really positive, fresh, ethical approach to people and customers that really resonates with us at Io-Tahoe too. >> That's great, thank you for that. Suresh, let's get into the whole notion of digital resilience. I want to sort of set it up with what I see and maybe you can comment. Being prior to the pandemic, a lot of customers that kind of equated disaster recovery with their business continuance or business resilience strategy and that's changed almost overnight. How have you seen your clients respond to that? What I sometimes call the forced match to become a digital business and maybe you could talk about some of the challenges that they've faced along the way. >> Absolutely, So especially during this pandemic times when you see Dave, customers have been having tough times managing their business. So Happiest Minds being a digital resilient company, we were able to react much faster in the industry apart from the other services company. So, one of the key things is, the organizations are trying to adapt onto the digital technologies, right? There has been lot of data which has to be managed by these customers, and there've been a lot of threats and risk which has to be managed by the CIOs. So Happiest Minds Digital Resilient Technology, right? We're bringing the data complaints as a service. We were able to manage the resilience much ahead of other competitors in the market. We were able to bring in our business continuity processes from day one, where we were able to deliver our services without any interruption to the services what we are delivering to our customers. So that is where the digital, the resilience with business continuity process enabled was very helpful for us to enable our customers continue their business without any interruptions during pandemics. >> So, I mean some of the challenges that customers tell me if I may obviously had to figure out how to get laptops to remote workers, that whole remote, work from home pivot, figure out how to secure the end points, and those were kind of looking back they're kind of table stakes. And it sounds like, you got, I mean digital business means, a data business, putting data at the core, I like to say it. But so, I wonder if you could talk a little bit more about, maybe the philosophy you have toward digital resilience and the specific approach you take with clients. >> Absolutely Dave, see in any organization, data becomes the key. And thus for the first step, is to identify the critical data, right? So, this is a six step process plot we follow in Happiest Minds. First of all, we take stock of the current state, though the customers think that they have a clear visibility of their data. However, we do more often assessment from an external point of view and see how critical their data is. Then we help the customers to strategize that, right? The most important thing is to identify the most important critical asset. Data being the most critical asset for any organization, identification of the data are key for the customers. Then we help in building a viable operating model to ensure these identified critical assets are secure and monitored duly so that they are consumed well as well as protected from external threats. Then as a fourth step, now we try to bring in awareness to the people. We train them, at all levels in the organization. That is a key for people to understand the importance of the digital lessons. And then, as a fifth step, we work as a backup plan. In terms of bringing in a very comprehensive and a wholistic distinct approach on people, process, as well as in technology, to see how the organization can withstand during a crisis time. And finally, we do a continuous governance of these data. Which is a key, right? It is not just a one-step process. We set up the environment, we do the initial analysis, and set up the strategy and continuously govern these data to ensure that they are not only not managed well, secure, as well as they also have to meet the compliance requirements of the organizations, right? That is where we help organizations to secure and meet the regulations of the organizations as per the privacy laws. So this is a constant process. It's not a one time effort, we do a constant process because every organization grows towards their digital journey, and they have to face all these as part of the evolving environment on digital journey. And that's where they should be kept ready in terms of recovering, rebounding and moving forward if things goes wrong. >> So, let's stick on that for a minute and then I want to bring Yusef into the conversation. So, you mentioned compliance and governance. When you're in digital business here as you say you're a data business, so that brings up issues, data sovereignty, there's governance, there's compliance, there's things like right to be forgotten, there's data privacy, so many things. These were often kind of afterthoughts for businesses that bolted on, if you will. I know a lot of executives are very much concerned that these are built in and it's not a one-shot deal. So, do you have solutions around compliance and governance? Can you deliver that as a service? Maybe you could talk about some of the specifics there. >> Sure, we offer multiple services to our customers on digital residents. And one of the key service is the data compliance as a service. Here, we help organizations to map the key data against the data compliance requirements. Some of the features includes in terms of the continuous discovery of data, right? Because organizations keep adding on data when they move more digital. And helping and understanding the actual data in terms of the resilience of data, it could be an heterogeneous data sources, It could be on data basis, or it could be even on the data lakes, or it could be even on on-prem or on the cloud environment. So, identifying the data across the various heterogeneous environment is a very key feature of our solution. Once we identify and classify these sensitive data, the data privacy regulations and the prevalent laws have to be mapped based on the business rules. So we define those rules and help map those data so that organizations know how critical their digital assets are. Then we work on a continuous monitoring of data for anomalies. Because that's one of the key features of the solution, which needs to be implemented on the day-to-day operational basis. So, we help in monitoring those anomalies of data, for data quality management on an ongoing basis. And finally, we also bring in the automated data governance where we can manage the sensitive data policies and their data relationships in terms of mapping and manage that business rules. And we drive limitations and also suggest appropriate actions to the customers to take on those specific data assets. >> Great, thank you. Yusef thanks for being patient. I want to bring in Io-Tahoe to the discussion and understand where your customers and Happiest Minds can leverage your data automation capability that you and I have talked about in the past. And I mean it'd be great if you had an example as well, but maybe you could pick it up from there. >> Sure, I mean at a high level as Suresh articulated really, Io-Tahoe delivers business agility. So that's by accelerating the times operationalized data, automating, putting in place controls, and also helping put in place digital resilience. I mean, if we stepped back a little bit in time, traditional resilience in relation to data, often meant manually making multiple copies of the same data. So you'd have a DBA, they would copy the data to various different places, and then business users would access it in those functional silos. And of course, what happened was you ended up with lots of different copies of the same data around the enterprise. Very inefficient, and of course ultimately increases your risk profile, your risk of a data breach, It's very hard to know where everything is. And I realized that expression you used David, the idea of the forced match to digital. So, with enterprises that are going on this forced match, what they're finding is, they don't have a single version of the truth. And almost nobody has an accurate view of where their critical data is. Then you have containers, and with containers that enables a big leap forward. So you can break applications down into microservices, updates are available via APIs, and so you don't have the same need to to build and to manage multiple copies of the data. So, you have an opportunity to just have a single version of a truth. Then your challenge is, how do you deal with these large legacy data states that Suresh has been referring to? Where you have to consolidate. And that's really where Io-Tahoe comes in. We massively accelerate that process of putting in a single version of truth into place. So by automatically discovering the data, discovering what's duplicate, what's redundant, that means you can consolidate it down to a single trusted version, much more quickly. We've seen many customers who've tried to do this manually and it's literally taken years using manual methods to cover even a small percentage of their IT estates. With Io-Tahoe you can do it really very quickly and you can have tangible results within weeks and months. And then you can apply controls to the data based on context. So, who's the user? What's the content? What's the use case? Things like data quality validations or access permissions, and then once you've done that, your applications and your enterprise are much more secure, much more resilient as a result. You've got to do these things whilst retaining agility though. So, coming full circle, this is where the partnership with Happiest Minds really comes in as well. You've got to be agile, you've got to have controls and you've got to drive towards the business outcomes. And it's doing those three things together, we really deliver for the customer. >> Thank you, Yusef. I mean you and I in previous episodes we've looked in detail at the business case you were just talking about the manual labor involved. We know that you can't scale, but also there's that compression of time to get to the next step in terms of ultimately getting to the outcome and we've to a number of customers in theCUBE and the conclusion is, it's really consistent that if you can accelerate the time to value, that's the key driver, reducing complexity, automating and getting to insights faster. That's where you see telephone numbers in terms of business impact. So my question is, where should customers start? I mean how can they take advantage of some of these opportunities that we've discussed today? >> Well, we've tried to make that easy for customers. So, with Io-Tahoe and Happiest Minds you can very quickly do what we call a data health check. And this is a two to three week process to really quickly start to understand and deliver value from your data. So, Io-Tahoe deploys into the customer environment, data doesn't go anywhere, we would look at a few data sources, and a sample of data and we can very rapidly demonstrate how data discovery, data cataloging and understanding duplicate data or redundant data can be done, using machine learning, and how those problems can be solved. And so what we tend to find is that we can very quickly as I said in a matter of a few weeks, show a customer how they can get to a more resilient outcome and then how they can scale that up, take it into production, and then really understand their data state better, and build resilience into the enterprise. >> Excellent, there you have it. We'll leave it right there guys. Great conversation. Thanks so much for coming into the program. Best of luck to you in the partnership, be well. >> Thank you David, Suresh. >> Thank you Yusef. >> And thank you for watching everybody. This is Dave Vellante for theCUBE and our ongoing series on Data Automation with Io-Tahoe. (soft upbeat music)

Published Date : Jan 13 2021

SUMMARY :

Brought to you by Io-Tahoe. great to have you in theCUBE. mission at the company. in the great places to work in service. like a happy place to be. and partners to help of the partnership with Happiest Minds, that attracts the customers. and maybe you can comment. of other competitors in the market. at the core, I like to say it. identification of the data some of the specifics there. and the prevalent laws have to be mapped that you and I have the same need to to build the time to value, and build resilience into the enterprise. Best of luck to you in And thank you for watching everybody.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

Yusef KhanPERSON

0.99+

2011DATE

0.99+

DavePERSON

0.99+

SureshPERSON

0.99+

Suresh KanniappanPERSON

0.99+

twoQUANTITY

0.99+

YusefPERSON

0.99+

Dave VellantePERSON

0.99+

Io-TahoeORGANIZATION

0.99+

Happiest MindsORGANIZATION

0.99+

Happiest MindsORGANIZATION

0.99+

fourth stepQUANTITY

0.99+

oneQUANTITY

0.99+

Yusef KhanPERSON

0.99+

first stepQUANTITY

0.99+

fifth stepQUANTITY

0.99+

six stepQUANTITY

0.99+

three weekQUANTITY

0.99+

fiveQUANTITY

0.99+

LondonLOCATION

0.99+

todayDATE

0.99+

fourQUANTITY

0.98+

singleQUANTITY

0.98+

bothQUANTITY

0.98+

one-shotQUANTITY

0.97+

both businessesQUANTITY

0.97+

three thingsQUANTITY

0.96+

FirstQUANTITY

0.96+

USLOCATION

0.96+

Happiest mindsORGANIZATION

0.96+

GlassdoorORGANIZATION

0.95+

single versionQUANTITY

0.9+

day oneQUANTITY

0.89+

one-step processQUANTITY

0.87+

single versionQUANTITY

0.86+

-TahoeORGANIZATION

0.84+

past yearDATE

0.82+

a minuteQUANTITY

0.82+

one time effortQUANTITY

0.8+

weeksQUANTITY

0.8+

Io-TahoePERSON

0.77+

IndianOTHER

0.74+

theCUBEORGANIZATION

0.7+

25 ID servicesQUANTITY

0.61+

1QUANTITY

0.61+

IoTITLE

0.53+

pandemicEVENT

0.52+

topQUANTITY

0.51+

TierOTHER

0.49+

Santiago Castro, Gudron van der Wal and Yusef Khan | Io-Tahoe Adaptive Data Governance


 

>> Presenter: From around the globe, it's theCUBE. Presenting Adaptive Data Governance, brought to you by Io-Tahoe. >> Our next segment here is an interesting panel, you're going to hear from three gentlemen, about Adaptive Data Governance. We're going to talk a lot about that. Please welcome Yusef Khan, the global director of data services for Io-Tahoe. We also have Santiago Castor, the chief data officer at the First Bank of Nigeria, and Gudron Van Der Wal, Oracle's senior manager of digital transformation and industries. Gentlemen, it's great to have you joining us in this panel. (indistinct) >> All right, Santiago, we're going to start with you. Can you talk to the audience a little bit about the First Bank of Nigeria and its scale? This is beyond Nigeria, talk to us about that. >> Yes. So First Bank of Nigeria was created 125 years ago, it's one of the oldest, if not the oldest bank in Africa. And because of the history, it grew, everywhere in the region, and beyond the region. I'm currently based in London, where it's kind of the European headquarters. And it really promotes trade finance, institutional banking, corporate banking, private banking around the world, in particular in relationship to Africa. We are also in Asia, in the Middle East. And yes, and is a very kind of active bank in all these regions. >> So Santiago, talk to me about what adaptive data governance means to you, and how does it helps the First Bank of Nigeria to be able to innovate faster with the data that you have. >> Yes I like that concept of adaptive data governance, because it's kind of, I would say, an approach that can really happen today with the new technology before it was much more difficult to implement. So just to give you a little bit of context, I used to work in consulting for 16-17 years before joining the First Bank of Nigeria. And I saw many organizations trying to apply different type of approaches in data governance. And the beginning, early days was really kind of (indistinct), where you top down approach, where data governance was seen as implement a set of rules, policies and procedures, but really from the top down. And is important, it's important to have the battle of your sea level, of your director, whatever is, so just that way it fails, you really need to have a complimentary approach, I often say both amount, and actually, as a CEO I'm really trying to decentralized data governance, really instead of imposing a framework that some people in the business don't understand or don't care about it. It really needs to come from them. So what I'm trying to say is that, data basically support business objectives. And what you need to do is every business area needs information on particular decisions to actually be able to be more efficient, create value, et cetera. Now, depending on the business questions they have to show, they will need certain data sets. So they need actually to be able to have data quality for their own, 'çause now when they understand that, they become the stewards naturally of their own data sets. And that is where my bottom line is meeting my top down. You can guide them from the top, but they need themselves to be also in power and be actually in a way flexible to adapt the different questions that they have in order to be able to respond to the business needs. And I think that is where these adaptive data governance starts. Because if you want, I'll give you an example. In the bank, we work, imagine a Venn diagram. So we have information that is provided to finance, and all information to risk, or information for business development. And in this Venn diagram, there is going to be part of that every circle that are going to kind of intersect with each other. So what you want as a data governance is to help providing what is in common, and then let them do their own analysis to what is really related to their own area as an example, nationality. You will say in a bank that will open an account is the nationality of your customer, that's fine for final when they want to do a balance sheet an accounting or a P&L, but for risk, they want that type of analysis plus the net nationality of exposure, meaning where you are actually exposed as a risk, you can have a customer that are on hold in the UK, but then trade with Africa, and in Africa they're exposing their credit. So what I'm trying to say is they have these pieces in common and pieces that are different. Now I cannot impose a definition for everyone. I need them to adapt and to bring their answers to their own business questions. That is adaptive data governance. And all that is possible because we have and I was saying at the very beginning, just to finalize the point, we have new technologies that allow you to do these metadata classification in a very sophisticated way that you can actually create analytics of your metadata. You can understand your different data sources, in order to be able to create those classifications like nationalities and way of classifying your customers, your products, et cetera. But you will need to understand which areas need, what type nationality or classification, which others will mean that all the time. And the more you create that understanding, that intelligence about how your people are using your data you create in a way building blocks like a label, if you want. Where you provide them with those definitions, those catalogs you understand how they are used or you let them compose like Lego. They would play their way to build their analysis. And they will be adaptive. And I think the new technologies are allowing that. And this is a real game changer. I will say that over and over. >> So one of the things that you just said Santiago kind of struck me in to enable the users to be adaptive, they probably don't want to be logging in support ticket. So how do you support that sort of self service to meet the demand of the user so that they can be adaptive? >> Yeah, that's a really good question. And that goes along with that type of approach. I was saying in a way more and more business users want autonomy, and they want to basically be able to grab the data and answers their question. Now, when you have that, that's great, because then you have demand. The business is asking for data. They're asking for the insight. So how do you actually support that? I will say there is a changing culture that is happening more and more. I would say even the current pandemic has helped a lot into that because you have had, in a way, of course, technology is one of the biggest winners without technology we couldn't have been working remotely. Without this technology, where people can actually log in from their homes and still have a market data marketplaces where they self serve their information. But even beyond that, data is a big winner. Data because the pandemic has shown us that crisis happened, but we cannot predict everything and that we are actually facing a new kind of situation out of our comfort zone, where we need to explore and we need to adapt and we need to be flexible. How do we do that? With data. As a good example this, every country, every government, is publishing everyday data stats of what's happening in the countries with the COVID and the pandemic so they can understand how to react because this is new. So you need facts in order to learn and adapt. Now, the companies that are the same. Every single company either saw the revenue going down, or the revenue going very up for those companies that are very digital already now, it changed the reality. So they needed to adapt, but for that they needed information in order to think and innovate and try to create responses. So that type of self service of data, (indistinct) for data in order to be able to understand what's happening when the construct is changing, is something that is becoming more of the topic today because of the pandemic, because of the new capabilities of technologies that allow that. And then, you then are allowed to basically help, your data citizens, I call them in organization. People that know their business and can actually start playing and answer their own questions. So these technologies that gives more accessibility to the data, that gives some cataloging so we can understand where to go or where to find lineage and relationships. All this is basically the new type of platforms or tools that allow you to create what I call a data marketplace. So once you create that marketplace, they can play with it. And I was talking about new culture. And I'm going to finish with that idea. I think these new tools are really strong because they are now allowing for people that are not technology or IT people to be able to play with data because it comes in the digital world they are useful. I'll give you an example with all your stuff where you have a very interesting search functionality, where you want to find your data and you want to self serve, you go there in that search, and you actually go and look for your data. Everybody knows how to search in Google, everybody searching the internet. So this is part of the data culture, the digital culture, they know how to use those tools. Now similarly, that data marketplace is in Io-Tahoe, you can for example, see which data sources are mostly used. So when I'm doing an analysis, I see that police in my area are also using these sources so I trust those sources. We are a little bit like Amazon, when you might suggest you what next to buy, again this is the digital kind of culture where people very easily will understand. Similarly, you can actually like some type of data sets that are working, that's Facebook. So what I'm trying to say is you have some very easy user friendly technologies that allows you to understand how to interact with them. And then within the type of digital knowledge that you have, be able to self serve, play, collaborate with your peers, collaborate with the data query analysis. So its really enabling very easily that transition to become a data savvy without actually needing too much knowledge of IT, or coding, et cetera, et cetera. And I think that is a game changer as well. >> And enabling that speed that we're all demanding today during these unprecedented times. Gudron I wanted to go to you, as we talk about in the spirit of evolution, technology's changing. Talk to us a little bit about Oracle Digital. What are you guys doing there? >> Yeah, thank you. Well, Oracle Digital is a business unit at Oracle EMEA. And we focus on emerging countries, as well as low end enterprises in the mid market in more developed countries. And four years ago, they started with the idea to engage digital with our customers via central hubs across EMEA. That means engaging with video having conference calls, having a wall, agreeing wall, where we stand in front and engage with our customers. No one at that time could have foreseen how this is the situation today. And this helps us to engage with our customers in the way we're already doing. And then about my team. The focus of my team is to have early stage conversations with our customers on digital transformation and innovation. And we also have a team of industry experts who engage with our customers and share expertise across EMEA. And we we inspire our customers. The outcome of these conversations for Oracle is a deep understanding of our customer needs, which is very important. So we can help the customer and for the customer means that we will help them with our technology and our resources to achieve their goals. >> It's all about outcomes. Right Gudron? So in terms of automation, what are some of the things Oracle is doing there to help your clients leverage automation to improve agility so that they can innovate faster? Which on these interesting times it's demanding. >> Yeah. Thank you. Well, traditionally, Oracle is known for their databases, which has been innovated year over year since the first launch. And the latest innovation is the autonomous database and autonomous data warehouse. For our customers, this means a reduction in operational costs by 90%, with a multimodal converged database, and machine learning based automation for full lifecycle management. Our database is self driving. This means we automate database provisioning, tuning and scaling. The database is self securing. This means ultimate data protection and security and itself repairing the ultimate failure detection, failover and repair. And the question is for our customers, what does it mean? It means they can focus on their business instead of maintaining their infrastructure and their operations. >> That's absolutely critical. Yusef, I want to go over to you now. Some of the things that we've talked about, just the massive progression and technology, the evolution of that, but we know that whether we're talking about data management, or digital transformation. A one size fits all approach doesn't work to address the challenges that the business has. That the IT folks have. As you are looking to the industry, with what Santiago told us about First Bank of Nigeria, what are some of the changes that you're seeing that Io-Tahoe has seen throughout the industry? >> Well, Lisa, I think the first way I'd characterize it is to say, the traditional kind of top down approach to data, where you have almost a data policeman who tells you what you can and cannot do just doesn't work anymore. It's too slow, it's too result intensive. Data Management, data governance, digital transformation itself, it has to be collaborative. And it has to be an element of personalization today to users. In the environment we find ourselves in now, it has to be about enabling self service as well. A one size fits all model when it comes to those things around data doesn't work. As Santiago was saying, it needs to be adaptive to how the data is used and who is using it. And in order to do this, companies, enterprises, organizations really need to know their data. They need to understand what data they hold, where it is, and what the sensitivity of it is. They can then in a more agile way, apply appropriate controls and access so that people themselves are in groups within businesses are agile and can innovate. Otherwise, everything grinds to a halt, and you risk falling behind your competitors. >> Yet a one size fits all terms doesn't apply when you're talking about adaptive and agility. So we heard from Santiago about some of the impact that they're making with First Bank of Nigeria. Yusef, talk to us about some of the business outcomes that you're seeing other customers make leveraging automation that they could not do before. >> I guess one of the key ones is around. Just it's automatically being able to classify terabytes of data or even petabytes of data across different sources to find duplicates, which you can then remediate and delete. Now, with the capabilities that Io-Tahoe offers, and Oracle offers, you can do things not just with a five times or 10 times improvement, but it actually enables you to do project for stock that otherwise would fail, or you would just not be able to do. Classifying multi terabyte and multi petabyte estates across different sources, formats, very large volumes of data. In many scenarios, you just can't do that manually. We've worked with government departments. And the issues there as you'd expect are the result of fragmented data. There's a lot of different sources, there's a lot of different formats. And without these newer technologies to address it, with automation and machine learning, the project isn't doable. But now it is. And that could lead to a revolution in some of these businesses organizations. >> To enable that revolution now, there's got to be the right cultural mindset. And one, when Santiago was talking about those really kind of adopting that and I think, I always call that getting comfortably uncomfortable. But that's hard for organizations to do. The technology is here to enable that. But when you're talking with customers, how do you help them build the trust and the confidence that the new technologies and a new approaches can deliver what they need? How do you help drive that kind of attack in the culture? >> It's really good question, because it can be quite scary. I think the first thing we'd start with is to say, look, the technology is here, with businesses like Io-Tahoe, unlike Oracle, it's already arrived. What you need to be comfortable doing is experimenting, being agile around it and trying new ways of doing things. If you don't want to get left behind. And Santiago, and the team at FBN, are a great example of embracing it, testing it on a small scale and then scaling up. At Io-Tahoe we offer what we call a data health check, which can actually be done very quickly in a matter of a few weeks. So we'll work with the customer, pick a use case, install the application, analyze data, drive out some some quick wins. So we worked in the last few weeks of a large energy supplier. And in about 20 days, we were able to give them an accurate understanding of their critical data elements, help them apply data protection policies, minimize copies of the data, and work out what data they needed to delete to reduce their infrastructure spend. So it's about experimenting on that small scale, being agile, and then scaling up in a in a kind of very modern way. >> Great advice. Santiago, I'd like to go back to you. Is we kind of look at, again, that topic of culture, and the need to get that mindset there to facilitate these rapid changes. I want to understand kind of last question for you about how you're doing that. From a digital transformation perspective, we know everything is accelerating in 2020. So how are you building resilience into your data architecture and also driving that cultural change that can help everyone in this shift to remote working and a lot of the the digital challenges that we're all going through? >> That's a really interesting transition, I would say. I was mentioning, just going back to some of the points before to transition these I said that the new technologies allowed us to discover the data in a new way to blog and see very quickly information, to have new models of (indistinct) data, we are talking about data (indistinct), and giving autonomy to our different data units. Well, from that autonomy, they can then compose and innovate their own ways. So for me now we're talking about resilience. Because, in a way autonomy and flexibility in our organization, in our data structure, we platform gives you resilience. The organizations and the business units that I have experienced in the pandemic, are working well, are those that actually, because they're not physically present anymore in the office, you need to give them their autonomy and let them actually engage on their own side and do their own job and trust them in a away. And as you give them that they start innovating, and they start having a really interesting idea. So autonomy and flexibility, I think, is a key component of the new infrastructure where you get the new reality that pandemic shows that yes, we used to be very kind of structure, policies, procedures, as they're important, but now we learn flexibility and adaptability at the same site. Now, when you have that, a key other components of resiliency is speed, of course, people want to access the data and access it fast and decide fast, especially changes are changing so quickly nowadays, that you need to be able to, interact and iterate with your information to answer your questions quickly. And coming back maybe to where Yusef was saying, I completely agree is about experimenting, and iterate. You will not get it right the first time, especially that the world is changing too fast. And we don't have answers already set for everything. So we need to just go play and have ideas fail, fail fast, and then learn and then go for the next. So, technology that allows you to be flexible, iterate, and in a very fast agile way continue will allow you to actually be resilient in the way because you're flexible, you adapt, you are agile and you continue answering questions as they come without having everything said in a stroke that is too hard. Now coming back to your idea about the culture is changing in employees and in customers. Our employees, our customers are more and more digital service. And in a way the pandemic has accelerated that. We had many branches of the bank that people used to go to ask for things now they cannot go. You need to, here in Europe with the lockdown you physically cannot be going to the branches and the shops that have been closed. So they had to use our mobile apps. And we have to go into the internet banking, which is great, because that was the acceleration we wanted. Similarly, our employees needed to work remotely. So they needed to engage with a digital platform. Now what that means, and this is, I think the really strong point for the cultural change for resilience is that more and more we have two type of connectivity that is happening with data. And I call it employees connecting to data. The session we're talking about, employees connecting with each other, the collaboration that Yusef was talking about, which is allowing people to share ideas, learn and innovate. Because the more you have platforms where people can actually find themselves and play with the data, they can bring ideas to the analysis. And then employees actually connecting to algorithms. And this is the other part that is really interesting. We also are a partner of Oracle. And Oracle (indistinct) is great, they have embedded within the transactional system, many algorithms that are allowing us to calculate as the transactions happen. What happened there is that when our customers engage with algorithms, and again, with Io-Tahoe as well, the machine learning that is there for speeding the automation of how you find your data allows you to create an alliance with the machine. The machine is there to actually in a way be your best friend, to actually have more volume of data calculated faster in a way to discover more variety. And then, we couldn't cope without being connected to these algorithms. And then, we'll finally get to the last connection I was saying is, the customers themselves engaging with the connecting with the data. I was saying they're more and more engaging with our app and our website and they're digitally serving. The expectation of the customer has changed. I work in a bank where the industry is completely challenged. You used to have people going to a branch, as I was saying, they cannot not only not go there, but they're even going from branch to digital to ask to now even wanting to have business services actually in every single app that they are using. So the data becomes a service for them. The data they want to see how they spend their money and the data of their transactions will tell them what is actually their spending is going well with their lifestyle. For example, we talk about a normal healthy person. I want to see that I'm standing, eating good food and the right handle, healthy environment where I'm mentally engaged. Now all these is metadata is knowing how to classify your data according to my values, my lifestyle, is algorithms I'm coming back to my three connections, is the algorithms that allow me to very quickly analyze that metadata. And actually my staff in the background, creating those understanding of the customer journey to give them service that they expect on a digital channel, which is actually allowing them to understand how they are engaging with financial services. >> Engagement is absolutely critical Santiago. Thank you for sharing that. I do want to wrap really quickly. Gudron one last question for you. Santiago talked about Oracle, you've talked about it a little bit. As we look at digital resilience, talk to us a little bit in the last minute about the evolution of Oracle, what you guys are doing there to help your customers get the resilience that they have to have to be. To not just survive, but thrive. >> Yeah. Well, Oracle has a cloud offering for infrastructure, database, platform service, and the complete solutions offered at SaaS. And as Santiago also mentioned, we are using AI across our entire portfolio, and by this will help our customers to focus on their business innovation and capitalize on data by enabling your business models. And Oracle has a global coverage with our cloud regions. It's massively investing in innovating and expanding their cloud. And by offering cloud as public cloud in our data centers, and also as private clouds with clouded customer, we can meet every sovereignty and security requirement. And then this way, we help people to see data in new ways. We discover insights and unlock endless possibilities. And maybe one one of my takeaways is, if I speak with customers, I always tell them, you better start collecting your data now. We enable this by this like Io-Tahoe help us as well. If you collect your data now you are ready for tomorrow. You can never collect your data backwards. So that is my takeaway for today. >> You can't collect your data backwards. Excellent Gudron. Gentlemen, thank you for sharing all of your insights, very informative conversation. All right. This is theCUBE, the leader in live digital tech coverage. (upbeat music)

Published Date : Dec 10 2020

SUMMARY :

brought to you by Io-Tahoe. Gentlemen, it's great to have going to start with you. And because of the history, it grew, So Santiago, talk to me about So just to give you a that you just said Santiago And I'm going to finish with that idea. And enabling that speed and for the customer means to help your clients leverage automation and itself repairing the that the business has. And in order to do this, of the business outcomes And that could lead to a revolution and the confidence that And Santiago, and the team and the need to get that of the customer journey to give them they have to have to be. and the complete the leader in live digital tech coverage.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Yusef KhanPERSON

0.99+

YusefPERSON

0.99+

OracleORGANIZATION

0.99+

LondonLOCATION

0.99+

EuropeLOCATION

0.99+

Io-TahoeORGANIZATION

0.99+

Gudron Van Der WalPERSON

0.99+

LisaPERSON

0.99+

AsiaLOCATION

0.99+

10 timesQUANTITY

0.99+

AmazonORGANIZATION

0.99+

AfricaLOCATION

0.99+

five timesQUANTITY

0.99+

Santiago CastroPERSON

0.99+

Santiago CastorPERSON

0.99+

2020DATE

0.99+

UKLOCATION

0.99+

FBNORGANIZATION

0.99+

Oracle DigitalORGANIZATION

0.99+

LegoORGANIZATION

0.99+

SantiagoPERSON

0.99+

Middle EastLOCATION

0.99+

90%QUANTITY

0.99+

GudronPERSON

0.99+

Gudron van der WalPERSON

0.99+

firstQUANTITY

0.99+

First Bank of NigeriaORGANIZATION

0.99+

tomorrowDATE

0.99+

FacebookORGANIZATION

0.99+

bothQUANTITY

0.99+

First Bank of NigeriaORGANIZATION

0.98+

COVIDEVENT

0.98+

oneQUANTITY

0.98+

Io-TahoeLOCATION

0.98+

four years agoDATE

0.98+

todayDATE

0.98+

two typeQUANTITY

0.97+

pandemicEVENT

0.97+

about 20 daysQUANTITY

0.97+

16-17 yearsQUANTITY

0.97+

first timeQUANTITY

0.97+

first launchQUANTITY

0.97+

first wayQUANTITY

0.96+

three gentlemenQUANTITY

0.96+

GoogleORGANIZATION

0.95+

125 years agoDATE

0.95+

NigeriaLOCATION

0.95+

one sizeQUANTITY

0.94+

Yusef Khan, Io Tahoe | Enterprise Data Automation


 

>>from around the globe. It's the Cube with digital coverage of enterprise data automation, an event Siri's brought to you by Iot. Tahoe, everybody, We're back. We're talking about enterprise data automation. The hashtag is data automated, and we're going to really dig into data migrations, data, migrations. They're risky. They're time consuming, and they're expensive. Yousef con is here. He's the head of partnerships and alliances at I o ta ho coming again from London. Hey, good to see you, Seth. Thanks very much. >>Thank you. >>So your role is is interesting. We're talking about data migrations. You're gonna head of partnerships. What is your role specifically? And how is it relevant to what we're gonna talk about today? >>Uh, I work with the various businesses such as cloud companies, systems integrators, companies that sell operating systems, middleware, all of whom are often quite well embedded within a company. I t infrastructures and have existing relationships. Because what we do fundamentally makes migrating to the cloud easier on data migration easier. A lot of businesses that are interested in partnering with us. Um, we're interested in parting with, So >>let's set up the problem a little bit. And then I want to get into some of the data. You know, I said that migration is a risky, time consuming, expensive. They're they're often times a blocker for organizations to really get value out of data. Why is that? >>Uh, I think I mean, all migrations have to start with knowing the facts about your data, and you can try and do this manually. But when that you have an organization that may have been going for decades or longer, they will probably have a pretty large legacy data estate so that I have everything from on premise mainframes. They may have stuff which is probably in the cloud, but they probably have hundreds, if not thousands of applications and potentially hundreds of different data stores. Um, now they're understanding of what they have. Ai's often quite limited because you can try and draw a manual maps, but they're outdated very quickly. Every time that data changes the manual that's out of date on people obviously leave organizations over time, so that kind of tribal knowledge gets built up is limited as well. So you can try a Mackel that manually you might need a db. Hey, thanks. Based analyst or ah, business analyst, and they won't go in and explore the data for you. But doing that manually is very, very time consuming this contract teams of people, months and months. Or you can use automation just like what's the bank with Iot? And they managed to do this with a relatively small team. Are in a timeframe of days. >>Yeah, we talked to Paul from Webster Bank. Awesome discussion. So I want to dig into this migration and let's let's pull up graphic it will talk about. We'll talk about what a typical migration project looks like. So what you see here it is. It's very detailed. I know it's a bit of an eye test, but let me call your attention to some of the key aspects of this Ah, and then use. If I want you to chime in. So at the top here, you see that area graph that's operational risk for a typical migration project, and you can see the timeline and the the milestones. That blue bar is the time to test so you can see the second step data analysis talking 24 weeks so, you know, very time consuming. And then Let's not get dig into the stuff in the middle of the fine print, but there's some real good detail there, but go down the bottom. That's labor intensity in the in the bottom and you can see high is that sort of brown and and you could see a number of data analysis, data staging data prep, the trial, the implementation post implementation fixtures, the transition toe B A B a year, which I think is business as usual. Those are all very labor intensive. So what do you take aways from this typical migration project? What do we need to know yourself? >>I mean, I think the key thing is, when you don't understand your data upfront, it's very difficult to scope to set up a project because you go to business stakeholders and decision makers and you say Okay, we want to migrate these data stores. We want to put them in the cloud most often, but actually, you probably don't know how much data is there. You don't necessarily know how many applications that relates to, you know, the relationships between the data. You don't know the flow of the data. So the direction in which the data is going between different data stores and tables, so you start from a position where you have pretty high risk and alleviate that risk. You could be stacking project team of lots and lots of people to do the next base, which is analysis. And so you set up a project which has got a pretty high cost. The big projects, more people, the heavy of governance, obviously on then there, then in the phase where they're trying to do lots and lots of manual analysis manage. That, in a sense, is, as we all know, on the idea of trying to relate data that's in different those stores relating individual tables and columns. Very, very time consuming, expensive. If you're hiring in resource from consultants or systems integrators externally, you might need to buy or to use party tools, Aziz said earlier. The people who understand some of those systems may have left a while ago. See you even high risks quite cost situation from the off on the same things that have developed through the project. Um, what are you doing with it, Ayatollah? Who is that? We're able to automate a lot of this process from the very beginning because we can do the initial data. Discovery run, for example, automatically you very quickly have an automated validator. A data map on the data flow has been generated automatically, much less time and effort and much less cars. Doctor Marley. >>Okay, so I want to bring back that that first chart, and I want to call your attention to the again that area graph the blue bars and then down below that labor intensity. And now let's bring up the the the same chart. But with a set of an automation injection in here and now. So you now see the So let's go Said Accelerated by Iot, Tom. Okay, great. And we're going to talk about this. But look, what happens to the operational risk. A dramatic reduction in that. That graph. And then look at the bars, the bars, those blue bars. You know, data analysis went from 24 weeks down to four weeks and then look at the labor intensity. The it was all these were high data analysis data staging data prep. Try a lot post implementation fixtures in transition to be a you. All of those went from high labor intensity. So we've now attack that and gone to low labor intensity. Explain how that magic happened. >>I think that the example off a data catalog. So every large enterprise wants to have some kind of repository where they put all their understanding about their data in its Price States catalog, if you like, um, imagine trying to do that manually. You need to go into every individual data store. You need a DB a business analyst, rich data store they need to do in extracted the data table was individually they need to cross reference that with other data school, it stores and schemers and tables. You probably were the mother of all lock Excel spreadsheets. It would be a very, very difficult exercise to do. I mean, in fact, one of our reflections as we automate lots of data lots of these things is, um it accelerates the ability to water may, But in some cases, it also makes it possible for enterprise customers with legacy systems um, take banks, for example. There quite often end up staying on mainframe systems that they've had in place for decades. Uh, no migrating away from them because they're not able to actually do the work of understanding the data g duplicating the data, deleting data isn't relevant and then confidently going forward to migrate. So they stay where they are with all the attendant problems assistance systems that are out of support. Go back to the data catalog example. Um, whatever you discover invades, discovery has to persist in a tool like a data catalog. And so we automate data catalog books, including Out Way Cannot be others, but we have our own. The only alternative to this kind of automation is to build out this very large project team or business analysts off db A's project managers processed analysts together with data to understand that the process of gathering data is correct. To put it in the repository to validate it except etcetera, we've got into organizations and we've seen them ramp up teams off 2030 people costs off £234 million a year on a time frame, 15 20 years just to try and get a data catalog done. And that's something that we can typically do in a timeframe of months, if not weeks. And the difference is using automation. And if you do what? I've just described it. In this manual situation, you make migrations to the cloud prohibitively expensive. Whatever saving you might make from shutting down your legacy data stores, we'll get eaten up by the cost of doing it. Unless you go with the more automated approach. >>Okay, so the automated approach reduces risk because you're not gonna, you know you're going to stay on project plan. Ideally, it's all these out of scope expectations that come up with the manual processes that kill you in the rework andan that data data catalog. People are afraid that their their family jewels data is not going to make it through to the other side. So So that's something that you're you're addressing and then you're also not boiling the ocean. You're really taking the pieces that are critical and stuff you don't need. You don't have to pay for >>process. It's a very good point. I mean, one of the other things that we do and we have specific features to do is to automatically and noise data for a duplication at a rover or record level and redundancy on a column level. So, as you say before you go into a migration process. You can then understand. Actually, this stuff it was replicated. We don't need it quite often. If you put data in the cloud you're paying, obviously, the storage based offer compute time. The more data you have in there that's duplicated, that is pure cost. You should take out before you migrate again if you're trying to do that process of understanding what's duplicated manually off tens or hundreds of bases stores. It was 20 months, if not years. Use machine learning to do that in an automatic way on it's much, much quicker. I mean, there's nothing I say. Well, then, that costs and benefits of guitar. Every organization we work with has a lot of money existing, sunk cost in their I t. So have your piece systems like Oracle or Data Lakes, which they've spent a good time and money investing in. But what we do by enabling them to transition everything to the strategic future repositories, is accelerate the value of that investment and the time to value that investment. So we're trying to help people get value out of their existing investments on data estate, close down the things that they don't need to enable them to go to a kind of brighter, more future well, >>and I think as well, you know, once you're able to and this is a journey, we know that. But once you're able to go live on, you're infusing sort of a data mindset, a data oriented culture. I know it's somewhat buzzword, but when you when you see it in organizations, you know it's really and what happens is you dramatically reduce that and cycle time of going from data to actually insights. Data's plentiful, but insights aren't, and that is what's going to drive competitive advantage over the next decade and beyond. >>Yeah, definitely. And you could only really do that if you get your data estate cleaned up in the first place. Um, I worked with the managed teams of data scientists, data engineers, business analysts, people who are pushing out dashboards and trying to build machine learning applications. You know, you know, the biggest frustration for lots of them and the thing that they spend far too much time doing is trying to work out what the right data is on cleaning data, which really you don't want a highly paid thanks to scientists doing with their time. But if you sort out your data stays in the first place, get rid of duplication. If that pans migrate to cloud store, where things are really accessible on its easy to build connections and to use native machine learning tools, you're well on the way up to date the maturity curve on you can start to use some of those more advanced applications. >>You said. What are some of the pre requisites? Maybe the top few that are two or three that I need to understand as a customer to really be successful here? Is it skill sets? Is it is it mindset leadership by in what I absolutely need to have to make this successful? >>Well, I think leadership is obviously key just to set the vision of people with spiky. One of the great things about Ayatollah, though, is you can use your existing staff to do this work. If you've used on automation, platform is no need to hire expensive people. Alright, I was a no code solution. It works out of the box. You just connect to force on your existing stuff can use. It's very intuitive that has these issues. User interface? >>Um, it >>was only to invest vast amounts with large consultants who may well charging the earth. Um, and you already had a bit of an advantage. If you've got existing staff who are close to the data subject matter experts or use it because they can very easily learn how to use a tool on, then they can go in and they can write their own data quality rules on. They can really make a contribution from day one, when we are go into organizations on way. Can I? It's one of the great things about the whole experience. Veritas is. We can get tangible results back within the day. Um, usually within an hour or two great ones to say Okay, we started to map relationships. Here's the data map of the data that we've analyzed. Harrison thoughts on where the sensitive data is because it's automated because it's running algorithms stater on. That's what they were really to expect. >>Um, >>and and you know this because you're dealing with the ecosystem. We're entering a new era of data and many organizations to your point, they just don't have the resources to do what Google and Amazon and Facebook and Microsoft did over the past decade To become data dominant trillion dollar market cap companies. Incumbents need to rely on technology companies to bring that automation that machine intelligence to them so they can apply it. They don't want to be AI inventors. They want to apply it to their businesses. So and that's what really was so difficult in the early days of so called big data. You have this just too much complexity out there, and now companies like Iot Tahoe or bringing your tooling and platforms that are allowing companies to really become data driven your your final thoughts. Please use it. >>That's a great point, Dave. In a way, it brings us back to where it began. In terms of partnerships and alliances. I completely agree with a really exciting point where we can take applications like Iot. Uh, we can go into enterprises and help them really leverage the value of these type of machine learning algorithms. And and I I we work with all the major cloud providers AWS, Microsoft Azure or Google Cloud Platform, IBM and Red Hat on others, and we we really I think for us. The key thing is that we want to be the best in the world of enterprise data automation. We don't aspire to be a cloud provider or even a workflow provider. But what we want to do is really help customers with their data without automated data functionality in partnership with some of those other businesses so we can leverage the great work they've done in the cloud. The great work they've done on work flows on virtual assistants in other areas. And we help customers leverage those investments as well. But our heart, we really targeted it just being the best, uh, enterprised data automation business in the world. >>Massive opportunities not only for technology companies, but for those organizations that can apply technology for business. Advantage yourself, count. Thanks so much for coming on the Cube. Appreciate. All right. And thank you for watching everybody. We'll be right back right after this short break. >>Yeah, yeah, yeah, yeah.

Published Date : Jun 23 2020

SUMMARY :

of enterprise data automation, an event Siri's brought to you by Iot. And how is it relevant to what we're gonna talk about today? fundamentally makes migrating to the cloud easier on data migration easier. a blocker for organizations to really get value out of data. And they managed to do this with a relatively small team. That blue bar is the time to test so you can see the second step data analysis talking 24 I mean, I think the key thing is, when you don't understand So you now see the So let's go Said Accelerated by Iot, You need a DB a business analyst, rich data store they need to do in extracted the data processes that kill you in the rework andan that data data catalog. close down the things that they don't need to enable them to go to a kind of brighter, and I think as well, you know, once you're able to and this is a journey, And you could only really do that if you get your data estate cleaned up in I need to understand as a customer to really be successful here? One of the great things about Ayatollah, though, is you can use Um, and you already had a bit of an advantage. and and you know this because you're dealing with the ecosystem. And and I I we work And thank you for watching everybody.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
PaulPERSON

0.99+

MicrosoftORGANIZATION

0.99+

FacebookORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

LondonLOCATION

0.99+

OracleORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

Yusef KhanPERSON

0.99+

SethPERSON

0.99+

DavePERSON

0.99+

20 monthsQUANTITY

0.99+

AzizPERSON

0.99+

hundredsQUANTITY

0.99+

tensQUANTITY

0.99+

IBMORGANIZATION

0.99+

Webster BankORGANIZATION

0.99+

24 weeksQUANTITY

0.99+

twoQUANTITY

0.99+

four weeksQUANTITY

0.99+

threeQUANTITY

0.99+

AWSORGANIZATION

0.99+

Io TahoePERSON

0.99+

MarleyPERSON

0.99+

HarrisonPERSON

0.99+

Data LakesORGANIZATION

0.99+

SiriTITLE

0.99+

ExcelTITLE

0.99+

VeritasORGANIZATION

0.99+

second stepQUANTITY

0.99+

15 20 yearsQUANTITY

0.98+

TahoePERSON

0.98+

OneQUANTITY

0.98+

first chartQUANTITY

0.98+

an hourQUANTITY

0.98+

Red HatORGANIZATION

0.98+

oneQUANTITY

0.97+

TomPERSON

0.96+

hundreds of basesQUANTITY

0.96+

firstQUANTITY

0.95+

next decadeDATE

0.94+

first placeQUANTITY

0.94+

IotORGANIZATION

0.94+

IotTITLE

0.93+

earthLOCATION

0.93+

day oneQUANTITY

0.92+

MackelORGANIZATION

0.91+

todayDATE

0.91+

AyatollahPERSON

0.89+

£234 million a yearQUANTITY

0.88+

dataQUANTITY

0.88+

IotPERSON

0.83+

hundreds ofQUANTITY

0.81+

thousands of applicationsQUANTITY

0.81+

decadesQUANTITY

0.8+

I o ta hoORGANIZATION

0.75+

past decadeDATE

0.75+

Microsoft AzureORGANIZATION

0.72+

two great onesQUANTITY

0.72+

2030 peopleQUANTITY

0.67+

DoctorPERSON

0.65+

StatesLOCATION

0.65+

Iot TahoeORGANIZATION

0.65+

a yearQUANTITY

0.55+

YousefPERSON

0.45+

Cloud PlatformTITLE

0.44+

CubeORGANIZATION

0.38+

Yusef Khan


 

>> Commentator: From around the globe, it's theCUBE with digital coverage of Enterprise Data Automation. An event series brought to you by Io-Tahoe. >> Hi everybody, we're back, we're talking about Enterprise Data Automation. The hashtag is data automated, and we're going to really dig into data migrations. Data migrations are risky, they're time consuming and they're expensive. Yusef Khan is here, he's the head of partnerships and alliances at Io-Tahoe, coming again from London. Hey, good to see you, Yusef, thanks very much. >> Thank Dave, great guy. >> So your role is interesting. We're talking about data migrations, you're going to head of partnerships, what is your role specifically and how is it relevant to what we're going to talk about today? >> Well, I work with the various businesses, such as cloud companies, systems integrators, companies that sell operating systems, middleware, all of whom are often quite well embedded within a company IT infrastructures and have existing relationships, because what we do fundamentally makes migration to the cloud easier and data migration easier, there are lots of businesses that are interested in partnering with us some were interested in partnering with. >> So let's set up the problem a little bit and then I want to get into some of the data. You know, you said that migrations are risky, time consuming, expensive, they're often times a blocker for organizations to really get value out of data. Why is that? >> Ah, I think I mean, all migrations have to start with knowing the facts about your data and you can try and do this manually but when you have an organization that may have been going for decades or longer, they will probably have a pretty large legacy data estate. So they'll have everything from on premise mainframes, they may have stuff which is partly in the clouds but they probably have hundreds, if not thousands of applications and potentially hundreds of different data stores. Now their understanding of what they have, is often quite limited because you can try and draw manual maps but they're out-of-date very quickly, every time data changes, the manual map set a date and people obviously leave organizations all the time. So that kind of tribal knowledge gets built up is limited as well. So you can try and map all that manually, you might need a DBA, database analyst or a business analyst and they might go in and explore the data for you. But doing that manually is very very time consuming. This can take teams of people months and months or you can use automation, just like Webster Bank did with Io-Tahoe and they managed to do this with a relatively small team in a timeframe of days. >> Yeah, we talked to Paul from Webster Bank, awesome discussion. So I want to dig in to this migration, then let's pull up a graphic that we'll talk about, what a typical migration project looks like. So what you see here it's very detailed, I know, it's a bit of an eye test but let me call your attention to some of the key aspects of this and then Yusef, I want you to chime in. So at the top here, you see that area graph, that's operational risk for typical migration project and you can see the timeline and the milestones, that blue bar is the time to test, so you can see the second step data analysis it's taking 24 weeks, so you know, very time consuming and then let's not get dig into the stuff in the middle of the fine print but there's some real good detail there but go down the bottom, that's labor intensity in the bottom and you can see high is that sort of brown and you can see a number of data analysis, data staging, data prep, the trial, the implementation, post implementation fixtures, the transition to BAU, which I think is Business As Usual. Those are all very labor intensive. So what are your takeaways from this typical migration project? What do we need to know Yusef? >> I mean, I think the key thing is, when you don't understand your data upfront, it's very difficult to scope and to set up a project because you go to business stakeholders and decision makers and you say, "okay, we want to migrate these data stores, we want to put them into the cloud most often", but actually, you probably don't know how much data is there, you don't necessarily know how many applications it relates to, you don't know the relationships between the data, you don't know the flow of the data so the direction in which the data is going between different data stores and tables. So you start from a position where you have pretty high risk and alleviate that risk, you probably stack your project team with lots and lots of people to do the next phase, which is analysis and so you've set up a project which is got to pretty high cost. The bigger the project, the more people the heavier the governance obviously and then in the phase where they're trying to do lots and lots of manual analysis. Manual analysis, as we all know and the idea of trying to relate data that's in different data stores, relating individual tables and columns are very, very time consuming, expensive if you're hiring in resource from consultants or systems integrators externally, you might need to buy or to use third party tools. As I said earlier, the people who understand some of those systems may have left a while ago and so you are in a high risks, high cost situation from the off and the same thing sort of develops through the project. What you find with Io-Tahoe is that we're able to automate a lot of this process from the very beginning, because we can do the initial data discovery run for example automatically, so you very quickly have an automated view of the data, a data map and the data flow has been generated automatically, much less time and effort and much less cost of money. >> Okay, so I'm going to bring back that first chart and I want to call your attention to again, that area graph, the blue bars and then down below that labor intensity and now let's bring up the same chart, but with a sort of an automation injection in here and now so you now see the sort of essence celebrated by Io-Tahoe. Okay, great, we're going to talk about this but look what happens to the operational risk, a dramatic reduction in that graph and then look at the bars, the bars, those blue bars, you know, data analysis went from 24 weeks down to four weeks and then look at the labor intensity. All these were high, data analysis, data staging, Data Prep, trial, post implementation fixtures in transition to BAU. All those went from high labor intensity, so we've now attacked that and gone to low labor intensity, explain how that magic happened. >> Ah, let's take the example of a data catalog. So every large enterprise wants to have some kind of repository where they put all their understanding about that data and its price data catalog, if you like. Imagine trying to do that manually, you need to go into every individual data store, you need a DBA and the business analyst for each data store, they need to do an extract of the data, they need to put tables individually, they need to cross reference that with other data stores and schemas and tables, you've probably end up with the mother of all Excel spreadsheets and it would be a very, very difficult exercise to do. I mean, in fact, one of our reflections as we automate lots of these things is, it accelerates the ability to automate, but in some cases it also makes it possible for enterprise customers with legacy systems, take banks, for example, they quite often end up staying on mainframe systems that they've had in place for decades, and not migrating away from them because they're not able to actually do the work of understanding the data, duplicating the data, deleting data that isn't relevant and then confidently going forward to migrate. So they stay where they are with all the attendant problems or success systems that are out of their support. Go back to the data catalog example. Whatever you discover in data discovery has to persist in a tool like a data catalog and so we automate data catalogs including our own, we can also feed others but we have our own. The only alternative to this kind of automation is to build out this very large project team of business analysts, of DBAs, project managers, process analysts, to gather all the data, to understand that the process of gathering the data is correct, to put it in the repository, to validate it, etcetera, etcetera. We've got into organizations and we've seen them, ramp up teams of 20 30 people, cost of 2, 3, 4 million pounds a year and a timeframe of 15 to 20 years, just to try and get a data catalog done and that's something that we can typically do in a timeframe of months if not weeks and the differences is using automation and if you do what I've just described in this manual situation, you make migrations to the cloud prohibitively expensive, whatever saving you might make from shutting down your legacy data stores, will get eaten up by the cost of doing it unless you go with a more automated approach. >> Okay, so the automated approach reduces risk because you're not going to, you know, you're going to stay on project plan, ideally, you know, it's all these out of scope expectations that come up with the manual processes that kill you in the rework and then that data catalog, people are afraid that their family jewels data is not going to make it through to the other side. So, that's something that you're addressing and then you're also not boiling the ocean, you're really taking the pieces that are critical and the stuff that you don't need, you don't have to pay for as part of this process. >> It's a very good point. I mean, one of the other things that we do and we have specific features to do, is to automatically analyze data for duplication at a row-level or record level and redundancy at a column level. So as you say, before you go into migration process, you can then understand actually, this stuff here is duplicated, we don't need it. Quite often, if you put data in the cloud, you're paying obviously for storage space or for compute time, the more data you have in there is duplicated, that's pure cost you should take out before you migrate. Again, if you're trying to do that process of understanding was duplicated manually of 10s or 100s of data stores, it will take you months if not years, you use machine learning to do it in an automatic way and it's much much quicker. I mean, there's nothing I'd say about the net cost and benefit of Io-Tahoe. Every organization we work with has a lot of money existing sunk cost in there IT, so they'll have your IP systems like Oracle or data lakes which they've spent good time and money investing in. What we do by enabling them to transition everything to their strategic future repositories, is accelerate the value of investment and the time to value that investment. So we are trying to help people get value out of their existing investments and data estate, close down the things that they don't need and enable them to go to a kind of brighter and more present future. >> Well, I think as well, you know, once you're able to and this is a journey, we know that but once you're able to go live and you're infusing sort of a data mindset, a data oriented culture, I know it's somewhat buzzwordy, but when you when you see it in organizations, you know it's real and what happens is you dramatically reduce that and cycle time of going from data to actually insights, data is plentiful but insights aren't and that is what's going to drive competitive advantage over the next decade and beyond. >> Yeah, definitely and you can only really do that if you get your data state cleaned up in the first place. I've worked with and managed teams of data scientists, big data engineers, business analysts, people who are pushing out dashboards and are trying to build machine learning applications. You'll know you have the biggest frustration for lots of them and the thing that they spend far too much time doing is trying to work out what the right data is, and cleaning data, which really you don't want a highly paid data scientist doing with their time but if you sort out your data set in the first place, get rid of duplication, perhaps migrate to a cloud store where things are more readily accessible and it's easy to build connections and to use native machine learning tools, you're well on the way up the maturity curve and you can start to use some of those more advanced applications. >> Yusef, what are some of the prerequisites maybe the top, you know, few that are two or three that I need to understand as a customer to really be successful here? I mean, there's, is it skill sets? Is it, mindset, leadership buy-in? What do I absolutely need to have to make this successful? >> Well, I think leadership is obviously key, being able to sort of set the vision for people is obviously key. One of the great things about Io-Tahoe though, is you can use your existing staff to do this work if you use our automation platform, there's no need to hire expensive people. Io-Tahoe is a no code solution, it works out of the box, you just connect to source and then your existing staff can use it. It's very intuitive and easy to use, user interface is only to invest vast amounts with large consultancies, who may well charging the earth and you are actually a bit of an advantage if you've got existing staff who are close to the data, who are subject matter experts or use it because they can very easily learn how to use the tool and then they can go in and they can write their own data quality rules and they can really make a contribution from day one. When we go into organizations and we connect all of the great things about the whole experience via Io-Tahoe is we can get tangible results back within the day. Usually within an hour or two, were able to say, okay, we started to map the relationships here. Here's a data map of the data that we've analyzed and here are some thoughts on what your sensitive data is, because it's automated, because it's running algorithms across data and that's what people really should expect. >> And you know this because you're dealing with the ecosystem, we're entering a new era of data and many organizations to your point, they just don't have the resources to do what Google and Amazon and Facebook and Microsoft did over the past decade to become you know, data dominant, you know, trillion dollar market cap companies. Incumbents need to rely on technology companies to bring that automation, that machine intelligence to them so they can apply it. They don't want to be AI inventors, they want to apply it to their businesses. So and that's what really was so difficult in the early days of so called Big Data, you had this just too much complexity out there and now companies like Io-Tahoe are bringing you know, tooling and platforms that are allowing companies to really become data driven. Your final thoughts, please Yusef. >> But that's a great point, Dave. In a way it brings us back to where it began in terms of partnerships and alliances. I completely agree, a really exciting point where we can take applications like Io-Tahoe and we can go into enterprises and help them really leverage the value of these type of machine learning algorithms and AI. We work with all the major cloud providers, AWS, Microsoft Azure, Google Cloud Platform, IBM, Red Hat, and others and we really, I think, for us, the key thing is that we want to be the best in the world at Enterprise Data Automation. We don't aspire to be a cloud provider or even a workflow provider but what we want to do is really help customers with their data, with our automated data functionality in partnership with some of those other businesses so we can leverage the great work they've done in the cloud, the great work they've done on workflows, on virtual assistants and in other areas and we help customers leverage those investments as well but our heart we're really targeted at just being the best enterprise, data automation business in the world. >> Massive opportunities not only for technology companies but for those organizations that can apply technology for business advantage, Yusef Khan, thanks so much for coming on theCUBE. >> Pretty much appreciated. >> All right, and thank you for watching everybody. We'll be right back right after this short break. (upbeat music)

Published Date : Jun 4 2020

SUMMARY :

to you by Io-Tahoe. and we're going to really and how is it relevant to the cloud easier and and then I want to get and they managed to do this that blue bar is the time to test, and so you are in a high and now so you now see the sort and if you do what I've just described and the stuff that you don't need, and the time to value that investment. and that is what's going to and you can start to use some and you are actually a bit of an advantage to become you know, data dominant, and we can go into enterprises that can apply technology you for watching everybody.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
PaulPERSON

0.99+

JohnPERSON

0.99+

YusefPERSON

0.99+

VodafoneORGANIZATION

0.99+

NeilPERSON

0.99+

VerizonORGANIZATION

0.99+

DavePERSON

0.99+

AWSORGANIZATION

0.99+

Webster BankORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

DeutscheORGANIZATION

0.99+

Earth EngineORGANIZATION

0.99+

SudhirPERSON

0.99+

EuropeLOCATION

0.99+

Jeff FrickPERSON

0.99+

MicrosoftORGANIZATION

0.99+

Adolfo HernandezPERSON

0.99+

TelcoORGANIZATION

0.99+

2012DATE

0.99+

GoogleORGANIZATION

0.99+

Andy JassyPERSON

0.99+

FacebookORGANIZATION

0.99+

Corinna CortesPERSON

0.99+

Dave BrownPERSON

0.99+

telcoORGANIZATION

0.99+

24 weeksQUANTITY

0.99+

John FurrierPERSON

0.99+

Amazon Web ServicesORGANIZATION

0.99+

100sQUANTITY

0.99+

AdolfoPERSON

0.99+

KDDIORGANIZATION

0.99+

thousandsQUANTITY

0.99+

LondonLOCATION

0.99+

15QUANTITY

0.99+

Io-TahoeORGANIZATION

0.99+

Yusef KhanPERSON

0.99+

80%QUANTITY

0.99+

90%QUANTITY

0.99+

Sudhir HasbePERSON

0.99+

twoQUANTITY

0.99+

SK TelecomORGANIZATION

0.99+

two linesQUANTITY

0.99+

hundredsQUANTITY

0.99+

BigQueryTITLE

0.99+

IBMORGANIZATION

0.99+

four weeksQUANTITY

0.99+

10sQUANTITY

0.99+

BrazilLOCATION

0.99+

threeQUANTITY

0.99+

SQLTITLE

0.99+

San FranciscoLOCATION

0.99+

LinkedInORGANIZATION

0.99+

Global Telco Business UnitORGANIZATION

0.99+