Image Title

Search Results for Santiago:

Santiago Castro, Gudron van der Wal and Yusef Khan | Io-Tahoe Adaptive Data Governance


 

>> Presenter: From around the globe, it's theCUBE. Presenting Adaptive Data Governance, brought to you by Io-Tahoe. >> Our next segment here is an interesting panel, you're going to hear from three gentlemen, about Adaptive Data Governance. We're going to talk a lot about that. Please welcome Yusef Khan, the global director of data services for Io-Tahoe. We also have Santiago Castor, the chief data officer at the First Bank of Nigeria, and Gudron Van Der Wal, Oracle's senior manager of digital transformation and industries. Gentlemen, it's great to have you joining us in this panel. (indistinct) >> All right, Santiago, we're going to start with you. Can you talk to the audience a little bit about the First Bank of Nigeria and its scale? This is beyond Nigeria, talk to us about that. >> Yes. So First Bank of Nigeria was created 125 years ago, it's one of the oldest, if not the oldest bank in Africa. And because of the history, it grew, everywhere in the region, and beyond the region. I'm currently based in London, where it's kind of the European headquarters. And it really promotes trade finance, institutional banking, corporate banking, private banking around the world, in particular in relationship to Africa. We are also in Asia, in the Middle East. And yes, and is a very kind of active bank in all these regions. >> So Santiago, talk to me about what adaptive data governance means to you, and how does it helps the First Bank of Nigeria to be able to innovate faster with the data that you have. >> Yes I like that concept of adaptive data governance, because it's kind of, I would say, an approach that can really happen today with the new technology before it was much more difficult to implement. So just to give you a little bit of context, I used to work in consulting for 16-17 years before joining the First Bank of Nigeria. And I saw many organizations trying to apply different type of approaches in data governance. And the beginning, early days was really kind of (indistinct), where you top down approach, where data governance was seen as implement a set of rules, policies and procedures, but really from the top down. And is important, it's important to have the battle of your sea level, of your director, whatever is, so just that way it fails, you really need to have a complimentary approach, I often say both amount, and actually, as a CEO I'm really trying to decentralized data governance, really instead of imposing a framework that some people in the business don't understand or don't care about it. It really needs to come from them. So what I'm trying to say is that, data basically support business objectives. And what you need to do is every business area needs information on particular decisions to actually be able to be more efficient, create value, et cetera. Now, depending on the business questions they have to show, they will need certain data sets. So they need actually to be able to have data quality for their own, 'çause now when they understand that, they become the stewards naturally of their own data sets. And that is where my bottom line is meeting my top down. You can guide them from the top, but they need themselves to be also in power and be actually in a way flexible to adapt the different questions that they have in order to be able to respond to the business needs. And I think that is where these adaptive data governance starts. Because if you want, I'll give you an example. In the bank, we work, imagine a Venn diagram. So we have information that is provided to finance, and all information to risk, or information for business development. And in this Venn diagram, there is going to be part of that every circle that are going to kind of intersect with each other. So what you want as a data governance is to help providing what is in common, and then let them do their own analysis to what is really related to their own area as an example, nationality. You will say in a bank that will open an account is the nationality of your customer, that's fine for final when they want to do a balance sheet an accounting or a P&L, but for risk, they want that type of analysis plus the net nationality of exposure, meaning where you are actually exposed as a risk, you can have a customer that are on hold in the UK, but then trade with Africa, and in Africa they're exposing their credit. So what I'm trying to say is they have these pieces in common and pieces that are different. Now I cannot impose a definition for everyone. I need them to adapt and to bring their answers to their own business questions. That is adaptive data governance. And all that is possible because we have and I was saying at the very beginning, just to finalize the point, we have new technologies that allow you to do these metadata classification in a very sophisticated way that you can actually create analytics of your metadata. You can understand your different data sources, in order to be able to create those classifications like nationalities and way of classifying your customers, your products, et cetera. But you will need to understand which areas need, what type nationality or classification, which others will mean that all the time. And the more you create that understanding, that intelligence about how your people are using your data you create in a way building blocks like a label, if you want. Where you provide them with those definitions, those catalogs you understand how they are used or you let them compose like Lego. They would play their way to build their analysis. And they will be adaptive. And I think the new technologies are allowing that. And this is a real game changer. I will say that over and over. >> So one of the things that you just said Santiago kind of struck me in to enable the users to be adaptive, they probably don't want to be logging in support ticket. So how do you support that sort of self service to meet the demand of the user so that they can be adaptive? >> Yeah, that's a really good question. And that goes along with that type of approach. I was saying in a way more and more business users want autonomy, and they want to basically be able to grab the data and answers their question. Now, when you have that, that's great, because then you have demand. The business is asking for data. They're asking for the insight. So how do you actually support that? I will say there is a changing culture that is happening more and more. I would say even the current pandemic has helped a lot into that because you have had, in a way, of course, technology is one of the biggest winners without technology we couldn't have been working remotely. Without this technology, where people can actually log in from their homes and still have a market data marketplaces where they self serve their information. But even beyond that, data is a big winner. Data because the pandemic has shown us that crisis happened, but we cannot predict everything and that we are actually facing a new kind of situation out of our comfort zone, where we need to explore and we need to adapt and we need to be flexible. How do we do that? With data. As a good example this, every country, every government, is publishing everyday data stats of what's happening in the countries with the COVID and the pandemic so they can understand how to react because this is new. So you need facts in order to learn and adapt. Now, the companies that are the same. Every single company either saw the revenue going down, or the revenue going very up for those companies that are very digital already now, it changed the reality. So they needed to adapt, but for that they needed information in order to think and innovate and try to create responses. So that type of self service of data, (indistinct) for data in order to be able to understand what's happening when the construct is changing, is something that is becoming more of the topic today because of the pandemic, because of the new capabilities of technologies that allow that. And then, you then are allowed to basically help, your data citizens, I call them in organization. People that know their business and can actually start playing and answer their own questions. So these technologies that gives more accessibility to the data, that gives some cataloging so we can understand where to go or where to find lineage and relationships. All this is basically the new type of platforms or tools that allow you to create what I call a data marketplace. So once you create that marketplace, they can play with it. And I was talking about new culture. And I'm going to finish with that idea. I think these new tools are really strong because they are now allowing for people that are not technology or IT people to be able to play with data because it comes in the digital world they are useful. I'll give you an example with all your stuff where you have a very interesting search functionality, where you want to find your data and you want to self serve, you go there in that search, and you actually go and look for your data. Everybody knows how to search in Google, everybody searching the internet. So this is part of the data culture, the digital culture, they know how to use those tools. Now similarly, that data marketplace is in Io-Tahoe, you can for example, see which data sources are mostly used. So when I'm doing an analysis, I see that police in my area are also using these sources so I trust those sources. We are a little bit like Amazon, when you might suggest you what next to buy, again this is the digital kind of culture where people very easily will understand. Similarly, you can actually like some type of data sets that are working, that's Facebook. So what I'm trying to say is you have some very easy user friendly technologies that allows you to understand how to interact with them. And then within the type of digital knowledge that you have, be able to self serve, play, collaborate with your peers, collaborate with the data query analysis. So its really enabling very easily that transition to become a data savvy without actually needing too much knowledge of IT, or coding, et cetera, et cetera. And I think that is a game changer as well. >> And enabling that speed that we're all demanding today during these unprecedented times. Gudron I wanted to go to you, as we talk about in the spirit of evolution, technology's changing. Talk to us a little bit about Oracle Digital. What are you guys doing there? >> Yeah, thank you. Well, Oracle Digital is a business unit at Oracle EMEA. And we focus on emerging countries, as well as low end enterprises in the mid market in more developed countries. And four years ago, they started with the idea to engage digital with our customers via central hubs across EMEA. That means engaging with video having conference calls, having a wall, agreeing wall, where we stand in front and engage with our customers. No one at that time could have foreseen how this is the situation today. And this helps us to engage with our customers in the way we're already doing. And then about my team. The focus of my team is to have early stage conversations with our customers on digital transformation and innovation. And we also have a team of industry experts who engage with our customers and share expertise across EMEA. And we we inspire our customers. The outcome of these conversations for Oracle is a deep understanding of our customer needs, which is very important. So we can help the customer and for the customer means that we will help them with our technology and our resources to achieve their goals. >> It's all about outcomes. Right Gudron? So in terms of automation, what are some of the things Oracle is doing there to help your clients leverage automation to improve agility so that they can innovate faster? Which on these interesting times it's demanding. >> Yeah. Thank you. Well, traditionally, Oracle is known for their databases, which has been innovated year over year since the first launch. And the latest innovation is the autonomous database and autonomous data warehouse. For our customers, this means a reduction in operational costs by 90%, with a multimodal converged database, and machine learning based automation for full lifecycle management. Our database is self driving. This means we automate database provisioning, tuning and scaling. The database is self securing. This means ultimate data protection and security and itself repairing the ultimate failure detection, failover and repair. And the question is for our customers, what does it mean? It means they can focus on their business instead of maintaining their infrastructure and their operations. >> That's absolutely critical. Yusef, I want to go over to you now. Some of the things that we've talked about, just the massive progression and technology, the evolution of that, but we know that whether we're talking about data management, or digital transformation. A one size fits all approach doesn't work to address the challenges that the business has. That the IT folks have. As you are looking to the industry, with what Santiago told us about First Bank of Nigeria, what are some of the changes that you're seeing that Io-Tahoe has seen throughout the industry? >> Well, Lisa, I think the first way I'd characterize it is to say, the traditional kind of top down approach to data, where you have almost a data policeman who tells you what you can and cannot do just doesn't work anymore. It's too slow, it's too result intensive. Data Management, data governance, digital transformation itself, it has to be collaborative. And it has to be an element of personalization today to users. In the environment we find ourselves in now, it has to be about enabling self service as well. A one size fits all model when it comes to those things around data doesn't work. As Santiago was saying, it needs to be adaptive to how the data is used and who is using it. And in order to do this, companies, enterprises, organizations really need to know their data. They need to understand what data they hold, where it is, and what the sensitivity of it is. They can then in a more agile way, apply appropriate controls and access so that people themselves are in groups within businesses are agile and can innovate. Otherwise, everything grinds to a halt, and you risk falling behind your competitors. >> Yet a one size fits all terms doesn't apply when you're talking about adaptive and agility. So we heard from Santiago about some of the impact that they're making with First Bank of Nigeria. Yusef, talk to us about some of the business outcomes that you're seeing other customers make leveraging automation that they could not do before. >> I guess one of the key ones is around. Just it's automatically being able to classify terabytes of data or even petabytes of data across different sources to find duplicates, which you can then remediate and delete. Now, with the capabilities that Io-Tahoe offers, and Oracle offers, you can do things not just with a five times or 10 times improvement, but it actually enables you to do project for stock that otherwise would fail, or you would just not be able to do. Classifying multi terabyte and multi petabyte estates across different sources, formats, very large volumes of data. In many scenarios, you just can't do that manually. We've worked with government departments. And the issues there as you'd expect are the result of fragmented data. There's a lot of different sources, there's a lot of different formats. And without these newer technologies to address it, with automation and machine learning, the project isn't doable. But now it is. And that could lead to a revolution in some of these businesses organizations. >> To enable that revolution now, there's got to be the right cultural mindset. And one, when Santiago was talking about those really kind of adopting that and I think, I always call that getting comfortably uncomfortable. But that's hard for organizations to do. The technology is here to enable that. But when you're talking with customers, how do you help them build the trust and the confidence that the new technologies and a new approaches can deliver what they need? How do you help drive that kind of attack in the culture? >> It's really good question, because it can be quite scary. I think the first thing we'd start with is to say, look, the technology is here, with businesses like Io-Tahoe, unlike Oracle, it's already arrived. What you need to be comfortable doing is experimenting, being agile around it and trying new ways of doing things. If you don't want to get left behind. And Santiago, and the team at FBN, are a great example of embracing it, testing it on a small scale and then scaling up. At Io-Tahoe we offer what we call a data health check, which can actually be done very quickly in a matter of a few weeks. So we'll work with the customer, pick a use case, install the application, analyze data, drive out some some quick wins. So we worked in the last few weeks of a large energy supplier. And in about 20 days, we were able to give them an accurate understanding of their critical data elements, help them apply data protection policies, minimize copies of the data, and work out what data they needed to delete to reduce their infrastructure spend. So it's about experimenting on that small scale, being agile, and then scaling up in a in a kind of very modern way. >> Great advice. Santiago, I'd like to go back to you. Is we kind of look at, again, that topic of culture, and the need to get that mindset there to facilitate these rapid changes. I want to understand kind of last question for you about how you're doing that. From a digital transformation perspective, we know everything is accelerating in 2020. So how are you building resilience into your data architecture and also driving that cultural change that can help everyone in this shift to remote working and a lot of the the digital challenges that we're all going through? >> That's a really interesting transition, I would say. I was mentioning, just going back to some of the points before to transition these I said that the new technologies allowed us to discover the data in a new way to blog and see very quickly information, to have new models of (indistinct) data, we are talking about data (indistinct), and giving autonomy to our different data units. Well, from that autonomy, they can then compose and innovate their own ways. So for me now we're talking about resilience. Because, in a way autonomy and flexibility in our organization, in our data structure, we platform gives you resilience. The organizations and the business units that I have experienced in the pandemic, are working well, are those that actually, because they're not physically present anymore in the office, you need to give them their autonomy and let them actually engage on their own side and do their own job and trust them in a away. And as you give them that they start innovating, and they start having a really interesting idea. So autonomy and flexibility, I think, is a key component of the new infrastructure where you get the new reality that pandemic shows that yes, we used to be very kind of structure, policies, procedures, as they're important, but now we learn flexibility and adaptability at the same site. Now, when you have that, a key other components of resiliency is speed, of course, people want to access the data and access it fast and decide fast, especially changes are changing so quickly nowadays, that you need to be able to, interact and iterate with your information to answer your questions quickly. And coming back maybe to where Yusef was saying, I completely agree is about experimenting, and iterate. You will not get it right the first time, especially that the world is changing too fast. And we don't have answers already set for everything. So we need to just go play and have ideas fail, fail fast, and then learn and then go for the next. So, technology that allows you to be flexible, iterate, and in a very fast agile way continue will allow you to actually be resilient in the way because you're flexible, you adapt, you are agile and you continue answering questions as they come without having everything said in a stroke that is too hard. Now coming back to your idea about the culture is changing in employees and in customers. Our employees, our customers are more and more digital service. And in a way the pandemic has accelerated that. We had many branches of the bank that people used to go to ask for things now they cannot go. You need to, here in Europe with the lockdown you physically cannot be going to the branches and the shops that have been closed. So they had to use our mobile apps. And we have to go into the internet banking, which is great, because that was the acceleration we wanted. Similarly, our employees needed to work remotely. So they needed to engage with a digital platform. Now what that means, and this is, I think the really strong point for the cultural change for resilience is that more and more we have two type of connectivity that is happening with data. And I call it employees connecting to data. The session we're talking about, employees connecting with each other, the collaboration that Yusef was talking about, which is allowing people to share ideas, learn and innovate. Because the more you have platforms where people can actually find themselves and play with the data, they can bring ideas to the analysis. And then employees actually connecting to algorithms. And this is the other part that is really interesting. We also are a partner of Oracle. And Oracle (indistinct) is great, they have embedded within the transactional system, many algorithms that are allowing us to calculate as the transactions happen. What happened there is that when our customers engage with algorithms, and again, with Io-Tahoe as well, the machine learning that is there for speeding the automation of how you find your data allows you to create an alliance with the machine. The machine is there to actually in a way be your best friend, to actually have more volume of data calculated faster in a way to discover more variety. And then, we couldn't cope without being connected to these algorithms. And then, we'll finally get to the last connection I was saying is, the customers themselves engaging with the connecting with the data. I was saying they're more and more engaging with our app and our website and they're digitally serving. The expectation of the customer has changed. I work in a bank where the industry is completely challenged. You used to have people going to a branch, as I was saying, they cannot not only not go there, but they're even going from branch to digital to ask to now even wanting to have business services actually in every single app that they are using. So the data becomes a service for them. The data they want to see how they spend their money and the data of their transactions will tell them what is actually their spending is going well with their lifestyle. For example, we talk about a normal healthy person. I want to see that I'm standing, eating good food and the right handle, healthy environment where I'm mentally engaged. Now all these is metadata is knowing how to classify your data according to my values, my lifestyle, is algorithms I'm coming back to my three connections, is the algorithms that allow me to very quickly analyze that metadata. And actually my staff in the background, creating those understanding of the customer journey to give them service that they expect on a digital channel, which is actually allowing them to understand how they are engaging with financial services. >> Engagement is absolutely critical Santiago. Thank you for sharing that. I do want to wrap really quickly. Gudron one last question for you. Santiago talked about Oracle, you've talked about it a little bit. As we look at digital resilience, talk to us a little bit in the last minute about the evolution of Oracle, what you guys are doing there to help your customers get the resilience that they have to have to be. To not just survive, but thrive. >> Yeah. Well, Oracle has a cloud offering for infrastructure, database, platform service, and the complete solutions offered at SaaS. And as Santiago also mentioned, we are using AI across our entire portfolio, and by this will help our customers to focus on their business innovation and capitalize on data by enabling your business models. And Oracle has a global coverage with our cloud regions. It's massively investing in innovating and expanding their cloud. And by offering cloud as public cloud in our data centers, and also as private clouds with clouded customer, we can meet every sovereignty and security requirement. And then this way, we help people to see data in new ways. We discover insights and unlock endless possibilities. And maybe one one of my takeaways is, if I speak with customers, I always tell them, you better start collecting your data now. We enable this by this like Io-Tahoe help us as well. If you collect your data now you are ready for tomorrow. You can never collect your data backwards. So that is my takeaway for today. >> You can't collect your data backwards. Excellent Gudron. Gentlemen, thank you for sharing all of your insights, very informative conversation. All right. This is theCUBE, the leader in live digital tech coverage. (upbeat music)

Published Date : Dec 10 2020

SUMMARY :

brought to you by Io-Tahoe. Gentlemen, it's great to have going to start with you. And because of the history, it grew, So Santiago, talk to me about So just to give you a that you just said Santiago And I'm going to finish with that idea. And enabling that speed and for the customer means to help your clients leverage automation and itself repairing the that the business has. And in order to do this, of the business outcomes And that could lead to a revolution and the confidence that And Santiago, and the team and the need to get that of the customer journey to give them they have to have to be. and the complete the leader in live digital tech coverage.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Yusef KhanPERSON

0.99+

YusefPERSON

0.99+

OracleORGANIZATION

0.99+

LondonLOCATION

0.99+

EuropeLOCATION

0.99+

Io-TahoeORGANIZATION

0.99+

Gudron Van Der WalPERSON

0.99+

LisaPERSON

0.99+

AsiaLOCATION

0.99+

10 timesQUANTITY

0.99+

AmazonORGANIZATION

0.99+

AfricaLOCATION

0.99+

five timesQUANTITY

0.99+

Santiago CastroPERSON

0.99+

Santiago CastorPERSON

0.99+

2020DATE

0.99+

UKLOCATION

0.99+

FBNORGANIZATION

0.99+

Oracle DigitalORGANIZATION

0.99+

LegoORGANIZATION

0.99+

SantiagoPERSON

0.99+

Middle EastLOCATION

0.99+

90%QUANTITY

0.99+

GudronPERSON

0.99+

Gudron van der WalPERSON

0.99+

firstQUANTITY

0.99+

First Bank of NigeriaORGANIZATION

0.99+

tomorrowDATE

0.99+

FacebookORGANIZATION

0.99+

bothQUANTITY

0.99+

First Bank of NigeriaORGANIZATION

0.98+

COVIDEVENT

0.98+

oneQUANTITY

0.98+

Io-TahoeLOCATION

0.98+

four years agoDATE

0.98+

todayDATE

0.98+

two typeQUANTITY

0.97+

pandemicEVENT

0.97+

about 20 daysQUANTITY

0.97+

16-17 yearsQUANTITY

0.97+

first timeQUANTITY

0.97+

first launchQUANTITY

0.97+

first wayQUANTITY

0.96+

three gentlemenQUANTITY

0.96+

GoogleORGANIZATION

0.95+

125 years agoDATE

0.95+

NigeriaLOCATION

0.95+

one sizeQUANTITY

0.94+

Santiago Aldana, Avianca | Adobe Summit 2019


 

>> Live from Las Vegas it's theCUBE covering Adobe Summit 2019. Brought to you by Accenture Interactive. >> Welcome back everybody, Jeff Frick here with theCUBE. We're in Vegas at the Adobe Summit 2019. I think there's about 17,000 people. The first time we've been here but we've been excited to be here. There's a crazy good buzz and energy and actually a ton of CUBE alumni here at Adobe. We're greeting old friends but it's always great to meet new friends. And coming off of great mention in the keynote this morning we're excited to have Santiago Aldana. He is the CDO and CTO of Avianca. Welcome. >> Thank you, John. >> So I was surprised this morning, we were watching the keynote and there's Satya Nadella and he has a shout-out for you guys. >> It was quite a surprise. It was very engaging and I'm happy to hear that. >> Yeah, congratulations. >> Thank you. >> And a little fact, you guys are the second oldest commercial airline, he said. I had no idea. >> That's right, we've been flying for almost 100 years. It's our 100th anniversary this year. >> Awesome. >> So great, great. >> Well, congratulations. >> Thank you. >> So air travel's a really interesting industry because it's growing like crazy in terms of the total number of passenger miles, right? More people are flying all the time. But it's got to be super competitive. You got to worry about fuel costs. A seat mile is a seat mile. So there's all kinds of interesting ways to compete. You guys are really into it, you've been successful for 100 years, so how do you differentiate what you guys are doing and continue to evolve and be successful? >> So there's several things. If you look 10 years back we used to be a domestic airline. We used to have around 30 planes, now we're around 170 planes. We're the second largest airline in Latin America. That has been a huge growth. >> Wait, how long did you do that? >> That's for the last 10 years. >> 10 years you went from 30 planes to 170. >> To 170, 180. >> And domestic to international. >> To a Latin American airline. >> That's a big move. >> That's a big move but we're shifting our emphasis, going more, rather than growth, going to profitability. And to make that profitability we have to make the strong transformation to make that happen. >> So for profitability there's all kinds of things that go in there, there's higher utilization, there's hopefully everybody buys Teslas so the gas doesn't cost as much for the airplanes. How are you focusing on profitability? 'Cause here we're at Adobe, all the talk's about experience, experience, experience. If I'm flying on your plane, I want to get a good deal and keep everything good but I'm not necessarily that worried about your profitability. >> So let me tell you a little bit about that. If you think about an airline we're just the distance between our customers and their dreams. We're just the distance. So the customer doesn't want to go to security. The customer doesn't want to go to the whole hassle of planning the trip. Our purpose is reducing that distance, reducing that effort, and when we reduce that effort we're going to self-service, we're going to personalize, to make life easier for our customers. That's the basic challenge. And that has to do with three main areas. One, knowing our customer. The other one is, making sure that the value proposal for that customer journey is proper. So that's operational work. And the other one is providing our employees with enough information to make that happen. All of those are working along data, and data to be able to provide a real value proposal to making that happen. The customer has to be in the center of our strategy and that's where we have to be working all the time. And when you do this, it's not about technology, it's about the customer. And being that, about the customer, the strongest challenge is not technology but people, making people change so we that can provide the value proposals to our customers. >> So what are some of the things that you did to enable the experience of my engagement, whether it's electronic or whether it's when I'm talking to that person at the counter, checking in or getting on my flight, how have you helped them provide me a better experience? >> You talk as if it was part of the past. To be honest, it's a journey, we're still working on that. There are several things that we did last year, a whole bunch of things. We changed our app, we changed our website, we changed our interaction with our customers with data. And regarding Adobe, we're here at Adobe, we implemented a whole set of tools. So AEM, the website is a new thing. Regarding Microsoft we implemented a CRM to know about our customers. We changed our app, and the app is like a platform with which we're transforming the customer journey. What we have to do at the end of the game is changing those touch points so that those require less effort from our customer, they're more seamless and we are able to personalize and know in advance what the customer is looking for to provide alternatives. And that makes it more seamless. So we're in that process of doing data-centered decision making to reduce that effort from our customers and make things happen. >> So as you've gone through this journey to date what are some of the surprising things that came up that you just didn't expect at all, on a positive side? And then what were some of the negative things that you didn't know, that were so negative that now you've kind of removed? >> Okay, so I've been here in this business and Avianca just for the last two years, so if you talk about surprises this is my first time in airlines. I wasn't expecting this to be so challenging. >> Well, it's good to come at it from a fresh point of view, absolutely. >> I've been in banking, I've been in telcos. Believe me, there's a huge technical depth, there's a lot of complexity, and bringing this customer information up to the table, it's been challenging. Lots of things going together. Surprises, yeah, we have to work with our employees. We have to transform that culture. We have to move towards a more testing ... Having experiments iterating and learning from that process. And that takes time and that requires a lot change of culture. The other one is being more agile and that's more easily said than done. So making the teams be more collaborative. And working with partners. We decided to choose a handful of partners to make this transformation work. And those partners, that's not one thing that you just plug and play, you have to make it work and that requires a lot of effort. Even if it's big, strong, world-wide, world-level sponsors and partners, it requires engaging and making them work together. At the end it's about people in every part. And making people work together, that's a challenge. That's a challenge. >> You've got the whole gamut too, 'cause you've got the front line people that are directly engaged with the client, whether again it be at the gate agent or on the telephone or processing those things all the way back to the senior management and the operations which I'm sure are not only regulated and very very finely detailed for safety and everything else. So that's got to be hard to try to drive transformation in what was probably a pretty rigid situation. >> It is, that's why you have to choose what to do. And probably you don't know how to do it at the beginning but you know what you want to achieve. And that requires a more iteration way of learning, experimenting and finding a way. That's regarding agility. And that's where you work with partners to also leapfrog and move faster forward unto this. That's where we choose partners as Accenture, Adobe, Microsoft, SAP, and Amadeus. And they're moving us forward unto that. >> So what are some of the ways that you're trying to measure success? What are some of the things you're tracking as you go through this transformation? >> Well, several of them. Let me talk just about a couple of them. One of the things that we have to do is make the buying process easier. We're starting way behind, strong technical depth, and we have to decouple our systems to make those steps that our customer has to do, make them fewer, easier, and changing the whole booking flow. But to do that, we don't have the answer. First we have to decouple the system, the legacy systems, and then we have to learn from our customers. We have to do a lot of A/B testing to see what works better. Test and see if the process is better accepted by our customers, learn from that, probably fail, do it again, iterate it and do it again. And that process we have to engage unto that. The other one is ... So that's one of the areas. But the other one is, how can we make sure that the operational value proposal takes place? Since we have been growing for the last 10 years so much, we started from a local airline to the second biggest airline in Latin America, but that growth is a little bit disorganized and we have to set things up to make it happen. We have to provide a lot more data and connectivity to all our employees at the airports, at the counters, at the call center, and providing them with more customer information to make it happen. >> Right, so you're on that process, so you're starting to deliver new data to the gate agents and the people on the front line? >> Sure. >> So how are they reacting to that? Do they like to be empowered, are they afraid of being empowered, are they saying, "Ah, finally I have the information "in front of me that I can take care of this traveler?" >> So there's not one answer for that. In some cases we empower them and they enjoy a lot and they say, "Hey, finally we got this." For example, we are giving our ... This is a recent project that we launched at the airport. We're providing them data through mobility, making the turnaround of our planes faster, and we're giving them much more data. Before then, they had to call everywhere to find what was happening. Now they have it at their hands, and that's different. So that changes the whole thing and they look forward to that. At other times, we sometimes do mistakes also. We provided more information through the apps to our pilots. They were finding that awesome. But then some of the information that they used to have, we didn't get it. So we have to iterate it and give it and then they start loving it. Regarding our customers, which is the other side, it's not internal employees, we do some things in which we test and sometimes they say, "Oh, that was not what we were expecting." So we have to learn from that. I mean, it's not about making a huge waterfall project. It's about learning in the process, failing, and iterating and making it happen again and again. It's a whole journey. >> We just had our last guest, he talked about trying to move this stuff to the cloud. It's like, first time didn't work, second time didn't work, third time, hey, now it's working. So you don't know until you know, right? And what we hear over and over is as you start this top-level transformation project you uncover a bunch of stuff under the covers that has to be reworked to support what you're trying to do on the front end. >> That's right. >> I assume it's a lot of the same thing that you found? >> You're exactly right, there's a lot of things in that way. On all three areas. Customer, on customer we didn't have customer information, we didn't even have a CRM. So we implemented our CRM at a huge fast pace that we did it, in a year we already had it. The app and the website, we have to totally remake it, and getting more information from that and getting personalized information regarding that. That's technical depth, I was not expecting that to be there. >> So I'm just curious, what was the catalyst of this transformation and this growth? Were you trying to put in systems to support the growth that you did from going from a relatively small domestic airline to an international, or are you trying to set the table for continued growth, to continue on that growth path? That's a pretty aggressive growth path. >> It's a little bit more simple than that and I'm going to be blunt here. Three years ago the board at Avianca was doing a search for a new CEO. That's my boss right now. He came over three years ago. He used to be the president for Microsoft in Latin America. In the interviews they told him a lot of things. And after he was questioned and doing the interview, he said, "Okay, let me say this now. "Are you asking me to make Avianca "a digital company flying airplanes?" And they said, "Yeah, that's exactly right. "That's what we want." So that was the initial pace. That was three years ago. I joined the team two years ago. There was already a vision, and that vision is making things easier and effortless for the customer. That's part of what we're trying to build. And that is before, during, and after the trip. If we are able to do that we're reducing costs, we're making it simpler. The whole process is about being simpler, taking away complexity, making sure that our operations are better, and that's taking away complexity. You can do that through technology also. But again, the biggest challenge is probably not technology. It's a cultural change and it's the leadership required to move on and make our employees, our customers, take advantage of it. >> Bold move by the board and a bold move by the CEO but we hear it all the time. Everybody's a digital company now, it's just what product or service do you happen to wrap it around? So what a great story. >> Thank you. And yeah, again, we got to go more data-centered, we have to know our customer better. If we want to do something personalized the only way is through the data. We have to know in advance what our customers are requesting and trying to make it easier for all of them, and that's the data. >> Well Santiago, thanks for sharing your story. And again congratulations on the keynote shout-out. >> Thank you, thanks a lot. >> All right. He's Santiago, I'm Jeff, you're watching theCUBE. We're at Adobe Summit 2019 in Las Vegas. Thanks for watching, we'll see you next time. (lively electronic music)

Published Date : Mar 28 2019

SUMMARY :

Brought to you by Accenture Interactive. in the keynote this morning and he has a shout-out for you guys. It was quite a surprise. And a little fact, you guys are It's our 100th anniversary this year. and continue to evolve and be successful? We're the second largest airline in Latin America. 10 years you went from 30 planes to And to make that profitability we have to make so the gas doesn't cost as much for the airplanes. And that has to do with three main areas. So AEM, the website is a new thing. just for the last two years, so if you talk about surprises Well, it's good to come at it So making the teams be more collaborative. and the operations which I'm sure are not only regulated And that's where you work with partners One of the things that we have to do So that changes the whole thing that has to be reworked to support that we did it, in a year we already had it. the growth that you did from going from And that is before, during, and after the trip. Bold move by the board and a bold move by the CEO We have to know in advance what our customers And again congratulations on the keynote shout-out. Thanks for watching, we'll see you next time.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Satya NadellaPERSON

0.99+

MicrosoftORGANIZATION

0.99+

JeffPERSON

0.99+

AviancaORGANIZATION

0.99+

AccentureORGANIZATION

0.99+

Jeff FrickPERSON

0.99+

AdobeORGANIZATION

0.99+

JohnPERSON

0.99+

SantiagoPERSON

0.99+

Santiago AldanaPERSON

0.99+

100 yearsQUANTITY

0.99+

VegasLOCATION

0.99+

AmadeusORGANIZATION

0.99+

Latin AmericaLOCATION

0.99+

last yearDATE

0.99+

OneQUANTITY

0.99+

Las VegasLOCATION

0.99+

FirstQUANTITY

0.99+

30 planesQUANTITY

0.99+

Three years agoDATE

0.99+

two years agoDATE

0.99+

three years agoDATE

0.99+

third timeQUANTITY

0.99+

Adobe Summit 2019EVENT

0.99+

SAPORGANIZATION

0.99+

second timeQUANTITY

0.99+

first timeQUANTITY

0.99+

100th anniversaryQUANTITY

0.98+

three main areasQUANTITY

0.98+

Accenture InteractiveORGANIZATION

0.98+

second largest airlineQUANTITY

0.97+

10 yearsQUANTITY

0.97+

oneQUANTITY

0.96+

this yearDATE

0.96+

around 170 planesQUANTITY

0.95+

one thingQUANTITY

0.95+

almost 100 yearsQUANTITY

0.95+

second biggest airlineQUANTITY

0.94+

TeslasORGANIZATION

0.94+

CUBEORGANIZATION

0.93+

about 17,000 peopleQUANTITY

0.93+

170QUANTITY

0.93+

around 30 planesQUANTITY

0.92+

CTOPERSON

0.92+

one answerQUANTITY

0.9+

this morningDATE

0.89+

theCUBEORGANIZATION

0.87+

last two yearsDATE

0.85+

Latin AmericanOTHER

0.84+

AEMORGANIZATION

0.83+

10 yearsDATE

0.82+

threeQUANTITY

0.81+

last 10 yearsDATE

0.8+

second oldest commercial airlineQUANTITY

0.79+

CDOPERSON

0.75+

180QUANTITY

0.71+

a yearQUANTITY

0.54+

Luke Hinds, Red Hat | KubeCon + CloudNativeCon NA 2021


 

>>Welcome to this cube conversation. I'm Dave Nicholson and we're having this conversation in advance of cube con cloud native con north America, 2021. Uh, we are going to be talking specifically about a subject near and dear to my heart, and that is security. We have a very special guest from red hat, the security lead from the office of the CTO. New kinds. Welcome. Welcome to the cube Luke. >>Oh, it's great to be here. Thank you, David. Really looking forward to this conversation. >>So you have a session, uh, at a CubeCon slash cloud native con this year. And, uh, frankly, I look at the title and based on everything that's going on in the world today, I'm going to accuse you of clickbait because the title of your session is a secure supply chain vision. Sure. What other than supply chain has is in the news today, all of these things going on, but you're talking about the software supply chain. Aren't you tell, tell us about, tell us about this vision, where it came from Phyllis in. >>Yes, very much. So I do agree. It is a bit of a buzzword at the moment, and there is a lot of attention. It is the hot topic, secure supply chains, thanks to things such as the executive order. And we're starting to see an increase in attacks as well. So there's a recent statistic came out that was 620%. I believe increase since last year of supply chain attacks involving the open source ecosystem. So things are certainly ramping up. And so there is a bit of clickbait. You got me there. And um, so supply chains, um, so it's predominantly let's consider what is a supply chain. Okay. And we'll, we'll do this within the context of cloud native technology. Okay. Cause there's many supply chains, you know, many, many different software supply chains. But if we look at a cloud native one predominantly it's a mix of people and machines. >>Okay. So you'll have your developers, uh, they will then write code. They will change code and they'll typically use our, a code revision control system, like get, okay, so they'll make their changes there. Then push those changes up to some sort of repository, typically a get Harbor or get level, something like that. Then another human will then engage and they will review the code. So somebody that's perhaps a maintain will look at the code and they'll improve that a code. And then at the same time, the machine start to get involved. So you have your build servers that run tests and integration tests and they check the code is linted correctly. Okay. And then you have this sort of chain of events that start to happen. These machines, these various actors that start to play their parts in the chain. Okay. So your build system might generate a container image is a very common thing within a cloud native supply chain. >>Okay. And then that image is typically deployed to production or it's hosted on a registry, a container registry, and then somebody else might utilize that container image because it has software that you've packaged within that container. Okay. And then this sort of prolific expansion of use of coasts where people start to rely on other software projects for their own dependencies within their code. Okay. And you've got this kind of a big spaghetti of actors that are dependent on each other and feed him from each other. Okay. And then eventually that is deployed into production. Okay. So these machines are a lot of them non open source code. Okay. Even if there is a commercial vendor that manages that as a service, it's all based on predominantly open source code. Okay. And the security aspects with the supply chain is there's many junctures where you can exploit that supply chain. >>So you can exploit the human, or you could be a net ferrous human in the first place you could steal somebody's identity. Okay. And then there's the build systems themselves where they generate these artifacts and they run jobs. Okay. And then there are the production system, which pulls these down. Okay. And then there's the element of which we touched upon around libraries and dependencies. So if you look at a lot of projects, they will have approximately around a hundred, perhaps 500 dependencies that they all pull in from. Okay. So then you have the supply chains within each one of those, they've got their own set of humans and machines. And so it's a very large spaghetti beast of, of, of sort of dependence and actors and various identities that make up. >>Yeah. You're, you're describing a nightmarish, uh, scenario here. So, uh, so, so I definitely appreciate the setup there. It's a chain of custody nightmare. Yeah. >>Yes. Yeah. But it's also a wonderful thing because it's allowed us to develop in the paradigms that we have now very fast, you know, you can, you can, you can prototype and design and build and ship very fast, thanks to these tools. So they're wonderful. It's not to say that they're, you know, that there is a gift there, but security has arguably been left as a bit of an afterthought essentially. Okay. So security is always trying to it's at the back of the race. It's always trying to catch up with you. See what I mean? So >>Well, so is there a specific reason why this is particularly timely? Um, in, you know, when we, when we talk about deployment of cloud native applications, uh, something like 75% of what we think of is it is still on premesis, but definitely moving in the direction of what we loosely call cloud. Um, is why is this particularly timely? >>I think really because of the rampant adoption that we see. So, I mean, as you rightly say, a lot of, uh, it companies are still running on a, sort of a, more of a legacy model okay. Where deployments are more monolithic and statics. I mean, we've both been around for a while when we started, you would, you know, somebody would rack a server, they plug a network cable and you'd spend a week deploying the app, getting it to run, and then you'd walk away and leave it to a degree. Whereas now obviously that's really been turned on its head. So there is a, an element of not everybody has adopted this new paradigm that we have in development, but it is increasing, there is rapid adoption here. And, and many that aren't many that rather haven't made that change yet to, to migrate to a sort of a cloud type infrastructure. >>They certainly intend to, well, they certainly wished to, I mean, there's challenges there in itself, but it, I would say it's a safe bet to say that the prolific use of cloud technologies is certainly increasing as we see in all the time. So that also means the attack vectors are increasing as we're starting to see different verticals come into this landscape that we have. So it's not just your kind of a sort of web developer that are running some sort of web two.site. We have telcos that are starting to utilize cloud technology with virtual network functions. Uh, we have, um, health banking, FinTech, all of these sort of large verticals are starting to come into cloud and to utilize the cloud infrastructure model that that can save them money, you know, and it can make them, can make their develop more agile and, you know, there's many benefits. So I guess that's the main thing is really, there's a convergence of industries coming into this space, which is starting to increase the security risks as well. Because I mean, the security risks to a telco are a very different group to somebody that's developing a web platform, for example. >>Yeah. Yeah. Now you, you, uh, you mentioned, um, the sort of obvious perspective from the open source perspective, which is that a lot of this code is open source code. Um, and then I also, I assume that it makes a lot of sense for the open source community to attack this problem, because you're talking about so many things in that chain of custody that you described where one individual private enterprise is not likely to be able to come up with something that handles all of it. So, so what's your, what's your vision for how we address this issue? I know I've seen in, um, uh, some of the content that you've produced an allusion to this idea that it's very similar to the concept of a secure HTTP. And, uh, and so, you know, imagine a world where HTTP is not secure at any time. It's something we can't imagine yet. We're living in this parallel world where, where code, which is one of the four CS and cloud security, uh, isn't secure. So what do we do about that? And, and, and as you share that with us, I want to dive in as much as we can on six store explain exactly what that is and, uh, how you came up with this. >>Yes, yes. So, so the HTTP story's incredibly apt for where we are. So around the open source ecosystem. Okay. We are at the HTTP stage. Okay. So a majority of code is pulled in on trusted. I'm not talking about so much here, somebody like a red hat or, or a large sort of distributor that has their own sign-in infrastructure, but more sort of in the, kind of the wide open source ecosystem. Okay. The, um, amount of code that's pulled in on tested is it's the majority. Okay. So, so it is like going to a website, which is HTTP. Okay. And we sort of use this as a vision related to six store and other projects that are operating in this space where what happened effectively was it was very common for sites to run on HTTP. So even the likes of Amazon and some of the e-commerce giants, they used to run on HTTP. >>Okay. And obviously they were some of the first to, to, uh, deploy TLS and to utilize TLS, but many sites got left behind. Okay. Because it was cumbersome to get the TLS certificate. I remember doing this myself, you would have to sort of, you'd have to generate some keys, the certificate signing request, you'd have to work out how to run open SSL. Okay. You would then go to an, uh, a commercial entity and you'd probably have to scan your passport and send it to them. And there'll be this kind of back and forth. Then you'll have to learn how to configure it on your machine. And it was cumbersome. Okay. So a majority just didn't bother. They just, you know, they continue to run their, their websites on protected. What effectively happened was let's encrypt came along. Okay. And they disrupted that whole paradigm okay. >>Where they made it free and easy to generate, procure, and set up TLS certificates. So what happened then was there was a, a very large change that the kind of the zeitgeists changed around TLS and the expectations of TLS. So it became common that most sites would run HTTPS. So that allowed the browsers to sort of ring fence effectively and start to have controls where if you're not running HTTPS, as it stands today, as it is today is kind of socially unacceptable to run a site on HTTP is a bit kind of, if you go to HTTP site, it feels a bit, yeah. You know, it's kind of, am I going to catch a virus here? It's kind of, it's not accepted anymore, you know, and, and it needed that disruptor to make that happen. So we want to kind of replicate that sort of change and movement and perception around software signing where a lot of software and code is, is not signed. And the reason it's not signed is because of the tools. It's the same story. Again, they're incredibly cumbersome to use. And the adoption is very poor as well. >>So SIG stores specifically, where did this, where did this come from? And, uh, and, uh, what's your vision for the future with six? >>Sure. So six door, six doors, a lockdown project. Okay. It started last year, July, 2020 approximately. And, uh, a few people have been looking at secure supply chain. Okay. Around that time, we really started to look at it. So there was various people looking at this. So it's been speaking to people, um, various people at Purdue university in Google and, and other, other sort of people trying to address this space. And I'd had this idea kicking around for quite a while about a transparency log. Okay. Now transparency logs are actually, we're going back to HTTPS again. They're heavily utilized there. Okay. So when somebody signs a HTTPS certificate as a root CA, that's captured in this thing called a transparency log. Okay. And a transparency log is effectively what we call an immutable tamper proof ledger. Okay. So it's, it's kind of like a blockchain, but it's different. >>Okay. And I had this idea of what, if we could leverage this technology okay. For secure supply chain so that we could capture the provenance of code and artifacts and containers, all of these actions, these actors that I described at the beginning in the supply chain, could we utilize that to provide a tamper resistant publicly or DePaul record of the supply chain? Okay. So I worked on a prototype wherever, uh, you know, some, uh, a week or two and got something basic happening. And it was a kind of a typical open source story there. So I wouldn't feel right to take all of the glory here. It was a bit like, kind of, you look at Linux when he created a Linux itself, Linus, Torvalds, he had an idea and he shared it out and then others started to jump in and collaborate. So it's a similar thing. >>I, um, shared it with an engineer from Google's open source security team called Dan Lawrence. Somebody that I know of been prolific in this space as well. And he said, I'd love to contribute to this, you know, so can I work this? And I was like, yeah, sure though, you know, the, the more, the better. And then there was also Santiago professor from Purdue university took an interest. So a small group of people started to work on this technology. So we built this project that's called Rico, and that was effectively the transparency log. So we started to approach projects to see if they would like to, to utilize this technology. Okay. And then we realized there was another problem. Okay. Which was, we now have a storage for signed artifacts. Okay. A signed record, a Providence record, but nobody's signing anything. So how are we going to get people to sign things so that we can then leverage this transparency log to fulfill its purpose of providing a public record? >>So then we had to look at the signing tools. Okay. So that's where we came up with this really sort of clever technology where we've managed to create something called ephemeral keys. Okay. So we're talking about a cryptographic key pair here. Okay. And what we could do we found was that we could utilize other technologies so that somebody wouldn't have to manage the private key and they could generate keys almost point and click. So it was an incredibly simple user experience. So then we realized, okay, now we've got an approach for getting people to sign things. And we've also got this immutable, publicly audited for record of people signing code and containers and artifacts. And that was the birth of six store. Then. So six store was created as this umbrella project of all of these different tools that were catering towards adoption of signing. And then being able to provide guarantees and protections by having this transparency log, this sort of blockchain type technology. So that was where we really sort of hit the killer application there. And things started to really lift off. And the adoption started to really gather steam then. >>So where are we now? And where does this go into the future? One of the, one of the wonderful things about the open source community is there's a sense of freedom in the creativity of coming up with a vision and then collaborating with others. Eventually you run headlong into expectations. So look, is this going to be available for purchase in Q1? What's the, >>Yeah, I, I will, uh, I will fill you in there. Okay. So, so with six door there's, um, there's several different models that are at play. Okay. I'll give you the, the two predominant ones. So one, we plan, we plan to run a public service. Okay. So this will be under the Linux foundation and it'll be very similar to let's encrypt. So you as a developer, if you want to sign your container, okay. And you want to use six door tooling that will be available to you. There'll be non-profit three to use. There's no specialties for anybody. It's, it's there for everybody to use. Okay. And that's to get everybody doing the right thing in signing things. Okay. The, the other model for six stories, this can be run behind a firewall as well. So an enterprise can stand up their own six store infrastructure. >>Okay. So the transparency log or code signing certificates, system, client tools, and then they can sign their own artifacts and secure, better materials, all of these sorts of things and have their own tamper-proof record of everything that's happened. So that if anything, untoward happens such as a key compromise or somebody's identity stolen, then you've got a credible source of truth because you've got that immutable record then. So we're seeing, um, adoption around both models. We've seen a lot of open source projects starting to utilize six store. So predominantly key, um, Kubernetes is a key one to mention here they are now using six store to sign and verify their release images. Okay. And, uh, there's many other open-source projects that are looking to leverage this as well. Okay. And then at the same time, various people are starting to consider six door as being a, sort of an enterprise signing solution. So within red hat, our expectations are that we're going to leverage this in open shift. So open shift customers who wish to sign their images. Okay. Uh, they want to sign their conflicts that they're using to deploy within Kubernetes and OpenShift. Rather they can start to leverage this technology as open shift customers. So we're looking to help the open source ecosystem here and also dog food, this, and make it available and useful to our own customers at red hat. >>Fantastic. You know, um, I noticed the red hat in the background and, uh, and, uh, you know, I just a little little historical note, um, red hat has been there from the beginning of cloud before, before cloud was cloud before there was anything credible from an enterprise perspective in cloud. Uh, I, I remember in the early two thousands, uh, doing work with tree AWS and, uh, there was a team of red hat folks who would work through the night to do kernel level changes for the, you know, for the Linux that was being used at the time. Uh, and so a lot of, a lot of what you and your collaborators do often falls into the category of, uh, toiling in obscurity, uh, to a certain degree. Uh, we hope to shine light on the amazing work that you're doing. And, um, and I, for one appreciate it, uh, I've uh, I've, I've suffered things like identity theft and, you know, we've all had brushes with experiences where compromise insecurity is not a good thing. So, um, this has been a very interesting conversation. And again, X for the work that you do, uh, do you have any other, do you have any other final thoughts or, or, uh, you know, points that we didn't cover on this subject that come to mind, >>There is something that you touched upon that I'd like to illustrate. Okay. You mentioned that, you know, identity theft and these things, well, the supply chain, this is critical infrastructure. Okay. So I like to think of this as you know, there's, sir, they're serving, you know, they're solving technical challenges and, you know, and the kind of that aspect of software development, but with the supply chain, we rely on these systems. When we wake up each morning, we rely on them to stay in touch with our loved ones. You know, we are our emergency services, our military, our police force, they rely on these supply chains, you know, so I sort of see this as there's a, there's a bigger vision here really in protecting the supply chain is, is for the good of our society, because, you know, a supply chain attack can go very much to the heart of our society. You know, it can, it can be an attack against our democracies. So I, you know, I see this as being something that's, there's a humanistic aspect to this as well. So that really gets me fired up to work on this technology., >>it's really important that we always keep that perspective. This isn't just about folks who will be attending CubeCon and, uh, uh, uh, cloud con uh, this is really something that's relevant to all of us. So, so with that, uh, fantastic conversation, Luke, it's been a pleasure to meet you. Pleasure to talk to you, David. I look forward to, uh, hanging out in person at some point, whatever that gets me. Uh, so with that, uh, we will sign off from this cube conversation in anticipation of cloud con cube con 2021, north America. I'm Dave Nicholson. Thanks for joining us.

Published Date : Oct 14 2021

SUMMARY :

Welcome to this cube conversation. Oh, it's great to be here. So you have a session, uh, at a CubeCon slash cloud So there's a recent statistic came out that was 620%. So you have your build servers that run tests and integration And the security aspects with the supply chain is there's many junctures So then you have the supply chains within each one of those, It's a chain of custody nightmare. in the paradigms that we have now very fast, you know, you can, you can, Um, in, you know, when we, when we talk about deployment of cloud native applications, So there is a, So that also means the I assume that it makes a lot of sense for the open source community to attack this problem, So around the open source ecosystem. I remember doing this myself, you would have to sort of, you'd have to generate some keys, So that allowed the browsers to sort So there was various people looking at this. uh, you know, some, uh, a week or two and got something basic happening. So a small group of people started to work on this technology. So that was where we really sort of hit So where are we now? So you as a developer, if you want to sign your container, okay. So that if anything, untoward happens such as And again, X for the work that you do, So I like to think of this as you know, it's really important that we always keep that perspective.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

Dave NicholsonPERSON

0.99+

Luke HindsPERSON

0.99+

LukePERSON

0.99+

GoogleORGANIZATION

0.99+

75%QUANTITY

0.99+

AmazonORGANIZATION

0.99+

620%QUANTITY

0.99+

Dan LawrencePERSON

0.99+

six storiesQUANTITY

0.99+

KubeConEVENT

0.99+

six doorsQUANTITY

0.99+

last yearDATE

0.99+

2021DATE

0.99+

CubeConEVENT

0.99+

a weekQUANTITY

0.99+

twoQUANTITY

0.99+

both modelsQUANTITY

0.98+

AWSORGANIZATION

0.98+

oneQUANTITY

0.98+

firstQUANTITY

0.98+

six storeQUANTITY

0.98+

todayDATE

0.98+

500 dependenciesQUANTITY

0.98+

sixQUANTITY

0.98+

north AmericaLOCATION

0.98+

LinuxTITLE

0.98+

threeQUANTITY

0.97+

each morningQUANTITY

0.97+

cloud con cube con 2021EVENT

0.97+

this yearDATE

0.97+

six doorQUANTITY

0.97+

bothQUANTITY

0.97+

fourQUANTITY

0.97+

around a hundredQUANTITY

0.97+

OneQUANTITY

0.96+

last year, July, 2020DATE

0.95+

Q1DATE

0.94+

each oneQUANTITY

0.94+

RicoORGANIZATION

0.93+

Purdue universityORGANIZATION

0.93+

Red HatORGANIZATION

0.91+

one individualQUANTITY

0.91+

SIGORGANIZATION

0.91+

KubernetesORGANIZATION

0.91+

cloud conEVENT

0.89+

CTOORGANIZATION

0.88+

approximatelyQUANTITY

0.88+

CubeConORGANIZATION

0.86+

HTTPSTITLE

0.82+

red hatORGANIZATION

0.82+

two thousandsQUANTITY

0.8+

storeORGANIZATION

0.8+

CloudNativeCon NA 2021EVENT

0.8+

LinusORGANIZATION

0.77+

ProvidenceLOCATION

0.76+

red hatTITLE

0.74+

KubernetesTITLE

0.74+

six storeORGANIZATION

0.72+

cloud native conORGANIZATION

0.71+

SantiagoPERSON

0.69+

telcoORGANIZATION

0.67+

OpenShiftTITLE

0.65+

PhyllisORGANIZATION

0.62+

redORGANIZATION

0.59+

HTTPSOTHER

0.55+

TorvaldsPERSON

0.53+

kernelTITLE

0.5+

onesQUANTITY

0.48+

DePaulORGANIZATION

0.48+

hatORGANIZATION

0.47+

hatTITLE

0.41+

IO TAHOE EPISODE 4 DATA GOVERNANCE V2


 

>>from around the globe. It's the Cube presenting adaptive data governance brought to you by Iota Ho. >>And we're back with the data automation. Siri's. In this episode, we're gonna learn more about what I owe Tahoe is doing in the field of adaptive data governance how it can help achieve business outcomes and mitigate data security risks. I'm Lisa Martin, and I'm joined by a J. Bihar on the CEO of Iot Tahoe and Lester Waters, the CEO of Bio Tahoe. Gentlemen, it's great to have you on the program. >>Thank you. Lisa is good to be back. >>Great. Staley's >>likewise very socially distant. Of course as we are. Listen, we're gonna start with you. What's going on? And I am Tahoe. What's name? Well, >>I've been with Iot Tahoe for a little over the year, and one thing I've learned is every customer needs air just a bit different. So we've been working on our next major release of the I O. Tahoe product. But to really try to address these customer concerns because, you know, we wanna we wanna be flexible enough in order to come in and not just profile the date and not just understand data quality and lineage, but also to address the unique needs of each and every customer that we have. And so that required a platform rewrite of our product so that we could, uh, extend the product without building a new version of the product. We wanted to be able to have plausible modules. We also focused a lot on performance. That's very important with the bulk of data that we deal with that we're able to pass through that data in a single pass and do the analytics that are needed, whether it's, uh, lineage, data quality or just identifying the underlying data. And we're incorporating all that we've learned. We're tuning up our machine learning we're analyzing on MAWR dimensions than we've ever done before. We're able to do data quality without doing a Nen initial rejects for, for example, just out of the box. So I think it's all of these things were coming together to form our next version of our product. We're really excited by it, >>So it's exciting a J from the CEO's level. What's going on? >>Wow, I think just building on that. But let's still just mentioned there. It's were growing pretty quickly with our partners. And today, here with Oracle are excited. Thio explain how that shaping up lots of collaboration already with Oracle in government, in insurance, on in banking and we're excited because we get to have an impact. It's real satisfying to see how we're able. Thio. Help businesses transform, Redefine what's possible with their data on bond. Having I recall there is a partner, uh, to lean in with is definitely helping. >>Excellent. We're gonna dig into that a little bit later. Let's let's go back over to you. Explain adaptive data governance. Help us understand that >>really adaptive data governance is about achieving business outcomes through automation. It's really also about establishing a data driven culture and pushing what's traditionally managed in I t out to the business. And to do that, you've got to you've got Thio. You've got to enable an environment where people can actually access and look at the information about the data, not necessarily access the underlying data because we've got privacy concerns itself. But they need to understand what kind of data they have, what shape it's in what's dependent on it upstream and downstream, and so that they could make their educated decisions on on what they need to do to achieve those business outcomes. >>Ah, >>lot of a lot of frameworks these days are hardwired, so you can set up a set of business rules, and that set of business rules works for a very specific database and a specific schema. But imagine a world where you could just >>say, you >>know, the start date of alone must always be before the end date of alone and having that generic rule, regardless of the underlying database and applying it even when a new database comes online and having those rules applied. That's what adaptive data governance about I like to think of. It is the intersection of three circles, Really. It's the technical metadata coming together with policies and rules and coming together with the business ontology ease that are that are unique to that particular business. And this all of this. Bringing this all together allows you to enable rapid change in your environment. So it's a mouthful, adaptive data governance. But that's what it kind of comes down to. >>So, Angie, help me understand this. Is this book enterprise companies are doing now? Are they not quite there yet. >>Well, you know, Lisa, I think every organization is is going at its pace. But, you know, markets are changing the economy and the speed at which, um, some of the changes in the economy happening is is compelling more businesses to look at being more digital in how they serve their own customers. Eh? So what we're seeing is a number of trends here from heads of data Chief Data Officers, CEO, stepping back from, ah, one size fits all approach because they've tried that before, and it it just hasn't worked. They've spent millions of dollars on I T programs China Dr Value from that data on Bennett. And they've ended up with large teams of manual processing around data to try and hardwire these policies to fit with the context and each line of business and on that hasn't worked. So the trends that we're seeing emerge really relate. Thio, How do I There's a chief data officer as a CEO. Inject more automation into a lot of these common tax. Andi, you know, we've been able toc that impact. I think the news here is you know, if you're trying to create a knowledge graph a data catalog or Ah, business glossary. And you're trying to do that manually will stop you. You don't have to do that manually anymore. I think best example I can give is Lester and I We we like Chinese food and Japanese food on. If you were sitting there with your chopsticks, you wouldn't eat the bowl of rice with the chopsticks, one grain at a time. What you'd want to do is to find a more productive way to to enjoy that meal before it gets cold. Andi, that's similar to how we're able to help the organizations to digest their data is to get through it faster, enjoy the benefits of putting that data to work. >>And if it was me eating that food with you guys, I would be not using chopsticks. I would be using a fork and probably a spoon. So eso Lester, how then does iota who go about doing this and enabling customers to achieve this? >>Let me, uh, let me show you a little story have here. So if you take a look at the challenges the most customers have, they're very similar, but every customers on a different data journey, so but it all starts with what data do I have? What questions or what shape is that data in? Uh, how is it structured? What's dependent on it? Upstream and downstream. Um, what insights can I derive from that data? And how can I answer all of those questions automatically? So if you look at the challenges for these data professionals, you know, they're either on a journey to the cloud. Maybe they're doing a migration oracle. Maybe they're doing some data governance changes on bits about enabling this. So if you look at these challenges and I'm gonna take you through a >>story here, E, >>I want to introduce Amanda. Man does not live like, uh, anyone in any large organization. She's looking around and she just sees stacks of data. I mean, different databases, the one she knows about, the one she doesn't know about what should know about various different kinds of databases. And a man is just tasking with understanding all of this so that they can embark on her data journey program. So So a man who goes through and she's great. I've got some handy tools. I can start looking at these databases and getting an idea of what we've got. Well, as she digs into the databases, she starts to see that not everything is as clear as she might have hoped it would be. You know, property names or column names, or have ambiguous names like Attribute one and attribute to or maybe date one and date to s Oh, man is starting to struggle, even though she's get tools to visualize. And look what look at these databases. She still No, she's got a long road ahead. And with 2000 databases in her large enterprise, yes, it's gonna be a long turkey but Amanda Smart. So she pulls out her trusty spreadsheet to track all of her findings on what she doesn't know about. She raises a ticket or maybe tries to track down the owner to find what the data means. And she's tracking all this information. Clearly, this doesn't scale that well for Amanda, you know? So maybe organization will get 10 Amanda's to sort of divide and conquer that work. But even that doesn't work that well because they're still ambiguities in the data with Iota ho. What we do is we actually profile the underlying data. By looking at the underlying data, we can quickly see that attribute. One looks very much like a U. S. Social Security number and attribute to looks like a I c D 10 medical code. And we do this by using anthologies and dictionaries and algorithms to help identify the underlying data and then tag it. Key Thio Doing, uh, this automation is really being able to normalize things across different databases, so that where there's differences in column names, I know that in fact, they contain contain the same data. And by going through this exercise with a Tahoe, not only can we identify the data, but we also could gain insights about the data. So, for example, we can see that 97% of that time that column named Attribute one that's got us Social Security numbers has something that looks like a Social Security number. But 3% of the time, it doesn't quite look right. Maybe there's a dash missing. Maybe there's a digit dropped. Or maybe there's even characters embedded in it. So there may be that may be indicative of a data quality issues, so we try to find those kind of things going a step further. We also try to identify data quality relationships. So, for example, we have two columns, one date, one date to through Ah, observation. We can see that date 1 99% of the time is less than date, too. 1% of the time. It's not probably indicative of a data quality issue, but going a step further, we can also build a business rule that says Day one is less than date to. And so then when it pops up again, we can quickly identify and re mediate that problem. So these are the kinds of things that we could do with with iota going even a step further. You could take your your favorite data science solution production ISAT and incorporated into our next version a zey what we call a worker process to do your own bespoke analytics. >>We spoke analytics. Excellent, Lester. Thank you. So a J talk us through some examples of where you're putting this to use. And also what is some of the feedback from >>some customers? But I think it helped do this Bring it to life a little bit. Lisa is just to talk through a case study way. Pull something together. I know it's available for download, but in ah, well known telecommunications media company, they had a lot of the issues that lasted. You spoke about lots of teams of Amanda's, um, super bright data practitioners, um, on baby looking to to get more productivity out of their day on, deliver a good result for their own customers for cell phone subscribers, Um, on broadband users. So you know that some of the examples that we can see here is how we went about auto generating a lot of that understanding off that data within hours. So Amanda had her data catalog populated automatically. A business class three built up on it. Really? Then start to see. Okay, where do I want Thio? Apply some policies to the data to to set in place some controls where they want to adapt, how different lines of business, maybe tax versus customer operations have different access or permissions to that data on What we've been able to do there is, is to build up that picture to see how does data move across the entire organization across the state. Andi on monitor that overtime for improvement, so have taken it from being a reactive. Let's do something Thio. Fix something. Thio, Now more proactive. We can see what's happening with our data. Who's using it? Who's accessing it, how it's being used, how it's being combined. Um, on from there. Taking a proactive approach is a real smart use of of the talents in in that telco organization Onda folks that worked there with data. >>Okay, Jason, dig into that a little bit deeper. And one of the things I was thinking when you were talking through some of those outcomes that you're helping customers achieve is our ally. How do customers measure are? Why? What are they seeing with iota host >>solution? Yeah, right now that the big ticket item is time to value on. And I think in data, a lot of the upfront investment cause quite expensive. They have been today with a lot of the larger vendors and technologies. So what a CEO and economic bio really needs to be certain of is how quickly can I get that are away. I think we've got something we can show. Just pull up a before and after, and it really comes down to hours, days and weeks. Um, where we've been able Thio have that impact on in this playbook that we pulled together before and after picture really shows. You know, those savings that committed a bit through providing data into some actionable form within hours and days to to drive agility, but at the same time being out and forced the controls to protect the use of that data who has access to it. So these are the number one thing I'd have to say. It's time on. We can see that on the the graphic that we've just pulled up here. >>We talk about achieving adaptive data governance. Lester, you guys talk about automation. You talk about machine learning. How are you seeing those technologies being a facilitator of organizations adopting adaptive data governance? Well, >>Azaz, we see Mitt Emmanuel day. The days of manual effort are so I think you know this >>is a >>multi step process. But the very first step is understanding what you have in normalizing that across your data estate. So you couple this with the ontology, that air unique to your business. There is no algorithms, and you basically go across and you identify and tag tag that data that allows for the next steps toe happen. So now I can write business rules not in terms of columns named columns, but I could write him in terms of the tags being able to automate. That is a huge time saver and the fact that we can suggest that as a rule, rather than waiting for a person to come along and say, Oh, wow. Okay, I need this rule. I need this will thes air steps that increased that are, I should say, decrease that time to value that A. J talked about and then, lastly, a couple of machine learning because even with even with great automation and being able to profile all of your data and getting a good understanding, that brings you to a certain point. But there's still ambiguities in the data. So, for example, I might have to columns date one and date to. I may have even observed the date. One should be less than day two, but I don't really know what date one and date to our other than a date. So this is where it comes in, and I might ask the user said, >>Can >>you help me identify what date? One and date You are in this in this table. Turns out they're a start date and an end date for alone That gets remembered, cycled into the machine learning. So if I start to see this pattern of date one day to elsewhere, I'm going to say, Is it start dating and date? And these Bringing all these things together with this all this automation is really what's key to enabling this This'll data governance. Yeah, >>great. Thanks. Lester and a j wanna wrap things up with something that you mentioned in the beginning about what you guys were doing with Oracle. Take us out by telling us what you're doing there. How are you guys working together? >>Yeah, I think those of us who worked in i t for many years we've We've learned Thio trust articles technology that they're shifting now to ah, hybrid on Prohm Cloud Generation to platform, which is exciting. Andi on their existing customers and new customers moving to article on a journey. So? So Oracle came to us and said, you know, we can see how quickly you're able to help us change mindsets Ondas mindsets are locked in a way of thinking around operating models of I t. That there may be no agile and what siloed on day wanting to break free of that and adopt a more agile A p I at driven approach. A lot of the work that we're doing with our recall no is around, uh, accelerating what customers conduce with understanding their data and to build digital APS by identifying the the underlying data that has value. Onda at the time were able to do that in in in hours, days and weeks. Rather many months. Is opening up the eyes to Chief Data Officers CEO to say, Well, maybe we can do this whole digital transformation this year. Maybe we can bring that forward and and transform who we are as a company on that's driving innovation, which we're excited about it. I know Oracle, a keen Thio to drive through and >>helping businesses transformed digitally is so incredibly important in this time as we look Thio things changing in 2021 a. J. Lester thank you so much for joining me on this segment explaining adaptive data governance, how organizations can use it benefit from it and achieve our Oi. Thanks so much, guys. >>Thank you. Thanks again, Lisa. >>In a moment, we'll look a adaptive data governance in banking. This is the Cube, your global leader in high tech coverage. >>Innovation, impact influence. Welcome to the Cube. Disruptors. Developers and practitioners learn from the voices of leaders who share their personal insights from the hottest digital events around the globe. Enjoy the best this community has to offer on the Cube, your global leader in high tech digital coverage. >>Our next segment here is an interesting panel you're gonna hear from three gentlemen about adaptive data. Governments want to talk a lot about that. Please welcome Yusuf Khan, the global director of data services for Iot Tahoe. We also have Santiago Castor, the chief data officer at the First Bank of Nigeria, and good John Vander Wal, Oracle's senior manager of digital transformation and industries. Gentlemen, it's great to have you joining us in this in this panel. Great >>to be >>tried for me. >>Alright, Santiago, we're going to start with you. Can you talk to the audience a little bit about the first Bank of Nigeria and its scale? This is beyond Nigeria. Talk to us about that. >>Yes, eso First Bank of Nigeria was created 125 years ago. One of the oldest ignored the old in Africa because of the history he grew everywhere in the region on beyond the region. I am calling based in London, where it's kind of the headquarters and it really promotes trade, finance, institutional banking, corporate banking, private banking around the world in particular, in relationship to Africa. We are also in Asia in in the Middle East. >>So, Sanjay, go talk to me about what adaptive data governance means to you. And how does it help the first Bank of Nigeria to be able to innovate faster with the data that you have? >>Yes, I like that concept off adaptive data governor, because it's kind of Ah, I would say an approach that can really happen today with the new technologies before it was much more difficult to implement. So just to give you a little bit of context, I I used to work in consulting for 16, 17 years before joining the president of Nigeria, and I saw many organizations trying to apply different type of approaches in the governance on by the beginning early days was really kind of a year. A Chicago A. A top down approach where data governance was seeing as implement a set of rules, policies and procedures. But really, from the top down on is important. It's important to have the battle off your sea level of your of your director. Whatever I saw, just the way it fails, you really need to have a complimentary approach. You can say bottom are actually as a CEO are really trying to decentralize the governor's. Really, Instead of imposing a framework that some people in the business don't understand or don't care about it, it really needs to come from them. So what I'm trying to say is that data basically support business objectives on what you need to do is every business area needs information on the detector decisions toe actually be able to be more efficient or create value etcetera. Now, depending on the business questions they have to solve, they will need certain data set. So they need actually to be ableto have data quality for their own. For us now, when they understand that they become the stores naturally on their own data sets. And that is where my bottom line is meeting my top down. You can guide them from the top, but they need themselves to be also empower and be actually, in a way flexible to adapt the different questions that they have in orderto be able to respond to the business needs. Now I cannot impose at the finish for everyone. I need them to adapt and to bring their answers toe their own business questions. That is adaptive data governor and all That is possible because we have. And I was saying at the very beginning just to finalize the point, we have new technologies that allow you to do this method data classifications, uh, in a very sophisticated way that you can actually create analitico of your metadata. You can understand your different data sources in order to be able to create those classifications like nationalities, a way of classifying your customers, your products, etcetera. >>So one of the things that you just said Santa kind of struck me to enable the users to be adaptive. They probably don't want to be logging in support ticket. So how do you support that sort of self service to meet the demand of the users so that they can be adaptive. >>More and more business users wants autonomy, and they want to basically be ableto grab the data and answer their own question. Now when you have, that is great, because then you have demand of businesses asking for data. They're asking for the insight. Eso How do you actually support that? I would say there is a changing culture that is happening more and more. I would say even the current pandemic has helped a lot into that because you have had, in a way, off course, technology is one of the biggest winners without technology. We couldn't have been working remotely without these technologies where people can actually looking from their homes and still have a market data marketplaces where they self serve their their information. But even beyond that data is a big winner. Data because the pandemic has shown us that crisis happened, that we cannot predict everything and that we are actually facing a new kind of situation out of our comfort zone, where we need to explore that we need to adapt and we need to be flexible. How do we do that with data. Every single company either saw the revenue going down or the revenue going very up For those companies that are very digital already. Now it changed the reality, so they needed to adapt. But for that they needed information. In order to think on innovate, try toe, create responses So that type of, uh, self service off data Haider for data in order to be able to understand what's happening when the prospect is changing is something that is becoming more, uh, the topic today because off the condemning because of the new abilities, the technologies that allow that and then you then are allowed to basically help your data. Citizens that call them in the organization people that no other business and can actually start playing and an answer their own questions. Eso so these technologies that gives more accessibility to the data that is some cataloging so they can understand where to go or what to find lineage and relationships. All this is is basically the new type of platforms and tools that allow you to create what are called a data marketplace. I think these new tools are really strong because they are now allowing for people that are not technology or I t people to be able to play with data because it comes in the digital world There. Used to a given example without your who You have a very interesting search functionality. Where if you want to find your data you want to sell, Sir, you go there in that search and you actually go on book for your data. Everybody knows how to search in Google, everybody's searching Internet. So this is part of the data culture, the digital culture. They know how to use those schools. Now, similarly, that data marketplace is, uh, in you can, for example, see which data sources they're mostly used >>and enabling that speed that we're all demanding today during these unprecedented times. Goodwin, I wanted to go to you as we talk about in the spirit of evolution, technology is changing. Talk to us a little bit about Oracle Digital. What are you guys doing there? >>Yeah, Thank you. Um, well, Oracle Digital is a business unit that Oracle EMEA on. We focus on emerging countries as well as low and enterprises in the mid market, in more developed countries and four years ago. This started with the idea to engage digital with our customers. Fear Central helps across EMEA. That means engaging with video, having conference calls, having a wall, a green wall where we stand in front and engage with our customers. No one at that time could have foreseen how this is the situation today, and this helps us to engage with our customers in the way we were already doing and then about my team. The focus of my team is to have early stage conversations with our with our customers on digital transformation and innovation. And we also have a team off industry experts who engaged with our customers and share expertise across EMEA, and we inspire our customers. The outcome of these conversations for Oracle is a deep understanding of our customer needs, which is very important so we can help the customer and for the customer means that we will help them with our technology and our resource is to achieve their goals. >>It's all about outcomes, right? Good Ron. So in terms of automation, what are some of the things Oracle's doing there to help your clients leverage automation to improve agility? So that they can innovate faster, which in these interesting times it's demanded. >>Yeah, thank you. Well, traditionally, Oracle is known for their databases, which have bean innovated year over year. So here's the first lunch on the latest innovation is the autonomous database and autonomous data warehouse. For our customers, this means a reduction in operational costs by 90% with a multi medal converts, database and machine learning based automation for full life cycle management. Our databases self driving. This means we automate database provisioning, tuning and scaling. The database is self securing. This means ultimate data protection and security, and it's self repairing the automates failure, detection fail over and repair. And then the question is for our customers, What does it mean? It means they can focus on their on their business instead off maintaining their infrastructure and their operations. >>That's absolutely critical use if I want to go over to you now. Some of the things that we've talked about, just the massive progression and technology, the evolution of that. But we know that whether we're talking about beta management or digital transformation, a one size fits all approach doesn't work to address the challenges that the business has, um that the i t folks have, as you're looking through the industry with what Santiago told us about first Bank of Nigeria. What are some of the changes that you're seeing that I owe Tahoe seeing throughout the industry? >>Uh, well, Lisa, I think the first way I'd characterize it is to say, the traditional kind of top down approach to data where you have almost a data Policeman who tells you what you can and can't do, just doesn't work anymore. It's too slow. It's too resource intensive. Uh, data management data, governments, digital transformation itself. It has to be collaborative on. There has to be in a personalization to data users. Um, in the environment we find ourselves in. Now, it has to be about enabling self service as well. Um, a one size fits all model when it comes to those things around. Data doesn't work. As Santiago was saying, it needs to be adapted toe how the data is used. Andi, who is using it on in order to do this cos enterprises organizations really need to know their data. They need to understand what data they hold, where it is on what the sensitivity of it is they can then any more agile way apply appropriate controls on access so that people themselves are and groups within businesses are our job and could innovate. Otherwise, everything grinds to a halt, and you risk falling behind your competitors. >>Yeah, that one size fits all term just doesn't apply when you're talking about adaptive and agility. So we heard from Santiago about some of the impact that they're making with First Bank of Nigeria. Used to talk to us about some of the business outcomes that you're seeing other customers make leveraging automation that they could not do >>before it's it's automatically being able to classify terabytes, terabytes of data or even petabytes of data across different sources to find duplicates, which you can then re mediate on. Deletes now, with the capabilities that iota offers on the Oracle offers, you can do things not just where the five times or 10 times improvement, but it actually enables you to do projects for Stop that otherwise would fail or you would just not be able to dio I mean, uh, classifying multi terrible and multi petabytes states across different sources, formats very large volumes of data in many scenarios. You just can't do that manually. I mean, we've worked with government departments on the issues there is expect are the result of fragmented data. There's a lot of different sources. There's lot of different formats and without these newer technologies to address it with automation on machine learning, the project isn't durable. But now it is on that that could lead to a revolution in some of these businesses organizations >>to enable that revolution that there's got to be the right cultural mindset. And one of the when Santiago was talking about folks really kind of adapted that. The thing I always call that getting comfortably uncomfortable. But that's hard for organizations to. The technology is here to enable that. But well, you're talking with customers use. How do you help them build the trust in the confidence that the new technologies and a new approaches can deliver what they need? How do you help drive the kind of a tech in the culture? >>It's really good question is because it can be quite scary. I think the first thing we'd start with is to say, Look, the technology is here with businesses like I Tahoe. Unlike Oracle, it's already arrived. What you need to be comfortable doing is experimenting being agile around it, Andi trying new ways of doing things. Uh, if you don't wanna get less behind that Santiago on the team that fbn are a great example off embracing it, testing it on a small scale on, then scaling up a Toyota, we offer what we call a data health check, which can actually be done very quickly in a matter of a few weeks. So we'll work with a customer. Picky use case, install the application, uh, analyzed data. Drive out Cem Cem quick winds. So we worked in the last few weeks of a large entity energy supplier, and in about 20 days, we were able to give them an accurate understanding of their critical data. Elements apply. Helping apply data protection policies. Minimize copies of the data on work out what data they needed to delete to reduce their infrastructure. Spend eso. It's about experimenting on that small scale, being agile on, then scaling up in a kind of very modern way. >>Great advice. Uh, Santiago, I'd like to go back to Is we kind of look at again that that topic of culture and the need to get that mindset there to facilitate these rapid changes, I want to understand kind of last question for you about how you're doing that from a digital transformation perspective. We know everything is accelerating in 2020. So how are you building resilience into your data architecture and also driving that cultural change that can help everyone in this shift to remote working and a lot of the the digital challenges and changes that we're all going through? >>The new technologies allowed us to discover the dating anyway. Toe flawed and see very quickly Information toe. Have new models off over in the data on giving autonomy to our different data units. Now, from that autonomy, they can then compose an innovator own ways. So for me now, we're talking about resilience because in a way, autonomy and flexibility in a organization in a data structure with platform gives you resilience. The organizations and the business units that I have experienced in the pandemic are working well. Are those that actually because they're not physically present during more in the office, you need to give them their autonomy and let them actually engaged on their own side that do their own job and trust them in a way on as you give them, that they start innovating and they start having a really interesting ideas. So autonomy and flexibility. I think this is a key component off the new infrastructure. But even the new reality that on then it show us that, yes, we used to be very kind off structure, policies, procedures as very important. But now we learn flexibility and adaptability of the same side. Now, when you have that a key, other components of resiliency speed, because people want, you know, to access the data and access it fast and on the site fast, especially changes are changing so quickly nowadays that you need to be ableto do you know, interact. Reiterate with your information to answer your questions. Pretty, um, so technology that allows you toe be flexible iterating on in a very fast job way continue will allow you toe actually be resilient in that way, because you are flexible, you adapt your job and you continue answering questions as they come without having everything, setting a structure that is too hard. We also are a partner off Oracle and Oracle. Embodies is great. They have embedded within the transactional system many algorithms that are allowing us to calculate as the transactions happened. What happened there is that when our customers engaged with algorithms and again without your powers, well, the machine learning that is there for for speeding the automation of how you find your data allows you to create a new alliance with the machine. The machine is their toe, actually, in a way to your best friend to actually have more volume of data calculated faster. In a way, it's cover more variety. I mean, we couldn't hope without being connected to this algorithm on >>that engagement is absolutely critical. Santiago. Thank you for sharing that. I do wanna rap really quickly. Good On one last question for you, Santiago talked about Oracle. You've talked about a little bit. As we look at digital resilience, talk to us a little bit in the last minute about the evolution of Oracle. What you guys were doing there to help your customers get the resilience that they have toe have to be not just survive but thrive. >>Yeah. Oracle has a cloud offering for infrastructure, database, platform service and a complete solutions offered a South on Daz. As Santiago also mentioned, We are using AI across our entire portfolio and by this will help our customers to focus on their business innovation and capitalize on data by enabling new business models. Um, and Oracle has a global conference with our cloud regions. It's massively investing and innovating and expanding their clouds. And by offering clouds as public cloud in our data centers and also as private cloud with clouded customer, we can meet every sovereignty and security requirements. And in this way we help people to see data in new ways. We discover insights and unlock endless possibilities. And and maybe 11 of my takeaways is if I If I speak with customers, I always tell them you better start collecting your data. Now we enable this partners like Iota help us as well. If you collect your data now, you are ready for tomorrow. You can never collect your data backwards, So that is my take away for today. >>You can't collect your data backwards. Excellently, John. Gentlemen, thank you for sharing all of your insights. Very informative conversation in a moment, we'll address the question. Do you know your data? >>Are you interested in test driving the iota Ho platform kick Start the benefits of data automation for your business through the Iota Ho Data Health check program. Ah, flexible, scalable sandbox environment on the cloud of your choice with set up service and support provided by Iota ho. Look time with a data engineer to learn more and see Io Tahoe in action from around the globe. It's the Cube presenting adaptive data governance brought to you by Iota Ho. >>In this next segment, we're gonna be talking to you about getting to know your data. And specifically you're gonna hear from two folks at Io Tahoe. We've got enterprise account execs to be to Davis here, as well as Enterprise Data engineer Patrick Simon. They're gonna be sharing insights and tips and tricks for how you could get to know your data and quickly on. We also want to encourage you to engage with the media and Patrick, use the chat feature to the right, send comments, questions or feedback so you can participate. All right, Patrick Savita, take it away. Alright. >>Thankfully saw great to be here as Lisa mentioned guys, I'm the enterprise account executive here in Ohio. Tahoe you Pat? >>Yeah. Hey, everyone so great to be here. I said my name is Patrick Samit. I'm the enterprise data engineer here in Ohio Tahoe. And we're so excited to be here and talk about this topic as one thing we're really trying to perpetuate is that data is everyone's business. >>So, guys, what patent I got? I've actually had multiple discussions with clients from different organizations with different roles. So we spoke with both your technical and your non technical audience. So while they were interested in different aspects of our platform, we found that what they had in common was they wanted to make data easy to understand and usable. So that comes back. The pats point off to being everybody's business because no matter your role, we're all dependent on data. So what Pan I wanted to do today was wanted to walk you guys through some of those client questions, slash pain points that we're hearing from different industries and different rules and demo how our platform here, like Tahoe, is used for automating Dozier related tasks. So with that said are you ready for the first one, Pat? >>Yeah, Let's do it. >>Great. So I'm gonna put my technical hat on for this one. So I'm a data practitioner. I just started my job. ABC Bank. I have, like, over 100 different data sources. So I have data kept in Data Lakes, legacy data, sources, even the cloud. So my issue is I don't know what those data sources hold. I don't know what data sensitive, and I don't even understand how that data is connected. So how can I saw who help? >>Yeah, I think that's a very common experience many are facing and definitely something I've encountered in my past. Typically, the first step is to catalog the data and then start mapping the relationships between your various data stores. Now, more often than not, this has tackled through numerous meetings and a combination of excel and something similar to video which are too great tools in their own part. But they're very difficult to maintain. Just due to the rate that we are creating data in the modern world. It starts to beg for an idea that can scale with your business needs. And this is where a platform like Io Tahoe becomes so appealing, you can see here visualization of the data relationships created by the I. O. Tahoe service. Now, what is fantastic about this is it's not only laid out in a very human and digestible format in the same action of creating this view, the data catalog was constructed. >>Um so is the data catalog automatically populated? Correct. Okay, so So what I'm using Iota hope at what I'm getting is this complete, unified automated platform without the added cost? Of course. >>Exactly. And that's at the heart of Iota Ho. A great feature with that data catalog is that Iota Ho will also profile your data as it creates the catalog, assigning some meaning to those pesky column underscore ones and custom variable underscore tents. They're always such a joy to deal with. Now, by leveraging this interface, we can start to answer the first part of your question and understand where the core relationships within our data exists. Uh, personally, I'm a big fan of this view, as it really just helps the i b naturally John to these focal points that coincide with these key columns following that train of thought, Let's examine the customer I D column that seems to be at the center of a lot of these relationships. We can see that it's a fairly important column as it's maintaining the relationship between at least three other tables. >>Now you >>notice all the connectors are in this blue color. This means that their system defined relationships. But I hope Tahoe goes that extra mile and actually creates thes orange colored connectors as well. These air ones that are machine learning algorithms have predicted to be relationships on. You can leverage to try and make new and powerful relationships within your data. >>Eso So this is really cool, and I can see how this could be leverage quickly now. What if I added new data sources or your multiple data sources and need toe identify what data sensitive can iota who detect that? >>Yeah, definitely. Within the hotel platform. There, already over 300 pre defined policies such as hip for C, C, P. A and the like one can choose which of these policies to run against their data along for flexibility and efficiency and running the policies that affect organization. >>Okay, so so 300 is an exceptional number. I'll give you that. But what about internal policies that apply to my organization? Is there any ability for me to write custom policies? >>Yeah, that's no issue. And it's something that clients leverage fairly often to utilize this function when simply has to write a rejects that our team has helped many deploy. After that, the custom policy is stored for future use to profile sensitive data. One then selects the data sources they're interested in and select the policies that meet your particular needs. The interface will automatically take your data according to the policies of detects, after which you can review the discoveries confirming or rejecting the tagging. All of these insights are easily exported through the interface. Someone can work these into the action items within your project management systems, and I think this lends to the collaboration as a team can work through the discovery simultaneously, and as each item is confirmed or rejected, they can see it ni instantaneously. All this translates to a confidence that with iota hope, you can be sure you're in compliance. >>So I'm glad you mentioned compliance because that's extremely important to my organization. So what you're saying when I use the eye a Tahoe automated platform, we'd be 90% more compliant that before were other than if you were going to be using a human. >>Yeah, definitely the collaboration and documentation that the Iot Tahoe interface lends itself to really help you build that confidence that your compliance is sound. >>So we're planning a migration. Andi, I have a set of reports I need to migrate. But what I need to know is, uh well, what what data sources? Those report those reports are dependent on. And what's feeding those tables? >>Yeah, it's a fantastic questions to be toe identifying critical data elements, and the interdependencies within the various databases could be a time consuming but vital process and the migration initiative. Luckily, Iota Ho does have an answer, and again, it's presented in a very visual format. >>Eso So what I'm looking at here is my entire day landscape. >>Yes, exactly. >>Let's say I add another data source. I can still see that unified 3 60 view. >>Yeah, One future that is particularly helpful is the ability to add data sources after the data lineage. Discovery has finished alone for the flexibility and scope necessary for any data migration project. If you only need need to select a few databases or your entirety, this service will provide the answers. You're looking for things. Visual representation of the connectivity makes the identification of critical data elements a simple matter. The connections air driven by both system defined flows as well as those predicted by our algorithms, the confidence of which, uh, can actually be customized to make sure that they're meeting the needs of the initiative that you have in place. This also provides tabular output in case you needed for your own internal documentation or for your action items, which we can see right here. Uh, in this interface, you can actually also confirm or deny the pair rejection the pair directions, allowing to make sure that the data is as accurate as possible. Does that help with your data lineage needs? >>Definitely. So So, Pat, My next big question here is So now I know a little bit about my data. How do I know I can trust >>it? So >>what I'm interested in knowing, really is is it in a fit state for me to use it? Is it accurate? Does it conform to the right format? >>Yeah, that's a great question. And I think that is a pain point felt across the board, be it by data practitioners or data consumers alike. Another service that I owe Tahoe provides is the ability to write custom data quality rules and understand how well the data pertains to these rules. This dashboard gives a unified view of the strength of these rules, and your dad is overall quality. >>Okay, so Pat s o on on the accuracy scores there. So if my marketing team needs to run, a campaign can read dependent those accuracy scores to know what what tables have quality data to use for our marketing campaign. >>Yeah, this view would allow you to understand your overall accuracy as well as dive into the minutia to see which data elements are of the highest quality. So for that marketing campaign, if you need everything in a strong form, you'll be able to see very quickly with these high level numbers. But if you're only dependent on a few columns to get that information out the door, you can find that within this view, eso >>you >>no longer have to rely on reports about reports, but instead just come to this one platform to help drive conversations between stakeholders and data practitioners. >>So I get now the value of IATA who brings by automatically capturing all those technical metadata from sources. But how do we match that with the business glossary? >>Yeah, within the same data quality service that we just reviewed, one can actually add business rules detailing the definitions and the business domains that these fall into. What's more is that the data quality rules were just looking at can then be tied into these definitions. Allowing insight into the strength of these business rules is this service that empowers stakeholders across the business to be involved with the data life cycle and take ownership over the rules that fall within their domain. >>Okay, >>so those custom rules can I apply that across data sources? >>Yeah, you could bring in as many data sources as you need, so long as you could tie them to that unified definition. >>Okay, great. Thanks so much bad. And we just want to quickly say to everyone working in data, we understand your pain, so please feel free to reach out to us. we are Website the chapel. Oh, Arlington. And let's get a conversation started on how iota Who can help you guys automate all those manual task to help save you time and money. Thank you. Thank >>you. Your Honor, >>if I could ask you one quick question, how do you advise customers? You just walk in this great example this banking example that you instantly to talk through. How do you advise customers get started? >>Yeah, I think the number one thing that customers could do to get started with our platform is to just run the tag discovery and build up that data catalog. It lends itself very quickly to the other needs you might have, such as thes quality rules. A swell is identifying those kind of tricky columns that might exist in your data. Those custom variable underscore tens I mentioned before >>last questions to be to anything to add to what Pat just described as a starting place. >>I'm no, I think actually passed something that pretty well, I mean, just just by automating all those manual task. I mean, it definitely can save your company a lot of time and money, so we we encourage you just reach out to us. Let's get that conversation >>started. Excellent. So, Pete and Pat, thank you so much. We hope you have learned a lot from these folks about how to get to know your data. Make sure that it's quality, something you can maximize the value of it. Thanks >>for watching. Thanks again, Lisa, for that very insightful and useful deep dive into the world of adaptive data governance with Iota Ho Oracle First Bank of Nigeria This is Dave a lot You won't wanna mess Iota, whose fifth episode in the data automation Siri's in that we'll talk to experts from Red Hat and Happiest Minds about their best practices for managing data across hybrid cloud Inter Cloud multi Cloud I T environment So market calendar for Wednesday, January 27th That's Episode five. You're watching the Cube Global Leader digital event technique

Published Date : Dec 10 2020

SUMMARY :

adaptive data governance brought to you by Iota Ho. Gentlemen, it's great to have you on the program. Lisa is good to be back. Great. Listen, we're gonna start with you. But to really try to address these customer concerns because, you know, we wanna we So it's exciting a J from the CEO's level. It's real satisfying to see how we're able. Let's let's go back over to you. But they need to understand what kind of data they have, what shape it's in what's dependent lot of a lot of frameworks these days are hardwired, so you can set up a set It's the technical metadata coming together with policies Is this book enterprise companies are doing now? help the organizations to digest their data is to And if it was me eating that food with you guys, I would be not using chopsticks. So if you look at the challenges for these data professionals, you know, they're either on a journey to the cloud. Well, as she digs into the databases, she starts to see that So a J talk us through some examples of where But I think it helped do this Bring it to life a little bit. And one of the things I was thinking when you were talking through some We can see that on the the graphic that we've just How are you seeing those technologies being think you know this But the very first step is understanding what you have in normalizing that So if I start to see this pattern of date one day to elsewhere, I'm going to say, in the beginning about what you guys were doing with Oracle. So Oracle came to us and said, you know, we can see things changing in 2021 a. J. Lester thank you so much for joining me on this segment Thank you. is the Cube, your global leader in high tech coverage. Enjoy the best this community has to offer on the Cube, Gentlemen, it's great to have you joining us in this in this panel. Can you talk to the audience a little bit about the first Bank of One of the oldest ignored the old in Africa because of the history And how does it help the first Bank of Nigeria to be able to innovate faster with the point, we have new technologies that allow you to do this method data So one of the things that you just said Santa kind of struck me to enable the users to be adaptive. Now it changed the reality, so they needed to adapt. I wanted to go to you as we talk about in the spirit of evolution, technology is changing. customer and for the customer means that we will help them with our technology and our resource is to achieve doing there to help your clients leverage automation to improve agility? So here's the first lunch on the latest innovation Some of the things that we've talked about, Otherwise, everything grinds to a halt, and you risk falling behind your competitors. Used to talk to us about some of the business outcomes that you're seeing other customers make leveraging automation different sources to find duplicates, which you can then re And one of the when Santiago was talking about folks really kind of adapted that. Minimize copies of the data can help everyone in this shift to remote working and a lot of the the and on the site fast, especially changes are changing so quickly nowadays that you need to be What you guys were doing there to help your customers I always tell them you better start collecting your data. Gentlemen, thank you for sharing all of your insights. adaptive data governance brought to you by Iota Ho. In this next segment, we're gonna be talking to you about getting to know your data. Thankfully saw great to be here as Lisa mentioned guys, I'm the enterprise account executive here in Ohio. I'm the enterprise data engineer here in Ohio Tahoe. So with that said are you ready for the first one, Pat? So I have data kept in Data Lakes, legacy data, sources, even the cloud. Typically, the first step is to catalog the data and then start mapping the relationships Um so is the data catalog automatically populated? i b naturally John to these focal points that coincide with these key columns following These air ones that are machine learning algorithms have predicted to be relationships Eso So this is really cool, and I can see how this could be leverage quickly now. such as hip for C, C, P. A and the like one can choose which of these policies policies that apply to my organization? And it's something that clients leverage fairly often to utilize this So I'm glad you mentioned compliance because that's extremely important to my organization. interface lends itself to really help you build that confidence that your compliance is Andi, I have a set of reports I need to migrate. Yeah, it's a fantastic questions to be toe identifying critical data elements, I can still see that unified 3 60 view. Yeah, One future that is particularly helpful is the ability to add data sources after So now I know a little bit about my data. the data pertains to these rules. So if my marketing team needs to run, a campaign can read dependent those accuracy scores to know what the minutia to see which data elements are of the highest quality. no longer have to rely on reports about reports, but instead just come to this one So I get now the value of IATA who brings by automatically capturing all those technical to be involved with the data life cycle and take ownership over the rules that fall within their domain. Yeah, you could bring in as many data sources as you need, so long as you could manual task to help save you time and money. you. this banking example that you instantly to talk through. Yeah, I think the number one thing that customers could do to get started with our so we we encourage you just reach out to us. folks about how to get to know your data. into the world of adaptive data governance with Iota Ho Oracle First Bank of Nigeria

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AmandaPERSON

0.99+

JasonPERSON

0.99+

LisaPERSON

0.99+

Patrick SimonPERSON

0.99+

Lisa MartinPERSON

0.99+

SantiagoPERSON

0.99+

OracleORGANIZATION

0.99+

Yusuf KhanPERSON

0.99+

AsiaLOCATION

0.99+

16QUANTITY

0.99+

Santiago CastorPERSON

0.99+

OhioLOCATION

0.99+

LondonLOCATION

0.99+

ABC BankORGANIZATION

0.99+

Patrick SavitaPERSON

0.99+

10 timesQUANTITY

0.99+

SanjayPERSON

0.99+

AngiePERSON

0.99+

Wednesday, January 27thDATE

0.99+

AfricaLOCATION

0.99+

ThioPERSON

0.99+

John Vander WalPERSON

0.99+

2020DATE

0.99+

PatrickPERSON

0.99+

two columnsQUANTITY

0.99+

90%QUANTITY

0.99+

SiriTITLE

0.99+

ToyotaORGANIZATION

0.99+

Bio TahoeORGANIZATION

0.99+

AzazPERSON

0.99+

PatPERSON

0.99+

11QUANTITY

0.99+

five timesQUANTITY

0.99+

Oracle DigitalORGANIZATION

0.99+

J. BiharPERSON

0.99+

1%QUANTITY

0.99+

StaleyPERSON

0.99+

Iot TahoeORGANIZATION

0.99+

Iota hoORGANIZATION

0.99+

todayDATE

0.99+

RonPERSON

0.99+

firstQUANTITY

0.99+

10QUANTITY

0.99+

Iota HoORGANIZATION

0.99+

AndiPERSON

0.99+

Io TahoeORGANIZATION

0.99+

one dateQUANTITY

0.99+

OneQUANTITY

0.99+

excelTITLE

0.99+

tomorrowDATE

0.99+

3%QUANTITY

0.99+

JohnPERSON

0.99+

First Bank of NigeriaORGANIZATION

0.99+

Middle EastLOCATION

0.99+

Patrick SamitPERSON

0.99+

I. O. TahoeORGANIZATION

0.99+

first stepQUANTITY

0.99+

97%QUANTITY

0.99+

LesterPERSON

0.99+

two folksQUANTITY

0.99+

DavePERSON

0.99+

2021DATE

0.99+

fifth episodeQUANTITY

0.99+

one grainQUANTITY

0.99+

Chris Bannocks, ING & Steven Eliuk, IBM | IBM CDO Fall Summit 2018


 

(light music) >> Live from Boston. It's theCUBE. Covering IBM Chief Data Officer Summit. Brought to you by IBM. >> Welcome back everyone, to theCUBE's live coverage of the IBM CDO Summit here in Boston, Massachusetts. I'm your host, Rebecca Night. And I'm joined by my co-host, Paul Gillen. We have two guests for this segment. We have Steven Eliuk, who is the Vice President of Deep Learning Global Chief Data Officer at IBM. And Christopher Bannocks, Group Chief Data Officer at IMG. Thanks so much for coming on theCUBE. >> My pleasure. >> Before we get started, Steve, I know you have some very important CUBE fans that you need-- >> I do. >> To give a shout out to. Please. >> For sure. So I missed them on the last three runs of CUBE, so I'd like to just shout out to Santiago, my son. Five years old. And the shortest one, which is Elana. Miss you guys tons and now you're on the air. (all laughing) >> Excellent. To get that important piece of business out. >> Absolutely. >> So, let's talk about Metadata. What's the problem with Metadata? >> The one problem, or the many (chuckles)? >> (laughing) There are a multitude of problems. >> How long ya got? The problem is, it's everywhere. And there's lots of it. And bringing context to that and understanding it from enterprise-wide perspective is a huge challenge. Just connecting to it finding it, or collecting centrally and then understanding the context and what it means. So, the standardization of it or the lack of standardization of it across the board. >> Yeah, it's incredibly challenging. Just the immense scale of metadata at the same time dealing with metadata as Chris mentioned. Just coming up with your own company's glossary of terms to describe your own data. It's kind of step one in the journey of making your data discoverable and governed. Alright, so it's challenging and it's not well understood and I think we're very early on in these stages of describing our data. >> Yeah. >> But we're getting there. Slowly but surely. >> And perhaps in that context it's not only the fact that it's everywhere but actually we've not created structural solutions in a consistent way across industries to be able to structure it and manage it in an appropriate way. >> So, help people do it better. What are some of the best practices for creating, managing metadata? >> Well you can look at diff, I mean, it's such a broad space you can look at different ones. Let's just take the work we do around describing our data and we do that for for the purposes of regulation. For the purposes of GDPR et cetera et cetera. It's really about discovering and providing context to the data that we have in the organization today. So, in that respect it's creating a catalog and making sure that we have the descriptions and the structures of the data that we manage and use in the organization and to give you perhaps a practical example when you have a data quality problem you need to know how to fix it. So, you store, so you create and structure metadata around well, where does it come from, first of all. So what's the journey it's taken to get to the point where you've identified that there's a problem. But also then, who do we go to to fix it? Where did it go wrong in the chain? And who's responsible for it? Those are very simple examples of the metadata around, the transformations the data might have come through to get to its heading point. The quality metrics associated with it. And then, the owner or the data steward that it has to be routed back to to get fixed. >> Now all of those are metadata elements >> All of those, yeah. >> Right? >> 'Cause we're not really talking about the data. The data might be a debit or a credit. Something very simple like that in banking terms. But actually it's got lots of other attributes associated with it which essentially describe that data. So, what is it? Who owns it? What are the data quality metrics? How do I know whether what it's quality is? >> So where do organizations make mistakes? Do they create too much metadata? Do they create poor, is it poorly labeled? Is it not federated? >> Yes. (all laughing) >> I think it's a mix of all of them. One of the things that you know Chris alluded to and you might of understood is that it's incredibly labor-intensive task. There's a lot of people involved. And when you get a lot of people involved in sadly a quite time-consuming, slightly boring job there's errors and there's problem. And that's data quality, that's GDPR, that's government owned entities, regulatory issues. Likewise, if you can't discover the data 'cause it's labeled wrong, that's potential insight that you've now lost. Because that data's not discoverable to a potential project that's looking for similar types of data. Alright, so, kind of step one is trying to scribe your metadata to the organization. Creating a taxonomy of metadata. And getting everybody on board to label that data whether it be short and long descriptions, having good tools et cetera. >> I mean look, the simple thing is... we struggle as... As a capability in any organization we struggle with these terms, right? Metadata, well ya know, if you're talking to the business they have no idea what you're talking about. You've already confused them the minute you mentioned meta. >> Hashtag. >> Yeah (laughs) >> It's a hashtag. >> That's basically what it is. >> Essentially what it is it's just data about data. It's the descriptive components that tell you what it is you're dealing with. If you just take a simple example from finance; An interest rate on it's own tells you nothing. It could be the interest rate on a savings account. It can the interest rate on a bond. But on its own you have no clue, what you're talking about. A maturity date, or a date in general. You have to provide the context. And that is it's relationships to other data and the contexts that it's in. But also the description of what it is you're looking at. And if that comes from two different systems in an organization, let's say one in Spain and one in France and you just receive a date. You don't know what you're looking at. You have not context of what you're looking at. And simply you have to have that context. So, you have to be able to label it there and then map it to a generic standard that you implement across the organization in order to create that control that you need in order to govern your data. >> Are there standards? I'm sorry Rebecca. >> Yes. >> Are there standards efforts underway industry standard why difference? >> There are open metadata standards that are underway and gaining great deal of traction. There are an internally use that you have to standardize anyway. Irrespective of what's happening across the industry. You don't have the time to wait for external standards to exist in order to make sure you standardize internally. >> Another difficult point is it can be region or country specific. >> Yeah. >> Right, so, it makes it incredibly challenging 'cause every region you might work in you might have to have a own sub-glossary of terms for that specific region. And you might have to control the export of certain data with certain terms between regions and between countries. It gets very very challenging. >> Yeah. And then somehow you have to connect to it all to be able to see what it all is because the usefulness of this is if one system calls exactly the same, maps to let's say date. And it's local definition of that is maturity date. Whereas someone else's map date to birthdate you know you've got a problem. You just know you've got a problem. And exposing the problem is part of the process. Understanding hey that mapping's wrong guys. >> So, where do you begin? If your mission is to transform your organization to be one that is data-centric and the business side is sort of eyes glazing over at the mention of metadata. What kind of communication needs to happen? What kind of teamwork, collaboration? >> So, I mean teamwork and collaboration are absolutely key. The communication takes time. Don't expect one blast of communication to solve the problem. It is going to take education and working with people to actually get 'em to realize the importance of things. And to do that you need to start something. Just the communication of the theory doesn't work. No one can ever connect to it. You have to have people who are working on the data for a reason that is business critical. And you need have them experience the problem to recognize that metadata is important. Until they experience the problem you don't get the right amount of traction. So you have to start small and grow. >> And you can use potentially the whip as well. Governance, the regulatory requirements that's a nice one to push things along. That's often helpful. >> It's helpful, but not necessarily popular. >> No, no. >> So you have to give-- >> Balance. >> We're always struggling with that balance. There's a lot of regulation that drives the need for this. But equally, that same regulation essentially drives all of the same needs that you need for analytics. For good measurement of the data. For growth of customers. For delivering better services to customers. All of these things are important. Just the web click information you have that's all essentially metadata. The way we interact with our clients online and through mobile. That's all metadata. So it's not all whip or stick. There's some real value that is in there as well. >> These would seem to be a domain that is ideal for automation. That through machine learning contextualization machines should be able to figure a lot of this stuff out. Am I wrong? >> No, absolutely right. And I think there's, we're working on proof of concepts to prove that case. And we have IBM AMG as well. The automatic metadata generation capability using machine learning and AI to be able to start to auto-generate some of this insight by using existing catalogs, et cetera et cetera. And we're starting to see real value through that. It's still very early days but I think we're really starting to see that one of the solutions can be machine learning and AI. For sure. >> I think there's various degrees of automation that will come in waves for the next, immediately right now we have certain degrees where we have a very small term set that is very high confidence predictions. But then you want to get specific to the specificity of a company which have 30,000 terms sometimes. Internally, we have 6,000 terms at IBM. And that level of specificity to have complete automation we're not there yet. But it's coming. It's a trial. >> It takes time because the machine is learning. And you have to give the machine enough inputs and gradually take time. Humans are involved as well. It's not about just throwing the machine at something and letting it churn. You have to have that human involvement. It takes time to have the machine continue to learn and grow and give it more terms. And give it more context. But over time I think we're going to see good results. >> I want to ask about that human-in-the-loop as IBM so often calls it. One of the things that Nander Paul Bendery was talking about is how the CDO needs to be a change engine in chief. So how are the rank and file interpreting this move to automation and increase in machine learning in their organizations? Is it accepted? It is (chuckles) it is a source of paranoia and worry? >> I think it's a mix. I think we're kind of blessed at least in the CDO at IBM, the global CDO. Is that everyone's kind of on board for that mission. That's what we're doing >> Right, right. >> There's team members 25, 30 years on IMBs roster and they're just as excited as I am and I've only been there for 16 months. But it kind of depends on the project too. Ones that have a high impact. Everyone's really gung ho because we've seen process times go from 90 days down to a couple of days. That's a huge reduction. And that's the governance regulatory aspects but more for us it's a little bit about we're looking for the linkage and availability of data. So that we can get more insights from that data and better outcomes for different types of enterprise use cases. >> And a more satisfying work day. >> Yeah it's fun. >> That's a key point. Much better to be involved in this than doing the job itself. The job of tagging and creating metadata associated with the vast number of data elements is very hard work. >> Yeah. >> It's very difficult. And it's much better to be working with machine learning to do it and dealing with the outliers or the exceptions than it is chugging through. Realistically it just doesn't scale. You can't do this across 30,000 elements in any meaningful way or a way that really makes sense from a financial perspective. So you really do need to be able to scale this quickly and machine learning is the way to do it. >> Have you found a way to make data governance fun? Can you gamify it? >> Are you suggesting that data governance isn't fun? (all laughing) Yes. >> But can you gamify it? Can you compete? >> We're using gamification in various in many ways. We haven't been using it in terms of data governance yet. Governance is just a horrible word, right? People have really negative connotations associated with it. But actually if you just step one degree away we're talking about quality. Quality means better decisions. And that's actually all governance is. Governance is knowing where your data is. Knowing who's responsible for fixing if it goes wrong. And being able to measure whether it's right or wrong in the first place. And it being better means we make better decisions. Our customers have better engagement with us. We please our customers more and therefore they hopefully engage with us more and buy more services. I think we should that your governance is something we invented through the need for regulation. And the need for control. And from that background. But realistically it's just, we should be proud about the data that we use in the organization. And we should want the best results from it. And it's not about governance. It's about us being proud about what we do. >> Yeah, a great note to end on. Thank you so much Christopher and Steven. >> Thank you. >> Cheers. >> I'm Rebecca Night for Paul Gillen we will have more from the IBM CDO Summit here in Boston coming up just after this. (electronic music)

Published Date : Nov 15 2018

SUMMARY :

Brought to you by IBM. of the IBM CDO Summit here in Boston, Massachusetts. To give a shout out to. And the shortest one, which is Elana. To get that important piece of business out. What's the problem with Metadata? And bringing context to that It's kind of step one in the journey But we're getting there. it's not only the fact that What are some of the best practices and the structures of the data that we manage and use What are the data quality metrics? (all laughing) One of the things that you know Chris alluded to I mean look, the simple thing is... It's the descriptive components that tell you Are there standards? You don't have the time to wait it can be region or country specific. And you might have to control the export And then somehow you have to connect to it all What kind of communication needs to happen? And to do that you need to start something. And you can use potentially the whip as well. but not necessarily popular. essentially drives all of the same needs that you need machines should be able to figure a lot of this stuff out. And we have IBM AMG as well. And that level of specificity And you have to give the machine enough inputs is how the CDO needs to be a change engine in chief. in the CDO at IBM, the global CDO. But it kind of depends on the project too. Much better to be involved in this And it's much better to be Are you suggesting And the need for control. Yeah, a great note to end on. we will have more from the IBM CDO Summit here in Boston

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
StevePERSON

0.99+

ChrisPERSON

0.99+

Steven EliukPERSON

0.99+

Paul GillenPERSON

0.99+

Christopher BannocksPERSON

0.99+

SpainLOCATION

0.99+

FranceLOCATION

0.99+

IBMORGANIZATION

0.99+

RebeccaPERSON

0.99+

Rebecca NightPERSON

0.99+

Five yearsQUANTITY

0.99+

90 daysQUANTITY

0.99+

16 monthsQUANTITY

0.99+

30,000 elementsQUANTITY

0.99+

6,000 termsQUANTITY

0.99+

30,000 termsQUANTITY

0.99+

BostonLOCATION

0.99+

oneQUANTITY

0.99+

Chris BannocksPERSON

0.99+

OneQUANTITY

0.99+

two guestsQUANTITY

0.99+

Boston, MassachusettsLOCATION

0.99+

ChristopherPERSON

0.99+

25QUANTITY

0.99+

Nander Paul BenderyPERSON

0.99+

GDPRTITLE

0.99+

StevenPERSON

0.99+

two different systemsQUANTITY

0.98+

ElanaPERSON

0.98+

INGORGANIZATION

0.98+

IBM CDO SummitEVENT

0.97+

SantiagoPERSON

0.96+

Vice PresidentPERSON

0.95+

30 yearsQUANTITY

0.94+

step oneQUANTITY

0.94+

IBM Chief Data Officer SummitEVENT

0.93+

one degreeQUANTITY

0.93+

firstQUANTITY

0.93+

IBM CDO Fall Summit 2018EVENT

0.93+

todayDATE

0.93+

one problemQUANTITY

0.92+

IBM AMGORGANIZATION

0.92+

theCUBEORGANIZATION

0.89+

daysQUANTITY

0.88+

one systemQUANTITY

0.82+

CUBEORGANIZATION

0.81+

three runsQUANTITY

0.8+

Chief Data OfficerPERSON

0.75+

Deep LearningORGANIZATION

0.64+

of peopleQUANTITY

0.62+

GlobalPERSON

0.58+

IMGORGANIZATION

0.57+

coupleQUANTITY

0.56+

DataPERSON

0.49+