Evaristus Mainsah, IBM & Kit Ho Chee, Intel | IBM Think 2020
>> Announcer: From theCUBE studios in Palo Alto and Boston, it's theCUBE, covering IBM Think brought to you by IBM. >> Hi, there, this is Dave Vellante. We're back at the IBM Think 2020 Digital Event Experience are socially responsible and distant. I'm here in the studios in Marlborough, our team in Palo Alto. We've been going wall to wall coverage of IBM Think, Kit Chee here is the Vice President, and general manager of Cloud and Enterprise sales at Intel. Kit, thanks for coming on. Good to see you. >> Thank you, Dave. Thank you for having me on. >> You're welcome, and Evaristus Mainsah, Mainsah is here. Mainsah, he is the general manager of the IBM Cloud Pack Ecosystem for the IBM Cloud. Evaristus, it's good to see you again. Thank you very much, I appreciate your time. >> Thank you, Dave. Thank you very much. Thanks for having me. >> You're welcome, so Kit, let me start with you. How are you guys doing? You know, there's this pandemic, never seen it before. How're things where you are? >> Yeah, so we were quite fortunate. Intel's had an epidemic leadership team. For about 15 years now, we have a team consisting of medical safety and operational professionals, and this same team has, who has navigated as across several other health issues like bad flu, Ebola, Zika and each one and one virus then navigating us at this point with this pandemic. Obviously, our top priority as it would be for IBM is protecting the health and well being of employees while keeping the business running for our customers. The company has taken the following measures to take care of it direct and indirect workforce, Dave and to ensure business continuity throughout the developing situation. They're from areas like work from home policies, keeping hourly workers home and reimbursing for daycare, elderly care, helping with WiFi policies. So that's been what we've been up to Intel's manufacturing and supply chain operations around the world world are working hard to meet demand and we are collaborating with supply pains of our customers and partners globally as well. And more recently, we have about $16 Million to support communities, from frontline health care workers and technology initiatives like online education, telemedicine and compute need to research. So that's what we've been up to date. Pretty much, you know, busy. >> You know, every society that come to you, I have to say my entire career have been in the technology business and you know, sometimes you hear negative toward the big tech but, but I got to say, just as Kit was saying, big tech has really stepped up in this crisis. IBM has been no different and, you know, tech for good and I was actually I'm really proud. How are you doing in New York City? >> Evaristus: No, thank you, Dave, for that, you know, we are, we're doing great and, and our focus has been absolutely the same, so obviously, because we provide services to clients. At a time like this, your clients need you even more, but we need to focus on our employees to make sure that their health and their safety and their well being is protected. And so we've taken this really seriously, and actually, we have two ways of doing this. One of them is just on to purpose as a, as a company, on our clients, but the other is trying to activate the ecosystem because problems of this magnitude require you to work across a broad ecosystem to, to bring forth in a solution that are long lasting, for example, we have a call for code, which where we go out and we ask developers to use their skills and open source technologies to help solve some technical problems. This year, the focus was per AVADA initiatives around computing resources, how you track the Coronavirus and other services that are provided free of charge to our clients. Let me give you a bit more color, so, so IBM recently formed the high performance computing consortium made up of the feYderal government industry and academic leaders focus on providing high performance computing to solve the COVID-19 problem. So we're currently we have 33 members, now we have 27 active products, deploying something like 400 teraflops as our petaflop 400 petaflops of compute to solve the problem. >> Well, it certainly is challenging times, but at the same time, you're both in the, in the sweet spot, which is Cloud. I've talked to a number of CIOs who have said, you know, this is really, we had a cloud strategy before but we're really accelerating our cloud strategy now and, and we see this as sort of a permanent effect. I mean, Kit, you guys, big, big on ecosystem, you, you want frankly, a level playing field, the more optionality that you can give to customers, you know, the better and Cloud is really been exploding and you guys are powering, you know, all the world's Clouds. >> We are, Dave and honestly, that's a huge responsibility that we undertake. Before the pandemic, we saw the market through the lens of four key mega trends and the experiences we are all having currently now deepens our belief in the importance of addressing these mega trends, but specifically, we see marketplace needs around key areas of cloudification of everything below point, the amount of online activities that have spiked just in the last 60 days. It's a testimony of that. Pervasive AI is the second big area that we have seen and we are now resolute on investments in that area, 5G network transformation and the edge build out. Applications run the business and we know enterprise IT faces challenges when deploying applications that require data movement between Clouds and Cloud native technologies like containers and Kubernetes will be key enablers in delivering end to end data analytics, AI, machine learning and other critical workloads and Cloud environments at the edge. Pairing Intel's data centric portfolio, including Intel's obtain SSPs with Red Hat, Openshift, and IBM Cloud Paks, enterprise can now break through storage bottlenecks and have unconstrained data availability in the hybrid and multicloud environments, so we're pretty happy with the progress we're making that together with IBM. >> Yeah, Evaristus, I mean, you guys are making some big bets. I've, you know, written and discussed in my breaking analysis, I think a lot of people misunderstand IBM Cloud, Ginni Rometty arm and a bow said, hey, you know, we're after only 20% of the workloads are in cloud, we're going after the really difficult to move workloads and the hybrid workloads, that's really the fourth foundation that Arvin you know, talks about, that you and IBM has built, you know, your mainframes, you have middleware services, and in hybrid Cloud is really that fourth sort of platform that you're building out, but you're making some bets in AI. You got other services in the Cloud like, like blockchain, you know, quantum, we've been having really interesting discussions around quantum, so I wonder if you can talk a little bit about sort of where you're allocating resources, some of the big bets that, that you're making for the next decade. >> Well, thank you very much, Dave, for that. I think what we're seeing with clients is that there's increasing focus on and, and really an acceptance, that the best way to take advantage of the Cloud is through a hybrid cloud strategy, infused with data, so it's not just the Cloud itself, but actually what you need to do to data in order to make sure that you can really, truly transform yourself digitally, to enable you to, to improve your operations, and in use your data to improve the way that you work and improve the way that you serve your clients. And what we see is and you see studies out there that say that if you adopt a hybrid cloud strategy, instead of 2.5 times more effective than a public cloud only strategy, and Why is that? Well, you get thi6ngs such as you know, the opportunity to move your application, the extent to which you move your applications to the Cloud. You get things such as you know, reduction in, in, in risk, you, you get a more flexible architecture, especially if you focus on open certification, reduction and certification reduction, some of the tools that you use, and so we see clients looking at that. The other thing that's really important, especially in this moment is business agility, and resilience. Our business agility says that if my customers used to come in, now, they can't come in anymore, because we need them to stay at home, we still need to figure out a way to serve them and we write our applications quickly enough in order to serve this new client, service client in a new way. And well, if your applications haven't been modernized, even if you've moved to the Cloud, you don't have the opportunity to do that and so many clients that have made that transformation, figure out they're much more agile, they can move more easily in this environment, and we're seeing the whole for clients saying yes, I do need to move to the Cloud, but I need somebody to help improve my business agility, so that I can transform, I can change with the needs of my clients, and with the demands of competition and this leads you then to, you know, what sort of platform do you need to enable you to do this, it's something that's open, so that you can write that application once you can run it anywhere, which is why I think the IBM position with our ecosystem and Red Hat with this open container Kubernetes environment that allows you to write application once and deploy it anywhere, is really important for clients in this environment, especially, and the Cloud Paks which is developed, which I, you know, General Manager of the Cloud Pak Ecosystem, the logic of the Cloud Paks is exactly that you'll want plans and want to modernize one, write the applications that are cloud native so that they can react more quickly to market conditions, they can react more quickly to what the clients need and they, but if they do so, they're not unlocked in a specific infrastructure that keeps them away from some of the technologies that may be available in other Clouds. So we have talked about it blockchain, we've got, you know, Watson AI, AI technologies, which is available on our Cloud. We've got the weather, company assets, those are key asset for, for many, many clients, because weather influences more than we realize, so, but if you are locked in a Cloud that didn't give you access to any of those, because you hadn't written on the same platform, you know, that's not something that you you want to support. So Red Hat's platform, which is our platform, which is open, allows you to write your application once and deploy it anyways, particularly our customers in this particular environment together with the data pieces that come on top of that, so that you can scale, scale, because, you know, you've got six people, but you need 600 of them. How do you scale them or they can use data and AI in it? >> Okay, this must be music to your ears, this whole notion of you know, multicloud because, you know, Intel's pervasive and so, because the more Clouds that are out there, the better for you, better for your customers, as I said before, the more optionality. Can you6 talk a little bit about the rela6tionship today between IBM and Intel because it's obviously evolved over the years, PC, servers, you know, other collaboration, nearly the Cloud is, you know, the latest 6and probably the most rel6evant, you know, part of your, your collaboration, but, but talk more about what that's like you guys are doing together that's, that'6s interesting and relevant. >> You know, IBM and Intel have had a very rich history of collaboration starting with the invention of the PC. So for those of us who may take a PC for granted, that was an invention over 40 years ago, between the two companies, all the way to optimizing leadership, IBM software like BB2 to run the best on Intel's data center products today, right? But what's more germane today is the Red Hat piece of the study and how that plays into a partnership with IBM going forward, Intel was one of Red Hat's earliest investors back in 1998, again, something that most people may not realize that we were in early investment with Red Hat. And we've been a longtime pioneer of open source. In fact, Levin Shenoy, Intel's Executive Vice President of Data Platforms Group was part of COBOL Commies pick up a Red Hat summit just last week, you should definitely go listen to that session, but in summary, together Intel and Red Hat have made commercial open source viable and enterprise and worldwide competing globally. Basically, now we've65 used by nearly every vertical and horizontal industr6y. We are bringing our customers choice, scalability and speed of innovation for key technologies today, such as security, Telco, NFV, and containers, or even at ease and most recently Red Hat Openshift. We're very excited to see IBM Cloud Packs, for example, standardized on top of Openshift as that builds the foundation for IBM chapter two, and allows for Intel's value to scale to the Cloud packs and ultimately IBM customers. Intel began partnering with IBM on what is now called Pax over two years ago and we 6are committed to that success and scaling that, try ecosystem, hardware partners, ISVs and our channel. >> Yeah, so theCUBE by the way, covered Red Hat summit last week, Steve Minima and I did a detailed analysis. It was awesome, like if we do say so ourselves, but awesome in the sense of, it allowed us to really sort of unpack what's going on at Red Hat and what's happening at IBM. Evaristus, so I want to come back to you on this Cloud Pack, you got, it's, it's the kind of brand that you guys have, you got Cloud Packs all over the place, you got Cloud Packs for applications, data, integration, automation, multicloud management, what do we need to know about Cloud pack? What are the relevant components there? >> Evaristus: I think the key components is so this is think of this as you know, software that is designed that is Cloud native is designed for specific core use cases and it's built on Red Hat Enterprise Linux with Red Hat Openshift container Kubernetes environment, and then on top of that, so you get a set of common services that look right across all of them and then on top of that, you've got specific both open source and IBM software that deals with specific plant situations. So if you're dealing with applications, for example, the open source and IBM software would be the run times that you need to write and, and to blow applications to have setups. If you're dealing with data, then you've got Cloud Pack to data. The foundation is still Red Hat Enterprise Linux sitting on top of with Red Hat Openshift container Kubernetes environment sitting on top of that providing you with a set of common services and then you'll get a combination of IBM zone open, so IBM software as well as open source will have third party software that sits on top of that, as well as all of our AI infrastructure that sits on top of that and machine learning, to enable you to do everything that you need to do, data to get insights updates, you've got automation to speed up and to enable us to do work more efficiently, more effectively, to make your smart workers better, to make management easier, to help management manage work and processes, and then you've got multicloud management that allows you to see from a single pane, all of your applications that you've deployed in the different Cloud, because the idea here, of course, is that not all sitting in the same Cloud. Some of it is on prem, some of it is in other Cloud, and you want to be able to see and deploy applications across all of those. And then you've got the Cloud Pack to security, which has a combination of third party offerings, as well as ISV offerings, as well as AI offerings. Again, the structure is the same, REL, Red Hat Openshift and then you've got the software that enables you to manage all aspects of security and to deal with incidents when, when they arise. So that gives you data applications and then there's integration, as every time you start writing an application, you need to integrate, you need to access data security from someplace, you need to bring two pipes together for them to communicate and we use a Cloud Pack for integration to allow us to do that. You can open up API's and expose those API so others writing application and gain access to those API's. And again, this idea of resilience, this idea of agility, so you can make changes and you can adapt data things about it. So that's what the Cloud Pack provides for you and Intel has been an absolutely fantastic partner for us. One of the things that we do with Intel, of course, is to, to work on the reference architectures to help our certification program for our hardware OEMs so that we can scale that process, get many more OEMs adopt and be ready for the Cloud Packs and then we work with them on some of the ISV partners and then right up front. >> Got it, let's talk about the edge. Kity, you mentioned 5G. I mean it's a really exciting time, (laughs) You got windmills, you got autonomous vehicles, you got factories, you got to ship, you know, shipping containers. I mean, everything's getting instrumented, data everywhere and so I'm interested in, let's start with Intel's point of view on the edge, how that's going to evolve, you know what it means to Cloud. >> You know, Dave, it's, its definitely the future and we're excited to partner with IBM here. In addition to enterprise edge, the communication service providers think of the Telcos and take advantage of running standardized open software at the Telco edge, enabling a range of new workloads via scalable services, something that, you know, didn't happen in the past, right? Earlier this year, Intel announced a new C on second generation, scalable, atom based processes targeting the 5G radio access network, so this is a new area for us, in terms of investments going to 5G ran by deploying these new technologies, with Cloud native platforms like Red Hat Openshift and IBM Cloud Packs, comm service providers can now make full use of their network investments and bring new services such as Artificial Intelligence, augmented reality, virtual reality and gaming to the market. We've only touched the surface as it comes to 5G and Telco but IBM Red Hat and Intel compute together that I would say, you know, this space is super, super interesting, as more developed with just getting started. >> Evaristus, what do you think this means for Cloud and how that will evolve? Is this sort of a new Cloud that will form at the edge? Obviously, a lot of data is going to stay at the edge, probably new architectures are going to emerge and again, to me, it's all about data, you can create more data, push more data back to the Cloud, so you can model it. Some of the data is going to have to be done in real time at the edge, but it just really extends the network to new horizons. >> Evaristus: It does exactly that, Dave and we think of it and which is why I thought it will impact the same, right? You wouldn't be surprised to see that the platform is based on open containers and that Kubernetes is container environment provided by Red Hat and so whether your data ends up living at the edge or your data lives in a private data center, or it lives in some public Cloud, and how it flows between all of them. We want to make it easy for our clients to be able to do that. So this is very exciting for us. We just announced IBM Edge Application Manager that allows you to basically deploy and manage applications at endpoints of all these devices. So we're not talking about 2030, we're talking about thousands or hundreds of thousands. And in fact, we're working with, we're getting divided Intel's device onboarding, which will enable us to use that because you can get that and you can onboard devices very, very easily at scale, which if you get that combined with IBM Edge Application Manager, then it helps you onboard the devices and it helps you divide both central devices. So we think this is really important. We see lots of work that moving on the edge devices, many of these devices and endpoints now have sufficient compute to be able to run them, but right now, if they are IoT devices, the data has been transferred to hundreds of miles away to some data center to be processed and enormous pass and then only 1% of that actually is useful, right? 99% of it gets thrown away. Some of that actually has data residency requirements, so you may not be able to move the data to process, so why wouldn't you just process the data where the data is created around your analytics where the data is spread, or you have situations that are disconnected as well. So you can't actually do that. You don't want to stop this still in the supermarket, because there's, you lost connectivity with your data center and so the importance of being able to work offline and IBM Edge Application Manager actually allows you so it's tournament so you can do all of this without using lots of people because it's a process that is all sort or automated, but you can work whether you're connected or you're disconnected, and then you get replication when you get really, really powerful for. >> All right, I think the developer model is going to be really interesting here. There's so many new use cases and applications. Of course, Intel's always had a very strong developer ecosystem. You know, IBM understands the importance of developers. Guys, we've got to wrap up, but I wonder if you could each, maybe start with Kit. Give us your sense as to where you want to see this, this partnership go, what can we expect over the next, you know, two to five years and beyond? >> I think it's just the area of, you know, 5G, and how that plays out in terms of edge build out that we just touched on. I think that's a really interesting space, what Evaristus has said is spot on, you know, the processing, and the analytics at the edge is still fairly nascent today and that's growing. So that's one area, building out the Cloud for the different enterprise applications is the other one and obviously, it's going to be a hybrid world. It's not just a public Cloud world on prem world. So the whole hybrid build out What I call hybrid to DoD zero, it's a policy and so the, the work that both of us need to do IBM and Intel will be critical to ensure that, you know, enterprise IT, it has solutions across the hybrid sector. >> Great. Evaristus, give us the last word, bring us home. >> Evaristus: And I would agree with that as well, Kit. I will say this work that you do around the Intel's market ready solutions, right, where we can bring our ecosystem together to do even more on Edge, some of these use cases, this work that we're doing around blockchain, which I think you know, again, another important piece of work and, and I think what we really need to do is to focus on helping clients because many of them are working through those early cases right now, identify use cases that work and without commitment to open standards, using exactly the same standard across like what you've got on your open retail initiative, which we're going to do, I think is going to be really important to help you out scale, but I wanted to just add one more thing, Dave, if you if you permit me. >> Yeah. >> Evaristus: In this COVID era, one of the things that we've been able to do for customers, which has been really helpful, is providing free technology for 90 days to enable them to work in an offline situation to work away from the office. One example, for example, is the just the ability to transfer files and bandwidth, new bandwidth is an issue because the parents and the kids are all working from home, we have a protocol, IBM Aspera, which will make available customers for 90 days at no cost. You don't need to give us your credit card, just log on and use it to improve the way that you work. So your bandwidth feels as if you are in the office. We have what's an assistant that is now helping clients in more than 18 countries that keep the same thing, basically providing COVID information. So those are all available. There's a slew of offerings that we have. We just want listeners to know that they can go on the IBM website and they can gain those offerings they can deploy and use them now. >> That's huge. I knew about the 90 day program, I didn't realize a sparrow was part of that and that's really important because you're like, Okay, how am I going to get this file there? And so thank you for, for sharing that and guys, great conversation. You know, hopefully next year, we could be face to face even if we still have to be socially distant, but it was really a pleasure having you on. Thanks so much. Stay safe, and good stuff. I appreciate it. >> Evaristus: Thank you very much, Dave. Thank you, Kit. Thank you. >> Thank you, thank you. >> All right, and thank you for watching everybody. This is Dave Volante for theCUBE, our wall to wall coverage of the IBM Think 2020 Digital Event Experience. We'll be right back right after this short break. (upbeat music)
SUMMARY :
brought to you by IBM. and general manager of Cloud Thank you for having me on. Evaristus, it's good to see you again. Thank you very much. How are you guys doing? and to ensure business the technology business and you know, for that, you know, we and you guys are powering, you and the experiences we that Arvin you know, talks about, the extent to which you move the Cloud is, you know, and how that plays into a partnership brand that you guys have, and you can adapt data things about it. how that's going to evolve, you that I would say, you know, Some of the data is going to have and so the importance of the next, you know, to ensure that, you know, enterprise IT, the last word, bring us home. to help you out scale, improve the way that you work. And so thank you for, for sharing that Evaristus: Thank you very much, Dave. you for watching everybody.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Evaristus | PERSON | 0.99+ |
Steve Minima | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Mainsah | PERSON | 0.99+ |
Levin Shenoy | PERSON | 0.99+ |
99% | QUANTITY | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
600 | QUANTITY | 0.99+ |
Telcos | ORGANIZATION | 0.99+ |
1998 | DATE | 0.99+ |
Dave Volante | PERSON | 0.99+ |
Evaristus Mainsah | PERSON | 0.99+ |
Marlborough | LOCATION | 0.99+ |
33 members | QUANTITY | 0.99+ |
Boston | LOCATION | 0.99+ |
90 days | QUANTITY | 0.99+ |
New York City | LOCATION | 0.99+ |
2.5 times | QUANTITY | 0.99+ |
Telco | ORGANIZATION | 0.99+ |
27 active products | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
two companies | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
Intel | ORGANIZATION | 0.99+ |
400 teraflops | QUANTITY | 0.99+ |
1% | QUANTITY | 0.99+ |
next year | DATE | 0.99+ |
COVID-19 | OTHER | 0.99+ |
hundreds of miles | QUANTITY | 0.99+ |
about $16 Million | QUANTITY | 0.99+ |
last week | DATE | 0.99+ |
both | QUANTITY | 0.99+ |
six people | QUANTITY | 0.99+ |
Red Hat | TITLE | 0.99+ |
Cloud Paks | TITLE | 0.99+ |
Red Hat Enterprise Linux | TITLE | 0.99+ |
five years | QUANTITY | 0.99+ |
hundreds of thousands | QUANTITY | 0.98+ |
Kit | PERSON | 0.98+ |
One example | QUANTITY | 0.98+ |
second generation | QUANTITY | 0.98+ |
more than 18 countries | QUANTITY | 0.98+ |
AVADA | ORGANIZATION | 0.98+ |
This year | DATE | 0.98+ |
Data Platforms Group | ORGANIZATION | 0.98+ |
Nhung Ho, Intuit | Stanford Women in Data Science (WiDS) Conference 2020
>>live from Stanford University. It's the queue covering Stanford women in data science 2020. Brought to you by Silicon Angle Media. Yeah. >>Hi. And welcome to the Cube. I'm your host Sonia Category. And we're live at Stanford University for the fifth annual Woods Women in Data Science Conference. Joining us today is none. Ho, the director of data Science at Intuit None. Welcome to the Cube. >>Thank you for having me here, so yeah, >>so tell us a little bit about your role at Intuit. So I leave the >>applied Machine Learning teams for our QuickBooks product lines and also for our customer success organization within my team. We do applied machine learning. So what? We specialize in building machine learning products and delivering them into our products for >>our users. Great. Today. Today you're giving a talk. You talked about how organizations want to achieve greater flexibility, speed and cost efficiencies on. And you're giving it a technical vision. Talk today about data science in the cloud world. So what should data scientists know about data science in a cloud world? >>Well, I'll just give you a little bit of a preview into my talk later because I don't want to spoil anything. Yeah, but I think one of the most important things being a data scientist in a cloud world is that you have to fundamentally change the way you work a lot of a start on our laptops or a server and do our work. But when you move to the cloud, it's like all bets are off. All the limiters are off. And so how do you fully take advantage of that? How do you change your workflow? What are some of the things that are available to you that you may not know about? And in addition to that, some some things that you have to rewire in your brain to operate in this new environment. And I'm going to share some experiences that I learned firsthand and also from my team in into its cloud migration over the past six years. >>That's great. Excited to hear that on DSO you were getting into it into it has sponsored Woods for many years now. Last year we spoke with could be the San Juan from Intuit. So tell us about this Intuit's sponsorship. Yeah, >>so into it. We are a champion of gender diversity and also all sorts of diversity. And when we first learned about which we said, We need to be a champion of the women in data science conference because for me personally, often times when I'm in a room, um, going over technical details I'm often the only woman and not just I'm often the only woman executive and so part of the sponsorship is to create this community of women, very technical women in this field, to share our work together to build this community and also to show the great diversity of work that's going on across the field of data science. >>And so Intuit has always been really great for embracing diversity. Tell us a little bit about about bad experience, about being part of Intuit and also about the tech women part. Yeah, >>so one of the things that into it that I really appreciate is we have employees groups around specific interests, and one of those employees groups is tech women at Intuit and Tech women at Intuit. The goal is to create a community of women who can provide coaching, mentorship, technical development, leadership development and I think one of the unique things about it is that it's not just focused on the technical development side, but on helping women develop into leadership positions. For me, When I first started out, there were very few women in executive positions in our field and data science is a brand new field, and so it takes time to get there. Now that I'm on the other side, one of the things that I want to do is be able to give back and coach the next generation. And so the tech women at Intuit Group allows me to do that through a very strong mentorship program that matches me and early career mentees across multiple different fields so that I can provide that coaching in that leadership development >>and speaking about like diversity. In the opening address, we heard that diversity creates perspectives, and it also takes away bias. So why gender diversity is so important into it, and how does it help take away that bias? Yeah, >>so one of the important things that I think a lot of people don't realize is when you go and you build your products, you bring in a lot of biases and how you build the product and ultimately the people who use your products are the general population for us. We serve consumer, small businesses and self employed. And if you take a look at the diversity of our customers, it mirrors the general population. And so when you think about building products, you need to bring in those diverse perspectives so you could build the best products possible because of people who are using those products come from a diverse background as well, >>right? And so now at Intuit like instead of going from a desktop based application, we're at a cloud based application, which is a big part of your talk. How do you use data Teoh for a B testing and why is it important? >>Yeah, a B testing That is a personal passion of mine, actually, because as a scientist, what we like to do is run a lot of experiments and say, Okay, what is the best thing out there so that ultimately, when you ship a new product or feature, you send the best thing possible that's verified by data, and you know exactly how users are going to react to it. When we were on desktop, they made it incredibly difficult because those were back in the days. And I don't know if you remember those put back in the days when you had a floppy disk, right or even a CD ROM's. That's how we shipped our products. And so all the changes that you wanted to make had to be contained. In the end, you really only ship it once per year. So if there's any type of testing that we did, we're bringing our users and have them use our products a little bit and then say Okay, we know exactly what we need to dio ship that out. So you only get one chance now that we're in the cloud. What that allows us to do is to test continuously via a B, testing every new feature that comes out. We have a champion Challenger model, and we can say Okay, the new version that we're shipping out is this much better than the previous one. We know it performs in this way, and then we got to make the decision. Is this the best thing to do for a customer? And so you turn what was once a one time process, a one time change management process. So one that's distributed throughout the entire year and at any one time we're running hundreds of tests to make sure that we're shipping exactly the best things for our customers. >>That's awesome. Um, so, um, what advice would you give to the next generation of women who are interested in stem but maybe feel like, Oh, I might be the only woman. I don't know if I should do this. Yeah, I think that the biggest >>thing for me was finding men's ownership, and initially, when I was very early career and even when I was doing my graduate studies for me, a mentor with someone who was in my field. But when I first joined into it, an executive in another group who is a female, said, Hey, I'd like to take your side, provide you some feedback, and this is some coaching I want to give you, And that was when I realized you don't actually need to have that person be in your field to actually guide you through to the next up. And so, for women who are going through their journey and early on, I recommend finding a mentor who is at a stage where you want to go, regardless of which field there in, because everybody has diverse perspectives and things that they can teach you as you go along. >>And how do you think Woods is helping women feel like they can do data science and be a part of the community? Yeah, I think >>what you'll see in the program today is a huge diversity of our speakers, our Panelists through all different stages of their career and all different fields. And so what we get to see is not only the time baseline of women who are in their PhDs all the way to very, very well established women. The provost of Stanford University was here today, which is amazing to see someone at the very top of the career who's been around the block. But the other thing is also the diversity and fields. When you think about data science, a lot of us think about just the tech industry. But you see it in healthcare. You see it in academia and there's a scene that wide diversity of where data science and where women who are practicing data science come from. I think it's really empowering because you can see yourself in the representation does matter quite a bit. >>Absolutely. And where do you see data science going forward? >>Oh, that is a, uh, tough and interesting question, actually. And I think that in the current environment today, we could talk about where it could go wrong or where it could actually open the doors. And for me, I'm an eternal optimist on one of the things that I think is really, really exciting for the future is we're getting to a stage where we're building models, not just for the general population. We have enough data and we have enough compute where we can build a model. Taylor just for you, for all of your life's on for me. I think that that is really, really powerful because we can build exactly the right solution to help our customers and our users succeed. Specifically, me working in the personal friend, Small business finance lease. That means I can hope that cupcake shop owner actually manage her cash flow and help her succeed to me that I think that's really powerful. And that's where data science is headed. >>None. Thank you so much for being on the Cube and thank you for your insight. Thank you so much. I'm so sorry. Thanks for watching the Cube. Stay tuned for more. Yeah, Yeah, yeah, yeah, yeah, yeah.
SUMMARY :
Brought to you by Silicon Angle Media. And we're live at Stanford University for the fifth so tell us a little bit about your role at Intuit. We do applied machine learning. And you're giving it a technical vision. What are some of the things that are available to you that you may not know about? Excited to hear that on DSO you were getting into it into it has sponsored We need to be a champion of the women in data science conference because And so Intuit has always been really great for embracing diversity. And so the tech women at Intuit Group allows me to do that through a very strong mentorship program that In the opening address, we heard that diversity creates And so when you think about building products, you need to bring in those diverse How do you use data Teoh for a B testing and And so all the changes that you wanted to make had to be contained. Um, so, um, what advice would you give to the next generation of women I recommend finding a mentor who is at a stage where you want to go, And so what we get to see is not only the time baseline of women who are in their PhDs all And where do you see data science going forward? And for me, I'm an eternal optimist on one of the things that I think is really, Thank you so much.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Intuit | ORGANIZATION | 0.99+ |
Silicon Angle Media | ORGANIZATION | 0.99+ |
Today | DATE | 0.99+ |
Last year | DATE | 0.99+ |
today | DATE | 0.99+ |
Intuit Group | ORGANIZATION | 0.99+ |
one time | QUANTITY | 0.99+ |
Stanford University | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
Sonia | PERSON | 0.99+ |
Nhung Ho | PERSON | 0.99+ |
one chance | QUANTITY | 0.99+ |
Taylor | PERSON | 0.98+ |
first | QUANTITY | 0.98+ |
Ho | PERSON | 0.97+ |
QuickBooks | TITLE | 0.97+ |
Intuit None | ORGANIZATION | 0.95+ |
Woods Women in Data Science Conference | EVENT | 0.94+ |
Stanford | ORGANIZATION | 0.93+ |
hundreds of tests | QUANTITY | 0.93+ |
2020 | DATE | 0.93+ |
past six years | DATE | 0.88+ |
Stanford Women in Data Science ( | EVENT | 0.88+ |
DSO | ORGANIZATION | 0.86+ |
one time process | QUANTITY | 0.86+ |
once per year | QUANTITY | 0.86+ |
Woods | PERSON | 0.83+ |
Cube | COMMERCIAL_ITEM | 0.77+ |
WiDS) Conference 2020 | EVENT | 0.75+ |
Woods | EVENT | 0.66+ |
once | QUANTITY | 0.61+ |
fifth | EVENT | 0.55+ |
Cube | ORGANIZATION | 0.51+ |
San Juan | LOCATION | 0.46+ |
annual | QUANTITY | 0.37+ |
Lisa Ho | Data Privacy Day 2017
>> Hey, welcome back everybody, Jeff Frick here with theCUBE. We're in downtown San Francisco at the Twitter headquarters at the Data Privacy Day Event. It's a full day event with a lot of seminars and presentations, really talking about data privacy, something that's getting increasingly important everyday, especially as we know, RSA's coming up in a couple of weeks and a lot of talk about phishing and increased surface area of attack, and et cetera, et cetera. So privacy is really important and we're excited to have Lisa Ho, Campus Privacy Officer at UC Berkeley. Welcome, Lisa. >> Thank you, glad to be here. >> So what does the Campus Privacy Officer do? >> Well, really anything that has to do with privacy that comes across. So making sure that we're in compliance or doing what I can to help the campus keep in compliance with privacy laws. But beyond that, also making sure that we stay aligned with our privacy values and when I say that, I mean, privacy is really important. It's critical for creativity and for intellectual freedom. So at the university, we need to make sure we hold on to those when we're dealing with new ideas and new scenarios that's got to come up. We have to balance privacy with all the other priorities and obligations we have. >> Yeah, I don't know if Berkeley gets enough credit and Stanford as really being two of the real big drivers of Silicon Valley. It attracts a lot of smart people. They come, they learn, and then more importantly, they stay. So you've got a lot of cutting edge innovation, you've got a ton of open source technologies come out of Berkeley over the years. Spark, et cetera. So you guys are really at the leading edge but at the same time, you're an old, established academic institution so what role do you have formally as an academic institution of higher education to help set some of these standards and norms as the world is changing around it so very, very quickly? >> Yeah, well, so as I say, the environment needs to be set for creativity and for allowing that intellectual freedom. So when we think about the university, the things that we do there are pretty much what we want to have in the community as a whole, and in our culture and environment. So some of the things that we think about particularly, first, if you talk about, think about school, you think about grades or you think about the letters evaluation that you get. Those things that, learning when you come down to it is a personal endeavor and you, developing internally. It's a transformation that's internal. And so what kind of feedback you get, what kind of critical evaluation, those need to be done in an area where you have the privacy to not be, have a reputation to either live up to or live down. Those are things that you keep secret or keep private and that's why school information and student data is so, as we've agreed as a society that that's something that needs to stay private. So that's one area that learning is personal. That's why the university is so important in that discussion. And secondly, I'd say, as we talked about, creativity requires time to develop and it requires freedom for taking risks. So whether you're working on a book or whether it's a piece of art or if you're a scientist, a formula, any kind of algorithm, a theory. Those are things that you need time to set aside and to be in your own head without the eyes of others until you're ready. Without not having judgment before it's ready for release. And those kind of things that you want to have space for creativity so that you can move beyond the status quo and take those risks to go somewhere to the next space and beyond. >> Jeff: Right. >> And I think lastly, I'd say that, this is not specific to the university, but where we hold particularly at Berkeley, is the fundamental rights that we have that privacy is one of those fundamental rights and as Ed Snowden said so famously, if you're saying I don't care about privacy because I have nothing to hide is like saying I don't care about freedom of speech because I have nothing to say. So just because you may not have something to say doesn't mean that you can take away the rights of someone else and you may find that you need those at some point in your life in the future, and no one has to justify why they need a fundamental right. So those things that are essential that come out in our university environment that we think of a lot are things that are applicable beyond just the learning space of the university, to the kind of society that we want to build. That's why the university's in the space to lead in these areas. >> Right, 'cause Berkeley's got a long history, right, of activism, and this goes back for decades and decades. I mean, is privacy starting to get elevated to the level that you're going to see more active, vocal points of view and statements, and I don't want to say marches, but potentially marches in terms of making sure this is taken care of? Because unfortunately, I think most privacy applications, at least historically, maybe it's changing, are really opt out, you know, not opt in. So do you see this? Is it becoming a more important kind of policy area versus just kind of an execution detail on an application? >> Yeah, we have a lot of really great professors working on these ideas around privacy and in cybersecurity that, those that are working on security and other things also have privacy in their background and are also advocating in that area as well. As far as marches, we all, you pretty much rely on the students for that and you can't dictate what the students are going to find as important. But there are. There's definitely a cadre of students that care and are interested in these topics and when you tie them together with the fundamental rights like free speech and academic freedom and creativity, that's where it becomes important and people get interested in that. >> Right. One of the real sticky areas that this bounces into is just security, security and unfortunately, there's been way too many instances at campuses over the last several years of crazy people grabbing a gun and shooting people, which, you know, hopefully won't happen today. And that's really kind of where the privacy and security thing runs up against should we have known? Should we have seen this person coming? If we had had access to whatever that they're doing, maybe we would have known and been able to prevent it. So when you look at kind of the, I don't want to say balance, but really, the conflict between security security and privacy, what are some of the rules coming out? How do you guys execute that to both provide a safe environment for people to study and learn and grow, as you mentioned, but at the same time, keep an eye out for unfortunately, there are bad characters in the world. >> Right, yeah well, I don't want to say that there's a dichotomy. I don't want to create a false dichotomy of it's either privacy or it's security and that's not the frame of mind that we want to be in. It's important for both and security is clearly important. Preventing unauthorized access to information or your personal information is clearly a part of privacy and so that's necessary for privacy and those are things that you would do to protect privacy. The two factor authentication and the antivirus and the network segmentation, those are all things that are important parts of protecting privacy as well. So it's not a dichotomy of one or the other, but there are things that you do for security purposes, whether it's cybersecurity or for the kind of security, personal security, that maybe in a conflict, have a different purpose than what you would do for privacy and monitoring is one of those areas specifically. When you're monitoring for attacks, this kind in particularly, now we have the continuous monitoring for any kind of attacks or to use that monitoring data as a forensic place to look for information after the fact. Those are things that really is lies in contrast with the idea in privacy of least perusal and not looking and not looking for information until you need it, so having that distance in the privacy of not having surveillance. So what we're coming to, at the University of California has outlined a privacy balancing analysis that's necessary for these kind of scenarios that are new, when we have, untested, when we don't have laws around them, to balance the many priorities and obligations and what you need to do is look at what does the security provide, look at the benefits together with the risks and do that balancing. And so you need to go through a series of questions. What is the utility that you're really getting out of that monitoring and not just in that normal scenario when you're expecting, how you're expecting to use it. But what about in the use cases that maybe you didn't expect that, but you can anticipate that it'll be wanted for those reasons or if you, what about when we're required to turn it over for a subpoena or another kind of letter. What are the use cases in that? What are the privacy impacts in those cases? What are the privacy impacts if it's hacked or what are the privacy impacts of an abuse by an employee? What are the privacy impacts for sharing it with partners? So that together, the utility with the impact you need to balance that and to look at those differences, and then also look at what's the scope of that? Does the scope change? If you change the scope of what you're monitoring, does it change the privacy impact? Does it change the utility? When you look at those kind of factors and keep them all in line, not just looking at what's the utility of what you're trying to do, but what are the other impacts to the privacy analysis and then what are the alternatives that you could do the same thing and are they appropriate? Do they give you the same kind of value that the proposed monitoring provides? Keeping transparent about and keeping accountable to what you're doing are really when it comes down to the key as you've done that analysis and making sure that you've looked through those questions of have you kept it, are you doing the least amount of perusal necessary to achieve the goals that you're trying to accomplish with that monitoring? And what about transparency and accountability coming back to whatever your decisions are, making those available to the community that's being monitored. >> Wow, well one you've got job security, I guess, for life, because that was amazing. Two, as you're talking balances, the word I was looking for before, so that is the right word. But you're balancing on so many axis and even once you get through the axis that you just went through that list of, it's phenomenal, then you still need to look at the alternatives, right? And do the same kind of analysis for each. So really, that was a great explanation. So I want to shift gears a little bit and talk about wearables. You're going to give a talk later on today about the wearables. Wearables are a whole new kind of interesting factor now that provide a whole bunch more data, really kind of the cutting edge of the internet of things with sensor data. People are things too, we like to say on theCUBE. So as you look at the wearables and the impact of wearables on this whole privacy conversation, what are some of the big, got you issues that are really kind of starting to be surfaced as these things get more popular? >> Yeah, I think a lot of the same kind of questions around what kind of monitoring you're doing, what's the utility, what is the privacy impact and how do you balance those in the various scenarios, the use cases that come up, really the same kind of questions apply to cybersecurity as they do to cybersecurity monitoring. We're finding, I think in college athletics and the university sponsored use of wearable technology is really just in infancy right now. It's not a big thing that we're working on. But it ties in so much as very much parallels the other kind of questions that we are talking about around learning data and how you jump or how your body functions is very private, very intimate. How you think, how you learn, that's right up there on the spectrum on that privacy and intimacy scale. So we're looking very much and we've been talking quite a bit in the university space about learning data and how we protect that. Some of the questions are who owns that data? It's about me, should I be, you know, it's about the student for example. Should I have control over how that information is used? When it's around learning data, maybe the average student, there may not be outside folks that are interested in that information but when you're talking about student athletes, potentially going pro, that's very valuable data that people may want, so that, people may want to pay for, maybe the student should have some say in the use of that data, monetizing that data, who owns that? Is it the student, is it the university, is it the company that we work with to provide that kind of monitoring the analytics on that? >> Jeff: Right, right. >> Even if we have a contract or right now, if it's through the university, we'd hopefully have made really clear who's the ownership, where the uses ally, what kind of things we can do with it, but as we move into kind of a consumer space, and it's where you just clicking the box and students may be asked, oh, use this technology, it's free and we'll be able to handle it, because of course, how much it costs is important in the university space >> Give you free slices at the pizza store. >> Right, well once we get into that consumer realm when it's just either not even having to click the box, the box is already clicked, can you say okay, that's the new come up to where students may be giving away data for reasons or for uses that they didn't intend, that they are not getting any compensation for, and in particular cases, when you talk about student athletes, that could be something that would be very meaningful for their career and beyond. >> Yeah or is it the guy that's come up with the unique and innovative training methodology that they're testing, is it Berkeley's information to see how people are learning so you can incorporate that into your lesson plans and the way that you teach 'em, and there's so many kind of angles but it always comes back, as you said, really the context. Kind of what's the context for the specific application that you're trying to use that and should you or should you not have rights for that context. It's really interesting space, a lot of interesting challenges, and like I said, job security for you for the unforeseeable future. >> Yeah, we're not going to run out of new and exciting applications and things to be thinking about in terms of privacy. It's just a non stop. >> Right, 'cause they're not, these are not technology questions, right? These are policy questions and rules questions. We heard a thing last night with the center and one of the topics was we need a lot more rules around these types of things because the technology's outpacing kind of the governance rules and really the thought processes, the ways that these things can all be used. >> It's a culture question, really. It's more than just what you allow or not, but how we feel about it and the kind of idea that privacy is dead is only true if we don't care about it anymore. So if we care about it and we pay attention to it, then privacy is not dead. >> Alright, well Lis, we'll leave it there. Lisa Ho from UC Berkeley, fantastic. Thank you for stopping by and good luck at your wearables panel later this afternoon. >> Thank you. >> Alright, I'm Jeff Frick. You're watching theCUBE, thanks for watching. (upbeat music)
SUMMARY :
the Twitter headquarters at the Data Privacy Day Event. So at the university, we need to make sure So you guys are really at the leading edge So some of the things that we think about particularly, is the fundamental rights that we have So do you see this? on the students for that and you can't dictate One of the real sticky areas that this bounces into and that's not the frame of mind that we want to be in. so that is the right word. is it the company that we work with slices at the pizza store. and in particular cases, when you talk about and the way that you teach 'em, and exciting applications and things and one of the topics was we need It's more than just what you allow or not, Thank you for stopping by and Alright, I'm Jeff Frick.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Ed Snowden | PERSON | 0.99+ |
Lisa Ho | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
Stanford | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
University of California | ORGANIZATION | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
Lis | PERSON | 0.99+ |
both | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Two | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
two factor | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
UC Berkeley | ORGANIZATION | 0.97+ |
last night | DATE | 0.97+ |
decades | QUANTITY | 0.96+ |
each | QUANTITY | 0.96+ |
Berkeley | ORGANIZATION | 0.95+ |
ORGANIZATION | 0.94+ | |
later this afternoon | DATE | 0.94+ |
One | QUANTITY | 0.93+ |
secondly | QUANTITY | 0.93+ |
theCUBE | ORGANIZATION | 0.92+ |
Data Privacy Day Event | EVENT | 0.92+ |
RSA | ORGANIZATION | 0.91+ |
Data Privacy Day 2017 | EVENT | 0.85+ |
Campus Privacy Officer | PERSON | 0.85+ |
Berkeley | LOCATION | 0.84+ |
downtown San Francisco | LOCATION | 0.79+ |
Spark | ORGANIZATION | 0.77+ |
last | DATE | 0.6+ |
years | DATE | 0.45+ |
Breaking Analysis: Supercloud2 Explores Cloud Practitioner Realities & the Future of Data Apps
>> Narrator: From theCUBE Studios in Palo Alto and Boston bringing you data-driven insights from theCUBE and ETR. This is breaking analysis with Dave Vellante >> Enterprise tech practitioners, like most of us they want to make their lives easier so they can focus on delivering more value to their businesses. And to do so, they want to tap best of breed services in the public cloud, but at the same time connect their on-prem intellectual property to emerging applications which drive top line revenue and bottom line profits. But creating a consistent experience across clouds and on-prem estates has been an elusive capability for most organizations, forcing trade-offs and injecting friction into the system. The need to create seamless experiences is clear and the technology industry is starting to respond with platforms, architectures, and visions of what we've called the Supercloud. Hello and welcome to this week's Wikibon Cube Insights powered by ETR. In this breaking analysis we give you a preview of Supercloud 2, the second event of its kind that we've had on the topic. Yes, folks that's right Supercloud 2 is here. As of this recording, it's just about four days away 33 guests, 21 sessions, combining live discussions and fireside chats from theCUBE's Palo Alto Studio with prerecorded conversations on the future of cloud and data. You can register for free at supercloud.world. And we are super excited about the Supercloud 2 lineup of guests whereas Supercloud 22 in August, was all about refining the definition of Supercloud testing its technical feasibility and understanding various deployment models. Supercloud 2 features practitioners, technologists and analysts discussing what customers need with real-world examples of Supercloud and will expose thinking around a new breed of cross-cloud apps, data apps, if you will that change the way machines and humans interact with each other. Now the example we'd use if you think about applications today, say a CRM system, sales reps, what are they doing? They're entering data into opportunities they're choosing products they're importing contacts, et cetera. And sure the machine can then take all that data and spit out a forecast by rep, by region, by product, et cetera. But today's applications are largely about filling in forms and or codifying processes. In the future, the Supercloud community sees a new breed of applications emerging where data resides on different clouds, in different data storages, databases, Lakehouse, et cetera. And the machine uses AI to inspect the e-commerce system the inventory data, supply chain information and other systems, and puts together a plan without any human intervention whatsoever. Think about a system that orchestrates people, places and things like an Uber for business. So at Supercloud 2, you'll hear about this vision along with some of today's challenges facing practitioners. Zhamak Dehghani, the founder of Data Mesh is a headliner. Kit Colbert also is headlining. He laid out at the first Supercloud an initial architecture for what that's going to look like. That was last August. And he's going to present his most current thinking on the topic. Veronika Durgin of Sachs will be featured and talk about data sharing across clouds and you know what she needs in the future. One of the main highlights of Supercloud 2 is a dive into Walmart's Supercloud. Other featured practitioners include Western Union Ionis Pharmaceuticals, Warner Media. We've got deep, deep technology dives with folks like Bob Muglia, David Flynn Tristan Handy of DBT Labs, Nir Zuk, the founder of Palo Alto Networks focused on security. Thomas Hazel, who's going to talk about a new type of database for Supercloud. It's several analysts including Keith Townsend Maribel Lopez, George Gilbert, Sanjeev Mohan and so many more guests, we don't have time to list them all. They're all up on supercloud.world with a full agenda, so you can check that out. Now let's take a look at some of the things that we're exploring in more detail starting with the Walmart Cloud native platform, they call it WCNP. We definitely see this as a Supercloud and we dig into it with Jack Greenfield. He's the head of architecture at Walmart. Here's a quote from Jack. "WCNP is an implementation of Kubernetes for the Walmart ecosystem. We've taken Kubernetes off the shelf as open source." By the way, they do the same thing with OpenStack. "And we have integrated it with a number of foundational services that provide other aspects of our computational environment. Kubernetes off the shelf doesn't do everything." And so what Walmart chose to do, they took a do-it-yourself approach to build a Supercloud for a variety of reasons that Jack will explain, along with Walmart's so-called triplet architecture connecting on-prem, Azure and GCP. No surprise, there's no Amazon at Walmart for obvious reasons. And what they do is they create a common experience for devs across clouds. Jack is going to talk about how Walmart is evolving its Supercloud in the future. You don't want to miss that. Now, next, let's take a look at how Veronica Durgin of SAKS thinks about data sharing across clouds. Data sharing we think is a potential killer use case for Supercloud. In fact, let's hear it in Veronica's own words. Please play the clip. >> How do we talk to each other? And more importantly, how do we data share? You know, I work with data, you know this is what I do. So if you know I want to get data from a company that's using, say Google, how do we share it in a smooth way where it doesn't have to be this crazy I don't know, SFTP file moving? So that's where I think Supercloud comes to me in my mind, is like practical applications. How do we create that mesh, that network that we can easily share data with each other? >> Now data mesh is a possible architectural approach that will enable more facile data sharing and the monetization of data products. You'll hear Zhamak Dehghani live in studio talking about what standards are missing to make this vision a reality across the Supercloud. Now one of the other things that we're really excited about is digging deeper into the right approach for Supercloud adoption. And we're going to share a preview of a debate that's going on right now in the community. Bob Muglia, former CEO of Snowflake and Microsoft Exec was kind enough to spend some time looking at the community's supercloud definition and he felt that it needed to be simplified. So in near real time he came up with the following definition that we're showing here. I'll read it. "A Supercloud is a platform that provides programmatically consistent services hosted on heterogeneous cloud providers." So not only did Bob simplify the initial definition he's stressed that the Supercloud is a platform versus an architecture implying that the platform provider eg Snowflake, VMware, Databricks, Cohesity, et cetera is responsible for determining the architecture. Now interestingly in the shared Google doc that the working group uses to collaborate on the supercloud de definition, Dr. Nelu Mihai who is actually building a Supercloud responded as follows to Bob's assertion "We need to avoid creating many Supercloud platforms with their own architectures. If we do that, then we create other proprietary clouds on top of existing ones. We need to define an architecture of how Supercloud interfaces with all other clouds. What is the information model? What is the execution model and how users will interact with Supercloud?" What does this seemingly nuanced point tell us and why does it matter? Well, history suggests that de facto standards will emerge more quickly to resolve real world practitioner problems and catch on more quickly than consensus-based architectures and standards-based architectures. But in the long run, the ladder may serve customers better. So we'll be exploring this topic in more detail in Supercloud 2, and of course we'd love to hear what you think platform, architecture, both? Now one of the real technical gurus that we'll have in studio at Supercloud two is David Flynn. He's one of the people behind the the movement that enabled enterprise flash adoption, that craze. And he did that with Fusion IO and he is now working on a system to enable read write data access to any user in any application in any data center or on any cloud anywhere. So think of this company as a Supercloud enabler. Allow me to share an excerpt from a conversation David Flore and I had with David Flynn last year. He as well gave a lot of thought to the Supercloud definition and was really helpful with an opinionated point of view. He said something to us that was, we thought relevant. "What is the operating system for a decentralized cloud? The main two functions of an operating system or an operating environment are one the process scheduler and two, the file system. The strongest argument for supercloud is made when you go down to the platform layer and talk about it as an operating environment on which you can run all forms of applications." So a couple of implications here that will be exploring with David Flynn in studio. First we're inferring from his comment that he's in the platform camp where the platform owner is responsible for the architecture and there are obviously trade-offs there and benefits but we'll have to clarify that with him. And second, he's basically saying, you kill the concept the further you move up the stack. So the weak, the further you move the stack the weaker the supercloud argument becomes because it's just becoming SaaS. Now this is something we're going to explore to better understand is thinking on this, but also whether the existing notion of SaaS is changing and whether or not a new breed of Supercloud apps will emerge. Which brings us to this really interesting fellow that George Gilbert and I RIFed with ahead of Supercloud two. Tristan Handy, he's the founder and CEO of DBT Labs and he has a highly opinionated and technical mind. Here's what he said, "One of the things that we still don't know how to API-ify is concepts that live inside of your data warehouse inside of your data lake. These are core concepts that the business should be able to create applications around very easily. In fact, that's not the case because it involves a lot of data engineering pipeline and other work to make these available. So if you really want to make it easy to create these data experiences for users you need to have an ability to describe these metrics and then to turn them into APIs to make them accessible to application developers who have literally no idea how they're calculated behind the scenes and they don't need to." A lot of implications to this statement that will explore at Supercloud two versus Jamma Dani's data mesh comes into play here with her critique of hyper specialized data pipeline experts with little or no domain knowledge. Also the need for simplified self-service infrastructure which Kit Colbert is likely going to touch upon. Veronica Durgin of SAKS and her ideal state for data shearing along with Harveer Singh of Western Union. They got to deal with 200 locations around the world in data privacy issues, data sovereignty how do you share data safely? Same with Nick Taylor of Ionis Pharmaceutical. And not to blow your mind but Thomas Hazel and Bob Muglia deposit that to make data apps a reality across the Supercloud you have to rethink everything. You can't just let in memory databases and caching architectures take care of everything in a brute force manner. Rather you have to get down to really detailed levels even things like how data is laid out on disk, ie flash and think about rewriting applications for the Supercloud and the MLAI era. All of this and more at Supercloud two which wouldn't be complete without some data. So we pinged our friends from ETR Eric Bradley and Darren Bramberm to see if they had any data on Supercloud that we could tap. And so we're going to be analyzing a number of the players as well at Supercloud two. Now, many of you are familiar with this graphic here we show some of the players involved in delivering or enabling Supercloud-like capabilities. On the Y axis is spending momentum and on the horizontal accesses market presence or pervasiveness in the data. So netscore versus what they call overlap or end in the data. And the table insert shows how the dots are plotted now not to steal ETR's thunder but the first point is you really can't have supercloud without the hyperscale cloud platforms which is shown on this graphic. But the exciting aspect of Supercloud is the opportunity to build value on top of that hyperscale infrastructure. Snowflake here continues to show strong spending velocity as those Databricks, Hashi, Rubrik. VMware Tanzu, which we all put under the magnifying glass after the Broadcom announcements, is also showing momentum. Unfortunately due to a scheduling conflict we weren't able to get Red Hat on the program but they're clearly a player here. And we've put Cohesity and Veeam on the chart as well because backup is a likely use case across clouds and on-premises. And now one other call out that we drill down on at Supercloud two is CloudFlare, which actually uses the term supercloud maybe in a different way. They look at Supercloud really as you know, serverless on steroids. And so the data brains at ETR will have more to say on this topic at Supercloud two along with many others. Okay, so why should you attend Supercloud two? What's in it for me kind of thing? So first of all, if you're a practitioner and you want to understand what the possibilities are for doing cross-cloud services for monetizing data how your peers are doing data sharing, how some of your peers are actually building out a Supercloud you're going to get real world input from practitioners. If you're a technologist, you're trying to figure out various ways to solve problems around data, data sharing, cross-cloud service deployment there's going to be a number of deep technology experts that are going to share how they're doing it. We're also going to drill down with Walmart into a practical example of Supercloud with some other examples of how practitioners are dealing with cross-cloud complexity. Some of them, by the way, are kind of thrown up their hands and saying, Hey, we're going mono cloud. And we'll talk about the potential implications and dangers and risks of doing that. And also some of the benefits. You know, there's a question, right? Is Supercloud the same wine new bottle or is it truly something different that can drive substantive business value? So look, go to Supercloud.world it's January 17th at 9:00 AM Pacific. You can register for free and participate directly in the program. Okay, that's a wrap. I want to give a shout out to the Supercloud supporters. VMware has been a great partner as our anchor sponsor Chaos Search Proximo, and Alura as well. For contributing to the effort I want to thank Alex Myerson who's on production and manages the podcast. Ken Schiffman is his supporting cast as well. Kristen Martin and Cheryl Knight to help get the word out on social media and at our newsletters. And Rob Ho is our editor-in-chief over at Silicon Angle. Thank you all. Remember, these episodes are all available as podcast. Wherever you listen we really appreciate the support that you've given. We just saw some stats from from Buzz Sprout, we hit the top 25% we're almost at 400,000 downloads last year. So really appreciate your participation. All you got to do is search Breaking Analysis podcast and you'll find those I publish each week on wikibon.com and siliconangle.com. Or if you want to get ahold of me you can email me directly at David.Vellante@siliconangle.com or dm me DVellante or comment on our LinkedIn post. I want you to check out etr.ai. They've got the best survey data in the enterprise tech business. This is Dave Vellante for theCUBE Insights, powered by ETR. Thanks for watching. We'll see you next week at Supercloud two or next time on breaking analysis. (light music)
SUMMARY :
with Dave Vellante of the things that we're So if you know I want to get data and on the horizontal
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Bob Muglia | PERSON | 0.99+ |
Alex Myerson | PERSON | 0.99+ |
Cheryl Knight | PERSON | 0.99+ |
David Flynn | PERSON | 0.99+ |
Veronica | PERSON | 0.99+ |
Jack | PERSON | 0.99+ |
Nelu Mihai | PERSON | 0.99+ |
Zhamak Dehghani | PERSON | 0.99+ |
Thomas Hazel | PERSON | 0.99+ |
Nick Taylor | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Jack Greenfield | PERSON | 0.99+ |
Kristen Martin | PERSON | 0.99+ |
Ken Schiffman | PERSON | 0.99+ |
Veronica Durgin | PERSON | 0.99+ |
Walmart | ORGANIZATION | 0.99+ |
Rob Ho | PERSON | 0.99+ |
Warner Media | ORGANIZATION | 0.99+ |
Tristan Handy | PERSON | 0.99+ |
Veronika Durgin | PERSON | 0.99+ |
George Gilbert | PERSON | 0.99+ |
Ionis Pharmaceutical | ORGANIZATION | 0.99+ |
George Gilbert | PERSON | 0.99+ |
Bob Muglia | PERSON | 0.99+ |
David Flore | PERSON | 0.99+ |
DBT Labs | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Bob | PERSON | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
21 sessions | QUANTITY | 0.99+ |
Darren Bramberm | PERSON | 0.99+ |
33 guests | QUANTITY | 0.99+ |
Nir Zuk | PERSON | 0.99+ |
Boston | LOCATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Harveer Singh | PERSON | 0.99+ |
Kit Colbert | PERSON | 0.99+ |
Databricks | ORGANIZATION | 0.99+ |
Sanjeev Mohan | PERSON | 0.99+ |
Supercloud 2 | TITLE | 0.99+ |
Snowflake | ORGANIZATION | 0.99+ |
last year | DATE | 0.99+ |
Western Union | ORGANIZATION | 0.99+ |
Cohesity | ORGANIZATION | 0.99+ |
Supercloud | ORGANIZATION | 0.99+ |
200 locations | QUANTITY | 0.99+ |
August | DATE | 0.99+ |
Keith Townsend | PERSON | 0.99+ |
Data Mesh | ORGANIZATION | 0.99+ |
Palo Alto Networks | ORGANIZATION | 0.99+ |
David.Vellante@siliconangle.com | OTHER | 0.99+ |
next week | DATE | 0.99+ |
both | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
second | QUANTITY | 0.99+ |
first point | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
First | QUANTITY | 0.99+ |
VMware | ORGANIZATION | 0.98+ |
Silicon Angle | ORGANIZATION | 0.98+ |
ETR | ORGANIZATION | 0.98+ |
Eric Bradley | PERSON | 0.98+ |
two | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
Sachs | ORGANIZATION | 0.98+ |
SAKS | ORGANIZATION | 0.98+ |
Supercloud | EVENT | 0.98+ |
last August | DATE | 0.98+ |
each week | QUANTITY | 0.98+ |
Breaking Analysis: Grading our 2022 Enterprise Technology Predictions
>>From the Cube Studios in Palo Alto in Boston, bringing you data-driven insights from the cube and E T R. This is breaking analysis with Dave Valante. >>Making technology predictions in 2022 was tricky business, especially if you were projecting the performance of markets or identifying I P O prospects and making binary forecast on data AI and the macro spending climate and other related topics in enterprise tech 2022, of course was characterized by a seesaw economy where central banks were restructuring their balance sheets. The war on Ukraine fueled inflation supply chains were a mess. And the unintended consequences of of forced march to digital and the acceleration still being sorted out. Hello and welcome to this week's weekly on Cube Insights powered by E T R. In this breaking analysis, we continue our annual tradition of transparently grading last year's enterprise tech predictions. And you may or may not agree with our self grading system, but look, we're gonna give you the data and you can draw your own conclusions and tell you what, tell us what you think. >>All right, let's get right to it. So our first prediction was tech spending increases by 8% in 2022. And as we exited 2021 CIOs, they were optimistic about their digital transformation plans. You know, they rushed to make changes to their business and were eager to sharpen their focus and continue to iterate on their digital business models and plug the holes that they, the, in the learnings that they had. And so we predicted that 8% rise in enterprise tech spending, which looked pretty good until Ukraine and the Fed decided that, you know, had to rush and make up for lost time. We kind of nailed the momentum in the energy sector, but we can't give ourselves too much credit for that layup. And as of October, Gartner had it spending growing at just over 5%. I think it was 5.1%. So we're gonna take a C plus on this one and, and move on. >>Our next prediction was basically kind of a slow ground ball. The second base, if I have to be honest, but we felt it was important to highlight that security would remain front and center as the number one priority for organizations in 2022. As is our tradition, you know, we try to up the degree of difficulty by specifically identifying companies that are gonna benefit from these trends. So we highlighted some possible I P O candidates, which of course didn't pan out. S NQ was on our radar. The company had just had to do another raise and they recently took a valuation hit and it was a down round. They raised 196 million. So good chunk of cash, but, but not the i p O that we had predicted Aqua Securities focus on containers and cloud native. That was a trendy call and we thought maybe an M SS P or multiple managed security service providers like Arctic Wolf would I p o, but no way that was happening in the crummy market. >>Nonetheless, we think these types of companies, they're still faring well as the talent shortage in security remains really acute, particularly in the sort of mid-size and small businesses that often don't have a sock Lacework laid off 20% of its workforce in 2022. And CO C e o Dave Hatfield left the company. So that I p o didn't, didn't happen. It was probably too early for Lacework. Anyway, meanwhile you got Netscope, which we've cited as strong in the E T R data as particularly in the emerging technology survey. And then, you know, I lumia holding its own, you know, we never liked that 7 billion price tag that Okta paid for auth zero, but we loved the TAM expansion strategy to target developers beyond sort of Okta's enterprise strength. But we gotta take some points off of the failure thus far of, of Okta to really nail the integration and the go to market model with azero and build, you know, bring that into the, the, the core Okta. >>So the focus on endpoint security that was a winner in 2022 is CrowdStrike led that charge with others holding their own, not the least of which was Palo Alto Networks as it continued to expand beyond its core network security and firewall business, you know, through acquisition. So overall we're gonna give ourselves an A minus for this relatively easy call, but again, we had some specifics associated with it to make it a little tougher. And of course we're watching ve very closely this this coming year in 2023. The vendor consolidation trend. You know, according to a recent Palo Alto network survey with 1300 SecOps pros on average organizations have more than 30 tools to manage security tools. So this is a logical way to optimize cost consolidating vendors and consolidating redundant vendors. The E T R data shows that's clearly a trend that's on the upswing. >>Now moving on, a big theme of 2020 and 2021 of course was remote work and hybrid work and new ways to work and return to work. So we predicted in 2022 that hybrid work models would become the dominant protocol, which clearly is the case. We predicted that about 33% of the workforce would come back to the office in 2022 in September. The E T R data showed that figure was at 29%, but organizations expected that 32% would be in the office, you know, pretty much full-time by year end. That hasn't quite happened, but we were pretty close with the projection, so we're gonna take an A minus on this one. Now, supply chain disruption was another big theme that we felt would carry through 2022. And sure that sounds like another easy one, but as is our tradition, again we try to put some binary metrics around our predictions to put some meat in the bone, so to speak, and and allow us than you to say, okay, did it come true or not? >>So we had some data that we presented last year and supply chain issues impacting hardware spend. We said at the time, you can see this on the left hand side of this chart, the PC laptop demand would remain above pre covid levels, which would reverse a decade of year on year declines, which I think started in around 2011, 2012. Now, while demand is down this year pretty substantially relative to 2021, I D C has worldwide unit shipments for PCs at just over 300 million for 22. If you go back to 2019 and you're looking at around let's say 260 million units shipped globally, you know, roughly, so, you know, pretty good call there. Definitely much higher than pre covid levels. But so what you might be asking why the B, well, we projected that 30% of customers would replace security appliances with cloud-based services and that more than a third would replace their internal data center server and storage hardware with cloud services like 30 and 40% respectively. >>And we don't have explicit survey data on exactly these metrics, but anecdotally we see this happening in earnest. And we do have some data that we're showing here on cloud adoption from ET R'S October survey where the midpoint of workloads running in the cloud is around 34% and forecast, as you can see, to grow steadily over the next three years. So this, well look, this is not, we understand it's not a one-to-one correlation with our prediction, but it's a pretty good bet that we were right, but we gotta take some points off, we think for the lack of unequivocal proof. Cause again, we always strive to make our predictions in ways that can be measured as accurate or not. Is it binary? Did it happen, did it not? Kind of like an O K R and you know, we strive to provide data as proof and in this case it's a bit fuzzy. >>We have to admit that although we're pretty comfortable that the prediction was accurate. And look, when you make an hard forecast, sometimes you gotta pay the price. All right, next, we said in 2022 that the big four cloud players would generate 167 billion in IS and PaaS revenue combining for 38% market growth. And our current forecasts are shown here with a comparison to our January, 2022 figures. So coming into this year now where we are today, so currently we expect 162 billion in total revenue and a 33% growth rate. Still very healthy, but not on our mark. So we think a w s is gonna miss our predictions by about a billion dollars, not, you know, not bad for an 80 billion company. So they're not gonna hit that expectation though of getting really close to a hundred billion run rate. We thought they'd exit the year, you know, closer to, you know, 25 billion a quarter and we don't think they're gonna get there. >>Look, we pretty much nailed Azure even though our prediction W was was correct about g Google Cloud platform surpassing Alibaba, Alibaba, we way overestimated the performance of both of those companies. So we're gonna give ourselves a C plus here and we think, yeah, you might think it's a little bit harsh, we could argue for a B minus to the professor, but the misses on GCP and Alibaba we think warrant a a self penalty on this one. All right, let's move on to our prediction about Supercloud. We said it becomes a thing in 2022 and we think by many accounts it has, despite the naysayers, we're seeing clear evidence that the concept of a layer of value add that sits above and across clouds is taking shape. And on this slide we showed just some of the pickup in the industry. I mean one of the most interesting is CloudFlare, the biggest supercloud antagonist. >>Charles Fitzgerald even predicted that no vendor would ever use the term in their marketing. And that would be proof if that happened that Supercloud was a thing and he said it would never happen. Well CloudFlare has, and they launched their version of Supercloud at their developer week. Chris Miller of the register put out a Supercloud block diagram, something else that Charles Fitzgerald was, it was was pushing us for, which is rightly so, it was a good call on his part. And Chris Miller actually came up with one that's pretty good at David Linthicum also has produced a a a A block diagram, kind of similar, David uses the term metacloud and he uses the term supercloud kind of interchangeably to describe that trend. And so we we're aligned on that front. Brian Gracely has covered the concept on the popular cloud podcast. Berkeley launched the Sky computing initiative. >>You read through that white paper and many of the concepts highlighted in the Supercloud 3.0 community developed definition align with that. Walmart launched a platform with many of the supercloud salient attributes. So did Goldman Sachs, so did Capital One, so did nasdaq. So you know, sorry you can hate the term, but very clearly the evidence is gathering for the super cloud storm. We're gonna take an a plus on this one. Sorry, haters. Alright, let's talk about data mesh in our 21 predictions posts. We said that in the 2020s, 75% of large organizations are gonna re-architect their big data platforms. So kind of a decade long prediction. We don't like to do that always, but sometimes it's warranted. And because it was a longer term prediction, we, at the time in, in coming into 22 when we were evaluating our 21 predictions, we took a grade of incomplete because the sort of decade long or majority of the decade better part of the decade prediction. >>So last year, earlier this year, we said our number seven prediction was data mesh gains momentum in 22. But it's largely confined and narrow data problems with limited scope as you can see here with some of the key bullets. So there's a lot of discussion in the data community about data mesh and while there are an increasing number of examples, JP Morgan Chase, Intuit, H S P C, HelloFresh, and others that are completely rearchitecting parts of their data platform completely rearchitecting entire data platforms is non-trivial. There are organizational challenges, there're data, data ownership, debates, technical considerations, and in particular two of the four fundamental data mesh principles that the, the need for a self-service infrastructure and federated computational governance are challenging. Look, democratizing data and facilitating data sharing creates conflicts with regulatory requirements around data privacy. As such many organizations are being really selective with their data mesh implementations and hence our prediction of narrowing the scope of data mesh initiatives. >>I think that was right on J P M C is a good example of this, where you got a single group within a, within a division narrowly implementing the data mesh architecture. They're using a w s, they're using data lakes, they're using Amazon Glue, creating a catalog and a variety of other techniques to meet their objectives. They kind of automating data quality and it was pretty well thought out and interesting approach and I think it's gonna be made easier by some of the announcements that Amazon made at the recent, you know, reinvent, particularly trying to eliminate ET t l, better connections between Aurora and Redshift and, and, and better data sharing the data clean room. So a lot of that is gonna help. Of course, snowflake has been on this for a while now. Many other companies are facing, you know, limitations as we said here and this slide with their Hadoop data platforms. They need to do new, some new thinking around that to scale. HelloFresh is a really good example of this. Look, the bottom line is that organizations want to get more value from data and having a centralized, highly specialized teams that own the data problem, it's been a barrier and a blocker to success. The data mesh starts with organizational considerations as described in great detail by Ash Nair of Warner Brothers. So take a listen to this clip. >>Yeah, so when people think of Warner Brothers, you always think of like the movie studio, but we're more than that, right? I mean, you think of H B O, you think of t n t, you think of C N N. We have 30 plus brands in our portfolio and each have their own needs. So the, the idea of a data mesh really helps us because what we can do is we can federate access across the company so that, you know, CNN can work at their own pace. You know, when there's election season, they can ingest their own data and they don't have to, you know, bump up against, as an example, HBO if Game of Thrones is going on. >>So it's often the case that data mesh is in the eyes of the implementer. And while a company's implementation may not strictly adhere to Jamma Dani's vision of data mesh, and that's okay, the goal is to use data more effectively. And despite Gartner's attempts to deposition data mesh in favor of the somewhat confusing or frankly far more confusing data fabric concept that they stole from NetApp data mesh is taking hold in organizations globally today. So we're gonna take a B on this one. The prediction is shaping up the way we envision, but as we previously reported, it's gonna take some time. The better part of a decade in our view, new standards have to emerge to make this vision become reality and they'll come in the form of both open and de facto approaches. Okay, our eighth prediction last year focused on the face off between Snowflake and Databricks. >>And we realized this popular topic, and maybe one that's getting a little overplayed, but these are two companies that initially, you know, looked like they were shaping up as partners and they, by the way, they are still partnering in the field. But you go back a couple years ago, the idea of using an AW w s infrastructure, Databricks machine intelligence and applying that on top of Snowflake as a facile data warehouse, still very viable. But both of these companies, they have much larger ambitions. They got big total available markets to chase and large valuations that they have to justify. So what's happening is, as we've previously reported, each of these companies is moving toward the other firm's core domain and they're building out an ecosystem that'll be critical for their future. So as part of that effort, we said each is gonna become aggressive investors and maybe start doing some m and a and they have in various companies. >>And on this chart that we produced last year, we studied some of the companies that were targets and we've added some recent investments of both Snowflake and Databricks. As you can see, they've both, for example, invested in elation snowflake's, put money into Lacework, the Secur security firm, ThoughtSpot, which is trying to democratize data with ai. Collibra is a governance platform and you can see Databricks investments in data transformation with D B T labs, Matillion doing simplified business intelligence hunters. So that's, you know, they're security investment and so forth. So other than our thought that we'd see Databricks I p o last year, this prediction been pretty spot on. So we'll give ourselves an A on that one. Now observability has been a hot topic and we've been covering it for a while with our friends at E T R, particularly Eric Bradley. Our number nine prediction last year was basically that if you're not cloud native and observability, you are gonna be in big trouble. >>So everything guys gotta go cloud native. And that's clearly been the case. Splunk, the big player in the space has been transitioning to the cloud, hasn't always been pretty, as we reported, Datadog real momentum, the elk stack, that's open source model. You got new entrants that we've cited before, like observe, honeycomb, chaos search and others that we've, we've reported on, they're all born in the cloud. So we're gonna take another a on this one, admittedly, yeah, it's a re reasonably easy call, but you gotta have a few of those in the mix. Okay, our last prediction, our number 10 was around events. Something the cube knows a little bit about. We said that a new category of events would emerge as hybrid and that for the most part is happened. So that's gonna be the mainstay is what we said. That pure play virtual events are gonna give way to hi hybrid. >>And the narrative is that virtual only events are, you know, they're good for quick hits, but lousy replacements for in-person events. And you know that said, organizations of all shapes and sizes, they learn how to create better virtual content and support remote audiences during the pandemic. So when we set at pure play is gonna give way to hybrid, we said we, we i we implied or specific or specified that the physical event that v i p experience is going defined. That overall experience and those v i p events would create a little fomo, fear of, of missing out in a virtual component would overlay that serves an audience 10 x the size of the physical. We saw that really two really good examples. Red Hat Summit in Boston, small event, couple thousand people served tens of thousands, you know, online. Second was Google Cloud next v i p event in, in New York City. >>Everything else was, was, was, was virtual. You know, even examples of our prediction of metaverse like immersion have popped up and, and and, and you know, other companies are doing roadshow as we predicted like a lot of companies are doing it. You're seeing that as a major trend where organizations are going with their sales teams out into the regions and doing a little belly to belly action as opposed to the big giant event. That's a definitely a, a trend that we're seeing. So in reviewing this prediction, the grade we gave ourselves is, you know, maybe a bit unfair, it should be, you could argue for a higher grade, but the, but the organization still haven't figured it out. They have hybrid experiences but they generally do a really poor job of leveraging the afterglow and of event of an event. It still tends to be one and done, let's move on to the next event or the next city. >>Let the sales team pick up the pieces if they were paying attention. So because of that, we're only taking a B plus on this one. Okay, so that's the review of last year's predictions. You know, overall if you average out our grade on the 10 predictions that come out to a b plus, I dunno why we can't seem to get that elusive a, but we're gonna keep trying our friends at E T R and we are starting to look at the data for 2023 from the surveys and all the work that we've done on the cube and our, our analysis and we're gonna put together our predictions. We've had literally hundreds of inbounds from PR pros pitching us. We've got this huge thick folder that we've started to review with our yellow highlighter. And our plan is to review it this month, take a look at all the data, get some ideas from the inbounds and then the e t R of January surveys in the field. >>It's probably got a little over a thousand responses right now. You know, they'll get up to, you know, 1400 or so. And once we've digested all that, we're gonna go back and publish our predictions for 2023 sometime in January. So stay tuned for that. All right, we're gonna leave it there for today. You wanna thank Alex Myerson who's on production and he manages the podcast, Ken Schiffman as well out of our, our Boston studio. I gotta really heartfelt thank you to Kristen Martin and Cheryl Knight and their team. They helped get the word out on social and in our newsletters. Rob Ho is our editor in chief over at Silicon Angle who does some great editing for us. Thank you all. Remember all these podcasts are available or all these episodes are available is podcasts. Wherever you listen, just all you do Search Breaking analysis podcast, really getting some great traction there. Appreciate you guys subscribing. I published each week on wikibon.com, silicon angle.com or you can email me directly at david dot valante silicon angle.com or dm me Dante, or you can comment on my LinkedIn post. And please check out ETR AI for the very best survey data in the enterprise tech business. Some awesome stuff in there. This is Dante for the Cube Insights powered by etr. Thanks for watching and we'll see you next time on breaking analysis.
SUMMARY :
From the Cube Studios in Palo Alto in Boston, bringing you data-driven insights from self grading system, but look, we're gonna give you the data and you can draw your own conclusions and tell you what, We kind of nailed the momentum in the energy but not the i p O that we had predicted Aqua Securities focus on And then, you know, I lumia holding its own, you So the focus on endpoint security that was a winner in 2022 is CrowdStrike led that charge put some meat in the bone, so to speak, and and allow us than you to say, okay, We said at the time, you can see this on the left hand side of this chart, the PC laptop demand would remain Kind of like an O K R and you know, we strive to provide data We thought they'd exit the year, you know, closer to, you know, 25 billion a quarter and we don't think they're we think, yeah, you might think it's a little bit harsh, we could argue for a B minus to the professor, Chris Miller of the register put out a Supercloud block diagram, something else that So you know, sorry you can hate the term, but very clearly the evidence is gathering for the super cloud But it's largely confined and narrow data problems with limited scope as you can see here with some of the announcements that Amazon made at the recent, you know, reinvent, particularly trying to the company so that, you know, CNN can work at their own pace. So it's often the case that data mesh is in the eyes of the implementer. but these are two companies that initially, you know, looked like they were shaping up as partners and they, So that's, you know, they're security investment and so forth. So that's gonna be the mainstay is what we And the narrative is that virtual only events are, you know, they're good for quick hits, the grade we gave ourselves is, you know, maybe a bit unfair, it should be, you could argue for a higher grade, You know, overall if you average out our grade on the 10 predictions that come out to a b plus, You know, they'll get up to, you know,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Alex Myerson | PERSON | 0.99+ |
Cheryl Knight | PERSON | 0.99+ |
Ken Schiffman | PERSON | 0.99+ |
Chris Miller | PERSON | 0.99+ |
CNN | ORGANIZATION | 0.99+ |
Rob Ho | PERSON | 0.99+ |
Alibaba | ORGANIZATION | 0.99+ |
Dave Valante | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
5.1% | QUANTITY | 0.99+ |
2022 | DATE | 0.99+ |
Charles Fitzgerald | PERSON | 0.99+ |
Dave Hatfield | PERSON | 0.99+ |
Brian Gracely | PERSON | 0.99+ |
2019 | DATE | 0.99+ |
Lacework | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
GCP | ORGANIZATION | 0.99+ |
33% | QUANTITY | 0.99+ |
Walmart | ORGANIZATION | 0.99+ |
David | PERSON | 0.99+ |
2021 | DATE | 0.99+ |
20% | QUANTITY | 0.99+ |
Kristen Martin | PERSON | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
2020 | DATE | 0.99+ |
Ash Nair | PERSON | 0.99+ |
Goldman Sachs | ORGANIZATION | 0.99+ |
162 billion | QUANTITY | 0.99+ |
New York City | LOCATION | 0.99+ |
Databricks | ORGANIZATION | 0.99+ |
October | DATE | 0.99+ |
last year | DATE | 0.99+ |
Arctic Wolf | ORGANIZATION | 0.99+ |
two companies | QUANTITY | 0.99+ |
38% | QUANTITY | 0.99+ |
September | DATE | 0.99+ |
Fed | ORGANIZATION | 0.99+ |
JP Morgan Chase | ORGANIZATION | 0.99+ |
80 billion | QUANTITY | 0.99+ |
29% | QUANTITY | 0.99+ |
32% | QUANTITY | 0.99+ |
21 predictions | QUANTITY | 0.99+ |
30% | QUANTITY | 0.99+ |
HBO | ORGANIZATION | 0.99+ |
75% | QUANTITY | 0.99+ |
Game of Thrones | TITLE | 0.99+ |
January | DATE | 0.99+ |
2023 | DATE | 0.99+ |
10 predictions | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
22 | QUANTITY | 0.99+ |
ThoughtSpot | ORGANIZATION | 0.99+ |
196 million | QUANTITY | 0.99+ |
30 | QUANTITY | 0.99+ |
each | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
Palo Alto Networks | ORGANIZATION | 0.99+ |
2020s | DATE | 0.99+ |
167 billion | QUANTITY | 0.99+ |
Okta | ORGANIZATION | 0.99+ |
Second | QUANTITY | 0.99+ |
Gartner | ORGANIZATION | 0.99+ |
Eric Bradley | PERSON | 0.99+ |
Aqua Securities | ORGANIZATION | 0.99+ |
Dante | PERSON | 0.99+ |
8% | QUANTITY | 0.99+ |
Warner Brothers | ORGANIZATION | 0.99+ |
Intuit | ORGANIZATION | 0.99+ |
Cube Studios | ORGANIZATION | 0.99+ |
each week | QUANTITY | 0.99+ |
7 billion | QUANTITY | 0.99+ |
40% | QUANTITY | 0.99+ |
Snowflake | ORGANIZATION | 0.99+ |
Breaking Analysis: Even the Cloud Is Not Immune to the Seesaw Economy
>>From the Cube Studios in Palo Alto in Boston, bringing you data driven insights from the cube and etr. This is breaking analysis with Dave Ante. >>Have you ever been driving on the highway and traffic suddenly slows way down and then after a little while it picks up again and you're cruising along and you're thinking, Okay, hey, that was weird. But it's clear sailing now. Off we go, only to find out in a bit that the traffic is building up ahead again, forcing you to pump the brakes as the traffic pattern ebbs and flows well. Welcome to the Seesaw economy. The fed induced fire that prompted an unprecedented rally in tech is being purposefully extinguished now by that same fed. And virtually every sector of the tech industry is having to reset its expectations, including the cloud segment. Hello and welcome to this week's Wikibon Cube Insights powered by etr. In this breaking analysis will review the implications of the earnings announcements from the big three cloud players, Amazon, Microsoft, and Google who announced this week. >>And we'll update you on our quarterly IAS forecast and share the latest from ETR with a focus on cloud computing. Now, before we get into the new data, we wanna review something we shared with you on October 14th, just a couple weeks back, this is sort of a, we told you it was coming slide. It's an XY graph that shows ET R'S proprietary net score methodology on the vertical axis. That's a measure of spending momentum, spending velocity, and an overlap or presence in the dataset that's on the X axis. That's really a measure of pervasiveness. In the survey, the table, you see that table insert there that shows Wiki Bond's Q2 estimates of IAS revenue for the big four hyperscalers with their year on year growth rates. Now we told you at the time, this is data from the July TW 22 ETR survey and the ETR hadn't released its October survey results at that time. >>This was just a couple weeks ago. And while we couldn't share the specific data from the October survey, we were able to get a glimpse and we depicted the slowdown that we saw in the October data with those dotted arrows kind of down into the right, we said at the time that we were seeing and across the board slowdown even for the big three cloud vendors. Now, fast forward to this past week and we saw earnings releases from Alphabet, Microsoft, and just last night Amazon. Now you may be thinking, okay, big deal. The ETR survey data didn't really tell us anything we didn't already know. But judging from the negative reaction in the stock market to these earnings announcements, the degree of softness surprised a lot of investors. Now, at the time we didn't update our forecast, it doesn't make sense for us to do that when we're that close to earning season. >>And now that all the big three ha with all the big four with the exception of Alibaba have announced we've, we've updated. And so here's that data. This chart lays out our view of the IS and PAs worldwide revenue. Basically it's cloud infrastructure with an attempt to exclude any SaaS revenue so we can make an apples to apples comparison across all the clouds. Now the reason that actual is in quotes is because Microsoft and Google don't report IAS revenue, but they do give us clues and kind of directional commentary, which we then triangulate with other data that we have from the channel and ETR surveys and just our own intelligence. Now the second column there after the vendor name shows our previous estimates for q3, and then next to that we show our actuals. Same with the growth rates. And then we round out the chart with that lighter blue color highlights, the full year estimates for revenue and growth. >>So the key takeaways are that we shaved about $4 billion in revenue and roughly 300 basis points of growth off of our full year estimates. AWS had a strong July but exited Q3 in the mid 20% growth rate year over year. So we're using that guidance, you know, for our Q4 estimates. Azure came in below our earlier estimates, but Google actually exceeded our expectations. Now the compression in the numbers is in our view of function of the macro demand climate, we've made every attempt to adjust for constant currency. So FX should not be a factor in this data, but it's sure you know that that ma the the, the currency effects are weighing on those companies income statements. And so look, this is the fundamental dynamic of a cloud model where you can dial down consumption when you need to and dial it up when you need to. >>Now you may be thinking that many big cloud customers have a committed level of spending in order to get better discounts. And that's true. But what's happening we think is they'll reallocate that spend toward, let's say for example, lower cost storage tiers or they may take advantage of better price performance processors like Graviton for example. That is a clear trend that we're seeing and smaller companies that were perhaps paying by the drink just on demand, they're moving to reserve instance models to lower their monthly bill. So instead of taking the easy way out and just spending more companies are reallocating their reserve capacity toward lower cost. So those sort of lower cost services, so they're spending time and effort optimizing to get more for, for less whereas, or get more for the same is really how we should, should, should phrase it. Whereas during the pandemic, many companies were, you know, they perhaps were not as focused on doing that because business was booming and they had a response. >>So they just, you know, spend more dial it up. So in general, as they say, customers are are doing more with, with the same. Now let's look at the growth dynamic and spend some time on that. I think this is important. This data shows worldwide quarterly revenue growth rates back to Q1 2019 for the big four. So a couple of interesting things. The data tells us during the pandemic, you saw both AWS and Azure, but the law of large numbers and actually accelerate growth. AWS especially saw progressively increasing growth rates throughout 2021 for each quarter. Now that trend, as you can see is reversed in 2022 for aws. Now we saw Azure come down a bit, but it's still in the low forties in terms of percentage growth. While Google actually saw an uptick in growth this last quarter for GCP by our estimates as GCP is becoming an increasingly large portion of Google's overall cloud business. >>Now, unfortunately Google Cloud continues to lose north of 850 million per quarter, whereas AWS and Azure are profitable cloud businesses even though Alibaba is suffering its woes from China. And we'll see how they come in when they report in mid-November. The overall hyperscale market grew at 32% in Q3 in terms of worldwide revenue. So the slowdown isn't due to the repatriation or competition from on-prem vendors in our view, it's a macro related trend. And cloud will continue to significantly outperform other sectors despite its massive size. You know, on the repatriation point, it just still doesn't show up in the data. The A 16 Z article from Sarah Wong and Martin Martin Kasa claiming that repatriation was inevitable as a means to lower cost of good sold for SaaS companies. You know, while that was thought provoking, it hasn't shown up in the numbers. And if you read the financial statements of both AWS and its partners like Snowflake and you dig into the, to the, to the quarterly reports, you'll see little notes and comments with their ongoing negotiations to lower cloud costs for customers. >>AWS and no doubt execs at Azure and GCP understand that the lifetime value of a customer is worth much more than near term gross margin. And you can expect the cloud vendors to strike a balance between profitability, near term profitability anyway and customer attention. Now, even though Google Cloud platform saw accelerated growth, we need to put that in context for you. So GCP, by our estimate, has now crossed over the $3 billion for quarter market actually did so last quarter, but its growth rate accelerated to 42% this quarter. And so that's a good sign in our view. But let's do a quick little comparison with when AWS and Azure crossed the $3 billion mark and compare their growth rates at the time. So if you go back to to Q2 2016, as we're showing in this chart, that's around the time that AWS hit 3 billion per quarter and at the same time was growing at 58%. >>Azure by our estimates crossed that mark in Q4 2018 and at that time was growing at 67%. Again, compare that to Google's 42%. So one would expect Google's growth rate would be higher than its competitors at this point in the MO in the maturity of its cloud, which it's, you know, it's really not when you compared to to Azure. I mean they're kind of con, you know, comparable now but today, but, but you'll go back, you know, to that $3 billion mark. But more so looking at history, you'd like to see its growth rate at this point of a maturity model at least over 50%, which we don't believe it is. And one other point on this topic, you know, my business friend Matt Baker from Dell often says it's not a zero sum game, meaning there's plenty of opportunity exists to build value on top of hyperscalers. >>And I would totally agree it's not a dollar for dollar swap if you can continue to innovate. But history will show that the first company in makes the most money. Number two can do really well and number three tends to break even. Now maybe cloud is different because you have Microsoft software estate and the power behind that and that's driving its IAS business and Google ads are funding technology buildouts for, for for Google and gcp. So you know, we'll see how that plays out. But right now by this one measurement, Google is four years behind Microsoft in six years behind aws. Now to the point that cloud will continue to outpace other markets, let's, let's break this down a bit in spending terms and see why this claim holds water. This is data from ET r's latest October survey that shows the granularity of its net score or spending velocity metric. >>The lime green is new adoptions, so they're adding the platform, the forest green is spending more 6% or more. The gray bars spending is flat plus or minus, you know, 5%. The pinkish colors represent spending less down 6% or worse. And the bright red shows defections or churn of the platform. You subtract the reds from the greens and you get what's called net score, which is that blue dot that you can see on each of the bars. So what you see in the table insert is that all three have net scores above 40%, which is a highly elevated measure. Microsoft's net scores above 60% AWS well into the fifties and GCP in the mid forties. So all good. Now what's happening with all three is more customers are keep keeping their spending flat. So a higher percentage of customers are saying, our spending is now flat than it was in previous quarters and that's what's accounting for the compression. >>But the churn of all three, even gcp, which we reported, you know, last quarter from last quarter survey was was five x. The other two is actually very low in the single digits. So that might have been an anomaly. So that's a very good sign in our view. You know, again, customers aren't repatriating in droves, it's just not a trend that we would bet on, maybe makes for a FUD or you know, good marketing head, but it's just not a big deal. And you can't help but be impressed with both Microsoft and AWS's performance in the survey. And as we mentioned before, these companies aren't going to give up customers to try and preserve a little bit of gross margin. They'll do what it takes to keep people on their platforms cuz they'll make up for it over time with added services and improved offerings. >>Now, once these companies acquire a customer, they'll be very aggressive about keeping them. So customers take note, you have negotiating leverage, so use it. Okay, let's look at another cut at the cloud market from the ETR data set. Here's the two dimensional view, again, it's back, it's one of our favorites. Net score or spending momentum plotted against presence. And the data set, that's the x axis net score on the, on the vertical axis, this is a view of et r's cloud computing sector sector. You can see we put that magic 40% dotted red line in the table showing and, and then that the table inserts shows how the data are plotted with net score against presence. I e n in the survey, notably only the big three are above the 40% line of the names that we're showing here. The oth there, there are others. >>I mean if you put Snowflake on there, it'd be higher than any of these names, but we'll dig into that name in a later breaking analysis episode. Now this is just another way of quantifying the dominance of AWS and Azure, not only relative to Google, but the other cloud platforms out there. So we've, we've taken the opportunity here to plot IBM and Oracle, which both own a public cloud. Their performance is largely a reflection of them migrating their install bases to their respective public clouds and or hybrid clouds. And you know, that's fine, they're in the game. That's a point that we've made, you know, a number of times they're able to make it through the cloud, not whole and they at least have one, but they simply don't have the business momentum of AWS and Azure, which is actually quite impressive because AWS and Azure are now as large or larger than IBM and Oracle. >>And to show this type of continued growth that that that Azure and AWS show at their size is quite remarkable and customers are starting to recognize the viability of on-prem hi, you know, hybrid clouds like HPE GreenLake and Dell's apex. You know, you may say, well that's not cloud, but if the customer thinks it is and it was reporting in the survey that it is, we're gonna continue to report this view. You know, I don't know what's happening with H P E, They had a big down tick this quarter and I, and I don't read too much into that because their end is still pretty small at 53. So big fluctuations are not uncommon with those types of smaller ends, but it's over 50. So, you know, we did notice a a a negative within a giant public and private sector, which is often a, a bellwether giant public private is big public companies and large private companies like, like a Mars for example. >>So it, you know, it looks like for HPE it could be an outlier. We saw within the Fortune 1000 HPE E'S cloud looked actually really good and it had good spending momentum in that sector. When you di dig into the industry data within ETR dataset, obviously we're not showing that here, but we'll continue to monitor that. Okay, so where's this Leave us. Well look, this is really a tactical story of currency and macro headwinds as you can see. You know, we've laid out some of the points on this slide. The action in the stock market today, which is Friday after some of the soft earnings reports is really robust. You know, we'll see how it ends up in the day. So maybe this is a sign that the worst is over, but we don't think so. The visibility from tech companies is murky right now as most are guiding down, which indicates that their conservative outlook last quarter was still too optimistic. >>But as it relates to cloud, that platform is not going anywhere anytime soon. Sure, there are potential disruptors on the horizon, especially at the edge, but we're still a long ways off from, from the possibility that a new economic model emerges from the edge to disrupt the cloud and the opportunities in the cloud remain strong. I mean, what other path is there? Really private cloud. It was kind of a bandaid until the on-prem guys could get their a as a service models rolled out, which is just now happening. The hybrid thing is real, but it's, you know, defensive for the incumbents until they can get their super cloud investments going. Super cloud implying, capturing value above the hyperscaler CapEx, you know, call it what you want multi what multi-cloud should have been, the metacloud, the Uber cloud, whatever you like. But there are opportunities to play offense and that's clearly happening in the cloud ecosystem with the likes of Snowflake, Mongo, Hashi Corp. >>Hammer Spaces is a startup in this area. Aviatrix, CrowdStrike, Zeke Scaler, Okta, many, many more. And even the projects we see coming out of enterprise players like Dell, like with Project Alpine and what Pure Storage is doing along with a number of other of the backup vendors. So Q4 should be really interesting, but the real story is the investments that that companies are making now to leverage the cloud for digital transformations will be paying off down the road. This is not 1999. We had, you know, May might have had some good ideas and admittedly at a lot of bad ones too, but you didn't have the infrastructure to service customers at a low enough cost like you do today. The cloud is that infrastructure and so far it's been transformative, but it's likely the best is yet to come. Okay, let's call this a rap. >>Many thanks to Alex Morrison who does production and manages the podcast. Also Can Schiffman is our newest edition to the Boston Studio. Kristin Martin and Cheryl Knight helped get the word out on social media and in our newsletters. And Rob Ho is our editor in chief over@siliconangle.com, who does some wonderful editing for us. Thank you. Remember, all these episodes are available as podcasts. Wherever you listen, just search breaking analysis podcast. I publish each week on wiki bond.com at silicon angle.com. And you can email me at David dot valante@siliconangle.com or DM me at Dante or comment on my LinkedIn posts. And please do checkout etr.ai. They got the best survey data in the enterprise tech business. This is Dave Valante for the Cube Insights powered by etr. Thanks for watching and we'll see you next time on breaking analysis.
SUMMARY :
From the Cube Studios in Palo Alto in Boston, bringing you data driven insights from Have you ever been driving on the highway and traffic suddenly slows way down and then after In the survey, the table, you see that table insert there that Now, at the time we didn't update our forecast, it doesn't make sense for us And now that all the big three ha with all the big four with the exception of Alibaba have announced So we're using that guidance, you know, for our Q4 estimates. Whereas during the pandemic, many companies were, you know, they perhaps were not as focused So they just, you know, spend more dial it up. So the slowdown isn't due to the repatriation or And you can expect the cloud And one other point on this topic, you know, my business friend Matt Baker from Dell often says it's not a And I would totally agree it's not a dollar for dollar swap if you can continue to So what you see in the table insert is that all three have net scores But the churn of all three, even gcp, which we reported, you know, And the data set, that's the x axis net score on the, That's a point that we've made, you know, a number of times they're able to make it through the cloud, the viability of on-prem hi, you know, hybrid clouds like HPE GreenLake and Dell's So it, you know, it looks like for HPE it could be an outlier. off from, from the possibility that a new economic model emerges from the edge to And even the projects we see coming out of enterprise And you can email me at David dot valante@siliconangle.com or DM me at Dante
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Alex Morrison | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Alibaba | ORGANIZATION | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Alphabet | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Rob Ho | PERSON | 0.99+ |
Cheryl Knight | PERSON | 0.99+ |
Matt Baker | PERSON | 0.99+ |
October 14th | DATE | 0.99+ |
Dell | ORGANIZATION | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Dave Valante | PERSON | 0.99+ |
October | DATE | 0.99+ |
$3 billion | QUANTITY | 0.99+ |
Sarah Wong | PERSON | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
42% | QUANTITY | 0.99+ |
32% | QUANTITY | 0.99+ |
Friday | DATE | 0.99+ |
1999 | DATE | 0.99+ |
40% | QUANTITY | 0.99+ |
Snowflake | ORGANIZATION | 0.99+ |
5% | QUANTITY | 0.99+ |
six years | QUANTITY | 0.99+ |
3 billion | QUANTITY | 0.99+ |
2022 | DATE | 0.99+ |
Mongo | ORGANIZATION | 0.99+ |
last quarter | DATE | 0.99+ |
67% | QUANTITY | 0.99+ |
Martin Martin Kasa | PERSON | 0.99+ |
Kristin Martin | PERSON | 0.99+ |
Aviatrix | ORGANIZATION | 0.99+ |
July | DATE | 0.99+ |
CrowdStrike | ORGANIZATION | 0.99+ |
58% | QUANTITY | 0.99+ |
four years | QUANTITY | 0.99+ |
Okta | ORGANIZATION | 0.99+ |
second column | QUANTITY | 0.99+ |
Zeke Scaler | ORGANIZATION | 0.99+ |
2021 | DATE | 0.99+ |
last quarter | DATE | 0.99+ |
each week | QUANTITY | 0.99+ |
over@siliconangle.com | OTHER | 0.99+ |
Dave Ante | PERSON | 0.99+ |
Project Alpine | ORGANIZATION | 0.99+ |
Wiki Bond | ORGANIZATION | 0.99+ |
mid forties | DATE | 0.99+ |
Hashi Corp. | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
mid-November | DATE | 0.99+ |
today | DATE | 0.99+ |
each | QUANTITY | 0.99+ |
Azure | ORGANIZATION | 0.99+ |
about $4 billion | QUANTITY | 0.98+ |
Chris Wolf, VMware | VMware Explore 2022
>>Hey guys. Good morning. And welcome back to the cube. Lisa Martin here with John furrier. This is the Cube's third day of Wal Dal coverage of VMware Explorer. We're very pleased to welcome one of our alumni back to the program. Chris Wolf joins us chief research and innovation officer at VMware. Chris, welcome back to the >>Cube. Yeah. Thanks Lisa. It's always a pleasure. >>This has been a great event. We, we, the key note was standing room only on Tuesday morning. We've had great conversations with VMware's ecosystem and VMware of course, what are some of the, the hot things going on from an R and D perspective? >>Yeah, there's, there's a lot. I mean, we're, we have about four or five different priorities. And these look at this is looking at sovereign clouds and multi-cloud edge computing, modern applications and data services. We're doing quite a bit of work in machine learning as well as insecurity. So we're, we're relatively large organization, but at the same time, we really look to pick our bets. So when we're doing something in ML or security, then we wanna make sure that it's high quality and it's differentiated and adds value for VMware, our partners and our customers. >>Where are our customers in the mix in terms of being influential in the roadmap? >>Very, very much in the mix. What we, what we like to do is in early stage R and D, we want to have five to 10 customers as design partners. And that really helps. And in addition to that, as we get closer to go to market, we look to a lineup between one and three of our SI partners as well, to really help us, you know, in a large company, sometimes your organic innovations can get lost in the shuffle. Yeah. And when we have passionate SI that are like, yes, we want to take this forward with you together. That's just awesome. And it also helps us to understand at a very early stage, what are the integration requirements? So we're not just thinking about the, the core product itself, but how would it play in the ecosystem equally important? >>We had hit Culbert on CTO, great work. He's dealing with the white paper and cross cloud, obviously vSphere, big release, lot of this stuff. Dave ante had mentioned that in the analyst session, you had a lot of good stuff you were talking about. That's coming around the corner. That's shipping coming outta the oven and a big theme this year is multi-cloud cloud native. The relationship what's one's ahead. Bleed dog. No one, you kinda get a feel for multi-cloud. It's kind of out front right now, but now cloud native's got the most history what's coming out of the oven right now in terms of hitting the market. That's not yet in this, in the, in the, in the numbers, in terms of sales, like there's, there's some key cloud native stuff coming out. Where's the action. Can you share what you've shared at the analyst meeting? >>Yeah. So at the analyst meeting, what I was going through was a number of our new innovation projects or projects. And, and these are things that are typically close to being product or service at VMware, you know, somewhere in the year out timeframe. Some, some of these are just a few months out. So let me just go through some of them, I'll start with project keek. So keek is super exciting because when you think about edge, what we're hearing from customers is the, the notion of a single platform, a single piece of hardware that can run their cloud services, their containers, their VMs, their network, and security functions. Doing all of this on one platform, gives them the flexibility that as changes happen, it's a software update. They don't have to buy another piece of hardware, but if we step back, what's the management experience you want, right? >>Simple get ops oriented, simple life cycle and configuration management, very low touch. I don't need technical skills to deploy these types of devices. So this is where keek comes in. So what keek is doing is exposing a Kubernetes API above the ESXi hypervisor and taking a complete, get op style of management. So imagine now, when you need to do an update for infrastructure, you're logging into GitHub, you're editing a YAML file and pushing the update. We're doing the same thing for the applications that reside. I can do all of this through GitHub. So this is very, I would say, even internally disruptive to VMware, but super exciting for our customers and partners that we've shared this with. >>What else is happening? What else on the cloud native side Tansu Monterey those lot areas. >>Oh, there's so much. So if we look at project Monterey, I had a presentation within Invidia yesterday. We're really talking through this. And what I'm seeing now is there's a couple of really interesting inflection points with DPU. The first thing is the performance that you're getting and the number of cores that you can save on an X 86 host is actually providing a very strong business case now to bring DPU into the servers, into the data center. So that's one. So now you have a positive ROI. Number two, you start to decouple core services now from the X 86 host itself. So think about a distributed firewall that I can run on a PCI adapter. Now that's DEC coupled, physically from the server, and it really allows me to scale out east west security in a way that I could not do before. So again, I think that's really exciting and that's where we're seeing a lot of buzz from customers. >>So that DPU, which got a lot of buzz, by the way, Lisa, I never, you had trouble interviews on this. I had to the Dell folks too, V X RS taking the advantage of it, the performances, I see the performance angle on that and deep user hot. Can you talk about that security east west thing? Cuz Tom Gillis was on yesterday talking about that's a killer advantage for the security side. Can you touch on that real >>Quick? Yeah. A hundred percent. So what I can now do is take a, a firewall and run it isolated from the X 86 host that it's trying to protect. So it's right next to the host. I can get line rate speeds in terms of analytics and processing of my network and security traffic. So that's also huge. So I'm running line rate on the host and I'm able to run one of these firewall instances on every host in my data center, you cannot do that. You can never afford it with physical appliances. So to me, this is an inflection point because this is the start of network and security functions moving off of hardware appliances and onto DPU. And if you're the ecosystem vendors, this is how they're going to be able to scale some of their services and offerings into the public >>Cloud. So a lot of good stuff happening within the VMware kind of the hardware, low level atoms and the bits as well as the software. The other thing I wanna get your thoughts on relative to the next question is that takes to the next level is the super cloud world we're living in is about cloud native developers, which is DevOps dev security ops and data ops are now big parts of the, the challenges that the people are reigning in the chaos that that's being reigned in. How does VMware look at the relationship to the cloud providers? Cause we heard cloud universal. We had the cloud. If you believe in multi-cloud, which you guys are saying, people are agreeing with, then you gotta have good tight couple coupled relationships with the cloud services, >>A hundred percent. >>We can be decoupled, but highly cohesive, but you gotta connect in via APIs. What's the vision for the VMware customers who want to connect say AWS, for instance, is that seamless? What makes that happen? What's that roadmap look like for taking that VMware on premises hybrid and making it like turbo charging it to be like public cloud hybrid together? >>Yeah, I think there's some lessons that can be learned here. You know, an analogy I've been using lately is look at the early days of virtualization when VMware had vCenter, right? What was happening was you saw the enterprise management vendors try to do this overlay above virtualization management and say, we can manage all hypervisors. And at the end of the day, these multi hypervisor managers, no one bought 'em because they can do 20% of the functionality of a tool from VMware or Microsoft. And that's the lesson that we have to take to multi-cloud. We don't have to overlay every functionality. There's really good capabilities that the cloud providers are offering through their own tooling and APIs. Right? But you, you, if you step back, you say, well, what do I wanna centralize? I wanna have a centralized, secure software supply chain and I can get that through VMware tan zoo and, and where we're going with Kubernetes. When you're going with native cloud services, you might say, you know what, I wanna have a central view of, of visibility for compliance. So that's what we're doing with secure state or a central view of cost management. And we're doing that with cloud health. So you can have some brokering and governance, but then you also have to look from a surgical perspective as to what are the things that I really need to centralize versus what do I not need to centralize? >>One of the themes that we heard on the keynote on Tuesday was the, the different phases and that a lot of customers are still in the cloud chaos phase. We talked a lot about that in the last couple days with VMware, with its partner ecosystem. And, but the goal of getting to cloud smart, how does the R and D organization, how do, how are you helping customers really navigate that journey from the chaos that they're in, maybe they've inherited multi-cloud environment to getting to cloud smart. And what does cloud smart mean from your perspective >>Cloud? Smartt from my perspective means pragmatism. It means really thinking about what should I do here first, right? I don't want to just go somewhere because I can, right. I want to be really mindful of the steps I'm going to take. So one ex one example of this is I've met with a customer this morning and we were talking about using our vRealize network insight tool, because what that allows 'em to do is get a map of all of their application dependencies in their data center. And they can learn like, well, I can move this to the cloud or maybe I can't move this cuz it has all these other dependencies and it would be really difficult. So that's that's one example. It also means really thinking through issues around data sovereignty, you know, what do I wanna hold onto a customer? I just met with yesterday. They were talking about how valuable their data is and their services that they want to use via SA in the cloud. But then there's also services, which is their core research. They wanna make sure that they can maintain that in their data centers and maintain full control because they see researchers will leave. And now all of a sudden, so that intellectual property has actually gone with the person and they need to, they need to have, you know, better accountability there. >>Yeah. One of the things about that we discovered at our super cloud event was is that, you know, we kind didn't really kind of put too much structure on other than our, our vision. It's, it's not just SaaS on cloud and it's not just, multi-cloud, it's a new kind of application end state or reality that if you believe in digital transformation, then technology is everywhere. And like it in the old days, it powered the back office and then terminals and PCs and whatnot, wasn't powering the boardroom obviously or other business. But if, if it happens like that digital transformation, the company is the app, the app is the company. So you're all digital. So that means the operating expenses has to drive an income statement and the CapEx handled by the cloud provides a lot of goodness. So I think everyone's gonna realize that AWS and the hyperscalers are providing great CapEx gifts. They do all the work and you only pay when you've made your success. So that's a great business model. >>Absolutely >>That's and then combine that with open source, which is now growing so fast, going next level, the software industry's open source. That's not even a debate Mo in some circles, maybe like telco, cloud's got the CapEx. The new operating model is this cloud layer. That's going to transform the companies finally in a hundred percent. Okay. That's super cloud. If that's the case, does it really matter who provides the electricity or the power? It's the coders that are in charge. It's the developers that have to make the calls because if the application is the core, the developers are, are not only the front lines, they are the company. This is really kind of where the sea change is. So if, if we believe that, I'm sure you, you agree with that generally? >>Yeah, of >>Course. Okay. So then what's the VMware customer roadmap here. So to me, that's the big story here at the show is that we're at this point in time where the VMware customers are, have to go there >>A hundred percent, >>What's that path. What is the path for the VMware customer to go from here to there? And what's this order of operations or is there a roadmap? Can, can you share your thoughts on >>That? Yeah, I think part of it is, is with these disruptive technologies, you have to start small, you know, whether it's in your data center, into cloud, you have to build the own institutional knowledge of your team members in the organization. It's much easier than trying to attract outside talent, for least for many of our customers. So I think that's important. The other part of this when with the developer and control, like in my organization, I want my innovators to innovate any other noise around them. I don't want them to have to worry about it. And it's the same thing with our customers. So if your developers are building the technologies that is really differentiating your company, then things like security and cryptography shouldn't have to be things they worry about. So we've been doing a lot of work. Like one of the projects we announced this week was around being able to decouple cryptography from the applications themselves. And we can expose that through a proxy through service mesh. And that's really exciting because now it ops can make these changes. Our SecOps teams can make these changes without having to impact the application. So that's really key is focusing the developers on innovation and then really being mindful about how you can build the right automation around everything else. And certainly open source is key to all >>That. So that's so, so then if you, if that's happening, which I'm, I'm not gonna debate that then in essence, what's really going on here is that the companies are decomposing their entire businesses down to levels that are manageable completely different than the way they did them 20, 30 years ago. >>Absolutely. You, you, you could take a modular approach to how you're solving business problems. And we do the same thing with technology, where there might be a ML algorithms that we've developed that we're exposing as SA service, but then all of the interconnects around that service are open source and very flexible so that the businesses and the customers and the VMware partners can decide what's the right way to build a puzzle for a given problem. >>We were talking on day one, I was riffing with an executives. It was Ragu and Victoria. And the concept around cross cloud was if you get to this Nirvana state, which is we, people want to get to this or composability mode, you're not coding, you're composing cuz coding's kinda happening open source and not the old classic, write some code and write that app. It's more orchestrate, compose and orchestrate. Do you, what's your thoughts on >>That? Yeah, yeah. Yeah. I, I agree. And it's it's I would add one more part to it too, which is scope. You know, I think sometimes we see projects fail because the, the initial scope is just too big. You know, what is the problem that you need to solve, scope it properly and then continuously calibrate. So even like our customers have to listen to their customers and we have to be thinking about our customers' customers, right? Because that's really how we innovate because then we can really be mindful of a holistic solution for them. >>You know, Lisa, when we had a super cloud event, you know, one of the panels was called the innovators dilemma with a question mark. And of course everyone kinds of quotes that book innovators dilemma, but one of the panelists, Chris ho beaker on Twitter said, let's change the name from the innovator's dilemma to the integrator's dilemma. And we all kind of got chuckled. We all kind of paused and said, Hey, that's actually a good point. Yeah. If you're now in a cloud and you're seeing some of the ecosystem floor vendors out there talking in this game too, they're all kind of fitting in snapping in almost like modular, like you said, so this is a Lego game. Now it feels like, it feels like, you know, let's compose, let's orchestrate, let's integrate. Now I integrations API driven. Now you're seeing a lot more about API security in the news and we've been covering at least I've probably interviewed six companies in the past, you know, six months that are doing API security, who would've thought API, that's the link, frankly, with the web. Now that's now a target area for hackers. >>Oh. And that's such an innovation area for VMware, John. Okay. >>There it is. So, I mean, this is, again, this means the connected tissue is being attacked yet. We need it to grow. No one's debating that is wrong, but it's under siege. >>Yes. Yes. So something else we introduced this week was a project. We called project Trinidad. And the way, the way you can think about it is a lot of the anomaly detection software today is looking at point based anomalies. Like this API header looks funny where we, where we've gone further is we can look at full sequence based anomalies so we can learn the sequences of transactions at an application takes and really understand what is expected behavior within those API calls within the headers, within the payloads. And we can model legitimate application behavior based on what those expectations are. So like a, like a common sequence might be doing an e-commerce checkout, right? There's lots of operations that happen logging into the site, searching, finding a product, going through the cart. Right. All of those things. Right. So if something's out of sequence, like all of a sudden somebody's just trying to do a checkout, but they haven't actually added to the cart. Right. This just seems odd. Right. So we can start to, and that's a simplistic example, but we're able now to use our algorithms to model legitimate application behavior through the entire sequence of how applications behave and then we can start to trap on anomalies. That's very differentiating IP and, and we think it's gonna be really important for the industry. Yeah. >>Because a lot of the hacks, sometimes on the API side, even as a example, are not necessarily on the API, it's the business logic in them. That's what you're getting at here. Yes. The APIs are hard. Oh our APIs are secure. Right. Well, yeah, but you're not actually securing the business logic internally. That's what you're getting at. If I read >>That right. Or exactly. Exactly. Yeah. Yeah. And it, it's the thing it's right. It's great that you can, you can look at a header, but what's the payload, right? What is what's, what's the actual data flow, right. That's associated with the call and that's what we want to really hone in on. And that's just a, it's, it's a, it's a far different level of sophistication in being able to understand east west vulnerabilities, you know, log for JX voice and these kind of things. So we have some real, it's interesting technology >>There. Security conversations now are not about security there about defense ability because security's a state of time, your secure here, you're not secure or someone might be in the network or in the app, but can you defend yourself from, and in >>That's it, you know, our, our, our malware software, right. That we're building to prevent and respond has to be more dynamic than the threats we face. Right. And this is why machine learning is so essential in, in these types of applications. >>Let me ask you a question. So just now zooming out riffing here since day, three's our conversational day where we debate and just riff more like a podcast style. If you had to do a super cloud or build a NextGen cloud multi-cloud with abstraction layer, that's, you know, all singing and dancing and open everyone's happy hardware below it's working ISAs and then apps are killed. Can ass what's in that. What does it look like to you if you had to architect the, the ultimate super cloud enabler, that something that would disrupt the next 10 years, what would it look like and how does, and assuming, and trying to do where everybody wins go, you have 10 seconds. No, >>Yeah, yeah. So the, you know, first of all, there has to be open source at all of the intersections. I think that's really important. And, and this is, this goes from networking constructs to our database, as a service layers, you know, everything in between, you know, the, the, the participants should be able to win on merit there. The other part of super cloud though, that hasn't happened that I probably is the most important area of innovation is going to be decoupled control planes. We have a number of organizations building sovereign cloud initiatives. They wanna have flexibility in where their services physically run. And you're not going to have that with a limited number of control planes that live in very specific public cloud data centers. So that's an area, give >>An example of what a, a, a, a narrowly defined control plane is. >>Yeah, sure. So my database as a service layer, so the, the, the actual portal that the customer is going into to provision databases, right. Rep managed replication, et cetera. Right. I should be able to run that in a colo. I should be able to run that somewhere in region that is guaranteed, that I'm going to have data stay physically in region. You know, we still have some of these challenges in networking in terms of being able to constrain traffic flows and be able to predict and audit them within a particular region as well. >>It's interesting. You bring up region again, more complexity. You know, you got catalogs here, catalogs different. I mean, this is where the chaos really comes down. I mean, it's, it's advancing, but it's advancing the state of functionality, but making it hella complex, I mean, come on. Don't you think it's like pretty amazingly hard to reign in that? Well, or is it maybe you guys making it easier? I just think I just, my mind just went, oh my God, I gotta, I gotta provision to that region, but then it's gotta be the same over there. And >>When you go back to modular architecture constructs, it gets far easier. This has been really key for how VMware is even building our own clouds internally is so that we have a, a shared services platform for the different apps and services that we're building, so that you do have that modularized approach. Like I said, the, the examples of innovation projects I've shared have been really driven by the fact that, you know, what, I don't know how customers are gonna consume it, and I don't have to know. And if you have the right modular architecture, the right APIs around it, you don't have to limit a particular project or technology's future at the time you build >>It. Okay. So your super would have multiple control planes that you can move, manage with that within one place. I get that. What about the data control plane? That seems to be something that used to be the land grab in, in conversations from vendors. But that seems to be much more of a customer side, cuz if I'm a customer, I want my control plane data plane to be, you know, mine. Like I don't want to have anyone cuz data's gotta move around, gotta be secure. >>Oh exactly. >>And that's gonna be complicated. How does, how do you see the data planes emerging? >>Yeah. Yeah. We, we see an opportunity really around having a, a centralized view that can give me consistent indexing and consistent awareness of data, no matter where it resides. And then being able to have that level of integration now between my data services and my applications, because you're right, you know, right now we have data in different places, but we could have a future where data's more perpetually in motion. You know, we're already looking at time sensitive fabrics where we're expecting microservices to sometimes run in different cell towers depending on the SLA that they need to achieve. So then you have data parts that's going to follow, right? That may not always be in the same cloud data center. So there's, this is enormously complicated, not just in terms of meeting application SLAs, but auditing and security. Right. That makes it even further. So having these types of data layers that can give me a consistent purview of data, regardless of where it is, allow me to manage and life cycle data globally, that's going to be super important, I believe going forward. >>Yeah. Awesome. Well, my one last question, Lisa, gonna get a question in here. It's hard. Went for her. I'm getting all the, all the questions in, sorry, Lisa that's okay. What's your favorite, most exciting thing that you think's going on right now that people should pay attention to of all the things you're looking at, the most important thing that that's happening and maybe something that's super important that people aren't talking about or it could be the same thing. So the, the most important thing that you think that's happening in the industry for cloud next today and, and maybe something that you think people should look at and pay more attention to. >>Okay. Yeah, those are good questions. And that's hard to answer because there's, there's probably so much happening. I I've been on here before I've talked about edge. I still think that's really important. I think the value of edge soft of edge velocity being defined by software updates, I think is quite powerful. And that's, that's what we're building towards. And I would say the industry is as well. If you look at AWS and Azure, when they're packaging a service to go out to the edge it's package as a container. So it's already quite flexible and being able to think about how can I have a single platform that can give me all of this flexibility, I think is really, really essential. We're building these capabilities into cars. We have a version of our Velo cloud edge device. That's able to run on a ruggedized hardware in a police car today. We're piloting that with a customer. So there is a shift happening where you can have a core platform that can now allow you to layer on applications that you're not thinking about in the future. So I think that's probably obvious. A lot of people are like, yeah. Okay. Yes. Let's talk about edge, big deal. >>Oh it's, it's, it's big. Yes. It's >>Exploding, but >>It's complicated too. It's not easy. It's not obvious. Right. And it's merging >>There's new things coming every day. Yeah. Yeah. And related to that though, there is this kind of tension that's existing between machine learning and privacy and that's really important. So an area of investment that I don't think enough people are paying attention to today is federated machine learning. There's really good projects in open source that are having tangible impact on, in a lot of industries in VMware. We are, we're investing in a, in a couple of those projects, namely fate in the Linux foundation and open FFL. And in these use cases like the security product I mentioned to you that is looking at analyzing API sequence API call sequences. We architected that originally so that it can run in public cloud, but we're also leveraging now federated machine learning so that we can ensure that those API calls and metadata associated with that is staying on premises for the customers to ensure privacy. So I think those intersections are really important. Federated learning, I think is a, an area not getting enough attention. All right. All >>Right, Chris, thanks so much for coming on. Unfortunately we are out of time. I know you guys could keep going. Yeah. Good stuff. But thank you for sharing. What's going on in R and D the customer impact the outcomes that you're enabling customers to achieve. We appreciate your >>Insights. We're just getting started >>In, in early innings, right? Yeah. Awesome. Good stuff for guest and John furrier. I'm Lisa Martin. You're watching the cube live from VMware Explorer, 2022. Our next guest joins us momentarily. >>Okay.
SUMMARY :
This is the Cube's third day of Wal Dal coverage of VMware Explorer. We've had great conversations with VMware's ecosystem and VMware of course, And these look at this is looking at sovereign clouds and multi-cloud edge computing, And in addition to that, as we get closer to go to market, we look to a It's kind of out front right now, but now cloud native's got the most history what's coming out So keek is super exciting because when you think So imagine now, when you need to do an update for infrastructure, you're logging into GitHub, you're editing a YAML What else on the cloud native side Tansu Monterey those Now that's DEC coupled, physically from the server, and it really allows me to scale out east west security So that DPU, which got a lot of buzz, by the way, Lisa, I never, you had trouble interviews on this. So I'm running line rate on the How does VMware look at the relationship to the cloud providers? We can be decoupled, but highly cohesive, but you gotta connect in via APIs. And that's the lesson that we have to take to multi-cloud. but the goal of getting to cloud smart, how does the R and D organization, how do, how are you helping customers they need to have, you know, better accountability there. They do all the work and you only pay when you've made your It's the developers that have to make the calls because if the application is the core, So to me, that's the big story here at the show What is the path for the VMware customer to go from here to there? So that's really key is focusing the developers on innovation to levels that are manageable completely different than the way they did them 20, so that the businesses and the customers and the VMware partners can decide what's the right way to build And the concept around cross cloud was if So even like our customers have to listen to their customers and we have to be thinking about And of course everyone kinds of quotes that book innovators dilemma, but one of the Oh. And that's such an innovation area for VMware, John. We need it to grow. And the way, the way you can think about it is a lot of the anomaly detection software today is looking at point Because a lot of the hacks, sometimes on the API side, even as a example, are not necessarily on And it, it's the thing it's right. but can you defend yourself from, and in That's it, you know, our, our, our malware software, right. What does it look like to you if you had to architect the, the ultimate super cloud enabler, So the, you know, first of all, there has to be open the customer is going into to provision databases, right. Don't you think it's like pretty amazingly hard to reign in the right APIs around it, you don't have to limit a particular project or technology's future customer, I want my control plane data plane to be, you know, mine. How does, how do you see the data planes emerging? So then you have data parts that's going to follow, right? in the industry for cloud next today and, and maybe something that you think people should look So there is a shift happening where you can have a core platform that can now allow It's And it's merging So an area of investment that I don't think enough people are paying attention to today is federated What's going on in R and D the customer impact the outcomes We're just getting started Yeah.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Lisa Martin | PERSON | 0.99+ |
Chris | PERSON | 0.99+ |
five | QUANTITY | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Chris Wolf | PERSON | 0.99+ |
Tom Gillis | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
20% | QUANTITY | 0.99+ |
Tuesday morning | DATE | 0.99+ |
John | PERSON | 0.99+ |
Dell | ORGANIZATION | 0.99+ |
three | QUANTITY | 0.99+ |
VMware | ORGANIZATION | 0.99+ |
10 seconds | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Invidia | ORGANIZATION | 0.99+ |
yesterday | DATE | 0.99+ |
six companies | QUANTITY | 0.99+ |
Tuesday | DATE | 0.99+ |
one platform | QUANTITY | 0.99+ |
one example | QUANTITY | 0.99+ |
10 customers | QUANTITY | 0.99+ |
telco | ORGANIZATION | 0.99+ |
six months | QUANTITY | 0.99+ |
John furrier | PERSON | 0.99+ |
this week | DATE | 0.99+ |
today | DATE | 0.98+ |
2022 | DATE | 0.98+ |
this year | DATE | 0.98+ |
One | QUANTITY | 0.98+ |
third day | QUANTITY | 0.98+ |
single platform | QUANTITY | 0.97+ |
ESXi | TITLE | 0.97+ |
Nirvana | LOCATION | 0.97+ |
one last question | QUANTITY | 0.97+ |
VMware Explorer | TITLE | 0.97+ |
Lego | ORGANIZATION | 0.97+ |
hundred percent | QUANTITY | 0.96+ |
one place | QUANTITY | 0.96+ |
Chris ho beaker | PERSON | 0.95+ |
this morning | DATE | 0.95+ |
CapEx | ORGANIZATION | 0.95+ |
ORGANIZATION | 0.95+ | |
Kubernetes | TITLE | 0.94+ |
one more part | QUANTITY | 0.93+ |
first thing | QUANTITY | 0.91+ |
Victoria | PERSON | 0.89+ |
vCenter | TITLE | 0.89+ |
one example | QUANTITY | 0.87+ |
DevOps | TITLE | 0.86+ |
first | QUANTITY | 0.85+ |
Azure | ORGANIZATION | 0.82+ |
Ragu | PERSON | 0.82+ |
Rachel Wolfson, CoinTelegraph | Monaco Crypto Summit 2022
(upbeat music) >> Okay, welcome back everyone to the Cube's live coverage in Monaco. I'm John Furrier, host of theCube. Monaco Crypto Summit is the event and there's a big conversation later at the yacht club with Prince Albert and everyone else will be there, and it'll be quite the scene. And Rachel Wolfson is here. She's with Cointelegraph. They're the media partner of the event, the official media partner of the Monaco Crypto Summit. She's also MCing the event on stage, presented by DigitalBits. Rachel, thanks for coming on. >> Thanks for having me, John. >> So I know you're busy, thanks for taking the time cause' you got to go jump back in and moderate, and keep things on track. This isn't an inaugural event. So DigitalBits has exploded on the scene. I just saw a thing on YouTube news around this soccer player in Rome, has DigitalBits logo on their jersey. They're a big to do cause everyone's popular and they got a couple teams. So real world, kind of, assets coming together, what's going on in the event that you're MCing? What's the focus? What's the agenda? What's some of the conversations like? >> Yeah, definitely. Well, it's a great event. It's my first time here in Monaco and I'm loving it. And I think that Monaco is really becoming the next crypto hotspot. Definitely in terms of Metaverse and Web3 innovation, I think that we're going to start seeing a lot of that here. That's what we're seeing today at the Summit. So a lot of the presentations that we're seeing are really focused on Web3 and NFT platforms, so for instance, obviously what DigitalBits is doing. We watched a video before the break on Ecosystem and the Metaverse that people can join and be a part of, in terms of real estate, but we're seeing a lot of innovation here today with that. I moderated a great panel with Britney Kaiser, Lauren Bissell, Taross, I'm blanking on his last name, but it was about blockchain and how governments are implementing blockchain. So that was also really interesting to hear about what the Ukrainian government is doing with blockchain. So there's kind of a mix, but I'd say that the overall theme is Web3 and NFTs. >> Yeah. Britney was mentioning some of that, how they're going to preserve buildings and artifacts, so that in case they're looted or destroyed, they can preserve them. >> Right. I think it's called the Heritage Fund. And I just think it's such an interesting use case in terms of how governments are using blockchain because the best use for blockchain in my opinion, is recording data, and having that data be permanent. And so when we can have artifacts in Ukraine recorded on the blockchain, you know by being scanned, it's really revolutionary. And I think that a lot of governments around the world are going to see that use case and say, "Oh wow, blockchain is a great technology for things like that." >> So DigitalBits had a press conference this morning and they talked about their exchange and some other things. Did you attend that press conference or did you get briefed on that? >> I did not attend the press conference. I was prepping for my MC role. >> So they got this exchange thing and then there's real interest from Prince Albert's foundations to bring this into Monaco. So Monaco's got this vibe, big time. >> Rachel: Right. There's a vibe (John chuckles) >> What does it all mean, when you're putting in your reporting? What do you see happening? >> So, I mean, I honestly haven't covered Monaco actually ever in my reporting. And John, you know I've been reporting since 2017, but the vibe that I'm getting just from this summit today is that Web3 and NFTs are going to be huge here. I'm speaking, I haven't... You know, there's a panel coming up about crypto regulations, and so we're going to talk a little bit about laws being passed here in Monaco in terms of Metaverse and digital identity. So I think that there are a few laws around that here that they're looking at, the government here is looking at to kind of add clarity for those topics. >> I had a couple guests on earlier. We were talking about the old days, a couple years ago. You mentioned 2017, so much has changed. >> Yes. >> You know, we had a up and down. 2018 was a good year, and then it kind of dived back and changed a little bit. Then NFTs brought it back up again, been a great hype cycle, but also movement. What's your take on the real progress that's been made? If you zoom out and look at the landscape, what's happened? >> Right. I mean, well, a lot has happened. When I first entered the space, I initially came in, I was interested in enterprise, blockchain and private networks being utilized by enterprises to record data. And then we saw public blockchains come in, like Ethereum and enterprises using them. And then we saw a mix. And now I feel like we're just seeing public blockchains and there's really... (John chuckles) But there's still our private blockchains. But today, I mean, we've gone from that in 2017 to right now, I think, you know, we're recently seeing a lot of these centralized exchanges kind of collapsing. What we've seen with Celsius, for instance, and people moving their crypto to hardware wallets. I think that the space is really undergoing a lot of transformation. It's really revolutionary, actually, to see the hardware wallet market is growing rapidly, and I think that that's going to continue to grow. I think centralized exchanges are still going to exist in custody crypto for enterprises and institutions, and you know, in individuals as well. But we are seeing a shift from centralized exchanges to hardware wallets. NFTs, although the space is, you know, not as big as it was a year ago, it's still quite relevant. But I think with the way the market is looking today, we're only seeing the top projects kind of lead the way now, versus all of the noise that we were seeing previously. So yeah, I think it's- >> So corrections, basically? >> Right. Exactly. Corrections. And I think it's necessary, right. It's very necessary. >> Yeah. It's interesting. You know, you mentioned the big players you got Bitcoin, Ethereum driving a lot. I remember interviewing the crypto kiddies when they first came out, it was kind of a first gen Ethereum, and then it just exploded from there. And I remember saying to myself, if the NFTs and the decentralized applications can have that scale, but then it felt like, okay, there was a lot of jocking for under the covers, under the hood, so to speak. And now you've got massive presence from all the VCs, and Jason Ho has like another crypto fund. I mean, >> Right. you can't go a day without another big crypto fund from you know, traditional venture capitalists. Meanwhile, you got investors who have made billions on crypto, they're investing. So you kind of got a diversity of investor base going on and different instruments. So the investor community's changing and evolving too. >> Right. >> How do you see that evolving? >> Well, it's a really good point you mentioned. So Cointelegraph research recently released a report showing that Web3 is the most sought after investment sector this year. So it was DeFi before, and Web3 is now leading the way over DeFi. And so we're seeing a lot of these venture capitalist funds as you mentioned, create funds allocated just to Web3 growth. And that's exactly what we're seeing, the vibe I'm getting from the Monaco Crypto Summit here today, this is all about Web3. It's all about NFT, it is all about the Metaverse. You know, this is really revolutionary. So I think we're definitely going to see that trend kind of, you know, conquer all of these other sectors that we're seeing in blockchain right now. >> Has Web3 become the coin term for Metaverse and NFTs? Or is that being globalized as all shifted, decentralized? What's the read on it? It seems to be like, kind of all inclusive but it tends to be more like NFT's the new thing and the young Gen Zs >> Yeah want something different than the Millennials and the Xs and the Boomers, who screwed everything up for everybody. >> Yeah. (John chuckles) No, I mean, it's a great question. So when I think of Web3, I categorize NFTs and the Metaverse in there. Obviously it's just, you know the new form of the internet. It's the way the internet is- >> Never fight fashion, as I always say, right? >> Right. Yeah. Right. (John chuckles) It's just decentralization. The fact that we can live in these virtual worlds and own our own assets through NFT, it's all decentralized. And in my opinion, that all falls under the category of Web3. >> Well, you're doing a great job MCing. Great to have you on theCube. >> Rachel: Thanks. I'd like to ask you a personal question if you don't mind. COVID's impacted us all with no events. When did you get back onto the events circuit? What's on your calendar? What have you been up to? >> Yeah, so gosh, with COVID, I think when COVID, you know, when it was actually really happening, (John chuckles) and it still is happening. But when it was, you know, >> John: Like, when it was >> impacting- shut down mode. >> Right. When we were shut down, there were virtual events. And then, I think it was late last year or early this year when the events started happening again. So most recently I was at NFT NYC. Before that, I was at Consensus, which was huge. >> Was that the one in Austin or Miami? >> In Austin. >> That's right, Austin. >> Right. Were you there? >> No, I missed it. >> Okay. It was a very high level, great event. >> Huge numbers, I heard. >> Yes. Massive turnout. (John chuckles) Tons of speakers. It was really informative. >> It feels like a festival. actually. >> It was. It was just like South by Southwest, except for crypto and blockchain. (John chuckles) And then coming up, gosh, there are a lot of events. I'll be at an event in Miami, it's an NFT event that's in a few months. I know that there's a summit happening, I think in Turkey that I may be at as well. >> You're on the road. You're traveling. You're doing a lot of hopping around. >> Yes I am. And there's a lot of events happening in Europe. I'm US-based, but I'm hoping to spend more time in Europe just so I can go to those events. But there's a lot happening. >> Yeah. Cool. What's the most important story people should be paying attention to in your mind? >> Wow. That's... (Rachel chuckles) That's a big question. It's a good question. I think most, you know, the transition that we're seeing now, so in terms of prices, I think people need to focus less on the price of Bitcoin and Ethereum and more on innovation that's happening. So for instance, Web3 innovation, what we're seeing here today, you know, innovation, isn't about prices, but it's more about like actually now is the time to build. >> Yeah. because the prices are a bit down. >> Yeah. I mean, as, you know, Lewis Hamilton's F1 driver had a quote, you know, "It takes a team. No matter who's in the driver's seat, it's a team." So community, Wayne Gretzky skates where the puck is going to be I think is much more what I'm hearing now, seeing what you're saying is that don't try to count the price trade of Bitcoin. This is an evolution. >> Right. >> And the dots are connecting. >> Exactly. And like I said, now is the time to build. What we're seeing with the project Britney mentioned, putting the heritage, you know, on the blockchain from Ukraine, like, that's a great use case for what we're seeing now. I want to see more of those real world use cases. >> Right. Well, Rachel, thanks for coming on theCube. I really appreciate it. Great to see you. >> Thanks, John. >> And thanks for coming out of your schedule. I know you're busy. >> Thanks. Now you get some lunchtime now and get some break. >> Yeah. Get back on stage. Thanks for coming on. >> Rachel: Thank you. >> All right. We're here at the Monaco Crypto Summit. Rachel's MCing the event as part of the official media partner, Cointelegraph. Rachel Wolfson here on theCube. I'm John Furrier. More coverage coming after this short break. >> Thank you. (upbeat music)
SUMMARY :
and it'll be quite the scene. So DigitalBits has exploded on the scene. So a lot of the presentations how they're going to preserve And I just think it's such or did you get briefed on that? I did not attend the press conference. and then there's real interest Rachel: Right. but the vibe that I'm getting I had a couple guests on earlier. the landscape, what's happened? NFTs, although the space is, you know, And I think it's necessary, right. I remember interviewing the crypto kiddies So the investor community's and Web3 is now leading the way over DeFi. the Xs and the Boomers, It's the way the internet is- And in my opinion, Great to have you on theCube. I'd like to ask you But when it was, you know, And then, I think it was late last year Were you there? It was a very high level, great event. It was really informative. It feels like a festival. I know that there's a summit happening, You're on the road. just so I can go to those events. What's the most important story now is the time to build. because the prices the puck is going to be putting the heritage, you know, Great to see you. I know you're busy. Now you get some lunchtime Get back on stage. We're here at the Monaco Crypto Summit. Thank you.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Rachel | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Austin | LOCATION | 0.99+ |
2017 | DATE | 0.99+ |
Rachel Wolfson | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Miami | LOCATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
Lauren Bissell | PERSON | 0.99+ |
Britney | PERSON | 0.99+ |
Britney Kaiser | PERSON | 0.99+ |
Monaco | LOCATION | 0.99+ |
Turkey | LOCATION | 0.99+ |
US | LOCATION | 0.99+ |
Jason Ho | PERSON | 0.99+ |
Rome | LOCATION | 0.99+ |
Ukraine | LOCATION | 0.99+ |
2018 | DATE | 0.99+ |
DigitalBits | ORGANIZATION | 0.99+ |
Cointelegraph | ORGANIZATION | 0.99+ |
Lewis Hamilton | PERSON | 0.99+ |
Monaco Crypto Summit | EVENT | 0.99+ |
CoinTelegraph | ORGANIZATION | 0.99+ |
first time | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Monaco Crypto Summit 2022 | EVENT | 0.98+ |
Taross | PERSON | 0.98+ |
first | QUANTITY | 0.98+ |
a year ago | DATE | 0.98+ |
Metaverse | ORGANIZATION | 0.97+ |
this year | DATE | 0.97+ |
early this year | DATE | 0.96+ |
Wayne Gretzky | PERSON | 0.96+ |
NFTs | ORGANIZATION | 0.96+ |
late last year | DATE | 0.96+ |
billions | QUANTITY | 0.96+ |
Heritage Fund | OTHER | 0.95+ |
first gen | QUANTITY | 0.95+ |
Web3 | ORGANIZATION | 0.95+ |
YouTube | ORGANIZATION | 0.93+ |
Tons of speakers | QUANTITY | 0.92+ |
Prince Albert | PERSON | 0.91+ |
this morning | DATE | 0.91+ |
NYC | LOCATION | 0.89+ |
Monaco | ORGANIZATION | 0.88+ |
NFT | EVENT | 0.87+ |
Prince | PERSON | 0.87+ |
Ethereum | OTHER | 0.85+ |
Breaking Analysis: How Snowflake Plans to Make Data Cloud a De Facto Standard
>>From the cube studios in Palo Alto, in Boston, bringing you data driven insights from the cube and ETR. This is breaking analysis with Dave ante. >>When Frank sluman took service, now public many people undervalued the company, positioning it as just a better help desk tool. You know, it turns out that the firm actually had a massive Tam expansion opportunity in it. SM customer service, HR, logistics, security marketing, and service management. Generally now stock price followed over the years, the stellar execution under Slootman and CFO, Mike scar Kelly's leadership. Now, when they took the reins at snowflake expectations were already set that they'd repeat the feet, but this time, if anything, the company was overvalued out of the gate, the thing is people didn't really better understand the market opportunity this time around, other than that, it was a bet on Salman's track record of execution and on data, pretty good bets, but folks really didn't appreciate that snowflake. Wasn't just a better data warehouse that it was building what they call a data cloud, and we've turned a data super cloud. >>Hello and welcome to this. Week's Wikibon cube insights powered by ETR in this breaking analysis, we'll do four things. First. We're gonna review the recent narrative and concerns about snowflake and its value. Second, we're gonna share survey data from ETR that will confirm precisely what the company's CFO has been telling anyone who will listen. And third, we're gonna share our view of what snowflake is building IE, trying to become the defacto standard data platform, and four convey our expectations for the upcoming snowflake summit. Next week at Caesar's palace in Las Vegas, Snowflake's most recent quarterly results they've been well covered and well documented. It basically hit its targets, which for snowflake investors was bad news wall street piled on expressing concerns about Snowflake's consumption, pricing model, slowing growth rates, lack of profitability and valuation. Given the, given the current macro market conditions, the stock dropped below its IPO offering price, which you couldn't touch on day one, by the way, as the stock opened well above that and, and certainly closed well above that price of one 20 and folks express concerns about some pretty massive insider selling throughout 2021 and early 2022, all this caused the stock price to drop quite substantially. >>And today it's down around 63% or more year to date, but the only real substantive change in the company's business is that some of its largest consumer facing companies, while still growing dialed back, their consumption this past quarter, the tone of the call was I wouldn't say contentious the earnings call, but Scarelli, I think was getting somewhat annoyed with the implication from some analyst questions that something is fundamentally wrong with Snowflake's business. So let's unpack this a bit first. I wanna talk about the consumption pricing on the earnings call. One of the analysts asked if snowflake would consider more of a subscription based model so that they could better weather such fluctuations and demand before the analyst could even finish the question, CFO Scarelli emphatically interrupted and said, no, <laugh> the analyst might as well have asked, Hey Mike, have you ever considered changing your pricing model and screwing your customers the same way most legacy SaaS companies lock their customers in? >>So you could squeeze more revenue out of them and make my forecasting life a little bit easier. <laugh> consumption pricing is one of the things that makes a company like snowflake so attractive because customers is especially large customers facing fluctuating demand can dial and their end demand can dial down usage for certain workloads that are maybe not yet revenue producing or critical. Now let's jump to insider trading. There were a lot of insider selling going on last year and into 2022 now, I mean a lot sloop and Scarelli Christine Kleinman. Mike SP several board members. They sold stock worth, you know, many, many hundreds of millions of dollars or, or more at prices in the two hundreds and three hundreds and even four hundreds. You remember the company at one point was valued at a hundred billion dollars, surpassing the value of service now, which is this stupid at this point in the company's tenure and the insider's cost basis was very often in the single digit. >>So on the one hand, I can't blame them. You know what a gift the market gave them last year. Now also famed investor, Peter Linsey famously said, insiders sell for many reasons, but they only buy for one. But I have to say there wasn't a lot of insider buying of the stock when it was in the three hundreds and above. And so yeah, this pattern is something to watch our insiders buying. Now, I'm not sure we'll keep watching snowflake. It's pretty generous with stock based compensation and insiders still own plenty of stock. So, you know, maybe not, but we'll see in future disclosures, but the bottom line is Snowflake's business. Hasn't dramatically changed with the exception of these large consumer facing companies. Now, another analyst pointed out that companies like snap, he pointed to company snap, Peloton, Netflix, and face Facebook have been cutting back. >>And Scarelli said, and what was a bit of a surprise to me? Well, I'm not gonna name the customers, but it's not the ones you mentioned. So I, I thought I would've, you know, if I were the analyst I would've follow up with, how about Walmart target visa, Amex, Expedia price line, or Uber? Any of those Mike? I, I doubt he would've answered me anything. Anyway, the one thing that Scarelli did do is update Snowflake's fiscal year 2029 outlook to emphasize the long term opportunity that the company sees. This chart shows a financial snapshot of Snowflake's current business using a combination of quarterly and full year numbers in a model of what the business will look like. According to Scarelli in Dave ante with a little bit of judgment in 2029. So this is essentially based on the company's framework. Snowflake this year will surpass 2 billion in revenues and targeting 10 billion by 2029. >>Its current growth rate is 84% and its target is 30% in the out years, which is pretty impressive. Gross margins are gonna tick up a bit, but remember Snowflake's cost a good sold they're dominated by its cloud cost. So it's got a governor. There has to pay AWS Azure and Google for its infrastructure. But high seventies is a, is a good target. It's not like the historical Microsoft, you know, 80, 90% gross margin. Not that Microsoft is there anymore, but, but snowflake, you know, was gonna be limited by how far it can, how much it can push gross margin because of that factor. It's got a tiny operating margin today and it's targeting 20% in 2029. So that would be 2 billion. And you would certainly expect it's operating leverage in the out years to enable much, much, much lower SGNA than the current 54%. I'm guessing R and D's gonna stay healthy, you know, coming in at 15% or so. >>But the real interesting number to watch is free cash flow, 16% this year for the full fiscal year growing to 25% by 2029. So 2.5 billion in free cash flow in the out years, which I believe is up from previous Scarelli forecast in that 10, you know, out year view 2029 view and expect the net revenue retention, the NRR, it's gonna moderate. It's gonna come down, but it's still gonna be well over a hundred percent. We pegged it at 130% based on some of Mike's guidance. Now today, snowflake and every other stock is well off this morning. The company had a 40 billion value would drop well below that midday, but let's stick with the 40 billion on this, this sad Friday on the stock market, we'll go to 40 billion and who knows what the stock is gonna be valued in 2029? No idea, but let's say between 40 and 200 billion and look, it could get even ugly in the market as interest rates rise. >>And if inflation stays high, you know, until we get a Paul Voker like action, which is gonna be painful from the fed share, you know, let's hope we don't have a repeat of the long drawn out 1970s stagflation, but that is a concern among investors. We're gonna try to keep it positive here and we'll do a little sensitivity analysis of snowflake based on Scarelli and Ante's 2029 projections. What we've done here is we've calculated in this chart. Today's current valuation at about 40 billion and run a CAGR through 2029 with our estimates of valuation at that time. So if it stays at 40 billion valuation, can you imagine snowflake grow into a 10 billion company with no increase in valuation by the end, by by 2029 fiscal 2029, that would be a major bummer and investors would get a, a 0% return at 50 billion, 4% Kager 60 billion, 7%. >>Kegar now 7% market return is historically not bad relative to say the S and P 500, but with that kind of revenue and profitability growth projected by snowflake combined with inflation, that would again be a, a kind of a buzzkill for investors. The picture at 75 billion valuation, isn't much brighter, but it picks up at, at a hundred billion, even with inflation that should outperform the market. And as you get to 200 billion, which would track by the way, revenue growth, you get a 30% plus return, which would be pretty good. Could snowflake beat these projections. Absolutely. Could the market perform at the optimistic end of the spectrum? Sure. It could. It could outperform these levels. Could it not perform at these levels? You bet, but hopefully this gives a little context and framework to what Scarelli was talking about and his framework, not with notwithstanding the market's unpredictability you're you're on your own. >>There. I can't help snowflake looks like it's going to continue either way in amazing run compared to other software companies historically, and whether that's reflected in the stock price. Again, I, I, I can't predict, okay. Let's look at some ETR survey data, which aligns really well with what snowflake is telling the street. This chart shows the breakdown of Snowflake's net score and net score. Remember is ETS proprietary methodology that measures the percent of customers in their survey that are adding the platform new. That's the lime green at 19% existing snowflake customers that are ex spending 6% or more on the platform relative to last year. That's the forest green that's 55%. That's a big number flat spend. That's the gray at 21% decreasing spending. That's the pinkish at 5% and churning that's the red only 1% or, or moving off the platform, tiny, tiny churn, subtract the red from the greens and you get a net score that, that, that nets out to 68%. >>That's an, a very impressive net score by ETR standards. But it's down from the highs of the seventies and mid eighties, where high seventies and mid eighties, where snowflake has been since January of 2019 note that this survey of 1500 or so organizations includes 155 snowflake customers. What was really interesting is when we cut the data by industry sector, two of Snowflake's most important verticals, our finance and healthcare, both of those sectors are holding a net score in the ETR survey at its historic range. 83%. Hasn't really moved off that, you know, 80% plus number really encouraging, but retail consumer showed a dramatic decline. This past survey from 73% in the previous quarter down to 54%, 54% in just three months time. So this data aligns almost perfectly with what CFO Scarelli has been telling the street. So I give a lot of credibility to that narrative. >>Now here's a time series chart for the net score and the provision in the data set, meaning how penetrated snowflake is in the survey. Again, net score measures, spending velocity and a specific platform and provision measures the presence in the data set. You can see the steep downward trend in net score this past quarter. Now for context note, the red dotted line on the vertical axis at 40%, that's a bit of a magic number. Anything above that is best in class in our view, snowflake still a well, well above that line, but the April survey as we reported on May 7th in quite a bit of detail shows a meaningful break in the snowflake trend as shown by ETRS call out on the bottom line. You can see a steady rise in the survey, which is a proxy for Snowflake's overall market penetration. So steadily moving up and up. >>Here's a bit of a different view on that data bringing in some of Snowflake's peers and other data platforms. This XY graph shows net score on the vertical axis and provision on the horizontal with the red dotted line. At 40%, you can see from the ETR callouts again, that snowflake while declining in net score still holds the highest net score in the survey. So of course the highest data platforms while the spending velocity on AWS and Microsoft, uh, data platforms, outperforms that have, uh, sorry, while they're spending velocity on snowflake outperforms, that of AWS and, and Microsoft data platforms, those two are still well above the 40% line with a stronger market presence in the category. That's impressive because of their size. And you can see Google cloud and Mongo DB right around the 40% line. Now we reported on Mongo last week and discussed the commentary on consumption models. >>And we referenced Ray Lenchos what we thought was, was quite thoughtful research, uh, that rewarded Mongo DB for its forecasting transparency and, and accuracy and, and less likelihood of facing consumption headwinds. And, and I'll reiterate what I said last week, that snowflake, while seeing demand fluctuations this past quarter from those large customers is, is not like a data lake where you're just gonna shove data in and figure it out later, no schema on, right. Just throw it into the pond. That's gonna be more discretionary and you can turn that stuff off. More likely. Now you, you bring data into the snowflake data cloud with the intent of driving insights, which leads to actions, which leads to value creation. And as snowflake adds capabilities and expands its platform features and innovations and its ecosystem more and more data products are gonna be developed in the snowflake data cloud and by data products. >>We mean products and services that are conceived by business users. And that can be directly monetized, not just via analytics, but through governed data sharing and direct monetization. Here's a picture of that opportunity as we see it, this is our spin on our snowflake total available market chart that we've published many, many times. The key point here goes back to our opening statements. The snowflake data cloud is evolving well beyond just being a simpler and easier to use and more elastic cloud database snowflake is building what we often refer to as a super cloud. That is an abstraction layer that companies that, that comprises rich features and leverages the underlying primitives and APIs of the cloud providers, but hides all that complexity and adds new value beyond that infrastructure that value is seen in the left example in terms of compressed cycle time, snowflake often uses the example of pharmaceutical companies compressing time to discover a drug by years. >>Great example, there are many others this, and, and then through organic development and ecosystem expansion, snowflake will accelerate feature delivery. Snowflake's data cloud vision is not about vertically integrating all the functionality into its platform. Rather it's about creating a platform and delivering secure governed and facile and powerful analytics and data sharing capabilities to its customers, partners in a broad ecosystem so they can create additional value. On top of that ecosystem is how snowflake fills the gaps in its platform by building the best cloud data platform in the world, in terms of collaboration, security, governance, developer, friendliness, machine intelligence, etcetera, snowflake believes and plans to create a defacto standard. In our view in data platforms, get your data into the data cloud and all these native capabilities will be available to you. Now, is that a walled garden? Some might say it is. It's an interesting question and <laugh>, it's a moving target. >>It's definitely proprietary in the sense that snowflake is building something that is highly differentiatable and is building a moat around it. But the more open snowflake can make its platform. The more open source it uses, the more developer friendly and the great greater likelihood people will gravitate toward snowflake. Now, my new friend Tani, she's the creator of the data mesh concept. She might bristle at this narrative in favor, a more open source version of what snowflake is trying to build, but practically speaking, I think she'd recognize that we're a long ways off from that. And I also think that the benefits of a platform that despite requiring data to be inside of the data cloud can distribute data globally, enable facile governed, and computational data sharing, and to a large degree be a self-service platform for data, product builders. So this is how we see snow, the snowflake data cloud vision evolving question is edge part of that vision on the right hand side. >>Well, again, we think that is going to be a future challenge where the ecosystem is gonna have to come to play to fill those gaps. If snowflake can tap the edge, it'll bring even more clarity as to how it can expand into what we believe is a massive 200 billion Tam. Okay, let's close on next. Week's snowflake summit in Las Vegas. The cube is very excited to be there. I'll be hosting with Lisa Martin and we'll have Frank son as well as Christian Kleinman and several other snowflake experts. Analysts are gonna be there, uh, customers. And we're gonna have a number of ecosystem partners on as well. Here's what we'll be looking for. At least some of the things, evidence that our view of Snowflake's data cloud is actually taking shape and evolving in the way that we showed on the previous chart, where we also wanna figure out where snowflake is with it. >>Streamlet acquisition. Remember streamlet is a data science play and an expansion into data, bricks, territory, data, bricks, and snowflake have been going at it for a while. Streamlet brings an open source Python library and machine learning and kind of developer friendly data science environment. We also expect to hear some discussion, hopefully a lot of discussion about developers. Snowflake has a dedicated developer conference in November. So we expect to hear more about that and how it's gonna be leveraging further leveraging snow park, which it has previously announced, including a public preview of programming for unstructured data and data monetization along the lines of what we suggested earlier that is building data products that have the bells and whistles of native snowflake and can be directly monetized by Snowflake's customers. Snowflake's already announced a new workload this past week in security, and we'll be watching for others. >>And finally, what's happening in the all important ecosystem. One of the things we noted when we covered service now, cause we use service now as, as an example because Frank Lupin and Mike Scarelli and others, you know, DNA were there and they're improving on that service. Now in his post IPO, early adult years had a very slow pace. In our view was often one of our criticism of ecosystem development, you know, ServiceNow. They had some niche SI uh, like cloud Sherpa, and eventually the big guys came in and, and, and began to really lean in. And you had some other innovators kind of circling the mothership, some smaller companies, but generally we see sluman emphasizing the ecosystem growth much, much more than with this previous company. And that is a fundamental requirement in our view of any cloud or modern cloud company now to paraphrase the crazy man, Steve bomber developers, developers, developers, cause he screamed it and ranted and ran around the stage and was sweating <laugh> ecosystem ecosystem ecosystem equals optionality for developers and that's what they want. >>And that's how we see the current and future state of snowflake. Thanks today. If you're in Vegas next week, please stop by and say hello with the cube. Thanks to my colleagues, Stephanie Chan, who sometimes helps research breaking analysis topics. Alex, my is, and OS Myerson is on production. And today Andrew Frick, Sarah hiney, Steven Conti Anderson hill Chuck all and the entire team in Palo Alto, including Christian. Sorry, didn't mean to forget you Christian writer, of course, Kristin Martin and Cheryl Knight, they helped get the word out. And Rob ho is our E IIC over at Silicon angle. Remember, all these episodes are available as podcast, wherever you listen to search breaking analysis podcast, I publish each week on wikibon.com and Silicon angle.com. You can email me directly anytime David dot Valante Silicon angle.com. If you got something interesting, I'll respond. If not, I won't or DM me@deteorcommentonmylinkedinpostsandpleasedocheckoutetr.ai for the best survey data in the enterprise tech business. This is Dave Valante for the insights powered by ETR. Thanks for watching. And we'll see you next week. I hope if not, we'll see you next time on breaking analysis.
SUMMARY :
From the cube studios in Palo Alto, in Boston, bringing you data driven insights from the if anything, the company was overvalued out of the gate, the thing is people didn't We're gonna review the recent narrative and concerns One of the analysts asked if snowflake You remember the company at one point was valued at a hundred billion dollars, of the stock when it was in the three hundreds and above. but it's not the ones you mentioned. It's not like the historical Microsoft, you know, But the real interesting number to watch is free cash flow, 16% this year for And if inflation stays high, you know, until we get a Paul Voker like action, the way, revenue growth, you get a 30% plus return, which would be pretty Remember is ETS proprietary methodology that measures the percent of customers in their survey that in the previous quarter down to 54%, 54% in just three months time. You can see a steady rise in the survey, which is a proxy for Snowflake's overall So of course the highest data platforms while the spending gonna be developed in the snowflake data cloud and by data products. that comprises rich features and leverages the underlying primitives and APIs fills the gaps in its platform by building the best cloud data platform in the world, friend Tani, she's the creator of the data mesh concept. and evolving in the way that we showed on the previous chart, where we also wanna figure out lines of what we suggested earlier that is building data products that have the bells and One of the things we noted when we covered service now, cause we use service now as, This is Dave Valante for the insights powered
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Lisa Martin | PERSON | 0.99+ |
Stephanie Chan | PERSON | 0.99+ |
Cheryl Knight | PERSON | 0.99+ |
Peter Linsey | PERSON | 0.99+ |
Christian Kleinman | PERSON | 0.99+ |
Kristin Martin | PERSON | 0.99+ |
Sarah hiney | PERSON | 0.99+ |
Dave Valante | PERSON | 0.99+ |
Salman | PERSON | 0.99+ |
Alex | PERSON | 0.99+ |
Mike Scarelli | PERSON | 0.99+ |
Frank | PERSON | 0.99+ |
Vegas | LOCATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
April | DATE | 0.99+ |
Scarelli | PERSON | 0.99+ |
Walmart | ORGANIZATION | 0.99+ |
May 7th | DATE | 0.99+ |
Andrew Frick | PERSON | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
2029 | DATE | 0.99+ |
30% | QUANTITY | 0.99+ |
40 billion | QUANTITY | 0.99+ |
84% | QUANTITY | 0.99+ |
Snowflake | ORGANIZATION | 0.99+ |
75 billion | QUANTITY | 0.99+ |
2 billion | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
55% | QUANTITY | 0.99+ |
10 billion | QUANTITY | 0.99+ |
Netflix | ORGANIZATION | 0.99+ |
21% | QUANTITY | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
January of 2019 | DATE | 0.99+ |
November | DATE | 0.99+ |
19% | QUANTITY | 0.99+ |
40% | QUANTITY | 0.99+ |
Tani | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
Mike | PERSON | 0.99+ |
68% | QUANTITY | 0.99+ |
54% | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
200 billion | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
80% | QUANTITY | 0.99+ |
15% | QUANTITY | 0.99+ |
5% | QUANTITY | 0.99+ |
6% | QUANTITY | 0.99+ |
last week | DATE | 0.99+ |
7% | QUANTITY | 0.99+ |
20% | QUANTITY | 0.99+ |
Boston | LOCATION | 0.99+ |
Frank Lupin | PERSON | 0.99+ |
83% | QUANTITY | 0.99+ |
Next week | DATE | 0.99+ |
next week | DATE | 0.99+ |
Today | DATE | 0.99+ |
Frank sluman | PERSON | 0.99+ |
2.5 billion | QUANTITY | 0.99+ |
Slootman | PERSON | 0.99+ |
16% | QUANTITY | 0.99+ |
73% | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
2022 | DATE | 0.99+ |
Friday | DATE | 0.99+ |
1970s | DATE | 0.99+ |
two hundreds | QUANTITY | 0.99+ |
130% | QUANTITY | 0.99+ |
Glenn Grossman and Yusef Khan | Io-Tahoe ActiveDQ Intelligent Automation
>>from around the globe. It's the >>cube presenting >>active de que intelligent automation for data quality brought to you by Iota Ho >>Welcome to the sixth episode of the I. O. Tahoe data automation series. On the cube. We're gonna start off with a segment on how to accelerate the adoption of snowflake with Glenn Grossman, who is the enterprise account executive from Snowflake and yusef khan, the head of data services from Iota. Gentlemen welcome. >>Good afternoon. Good morning, Good evening. Dave. >>Good to see you. Dave. Good to see you. >>Okay glenn uh let's start with you. I mean the Cube hosted the snowflake data cloud summit in November and we heard from customers and going from love the tagline zero to snowflake, you know, 90 minutes very quickly. And of course you want to make it simple and attractive for enterprises to move data and analytics into the snowflake platform but help us understand once the data is there, how is snowflake helping to achieve savings compared to the data lake? >>Absolutely. dave. It's a great question, you know, it starts off first with the notion and uh kind of, we coined it in the industry or t shirt size pricing. You know, you don't necessarily always need the performance of a high end sports car when you're just trying to go get some groceries and drive down the street 20 mph. The t shirt pricing really aligns to, depending on what your operational workload is to support the business and the value that you need from that business? Not every day. Do you need data? Every second of the moment? Might be once a day, once a week through that t shirt size price and we can align for the performance according to the environmental needs of the business. What those drivers are the key performance indicators to drive that insight to make better decisions, It allows us to control that cost. So to my point, not always do you need the performance of a Ferrari? Maybe you need the performance and gas mileage of the Honda Civic if you would just get and deliver the value of the business but knowing that you have that entire performance landscape at a moments notice and that's really what what allows us to hold and get away from. How much is it going to cost me in a data lake type of environment? >>Got it. Thank you for that yussef. Where does Io Tahoe fit into this equation? I mean what's, what's, what's unique about the approach that you're taking towards this notion of mobilizing data on snowflake? >>Well, Dave in the first instance we profile the data itself at the data level, so not just at the level of metadata and we do that wherever that data lives. So it could be structured data could be semi structured data could be unstructured data and that data could be on premise. It could be in the cloud or it could be on some kind of SAAS platform. And so we profile this data at the source system that is feeding snowflake within snowflake itself within the end applications and the reports that the snowflake environment is serving. So what we've done here is take our machine learning discovery technology and make snowflake itself the repository for knowledge and insights on data. And this is pretty unique. Uh automation in the form of our P. A. Is being applied to the data both before after and within snowflake. And so the ultimate outcome is that business users can have a much greater degree of confidence that the data they're using can be trusted. Um The other thing we do uh which is unique is employee data R. P. A. To proactively detect and recommend fixes the data quality so that removes the manual time and effort and cost it takes to fix those data quality issues. Uh If they're left unchecked and untouched >>so that's key to things their trust, nobody's gonna use the data. It's not trusted. But also context. If you think about it, we've contextualized are operational systems but not our analytic system. So there's a big step forward glen. I wonder if you can tell us how customers are managing data quality when they migrate to snowflake because there's a lot of baggage in in traditional data warehouses and data lakes and and data hubs. Maybe you can talk about why this is a challenge for customers. And like for instance can you proactively address some of those challenges that customers face >>that we certainly can. They have. You know, data quality. Legacy data sources are always inherent with D. Q. Issues whether it's been master data management and data stewardship programs over the last really almost two decades right now, you do have systemic data issues. You have siloed data, you have information operational, data stores data marks. It became a hodgepodge when organizations are starting their journey to migrate to the cloud. One of the things that were first doing is that inspection of data um you know first and foremost even looking to retire legacy data sources that aren't even used across the enterprise but because they were part of the systemic long running operational on premise technology, it stayed there when we start to look at data pipelines as we onboard a customer. You know we want to do that era. We want to do QA and quality assurance so that we can, And our ultimate goal eliminate the garbage in garbage out scenarios that we've been plagued with really over the last 40, 50 years of just data in general. So we have to take an inspection where traditionally it was E. T. L. Now in the world of snowflake, it's really lt we're extracting were loading or inspecting them. We're transforming out to the business so that these routines could be done once and again give great business value back to making decisions around the data instead of spending all this long time. Always re architect ng the data pipeline to serve the business. >>Got it. Thank you. Glenda yourself of course. Snowflakes renowned for customers. Tell me all the time. It's so easy. It's so easy to spin up a data warehouse. It helps with my security. Again it simplifies everything but so you know, getting started is one thing but then adoption is also a key. So I'm interested in the role that that I owe. Tahoe plays in accelerating adoption for new customers. >>Absolutely. David. I mean as Ben said, you know every every migration to Snowflake is going to have a business case. Um uh and that is going to be uh partly about reducing spending legacy I. T. Servers, storage licenses, support all those good things um that see I want to be able to turn off entirely ultimately. And what Ayatollah does is help discover all the legacy undocumented silos that have been built up, as Glenn says on the data estate across a period of time, build intelligence around those silos and help reduce those legacy costs sooner by accelerating that that whole process. Because obviously the quicker that I. T. Um and Cdos can turn off legacy data sources the more funding and resources going to be available to them to manage the new uh Snowflake based data estate on the cloud. And so turning off the old building, the new go hand in hand to make sure those those numbers stack up the program is delivered uh and the benefits are delivered. And so what we're doing here with a Tahoe is improving the customers are y by accelerating their ability to adopt Snowflake. >>Great. And I mean we're talking a lot about data quality here but in a lot of ways that's table stakes like I said, if you don't trust the data, nobody's going to use it. And glenn, I mean I look at Snowflake and I see obviously the ease of use the simplicity you guys are nailing that the data sharing capabilities I think are really exciting because you know everybody talks about sharing data but then we talked about data as an asset, Everyone so high I to hold it. And so sharing is is something that I see as a paradigm shift and you guys are enabling that. So one of the things beyond data quality that are notable that customers are excited about that, maybe you're excited about >>David, I think you just cleared it out. It's it's this massive data sharing play part of the data cloud platform. Uh you know, just as of last year we had a little over about 100 people, 100 vendors in our data marketplace. That number today is well over 450 it is all about democratizing and sharing data in a world that is no longer held back by FTp s and C. S. V. S and then the organization having to take that data and ingested into their systems. You're a snowflake customer. want to subscribe to an S and P data sources an example, go subscribe it to it. It's in your account there was no data engineering, there was no physical lift of data and that becomes the most important thing when we talk about getting broader insights, data quality. Well, the data has already been inspected from your vendor is just available in your account. It's obviously a very simplistic thing to describe behind the scenes is what our founders have created to make it very, very easy for us to democratize not only internal with private sharing of data, but this notion of marketplace ensuring across your customers um marketplace is certainly on the type of all of my customers minds and probably some other areas that might have heard out of a recent cloud summit is the introduction of snow park and being able to do where all this data is going towards us. Am I in an ale, you know, along with our partners at Io Tahoe and R. P. A. Automation is what do we do with all this data? How do we put the algorithms and targets now? We'll be able to run in the future R and python scripts and java libraries directly inside Snowflake, which allows you to even accelerate even faster, Which people found traditionally when we started off eight years ago just as a data warehousing platform. >>Yeah, I think we're on the cusp of just a new way of thinking about data. I mean obviously simplicity is a starting point but but data by its very nature is decentralized. You talk about democratizing data. I like this idea of the global mesh. I mean it's very powerful concept and again it's early days but you know, keep part of this is is automation and trust, yussef you've worked with Snowflake and you're bringing active D. Q. To the market what our customers telling you so far? >>Well David the feedback so far has been great. Which is brilliant. So I mean firstly there's a point about speed and acceleration. Um So that's the speed to incite really. So where you have inherent data quality issues uh whether that's with data that was on premise and being brought into snowflake or on snowflake itself, we're able to show the customer results and help them understand their data quality better Within Day one which is which is a fantastic acceleration. I'm related to that. There's the cost and effort to get that insight is it's a massive productivity gain versus where you're seeing customers who've been struggling sometimes too remediate legacy data and legacy decisions that they've made over the past couple of decades, so that that cost and effort is much lower than it would otherwise have been. Um 3rdly, there's confidence and trust, so you can see Cdos and see IOS got demonstrable results that they've been able to improve data quality across a whole bunch of use cases for business users in marketing and customer services, for commercial teams, for financial teams. So there's that very quick kind of growth in confidence and credibility as the projects get moving. And then finally, I mean really all the use cases for the snowflake depend on data quality, really whether it's data science, uh and and the kind of snow park applications that Glenn has talked about, all those use cases work better when we're able to accelerate the ri for our joint customers by very quickly pushing out these data quality um insights. Um And I think one of the one of the things that the snowflake have recognized is that in order for C. I. O. Is to really adopt enterprise wide, um It's also as well as the great technology with Snowflake offers, it's about cleaning up that legacy data state, freeing up the budget for CIA to spend it on the new modern day to a state that lets them mobilise their data with snowflake. >>So you're seeing the Senate progression. We're simplifying the the the analytics from a tech perspective. You bring in Federated governance which which brings more trust. Then then you bring in the automation of the data quality piece which is fundamental. And now you can really start to, as you guys are saying, democratized and scale uh and share data. Very powerful guys. Thanks so much for coming on the program. Really appreciate your time. >>Thank you. I appreciate as well. Yeah.
SUMMARY :
It's the the head of data services from Iota. Good afternoon. Good to see you. I mean the Cube hosted the snowflake data cloud summit and the value that you need from that business? Thank you for that yussef. so not just at the level of metadata and we do that wherever that data lives. so that's key to things their trust, nobody's gonna use the data. Always re architect ng the data pipeline to serve the business. Again it simplifies everything but so you know, getting started is one thing but then I mean as Ben said, you know every every migration to Snowflake is going I see obviously the ease of use the simplicity you guys are nailing that the data sharing that might have heard out of a recent cloud summit is the introduction of snow park and I mean it's very powerful concept and again it's early days but you know, Um So that's the speed to incite And now you can really start to, as you guys are saying, democratized and scale uh and I appreciate as well.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Glenn Grossman | PERSON | 0.99+ |
Ben | PERSON | 0.99+ |
Io Tahoe | ORGANIZATION | 0.99+ |
Yusef Khan | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
20 mph | QUANTITY | 0.99+ |
Glenn | PERSON | 0.99+ |
CIA | ORGANIZATION | 0.99+ |
IOS | TITLE | 0.99+ |
Glenda | PERSON | 0.99+ |
90 minutes | QUANTITY | 0.99+ |
100 vendors | QUANTITY | 0.99+ |
Ferrari | ORGANIZATION | 0.99+ |
last year | DATE | 0.99+ |
One | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
first instance | QUANTITY | 0.99+ |
November | DATE | 0.99+ |
sixth episode | QUANTITY | 0.99+ |
once a day | QUANTITY | 0.99+ |
once a week | QUANTITY | 0.98+ |
Senate | ORGANIZATION | 0.98+ |
today | DATE | 0.98+ |
both | QUANTITY | 0.98+ |
eight years ago | DATE | 0.97+ |
yusef khan | PERSON | 0.97+ |
over | QUANTITY | 0.96+ |
one | QUANTITY | 0.95+ |
R. P. A. Automation | ORGANIZATION | 0.95+ |
python | TITLE | 0.95+ |
Tahoe | ORGANIZATION | 0.94+ |
I. O. Tahoe | TITLE | 0.93+ |
Honda | ORGANIZATION | 0.93+ |
Io-Tahoe | ORGANIZATION | 0.93+ |
one thing | QUANTITY | 0.91+ |
Io Tahoe | PERSON | 0.87+ |
firstly | QUANTITY | 0.87+ |
Civic | COMMERCIAL_ITEM | 0.87+ |
Snowflake | TITLE | 0.86+ |
Tahoe | PERSON | 0.85+ |
Ayatollah | PERSON | 0.84+ |
Snowflake | EVENT | 0.83+ |
past couple of decades | DATE | 0.82+ |
about 100 people | QUANTITY | 0.81+ |
two decades | QUANTITY | 0.8+ |
over 450 | QUANTITY | 0.79+ |
40, 50 years | QUANTITY | 0.76+ |
Day one | QUANTITY | 0.75+ |
glenn | PERSON | 0.74+ |
java | TITLE | 0.72+ |
snowflake | EVENT | 0.7+ |
Iota Ho | ORGANIZATION | 0.68+ |
P. | ORGANIZATION | 0.62+ |
ActiveDQ Intelligent Automation | ORGANIZATION | 0.61+ |
snowflake data cloud summit | EVENT | 0.6+ |
Iota | LOCATION | 0.58+ |
FTp | TITLE | 0.56+ |
Snowflake | ORGANIZATION | 0.54+ |
zero | QUANTITY | 0.53+ |
R | TITLE | 0.52+ |
O. | EVENT | 0.41+ |
C. | EVENT | 0.34+ |
Io-Tahoe Episode 5: Enterprise Digital Resilience on Hybrid and Multicloud
>>from around the globe. It's the Cube presenting enterprise. Digital resilience on hybrid and multi cloud Brought to You by Iota Ho. Hello, everyone, and welcome to our continuing Siri's covering data automation brought to you by Io Tahoe. Today we're gonna look at how to ensure enterprise resilience for hybrid and multi cloud. Let's welcome in age. Eva Hora, who is the CEO of Iota A J. Always good to see you again. Thanks for coming on. >>Great to be back. David Pleasure. >>And he's joined by Fozzy Coons, who is a global principal architect for financial services. The vertical of financial services. That red hat. He's got deep experiences in that sector. Welcome, Fozzie. Good to see you. >>Thank you very much. Happy to be here. >>Fancy. Let's start with you. Look, there are a lot of views on cloud and what it is. I wonder if you could explain to us how you think about what is a hybrid cloud and and how it works. >>Sure, yes. So the hybrid cloud is a 90 architecture that incorporates some degree off workload, possibility, orchestration and management across multiple clouds. Those clouds could be private cloud or public cloud or even your own data centers. And how does it all work? It's all about secure interconnectivity and on demand. Allocation of resources across clouds and separate clouds can become hydrate when they're similarly >>interconnected. And >>it is that interconnectivity that allows the workloads workers to be moved and how management can be unified in off the street. You can work and how well you have. These interconnections has a direct impact on how well your hybrid cloud will work. >>Okay, so we'll fancy staying with you for a minute. So in the early days of Cloud that turned private Cloud was thrown a lot around a lot, but often just meant virtualization of an on PREM system and a network connection to the public cloud. Let's bring it forward. What, in your view, does a modern hybrid cloud architecture look like? >>Sure. So for modern public clouds, we see that, um, teams organizations need to focus on the portability off applications across clouds. That's very important, right? And when organizations build applications, they need to build and deploy these applications as small collections off independently, loosely coupled services, and then have those things run on the same operating system which means, in other words, running it on Lenox everywhere and building cloud native applications and being able to manage and orchestrate thes applications with platforms like KUBERNETES or read it open shit, for example. >>Okay, so that Z, that's definitely different from building a monolithic application that's fossilized and and doesn't move. So what are the challenges for customers, you know, to get to that modern cloud? Aziz, you've just described it. Is it skill sets? Is that the ability to leverage things like containers? What's your view there? >>So, I mean, from what we've seen around around the industry, especially around financial services, where I spent most of my time, we see that the first thing that we see is management right now because you have all these clouds and all these applications, you have a massive array off connections off interconnections. You also have massive array off integrations, possibility and resource allocations as well, and then orchestrating all those different moving pieces. Things like storage networks and things like those are really difficult to manage, right? That's one. What s O Management is the first challenge. The second one is workload, placement, placement. Where do you place this? How do you place this cloud? Native applications. Do you or do you keep on site on Prem? And what do you put in the cloud? That is the the the other challenge. The major one. The third one is security. Security now becomes the key challenge and concern for most customers. And we could talk about how hundreds? Yeah, >>we're definitely gonna dig into that. Let's bring a J into the conversation. A J. You know, you and I have talked about this in the past. One of the big problems that virtually every companies face is data fragmentation. Um, talk a little bit about how I owe Tahoe unifies data across both traditional systems legacy systems. And it connects to these modern I t environments. >>Yeah, sure, Dave. I mean, fancy just nailed it. There used to be about data of the volume of data on the different types of data. But as applications become or connected and interconnected at the location of that data really matters how we serve that data up to those those app. So working with red hat in our partnership with Red Hat being able Thio, inject our data Discovery machine learning into these multiple different locations. Would it be in AWS on IBM Cloud or A D. C p R. On Prem being able thio Automate that discovery? I'm pulling that. That single view of where is all my data then allows the CEO to manage cast that can do things like one. I keep the data where it is on premise or in my Oracle Cloud or in my IBM cloud on Connect. The application that needs to feed off that data on the way in which you do that is machine learning. That learns over time is it recognizes different types of data, applies policies to declassify that data. Andi and brings it all together with automation. >>Right? And that's one of the big themes and we've talked about this on earlier episodes. Is really simplification really abstracting a lot of that heavy lifting away so we can focus on things A. J A. Z. You just mentioned e nifaz e. One of the big challenges that, of course, we all talk about his governance across thes disparity data sets. I'm curious as your thoughts. How does Red Hat really think about helping customers adhere to corporate edicts and compliance regulations, which, of course, are are particularly acute within financial services. >>Oh, yeah, Yes. So for banks and the payment providers, like you've just mentioned their insurers and many other financial services firms, Um, you know, they have to adhere Thio standards such as a PC. I. D. S s in Europe. You've got the G g d p g d p r, which requires strange and tracking, reporting documentation. And you know, for them to to remain in compliance and the way we recommend our customers to address these challenges is by having an automation strategy. Right. And that type of strategy can help you to improve the security on compliance off the organization and reduce the risk after the business. Right. And we help organizations build security and compliance from the start without consulting services residencies. We also offer courses that help customers to understand how to address some of these challenges. And that's also we help organizations build security into their applications without open sources. Mueller, where, um, middle offerings and even using a platform like open shift because it allows you to run legacy applications and also continue rights applications in a unified platform right And also that provides you with, you know, with the automation and the truly that you need to continuously monitor, manage and automate the systems for security and compliance >>purposes. Hey, >>Jay, anything. Any color you could add to this conversation? >>Yeah, I'm pleased. Badly brought up Open shift. I mean, we're using open shift to be able. Thio, take that security application of controls to to the data level. It's all about context. So, understanding what data is there being able to assess it to say who should have access to it. Which application permission should be applied to it. Um, that za great combination of Red Hat tonight. Tahoe. >>But what about multi Cloud? Doesn't that complicate the situation even even further? Maybe you could talk about some of the best practices to apply automation across not only hybrid cloud, but multi >>cloud a swell. Yeah, sure. >>Yeah. So the right automation solution, you know, can be the difference between, you know, cultivating an automated enterprise or automation caress. And some of the recommendations we give our clients is to look for an automation platform that can offer the first thing is complete support. So that means have an automation solution that provides that provides, um, you know, promotes I t availability and reliability with your platform so that you can provide, you know, enterprise great support, including security and testing, integration and clear roadmaps. The second thing is vendor interoperability interoperability in that you are going to be integrating multiple clouds. So you're going to need a solution that can connect to multiple clouds. Simples lee, right? And with that comes the challenge off maintain ability. So you you you're going to need to look into a automation Ah, solution that that is easy to learn or has an easy learning curve. And then the fourth idea that we tell our customers is scalability in the in the hybrid cloud space scale is >>is >>a big, big deal here, and you need a to deploy an automation solution that can span across the whole enterprise in a constituent, consistent manner, right? And then also, that allows you finally to, uh, integrate the multiple data centers that you have, >>So A J I mean, this is a complicated situation, for if a customer has toe, make sure things work on AWS or azure or Google. Uh, they're gonna spend all their time doing that, huh? What can you add really? To simplify that that multi cloud and hybrid cloud equation? >>Yeah. I could give a few customer examples here Warming a manufacturer that we've worked with to drive that simplification Onda riel bonuses for them is has been a reduction cost. We worked with them late last year to bring the cost bend down by $10 million in 2021 so they could hit that reduced budget. Andre, What we brought to that was the ability thio deploy using open shift templates into their different environments. Where there is on premise on bond or in as you mentioned, a W s. They had G cps well, for their marketing team on a cross, those different platforms being out Thio use a template, use pre built scripts to get up and running in catalog and discover that data within minutes. It takes away the legacy of having teams of people having Thio to jump on workshop cause and I know we're all on a lot of teens. The zoom cause, um, in these current times, they just sent me is in in of hours in the day Thio manually perform all of this. So yeah, working with red hat applying machine learning into those templates those little recipes that we can put that automation toe work, regardless of which location the data is in allows us thio pull that unified view together. Right? >>Thank you, Fozzie. I wanna come back to you. So the early days of cloud, you're in the big apple, you know, financial services. Really well. Cloud was like an evil word within financial services, and obviously that's changed. It's evolved. We talked about the pandemic, has even accelerated that, Um And when you really, you know, dug into it when you talk to customers about their experiences with security in the cloud it was it was not that it wasn't good. It was great, whatever. But it was different. And there's always this issue of skill, lack of skills and multiple tools suck up teams, they're really overburdened. But in the cloud requires new thinking. You've got the shared responsibility model you've got obviously have specific corporate requirements and compliance. So this is even more complicated when you introduce multiple clouds. So what are the differences that you can share from your experience is running on a sort of either on Prem or on a mono cloud, um, or, you know, and versus across clouds. What? What? What do you suggest there? >>Yeah, you know, because of these complexities that you have explained here, Miss Configurations and the inadequate change control the top security threats. So human error is what we want to avoid because is, you know, as your clouds grow with complexity and you put humans in the mix, then the rate off eras is going to increase, and that is going to exposure to security threat. So this is where automation comes in because automation will streamline and increase the consistency off your infrastructure management. Also application development and even security operations to improve in your protection, compliance and change control. So you want to consistently configure resources according to a pre approved um, you know, pre approved policies and you want to proactively maintain a to them in a repeatable fashion over the whole life cycle. And then you also want to rapid the identified system that require patches and and reconfiguration and automate that process off patching and reconfiguring so that you don't have humans doing this type of thing, right? And you want to be able to easily apply patches and change assistant settings. According Thio, Pre defined, based on like explained before, you know, with the pre approved policies and also you want is off auditing and troubleshooting, right? And from a rate of perspective, we provide tools that enable you to do this. We have, for example, a tool called danceable that enables you to automate data center operations and security and also deployment of applications and also obvious shit yourself, you know, automates most of these things and obstruct the human beings from putting their fingers on, causing, uh, potentially introducing errors right now in looking into the new world off multiple clouds and so forth. The difference is that we're seeing here between running a single cloud or on prem is three main areas which is control security and compliance. Right control here it means if your on premise or you have one cloud, um, you know, in most cases you have control over your data and your applications, especially if you're on Prem. However, if you're in the public cloud, there is a difference there. The ownership, it is still yours. But your resources are running on somebody else's or the public clouds. You know, e w s and so forth infrastructure. So people that are going to do this need to really especially banks and governments need to be aware off the regulatory constraints off running, uh, those applications in the public cloud. And we also help customers regionalize some of these choices and also on security. You will see that if you're running on premises or in a single cloud, you have more control, especially if you're on Prem. You can control this sensitive information that you have, however, in the cloud. That's a different situation, especially from personal information of employees and things like that. You need to be really careful off that. And also again, we help you rationalize some of those choices. And then the last one is compliant. Aziz. Well, you see that if you're running on Prem or a single cloud, um, regulations come into play again, right? And if you're running a problem, you have control over that. You can document everything you have access to everything that you need. But if you're gonna go to the public cloud again, you need to think about that. We have automation, and we have standards that can help you, uh, you know, address some of these challenges for security and compliance. >>So that's really strong insights, Potsie. I mean, first of all, answerable has a lot of market momentum. Red hats in a really good job with that acquisition, your point about repeatability is critical because you can't scale otherwise. And then that idea you're you're putting forth about control, security compliance It's so true is I called it the shared responsibility model. And there was a lot of misunderstanding in the early days of cloud. I mean, yeah, maybe a W s is gonna physically secure the, you know, s three, but in the bucket. But we saw so many Miss configurations early on. And so it's key to have partners that really understand this stuff and can share the experiences of other clients. So this all sounds great. A j. You're sharp, you know, financial background. What about the economics? >>You >>know, our survey data shows that security it's at the top of the spending priority list, but budgets are stretched thin. E especially when you think about the work from home pivot and and all the areas that they had toe the holes that they had to fill their, whether it was laptops, you know, new security models, etcetera. So how do organizations pay for this? What's the business case look like in terms of maybe reducing infrastructure costs so I could, you know, pay it forward or there's a There's a risk reduction angle. What can you share >>their? Yeah. I mean, the perspective I'd like to give here is, um, not being multi cloud is multi copies of an application or data. When I think about 20 years, a lot of the work in financial services I was looking at with managing copies of data that we're feeding different pipelines, different applications. Now what we're saying I talk a lot of the work that we're doing is reducing the number of copies of that data so that if I've got a product lifecycle management set of data, if I'm a manufacturer, I'm just gonna keep that in one location. But across my different clouds, I'm gonna have best of breed applications developed in house third parties in collaboration with my supply chain connecting securely to that. That single version of the truth. What I'm not going to do is to copy that data. So ah, lot of what we're seeing now is that interconnectivity using applications built on kubernetes. Um, that decoupled from the data source that allows us to reduce those copies of data within that you're gaining from the security capability and resilience because you're not leaving yourself open to those multiple copies of data on with that. Couldn't come. Cost, cost of storage on duh cost of compute. So what we're seeing is using multi cloud to leverage the best of what each cloud platform has to offer That goes all the way to Snowflake and Hiroko on Cloud manage databases, too. >>Well, and the people cost to a swell when you think about yes, the copy creep. But then you know when something goes wrong, a human has to come in and figured out um, you brought up snowflake, get this vision of the data cloud, which is, you know, data data. I think this we're gonna be rethinking a j, uh, data architectures in the coming decade where data stays where it belongs. It's distributed, and you're providing access. Like you said, you're separating the data from the applications applications as we talked about with Fozzie. Much more portable. So it Z really the last 10 years will be different than the next 10 years. A. >>J Definitely. I think the people cast election is used. Gone are the days where you needed thio have a dozen people governing managing black policies to data. Ah, lot of that repetitive work. Those tests can be in power automated. We've seen examples in insurance were reduced teams of 15 people working in the the back office China apply security controls compliance down to just a couple of people who are looking at the exceptions that don't fit. And that's really important because maybe two years ago the emphasis was on regulatory compliance of data with policies such as GDP are in CCP a last year, very much the economic effect of reduce headcounts on on enterprises of running lean looking to reduce that cost. This year, we can see that already some of the more proactive cos they're looking at initiatives such as net zero emissions how they use data toe under understand how cape how they can become more have a better social impact. Um, and using data to drive that, and that's across all of their operations and supply chain. So those regulatory compliance issues that may have been external we see similar patterns emerging for internal initiatives that benefiting the environment, social impact and and, of course, course, >>great perspectives. Yeah, Jeff Hammer, Bucker once famously said, The best minds of my generation are trying to get people to click on ads and a J. Those examples that you just gave of, you know, social good and moving. Uh, things forward are really critical. And I think that's where Data is gonna have the biggest societal impact. Okay, guys, great conversation. Thanks so much for coming on the program. Really appreciate your time. Keep it right there from, or insight and conversation around, creating a resilient digital business model. You're watching the >>Cube digital resilience, automated compliance, privacy and security for your multi cloud. Congratulations. You're on the journey. You have successfully transformed your organization by moving to a cloud based platform to ensure business continuity in these challenging times. But as you scale your digital activities, there is an inevitable influx of users that outpaces traditional methods of cybersecurity, exposing your data toe underlying threats on making your company susceptible toe ever greater risk to become digitally resilient. Have you applied controls your data continuously throughout the data Lifecycle? What are you doing to keep your customer on supply data private and secure? I owe Tahoe's automated, sensitive data. Discovery is pre programmed with over 300 existing policies that meet government mandated risk and compliance standards. Thes automate the process of applying policies and controls to your data. Our algorithm driven recommendation engine alerts you to risk exposure at the data level and suggests the appropriate next steps to remain compliant on ensure sensitive data is secure. Unsure about where your organization stands In terms of digital resilience, Sign up for a minimal cost commitment. Free data Health check. Let us run our sensitive data discovery on key unmapped data silos and sources to give you a clear understanding of what's in your environment. Book time within Iot. Tahoe Engineer Now >>Okay, let's now get into the next segment where we'll explore data automation. But from the angle of digital resilience within and as a service consumption model, we're now joined by Yusuf Khan, who heads data services for Iot, Tahoe and Shirish County up in. Who's the vice president and head of U. S. Sales at happiest Minds? Gents, welcome to the program. Great to have you in the Cube. >>Thank you, David. >>Trust you guys talk about happiest minds. This notion of born digital, foreign agile. I like that. But talk about your mission at the company. >>Sure. >>A former in 2011 Happiest Mind is a born digital born a child company. The reason is that we are focused on customers. Our customer centric approach on delivering digitals and seamless solutions have helped us be in the race. Along with the Tier one providers, Our mission, happiest people, happiest customers is focused to enable customer happiness through people happiness. We have Bean ranked among the top 25 i t services company in the great places to work serving hour glass to ratings off 41 against the rating off. Five is among the job in the Indian nineties services company that >>shows the >>mission on the culture. What we have built on the values right sharing, mindful, integrity, learning and social on social responsibilities are the core values off our company on. That's where the entire culture of the company has been built. >>That's great. That sounds like a happy place to be. Now you said you had up data services for Iot Tahoe. We've talked in the past. Of course you're out of London. What >>do you what? Your >>day to day focus with customers and partners. What you focused >>on? Well, David, my team work daily with customers and partners to help them better understand their data, improve their data quality, their data governance on help them make that data more accessible in a self service kind of way. To the stakeholders within those businesses on dis is all a key part of digital resilience that will will come on to talk about but later. You're >>right, e mean, that self service theme is something that we're gonna we're gonna really accelerate this decade, Yussef and so. But I wonder before we get into that, maybe you could talk about the nature of the partnership with happiest minds, you know? Why do you guys choose toe work closely together? >>Very good question. Um, we see Hyo Tahoe on happiest minds as a great mutual fit. A Suresh has said, uh, happiest minds are very agile organization um, I think that's one of the key things that attracts their customers on Io. Tahoe is all about automation. Uh, we're using machine learning algorithms to make data discovery data cataloging, understanding, data done. See, uh, much easier on. We're enabling customers and partners to do it much more quickly. So when you combine our emphasis on automation with the emphasis on agility that happiest minds have that that's a really nice combination work works very well together, very powerful. I think the other things that a key are both businesses, a serious have said, are really innovative digital native type type companies. Um, very focused on newer technologies, the cloud etcetera on. Then finally, I think they're both Challenger brands on happiest minds have a really positive, fresh ethical approach to people and customers that really resonates with us at Ideo Tahoe to >>great thank you for that. So Russia, let's get into the whole notion of digital resilience. I wanna I wanna sort of set it up with what I see, and maybe you can comment be prior to the pandemic. A lot of customers that kind of equated disaster recovery with their business continuance or business resilient strategy, and that's changed almost overnight. How have you seen your clients respond to that? What? I sometimes called the forced march to become a digital business. And maybe you could talk about some of the challenges that they faced along the way. >>Absolutely. So, uh, especially during this pandemic, times when you say Dave, customers have been having tough times managing their business. So happiest minds. Being a digital Brazilian company, we were able to react much faster in the industry, apart from the other services company. So one of the key things is the organisation's trying to adopt onto the digital technologies. Right there has bean lot off data which has been to manage by these customers on There have been lot off threats and risk, which has been to manage by the CEO Seo's so happiest minds digital resilient technology, right where we bring in the data. Complaints as a service were ableto manage the resilience much ahead off other competitors in the market. We were ableto bring in our business continuity processes from day one, where we were ableto deliver our services without any interruption to the services. What we were delivered to our customers So that is where the digital resilience with business community process enabled was very helpful for us. Toe enable our customers continue their business without any interruptions during pandemics. >>So I mean, some of the challenges that customers tell me they obviously they had to figure out how to get laptops to remote workers and that that whole remote work from home pivot figure out how to secure the end points. And, you know, those were kind of looking back there kind of table stakes, But it sounds like you've got a digital business. Means a data business putting data at the core, I like to say, but so I wonder if you could talk a little bit more about maybe the philosophy you have toward digital resilience in the specific approach you take with clients? >>Absolutely. They seen any organization data becomes. The key on that, for the first step is to identify the critical data. Right. So we this is a six step process. What we following happiest minds. First of all, we take stock off the current state, though the customers think that they have a clear visibility off their data. How are we do more often assessment from an external point off view on see how critical their data is, then we help the customers to strategies that right. The most important thing is to identify the most important critical herself. Data being the most critical assert for any organization. Identification off the data's key for the customers. Then we help in building a viable operating model to ensure these identified critical assets are secure on monitor dearly so that they are consumed well as well as protected from external threats. Then, as 1/4 step, we try to bring in awareness, toe the people we train them >>at >>all levels in the organization. That is a P for people to understand the importance off the digital ourselves and then as 1/5 step, we work as a back up plan in terms of bringing in a very comprehensive and a holistic testing approach on people process as well as in technology. We'll see how the organization can withstand during a crisis time, and finally we do a continuous governance off this data, which is a key right. It is not just a one step process. We set up the environment, we do the initial analysis and set up the strategy on continuously govern this data to ensure that they are not only know managed will secure as well as they also have to meet the compliance requirements off the organization's right. That is where we help organizations toe secure on Meet the regulations off the organizations. As for the privacy laws, so this is a constant process. It's not on one time effort. We do a constant process because every organization goes towards their digital journey on. They have to face all these as part off the evolving environment on digital journey. And that's where they should be kept ready in terms off. No recovering, rebounding on moving forward if things goes wrong. >>So let's stick on that for a minute, and then I wanna bring yourself into the conversation. So you mentioned compliance and governance when when your digital business, you're, as you say, you're a data business, so that brings up issues. Data sovereignty. Uh, there's governance, this compliance. There's things like right to be forgotten. There's data privacy, so many things. These were often kind of afterthoughts for businesses that bolted on, if you will. I know a lot of executives are very much concerned that these air built in on, and it's not a one shot deal. So do you have solutions around compliance and governance? Can you deliver that as a service? Maybe you could talk about some of the specifics there, >>so some of way have offered multiple services. Tow our customers on digital against. On one of the key service is the data complaints. As a service here we help organizations toe map the key data against the data compliance requirements. Some of the features includes in terms off the continuous discovery off data right, because organizations keep adding on data when they move more digital on helping the helping and understanding the actual data in terms off the residents of data, it could be a heterogeneous data soldiers. It could be on data basis, or it could be even on the data legs. Or it could be a no even on compromise all the cloud environment. So identifying the data across the various no heterogeneous environment is very key. Feature off our solution. Once we identify classify this sensitive data, the data privacy regulations on the traveling laws have to be map based on the business rules So we define those rules on help map those data so that organizations know how critical their digital assets are. Then we work on a continuous marching off data for anomalies because that's one of the key teachers off the solution, which needs to be implemented on the day to day operational basis. So we're helping monitoring those anomalies off data for data quality management on an ongoing basis. On finally, we also bringing the automated data governance where we can manage the sensory data policies on their later relationships in terms off mapping on manage their business roots on we drive reputations toe Also suggest appropriate actions to the customers. Take on those specific data sets. >>Great. Thank you, Yousef. Thanks for being patient. I want to bring in Iota ho thio discussion and understand where your customers and happiest minds can leverage your data automation capability that you and I have talked about in the past. I'm gonna be great if you had an example is well, but maybe you could pick it up from there, >>John. I mean, at a high level, assertions are clearly articulated. Really? Um, Hyoty, who delivers business agility. So that's by, um accelerating the time to operationalize data, automating, putting in place controls and actually putting helping put in place digital resilience. I mean way if we step back a little bit in time, um, traditional resilience in relation to data often met manually, making multiple copies of the same data. So you have a d b A. They would copy the data to various different places, and then business users would access it in those functional style owes. And of course, what happened was you ended up with lots of different copies off the same data around the enterprise. Very inefficient. ONDA course ultimately, uh, increases your risk profile. Your risk of a data breach. Um, it's very hard to know where everything is. And I realized that expression. They used David the idea of the forced march to digital. So with enterprises that are going on this forced march, what they're finding is they don't have a single version of the truth, and almost nobody has an accurate view of where their critical data is. Then you have containers bond with containers that enables a big leap forward so you could break applications down into micro services. Updates are available via a p I s on. So you don't have the same need thio to build and to manage multiple copies of the data. So you have an opportunity to just have a single version of the truth. Then your challenge is, how do you deal with these large legacy data states that the service has been referring Thio, where you you have toe consolidate and that's really where I attack comes in. Um, we massively accelerate that process of putting in a single version of the truth into place. So by automatically discovering the data, discovering what's dubica? What's redundant? Uh, that means you can consolidate it down to a single trusted version much more quickly. We've seen many customers have tried to do this manually, and it's literally taken years using manual methods to cover even a small percentage of their I T estates. With our tire, you could do it really very quickly on you can have tangible results within weeks and months on Ben, you can apply controls to the data based on context. So who's the user? What's the content? What's the use case? Things like data quality validations or access permissions on. Then, once you've done there. Your applications and your enterprise are much more secure, much more resilient. As a result, you've got to do these things whilst retaining agility, though. So coming full circle. This is where the partnership with happiest minds really comes in as well. You've got to be agile. You've gotta have controls. Um, on you've got a drug toward the business outcomes. Uh, and it's doing those three things together that really deliver for the customer. >>Thank you. Use f. I mean you and I. In previous episodes, we've looked in detail at the business case. You were just talking about the manual labor involved. We know that you can't scale, but also there's that compression of time. Thio get to the next step in terms of ultimately getting to the outcome. And we talked to a number of customers in the Cube, and the conclusion is, it's really consistent that if you could accelerate the time to value, that's the key driver reducing complexity, automating and getting to insights faster. That's where you see telephone numbers in terms of business impact. So my question is, where should customers start? I mean, how can they take advantage of some of these opportunities that we've discussed today. >>Well, we've tried to make that easy for customers. So with our Tahoe and happiest minds, you can very quickly do what we call a data health check. Um, this is a is a 2 to 3 week process, uh, to really quickly start to understand on deliver value from your data. Um, so, iota, who deploys into the customer environment? Data doesn't go anywhere. Um, we would look at a few data sources on a sample of data. Onda. We can very rapidly demonstrate how they discovery those catalog e on understanding Jupiter data and redundant data can be done. Um, using machine learning, um, on how those problems can be solved. Um, And so what we tend to find is that we can very quickly, as I say in the matter of a few weeks, show a customer how they could get toe, um, or Brazilian outcome on then how they can scale that up, take it into production on, then really understand their data state? Better on build. Um, Brasiliense into the enterprise. >>Excellent. There you have it. We'll leave it right there. Guys, great conversation. Thanks so much for coming on the program. Best of luck to you and the partnership Be well, >>Thank you, David Suresh. Thank you. Thank >>you for watching everybody, This is Dave Volonte for the Cuban are ongoing Siris on data automation without >>Tahoe, digital resilience, automated compliance, privacy and security for your multi cloud. Congratulations. You're on the journey. You have successfully transformed your organization by moving to a cloud based platform to ensure business continuity in these challenging times. But as you scale your digital activities, there is an inevitable influx of users that outpaces traditional methods of cybersecurity, exposing your data toe underlying threats on making your company susceptible toe ever greater risk to become digitally resilient. Have you applied controls your data continuously throughout the data lifecycle? What are you doing to keep your customer on supply data private and secure? I owe Tahoe's automated sensitive data. Discovery is pre programmed with over 300 existing policies that meet government mandated risk and compliance standards. Thes automate the process of applying policies and controls to your data. Our algorithm driven recommendation engine alerts you to risk exposure at the data level and suggests the appropriate next steps to remain compliant on ensure sensitive data is secure. Unsure about where your organization stands in terms of digital resilience. Sign up for our minimal cost commitment. Free data health check. Let us run our sensitive data discovery on key unmapped data silos and sources to give you a clear understanding of what's in your environment. Book time within Iot. Tahoe Engineer. Now. >>Okay, now we're >>gonna go into the demo. We want to get a better understanding of how you can leverage open shift. And I owe Tahoe to facilitate faster application deployment. Let me pass the mic to Sabetta. Take it away. >>Uh, thanks, Dave. Happy to be here again, Guys, uh, they've mentioned names to be the Davis. I'm the enterprise account executive here. Toyota ho eso Today we just wanted to give you guys a general overview of how we're using open shift. Yeah. Hey, I'm Noah Iota host data operations engineer, working with open ship. And I've been learning the Internets of open shift for, like, the past few months, and I'm here to share. What a plan. Okay, so So before we begin, I'm sure everybody wants to know. Noel, what are the benefits of using open shift. Well, there's five that I can think of a faster time, the operation simplicity, automation control and digital resilience. Okay, so that that's really interesting, because there's an exact same benefits that we had a Tahoe delivered to our customers. But let's start with faster time the operation by running iota. Who on open shift? Is it faster than, let's say, using kubernetes and other platforms >>are >>objective iota. Who is to be accessible across multiple cloud platforms, right? And so by hosting our application and containers were able to achieve this. So to answer your question, it's faster to create and use your application images using container tools like kubernetes with open shift as compared to, like kubernetes with docker cry over container D. Okay, so we got a bit technical there. Can you explain that in a bit more detail? Yeah, there's a bit of vocabulary involved, uh, so basically, containers are used in developing things like databases, Web servers or applications such as I have top. What's great about containers is that they split the workload so developers can select the libraries without breaking anything. And since Hammond's can update the host without interrupting the programmers. Uh, now, open shift works hand in hand with kubernetes to provide a way to build those containers for applications. Okay, got It s basically containers make life easier for developers and system happens. How does open shift differ from other platforms? Well, this kind of leads into the second benefit I want to talk about, which is simplicity. Basically, there's a lot of steps involved with when using kubernetes with docker. But open shift simplifies this with their source to image process that takes the source code and turns it into a container image. But that's not all. Open shift has a lot of automation and features that simplify working with containers, an important one being its Web console. Here. I've set up a light version of open ship called Code Ready Containers, and I was able to set up her application right from the Web console. And I was able to set up this entire thing in Windows, Mac and Lennox. So its environment agnostic in that sense. Okay, so I think I've seen the top left that this is a developers view. What would a systems admin view look like? It's a good question. So here's the administrator view and this kind of ties into the benefit of control. Um, this view gives insights into each one of the applications and containers that are running, and you could make changes without affecting deployment. Andi can also, within this view, set up each layer of security, and there's multiple that you can prop up. But I haven't fully messed around with it because with my luck, I'd probably locked myself out. So that seems pretty secure. Is there a single point security such as you use a log in? Or are there multiple layers of security? Yeah, there are multiple layers of security. There's your user login security groups and general role based access controls. Um, but there's also a ton of layers of security surrounding like the containers themselves. But for the sake of time, I won't get too far into it. Okay, eso you mentioned simplicity In time. The operation is being two of the benefits. You also briefly mention automation. And as you know, automation is the backbone of our platform here, Toyota Ho. So that's certainly grabbed my attention. Can you go a bit more in depth in terms of automation? Open shift provides extensive automation that speeds up that time the operation. Right. So the latest versions of open should come with a built in cryo container engine, which basically means that you get to skip that container engine insulation step and you don't have to, like, log into each individual container host and configure networking, configure registry servers, storage, etcetera. So I'd say, uh, it automates the more boring kind of tedious process is Okay, so I see the iota ho template there. What does it allow me to do? Um, in terms of automation in application development. So we've created an open shift template which contains our application. This allows developers thio instantly, like set up our product within that template. So, Noah Last question. Speaking of vocabulary, you mentioned earlier digital resilience of the term we're hearing, especially in the banking and finance world. Um, it seems from what you described, industries like banking and finance would be more resilient using open shift, Correct. Yeah, In terms of digital resilience, open shift will give you better control over the consumption of resource is each container is using. In addition, the benefit of containers is that, like I mentioned earlier since Hammond's can troubleshoot servers about bringing down the application and if the application does go down is easy to bring it back up using templates and, like the other automation features that open ship provides. Okay, so thanks so much. Know us? So any final thoughts you want to share? Yeah. I just want to give a quick recap with, like, the five benefits that you gained by using open shift. Uh, the five are timeto operation automation, control, security and simplicity. You could deploy applications faster. You could simplify the workload you could automate. A lot of the otherwise tedious processes can maintain full control over your workflow. And you could assert digital resilience within your environment. Guys, >>Thanks for that. Appreciate the demo. Um, I wonder you guys have been talking about the combination of a Iot Tahoe and red hat. Can you tie that in subito Digital resilience >>Specifically? Yeah, sure, Dave eso when we speak to the benefits of security controls in terms of digital resilience at Io Tahoe, we automated detection and apply controls at the data level, so this would provide for more enhanced security. >>Okay, But so if you were trying to do all these things manually. I mean, what what does that do? How much time can I compress? What's the time to value? >>So with our latest versions, Biota we're taking advantage of faster deployment time associated with container ization and kubernetes. So this kind of speeds up the time it takes for customers. Start using our software as they be ableto quickly spin up io towel on their own on premise environment are otherwise in their own cloud environment, like including aws. Assure or call GP on IBM Cloud a quick start templates allow flexibility deploy into multi cloud environments all just using, like, a few clicks. Okay, so so now just quickly add So what we've done iota, Who here is We've really moved our customers away from the whole idea of needing a team of engineers to apply controls to data as compared to other manually driven work flows. Eso with templates, automation, previous policies and data controls. One person can be fully operational within a few hours and achieve results straight out of the box on any cloud. >>Yeah, we've been talking about this theme of abstracting the complexity. That's really what we're seeing is a major trend in in this coming decade. Okay, great. Thanks, Sabina. Noah, How could people get more information or if they have any follow up questions? Where should they go? >>Yeah, sure. They've. I mean, if you guys are interested in learning more, you know, reach out to us at info at iata ho dot com to speak with one of our sales engineers. I mean, we love to hear from you, so book a meeting as soon as you can. All >>right. Thanks, guys. Keep it right there from or cube content with.
SUMMARY :
Always good to see you again. Great to be back. Good to see you. Thank you very much. I wonder if you could explain to us how you think about what is a hybrid cloud and So the hybrid cloud is a 90 architecture that incorporates some degree off And it is that interconnectivity that allows the workloads workers to be moved So in the early days of Cloud that turned private Cloud was thrown a lot to manage and orchestrate thes applications with platforms like Is that the ability to leverage things like containers? And what do you put in the cloud? One of the big problems that virtually every companies face is data fragmentation. the way in which you do that is machine learning. And that's one of the big themes and we've talked about this on earlier episodes. And that type of strategy can help you to improve the security on Hey, Any color you could add to this conversation? is there being able to assess it to say who should have access to it. Yeah, sure. the difference between, you know, cultivating an automated enterprise or automation caress. What can you add really? bond or in as you mentioned, a W s. They had G cps well, So what are the differences that you can share from your experience is running on a sort of either And from a rate of perspective, we provide tools that enable you to do this. A j. You're sharp, you know, financial background. know, our survey data shows that security it's at the top of the spending priority list, Um, that decoupled from the data source that Well, and the people cost to a swell when you think about yes, the copy creep. Gone are the days where you needed thio have a dozen people governing managing to get people to click on ads and a J. Those examples that you just gave of, you know, to give you a clear understanding of what's in your environment. Great to have you in the Cube. Trust you guys talk about happiest minds. We have Bean ranked among the mission on the culture. Now you said you had up data services for Iot Tahoe. What you focused To the stakeholders within those businesses on dis is of the partnership with happiest minds, you know? So when you combine our emphasis on automation with the emphasis And maybe you could talk about some of the challenges that they faced along the way. So one of the key things putting data at the core, I like to say, but so I wonder if you could talk a little bit more about maybe for the first step is to identify the critical data. off the digital ourselves and then as 1/5 step, we work as a back up plan So you mentioned compliance and governance when when your digital business, you're, as you say, So identifying the data across the various no heterogeneous environment is well, but maybe you could pick it up from there, So you don't have the same need thio to build and to manage multiple copies of the data. and the conclusion is, it's really consistent that if you could accelerate the time to value, to really quickly start to understand on deliver value from your data. Best of luck to you and the partnership Be well, Thank you, David Suresh. to give you a clear understanding of what's in your environment. Let me pass the mic to And I've been learning the Internets of open shift for, like, the past few months, and I'm here to share. into each one of the applications and containers that are running, and you could make changes without affecting Um, I wonder you guys have been talking about the combination of apply controls at the data level, so this would provide for more enhanced security. What's the time to value? a team of engineers to apply controls to data as compared to other manually driven work That's really what we're seeing I mean, if you guys are interested in learning more, you know, reach out to us at info at iata Keep it right there from or cube content with.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Jeff Hammer | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Eva Hora | PERSON | 0.99+ |
David Suresh | PERSON | 0.99+ |
Sabina | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Yusuf Khan | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
London | LOCATION | 0.99+ |
2021 | DATE | 0.99+ |
two | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Dave Volonte | PERSON | 0.99+ |
Siri | TITLE | 0.99+ |
ORGANIZATION | 0.99+ | |
Fozzie | PERSON | 0.99+ |
2 | QUANTITY | 0.99+ |
five | QUANTITY | 0.99+ |
David Pleasure | PERSON | 0.99+ |
iata ho dot com | ORGANIZATION | 0.99+ |
Jay | PERSON | 0.99+ |
Five | QUANTITY | 0.99+ |
six step | QUANTITY | 0.99+ |
five benefits | QUANTITY | 0.99+ |
15 people | QUANTITY | 0.99+ |
Yousef | PERSON | 0.99+ |
$10 million | QUANTITY | 0.99+ |
This year | DATE | 0.99+ |
first step | QUANTITY | 0.99+ |
Ideo Tahoe | ORGANIZATION | 0.99+ |
last year | DATE | 0.99+ |
Andre | PERSON | 0.99+ |
hundreds | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
one cloud | QUANTITY | 0.99+ |
2011 | DATE | 0.99+ |
Tahoe | ORGANIZATION | 0.99+ |
Today | DATE | 0.99+ |
Noel | PERSON | 0.99+ |
Red Hat | ORGANIZATION | 0.99+ |
Prem | ORGANIZATION | 0.99+ |
today | DATE | 0.99+ |
tonight | DATE | 0.99+ |
Io Tahoe | ORGANIZATION | 0.99+ |
second benefit | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
Iota A J. | ORGANIZATION | 0.99+ |
one step | QUANTITY | 0.99+ |
both | QUANTITY | 0.98+ |
third one | QUANTITY | 0.98+ |
Siris | TITLE | 0.98+ |
Aziz | PERSON | 0.98+ |
red hat | ORGANIZATION | 0.98+ |
each layer | QUANTITY | 0.98+ |
both businesses | QUANTITY | 0.98+ |
fourth idea | QUANTITY | 0.98+ |
apple | ORGANIZATION | 0.98+ |
1/5 step | QUANTITY | 0.98+ |
Toyota Ho | ORGANIZATION | 0.98+ |
first challenge | QUANTITY | 0.98+ |
41 | QUANTITY | 0.98+ |
azure | ORGANIZATION | 0.98+ |
Io Tahoe | PERSON | 0.98+ |
One person | QUANTITY | 0.98+ |
one location | QUANTITY | 0.98+ |
single | QUANTITY | 0.98+ |
Noah | PERSON | 0.98+ |
over 300 existing policies | QUANTITY | 0.98+ |
Iot Tahoe | ORGANIZATION | 0.98+ |
Thio | PERSON | 0.98+ |
Lenox | ORGANIZATION | 0.98+ |
two years ago | DATE | 0.98+ |
A. J A. Z. | PERSON | 0.98+ |
single point | QUANTITY | 0.98+ |
first thing | QUANTITY | 0.97+ |
Yussef | PERSON | 0.97+ |
Jupiter | LOCATION | 0.97+ |
second thing | QUANTITY | 0.97+ |
three things | QUANTITY | 0.97+ |
about 20 years | QUANTITY | 0.97+ |
single cloud | QUANTITY | 0.97+ |
First | QUANTITY | 0.97+ |
Suresh | PERSON | 0.97+ |
3 week | QUANTITY | 0.97+ |
each container | QUANTITY | 0.97+ |
each cloud platform | QUANTITY | 0.97+ |
Yusef Khan & Suresh Kanniappan | Io Tahoe Enterprise Digital Resilience on Hybrid & Multicloud
>>from around the globe. It's the Cube presenting enterprise, Digital resilience on hybrid and multi cloud Brought to You by Iota Ho. Okay, let's now get into the next segment where we'll explore data automation. But from the angle of digital resilience within and as a service consumption model, we're now joined by Yusuf Khan, who heads data services for Iota Ho and Shirish County. Up in Who's the vice president and head of U. S. Sales at happiest Minds. Gents, welcome to the program. Great to have you in the Cube. >>Thank you, David. >>Stretch. You guys talk about happiest minds. This notion of born digital, foreign agile. I like that. But talk about your mission at the company. >>Sure. A former in 2011 Happiest minds Up Born digital born a child company. >>The >>reason is that we are focused on customers. Our customer centric approach on delivering digitals and seamless solutions have helped us be in the race. Along with the Tier one providers, our mission, happiest people, happiest customers is focused to enable customer happiness through people happiness. We have Bean ranked among the top 25 I t services company in the great places to work serving hour glass to ratings off 4.1 against the rating off five is among the job in the Indian nineties services company that >>shows the >>mission on the culture. What we have built on the values, right sharing, mindful, integrity, learning and social on social responsibilities are the core values off our company on. That's where the entire culture of the company has been built. >>That's great. That sounds like a happy place to be. Now you have you head up data services for Iot Tahoe. We've talked in the past. Of course you're out of London. What do you what's your day to day focus with customers and partners? What you focused on? >>Well, David, my team work daily with customers and partners to help them better understand their data, improve their data quality, their data governance on help them make that data more accessible in a self service kind of way. To the stakeholders within those businesses on dis is all a key part of digital resilience that will will come on to talk about but later. You're >>right, e mean, that self service theme is something that we're gonna we're gonna really accelerate this decade, Yussef and so. But I wonder before we get into that, maybe you could talk about the nature of the partnership with happiest minds. You know, why do you guys choose toe work closely together? >>Very good question. Um, we see Io Tahoe on Happiest minds as a great mutual fit. A Suresh has said happiest minds are very agile organization. Um, I think that's one of the key things that attracts their customers on Io. Tahoe is all about automation. We're using machine learning algorithms to make data discovery data cataloging, understanding, data, redundancy, uh, much easier on. We're enabling customers and partners to do it much more quickly. So when you combine our emphasis on automation with the emphasis on agility, the happiest minds have that. That's a really nice combination. Work works very well together, very powerful. I think the other things that a key are both businesses, a serious have said are really innovative digital native type type companies. Um, very focused on newer technologies, the cloud etcetera, uh, on. Then finally, I think that both challenger brands Andi happiest minds have a really positive, fresh ethical approach to people and customers that really resonates with us that I have tied to its >>great thank you for that. So Russia, Let's get into the whole notion of digital resilience. I wanna I wanna sort of set it up with what I see. And maybe you can comment be prior to the pandemic. A lot of customers that kind of equated disaster recovery with their business continuance or business resilient strategy, and that's changed almost overnight. How have you seen your clients respond to that? What? I sometimes called the forced march to become a digital business. And maybe you could talk about some of the challenges that they faced along the way. >>Absolutely. So, uh, especially during this pandemic times when you see Dave customers have been having tough times managing their business. So happiest minds. Being a digital Brazilian company, we were able to react much faster in the industry, apart from the other services company. So one of the key things is the organizations trying to adopt onto the digital technologies right there has bean lot off data which has been to managed by these customers on. There have been lot off threats and risk, which has been to manage by the CEO Seo's so happiest minds digital resilient technology fight the where we're bringing the data complaints as a service, we were ableto manage the resilience much ahead off other competitors in the market. We were ableto bring in our business community processes from day one, where we were ableto deliver our services without any interruption to the services what we were delivering to our customers. >>So >>that is where the digital resilience with business community process enabled was very helpful for us who enable our customers continue there business without any interruptions during pandemics. >>So, I mean, some of the challenges that that customers tell me they obviously had to figure out how to get laptops to remote workers and that that whole remote, you know, work from home pivot figure out how to secure the end points. And, you know, those were kind of looking back there kind of table stakes, but it sounds like you've got a digital business means a data business putting data at the core, I like to say, but so I wonder if you could talk a little bit more about maybe the philosophy you have toward digital resilience in the specific approach you take with clients? >>Absolutely. They seen any organization data becomes. The key on this for the first step is to identify the critical data. Right. So we this is 1/6 process. What we following happiest minds. First of all, we take stock off the current state, though the customers think that they have a clear visibility off their data. How are we do more often assessment from an external point off view on See how critical their data is? Then we help the customers to strategies that right the most important thing is to identify the most important critical herself. Data being the most critical assault for any organization. Identification off the data's key for the customers. Then we help in building a viable operating model to ensure these identified critical assets are secure on monitor dearly so that they are consumed well as well as protected from external threats. Then, as 1/4 step, we try to bring in awareness, toe the people we train them at all levels in the organization. That is a P for people to understand the importance off the residual our cells. And then as 1/5 step, we work as a back up plan in terms of bringing in a very comprehensive and the holistic testing approach on people process as well as in technology. We'll see how the organization can withstand during a crisis time. And finally we do a continuous governance off this data, which is a key right. It is not just a one step process. We set up the environment. We do the initial analysis and set up the strategy on continuously govern this data to ensure that they are not only know managed will secure as well as they also have to meet the compliance requirements off the organization's right. That is where we help organizations toe secure on Meet the regulations off the organizations. As for the privacy laws, >>so >>this is a constant process. It's not on one time effort. We do a constant process because every organization goes towards the digital journey on. They have to face all these as part off the evolving environment on digital journey, and that's where they should be kept ready in terms off. No recovering, rebounding on moving forward if things goes wrong. >>So let's stick on that for a minute, and then I wanna bring yourself into the conversation. So you mentioned compliance and governance. When? When your digital business. Here, as you say, you're a data business. So that brings up issues. Data sovereignty. Uh, there's governance, this compliance. There's things like right to be forgotten. There's data privacy, so many things. These were often kind of afterthoughts for businesses that bolted on, if you will. I know a lot of executives are very much concerned that these air built in on, and it's not a one shot deal. So do you have solutions around compliance and governance? Can you deliver that as a service? Maybe you could talk about some of the specifics there, >>so some of way have offered multiple services. Tow our customers on digital race against. On one of the key service is the data complaints. As a service here we help organizations toe map the key data against the data compliance requirements. Some of the features includes in terms off the continuous discovery off data right, because organizations keep adding on data when they move more digital on helping the helping and understanding the actual data in terms off the residents of data, it could be a heterogeneous data sources. It could be on data basis or it could be even on the data lakes. Or it could be or no even on compromise, all the cloud environment. So identifying the data across the various no heterogeneous environment is very key. Feature off our solution. Once we identify, classify this sensitive data, the data privacy regulations on the traveling laws have to be map based on the business rules. So we define those rules on help map those data so that organizations know how critical their digital assets are. Then we work on a continuous marching off data for anomalies because that's one of the key teachers off the solution, which needs to be implemented on the day to day operational basis. So we're helping monitoring those anomalies off data for data quality management on an ongoing basis. And finally we also bringing the automatic data governance where we can manage the sensory data policies on their data relationships in terms off, mapping on manage their business rules on we drive reputations toe also suggest appropriate actions to the customers. Take on those specific data sets. >>Great. Thank you, Yousef. Thanks for being patient. I want to bring in Iota ho thio discussion and understand where your customers and happiest minds can leverage your data automation capability that you and I have talked about in the past. And I'm gonna be great if you had an example is well, but maybe you could pick it up from there. >>Sure. I mean, at a high level, assertions are clearly articulated. Really? Um, Iota, who delivers business agility. So that's by, um, accelerating the time to operationalize data, automating, putting in place controls and ultimately putting, helping put in place digital resilience. I mean, way if we step back a little bit in time, um, traditional resilience in relation to data are often met manually, making multiple copies of the same data. So you have a DB A. They would copy the data to various different places on business. Users would access it in those functional style owes. And of course, what happened was you ended up with lots of different copies off the same data around the enterprise. Very inefficient. Onda course ultimately, uh, increases your risk profile. Your risk of a data breach. Um, it's very hard to know where everything is, and I realized that expression they used David, the idea of the forced march to digital. So with enterprises that are going on this forced march, what they're finding is they don't have a single version of the truth, and almost nobody has an accurate view of where their critical data is. Then you have containers bond with containers that enables a big leap forward so you could break applications down into micro services. Updates are available via a P I s. And so you don't have the same need to build and to manage multiple copies of the data. So you have an opportunity to just have a single version of the truth. Then your challenge is, how do you deal with these large legacy data states that the service has been referring Thio, where you you have toe consolidate, and that's really where I Tahoe comes in. Um, we massively accelerate that process of putting in a single version of the truth into place. So by automatically discovering the data, um, discovering what's duplicate what's redundant, that means you can consolidate it down to a single trusted version much more quickly. We've seen many customers have tried to do this manually, and it's literally taken years using manual methods to cover even a small percentage of their I T estates with a tire. You could do it really very quickly on you can have tangible results within weeks and months. Um, and then you can apply controls to the data based on context. So who's the user? What's the content? What's the use case? Things like data quality validations or access permissions on. Then once you've done there, your applications and your enterprise are much more secure, much more resilient. As a result, you've got to do these things whilst retaining agility, though. So coming full circle. This is where the partnership with happiest minds really comes in as well. You've got to be agile. You've gotta have controls, um, on you've got a drug towards the business outcomes and it's doing those three things together that really deliver for the customer. Thank >>you. Use f. I mean you and I. In previous episodes, we've looked in detail at the business case. You were just talking about the manual labor involved. We know that you can't scale, but also there's that compression of time. Thio get to the next step in terms of ultimately getting to the outcome and we talked to a number of customers in the Cube. And the conclusion is really consistent that if you could accelerate the time to value, that's the key driver reducing complexity, automating and getting to insights faster. That's where you see telephone numbers in terms of business impact. So my question is, where should customers start? I mean, how can they take advantage of some of these opportunities that we've discussed >>today? Well, we've tried to make that easy for customers. So with our Tahoe and happiest minds, you can very quickly do what we call a data health check on. Dis is a is a 2 to 3 weeks process are two Really quickly start to understand and deliver value from your data. Um, so, iota, who deploys into the customer environment? Data doesn't go anywhere. Um, we would look at a few data sources on a sample of data Onda. We can very rapidly demonstrate how date discovery those catalog e understanding Jupiter data and redundant data can be done. Um, using machine learning, um, on how those problems can be solved. Um, and so what we tend to find is that we can very quickly as I say in a matter of a few weeks, show a customer how they could get toe, um, or Brazilian outcome on. Then how they can scale that up, take it into production on, then really understand their data state Better on build resilience into the enterprise. >>Excellent. There you have it. We'll leave it right there. Guys. Great conversation. Thanks so much for coming on the program. Best of luck to you in the partnership. Be well. >>Thank you, David. Sorry. Thank you. Thank >>you for watching everybody, This is Dave Volonte for the Cuban Are ongoing Siris on data Automation without Tahoe.
SUMMARY :
Great to have you in the Cube. But talk about your mission at the company. digital born a child company. I t services company in the great places to work serving hour glass to ratings mission on the culture. What do you what's your day to day focus To the stakeholders within those businesses on dis is all a key part of digital of the partnership with happiest minds. So when you combine our emphasis I sometimes called the forced march to become a digital business. So one of the key things that is where the digital resilience with business community process enabled was very putting data at the core, I like to say, but so I wonder if you could talk a little bit more about maybe for the first step is to identify the critical data. They have to face all these as part off the evolving environment So do you have solutions around compliance and governance? So identifying the data across the various no heterogeneous is well, but maybe you could pick it up from there. So by automatically discovering the data, um, And the conclusion is really consistent that if you could accelerate the time to value, So with our Tahoe and happiest minds, you can very quickly do what we call Best of luck to you in the partnership. Thank you. you for watching everybody, This is Dave Volonte for the Cuban Are ongoing Siris on data Automation without
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Yusuf Khan | PERSON | 0.99+ |
Yusef Khan | PERSON | 0.99+ |
2 | QUANTITY | 0.99+ |
London | LOCATION | 0.99+ |
Suresh Kanniappan | PERSON | 0.99+ |
Yousef | PERSON | 0.99+ |
one step | QUANTITY | 0.99+ |
Dave Volonte | PERSON | 0.99+ |
first step | QUANTITY | 0.99+ |
2011 | DATE | 0.99+ |
1/5 step | QUANTITY | 0.99+ |
4.1 | QUANTITY | 0.99+ |
Yussef | PERSON | 0.99+ |
Iot Tahoe | ORGANIZATION | 0.99+ |
both | QUANTITY | 0.99+ |
both businesses | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
two | QUANTITY | 0.98+ |
five | QUANTITY | 0.98+ |
single | QUANTITY | 0.98+ |
Dave | PERSON | 0.98+ |
1/6 | QUANTITY | 0.98+ |
today | DATE | 0.97+ |
3 weeks | QUANTITY | 0.97+ |
Suresh | PERSON | 0.97+ |
Jupiter | LOCATION | 0.96+ |
Io Tahoe | ORGANIZATION | 0.96+ |
one shot | QUANTITY | 0.96+ |
single version | QUANTITY | 0.96+ |
Russia | LOCATION | 0.96+ |
1/4 step | QUANTITY | 0.96+ |
First | QUANTITY | 0.96+ |
Siris | TITLE | 0.96+ |
Tahoe | PERSON | 0.94+ |
Cube | ORGANIZATION | 0.93+ |
Iota | ORGANIZATION | 0.92+ |
day one | QUANTITY | 0.9+ |
one time | QUANTITY | 0.88+ |
Iota Ho | ORGANIZATION | 0.87+ |
three things | QUANTITY | 0.85+ |
Brazilian | OTHER | 0.84+ |
Tier one | QUANTITY | 0.84+ |
forced | EVENT | 0.82+ |
Shirish County | LOCATION | 0.81+ |
Seo | PERSON | 0.81+ |
Cuban | OTHER | 0.81+ |
Tahoe | ORGANIZATION | 0.73+ |
Bean | PERSON | 0.72+ |
Iota | TITLE | 0.69+ |
pandemic | EVENT | 0.67+ |
U. S. Sales | ORGANIZATION | 0.66+ |
top 25 I t | QUANTITY | 0.64+ |
Thio | PERSON | 0.61+ |
Io | ORGANIZATION | 0.57+ |
Indian | OTHER | 0.55+ |
teachers | QUANTITY | 0.55+ |
Andi | PERSON | 0.54+ |
minute | QUANTITY | 0.53+ |
CEO | PERSON | 0.52+ |
Onda | LOCATION | 0.51+ |
Cube | COMMERCIAL_ITEM | 0.45+ |
service | QUANTITY | 0.45+ |
march | EVENT | 0.44+ |
nineties | DATE | 0.41+ |
Sabita Davis and Patrick Zeimet | Io-Tahoe Adaptive Data Governance
>>from around the globe. It's the Cube presenting adaptive data governance brought >>to you by >>Iota Ho. In this next segment, we're gonna be talking to you about getting to know your data. And specifically you're gonna hear from two folks at Io Tahoe. We've got enterprise account execs Evita Davis here, as well as Enterprise Data engineer Patrick Simon. They're gonna be sharing insights and tips and tricks for how you can get to know your data and quickly on. We also want to encourage you to engage with Sabina and Patrick. Use the chat feature to the right, send comments, questions or feedback so you can participate. All right, Patrick Sabetta, take it away. All right. >>Thanks, Lisa. Great to be here as Lisa mentioned guys. I'm the enterprise account executive here in Ohio. Tahoe you Pat? >>Yeah. Hey, everyone so great to be here. A said My name's Patrick Samit. I'm the enterprise data engineer here at Iota Ho. And we're so excited to be here and talk about this topic as one thing we're really trying to perpetuate is that data is everyone's business. >>I couldn't agree more, Pat. So, guys, what patent? I patent. I've actually had multiple discussions with clients from different organizations with different roles. So we spoke with both your technical and your non technical audience. So while they were interested in different aspects of our platform, we found that what they had in common was they wanted to make data easy to understand and usable. So that comes back. The pats point off being everybody's business because no matter your role, we're all dependent on data. So what Pan I wanted to do today was wanted toe walk. You guys through some of those client questions, slash pain points that we're hearing from different industries and different roles and demo how our platform here, like Tahoe, is used for automating those, uh, automating Dozier related tasks. So with that said, are you ready for the first one, Pat? >>Yeah, Let's do it. >>Great. So I'm gonna put my technical hat on for this one, So I'm a data practitioner. I just started my job. ABC Bank. I have over 100 different data sources. So I have data kept in Data Lakes, legacy data, sources, even the cloud. So my issue is I don't know what those data sources hold. I don't know what data sensitive, and I don't even understand how that data is connected. So how can I talk to help? >>Yeah, I think that's a very common experience many are facing and definitely something I've encountered in my past. Typically, the first step is to catalog the data and then start mapping the relationships between your various data stores. Now, more often than not, this has tackled through numerous meetings and a combination of Excel and something similar to video, which are too great tools in their own part. But they're very difficult to maintain. Just due to the rate that we are creating data in the modern world. It starts to beg for an idea that can scale with your business needs. And this is where a platform like Io Tahoe becomes so appealing. You can see here visualization of the data relationships created by the I Ho Tahoe service. Now, what is fantastic about this is it's not only laid out in a very human and digestible format in the same action of creating this view, the data catalog was constructed. >>Um, So is the data catalog automatically populated? Correct. Okay, so So what? I'm using iota. Hope at what I'm getting is this complete, unified automated platform without the added cost, of course. >>Exactly. And that's at the heart of Iota Ho. A great feature with that data catalog is that Iota Ho will also profile your data as it creates the catalog, assigning some meaning to those pesky column Underscore ones and custom variable underscore tents that are always such a joy to deal with. Uh, now, by leveraging this interface, we can start to answer the first part of your question and understand where the core relationships within our data exists. Personally, I'm a big fan of this >>view, >>as it really just helps the i b naturally John to these focal points that coincide with these key columns following that train of thought. Let's examine the customer I D column that seems to be at the center of a lot of these relationships. We can see that it's a fairly important column as it's maintaining the relationship between at least three other tables. Now you notice all the connectors are in this blue color. This means that their system defined relationships. But I hope Tahoe goes that extra mile and actually creates thes orange colored connectors as well. These air ones that are machine learning algorithms have predicted to be relationships. Uh, and you can leverage to try and make new and powerful relationships within your data. So I hope that answers the first part of your question. >>Eso So this is really cool. And I can see how this could be leverage quickly. Now. What if I added new data sources or your multiple data sources and needed toe? Identify what data sensitive. Can I Oh, Tahoe, Detect that. >>Yeah, definitely. Within the i o ta platform. There already over 300 pre defined policies such as HIPAA, ferpa, C, c, p, a and the like. One can choose which of these policies to run against their data along for flexibility and efficiency and running the policies that affect organization. >>Okay, so so 300 is an exceptional number. I'll give you that. But what about internal policies that apply to my organization? Is there any ability for me to write custom policies? >>Yeah, that's no issue. And is something that clients leverage fairly often to utilize this function when simply has to write a rejects that our team has helped many deploy. After that, the custom policy is stored for future use to profile sensitive data. One then selects the data sources they're interested in and select the policies that meet your particular needs. The interface will automatically take your data according to the policies of detects, after which you can review the discoveries confirming or rejecting the tagging. All of these insights are easily exported through the interface, so one can work these into the action items within your project management systems. And I think this lends to the collaboration as a team can work through the discovery simultaneously. And as each item is confirmed or rejected, they can see it ni instantaneously. All this translates to a confidence that with iota how you can be sure you're in compliance. >>Um, so I'm glad you mentioned compliance because that's extremely important to my organization. >>So >>what you're saying when I use the eye a Tahoe automated platform, we'd be 90% more compliant that before were other than if you were going to be using a human. >>Yeah, definitely. The collaboration and documentation that the iota ho interface lends itself to can really help you build that confidence that your compliance is sound. >>Does >>that answer your question about sense of data? >>Definitely so. So path. I have the next question for you. So we're planning on migration on guy. Have a set of reports I need to migrate. But what I need to know is that well, what what data sources? Those report those reports are dependent on and what's feeding those tables? >>Yeah, it's a fantastic questions to be toe identifying critical data elements, and the interdependencies within the various databases could be a time consuming but vital process and the migration initiative. Luckily, Iota Ho does have an answer, and again, it's presented in a very visual format. >>So what I'm looking at here is my entire day landscape. >>Yes, exactly. >>So let's say I add another data source. I can still see that Unified 3 60 view. >>Yeah, One feature that is particularly helpful is the ability to add data sources after the data lineage. Discovery has finished along for the flexibility and scope necessary for any data migration project. If you only need need to select a few databases or your entirety, this service will provide the answers. You're looking for this visual representation of the connectivity makes the identification of critical data elements a simple matter. The connections air driven by both system defined flows as well as those predicted by our algorithms, the confidence of which, uh can actually be customized to make sure that they're meeting the needs of the initiative that you have in place. Now, this also provides tabular output in case you need it for your own internal documentation or for your action items, which we can see right here. Uh, in this interface, you can actually also confirm or deny the pair rejection the pair directions along to make sure that the data is as accurate as possible. Does that help with your data lineage needs? >>Definitely. So So, Pat, My next big question here is So now I know a little bit about my data. How do I know I can trust it? So what I'm interested in knowing really is is it in a fit state for Meteo use it? Is it accurate? Does it conform to the right format? >>Yeah, that's a great question. I think that is a pain point felt across the board, be it by data practitioners or data consumers alike. another service that iota hope provides is the ability to write custom data quality rules and understand how well the data pertains to these rules. This dashboard gives a unified view of the strength of these rules, and your dad is overall quality. >>Okay, so Pat s o on on the accuracy scores there. So if my marketing team needs to run, a campaign can read dependent those accuracy scores to know what what tables have quality data to use for our marketing campaign. >>Yeah, this view would allow you to understand your overall accuracy as well as dive into the minutia to see which data elements are of the highest quality. So for that marketing campaign, if you need everything in a strong form, you'll be able to see very quickly with these high level numbers. But if you're only dependent on a few columns to get that information out the door, you can find that within this view, uh, >>so you >>no longer have to rely on reports about reports, but instead just come to this one platform to help drive conversations between stakeholders and data practitioners. I hope that helps answer your questions about that quality. >>Oh, definitely. So I have another one for you here. Path. So I get now the value of IATA who brings by automatically captured all those technical metadata from sources. But how do we match that with the business glossary? >>Yeah, within the same data quality service that we just reviewed. One can actually add business rules detailing the definitions and the business domains that these fall into. What's more is that the data quality rules were just looking at can then be tied into these definitions, allowing insight into the strength of these business rules. It is this service that empowers stakeholders across the business to be involved with the data life cycle and take ownership over the rules that fall within their domain. >>Okay, so those custom rules can I apply that across data sources? >>Yeah. You can bring in as many data sources as you need, so long as you could tie them to that unified definition. >>Okay, great. Thanks so much bad. And we just want to quickly say to everyone working in data, we understand your pain, so please feel free to reach out >>to us. We >>are website the chapel. Oh, Arlington. And let's get a conversation started on how iota Who can help you guys automate all those manual task to help save you time and money. Thank you. Thank >>you. Erin. >>Impact. If I could ask you one quick question, how do you advise customers? You just walk in this great example This banking example that you and city to talk through. How do you advise customers get started? >>Yeah, I think the number one thing that customers could do to get started with our platform is to just run the tag discovery and build up that data catalog. It lends itself very quickly to the other needs you might have, such as thes quality rules as well as identifying those kind of tricky columns that might exist in your data. Those custom variable underscore tens I mentioned before >>last questions to be to anything to add to what Pat just described as a starting place. >>Um, no, I think actually passed something that pretty well, I mean, just just by automating all those manual tasks, I mean, it definitely can save your company a lot of time and money, so we we encourage you just reach out to us. Let's get that conversation started. >>Excellent. Savita and Pat, Thank you so much. We hope you have learned a lot from these folks about how to get to know your data. Make sure that it's quality so that you can maximize the value of it. Thanks for watching.
SUMMARY :
from around the globe. for how you can get to know your data and quickly on. I'm the enterprise account executive here in Ohio. I'm the enterprise data engineer here at Iota Ho. So we spoke with both your technical and your non technical So I have data kept in Data Lakes, legacy data, sources, even the cloud. Typically, the first step is to catalog the data and then start mapping the relationships Um, So is the data catalog automatically populated? Uh, now, by leveraging this interface, we can start to answer the first part of your question So I hope that answers the first part of your question. And I can see how this could be leverage quickly. to run against their data along for flexibility and efficiency and running the policies that affect organization. policies that apply to my organization? And I think this lends to the collaboration as a team can work through the discovery that before were other than if you were going to be using a human. interface lends itself to can really help you build that confidence that your compliance is I have the next question for you. Yeah, it's a fantastic questions to be toe identifying critical data elements, and the interdependencies within I can still see that Unified 3 60 view. Yeah, One feature that is particularly helpful is the ability to add data sources after the data Does it conform to the right format? hope provides is the ability to write custom data quality rules and understand how well the data needs to run, a campaign can read dependent those accuracy scores to know what what tables have quality Yeah, this view would allow you to understand your overall accuracy as well as dive into the minutia I hope that helps answer your questions about that quality. So I have another one for you here. to be involved with the data life cycle and take ownership over the rules that fall within their domain. so long as you could tie them to that unified definition. we understand your pain, so please feel free to reach out to us. help you guys automate all those manual task to help save you time and money. you. This banking example that you and city to talk through. Yeah, I think the number one thing that customers could do to get started with our platform so we we encourage you just reach out to us. Make sure that it's quality so that you can maximize the value of it.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Sabina | PERSON | 0.99+ |
Savita | PERSON | 0.99+ |
Pat | PERSON | 0.99+ |
Patrick | PERSON | 0.99+ |
Patrick Zeimet | PERSON | 0.99+ |
Patrick Simon | PERSON | 0.99+ |
Evita Davis | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
Ohio | LOCATION | 0.99+ |
ABC Bank | ORGANIZATION | 0.99+ |
Patrick Sabetta | PERSON | 0.99+ |
Sabita Davis | PERSON | 0.99+ |
I Ho Tahoe | ORGANIZATION | 0.99+ |
Patrick Samit | PERSON | 0.99+ |
90% | QUANTITY | 0.99+ |
Erin | PERSON | 0.99+ |
Excel | TITLE | 0.99+ |
each item | QUANTITY | 0.99+ |
first step | QUANTITY | 0.99+ |
two folks | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Io Tahoe | ORGANIZATION | 0.98+ |
both | QUANTITY | 0.98+ |
first part | QUANTITY | 0.98+ |
John | PERSON | 0.98+ |
HIPAA | TITLE | 0.98+ |
first one | QUANTITY | 0.97+ |
iota | TITLE | 0.95+ |
one quick question | QUANTITY | 0.94+ |
ferpa | TITLE | 0.93+ |
Iota Ho | TITLE | 0.93+ |
Cube | ORGANIZATION | 0.93+ |
One feature | QUANTITY | 0.92+ |
IATA | ORGANIZATION | 0.92+ |
over 100 different data sources | QUANTITY | 0.9+ |
one | QUANTITY | 0.89+ |
one platform | QUANTITY | 0.88+ |
three other tables | QUANTITY | 0.86+ |
Pan | PERSON | 0.85+ |
Tahoe | ORGANIZATION | 0.84+ |
Iota Ho | TITLE | 0.84+ |
one thing | QUANTITY | 0.82+ |
Tahoe | PERSON | 0.82+ |
Iota Ho | ORGANIZATION | 0.75+ |
over 300 | QUANTITY | 0.74+ |
C | TITLE | 0.74+ |
both system | QUANTITY | 0.72+ |
at least | QUANTITY | 0.68+ |
Data Lakes | LOCATION | 0.68+ |
Meteo | ORGANIZATION | 0.64+ |
One | QUANTITY | 0.58+ |
Io-Tahoe | ORGANIZATION | 0.56+ |
Dozier | ORGANIZATION | 0.56+ |
p | TITLE | 0.52+ |
300 | OTHER | 0.48+ |
Arlington | PERSON | 0.41+ |
Tahoe | LOCATION | 0.4+ |
3 60 | OTHER | 0.38+ |
IO TAHOE EPISODE 4 DATA GOVERNANCE V2
>>from around the globe. It's the Cube presenting adaptive data governance brought to you by Iota Ho. >>And we're back with the data automation. Siri's. In this episode, we're gonna learn more about what I owe Tahoe is doing in the field of adaptive data governance how it can help achieve business outcomes and mitigate data security risks. I'm Lisa Martin, and I'm joined by a J. Bihar on the CEO of Iot Tahoe and Lester Waters, the CEO of Bio Tahoe. Gentlemen, it's great to have you on the program. >>Thank you. Lisa is good to be back. >>Great. Staley's >>likewise very socially distant. Of course as we are. Listen, we're gonna start with you. What's going on? And I am Tahoe. What's name? Well, >>I've been with Iot Tahoe for a little over the year, and one thing I've learned is every customer needs air just a bit different. So we've been working on our next major release of the I O. Tahoe product. But to really try to address these customer concerns because, you know, we wanna we wanna be flexible enough in order to come in and not just profile the date and not just understand data quality and lineage, but also to address the unique needs of each and every customer that we have. And so that required a platform rewrite of our product so that we could, uh, extend the product without building a new version of the product. We wanted to be able to have plausible modules. We also focused a lot on performance. That's very important with the bulk of data that we deal with that we're able to pass through that data in a single pass and do the analytics that are needed, whether it's, uh, lineage, data quality or just identifying the underlying data. And we're incorporating all that we've learned. We're tuning up our machine learning we're analyzing on MAWR dimensions than we've ever done before. We're able to do data quality without doing a Nen initial rejects for, for example, just out of the box. So I think it's all of these things were coming together to form our next version of our product. We're really excited by it, >>So it's exciting a J from the CEO's level. What's going on? >>Wow, I think just building on that. But let's still just mentioned there. It's were growing pretty quickly with our partners. And today, here with Oracle are excited. Thio explain how that shaping up lots of collaboration already with Oracle in government, in insurance, on in banking and we're excited because we get to have an impact. It's real satisfying to see how we're able. Thio. Help businesses transform, Redefine what's possible with their data on bond. Having I recall there is a partner, uh, to lean in with is definitely helping. >>Excellent. We're gonna dig into that a little bit later. Let's let's go back over to you. Explain adaptive data governance. Help us understand that >>really adaptive data governance is about achieving business outcomes through automation. It's really also about establishing a data driven culture and pushing what's traditionally managed in I t out to the business. And to do that, you've got to you've got Thio. You've got to enable an environment where people can actually access and look at the information about the data, not necessarily access the underlying data because we've got privacy concerns itself. But they need to understand what kind of data they have, what shape it's in what's dependent on it upstream and downstream, and so that they could make their educated decisions on on what they need to do to achieve those business outcomes. >>Ah, >>lot of a lot of frameworks these days are hardwired, so you can set up a set of business rules, and that set of business rules works for a very specific database and a specific schema. But imagine a world where you could just >>say, you >>know, the start date of alone must always be before the end date of alone and having that generic rule, regardless of the underlying database and applying it even when a new database comes online and having those rules applied. That's what adaptive data governance about I like to think of. It is the intersection of three circles, Really. It's the technical metadata coming together with policies and rules and coming together with the business ontology ease that are that are unique to that particular business. And this all of this. Bringing this all together allows you to enable rapid change in your environment. So it's a mouthful, adaptive data governance. But that's what it kind of comes down to. >>So, Angie, help me understand this. Is this book enterprise companies are doing now? Are they not quite there yet. >>Well, you know, Lisa, I think every organization is is going at its pace. But, you know, markets are changing the economy and the speed at which, um, some of the changes in the economy happening is is compelling more businesses to look at being more digital in how they serve their own customers. Eh? So what we're seeing is a number of trends here from heads of data Chief Data Officers, CEO, stepping back from, ah, one size fits all approach because they've tried that before, and it it just hasn't worked. They've spent millions of dollars on I T programs China Dr Value from that data on Bennett. And they've ended up with large teams of manual processing around data to try and hardwire these policies to fit with the context and each line of business and on that hasn't worked. So the trends that we're seeing emerge really relate. Thio, How do I There's a chief data officer as a CEO. Inject more automation into a lot of these common tax. Andi, you know, we've been able toc that impact. I think the news here is you know, if you're trying to create a knowledge graph a data catalog or Ah, business glossary. And you're trying to do that manually will stop you. You don't have to do that manually anymore. I think best example I can give is Lester and I We we like Chinese food and Japanese food on. If you were sitting there with your chopsticks, you wouldn't eat the bowl of rice with the chopsticks, one grain at a time. What you'd want to do is to find a more productive way to to enjoy that meal before it gets cold. Andi, that's similar to how we're able to help the organizations to digest their data is to get through it faster, enjoy the benefits of putting that data to work. >>And if it was me eating that food with you guys, I would be not using chopsticks. I would be using a fork and probably a spoon. So eso Lester, how then does iota who go about doing this and enabling customers to achieve this? >>Let me, uh, let me show you a little story have here. So if you take a look at the challenges the most customers have, they're very similar, but every customers on a different data journey, so but it all starts with what data do I have? What questions or what shape is that data in? Uh, how is it structured? What's dependent on it? Upstream and downstream. Um, what insights can I derive from that data? And how can I answer all of those questions automatically? So if you look at the challenges for these data professionals, you know, they're either on a journey to the cloud. Maybe they're doing a migration oracle. Maybe they're doing some data governance changes on bits about enabling this. So if you look at these challenges and I'm gonna take you through a >>story here, E, >>I want to introduce Amanda. Man does not live like, uh, anyone in any large organization. She's looking around and she just sees stacks of data. I mean, different databases, the one she knows about, the one she doesn't know about what should know about various different kinds of databases. And a man is just tasking with understanding all of this so that they can embark on her data journey program. So So a man who goes through and she's great. I've got some handy tools. I can start looking at these databases and getting an idea of what we've got. Well, as she digs into the databases, she starts to see that not everything is as clear as she might have hoped it would be. You know, property names or column names, or have ambiguous names like Attribute one and attribute to or maybe date one and date to s Oh, man is starting to struggle, even though she's get tools to visualize. And look what look at these databases. She still No, she's got a long road ahead. And with 2000 databases in her large enterprise, yes, it's gonna be a long turkey but Amanda Smart. So she pulls out her trusty spreadsheet to track all of her findings on what she doesn't know about. She raises a ticket or maybe tries to track down the owner to find what the data means. And she's tracking all this information. Clearly, this doesn't scale that well for Amanda, you know? So maybe organization will get 10 Amanda's to sort of divide and conquer that work. But even that doesn't work that well because they're still ambiguities in the data with Iota ho. What we do is we actually profile the underlying data. By looking at the underlying data, we can quickly see that attribute. One looks very much like a U. S. Social Security number and attribute to looks like a I c D 10 medical code. And we do this by using anthologies and dictionaries and algorithms to help identify the underlying data and then tag it. Key Thio Doing, uh, this automation is really being able to normalize things across different databases, so that where there's differences in column names, I know that in fact, they contain contain the same data. And by going through this exercise with a Tahoe, not only can we identify the data, but we also could gain insights about the data. So, for example, we can see that 97% of that time that column named Attribute one that's got us Social Security numbers has something that looks like a Social Security number. But 3% of the time, it doesn't quite look right. Maybe there's a dash missing. Maybe there's a digit dropped. Or maybe there's even characters embedded in it. So there may be that may be indicative of a data quality issues, so we try to find those kind of things going a step further. We also try to identify data quality relationships. So, for example, we have two columns, one date, one date to through Ah, observation. We can see that date 1 99% of the time is less than date, too. 1% of the time. It's not probably indicative of a data quality issue, but going a step further, we can also build a business rule that says Day one is less than date to. And so then when it pops up again, we can quickly identify and re mediate that problem. So these are the kinds of things that we could do with with iota going even a step further. You could take your your favorite data science solution production ISAT and incorporated into our next version a zey what we call a worker process to do your own bespoke analytics. >>We spoke analytics. Excellent, Lester. Thank you. So a J talk us through some examples of where you're putting this to use. And also what is some of the feedback from >>some customers? But I think it helped do this Bring it to life a little bit. Lisa is just to talk through a case study way. Pull something together. I know it's available for download, but in ah, well known telecommunications media company, they had a lot of the issues that lasted. You spoke about lots of teams of Amanda's, um, super bright data practitioners, um, on baby looking to to get more productivity out of their day on, deliver a good result for their own customers for cell phone subscribers, Um, on broadband users. So you know that some of the examples that we can see here is how we went about auto generating a lot of that understanding off that data within hours. So Amanda had her data catalog populated automatically. A business class three built up on it. Really? Then start to see. Okay, where do I want Thio? Apply some policies to the data to to set in place some controls where they want to adapt, how different lines of business, maybe tax versus customer operations have different access or permissions to that data on What we've been able to do there is, is to build up that picture to see how does data move across the entire organization across the state. Andi on monitor that overtime for improvement, so have taken it from being a reactive. Let's do something Thio. Fix something. Thio, Now more proactive. We can see what's happening with our data. Who's using it? Who's accessing it, how it's being used, how it's being combined. Um, on from there. Taking a proactive approach is a real smart use of of the talents in in that telco organization Onda folks that worked there with data. >>Okay, Jason, dig into that a little bit deeper. And one of the things I was thinking when you were talking through some of those outcomes that you're helping customers achieve is our ally. How do customers measure are? Why? What are they seeing with iota host >>solution? Yeah, right now that the big ticket item is time to value on. And I think in data, a lot of the upfront investment cause quite expensive. They have been today with a lot of the larger vendors and technologies. So what a CEO and economic bio really needs to be certain of is how quickly can I get that are away. I think we've got something we can show. Just pull up a before and after, and it really comes down to hours, days and weeks. Um, where we've been able Thio have that impact on in this playbook that we pulled together before and after picture really shows. You know, those savings that committed a bit through providing data into some actionable form within hours and days to to drive agility, but at the same time being out and forced the controls to protect the use of that data who has access to it. So these are the number one thing I'd have to say. It's time on. We can see that on the the graphic that we've just pulled up here. >>We talk about achieving adaptive data governance. Lester, you guys talk about automation. You talk about machine learning. How are you seeing those technologies being a facilitator of organizations adopting adaptive data governance? Well, >>Azaz, we see Mitt Emmanuel day. The days of manual effort are so I think you know this >>is a >>multi step process. But the very first step is understanding what you have in normalizing that across your data estate. So you couple this with the ontology, that air unique to your business. There is no algorithms, and you basically go across and you identify and tag tag that data that allows for the next steps toe happen. So now I can write business rules not in terms of columns named columns, but I could write him in terms of the tags being able to automate. That is a huge time saver and the fact that we can suggest that as a rule, rather than waiting for a person to come along and say, Oh, wow. Okay, I need this rule. I need this will thes air steps that increased that are, I should say, decrease that time to value that A. J talked about and then, lastly, a couple of machine learning because even with even with great automation and being able to profile all of your data and getting a good understanding, that brings you to a certain point. But there's still ambiguities in the data. So, for example, I might have to columns date one and date to. I may have even observed the date. One should be less than day two, but I don't really know what date one and date to our other than a date. So this is where it comes in, and I might ask the user said, >>Can >>you help me identify what date? One and date You are in this in this table. Turns out they're a start date and an end date for alone That gets remembered, cycled into the machine learning. So if I start to see this pattern of date one day to elsewhere, I'm going to say, Is it start dating and date? And these Bringing all these things together with this all this automation is really what's key to enabling this This'll data governance. Yeah, >>great. Thanks. Lester and a j wanna wrap things up with something that you mentioned in the beginning about what you guys were doing with Oracle. Take us out by telling us what you're doing there. How are you guys working together? >>Yeah, I think those of us who worked in i t for many years we've We've learned Thio trust articles technology that they're shifting now to ah, hybrid on Prohm Cloud Generation to platform, which is exciting. Andi on their existing customers and new customers moving to article on a journey. So? So Oracle came to us and said, you know, we can see how quickly you're able to help us change mindsets Ondas mindsets are locked in a way of thinking around operating models of I t. That there may be no agile and what siloed on day wanting to break free of that and adopt a more agile A p I at driven approach. A lot of the work that we're doing with our recall no is around, uh, accelerating what customers conduce with understanding their data and to build digital APS by identifying the the underlying data that has value. Onda at the time were able to do that in in in hours, days and weeks. Rather many months. Is opening up the eyes to Chief Data Officers CEO to say, Well, maybe we can do this whole digital transformation this year. Maybe we can bring that forward and and transform who we are as a company on that's driving innovation, which we're excited about it. I know Oracle, a keen Thio to drive through and >>helping businesses transformed digitally is so incredibly important in this time as we look Thio things changing in 2021 a. J. Lester thank you so much for joining me on this segment explaining adaptive data governance, how organizations can use it benefit from it and achieve our Oi. Thanks so much, guys. >>Thank you. Thanks again, Lisa. >>In a moment, we'll look a adaptive data governance in banking. This is the Cube, your global leader in high tech coverage. >>Innovation, impact influence. Welcome to the Cube. Disruptors. Developers and practitioners learn from the voices of leaders who share their personal insights from the hottest digital events around the globe. Enjoy the best this community has to offer on the Cube, your global leader in high tech digital coverage. >>Our next segment here is an interesting panel you're gonna hear from three gentlemen about adaptive data. Governments want to talk a lot about that. Please welcome Yusuf Khan, the global director of data services for Iot Tahoe. We also have Santiago Castor, the chief data officer at the First Bank of Nigeria, and good John Vander Wal, Oracle's senior manager of digital transformation and industries. Gentlemen, it's great to have you joining us in this in this panel. Great >>to be >>tried for me. >>Alright, Santiago, we're going to start with you. Can you talk to the audience a little bit about the first Bank of Nigeria and its scale? This is beyond Nigeria. Talk to us about that. >>Yes, eso First Bank of Nigeria was created 125 years ago. One of the oldest ignored the old in Africa because of the history he grew everywhere in the region on beyond the region. I am calling based in London, where it's kind of the headquarters and it really promotes trade, finance, institutional banking, corporate banking, private banking around the world in particular, in relationship to Africa. We are also in Asia in in the Middle East. >>So, Sanjay, go talk to me about what adaptive data governance means to you. And how does it help the first Bank of Nigeria to be able to innovate faster with the data that you have? >>Yes, I like that concept off adaptive data governor, because it's kind of Ah, I would say an approach that can really happen today with the new technologies before it was much more difficult to implement. So just to give you a little bit of context, I I used to work in consulting for 16, 17 years before joining the president of Nigeria, and I saw many organizations trying to apply different type of approaches in the governance on by the beginning early days was really kind of a year. A Chicago A. A top down approach where data governance was seeing as implement a set of rules, policies and procedures. But really, from the top down on is important. It's important to have the battle off your sea level of your of your director. Whatever I saw, just the way it fails, you really need to have a complimentary approach. You can say bottom are actually as a CEO are really trying to decentralize the governor's. Really, Instead of imposing a framework that some people in the business don't understand or don't care about it, it really needs to come from them. So what I'm trying to say is that data basically support business objectives on what you need to do is every business area needs information on the detector decisions toe actually be able to be more efficient or create value etcetera. Now, depending on the business questions they have to solve, they will need certain data set. So they need actually to be ableto have data quality for their own. For us now, when they understand that they become the stores naturally on their own data sets. And that is where my bottom line is meeting my top down. You can guide them from the top, but they need themselves to be also empower and be actually, in a way flexible to adapt the different questions that they have in orderto be able to respond to the business needs. Now I cannot impose at the finish for everyone. I need them to adapt and to bring their answers toe their own business questions. That is adaptive data governor and all That is possible because we have. And I was saying at the very beginning just to finalize the point, we have new technologies that allow you to do this method data classifications, uh, in a very sophisticated way that you can actually create analitico of your metadata. You can understand your different data sources in order to be able to create those classifications like nationalities, a way of classifying your customers, your products, etcetera. >>So one of the things that you just said Santa kind of struck me to enable the users to be adaptive. They probably don't want to be logging in support ticket. So how do you support that sort of self service to meet the demand of the users so that they can be adaptive. >>More and more business users wants autonomy, and they want to basically be ableto grab the data and answer their own question. Now when you have, that is great, because then you have demand of businesses asking for data. They're asking for the insight. Eso How do you actually support that? I would say there is a changing culture that is happening more and more. I would say even the current pandemic has helped a lot into that because you have had, in a way, off course, technology is one of the biggest winners without technology. We couldn't have been working remotely without these technologies where people can actually looking from their homes and still have a market data marketplaces where they self serve their their information. But even beyond that data is a big winner. Data because the pandemic has shown us that crisis happened, that we cannot predict everything and that we are actually facing a new kind of situation out of our comfort zone, where we need to explore that we need to adapt and we need to be flexible. How do we do that with data. Every single company either saw the revenue going down or the revenue going very up For those companies that are very digital already. Now it changed the reality, so they needed to adapt. But for that they needed information. In order to think on innovate, try toe, create responses So that type of, uh, self service off data Haider for data in order to be able to understand what's happening when the prospect is changing is something that is becoming more, uh, the topic today because off the condemning because of the new abilities, the technologies that allow that and then you then are allowed to basically help your data. Citizens that call them in the organization people that no other business and can actually start playing and an answer their own questions. Eso so these technologies that gives more accessibility to the data that is some cataloging so they can understand where to go or what to find lineage and relationships. All this is is basically the new type of platforms and tools that allow you to create what are called a data marketplace. I think these new tools are really strong because they are now allowing for people that are not technology or I t people to be able to play with data because it comes in the digital world There. Used to a given example without your who You have a very interesting search functionality. Where if you want to find your data you want to sell, Sir, you go there in that search and you actually go on book for your data. Everybody knows how to search in Google, everybody's searching Internet. So this is part of the data culture, the digital culture. They know how to use those schools. Now, similarly, that data marketplace is, uh, in you can, for example, see which data sources they're mostly used >>and enabling that speed that we're all demanding today during these unprecedented times. Goodwin, I wanted to go to you as we talk about in the spirit of evolution, technology is changing. Talk to us a little bit about Oracle Digital. What are you guys doing there? >>Yeah, Thank you. Um, well, Oracle Digital is a business unit that Oracle EMEA on. We focus on emerging countries as well as low and enterprises in the mid market, in more developed countries and four years ago. This started with the idea to engage digital with our customers. Fear Central helps across EMEA. That means engaging with video, having conference calls, having a wall, a green wall where we stand in front and engage with our customers. No one at that time could have foreseen how this is the situation today, and this helps us to engage with our customers in the way we were already doing and then about my team. The focus of my team is to have early stage conversations with our with our customers on digital transformation and innovation. And we also have a team off industry experts who engaged with our customers and share expertise across EMEA, and we inspire our customers. The outcome of these conversations for Oracle is a deep understanding of our customer needs, which is very important so we can help the customer and for the customer means that we will help them with our technology and our resource is to achieve their goals. >>It's all about outcomes, right? Good Ron. So in terms of automation, what are some of the things Oracle's doing there to help your clients leverage automation to improve agility? So that they can innovate faster, which in these interesting times it's demanded. >>Yeah, thank you. Well, traditionally, Oracle is known for their databases, which have bean innovated year over year. So here's the first lunch on the latest innovation is the autonomous database and autonomous data warehouse. For our customers, this means a reduction in operational costs by 90% with a multi medal converts, database and machine learning based automation for full life cycle management. Our databases self driving. This means we automate database provisioning, tuning and scaling. The database is self securing. This means ultimate data protection and security, and it's self repairing the automates failure, detection fail over and repair. And then the question is for our customers, What does it mean? It means they can focus on their on their business instead off maintaining their infrastructure and their operations. >>That's absolutely critical use if I want to go over to you now. Some of the things that we've talked about, just the massive progression and technology, the evolution of that. But we know that whether we're talking about beta management or digital transformation, a one size fits all approach doesn't work to address the challenges that the business has, um that the i t folks have, as you're looking through the industry with what Santiago told us about first Bank of Nigeria. What are some of the changes that you're seeing that I owe Tahoe seeing throughout the industry? >>Uh, well, Lisa, I think the first way I'd characterize it is to say, the traditional kind of top down approach to data where you have almost a data Policeman who tells you what you can and can't do, just doesn't work anymore. It's too slow. It's too resource intensive. Uh, data management data, governments, digital transformation itself. It has to be collaborative on. There has to be in a personalization to data users. Um, in the environment we find ourselves in. Now, it has to be about enabling self service as well. Um, a one size fits all model when it comes to those things around. Data doesn't work. As Santiago was saying, it needs to be adapted toe how the data is used. Andi, who is using it on in order to do this cos enterprises organizations really need to know their data. They need to understand what data they hold, where it is on what the sensitivity of it is they can then any more agile way apply appropriate controls on access so that people themselves are and groups within businesses are our job and could innovate. Otherwise, everything grinds to a halt, and you risk falling behind your competitors. >>Yeah, that one size fits all term just doesn't apply when you're talking about adaptive and agility. So we heard from Santiago about some of the impact that they're making with First Bank of Nigeria. Used to talk to us about some of the business outcomes that you're seeing other customers make leveraging automation that they could not do >>before it's it's automatically being able to classify terabytes, terabytes of data or even petabytes of data across different sources to find duplicates, which you can then re mediate on. Deletes now, with the capabilities that iota offers on the Oracle offers, you can do things not just where the five times or 10 times improvement, but it actually enables you to do projects for Stop that otherwise would fail or you would just not be able to dio I mean, uh, classifying multi terrible and multi petabytes states across different sources, formats very large volumes of data in many scenarios. You just can't do that manually. I mean, we've worked with government departments on the issues there is expect are the result of fragmented data. There's a lot of different sources. There's lot of different formats and without these newer technologies to address it with automation on machine learning, the project isn't durable. But now it is on that that could lead to a revolution in some of these businesses organizations >>to enable that revolution that there's got to be the right cultural mindset. And one of the when Santiago was talking about folks really kind of adapted that. The thing I always call that getting comfortably uncomfortable. But that's hard for organizations to. The technology is here to enable that. But well, you're talking with customers use. How do you help them build the trust in the confidence that the new technologies and a new approaches can deliver what they need? How do you help drive the kind of a tech in the culture? >>It's really good question is because it can be quite scary. I think the first thing we'd start with is to say, Look, the technology is here with businesses like I Tahoe. Unlike Oracle, it's already arrived. What you need to be comfortable doing is experimenting being agile around it, Andi trying new ways of doing things. Uh, if you don't wanna get less behind that Santiago on the team that fbn are a great example off embracing it, testing it on a small scale on, then scaling up a Toyota, we offer what we call a data health check, which can actually be done very quickly in a matter of a few weeks. So we'll work with a customer. Picky use case, install the application, uh, analyzed data. Drive out Cem Cem quick winds. So we worked in the last few weeks of a large entity energy supplier, and in about 20 days, we were able to give them an accurate understanding of their critical data. Elements apply. Helping apply data protection policies. Minimize copies of the data on work out what data they needed to delete to reduce their infrastructure. Spend eso. It's about experimenting on that small scale, being agile on, then scaling up in a kind of very modern way. >>Great advice. Uh, Santiago, I'd like to go back to Is we kind of look at again that that topic of culture and the need to get that mindset there to facilitate these rapid changes, I want to understand kind of last question for you about how you're doing that from a digital transformation perspective. We know everything is accelerating in 2020. So how are you building resilience into your data architecture and also driving that cultural change that can help everyone in this shift to remote working and a lot of the the digital challenges and changes that we're all going through? >>The new technologies allowed us to discover the dating anyway. Toe flawed and see very quickly Information toe. Have new models off over in the data on giving autonomy to our different data units. Now, from that autonomy, they can then compose an innovator own ways. So for me now, we're talking about resilience because in a way, autonomy and flexibility in a organization in a data structure with platform gives you resilience. The organizations and the business units that I have experienced in the pandemic are working well. Are those that actually because they're not physically present during more in the office, you need to give them their autonomy and let them actually engaged on their own side that do their own job and trust them in a way on as you give them, that they start innovating and they start having a really interesting ideas. So autonomy and flexibility. I think this is a key component off the new infrastructure. But even the new reality that on then it show us that, yes, we used to be very kind off structure, policies, procedures as very important. But now we learn flexibility and adaptability of the same side. Now, when you have that a key, other components of resiliency speed, because people want, you know, to access the data and access it fast and on the site fast, especially changes are changing so quickly nowadays that you need to be ableto do you know, interact. Reiterate with your information to answer your questions. Pretty, um, so technology that allows you toe be flexible iterating on in a very fast job way continue will allow you toe actually be resilient in that way, because you are flexible, you adapt your job and you continue answering questions as they come without having everything, setting a structure that is too hard. We also are a partner off Oracle and Oracle. Embodies is great. They have embedded within the transactional system many algorithms that are allowing us to calculate as the transactions happened. What happened there is that when our customers engaged with algorithms and again without your powers, well, the machine learning that is there for for speeding the automation of how you find your data allows you to create a new alliance with the machine. The machine is their toe, actually, in a way to your best friend to actually have more volume of data calculated faster. In a way, it's cover more variety. I mean, we couldn't hope without being connected to this algorithm on >>that engagement is absolutely critical. Santiago. Thank you for sharing that. I do wanna rap really quickly. Good On one last question for you, Santiago talked about Oracle. You've talked about a little bit. As we look at digital resilience, talk to us a little bit in the last minute about the evolution of Oracle. What you guys were doing there to help your customers get the resilience that they have toe have to be not just survive but thrive. >>Yeah. Oracle has a cloud offering for infrastructure, database, platform service and a complete solutions offered a South on Daz. As Santiago also mentioned, We are using AI across our entire portfolio and by this will help our customers to focus on their business innovation and capitalize on data by enabling new business models. Um, and Oracle has a global conference with our cloud regions. It's massively investing and innovating and expanding their clouds. And by offering clouds as public cloud in our data centers and also as private cloud with clouded customer, we can meet every sovereignty and security requirements. And in this way we help people to see data in new ways. We discover insights and unlock endless possibilities. And and maybe 11 of my takeaways is if I If I speak with customers, I always tell them you better start collecting your data. Now we enable this partners like Iota help us as well. If you collect your data now, you are ready for tomorrow. You can never collect your data backwards, So that is my take away for today. >>You can't collect your data backwards. Excellently, John. Gentlemen, thank you for sharing all of your insights. Very informative conversation in a moment, we'll address the question. Do you know your data? >>Are you interested in test driving the iota Ho platform kick Start the benefits of data automation for your business through the Iota Ho Data Health check program. Ah, flexible, scalable sandbox environment on the cloud of your choice with set up service and support provided by Iota ho. Look time with a data engineer to learn more and see Io Tahoe in action from around the globe. It's the Cube presenting adaptive data governance brought to you by Iota Ho. >>In this next segment, we're gonna be talking to you about getting to know your data. And specifically you're gonna hear from two folks at Io Tahoe. We've got enterprise account execs to be to Davis here, as well as Enterprise Data engineer Patrick Simon. They're gonna be sharing insights and tips and tricks for how you could get to know your data and quickly on. We also want to encourage you to engage with the media and Patrick, use the chat feature to the right, send comments, questions or feedback so you can participate. All right, Patrick Savita, take it away. Alright. >>Thankfully saw great to be here as Lisa mentioned guys, I'm the enterprise account executive here in Ohio. Tahoe you Pat? >>Yeah. Hey, everyone so great to be here. I said my name is Patrick Samit. I'm the enterprise data engineer here in Ohio Tahoe. And we're so excited to be here and talk about this topic as one thing we're really trying to perpetuate is that data is everyone's business. >>So, guys, what patent I got? I've actually had multiple discussions with clients from different organizations with different roles. So we spoke with both your technical and your non technical audience. So while they were interested in different aspects of our platform, we found that what they had in common was they wanted to make data easy to understand and usable. So that comes back. The pats point off to being everybody's business because no matter your role, we're all dependent on data. So what Pan I wanted to do today was wanted to walk you guys through some of those client questions, slash pain points that we're hearing from different industries and different rules and demo how our platform here, like Tahoe, is used for automating Dozier related tasks. So with that said are you ready for the first one, Pat? >>Yeah, Let's do it. >>Great. So I'm gonna put my technical hat on for this one. So I'm a data practitioner. I just started my job. ABC Bank. I have, like, over 100 different data sources. So I have data kept in Data Lakes, legacy data, sources, even the cloud. So my issue is I don't know what those data sources hold. I don't know what data sensitive, and I don't even understand how that data is connected. So how can I saw who help? >>Yeah, I think that's a very common experience many are facing and definitely something I've encountered in my past. Typically, the first step is to catalog the data and then start mapping the relationships between your various data stores. Now, more often than not, this has tackled through numerous meetings and a combination of excel and something similar to video which are too great tools in their own part. But they're very difficult to maintain. Just due to the rate that we are creating data in the modern world. It starts to beg for an idea that can scale with your business needs. And this is where a platform like Io Tahoe becomes so appealing, you can see here visualization of the data relationships created by the I. O. Tahoe service. Now, what is fantastic about this is it's not only laid out in a very human and digestible format in the same action of creating this view, the data catalog was constructed. >>Um so is the data catalog automatically populated? Correct. Okay, so So what I'm using Iota hope at what I'm getting is this complete, unified automated platform without the added cost? Of course. >>Exactly. And that's at the heart of Iota Ho. A great feature with that data catalog is that Iota Ho will also profile your data as it creates the catalog, assigning some meaning to those pesky column underscore ones and custom variable underscore tents. They're always such a joy to deal with. Now, by leveraging this interface, we can start to answer the first part of your question and understand where the core relationships within our data exists. Uh, personally, I'm a big fan of this view, as it really just helps the i b naturally John to these focal points that coincide with these key columns following that train of thought, Let's examine the customer I D column that seems to be at the center of a lot of these relationships. We can see that it's a fairly important column as it's maintaining the relationship between at least three other tables. >>Now you >>notice all the connectors are in this blue color. This means that their system defined relationships. But I hope Tahoe goes that extra mile and actually creates thes orange colored connectors as well. These air ones that are machine learning algorithms have predicted to be relationships on. You can leverage to try and make new and powerful relationships within your data. >>Eso So this is really cool, and I can see how this could be leverage quickly now. What if I added new data sources or your multiple data sources and need toe identify what data sensitive can iota who detect that? >>Yeah, definitely. Within the hotel platform. There, already over 300 pre defined policies such as hip for C, C, P. A and the like one can choose which of these policies to run against their data along for flexibility and efficiency and running the policies that affect organization. >>Okay, so so 300 is an exceptional number. I'll give you that. But what about internal policies that apply to my organization? Is there any ability for me to write custom policies? >>Yeah, that's no issue. And it's something that clients leverage fairly often to utilize this function when simply has to write a rejects that our team has helped many deploy. After that, the custom policy is stored for future use to profile sensitive data. One then selects the data sources they're interested in and select the policies that meet your particular needs. The interface will automatically take your data according to the policies of detects, after which you can review the discoveries confirming or rejecting the tagging. All of these insights are easily exported through the interface. Someone can work these into the action items within your project management systems, and I think this lends to the collaboration as a team can work through the discovery simultaneously, and as each item is confirmed or rejected, they can see it ni instantaneously. All this translates to a confidence that with iota hope, you can be sure you're in compliance. >>So I'm glad you mentioned compliance because that's extremely important to my organization. So what you're saying when I use the eye a Tahoe automated platform, we'd be 90% more compliant that before were other than if you were going to be using a human. >>Yeah, definitely the collaboration and documentation that the Iot Tahoe interface lends itself to really help you build that confidence that your compliance is sound. >>So we're planning a migration. Andi, I have a set of reports I need to migrate. But what I need to know is, uh well, what what data sources? Those report those reports are dependent on. And what's feeding those tables? >>Yeah, it's a fantastic questions to be toe identifying critical data elements, and the interdependencies within the various databases could be a time consuming but vital process and the migration initiative. Luckily, Iota Ho does have an answer, and again, it's presented in a very visual format. >>Eso So what I'm looking at here is my entire day landscape. >>Yes, exactly. >>Let's say I add another data source. I can still see that unified 3 60 view. >>Yeah, One future that is particularly helpful is the ability to add data sources after the data lineage. Discovery has finished alone for the flexibility and scope necessary for any data migration project. If you only need need to select a few databases or your entirety, this service will provide the answers. You're looking for things. Visual representation of the connectivity makes the identification of critical data elements a simple matter. The connections air driven by both system defined flows as well as those predicted by our algorithms, the confidence of which, uh, can actually be customized to make sure that they're meeting the needs of the initiative that you have in place. This also provides tabular output in case you needed for your own internal documentation or for your action items, which we can see right here. Uh, in this interface, you can actually also confirm or deny the pair rejection the pair directions, allowing to make sure that the data is as accurate as possible. Does that help with your data lineage needs? >>Definitely. So So, Pat, My next big question here is So now I know a little bit about my data. How do I know I can trust >>it? So >>what I'm interested in knowing, really is is it in a fit state for me to use it? Is it accurate? Does it conform to the right format? >>Yeah, that's a great question. And I think that is a pain point felt across the board, be it by data practitioners or data consumers alike. Another service that I owe Tahoe provides is the ability to write custom data quality rules and understand how well the data pertains to these rules. This dashboard gives a unified view of the strength of these rules, and your dad is overall quality. >>Okay, so Pat s o on on the accuracy scores there. So if my marketing team needs to run, a campaign can read dependent those accuracy scores to know what what tables have quality data to use for our marketing campaign. >>Yeah, this view would allow you to understand your overall accuracy as well as dive into the minutia to see which data elements are of the highest quality. So for that marketing campaign, if you need everything in a strong form, you'll be able to see very quickly with these high level numbers. But if you're only dependent on a few columns to get that information out the door, you can find that within this view, eso >>you >>no longer have to rely on reports about reports, but instead just come to this one platform to help drive conversations between stakeholders and data practitioners. >>So I get now the value of IATA who brings by automatically capturing all those technical metadata from sources. But how do we match that with the business glossary? >>Yeah, within the same data quality service that we just reviewed, one can actually add business rules detailing the definitions and the business domains that these fall into. What's more is that the data quality rules were just looking at can then be tied into these definitions. Allowing insight into the strength of these business rules is this service that empowers stakeholders across the business to be involved with the data life cycle and take ownership over the rules that fall within their domain. >>Okay, >>so those custom rules can I apply that across data sources? >>Yeah, you could bring in as many data sources as you need, so long as you could tie them to that unified definition. >>Okay, great. Thanks so much bad. And we just want to quickly say to everyone working in data, we understand your pain, so please feel free to reach out to us. we are Website the chapel. Oh, Arlington. And let's get a conversation started on how iota Who can help you guys automate all those manual task to help save you time and money. Thank you. Thank >>you. Your Honor, >>if I could ask you one quick question, how do you advise customers? You just walk in this great example this banking example that you instantly to talk through. How do you advise customers get started? >>Yeah, I think the number one thing that customers could do to get started with our platform is to just run the tag discovery and build up that data catalog. It lends itself very quickly to the other needs you might have, such as thes quality rules. A swell is identifying those kind of tricky columns that might exist in your data. Those custom variable underscore tens I mentioned before >>last questions to be to anything to add to what Pat just described as a starting place. >>I'm no, I think actually passed something that pretty well, I mean, just just by automating all those manual task. I mean, it definitely can save your company a lot of time and money, so we we encourage you just reach out to us. Let's get that conversation >>started. Excellent. So, Pete and Pat, thank you so much. We hope you have learned a lot from these folks about how to get to know your data. Make sure that it's quality, something you can maximize the value of it. Thanks >>for watching. Thanks again, Lisa, for that very insightful and useful deep dive into the world of adaptive data governance with Iota Ho Oracle First Bank of Nigeria This is Dave a lot You won't wanna mess Iota, whose fifth episode in the data automation Siri's in that we'll talk to experts from Red Hat and Happiest Minds about their best practices for managing data across hybrid cloud Inter Cloud multi Cloud I T environment So market calendar for Wednesday, January 27th That's Episode five. You're watching the Cube Global Leader digital event technique
SUMMARY :
adaptive data governance brought to you by Iota Ho. Gentlemen, it's great to have you on the program. Lisa is good to be back. Great. Listen, we're gonna start with you. But to really try to address these customer concerns because, you know, we wanna we So it's exciting a J from the CEO's level. It's real satisfying to see how we're able. Let's let's go back over to you. But they need to understand what kind of data they have, what shape it's in what's dependent lot of a lot of frameworks these days are hardwired, so you can set up a set It's the technical metadata coming together with policies Is this book enterprise companies are doing now? help the organizations to digest their data is to And if it was me eating that food with you guys, I would be not using chopsticks. So if you look at the challenges for these data professionals, you know, they're either on a journey to the cloud. Well, as she digs into the databases, she starts to see that So a J talk us through some examples of where But I think it helped do this Bring it to life a little bit. And one of the things I was thinking when you were talking through some We can see that on the the graphic that we've just How are you seeing those technologies being think you know this But the very first step is understanding what you have in normalizing that So if I start to see this pattern of date one day to elsewhere, I'm going to say, in the beginning about what you guys were doing with Oracle. So Oracle came to us and said, you know, we can see things changing in 2021 a. J. Lester thank you so much for joining me on this segment Thank you. is the Cube, your global leader in high tech coverage. Enjoy the best this community has to offer on the Cube, Gentlemen, it's great to have you joining us in this in this panel. Can you talk to the audience a little bit about the first Bank of One of the oldest ignored the old in Africa because of the history And how does it help the first Bank of Nigeria to be able to innovate faster with the point, we have new technologies that allow you to do this method data So one of the things that you just said Santa kind of struck me to enable the users to be adaptive. Now it changed the reality, so they needed to adapt. I wanted to go to you as we talk about in the spirit of evolution, technology is changing. customer and for the customer means that we will help them with our technology and our resource is to achieve doing there to help your clients leverage automation to improve agility? So here's the first lunch on the latest innovation Some of the things that we've talked about, Otherwise, everything grinds to a halt, and you risk falling behind your competitors. Used to talk to us about some of the business outcomes that you're seeing other customers make leveraging automation different sources to find duplicates, which you can then re And one of the when Santiago was talking about folks really kind of adapted that. Minimize copies of the data can help everyone in this shift to remote working and a lot of the the and on the site fast, especially changes are changing so quickly nowadays that you need to be What you guys were doing there to help your customers I always tell them you better start collecting your data. Gentlemen, thank you for sharing all of your insights. adaptive data governance brought to you by Iota Ho. In this next segment, we're gonna be talking to you about getting to know your data. Thankfully saw great to be here as Lisa mentioned guys, I'm the enterprise account executive here in Ohio. I'm the enterprise data engineer here in Ohio Tahoe. So with that said are you ready for the first one, Pat? So I have data kept in Data Lakes, legacy data, sources, even the cloud. Typically, the first step is to catalog the data and then start mapping the relationships Um so is the data catalog automatically populated? i b naturally John to these focal points that coincide with these key columns following These air ones that are machine learning algorithms have predicted to be relationships Eso So this is really cool, and I can see how this could be leverage quickly now. such as hip for C, C, P. A and the like one can choose which of these policies policies that apply to my organization? And it's something that clients leverage fairly often to utilize this So I'm glad you mentioned compliance because that's extremely important to my organization. interface lends itself to really help you build that confidence that your compliance is Andi, I have a set of reports I need to migrate. Yeah, it's a fantastic questions to be toe identifying critical data elements, I can still see that unified 3 60 view. Yeah, One future that is particularly helpful is the ability to add data sources after So now I know a little bit about my data. the data pertains to these rules. So if my marketing team needs to run, a campaign can read dependent those accuracy scores to know what the minutia to see which data elements are of the highest quality. no longer have to rely on reports about reports, but instead just come to this one So I get now the value of IATA who brings by automatically capturing all those technical to be involved with the data life cycle and take ownership over the rules that fall within their domain. Yeah, you could bring in as many data sources as you need, so long as you could manual task to help save you time and money. you. this banking example that you instantly to talk through. Yeah, I think the number one thing that customers could do to get started with our so we we encourage you just reach out to us. folks about how to get to know your data. into the world of adaptive data governance with Iota Ho Oracle First Bank of Nigeria
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Amanda | PERSON | 0.99+ |
Jason | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
Patrick Simon | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Santiago | PERSON | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Yusuf Khan | PERSON | 0.99+ |
Asia | LOCATION | 0.99+ |
16 | QUANTITY | 0.99+ |
Santiago Castor | PERSON | 0.99+ |
Ohio | LOCATION | 0.99+ |
London | LOCATION | 0.99+ |
ABC Bank | ORGANIZATION | 0.99+ |
Patrick Savita | PERSON | 0.99+ |
10 times | QUANTITY | 0.99+ |
Sanjay | PERSON | 0.99+ |
Angie | PERSON | 0.99+ |
Wednesday, January 27th | DATE | 0.99+ |
Africa | LOCATION | 0.99+ |
Thio | PERSON | 0.99+ |
John Vander Wal | PERSON | 0.99+ |
2020 | DATE | 0.99+ |
Patrick | PERSON | 0.99+ |
two columns | QUANTITY | 0.99+ |
90% | QUANTITY | 0.99+ |
Siri | TITLE | 0.99+ |
Toyota | ORGANIZATION | 0.99+ |
Bio Tahoe | ORGANIZATION | 0.99+ |
Azaz | PERSON | 0.99+ |
Pat | PERSON | 0.99+ |
11 | QUANTITY | 0.99+ |
five times | QUANTITY | 0.99+ |
Oracle Digital | ORGANIZATION | 0.99+ |
J. Bihar | PERSON | 0.99+ |
1% | QUANTITY | 0.99+ |
Staley | PERSON | 0.99+ |
Iot Tahoe | ORGANIZATION | 0.99+ |
Iota ho | ORGANIZATION | 0.99+ |
today | DATE | 0.99+ |
Ron | PERSON | 0.99+ |
first | QUANTITY | 0.99+ |
10 | QUANTITY | 0.99+ |
Iota Ho | ORGANIZATION | 0.99+ |
Andi | PERSON | 0.99+ |
Io Tahoe | ORGANIZATION | 0.99+ |
one date | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
excel | TITLE | 0.99+ |
tomorrow | DATE | 0.99+ |
3% | QUANTITY | 0.99+ |
John | PERSON | 0.99+ |
First Bank of Nigeria | ORGANIZATION | 0.99+ |
Middle East | LOCATION | 0.99+ |
Patrick Samit | PERSON | 0.99+ |
I. O. Tahoe | ORGANIZATION | 0.99+ |
first step | QUANTITY | 0.99+ |
97% | QUANTITY | 0.99+ |
Lester | PERSON | 0.99+ |
two folks | QUANTITY | 0.99+ |
Dave | PERSON | 0.99+ |
2021 | DATE | 0.99+ |
fifth episode | QUANTITY | 0.99+ |
one grain | QUANTITY | 0.99+ |
Ajay Vohora, Io Tahoe | Enterprise Data Automation
>>from around the globe. It's the Cube with digital coverage of enterprise data automation an event Siri's brought to you by Iot. Tahoe. >>Okay, we're back. Welcome back to data Automated. A J ahora is CEO of I o Ta ho, JJ. Good to see you. How have things in London? >>Big thing. Well, thinking well, where we're making progress, I could see you hope you're doing well and pleasure being back here on the Cube. >>Yeah, it's always great to talk to. You were talking enterprise data automation. As you know, with within our community, we've been pounding the whole data ops conversation. Little different, though. We're gonna We're gonna dig into that a little bit. But let's start with a J how you've seen the response to Covert and I'm especially interested in the role that data has played in this pandemic. >>Yeah, absolutely. I think everyone's adapting both essentially, um, and and in business, the customers that I speak to on day in, day out that we partner with, um they're busy adapting their businesses to serve their customers. It's very much a game of and showing the week and serve our customers to help their customers um, you know, the adaptation that's happening here is, um, trying to be more agile, kind of the most flexible. Um, a lot of pressure on data. A lot of demand on data and to deliver more value to the business, too. Serve that customer. >>Yeah. I mean, data machine intelligence and cloud, or really three huge factors that have helped organizations in this pandemic. And, you know, the machine intelligence or AI piece? That's what automation is all about. How do you see automation helping organizations evolve maybe faster than they thought they might have to >>Sure. I think the necessity of these times, um, there's there's a says a lot of demand doing something with data data. Uh huh. A lot of a lot of businesses talk about being data driven. Um, so interesting. I sort of look behind that when we work with our customers, and it's all about the customer. You know, the mic is cios invested shareholders. The common theme here is the customer. That customer experience starts and ends with data being able to move from a point that is reacting. So what the customer is expecting and taking it to that step forward where you can be proactive to serve what that customer's expectation to and that's definitely come alive now with they, um, the current time. >>Yes. So, as I said, we've been talking about data ops a lot. The idea being Dev Ops applied to the data pipeline. But talk about enterprise data automation. What is it to you and how is it different from data off? >>Yeah, Great question. Thank you. I am. I think we're all familiar with felt more more awareness around. So as it's applied, Teoh, uh, processes methodologies that have become more mature of the past five years around devil that managing change, managing an application, life cycles, managing software development data about, you know, has been great. But breaking down those silos between different roles functions and bringing people together to collaborate. Andi, you know, we definitely see that those tools, those methodologies, those processes, that kind of thinking, um, landing itself to data with data is exciting. We're excited about that, Andi shifting the focus from being I t versus business users to you know who are the data producers. And here the data consumers in a lot of cases, it concert in many different lines of business. So in data role, those methods those tools and processes well we look to do is build on top of that with data automation. It's the is the nuts and bolts of the the algorithms, the models behind machine learning that the functions. That's where we investors our R and D and bringing that in to build on top of the the methods, the ways of thinking that break down those silos on injecting that automation into the business processes that are going to drive a business to serve its customers. It's, um, a layer beyond Dev ops data ops. They can get to that point where well, I think about it is, Is the automation behind the automation we can take? I'll give you an example. Okay, a bank where we did a lot of work to do make move them into accelerating that digital transformation. And what we're finding is that as we're able to automate the jobs related to data a managing that data and serving that data that's going into them as a business automating their processes for their customer. Um, so it's it's definitely having a compound effect. >>Yeah, I mean I think that you did. Data ops for a lot of people is somewhat new to the whole Dev Ops. The data ops thing is is good and it's a nice framework. Good methodology. There is obviously a level of automation in there and collaboration across different roles. But it sounds like you're talking about so supercharging it, if you will, the automation behind the automation. You know, I think organizations talk about being data driven. You hear that? They have thrown around a lot of times. People sit back and say, We don't make decisions without data. Okay? But really, being data driven is there's a lot of aspects there. There's cultural, but it's also putting data at the core of your organization, understanding how it effects monetization. And, as you know, well, silos have been built up, whether it's through M and a, you know, data sprawl outside data sources. So I'm interested in your thoughts on what data driven means and specifically Hi, how Iot Tahoe plays >>there. Yeah, I'm sure we'll be happy. That look that three David, we've We've come a long way in the last four years. We started out with automating some of those simple, um, to codify. Um, I have a high impact on organization across the data, a data warehouse. There's data related tasks that classify data on and a lot of our original pattern. Senai people value that were built up is is very much around. They're automating, classifying data across different sources and then going out to so that for some purpose originally, you know, some of those simpler I'm challenges that we have. Ah, custom itself, um, around data privacy. You know, I've got a huge data lake here. I'm a telecoms business. I've got millions of six subscribers. Um, quite often the chief data office challenges. How do I cover the operational risk? Where, um, I got so much data I need to simplify my approach to automating, classifying that data. Recent is you can't do that manually. We can for people at it. And the the scale of that is is prohibitive, right? Often, if you had to do it manually by the time you got a good picture of it, it's already out of date. Then, starting with those those simple challenges that we've been able to address, we're then going on and build on that to say, What else do we serve? What else do we serve? The chief data officer, Chief marketing officer on the CFO. Within these times, um, where those decision makers are looking for having a lot of choices in the platform options that they say that the tooling they're very much looking for We're that Swiss army. Not being able to do one thing really well is is great, but more more. Where that cost pressure challenge is coming in is about how do we, um, offer more across the organization, bring in those business lines of business activities that depend on data to not just with a T. Okay, >>so we like the cube. Sometimes we like to talk about Okay, what is it? And then how does it work? And what's the business impact? We kind of covered what it is but love to get into the tech a little bit in terms of how it works. And I think we have a graphic here that gets into that a little bit. So, guys, if you bring that up, I wonder if you could tell us and what is the secret sauce behind Iot Tahoe? And if you could take us through this slot. >>Sure. I mean, right there in the middle that the heart of what we do It is the intellectual property. Yeah, that was built up over time. That takes from Petra genius data sources Your Oracle relational database, your your mainframe. If they lay in increasingly AP eyes and devices that produce data and that creates the ability to automatically discover that data, classify that data after it's classified them have the ability to form relationships across those different, uh, source systems, silos, different lines of business. And once we've automated that that we can start to do some cool things that just puts a contact and meaning around that data. So it's moving it now from bringing data driven on increasingly well. We have really smile, right people in our customer organizations you want do some of those advanced knowledge tasks, data scientists and, uh, quants in some of the banks that we work with. The the onus is on, then, putting everything we've done there with automation, pacifying it, relationship, understanding that equality policies that you apply to that data. I'm putting it in context once you've got the ability to power. A a professional is using data, um, to be able to put that data and contacts and search across the entire enterprise estate. Then then they can start to do some exciting things and piece together the tapestry that fabric across that different systems could be crm air P system such as s AP on some of the newer cloud databases that we work with. Snowflake is a great Well, >>yes. So this is you're describing sort of one of the one of the reasons why there's so many stove pipes and organizations because data is gonna locked in the silos of applications. I also want to point out, you know, previously to do discovery to do that classification that you talked about form those relationship to glean context from data. A lot of that, if not most of that in some cases all that would have been manual. And of course, it's out of date so quickly. Nobody wants to do it because it's so hard. So this again is where automation comes into the the the to the idea of really becoming data driven. >>Sure. I mean the the efforts. If we if I look back, maybe five years ago, we had a prevalence of daily technologies at the cutting edge. Those have said converging me to some of these cloud platforms. So we work with Google and AWS, and I think very much is, as you said it, those manual attempts to try and grasp. But it is such a complex challenge at scale. I quickly runs out of steam because once, um, once you've got your hat, once you've got your fingers on the details Oh, um, what's what's in your data estate? It's changed, you know, you've onboard a new customer. You signed up a new partner, Um, customer has no adopted a new product that you just Lawrence and there that that slew of data it's keeps coming. So it's keeping pace with that. The only answer really is is some form of automation. And what we found is if we can tie automation with what I said before the expertise the, um, the subject matter expertise that sometimes goes back many years within an organization's people that augmentation between machine learning ai on and on that knowledge that sits within inside the organization really tends to involve a lot of value in data? >>Yes, So you know Well, a J you can't be is a smaller company, all things to all people. So your ecosystem is critical. You working with AWS? You're working with Google. You got red hat. IBM is as partners. What is attracting those folks to your ecosystem and give us your thoughts on the importance of ecosystem? >>Yeah, that's that's fundamental. So I mean, when I caimans, we tell her here is the CEO of one of the, um, trends that I wanted us to to be part of was being open, having an open architecture that allowed one thing that was nice to my heart, which is as a CEO, um, a C I O where you've got a budget vision and you've already made investments into your organization, and some of those are pretty long term bets. They should be going out 5 10 years, sometimes with CRM system training up your people, getting everybody working together around a common business platform. What I wanted to ensure is that we could openly like it using ap eyes that were available, the love that some investment on the cost that has already gone into managing in organizations I t. But business users to before So part of the reason why we've been able to be successful with, um, the partners like Google AWS and increasingly, a number of technology players. That red hat mongo DB is another one where we're doing a lot of good work with, um, and snowflake here is, um it's those investments have been made by the organizations that are our customers, and we want to make sure we're adding to that, and they're leveraging the value that they've already committed to. >>Okay, so we've talked about kind of what it is and how it works, and I want to get into the business impact. I would say what I would be looking for from from this would be Can you help me lower my operational risk? I've got I've got tasks that I do many year sequential, some who are in parallel. But can you reduce my time to task? And can you help me reduce the labor intensity and ultimately, my labor costs? And I put those resources elsewhere, and ultimately, I want to reduce the end and cycle time because that is going to drive Telephone number R. A. Y So, um, I missing anything? Can you do those things? And maybe you could give us some examples of the tiara y and the business impact. >>Yeah. I mean, the r a y David is is built upon on three things that I mentioned is a combination off leveraging the existing investment with the existing state, whether that's home, Microsoft, Azure or AWS or Google IBM. And I'm putting that to work because, yeah, the customers that we work with have had made those choices. On top of that, it's, um, is ensuring that we have you got the automation that is working right down to the level off data, a column level or the file level so we don't do with meta data. It is being very specific to be at the most granular level. So as we've grown our processes and on the automation, gasification tagging, applying policies from across different compliance and regulatory needs, that an organization has to the data, everything that then happens downstream from that is ready to serve a business outcome. It could be a customer who wants that experience on a mobile device. A tablet oh, face to face within, within the store. I mean game. Would you provision the right data and enable our customers do that? But their customers, with the right data that they can trust at the right time, just in that real time moment where decision or an action is being expected? That's, um, that's driving the r a y two b in some cases, 20 x but and that's that's really satisfying to see that that kind of impact it is taking years down to months and in many cases, months of work down to days. In some cases, our is the time to value. I'm I'm impressed with how quickly out of the box with very little training a customer and think about, too. And you speak just such a search. They discovery knowledge graph on DM. I don't find duplicates. Onda Redundant data right off the bat within hours. >>Well, it's why investors are interested in this space. I mean, they're looking for a big, total available market. They're looking for a significant return. 10 X is you gotta have 10 x 20 x is better. So so that's exciting and obviously strong management and a strong team. I want to ask you about people and culture. So you got people process technology we've seen with this pandemic that processes you know are really unpredictable. And the technology has to be able to adapt to any process, not the reverse. You can't force your process into some static software, so that's very, very important. But the end of the day you got to get people on board. So I wonder if you could talk about this notion of culture and a data driven culture. >>Yeah, that's that's so important. I mean, current times is forcing the necessity of the moment to adapt. But as we start to work their way through these changes on adapt ah, what with our customers, But that is changing economic times. What? What we're saying here is the ability >>to I >>have, um, the technology Cartman, in a really smart way, what those business uses an I T knowledge workers are looking to achieve together. So I'll give you an example. We have quite often with the data operations teams in the companies that we, um, partnering with, um, I have a lot of inbound enquiries on the day to day level. I really need this set of data they think it can help my data scientists run a particular model? Or that what would happen if we combine these two different silence of data and gets the Richmond going now, those requests you can, sometimes weeks to to realize what we've been able to do with the power is to get those answers being addressed by the business users themselves. And now, without without customers, they're coming to the data. And I t folks saying, Hey, I've now built something in the development environment. Why don't we see how that can scale up with these sets of data? I don't need terabytes of it. I know exactly the columns and the feet in the data that I'm going to use on that gets seller wasted in time, um, angle to innovate. >>Well, that's huge. I mean, the whole notion of self service and the lines of business actually feeling like they have ownership of the data as opposed to, you know, I t or some technology group owning the data because then you've got data quality issues or if it doesn't line up there their agenda, you're gonna get a lot of finger pointing. So so that is a really important. You know a piece of it. I'll give you last word A J. Your final thoughts, if you would. >>Yeah, we're excited to be the only path. And I think we've built great customer examples here where we're having a real impact in in a really fast pace, whether it helping them migrate to the cloud, helping the bean up their legacy, Data lake on and write off there. Now the conversation is around data quality as more of the applications that we enable to a more efficiently could be data are be a very robotic process automation along the AP, eyes that are now available in the cloud platforms. A lot of those they're dependent on data quality on and being able to automate. So business users, um, to take accountability off being able to so look at the trend of their data quality over time and get the signals is is really driving trust. And that trust in data is helping in time. Um, the I T teams, the data operations team, with do more and more quickly that comes back to culture being out, supply this technology in such a way that it's visual insensitive. Andi. How being? Just like Dev Ops tests with with a tty Dave drops putting intelligence in at the data level to drive that collaboration. We're excited, >>you know? You remind me of something. I lied. I don't want to go yet. It's OK, so I know we're tight on time, but you mentioned migration to the cloud. And I'm thinking about conversation with Paula from Webster Webster. Bank migrations. Migrations are, you know, they're they're a nasty word for for organizations. So our and we saw this with Webster. How are you able to help minimize the migration pain and and why is that something that you guys are good at? >>Yeah. I mean, there were many large, successful companies that we've worked with. What's There's a great example where, you know, I'd like to give you the analogy where, um, you've got a lot of people in your teams if you're running a business as a CEO on this bit like a living living grade. But imagine if those different parts of your brain we're not connected, that with, um, so diminish how you're able to perform. So what we're seeing, particularly with migration, is where banks retailers. Manufacturers have grown over the last 10 years through acquisition on through different initiatives, too. Um, drive customer value that sprawl in their data estate hasn't been fully dealt with. It sometimes been a good thing, too. Leave whatever you're fired off the agent incent you a side by side with that legacy mainframe on your oracle, happy and what we're able to do very quickly with that migration challenges shine a light on all the different parts. Oh, data application at the column level or higher level if it's a day late and show an enterprise architect a CDO how everything's connected, where they may not be any documentation. The bright people that created some of those systems long since moved on or retired or been promoted into so in the rose on within days, being out to automatically generate Anke refreshed the states of that data across that man's game on and put it into context, then allows you to look at a migration from a confidence that you did it with the back rather than what we've often seen in the past is teams of consultant and business analysts. Data around this spend months getting an approximation and and a good idea of what it could be in the current state and try their very best to map that to the future Target state. Now, without all hoping out, run those processes within hours of getting started on, um well, that picture visualize that picture and bring it to life. You know, the Yarra. Why, that's off the bat with finding data that should have been deleted data that was copies off on and being able to allow the architect whether it's we're working on gcb or migration to any other clouds such as AWS or a multi cloud landscape right now with yeah, >>that visibility is key. Teoh sort of reducing operational risks, giving people confidence that they can move forward and being able to do that and update that on an ongoing basis, that means you can scale a J. Thanks so much for coming on the Cube and sharing your insights and your experience is great to have >>you. Thank you, David. Look towards smoking in. >>Alright, keep it right there, everybody. We're here with data automated on the Cube. This is Dave Volante and we'll be right back. Short break. >>Yeah, yeah, yeah, yeah
SUMMARY :
enterprise data automation an event Siri's brought to you by Iot. Good to see you. Well, thinking well, where we're making progress, I could see you hope As you know, with within A lot of demand on data and to deliver more value And, you know, the machine intelligence I sort of look behind that What is it to you that automation into the business processes that are going to drive at the core of your organization, understanding how it effects monetization. that for some purpose originally, you know, some of those simpler I'm challenges And if you could take us through this slot. produce data and that creates the ability to that you talked about form those relationship to glean context from data. customer has no adopted a new product that you just Lawrence those folks to your ecosystem and give us your thoughts on the importance of ecosystem? that are our customers, and we want to make sure we're adding to that, that is going to drive Telephone number R. A. Y So, um, And I'm putting that to work because, yeah, the customers that we work But the end of the day you got to get people on board. necessity of the moment to adapt. I have a lot of inbound enquiries on the day to day level. of the data as opposed to, you know, I t or some technology group owning the data intelligence in at the data level to drive that collaboration. is that something that you guys are good at? I'd like to give you the analogy where, um, you've got a lot of people giving people confidence that they can move forward and being able to do that and update We're here with data automated on the Cube.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Paula | PERSON | 0.99+ |
Ajay Vohora | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
IBM | ORGANIZATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Dave Volante | PERSON | 0.99+ |
millions | QUANTITY | 0.99+ |
Siri | TITLE | 0.99+ |
Webster | ORGANIZATION | 0.99+ |
London | LOCATION | 0.99+ |
Iot Tahoe | ORGANIZATION | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Io Tahoe | PERSON | 0.99+ |
10 | QUANTITY | 0.99+ |
five years ago | DATE | 0.98+ |
Onda | ORGANIZATION | 0.98+ |
Webster Webster | ORGANIZATION | 0.98+ |
Covert | PERSON | 0.97+ |
two | QUANTITY | 0.97+ |
both | QUANTITY | 0.97+ |
5 10 years | QUANTITY | 0.97+ |
three | QUANTITY | 0.96+ |
20 x | QUANTITY | 0.94+ |
10 X | QUANTITY | 0.94+ |
Cube | COMMERCIAL_ITEM | 0.93+ |
Andi | PERSON | 0.93+ |
one | QUANTITY | 0.93+ |
Azure | ORGANIZATION | 0.92+ |
six subscribers | QUANTITY | 0.91+ |
three things | QUANTITY | 0.91+ |
I o Ta ho | ORGANIZATION | 0.91+ |
Google AWS | ORGANIZATION | 0.91+ |
Yarra | ORGANIZATION | 0.89+ |
J ahora | PERSON | 0.89+ |
Anke | ORGANIZATION | 0.89+ |
Dave | PERSON | 0.85+ |
Iot Tahoe | PERSON | 0.84+ |
a day | QUANTITY | 0.82+ |
Lawrence | PERSON | 0.82+ |
one thing | QUANTITY | 0.81+ |
Petra | PERSON | 0.78+ |
pandemic | EVENT | 0.78+ |
Iot. Tahoe | PERSON | 0.78+ |
last four years | DATE | 0.78+ |
past five years | DATE | 0.77+ |
Swiss | ORGANIZATION | 0.76+ |
JJ | PERSON | 0.75+ |
Enterprise Data Automation | ORGANIZATION | 0.73+ |
last 10 years | DATE | 0.62+ |
Dev Ops | ORGANIZATION | 0.59+ |
Richmond | ORGANIZATION | 0.55+ |
Cartman | ORGANIZATION | 0.55+ |
Snowflake | EVENT | 0.51+ |
terabytes | QUANTITY | 0.5+ |
factors | QUANTITY | 0.46+ |
data | TITLE | 0.45+ |
Yusef Khan, Io Tahoe | Enterprise Data Automation
>>from around the globe. It's the Cube with digital coverage of enterprise data automation, an event Siri's brought to you by Iot. Tahoe, everybody, We're back. We're talking about enterprise data automation. The hashtag is data automated, and we're going to really dig into data migrations, data, migrations. They're risky. They're time consuming, and they're expensive. Yousef con is here. He's the head of partnerships and alliances at I o ta ho coming again from London. Hey, good to see you, Seth. Thanks very much. >>Thank you. >>So your role is is interesting. We're talking about data migrations. You're gonna head of partnerships. What is your role specifically? And how is it relevant to what we're gonna talk about today? >>Uh, I work with the various businesses such as cloud companies, systems integrators, companies that sell operating systems, middleware, all of whom are often quite well embedded within a company. I t infrastructures and have existing relationships. Because what we do fundamentally makes migrating to the cloud easier on data migration easier. A lot of businesses that are interested in partnering with us. Um, we're interested in parting with, So >>let's set up the problem a little bit. And then I want to get into some of the data. You know, I said that migration is a risky, time consuming, expensive. They're they're often times a blocker for organizations to really get value out of data. Why is that? >>Uh, I think I mean, all migrations have to start with knowing the facts about your data, and you can try and do this manually. But when that you have an organization that may have been going for decades or longer, they will probably have a pretty large legacy data estate so that I have everything from on premise mainframes. They may have stuff which is probably in the cloud, but they probably have hundreds, if not thousands of applications and potentially hundreds of different data stores. Um, now they're understanding of what they have. Ai's often quite limited because you can try and draw a manual maps, but they're outdated very quickly. Every time that data changes the manual that's out of date on people obviously leave organizations over time, so that kind of tribal knowledge gets built up is limited as well. So you can try a Mackel that manually you might need a db. Hey, thanks. Based analyst or ah, business analyst, and they won't go in and explore the data for you. But doing that manually is very, very time consuming this contract teams of people, months and months. Or you can use automation just like what's the bank with Iot? And they managed to do this with a relatively small team. Are in a timeframe of days. >>Yeah, we talked to Paul from Webster Bank. Awesome discussion. So I want to dig into this migration and let's let's pull up graphic it will talk about. We'll talk about what a typical migration project looks like. So what you see here it is. It's very detailed. I know it's a bit of an eye test, but let me call your attention to some of the key aspects of this Ah, and then use. If I want you to chime in. So at the top here, you see that area graph that's operational risk for a typical migration project, and you can see the timeline and the the milestones. That blue bar is the time to test so you can see the second step data analysis talking 24 weeks so, you know, very time consuming. And then Let's not get dig into the stuff in the middle of the fine print, but there's some real good detail there, but go down the bottom. That's labor intensity in the in the bottom and you can see high is that sort of brown and and you could see a number of data analysis, data staging data prep, the trial, the implementation post implementation fixtures, the transition toe B A B a year, which I think is business as usual. Those are all very labor intensive. So what do you take aways from this typical migration project? What do we need to know yourself? >>I mean, I think the key thing is, when you don't understand your data upfront, it's very difficult to scope to set up a project because you go to business stakeholders and decision makers and you say Okay, we want to migrate these data stores. We want to put them in the cloud most often, but actually, you probably don't know how much data is there. You don't necessarily know how many applications that relates to, you know, the relationships between the data. You don't know the flow of the data. So the direction in which the data is going between different data stores and tables, so you start from a position where you have pretty high risk and alleviate that risk. You could be stacking project team of lots and lots of people to do the next base, which is analysis. And so you set up a project which has got a pretty high cost. The big projects, more people, the heavy of governance, obviously on then there, then in the phase where they're trying to do lots and lots of manual analysis manage. That, in a sense, is, as we all know, on the idea of trying to relate data that's in different those stores relating individual tables and columns. Very, very time consuming, expensive. If you're hiring in resource from consultants or systems integrators externally, you might need to buy or to use party tools, Aziz said earlier. The people who understand some of those systems may have left a while ago. See you even high risks quite cost situation from the off on the same things that have developed through the project. Um, what are you doing with it, Ayatollah? Who is that? We're able to automate a lot of this process from the very beginning because we can do the initial data. Discovery run, for example, automatically you very quickly have an automated validator. A data map on the data flow has been generated automatically, much less time and effort and much less cars. Doctor Marley. >>Okay, so I want to bring back that that first chart, and I want to call your attention to the again that area graph the blue bars and then down below that labor intensity. And now let's bring up the the the same chart. But with a set of an automation injection in here and now. So you now see the So let's go Said Accelerated by Iot, Tom. Okay, great. And we're going to talk about this. But look, what happens to the operational risk. A dramatic reduction in that. That graph. And then look at the bars, the bars, those blue bars. You know, data analysis went from 24 weeks down to four weeks and then look at the labor intensity. The it was all these were high data analysis data staging data prep. Try a lot post implementation fixtures in transition to be a you. All of those went from high labor intensity. So we've now attack that and gone to low labor intensity. Explain how that magic happened. >>I think that the example off a data catalog. So every large enterprise wants to have some kind of repository where they put all their understanding about their data in its Price States catalog, if you like, um, imagine trying to do that manually. You need to go into every individual data store. You need a DB a business analyst, rich data store they need to do in extracted the data table was individually they need to cross reference that with other data school, it stores and schemers and tables. You probably were the mother of all lock Excel spreadsheets. It would be a very, very difficult exercise to do. I mean, in fact, one of our reflections as we automate lots of data lots of these things is, um it accelerates the ability to water may, But in some cases, it also makes it possible for enterprise customers with legacy systems um, take banks, for example. There quite often end up staying on mainframe systems that they've had in place for decades. Uh, no migrating away from them because they're not able to actually do the work of understanding the data g duplicating the data, deleting data isn't relevant and then confidently going forward to migrate. So they stay where they are with all the attendant problems assistance systems that are out of support. Go back to the data catalog example. Um, whatever you discover invades, discovery has to persist in a tool like a data catalog. And so we automate data catalog books, including Out Way Cannot be others, but we have our own. The only alternative to this kind of automation is to build out this very large project team or business analysts off db A's project managers processed analysts together with data to understand that the process of gathering data is correct. To put it in the repository to validate it except etcetera, we've got into organizations and we've seen them ramp up teams off 2030 people costs off £234 million a year on a time frame, 15 20 years just to try and get a data catalog done. And that's something that we can typically do in a timeframe of months, if not weeks. And the difference is using automation. And if you do what? I've just described it. In this manual situation, you make migrations to the cloud prohibitively expensive. Whatever saving you might make from shutting down your legacy data stores, we'll get eaten up by the cost of doing it. Unless you go with the more automated approach. >>Okay, so the automated approach reduces risk because you're not gonna, you know you're going to stay on project plan. Ideally, it's all these out of scope expectations that come up with the manual processes that kill you in the rework andan that data data catalog. People are afraid that their their family jewels data is not going to make it through to the other side. So So that's something that you're you're addressing and then you're also not boiling the ocean. You're really taking the pieces that are critical and stuff you don't need. You don't have to pay for >>process. It's a very good point. I mean, one of the other things that we do and we have specific features to do is to automatically and noise data for a duplication at a rover or record level and redundancy on a column level. So, as you say before you go into a migration process. You can then understand. Actually, this stuff it was replicated. We don't need it quite often. If you put data in the cloud you're paying, obviously, the storage based offer compute time. The more data you have in there that's duplicated, that is pure cost. You should take out before you migrate again if you're trying to do that process of understanding what's duplicated manually off tens or hundreds of bases stores. It was 20 months, if not years. Use machine learning to do that in an automatic way on it's much, much quicker. I mean, there's nothing I say. Well, then, that costs and benefits of guitar. Every organization we work with has a lot of money existing, sunk cost in their I t. So have your piece systems like Oracle or Data Lakes, which they've spent a good time and money investing in. But what we do by enabling them to transition everything to the strategic future repositories, is accelerate the value of that investment and the time to value that investment. So we're trying to help people get value out of their existing investments on data estate, close down the things that they don't need to enable them to go to a kind of brighter, more future well, >>and I think as well, you know, once you're able to and this is a journey, we know that. But once you're able to go live on, you're infusing sort of a data mindset, a data oriented culture. I know it's somewhat buzzword, but when you when you see it in organizations, you know it's really and what happens is you dramatically reduce that and cycle time of going from data to actually insights. Data's plentiful, but insights aren't, and that is what's going to drive competitive advantage over the next decade and beyond. >>Yeah, definitely. And you could only really do that if you get your data estate cleaned up in the first place. Um, I worked with the managed teams of data scientists, data engineers, business analysts, people who are pushing out dashboards and trying to build machine learning applications. You know, you know, the biggest frustration for lots of them and the thing that they spend far too much time doing is trying to work out what the right data is on cleaning data, which really you don't want a highly paid thanks to scientists doing with their time. But if you sort out your data stays in the first place, get rid of duplication. If that pans migrate to cloud store, where things are really accessible on its easy to build connections and to use native machine learning tools, you're well on the way up to date the maturity curve on you can start to use some of those more advanced applications. >>You said. What are some of the pre requisites? Maybe the top few that are two or three that I need to understand as a customer to really be successful here? Is it skill sets? Is it is it mindset leadership by in what I absolutely need to have to make this successful? >>Well, I think leadership is obviously key just to set the vision of people with spiky. One of the great things about Ayatollah, though, is you can use your existing staff to do this work. If you've used on automation, platform is no need to hire expensive people. Alright, I was a no code solution. It works out of the box. You just connect to force on your existing stuff can use. It's very intuitive that has these issues. User interface? >>Um, it >>was only to invest vast amounts with large consultants who may well charging the earth. Um, and you already had a bit of an advantage. If you've got existing staff who are close to the data subject matter experts or use it because they can very easily learn how to use a tool on, then they can go in and they can write their own data quality rules on. They can really make a contribution from day one, when we are go into organizations on way. Can I? It's one of the great things about the whole experience. Veritas is. We can get tangible results back within the day. Um, usually within an hour or two great ones to say Okay, we started to map relationships. Here's the data map of the data that we've analyzed. Harrison thoughts on where the sensitive data is because it's automated because it's running algorithms stater on. That's what they were really to expect. >>Um, >>and and you know this because you're dealing with the ecosystem. We're entering a new era of data and many organizations to your point, they just don't have the resources to do what Google and Amazon and Facebook and Microsoft did over the past decade To become data dominant trillion dollar market cap companies. Incumbents need to rely on technology companies to bring that automation that machine intelligence to them so they can apply it. They don't want to be AI inventors. They want to apply it to their businesses. So and that's what really was so difficult in the early days of so called big data. You have this just too much complexity out there, and now companies like Iot Tahoe or bringing your tooling and platforms that are allowing companies to really become data driven your your final thoughts. Please use it. >>That's a great point, Dave. In a way, it brings us back to where it began. In terms of partnerships and alliances. I completely agree with a really exciting point where we can take applications like Iot. Uh, we can go into enterprises and help them really leverage the value of these type of machine learning algorithms. And and I I we work with all the major cloud providers AWS, Microsoft Azure or Google Cloud Platform, IBM and Red Hat on others, and we we really I think for us. The key thing is that we want to be the best in the world of enterprise data automation. We don't aspire to be a cloud provider or even a workflow provider. But what we want to do is really help customers with their data without automated data functionality in partnership with some of those other businesses so we can leverage the great work they've done in the cloud. The great work they've done on work flows on virtual assistants in other areas. And we help customers leverage those investments as well. But our heart, we really targeted it just being the best, uh, enterprised data automation business in the world. >>Massive opportunities not only for technology companies, but for those organizations that can apply technology for business. Advantage yourself, count. Thanks so much for coming on the Cube. Appreciate. All right. And thank you for watching everybody. We'll be right back right after this short break. >>Yeah, yeah, yeah, yeah.
SUMMARY :
of enterprise data automation, an event Siri's brought to you by Iot. And how is it relevant to what we're gonna talk about today? fundamentally makes migrating to the cloud easier on data migration easier. a blocker for organizations to really get value out of data. And they managed to do this with a relatively small team. That blue bar is the time to test so you can see the second step data analysis talking 24 I mean, I think the key thing is, when you don't understand So you now see the So let's go Said Accelerated by Iot, You need a DB a business analyst, rich data store they need to do in extracted the data processes that kill you in the rework andan that data data catalog. close down the things that they don't need to enable them to go to a kind of brighter, and I think as well, you know, once you're able to and this is a journey, And you could only really do that if you get your data estate cleaned up in I need to understand as a customer to really be successful here? One of the great things about Ayatollah, though, is you can use Um, and you already had a bit of an advantage. and and you know this because you're dealing with the ecosystem. And and I I we work And thank you for watching everybody.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Paul | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Amazon | ORGANIZATION | 0.99+ |
London | LOCATION | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Yusef Khan | PERSON | 0.99+ |
Seth | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
20 months | QUANTITY | 0.99+ |
Aziz | PERSON | 0.99+ |
hundreds | QUANTITY | 0.99+ |
tens | QUANTITY | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Webster Bank | ORGANIZATION | 0.99+ |
24 weeks | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
four weeks | QUANTITY | 0.99+ |
three | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Io Tahoe | PERSON | 0.99+ |
Marley | PERSON | 0.99+ |
Harrison | PERSON | 0.99+ |
Data Lakes | ORGANIZATION | 0.99+ |
Siri | TITLE | 0.99+ |
Excel | TITLE | 0.99+ |
Veritas | ORGANIZATION | 0.99+ |
second step | QUANTITY | 0.99+ |
15 20 years | QUANTITY | 0.98+ |
Tahoe | PERSON | 0.98+ |
One | QUANTITY | 0.98+ |
first chart | QUANTITY | 0.98+ |
an hour | QUANTITY | 0.98+ |
Red Hat | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.97+ |
Tom | PERSON | 0.96+ |
hundreds of bases | QUANTITY | 0.96+ |
first | QUANTITY | 0.95+ |
next decade | DATE | 0.94+ |
first place | QUANTITY | 0.94+ |
Iot | ORGANIZATION | 0.94+ |
Iot | TITLE | 0.93+ |
earth | LOCATION | 0.93+ |
day one | QUANTITY | 0.92+ |
Mackel | ORGANIZATION | 0.91+ |
today | DATE | 0.91+ |
Ayatollah | PERSON | 0.89+ |
£234 million a year | QUANTITY | 0.88+ |
data | QUANTITY | 0.88+ |
Iot | PERSON | 0.83+ |
hundreds of | QUANTITY | 0.81+ |
thousands of applications | QUANTITY | 0.81+ |
decades | QUANTITY | 0.8+ |
I o ta ho | ORGANIZATION | 0.75+ |
past decade | DATE | 0.75+ |
Microsoft Azure | ORGANIZATION | 0.72+ |
two great ones | QUANTITY | 0.72+ |
2030 people | QUANTITY | 0.67+ |
Doctor | PERSON | 0.65+ |
States | LOCATION | 0.65+ |
Iot Tahoe | ORGANIZATION | 0.65+ |
a year | QUANTITY | 0.55+ |
Yousef | PERSON | 0.45+ |
Cloud Platform | TITLE | 0.44+ |
Cube | ORGANIZATION | 0.38+ |
Enterprise Data Automation | Crowdchat
>>from around the globe. It's the Cube with digital coverage of enterprise data automation, an event Siri's brought to you by Iot. Tahoe Welcome everybody to Enterprise Data Automation. Ah co created digital program on the Cube with support from my hotel. So my name is Dave Volante. And today we're using the hashtag data automated. You know, organizations. They really struggle to get more value out of their data, time to data driven insights that drive cost savings or new revenue opportunities. They simply take too long. So today we're gonna talk about how organizations can streamline their data operations through automation, machine intelligence and really simplifying data migrations to the cloud. We'll be talking to technologists, visionaries, hands on practitioners and experts that are not just talking about streamlining their data pipelines. They're actually doing it. So keep it right there. We'll be back shortly with a J ahora who's the CEO of Iot Tahoe to kick off the program. You're watching the Cube, the leader in digital global coverage. We're right back right after this short break. Innovation impact influence. Welcome to the Cube disruptors. Developers and practitioners learn from the voices of leaders who share their personal insights from the hottest digital events around the globe. Enjoy the best this community has to offer on the Cube, your global leader. High tech digital coverage from around the globe. It's the Cube with digital coverage of enterprise, data, automation and event. Siri's brought to you by Iot. Tahoe. Okay, we're back. Welcome back to Data Automated. A J ahora is CEO of I O ta ho, JJ. Good to see how things in London >>Thanks doing well. Things in, well, customers that I speak to on day in, day out that we partner with, um, they're busy adapting their businesses to serve their customers. It's very much a game of ensuring the week and serve our customers to help their customers. Um, you know, the adaptation that's happening here is, um, trying to be more agile. Got to be more flexible. Um, a lot of pressure on data, a lot of demand on data and to deliver more value to the business, too. So that customers, >>as I said, we've been talking about data ops a lot. The idea being Dev Ops applied to the data pipeline, But talk about enterprise data automation. What is it to you. And how is it different from data off >>Dev Ops, you know, has been great for breaking down those silos between different roles functions and bring people together to collaborate. Andi, you know, we definitely see that those tools, those methodologies, those processes, that kind of thinking, um, lending itself to data with data is exciting. We look to do is build on top of that when data automation, it's the it's the nuts and bolts of the the algorithms, the models behind machine learning that the functions. That's where we investors, our r and d on bringing that in to build on top of the the methods, the ways of thinking that break down those silos on injecting that automation into the business processes that are going to drive a business to serve its customers. It's, um, a layer beyond Dev ops data ops. They can get to that point where well, I think about it is is the automation behind new dimension. We've come a long way in the last few years. Boy is, we started out with automating some of those simple, um, to codify, um, I have a high impact on organization across the data a cost effective way house. There's data related tasks that classify data on and a lot of our original pattern certain people value that were built up is is very much around that >>love to get into the tech a little bit in terms of how it works. And I think we have a graphic here that gets into that a little bit. So, guys, if you bring that up, >>sure. I mean right there in the middle that the heart of what we do it is, you know, the intellectual property now that we've built up over time that takes from Hacha genius data sources. Your Oracle Relational database. Short your mainframe. It's a lay and increasingly AP eyes and devices that produce data and that creates the ability to automatically discover that data. Classify that data after it's classified. Them have the ability to form relationships across those different source systems, silos, different lines of business. And once we've automated that that we can start to do some cool things that just puts of contact and meaning around that data. So it's moving it now from bringing data driven on increasingly where we have really smile, right people in our customer organizations you want I do some of those advanced knowledge tasks data scientists and ah, yeah, quants in some of the banks that we work with, the the onus is on, then, putting everything we've done there with automation, pacifying it, relationship, understanding that equality, the policies that you can apply to that data. I'm putting it in context once you've got the ability to power. Okay, a professional is using data, um, to be able to put that data and contacts and search across the entire enterprise estate. Then then they can start to do some exciting things and piece together the the tapestry that fabric across that different system could be crm air P system such as s AP and some of the newer brown databases that we work with. Snowflake is a great well, if I look back maybe five years ago, we had prevalence of daily technologies at the cutting edge. Those are converging to some of the cloud platforms that we work with Google and AWS and I think very much is, as you said it, those manual attempts to try and grasp. But it is such a complex challenges scale quickly runs out of steam because once, once you've got your hat, once you've got your fingers on the details Oh, um, what's what's in your data state? It's changed, You know, you've onboard a new customer. You signed up a new partner. Um, customer has, you know, adopted a new product that you just Lawrence and there that that slew of data keeps coming. So it's keeping pace with that. The only answer really is is some form of automation >>you're working with AWS. You're working with Google, You got red hat. IBM is as partners. What is attracting those folks to your ecosystem and give us your thoughts on the importance of ecosystem? >>That's fundamental. So, I mean, when I caimans where you tell here is the CEO of one of the, um, trends that I wanted us CIO to be part of was being open, having an open architecture allowed one thing that was close to my heart, which is as a CEO, um, a c i o where you go, a budget vision on and you've already made investments into your organization, and some of those are pretty long term bets. They should be going out 5 10 years, sometimes with the CRM system training up your people, getting everybody working together around a common business platform. What I wanted to ensure is that we could openly like it using AP eyes that were available, the love that some investment on the cost that has already gone into managing in organizations I t. But business users to before. So part of the reason why we've been able to be successful with, um, the partners like Google AWS and increasingly, a number of technology players. That red hat mongo DB is another one where we're doing a lot of good work with, um and snowflake here is, um Is those investments have been made by the organizations that are our customers, and we want to make sure we're adding to that. And they're leveraging the value that they've already committed to. >>Yeah, and maybe you could give us some examples of the r A y and the business impact. >>Yeah, I mean, the r a y David is is built upon on three things that I mentioned is a combination off. You're leveraging the existing investment with the existing estate, whether that's on Microsoft Azure or AWS or Google, IBM, and I'm putting that to work because, yeah, the customers that we work with have had made those choices. On top of that, it's, um, is ensuring that we have got the automation that is working right down to the level off data, a column level or the file level we don't do with meta data. It is being very specific to be at the most granular level. So as we've grown our processes and on the automation, gasification tagging, applying policies from across different compliance and regulatory needs that an organization has to the data, everything that then happens downstream from that is ready to serve a business outcome now without hoping out which run those processes within hours of getting started And, um, Bill that picture, visualize that picture and bring it to life. You know, the PR Oh, I that's off the bat with finding data that should have been deleted data that was copies off on and being able to allow the architect whether it's we're working on GCB or a migration to any other clouds such as AWS or a multi cloud landscape right off the map. >>A. J. Thanks so much for coming on the Cube and sharing your insights and your experience is great to have you. >>Thank you, David. Look who is smoking in >>now. We want to bring in the customer perspective. We have a great conversation with Paul Damico, senior vice president data architecture, Webster Bank. So keep it right there. >>Utah Data automated Improve efficiency, Drive down costs and make your enterprise data work for you. Yeah, we're on a mission to enable our customers to automate the management of data to realise maximum strategic and operational benefits. We envisage a world where data users consume accurate, up to date unified data distilled from many silos to deliver transformational outcomes, activate your data and avoid manual processing. Accelerate data projects by enabling non I t resources and data experts to consolidate categorize and master data. Automate your data operations Power digital transformations by automating a significant portion of data management through human guided machine learning. Yeah, get value from the start. Increase the velocity of business outcomes with complete accurate data curated automatically for data, visualization tours and analytic insights. Improve the security and quality of your data. Data automation improves security by reducing the number of individuals who have access to sensitive data, and it can improve quality. Many companies report double digit era reduction in data entry and other repetitive tasks. Trust the way data works for you. Data automation by our Tahoe learns as it works and can ornament business user behavior. It learns from exception handling and scales up or down is needed to prevent system or application overloads or crashes. It also allows for innate knowledge to be socialized rather than individualized. No longer will your companies struggle when the employee who knows how this report is done, retires or takes another job, the work continues on without the need for detailed information transfer. Continue supporting the digital shift. Perhaps most importantly, data automation allows companies to begin making moves towards a broader, more aspirational transformation, but on a small scale but is easy to implement and manage and delivers quick wins. Digital is the buzzword of the day, but many companies recognized that it is a complex strategy requires time and investment. Once you get started with data automation, the digital transformation initiated and leaders and employees alike become more eager to invest time and effort in a broader digital transformational agenda. Yeah, >>everybody, we're back. And this is Dave Volante, and we're covering the whole notion of automating data in the Enterprise. And I'm really excited to have Paul Damico here. She's a senior vice president of enterprise Data Architecture at Webster Bank. Good to see you. Thanks for coming on. >>Nice to see you too. Yes. >>So let's let's start with Let's start with Webster Bank. You guys are kind of a regional. I think New York, New England, uh, leave headquartered out of Connecticut, but tell us a little bit about the >>bank. Yeah, Webster Bank is regional, Boston. And that again in New York, Um, very focused on in Westchester and Fairfield County. Um, they're a really highly rated bank regional bank for this area. They, um, hold, um, quite a few awards for the area for being supportive for the community. And, um, are really moving forward. Technology lives. Currently, today we have, ah, a small group that is just working toward moving into a more futuristic, more data driven data warehouse. That's our first item. And then the other item is to drive new revenue by anticipating what customers do when they go to the bank or when they log into there to be able to give them the best offer. The only way to do that is you have timely, accurate, complete data on the customer and what's really a great value on off something to offer that >>at the top level, what were some of what are some of the key business drivers there catalyzing your desire for change >>the ability to give the customer what they need at the time when they need it? And what I mean by that is that we have, um, customer interactions and multiple weights, right? And I want to be able for the customer, too. Walk into a bank, um, or online and see the same the same format and being able to have the same feel, the same look and also to be able to offer them the next best offer for them. >>Part of it is really the cycle time, the end end cycle, time that you're pressing. And then there's if I understand it, residual benefits that are pretty substantial from a revenue opportunity >>exactly. It's drive new customers, Teoh new opportunities. It's enhanced the risk, and it's to optimize the banking process and then obviously, to create new business. Um, and the only way we're going to be able to do that is that we have the ability to look at the data right when the customer walks in the door or right when they open up their app. >>Do you see the potential to increase the data sources and hence the quality of the data? Or is that sort of premature? >>Oh, no. Um, exactly. Right. So right now we ingest a lot of flat files and from our mainframe type of runnin system that we've had for quite a few years. But now that we're moving to the cloud and off Prem and on France, you know, moving off Prem into, like, an s three bucket Where that data king, we can process that data and get that data faster by using real time tools to move that data into a place where, like, snowflake Good, um, utilize that data or we can give it out to our market. The data scientists are out in the lines of business right now, which is great, cause I think that's where data science belongs. We should give them on, and that's what we're working towards now is giving them more self service, giving them the ability to access the data in a more robust way. And it's a single source of truth. So they're not pulling the data down into their own like tableau dashboards and then pushing the data back out. I have eight engineers, data architects, they database administrators, right, um, and then data traditional data forwarding people, Um, and because some customers that I have that our business customers lines of business, they want to just subscribe to a report. They don't want to go out and do any data science work. Um, and we still have to provide that. So we still want to provide them some kind of read regiment that they wake up in the morning and they open up their email. And there's the report that they just drive, um, which is great. And it works out really well. And one of the things. This is why we purchase I o waas. I would have the ability to give the lines of business the ability to do search within the data, and we read the data flows and data redundancy and things like that and help me cleanup the data and also, um, to give it to the data. Analysts who say All right, they just asked me. They want this certain report and it used to take Okay, well, we're gonna four weeks, we're going to go. We're gonna look at the data, and then we'll come back and tell you what we dio. But now with Iot Tahoe, they're able to look at the data and then, in one or two days of being able to go back and say, Yes, we have data. This is where it is. This is where we found that this is the data flows that we've found also, which is what I call it is the birth of a column. It's where the calm was created and where it went live as a teenager. And then it went to, you know, die very archive. >>In researching Iot Tahoe, it seems like one of the strengths of their platform is the ability to visualize data the data structure, and actually dig into it. But also see it, um, and that speeds things up and gives everybody additional confidence. And then the other pieces essentially infusing ai or machine intelligence into the data pipeline is really how you're attacking automation, right? >>Exactly. So you're able to let's say that I have I have seven cause lines of business that are asking me questions. And one of the questions I'll ask me is, um, we want to know if this customer is okay to contact, right? And you know, there's different avenues so you can go online to go. Do not contact me. You can go to the bank And you could say, I don't want, um, email, but I'll take tests and I want, you know, phone calls. Um, all that information. So seven different lines of business asked me that question in different ways once said Okay to contact the other one says, You know, just for one to pray all these, you know, um, and each project before I got there used to be siloed. So one customer would be 100 hours for them to do that and analytical work, and then another cut. Another of analysts would do another 100 hours on the other project. Well, now I can do that all at once, and I can do those type of searches and say yes we already have that documentation. Here it is. And this is where you can find where the customer has said, You know, you don't want I don't want to get access from you by email, or I've subscribed to get emails from you. I'm using Iot typos eight automation right now to bring in the data and to start analyzing the data close to make sure that I'm not missing anything and that I'm not bringing over redundant data. Um, the data warehouse that I'm working off is not, um a It's an on prem. It's an oracle database. Um, and it's 15 years old, so it has extra data in it. It has, um, things that we don't need anymore. And Iot. Tahoe's helping me shake out that, um, extra data that does not need to be moved into my S three. So it's saving me money when I'm moving from offering on Prem. >>What's your vision or your your data driven organization? >>Um, I want for the bankers to be able to walk around with on iPad in their hands and be able to access data for that customer really fast and be able to give them the best deal that they can get. I want Webster to be right there on top, with being able to add new customers and to be able to serve our existing customers who had bank accounts. Since you were 12 years old there and now our, you know, multi. Whatever. Um, I want them to be able to have the best experience with our our bankers. >>That's really what I want is a banking customer. I want my bank to know who I am, anticipate my needs and create a great experience for me. And then let me go on with my life. And so that's a great story. Love your experience, your background and your knowledge. Can't thank you enough for coming on the Cube. >>No, thank you very much. And you guys have a great day. >>Next, we'll talk with Lester Waters, who's the CTO of Iot Toe cluster takes us through the key considerations of moving to the cloud. >>Yeah, right. The entire platform Automated data Discovery data Discovery is the first step to knowing your data auto discover data across any application on any infrastructure and identify all unknown data relationships across the entire siloed data landscape. smart data catalog. Know how everything is connected? Understand everything in context, regained ownership and trust in your data and maintain a single source of truth across cloud platforms, SAS applications, reference data and legacy systems and power business users to quickly discover and understand the data that matters to them with a smart data catalog continuously updated ensuring business teams always have access to the most trusted data available. Automated data mapping and linking automate the identification of unknown relationships within and across data silos throughout the organization. Build your business glossary automatically using in house common business terms, vocabulary and definitions. Discovered relationships appears connections or dependencies between data entities such as customer account, address invoice and these data entities have many discovery properties. At a granular level, data signals dashboards. Get up to date feeds on the health of your data for faster improved data management. See trends, view for history. Compare versions and get accurate and timely visual insights from across the organization. Automated data flows automatically captured every data flow to locate all the dependencies across systems. Visualize how they work together collectively and know who within your organization has access to data. Understand the source and destination for all your business data with comprehensive data lineage constructed automatically during with data discovery phase and continuously load results into the smart Data catalog. Active, geeky automated data quality assessments Powered by active geek You ensure data is fit for consumption that meets the needs of enterprise data users. Keep information about the current data quality state readily available faster Improved decision making Data policy. Governor Automate data governance End to end over the entire data lifecycle with automation, instant transparency and control Automate data policy assessments with glossaries, metadata and policies for sensitive data discovery that automatically tag link and annotate with metadata to provide enterprise wide search for all lines of business self service knowledge graph Digitize and search your enterprise knowledge. Turn multiple siloed data sources into machine Understandable knowledge from a single data canvas searching Explore data content across systems including GRP CRM billing systems, social media to fuel data pipelines >>Yeah, yeah, focusing on enterprise data automation. We're gonna talk about the journey to the cloud Remember, the hashtag is data automate and we're here with Leicester Waters. Who's the CTO of Iot Tahoe? Give us a little background CTO, You've got a deep, deep expertise in a lot of different areas. But what do we need to know? >>Well, David, I started my career basically at Microsoft, uh, where I started the information Security Cryptography group. They're the very 1st 1 that the company had, and that led to a career in information, security. And and, of course, as easy as you go along with information security data is the key element to be protected. Eso I always had my hands and data not naturally progressed into a roll out Iot talk was their CTO. >>What's the prescription for that automation journey and simplifying that migration to the cloud? >>Well, I think the first thing is understanding what you've got. So discover and cataloging your data and your applications. You know, I don't know what I have. I can't move it. I can't. I can't improve it. I can't build upon it. And I have to understand there's dependence. And so building that data catalog is the very first step What I got. Okay, >>so So we've done the audit. We know we've got what's what's next? Where do we go >>next? So the next thing is remediating that data you know, where do I have duplicate data? I may have often times in an organization. Uh, data will get duplicated. So somebody will take a snapshot of the data, you know, and then end up building a new application, which suddenly becomes dependent on that data. So it's not uncommon for an organization of 20 master instances of a customer, and you can see where that will go. And trying to keep all that stuff in sync becomes a nightmare all by itself. So you want to sort of understand where all your redundant data is? So when you go to the cloud, maybe you have an opportunity here to do you consolidate that that data, >>then what? You figure out what to get rid of our actually get rid of it. What's what's next? >>Yes, yes, that would be the next step. So figure out what you need. What, you don't need you Often times I've found that there's obsolete columns of data in your databases that you just don't need. Or maybe it's been superseded by another. You've got tables have been superseded by other tables in your database, so you got to kind of understand what's being used and what's not. And then from that, you can decide. I'm gonna leave this stuff behind or I'm gonna I'm gonna archive this stuff because I might need it for data retention where I'm just gonna delete it. You don't need it. All were >>plowing through your steps here. What's next on the >>journey? The next one is is in a nutshell. Preserve your data format. Don't. Don't, Don't. Don't boil the ocean here at music Cliche. You know, you you want to do a certain degree of lift and shift because you've got application dependencies on that data and the data format, the tables in which they sent the columns and the way they're named. So some degree, you are gonna be doing a lift and ship, but it's an intelligent lift and ship. The >>data lives in silos. So how do you kind of deal with that? Problem? Is that is that part of the journey? >>That's that's great pointed because you're right that the data silos happen because, you know, this business unit is start chartered with this task. Another business unit has this task and that's how you get those in stance creations of the same data occurring in multiple places. So you really want to is part of your cloud migration. You really want a plan where there's an opportunity to consolidate your data because that means it will be less to manage. Would be less data to secure, and it will be. It will have a smaller footprint, which means reduce costs. >>But maybe you could address data quality. Where does that fit in on the >>journey? That's that's a very important point, you know. First of all, you don't want to bring your legacy issues with U. S. As the point I made earlier. If you've got data quality issues, this is a good time to find those and and identify and remediate them. But that could be a laborious task, and you could probably accomplish. It will take a lot of work. So the opportunity used tools you and automate that process is really will help you find those outliers that >>what's next? I think we're through. I think I've counted six. What's the What's the lucky seven >>Lucky seven involved your business users. Really, When you think about it, you're your data is in silos, part of part of this migration to cloud as an opportunity to break down the silos. These silence that naturally occurs are the business. You, uh, you've got to break these cultural barriers that sometimes exists between business and say so. For example, I always advise there's an opportunity year to consolidate your sensitive data. Your P I. I personally identifiable information and and three different business units have the same source of truth From that, there's an opportunity to consolidate that into one. >>Well, great advice, Lester. Thanks so much. I mean, it's clear that the Cap Ex investments on data centers they're generally not a good investment for most companies. Lester really appreciate Lester Water CTO of Iot Tahoe. Let's watch this short video and we'll come right back. >>Use cases. Data migration. Accelerate digitization of business by providing automated data migration work flows that save time in achieving project milestones. Eradicate operational risk and minimize labor intensive manual processes that demand costly overhead data quality. You know the data swamp and re establish trust in the data to enable data signs and Data analytics data governance. Ensure that business and technology understand critical data elements and have control over the enterprise data landscape Data Analytics ENABLEMENT Data Discovery to enable data scientists and Data Analytics teams to identify the right data set through self service for business demands or analytical reporting that advanced too complex regulatory compliance. Government mandated data privacy requirements. GDP Our CCP, A, e, p, R HIPPA and Data Lake Management. Identify late contents cleanup manage ongoing activity. Data mapping and knowledge graph Creates BKG models on business enterprise data with automated mapping to a specific ontology enabling semantic search across all sources in the data estate data ops scale as a foundation to automate data management presences. >>Are you interested in test driving the i o ta ho platform Kickstart the benefits of data automation for your business through the Iot Labs program? Ah, flexible, scalable sandbox environment on the cloud of your choice with set up service and support provided by Iot. Top Click on the link and connect with the data engineer to learn more and see Iot Tahoe in action. Everybody, we're back. We're talking about enterprise data automation. The hashtag is data automated and we're going to really dig into data migrations, data migrations. They're risky, they're time consuming and they're expensive. Yousef con is here. He's the head of partnerships and alliances at I o ta ho coming again from London. Hey, good to see you, Seth. Thanks very much. >>Thank you. >>So let's set up the problem a little bit. And then I want to get into some of the data said that migration is a risky, time consuming, expensive. They're they're often times a blocker for organizations to really get value out of data. Why is that? >>I think I mean, all migrations have to start with knowing the facts about your data. Uh, and you can try and do this manually. But when you have an organization that may have been going for decades or longer, they will probably have a pretty large legacy data estate so that I have everything from on premise mainframes. They may have stuff which is probably in the cloud, but they probably have hundreds, if not thousands of applications and potentially hundreds of different data stores. >>So I want to dig into this migration and let's let's pull up graphic. It will talk about We'll talk about what a typical migration project looks like. So what you see, here it is. It's very detailed. I know it's a bit of an eye test, but let me call your attention to some of the key aspects of this, uh and then use if I want you to chime in. So at the top here, you see that area graph that's operational risk for a typical migration project, and you can see the timeline and the the milestones That Blue Bar is the time to test so you can see the second step. Data analysis. It's 24 weeks so very time consuming, and then let's not get dig into the stuff in the middle of the fine print. But there's some real good detail there, but go down the bottom. That's labor intensity in the in the bottom, and you can see hi is that sort of brown and and you could see a number of data analysis data staging data prep, the trial, the implementation post implementation fixtures, the transition to be a Blu, which I think is business as usual. >>The key thing is, when you don't understand your data upfront, it's very difficult to scope to set up a project because you go to business stakeholders and decision makers, and you say Okay, we want to migrate these data stores. We want to put them in the cloud most often, but actually, you probably don't know how much data is there. You don't necessarily know how many applications that relates to, you know, the relationships between the data. You don't know the flow of the basis of the direction in which the data is going between different data stores and tables. So you start from a position where you have pretty high risk and probably the area that risk you could be. Stack your project team of lots and lots of people to do the next phase, which is analysis. And so you set up a project which has got a pretty high cost. The big projects, more people, the heavy of governance, obviously on then there, then in the phase where they're trying to do lots and lots of manual analysis, um, manual processes, as we all know, on the layer of trying to relate data that's in different grocery stores relating individual tables and columns, very time consuming, expensive. If you're hiring in resource from consultants or systems integrators externally, you might need to buy or to use party tools. Aziz said earlier the people who understand some of those systems may have left a while ago. CEO even higher risks quite cost situation from the off on the same things that have developed through the project. Um, what are you doing with Ayatollah? Who is that? We're able to automate a lot of this process from the very beginning because we can do the initial data. Discovery run, for example, automatically you very quickly have an automated validator. A data met on the data flow has been generated automatically, much less time and effort and much less cars stopped. >>Yeah. And now let's bring up the the the same chart. But with a set of an automation injection in here and now. So you now see the sort of Cisco said accelerated by Iot, Tom. Okay, great. And we're gonna talk about this, but look, what happens to the operational risk. A dramatic reduction in that, That that graph and then look at the bars, the bars, those blue bars. You know, data analysis went from 24 weeks down to four weeks and then look at the labor intensity. The it was all these were high data analysis, data staging data prep trialling post implementation fixtures in transition to be a you all those went from high labor intensity. So we've now attacked that and gone to low labor intensity. Explain how that magic happened. >>I think that the example off a data catalog. So every large enterprise wants to have some kind of repository where they put all their understanding about their data in its price States catalog. If you like, imagine trying to do that manually, you need to go into every individual data store. You need a DB, a business analyst, reach data store. They need to do an extract of the data. But it on the table was individually they need to cross reference that with other data school, it stores and schemers and tables you probably with the mother of all Lock Excel spreadsheets. It would be a very, very difficult exercise to do. I mean, in fact, one of our reflections as we automate lots of data lots of these things is, um it accelerates the ability to water may, But in some cases, it also makes it possible for enterprise customers with legacy systems take banks, for example. There quite often end up staying on mainframe systems that they've had in place for decades. I'm not migrating away from them because they're not able to actually do the work of understanding the data, duplicating the data, deleting data isn't relevant and then confidently going forward to migrate. So they stay where they are with all the attendant problems assistance systems that are out of support. You know, you know, the biggest frustration for lots of them and the thing that they spend far too much time doing is trying to work out what the right data is on cleaning data, which really you don't want a highly paid thanks to scientists doing with their time. But if you sort out your data in the first place, get rid of duplication that sounds migrate to cloud store where things are really accessible. It's easy to build connections and to use native machine learning tools. You well, on the way up to the maturity card, you can start to use some of the more advanced applications >>massive opportunities not only for technology companies, but for those organizations that can apply technology for business. Advantage yourself, count. Thanks so much for coming on the Cube. Much appreciated. Yeah, yeah, yeah, yeah
SUMMARY :
of enterprise data automation, an event Siri's brought to you by Iot. a lot of pressure on data, a lot of demand on data and to deliver more value What is it to you. into the business processes that are going to drive a business to love to get into the tech a little bit in terms of how it works. the ability to automatically discover that data. What is attracting those folks to your ecosystem and give us your thoughts on the So part of the reason why we've IBM, and I'm putting that to work because, yeah, the A. J. Thanks so much for coming on the Cube and sharing your insights and your experience is great to have Look who is smoking in We have a great conversation with Paul Increase the velocity of business outcomes with complete accurate data curated automatically And I'm really excited to have Paul Damico here. Nice to see you too. So let's let's start with Let's start with Webster Bank. complete data on the customer and what's really a great value the ability to give the customer what they need at the Part of it is really the cycle time, the end end cycle, time that you're pressing. It's enhanced the risk, and it's to optimize the banking process and to the cloud and off Prem and on France, you know, moving off Prem into, In researching Iot Tahoe, it seems like one of the strengths of their platform is the ability to visualize data the You know, just for one to pray all these, you know, um, and each project before data for that customer really fast and be able to give them the best deal that they Can't thank you enough for coming on the Cube. And you guys have a great day. Next, we'll talk with Lester Waters, who's the CTO of Iot Toe cluster takes Automated data Discovery data Discovery is the first step to knowing your We're gonna talk about the journey to the cloud Remember, the hashtag is data automate and we're here with Leicester Waters. data is the key element to be protected. And so building that data catalog is the very first step What I got. Where do we go So the next thing is remediating that data you know, You figure out what to get rid of our actually get rid of it. And then from that, you can decide. What's next on the You know, you you want to do a certain degree of lift and shift Is that is that part of the journey? So you really want to is part of your cloud migration. Where does that fit in on the So the opportunity used tools you and automate that process What's the What's the lucky seven there's an opportunity to consolidate that into one. I mean, it's clear that the Cap Ex investments You know the data swamp and re establish trust in the data to enable Top Click on the link and connect with the data for organizations to really get value out of data. Uh, and you can try and milestones That Blue Bar is the time to test so you can see the second step. have pretty high risk and probably the area that risk you could be. to be a you all those went from high labor intensity. But it on the table was individually they need to cross reference that with other data school, Thanks so much for coming on the Cube.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
Dave Volante | PERSON | 0.99+ |
Paul Damico | PERSON | 0.99+ |
Paul Damico | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Aziz | PERSON | 0.99+ |
Webster Bank | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Westchester | LOCATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
24 weeks | QUANTITY | 0.99+ |
Seth | PERSON | 0.99+ |
London | LOCATION | 0.99+ |
one | QUANTITY | 0.99+ |
hundreds | QUANTITY | 0.99+ |
Connecticut | LOCATION | 0.99+ |
New York | LOCATION | 0.99+ |
100 hours | QUANTITY | 0.99+ |
iPad | COMMERCIAL_ITEM | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
four weeks | QUANTITY | 0.99+ |
Siri | TITLE | 0.99+ |
thousands | QUANTITY | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
six | QUANTITY | 0.99+ |
first item | QUANTITY | 0.99+ |
20 master instances | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
second step | QUANTITY | 0.99+ |
S three | COMMERCIAL_ITEM | 0.99+ |
I o ta ho | ORGANIZATION | 0.99+ |
first step | QUANTITY | 0.99+ |
Fairfield County | LOCATION | 0.99+ |
five years ago | DATE | 0.99+ |
first | QUANTITY | 0.99+ |
each project | QUANTITY | 0.99+ |
France | LOCATION | 0.98+ |
two days | QUANTITY | 0.98+ |
Leicester Waters | ORGANIZATION | 0.98+ |
Iot Tahoe | ORGANIZATION | 0.98+ |
Cap Ex | ORGANIZATION | 0.98+ |
seven cause | QUANTITY | 0.98+ |
Lester Waters | PERSON | 0.98+ |
5 10 years | QUANTITY | 0.98+ |
Boston | LOCATION | 0.97+ |
Iot | ORGANIZATION | 0.97+ |
Tahoe | ORGANIZATION | 0.97+ |
Tom | PERSON | 0.97+ |
First | QUANTITY | 0.97+ |
15 years old | QUANTITY | 0.96+ |
seven different lines | QUANTITY | 0.96+ |
single source | QUANTITY | 0.96+ |
Utah | LOCATION | 0.96+ |
New England | LOCATION | 0.96+ |
Webster | ORGANIZATION | 0.95+ |
12 years old | QUANTITY | 0.95+ |
Iot Labs | ORGANIZATION | 0.95+ |
Iot. Tahoe | ORGANIZATION | 0.95+ |
1st 1 | QUANTITY | 0.95+ |
U. S. | LOCATION | 0.95+ |
J ahora | ORGANIZATION | 0.95+ |
Cube | COMMERCIAL_ITEM | 0.94+ |
Prem | ORGANIZATION | 0.94+ |
one customer | QUANTITY | 0.93+ |
Oracle | ORGANIZATION | 0.93+ |
I O ta ho | ORGANIZATION | 0.92+ |
Snowflake | TITLE | 0.92+ |
seven | QUANTITY | 0.92+ |
single | QUANTITY | 0.92+ |
Lester | ORGANIZATION | 0.91+ |
Team D3c0ders, Albania | Technovation World Pitch Summit 2019
>> from Santa Clara, California It's the Cube covering techno ovation World Picks Summit 2019 Brought to you by Silicon Angle Media Now here's Sonia to Gari >> Hi and welcome to the Cube. I'm your host, Sonia today, Aria and >> we're here at Oracle's >> Agnew's campus in Santa Clara, California covering techno vacations World Summit 2019 a pitch competition in which girls from around >> the world create mobile >> lapse in order to create positive change in the world >> with us. Today we have team decoders >> from Albania. Welcome. Thank you. The members are day a row, Johnny. Um Arla Ho, Huh? And your non desk Degrassi. Welcome. And congratulations on being finalists. Thank you. So your app is called JSA. Tell me more about that. >> Okay, So this name is an opinion, and it actually means find your voice, which is also our Moto jesu is focused on helping women who suffer from domestic and gender based violence. So it has all these features that are based on our three main pillars helping the user identify the problem, empowering them and then enabling them to take >> action. That's amazing. And I know sometimes in domestic abuse cases, sometimes just identifying the problem is the hardest part, so that's awesome. That's the first part in your AB s o. Can you tell us more about how someone would use Thea? >> Yeah, So on the first round after insulation, they would face this entrance quiz a T end. It gives you on evaluation about these five questions about gender based fans, but it's more about self reflection and serving as an early warning mechanism for people and questioning their whole, >> um, >> their whole perception on gender based finalists. After that, they come to the main menu, which is the 30 day program, which has myths about violence that you can give the answer. If it's a powerful, it will give you the anti myth, mindfulness exercises and success stories of other women in similar situations. Besides from the program, we have an information and you that has contacts. Thio, coordinator to municipality coordinators, Thio nonprofit organizations. It has some basic information about gender based violence ended. It signs legislation updates on, um, laws that women can benefit from and some other additional information. But also one of the main points of our app is connecting. Scattered resource is in our country So we have all these NGOs and old these institutions that are designed to help women. But most of them do not know that they exist. So when they want to separate from an abusive husband and want to report violence, they don't know where to head. So serving that we have the S. O s menu, which has the emergency hotlines, because in Albania we have separate health fines for different situations not like here in America. 91 on you have different numbers. They change them from time to time, and it's really important to have them all in one place when you need them. Most way also have, um, you can also connect directly to psychologists, lawyers, doctors and shelters that help women who >> suffer from domestic bounds. That's amazing. It just sounds like such a great app. >> And one more thing, which is really important because this feature that I'm about to mention is about all women. It's the opportunities many. So we have collaborated with local businesses, and they have agreed to furnish the AB with job notices, how workshop notices and coupons that allowed the only the users of the app can you respond to so they can benefit from that. But the thing is, when a user, even though they didn't they do not suffer from domestic violence. The Enter the app for the Opportunities menu. They also go through the entrance questionnaire. So that's when all the questioning for >> a violin starts. And do you find that this domestic violence is a huge problem in your community? Or how did you come up with >> this idea? >> Yes, it's actually a really huge problem in Albania. We have grown up seeing all these headlines. At the moment we opened the TV, there would be a ah headline that would say, Husband killed life and it would be for the most absurd reasons. And we have. It has all these deep cultural roots, and it's really horrible. We would see it, um, Unger peers through early signs of it, of course, and we would see how Dad would soon develop Thio. What we will be see today in the news and we see it's not getting any better. So we decided we wanted to do >> something about it. That's amazing. And I hope you, uh, you take the sap worldwide and globally. Thank you I'm sure it will help a bunch of other people in the world as well. Oh, thank you so much. That is all the time we have for today. Thank you for being on the Cuban. Good luck for tonight. >> Thank you. Uh, I'm your host, Sonita Gari. Thank you for >> watching the keeps Coverage of techno. Haitian World pitched 2019 till next time.
SUMMARY :
I'm your host, Sonia today, Aria and Today we have team decoders So your app is called JSA. Okay, So this name is an opinion, and it actually means find your voice, That's the first part in your AB s o. Can you tell us more about how someone would use Thea? Yeah, So on the first round after insulation, they would face this entrance quiz a Besides from the program, we have an information and you that has contacts. That's amazing. and coupons that allowed the only the users of the app can you respond to And do you find that this domestic violence is a huge At the moment we opened the TV, there would be a ah headline that would say, That is all the time we have for today. Thank you for
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Sonita Gari | PERSON | 0.99+ |
Albania | LOCATION | 0.99+ |
America | LOCATION | 0.99+ |
five questions | QUANTITY | 0.99+ |
Sonia | PERSON | 0.99+ |
Santa Clara, California | LOCATION | 0.99+ |
today | DATE | 0.99+ |
Today | DATE | 0.99+ |
Silicon Angle Media | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
first round | QUANTITY | 0.99+ |
tonight | DATE | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Gari | PERSON | 0.99+ |
three main pillars | QUANTITY | 0.99+ |
Arla Ho | PERSON | 0.99+ |
Thio | PERSON | 0.99+ |
World Summit 2019 | EVENT | 0.98+ |
one place | QUANTITY | 0.98+ |
World Picks Summit 2019 | EVENT | 0.98+ |
first part | QUANTITY | 0.98+ |
Aria | PERSON | 0.98+ |
Johnny | PERSON | 0.96+ |
Technovation World Pitch Summit 2019 | EVENT | 0.96+ |
Haitian World | EVENT | 0.95+ |
2019 | DATE | 0.94+ |
JSA | TITLE | 0.94+ |
30 day program | QUANTITY | 0.87+ |
91 | QUANTITY | 0.86+ |
one more thing | QUANTITY | 0.83+ |
Agnew | ORGANIZATION | 0.79+ |
Degrassi | PERSON | 0.74+ |
Thio | ORGANIZATION | 0.7+ |
Team D3c0ders | ORGANIZATION | 0.68+ |
Cube | COMMERCIAL_ITEM | 0.63+ |
Cuban | OTHER | 0.58+ |
S. O | LOCATION | 0.47+ |
Thea | TITLE | 0.38+ |
Influencer Panel | IBM CDO Summit 2019
>> Live from San Francisco, California, it's theCUBE covering the IBM Chief Data Officers Summit, brought to you by IBM. >> Welcome back to San Francisco everybody. I'm Dave Vellante and you're watching theCUBE, the leader in live tech coverage. This is the end of the day panel at the IBM Chief Data Officer Summit. This is the 10th CDO event that IBM has held and we love to to gather these panels. This is a data all-star panel and I've recruited Seth Dobrin who is the CDO of the analytics group at IBM. Seth, thank you for agreeing to chip in and be my co-host in this segment. >> Yeah, thanks Dave. Like I said before we started, I don't know if this is a promotion or a demotion. (Dave laughing) >> We'll let you know after the segment. So, the data all-star panel and the data all-star awards that you guys are giving out a little later in the event here, what's that all about? >> Yeah so this is our 10th CDU Summit. So two a year, so we've been doing this for 5 years. The data all-stars are those people that have been to four at least of the ten. And so these are five of the 16 people that got the award. And so thank you all for participating and I attended these like I said earlier, before I joined IBM they were immensely valuable to me and I was glad to see 16 other people that think it's valuable too. >> That is awesome. Thank you guys for coming on. So, here's the format. I'm going to introduce each of you individually and then ask you to talk about your role in your organization. What role you play, how you're using data, however you want to frame that. And the first question I want to ask is, what's a good day in the life of a data person? Or if you want to answer what's a bad day, that's fine too, you choose. So let's start with Lucia Mendoza-Ronquillo. Welcome, she's the Senior Vice President and the Head of BI and Data Governance at Wells Fargo. You told us that you work within the line of business group, right? So introduce your role and what's a good day for a data person? >> Okay, so my role basically is again business intelligence so I support what's called cards and retail services within Wells Fargo. And I also am responsible for data governance within the business. We roll up into what's called a data governance enterprise. So we comply with all the enterprise policies and my role is to make sure our line of business complies with data governance policies for enterprise. >> Okay, good day? What's a good day for you? >> A good day for me is really when I don't get a call that the regulators are knocking on our doors. (group laughs) Asking for additional reports or have questions on the data and so that would be a good day. >> Yeah, especially in your business. Okay, great. Parag Shrivastava is the Director of Data Architecture at McKesson, welcome. Thanks so much for coming on. So we got a healthcare, couple of healthcare examples here. But, Parag, introduce yourself, your role, and then what's a good day or if you want to choose a bad day, be fun the mix that up. >> Yeah, sounds good. Yeah, so mainly I'm responsible for the leader strategy and architecture at McKesson. What that means is McKesson has a lot of data around the pharmaceutical supply chain, around one-third of the world's pharmaceutical supply chain, clinical data, also around pharmacy automation data, and we want to leverage it for the better engagement of the patients and better engagement of our customers. And my team, which includes the data product owners, and data architects, we are all responsible for looking at the data holistically and creating the data foundation layer. So I lead the team across North America. So that's my current role. And going back to the question around what's a good day, I think I would say the good day, I'll start at the good day. Is really looking at when the data improves the business. And the first thing that comes to my mind is sort of like an example, of McKesson did an acquisition of an eight billion dollar pharmaceutical company in Europe and we were creating the synergy solution which was based around the analytics and data. And actually IBM was one of the partners in implementing that solution. When the solution got really implemented, I mean that was a big deal for me to see that all the effort that we did in plumbing the data, making sure doing some analytics, is really helping improve the business. I think that is really a good day I would say. I mean I wouldn't say a bad day is such, there are challenges, constant challenges, but I think one of the top priorities that we are having right now is to deal with the demand. As we look at the demand around the data, the role of data has got multiple facets to it now. For example, some of the very foundational, evidentiary, and compliance type of needs as you just talked about and then also profitability and the cost avoidance and those kind of aspects. So how to balance between that demand is the other aspect. >> All right good. And we'll get into a lot of that. So Carl Gold is the Chief Data Scientist at Zuora. Carl, tell us a little bit about Zuora. People might not be as familiar with how you guys do software for billing et cetera. Tell us about your role and what's a good day for a data scientist? >> Okay, sure, I'll start by a little bit about Zuora. Zuora is a subscription management platform. So any company who wants to offer a product or service as subscription and you don't want to build your billing and subscription management, revenue recognition, from scratch, you can use a product like ours. I say it lets anyone build a telco with a complicated plan, with tiers and stuff like that. I don't know if that's a good thing or not. You guys'll have to make up your own mind. My role is an interesting one. It's split, so I said I'm a chief data scientist and we work about 50% on product features based on data science. Things like churn prediction, or predictive payment retries are product areas where we offer AI-based solutions. And then but because Zuora is a subscription platform, we have an amazing set of data on the actual performance of companies using our product. So a really interesting part of my role has been leading what we call the subscription economy index and subscription economy benchmarks which are reports around best practices for subscription companies. And it's all based off this amazing dataset created from an anonymized data of our customers. So that's a really exciting part of my role. And for me, maybe this speaks to our level of data governance, I might be able to get some tips from some of my co-panelists, but for me a good day is when all the data for me and everyone on my team is where we left it the night before. And no schema changes, no data, you know records that you were depending on finding removed >> Pipeline failures. >> Yeah pipeline failures. And on a bad day is a schema change, some crucial data just went missing and someone on my team is like, "The code's broken." >> And everybody's stressed >> Yeah, so those are bad days. But, data governance issues maybe. >> Great, okay thank you. Jung Park is the COO of Latitude Food Allergy Care. Jung welcome. >> Yeah hi, thanks for having me and the rest of us here. So, I guess my role I like to put it as I'm really the support team. I'm part of the support team really for the medical practice so, Latitude Food Allergy Care is a specialty practice that treats patients with food allergies. So, I don't know if any of you guys have food allergies or maybe have friends, kids, who have food allergies, but, food allergies unfortunately have become a lot more prevalent. And what we've been able to do is take research and data really from clinical trials and other research institutions and really use that from the clinical trial setting, back to the clinical care model so that we can now treat patients who have food allergies by using a process called oral immunotherapy. It's fascinating and this is really personal to me because my son as food allergies and he's been to the ER four times. >> Wow. >> And one of the scariest events was when he went to an ER out of the country and as a parent, you know you prepare your child right? With the food, he takes the food. He was 13 years old and you had the chaperones, everyone all set up, but you get this call because accidentally he ate some peanut, right. And so I saw this unfold and it scared me so much that this is something I believe we just have to get people treated. So this process allows people to really eat a little bit of the food at a time and then you eat the food at the clinic and then you go home and eat it. Then you come back two weeks later and then you eat a little bit more until your body desensitizes. >> So you build up that immunity >> Exactly. >> and then you watch the data obviously. >> Yeah. So what's a good day for me? When our patients are done for the day and they have a smile on their face because they were able to progress to that next level. >> Now do you have a chief data officer or are you the de facto CFO? >> I'm the de facto. So, my career has been pretty varied. So I've been essentially chief data officer, CIO, at companies small and big. And what's unique about I guess in this role is that I'm able to really think about the data holistically through every component of the practice. So I like to think of it as a patient journey and I'm sure you guys all think of it similarly when you talk about your customers, but from a patient's perspective, before they even come in, you have to make sure the data behind the science of whatever you're treating is proper, right? Once that's there, then you have to have the acquisition part. How do you actually work with the community to make sure people are aware of really the services that you're providing? And when they're with you, how do you engage them? How do you make sure that they are compliant with the process? So in healthcare especially, oftentimes patients don't actually succeed all the way through because they don't continue all the way through. So it's that compliance. And then finally, it's really long-term care. And when you get the long-term care, you know that the patient that you've treated is able to really continue on six months, a year from now, and be able to eat the food. >> Great, thank you for that description. Awesome mission. Rolland Ho is the Vice President of Data and Analytics at Clover Health. Tell us a little bit about Clover Health and then your role. >> Yeah, sure. So Clover is a startup Medicare Advantage plan. So we provide Medicare, private Medicare to seniors. And what we do is we're because of the way we run our health plan, we're able to really lower a lot of the copay costs and protect seniors against out of pocket. If you're on regular Medicare, you get cancer, you have some horrible accident, your out of pocket is infinite potentially. Whereas with Medicare Advantage Plan it's limited to like five, $6,000 and you're always protected. One of the things I'm excited about being at Clover is our ability to really look at how can we bring the value of data analytics to healthcare? Something I've been in this industry for close to 20 years at this point and there's a lot of waste in healthcare. And there's also a lot of very poor application of preventive measures to the right populations. So one of the things that I'm excited about is that with today's models, if you're able to better identify with precision, the right patients to intervene with, then you fundamentally transform the economics of what can be done. Like if you had to pa $1,000 to intervene, but you were only 20% of the chance right, that's very expensive for each success. But, now if your model is 60, 70% right, then now it opens up a whole new world of what you can do. And that's what excites me. In terms of my best day? I'll give you two different angles. One as an MBA, one of my best days was, client calls me up, says, "Hey Rolland, you know, "your analytics brought us over $100 million "in new revenue last year." and I was like, cha-ching! Excellent! >> Which is my half? >> Yeah right. And then on the data geek side the best day was really, run a model, you train a model, you get ridiculous AUC score, so area under the curve, and then you expect that to just disintegrate as you go into validation testing and actual live production. But the 98 AUC score held up through production. And it's like holy cow, the model actually works! And literally we could cut out half of the workload because of how good that model was. >> Great, excellent, thank you. Seth, anything you'd add to the good day, bad day, as a CDO? >> So for me, well as a CDO or as CDO at IBM? 'Cause at IBM I spend most of my time traveling. So a good day is a day I'm home. >> Yeah, when you're not in an (group laughing) aluminum tube. >> Yeah. Hurdling through space (laughs). No, but a good day is when a GDPR compliance just happened, a good day for me was May 20th of last year when IBM was done and we were, or as done as we needed to be for GDPR so that was a good day for me last year. This year is really a good day is when we start implementing some new models to help IBM become a more effective company and increase our bottom line or increase our margins. >> Great, all right so I got a lot of questions as you know and so I want to give you a chance to jump in. >> All right. >> But, I can get it started or have you got something? >> I'll go ahead and get started. So this is a the 10th CDO Summit. So five years. I know personally I've had three jobs at two different companies. So over the course of the last five years, how many jobs, how many companies? Lucia? >> One job with one company. >> Oh my gosh you're boring. (group laughing) >> No, but actually, because I support basically the head of the business, we go into various areas. So, we're not just from an analytics perspective and business intelligence perspective and of course data governance, right? It's been a real journey. I mean there's a lot of work to be done. A lot of work has been accomplished and constantly improving the business, which is the first goal, right? Increasing market share through insights and business intelligence, tracking product performance to really helping us respond to regulators (laughs). So it's a variety of areas I've had to be involved in. >> So one company, 50 jobs. >> Exactly. So right now I wear different hats depending on the day. So that's really what's happening. >> So it's a good question, have you guys been jumping around? Sure, I mean I think of same company, one company, but two jobs. And I think those two jobs have two different layers. When I started at McKesson I was a solution leader or solution director for business intelligence and I think that's how I started. And over the five years I've seen the complete shift towards machine learning and my new role is actually focused around machine learning and AI. That's why we created this layer, so our own data product owners who understand the data science side of things and the ongoing and business architecture. So, same company but has seen a very different shift of data over the last five years. >> Anybody else? >> Sure, I'll say two companies. I'm going on four years at Zuora. I was at a different company for a year before that, although it was kind of the same job, first at the first company, and then at Zuora I was really focused on subscriber analytics and churn for my first couple a years. And then actually I kind of got a new job at Zuora by becoming the subscription economy expert. I become like an economist, even though I don't honestly have a background. My PhD's in biology, but now I'm a subscription economy guru. And a book author, I'm writing a book about my experiences in the area. >> Awesome. That's great. >> All right, I'll give a bit of a riddle. Four, how do you have four jobs, five companies? >> In five years. >> In five years. (group laughing) >> Through a series of acquisition, acquisition, acquisition, acquisition. Exactly, so yeah, I have to really, really count on that one (laughs). >> I've been with three companies over the past five years and I would say I've had seven jobs. But what's interesting is I think it kind of mirrors and kind of mimics what's been going on in the data world. So I started my career in data analytics and business intelligence. But then along with that I had the fortune to work with the IT team. So the IT came under me. And then after that, the opportunity came about in which I was presented to work with compliance. So I became a compliance officer. So in healthcare, it's very interesting because these things are tied together. When you look about the data, and then the IT, and then the regulations as it relates to healthcare, you have to have the proper compliance, both internal compliance, as well as external regulatory compliance. And then from there I became CIO and then ultimately the chief operating officer. But what's interesting is as I go through this it's all still the same common themes. It's how do you use the data? And if anything it just gets to a level in which you become closer with the business and that is the most important part. If you stand alone as a data scientist, or a data analyst, or the data officer, and you don't incorporate the business, you alienate the folks. There's a math I like to do. It's different from your basic math, right? I believe one plus one is equal to three because when you get the data and the business together, you create that synergy and then that's where the value is created. >> Yeah, I mean if you think about it, data's the only commodity that increases value when you use it correctly. >> Yeah. >> Yeah so then that kind of leads to a question that I had. There's this mantra, the more data the better. Or is it more of an Einstein derivative? Collect as much data as possible but not too much. What are your thoughts? Is more data better? >> I'll take it. So, I would say the curve has shifted over the years. Before it used to be data was the bottleneck. But now especially over the last five to 10 years, I feel like data is no longer oftentimes the bottleneck as much as the use case. The definition of what exactly we're going to apply to, how we're going to apply it to. Oftentimes once you have that clear, you can go get the data. And then in the case where there is not data, like in Mechanical Turk, you can all set up experiments, gather data, the cost of that is now so cheap to experiment that I think the bottleneck's really around the business understanding the use case. >> Mm-hmm. >> Mm-hmm. >> And I think the wave that we are seeing, I'm seeing this as there are, in some cases, more data is good, in some cases more data is not good. And I think I'll start it where it is not good. I think where quality is more required is the area where more data is not good. For example like regulation and compliance. So for example in McKesson's case, we have to report on opioid compliance for different states. How much opioid drugs we are giving to states and making sure we have very, very tight reporting and compliance regulations. There, highest quality of data is important. In our data organization, we have very, very dedicated focus around maintaining that quality. So, quality is most important, quantity is not if you will, in that case. Having the right data. Now on the other side of things, where we are doing some kind of exploratory analysis. Like what could be a right category management for our stores? Or where the product pricing could be the right ones. Product has around 140 attributes. We would like to look at all of them and see what patterns are we finding in our models. So there you could say more data is good. >> Well you could definitely see a lot of cases. But certainly in financial services and a lot of healthcare, particularly in pharmaceutical where you don't want work in process hanging around. >> Yeah. >> Some lawyer could find a smoking gun and say, "Ooh see." And then if that data doesn't get deleted. So, let's see, I would imagine it's a challenge in your business, I've heard people say, "Oh keep all the, now we can keep all the data, "it's so inexpensive to store." But that's not necessarily such a good thing is it? >> Well, we're required to store data. >> For N number of years, right? >> Yeah, N number of years. But, sometimes they go beyond those number of years when there's a legal requirements to comply or to answer questions. So we do keep more than, >> Like a legal hold for example. >> Yeah. So we keep more than seven years for example and seven years is the regulatory requirement. But in the case of more data, I'm a data junkie, so I like more data (laughs). Whenever I'm asked, "Is the data available?" I always say, "Give me time I'll find it for you." so that's really how we operate because again, we're the go-to team, we need to be able to respond to regulators to the business and make sure we understand the data. So that's the other key. I mean more data, but make sure you understand what that means. >> But has that perspective changed? Maybe go back 10 years, maybe 15 years ago, when you didn't have the tooling to be able to say, "Give me more data." "I'll get you the answer." Maybe, "Give me more data." "I'll get you the answer in three years." Whereas today, you're able to, >> I'm going to go get it off the backup tapes (laughs). >> (laughs) Yeah, right, exactly. (group laughing) >> That's fortunately for us, Wells Fargo has implemented data warehouse for so many number of years, I think more than 10 years. So we do have that capability. There's certainly a lot of platforms you have to navigate through, but if you are able to navigate, you can get to the data >> Yeah. >> within the required timeline. So I have, astonished you have the technology, team behind you. Jung, you want to add something? >> Yeah, so that's an interesting question. So, clearly in healthcare, there is a lot of data and as I've kind of come closer to the business, I also realize that there's a fine line between collecting the data and actually asking our folks, our clinicians, to generate the data. Because if you are focused only on generating data, the electronic medical records systems for example. There's burnout, you don't want the clinicians to be working to make sure you capture every element because if you do so, yes on the back end you have all kinds of great data, but on the other side, on the business side, it may not be necessarily a productive thing. And so we have to make a fine line judgment as to the data that's generated and who's generating that data and then ultimately how you end up using it. >> And I think there's a bit of a paradox here too, right? The geneticist in me says, "Don't ever throw anything away." >> Right. >> Right? I want to keep everything. But, the most interesting insights often come from small data which are a subset of that larger, keep everything inclination that we as data geeks have. I think also, as we're moving in to kind of the next phase of AI when you can start doing really, really doing things like transfer learning. That small data becomes even more valuable because you can take a model trained on one thing or a different domain and move it over to yours to have a starting point where you don't need as much data to get the insight. So, I think in my perspective, the answer is yes. >> Yeah (laughs). >> Okay, go. >> I'll go with that just to run with that question. I think it's a little bit of both 'cause people touched on different definitions of more data. In general, more observations can never hurt you. But, more features, or more types of things associated with those observations actually can if you bring in irrelevant stuff. So going back to Rolland's answer, the first thing that's good is like a good mental model. My PhD is actually in physical science, so I think about physical science, where you actually have a theory of how the thing works and you collect data around that theory. I think the approach of just, oh let's put in 2,000 features and see what sticks, you know you're leaving yourself open to all kinds of problems. >> That's why data science is not democratized, >> Yeah (laughing). >> because (laughing). >> Right, but first Carl, in your world, you don't have to guess anymore right, 'cause you have real data. >> Well yeah, of course, we have real data, but the collection, I mean for example, I've worked on a lot of customer churn problems. It's very easy to predict customer churn if you capture data that pertains to the value customers are receiving. If you don't capture that data, then you'll never predict churn by counting how many times they login or more crude measures of engagement. >> Right. >> All right guys, we got to go. The keynotes are spilling out. Seth thank you so much. >> That's it? >> Folks, thank you. I know, I'd love to carry on, right? >> Yeah. >> It goes fast. >> Great. >> Yeah. >> Guys, great, great content. >> Yeah, thanks. And congratulations on participating and being data all-stars. >> We'd love to do this again sometime. All right and thank you for watching everybody, it's a wrap from IBM CDOs, Dave Vellante from theCUBE. We'll see you next time. (light music)
SUMMARY :
brought to you by IBM. This is the end of the day panel Like I said before we started, I don't know if this is that you guys are giving out a little later And so thank you all for participating and then ask you to talk and my role is to make sure our line of business complies a call that the regulators are knocking on our doors. and then what's a good day or if you want to choose a bad day, And the first thing that comes to my mind So Carl Gold is the Chief Data Scientist at Zuora. as subscription and you don't want to build your billing and someone on my team is like, "The code's broken." Yeah, so those are bad days. Jung Park is the COO of Latitude Food Allergy Care. So, I don't know if any of you guys have food allergies of the food at a time and then you eat the food and then you When our patients are done for the day and I'm sure you guys all think of it similarly Great, thank you for that description. the right patients to intervene with, and then you expect that to just disintegrate Great, excellent, thank you. So a good day is a day I'm home. Yeah, when you're not in an (group laughing) for GDPR so that was a good day for me last year. and so I want to give you a chance to jump in. So over the course of the last five years, Oh my gosh you're boring. and constantly improving the business, So that's really what's happening. and the ongoing and business architecture. in the area. That's great. Four, how do you have four jobs, five companies? In five years. really count on that one (laughs). and you don't incorporate the business, Yeah, I mean if you think about it, Or is it more of an Einstein derivative? But now especially over the last five to 10 years, So there you could say more data is good. particularly in pharmaceutical where you don't want "it's so inexpensive to store." So we do keep more than, Like a legal hold So that's the other key. when you didn't have the tooling to be able to say, (laughs) Yeah, right, exactly. but if you are able to navigate, you can get to the data astonished you have the technology, and then ultimately how you end up using it. And I think there's a bit of a paradox here too, right? to have a starting point where you don't need as much data and you collect data around that theory. you don't have to guess anymore right, if you capture data that pertains Seth thank you so much. I know, I'd love to carry on, right? and being data all-stars. All right and thank you for watching everybody,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
IBM | ORGANIZATION | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Seth Dobrin | PERSON | 0.99+ |
McKesson | ORGANIZATION | 0.99+ |
Wells Fargo | ORGANIZATION | 0.99+ |
May 20th | DATE | 0.99+ |
five companies | QUANTITY | 0.99+ |
Zuora | ORGANIZATION | 0.99+ |
two jobs | QUANTITY | 0.99+ |
seven jobs | QUANTITY | 0.99+ |
$1,000 | QUANTITY | 0.99+ |
50 jobs | QUANTITY | 0.99+ |
three companies | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
Seth | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Clover | ORGANIZATION | 0.99+ |
Lucia Mendoza-Ronquillo | PERSON | 0.99+ |
seven years | QUANTITY | 0.99+ |
five | QUANTITY | 0.99+ |
two companies | QUANTITY | 0.99+ |
Clover Health | ORGANIZATION | 0.99+ |
four years | QUANTITY | 0.99+ |
Parag Shrivastava | PERSON | 0.99+ |
San Francisco | LOCATION | 0.99+ |
five years | QUANTITY | 0.99+ |
Rolland Ho | PERSON | 0.99+ |
$6,000 | QUANTITY | 0.99+ |
Lucia | PERSON | 0.99+ |
eight billion dollar | QUANTITY | 0.99+ |
5 years | QUANTITY | 0.99+ |
Carl | PERSON | 0.99+ |
more than seven years | QUANTITY | 0.99+ |
one company | QUANTITY | 0.99+ |
San Francisco, California | LOCATION | 0.99+ |
today | DATE | 0.99+ |
North America | LOCATION | 0.99+ |
One | QUANTITY | 0.99+ |
Four | QUANTITY | 0.99+ |
Jung | PERSON | 0.99+ |
three jobs | QUANTITY | 0.99+ |
Latitude Food Allergy Care | ORGANIZATION | 0.99+ |
One job | QUANTITY | 0.99+ |
2,000 features | QUANTITY | 0.99+ |
Carl Gold | PERSON | 0.99+ |
four jobs | QUANTITY | 0.99+ |
over $100 million | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
Einstein | PERSON | 0.99+ |
first question | QUANTITY | 0.99+ |
16 people | QUANTITY | 0.99+ |
three | QUANTITY | 0.99+ |
first goal | QUANTITY | 0.99+ |
Parag | PERSON | 0.99+ |
IBM Chief Data Officers Summit | EVENT | 0.99+ |
Rolland | PERSON | 0.99+ |
six months | QUANTITY | 0.98+ |
15 years ago | DATE | 0.98+ |
Jung Park | PERSON | 0.98+ |
Jay Chitnis, Nutanix & Michael Cade, Veeam | Nutanix .NEXT EU 2018
>> Live from London, England, it's theCUBE, covering .NEXT Conference Europe 2018. Brought to you by Nutanix. >> Welcome back to London, England. I'm Stu Miniman with my cohost, Eup Piscar, and we're going to dig into one of the partnerships that Nutanix have. Joining me, two CUBE alums, Michael Cade, who's a technologist with Veeam. Had you on the program last year in Nice, and welcome back a little closer to home for you, here in London. >> Yeah cheers Stu, Hidey-ho. >> And welcome, six months with Nutanix, someone I've known. CUBE alumni. So, wherever you go, you know, there are CUBE alumnis always. So Jay Chitnis, who's the head of Global Strategic Alliances with Nutanix. Jay, thanks for joining us. >> Stu, thanks for having me. It's great to be here, guys. >> Alright, first of all You know, Michael, what's it mean having the show here in London, and would love your opinion, having kind of, how Nutanix's doing with Europe adoption. >> Yeah, so, obviously being in London means I don't have to go on a plane and travel anywhere, right? So, that's one benefit, but one thing, I was there last year, obviously, we spoke. I think one of the things I can see here is how many people are here. Feel's like its doubled in numbers, doubled in size. Doubled the conversations, obviously with us, with our product coming out in July/August of this year. Only a version one but we're seeing good feedback, good strong feedback and lots of questions around that. >> Yeah absolutely, 3500 is the number I heard here. Jay, we're going to talk about with Veeam, so set the stage for us, data protection, what's going, Nutanix positioning, and what you look to that. >> Yeah, its a vibrant landscape, right? So, just to kind of pick up a little bit on the thread around the European side. We've got over 50 partners here. Over 50 technology partners and a number of channel partners. There's just a vibrant buzz and one of the first things that people always talk about is we're in the the nation of GDPR. If you start to think about just where's this nation, this notion of data and where does it reside, data mobility and that sort of thing. That's one of the first things that we get hit with all the time; we get asked a lot. And so, it's really core to what we do. That's where the relationship really comes in. >> I love the little commentary there at that GDPR. Cause I remember last year, like most of last year, every show that had data protection, everything, we talked about GDPR a lot. To be honest, once we got past May, we didn't talk about it a lot. I mean, we said we knew it's real when there were some lawsuits and that happened rather fast to some of the really large companies, but is this still a major conversation with costumers, where are we and? >> Yeah, yeah, massively so that sovereignty of data, where is resides is something that, speaking to enterprise and mid-market customers over in Europe, there absolutely still top of mind is, why are we keeping that data? Where are we keeping that data? How do we leverage our tool set to understand where that data is? And then actually provide some insight into where it is, and report against things like violations between different locations. And just, We obviously had to go through that process of becoming GDPR compliant ourselves, and obviously as a global company, you have to kind of eat your own dog food. And understand, you have to know your own data, understand what that's doing, why we're keeping that? How it's being stored, and the message we just relay back into content and let our customers then use that. >> So what does that look? Maybe from a technology perspective, if you had to deal with GDPR, from an Nutanix standpoint, from a Veeam standpoint. What does it change, right? What does it change in terms of backing up? What does it change in terms of storing it? In a cloud or on print? Have you seen any majors changes in how that works for customers? >> Yeah, so the good the is that thinking about what that data is and where it's being stored. They know that in Germany that data may not be able to leave Germany or that data may not be able to leave the UK or Ireland and they might have offices in remote locations in various different countries. So, a simple thing that we put in was the ability to put tagging on repositories, on our physical constructs so that we knew the data path and the workflow. And then be able to use then Veeam one to be able to report against that so you understood where that data was going but also flag up any of those violations that may be where a backup job has pushed it to a different location. We need to know about that and we need to fix it as fast as possible. So that's one of the areas that we're talking >> So, I can imagine that this is not only has had an impact from a technology perspective from a vendor's side, but also in the service provider market. I guess a lot of service providers have gone into that phase to be able to help customers with their GDPR issues. >> Yeah, yeah, absolutely, so we were already aligned with our VCSP program. 20,000 VCSP partners out there and their model is as a service, so being able to provide, as a service and help them understand what that data is and know where that data is residing, is key to, that those customers that can't necessarily put their workloads into the public cloud but they can put it into a trusted service provider of VCSP. >> Or a trusted, like an enterprise private cloud. Or, one of the things that we're seeing is, when you start to think about data and where it resides, it's not just the cloud. It's not a discussion of is it on prem, is it in the cloud. There's this notion about this distributed cloud, some of this stuff that we talked about earlier this morning around what does that mean when you start to think about where, first of all, the amount of data that's sitting in everything other than what we would consider an enterprise cloud. That's one. The second thing is, how do you protect it? How do you back it up? What do you do at things that at the edge, right? That requires a fundamentally different way of looking at things. Just the size and the volume of the data. >> Yeah, one of the key things that we're seeing is that sprawl of data. Not necessarily, it doesn't really matter where that data resides. Whether it is on premises or whether it's in the public cloud. It's the data and that sprawl of data that can sit on many different platforms. >> Alright, Want to pivot the conversation a little bit lets talk about AHV. So, in the earnings announcement earlier this week, the number I heard was 38% looking at the last four quarters trailing, so strong growth. I actually, when I had asked Dheeraj about two years ago and said, "okay well what's the goal?" He said "Look, we're going to keep building and do it, and customers will have choice." You know, if we get to 50%, that felt about right to him then, when I talked to him he said "This seems right." It's not like we're going to eradicate everybody's other virtualization. That's not the goal. It's to do what makes sense. I remember one of the .NEXT's when Veeam said "We're going to go down the path to adopt AHV". There are actually tears in the audience. So, we know that ecosystem is super important to AHV. So Jim, maybe set the table for us with the guideline as to where we are with the partner eco system. Obviously Veeam's got some good, exciting stuff recently. But overall? >> Look, at the end of the day, the 38% number that you mentioned is critical, right? One of the things that we look at is, this is it's, our philosophy has always been about freedom and, so, some semblance of choice. And it doesn't matter whether you have a preference for a private cloud, a public cloud, a hypervisor. What we really are focused on is, how do we enhance incremental value add, especially in a management staff, right? So it's not necessarily a, we absolutely want to become a Hypervisor company. That's not the goal here. In order to, when you look at our partner landscape, and our partner ecosystem, it kind of fits into a few things. First and foremost, it's about customers who want, when they buy Nutanix, it's because they're buying Nutanix to fit in to a certain environment. Data protection, management, management and orchestration, networking and security. And then there's obviously customers who buy Nutanix for running something on top us, right? An, ISV, and enterprise ISV, big data applications, cloud native applications and things of that nature. One of the cornerstones for that ecosystem is to support AHV and we're starting to see a significant amount of our partners, not only looking at supporting AHV but actually going further and deeper. So, we look at things in terms of the breadth of the ecosystem, which is great, we want to grow that, but we also look as the depth. And someone like Veeam, who said, "Hey look, we were partnering with you on the breadth, where we were doing some stuff around supporting ESX." But really, the game changer was AHV. AHV support which was what, August? >> Yeah, yeah, beginning of August. I think the same premise as to what you were just saying Jay, so bring that simplicity model, we don't really care about what that is sitting on top. With a management layer, we're offering this hardware up as a service, or this layer of abstraction. From a Veeam, obviously, form a Veeam perspective, it's all about the ease of use, the reliability, but also the flexibility. And that's something that we kind of have that synergistic approach. >> I think that's a very shared common vision, right? It's making sure that you provide a seamless experience. One click sort of experience. But, being able to do so in a more cohesive manner. >> Michael, I want you to bring us inside. I remember back when Veeam supported Microsoft Hyper-V. It was a big deal. There's a lot of engineering work that goes into it. And a move, Veeam was more than just a virtualization company. Today Veeam is multi-cloud, they can play in lots of environment. Give us a little insight as to what happened and what's special has been done for the interface and the technology to fully support AHV with Veeam. >> Yeah, I think, so 12 years ago, Veeam started out protecting those virtual workloads. Virtualization first, Vmware first, then Hyper-V. And then the physical agents came and really that platform started to get broadened. What then happened is the AHV adoption rate from you guys was obviously rising so saw that and went in, and, but we took a different approach in terms of, okay, just because of what we've done in a Vmware and Hyper_V world, doesn't necessarily mean that that will fit our Nutanix AHV customers. So we went out, we seeded the market, understood what that looked like, how it looked from both a Nutanix point of view and also existing AHV customers. And then built the new AHV platform that we have to be able to protect them. But we still wanted to keep that agentless approach. But from a management perspective, we offer out a web interface that allows us to look very similar to the prism interface, the management layout. So that, an admin doesn't have to shift his command stature, his knowledge of working in management into that mind set. So, version one, and again, there's a considerable amount of effort gone into that has a pretty, pretty full-on feature list of features in that version one and that's going to continue to roll out over 2019 and beyond. >> So looking at this from a customers perspective, you know, back when I built an IS platform based on Nutanix, based on VH, one of the things that was high on my list was a AVH support. Simply because AVH over hypervisor, it became a commodity. I, even as a service provider, even as an IS provider, I didn't really care what hypervisor I ran. And so, support from VM to actually be able to back up VM's on AHV, and that was top priority for me. And seeing you guys use that different UI, even though it was a little bit over shot, because you know, we've been using VM for maybe a decade already. We're used to it. A little bit of a culture shock to start using it, but when you do, it becomes a totally different experience because it is aligned with Nutanix. So maybe tell us about why you've taken that approach of using the way of integrating with the Nutanix UI instead of staying at your old UI? >> Yeah, and so exactly, so mostly around Nutanix admins and their feedback around, if we could just have another tab that looks and feels exactly how our management plane looks like. Then that would be of more of a benefit. Now, obviously we didn't feedback on replication. There's still visibility of those jobs, there's no configuration, lettered out, that's one of the biggest asks that we're getting in the forums, in the public forums, is when can we have exactly what you're asking for there. Is it around how can we bring that central management back into VBR because they may have Nutanix clusters running different hypervisors and that's all supported from us but then, but, then, now we've got to go outside of that single management interface into the prism-like management for that, so, I kind of see that from that perspective. But, so that was really the main key for version one is, get something out that's the same as what our Nutanix administrators are used to. >> So, if we're talking about future, right, so what's next for VM and Nutanix? Real short question, short answer maybe. >> Yeah, without being fired, I'm but... (Jay laughs) So, version two, update one, so 1.1. That will be out in the next few, let's say weeks, months. And that really doesn't bring any major features or changes. That's the generic bug fixes, there's a few things that needed to be ironed out in the interface but also as the process. So that will be relatively soon. Then, the good thing around the ability to develop against what we're doing with AHV, is that because it's so separate from the VBR piece. It allows us to hopefully keep that much more frequent cadence of release. So we'll be starting to see more news about version two as we get into early 2019. >> Just a last thing, wondering what you could say about adoption so far? How much pent up demand was there? You know, I'd like to hear first from the Veeam standpoint. How many customers, if you can share anything about that? And then, Jay, what this means for AHV adoption? >> So, I don't know specific numbers, up to date numbers, but I have seen the sales force numbers grow from an opportunity perspective, and that's specifically where Veeam availability and Nutanix AHV is included in that sales force opportunity. So one of the things, though, is that we're seeing, if you're familiar with the Veeam forums, that, in particular, forum thread is growing and growing because people are understanding that we can help shape what we do here, we want those customers that are using it on a daily business to give us that feedback. >> Do you expect there to be new Veeam customers due to this offering? >> Yeah, yeah, absolutely. >> Yeah, I think we absolutely expect new Veeam customers. I think at the end of the day, going back your question around AVH, having a healthy ecosystem is really what's going to drive AHV adoption. So partners like Veeam who've done that is really what is providing some choice back. So you're question around what do we expect in the next few months, quarters, what we're seeing is a lot demand on, what's the right way, We're seeing a lot more demand on additional functionality that people customers would like to add into their grate. So AHV is just the beginning of the platform. It's not the end state and then, we're starting to see is a lot of customers, partners who are taking on things like, "Oh, well that's interesting, now I can do something with files, or buckets, or add on top of it where now all of a sudden, I can derive even more value. So AHV is just step one if you will, right? >> Yeah, I think that's important as well. So we've got update four coming out early next year that's going to bring the ability to leverage the Nutanix buckets that we've heard about this week. There's also other cloud mobility, but for the option of being able to convert those machines and send the up into Azure or AWS to be able to run tests and development up there. But, that whole cloud mobility about movement of data and making it seamless using the same tool set. One of the key differentiators is the VBK format. So those who know Veeam, they use the VBK format and that's exactly the same format that the Nutanix AHV product uses as well. >> Yeah, yeah, absolutely. Well, congratulations. Really looked at, as I said, this is really opening the door to start the journey as to where your customers are going. I've been hearing feedback from customers that have been waiting for this for a while and excited to see how this matures as things go forward. So, Jay, Michael, thanks so much for joining us and stay with us, full day of coverage here at Nutanix .NEXT 2018 in London. Thanks of watching theCUBE. (electronic beat)
SUMMARY :
Brought to you by Nutanix. one of the partnerships that Nutanix have. So, wherever you go, you know, It's great to be here, guys. the show here in London, Doubled the conversations, is the number I heard here. that we get hit with all the and that happened rather fast and the message we just in how that works for customers? so that we knew the data but also in the service provider market. so being able to provide, that at the edge, right? Yeah, one of the key the path to adopt AHV". One of the things that we to what you were just saying Jay, It's making sure that you and the technology to fully and really that platform started to get broadened. based on VH, one of the things the same as what our So, if we're talking the ability to develop first from the Veeam standpoint. So one of the things, So AHV is just the the ability to leverage and excited to see how this
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Michael | PERSON | 0.99+ |
Jay | PERSON | 0.99+ |
Jay Chitnis | PERSON | 0.99+ |
London | LOCATION | 0.99+ |
Jim | PERSON | 0.99+ |
Nutanix | ORGANIZATION | 0.99+ |
Michael Cade | PERSON | 0.99+ |
Germany | LOCATION | 0.99+ |
Dheeraj | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Stu | PERSON | 0.99+ |
Ireland | LOCATION | 0.99+ |
UK | LOCATION | 0.99+ |
Veeam | ORGANIZATION | 0.99+ |
Stu Miniman | PERSON | 0.99+ |
last year | DATE | 0.99+ |
July | DATE | 0.99+ |
50% | QUANTITY | 0.99+ |
CUBE | ORGANIZATION | 0.99+ |
ESX | TITLE | 0.99+ |
Veeam | PERSON | 0.99+ |
GDPR | TITLE | 0.99+ |
early 2019 | DATE | 0.99+ |
August | DATE | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
six months | QUANTITY | 0.99+ |
London, England | LOCATION | 0.99+ |
two | QUANTITY | 0.99+ |
38% | QUANTITY | 0.99+ |
early next year | DATE | 0.99+ |
First | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
second thing | QUANTITY | 0.99+ |
Nice | LOCATION | 0.99+ |
Nutan | ORGANIZATION | 0.98+ |
AWS | ORGANIZATION | 0.98+ |
One | QUANTITY | 0.98+ |
this week | DATE | 0.98+ |
Eup Piscar | PERSON | 0.98+ |
over 50 partners | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
Over 50 technology partners | QUANTITY | 0.97+ |
both | QUANTITY | 0.97+ |
One click | QUANTITY | 0.96+ |
20,000 VCSP | QUANTITY | 0.96+ |
Today | DATE | 0.96+ |
12 years ago | DATE | 0.96+ |
Robert Walsh, ZeniMax | PentahoWorld 2017
>> Announcer: Live from Orlando, Florida it's theCUBE covering Pentaho World 2017. Brought to you by Hitachi Vantara. (upbeat techno music) (coughs) >> Welcome to Day Two of theCUBE's live coverage of Pentaho World, brought to you by Hitachi Vantara. I'm your host Rebecca Knight along with my co-host Dave Vellante. We're joined by Robert Walsh. He is the Technical Director Enterprise Business Intelligence at ZeniMax. Thanks so much for coming on the show. >> Thank you, good morning. >> Good to see ya. >> I should say congratulations is in order (laughs) because you're company, ZeniMax, has been awarded the Pentaho Excellence Award for the Big Data category. I want to talk about the award, but first tell us a little bit about ZeniMax. >> Sure, so the company itself, so most people know us by the games versus the company corporate name. We make a lot of games. We're the third biggest company for gaming in America. And we make a lot of games such as Quake, Fallout, Skyrim, Doom. We have game launching this week called Wolfenstein. And so, most people know us by the games versus the corporate entity which is ZeniMax Media. >> Okay, okay. And as you said, you're the third largest gaming company in the country. So, tell us what you do there. >> So, myself and my team, we are primarily responsible for the ingestion and the evaluation of all the data from the organization. That includes really two main buckets. So, very simplistically we have the business world. So, the traditional money, users, then the graphics, people, sales. And on the other side we have the game. That's where a lot of people see the fun in what we do, such as what people are doing in the game, where in the game they're doing it, and why they're doing it. So, get a lot of data on gameplay behavior based on our playerbase. And we try and fuse those two together for the single viewer or customer. >> And that data comes from is it the console? Does it come from the ... What's the data flow? >> Yeah, so we actually support many different platforms. So, we have games on the console. So, Microsoft, Sony, PlayStation, Xbox, as well as the PC platform. Mac's for example, Android, and iOS. We support all platforms. So, the big challenge that we have is trying to unify that ingestion of data across all these different platforms in a unified way to facilitate downstream the reporting that we do as a company. >> Okay, so who ... When it says you're playing the game on a Microsoft console, whose data is that? Is it the user's data? Is it Microsoft's data? Is it ZeniMax's data? >> I see. So, many games that we actually release have a service act component. Most of our games are actually an online world. So, if you disconnect today people are still playing in that world. It never ends. So, in that situation, we have all the servers that people connect to from their desktop, from their console. Not all but most data we generate for the game comes from the servers that people connect to. We own those. >> Dave: Oh, okay. >> Which simplifies greatly getting that data from the people. >> Dave: So, it's your data? >> Exactly. >> What is the data telling you these days? >> Oh, wow, depends on the game. I think people realize what people do in games, what games have become. So, we have one game right now called Elder Scrolls Online, and this year we released the ability to buy in-game homes. And you can buy furniture for your in-game homes. So, you can furnish them. People can come and visit. And you can buy items, and weapons, and pets, and skins. And what's really interesting is part of the reason why we exist is to look at patterns and trends based on people interact with that environment. So for example, we'll see America playerbase buy very different items compared to say the European playerbase, based on social differences. And so, that helps immensely for the people who continuously develop the game to add items and features that people want to see and want to leverage. >> That is fascinating that Americans and Europeans are buying different furniture for their online homes. So, just give us some examples of the difference that you're seeing between these two groups. >> So, it's not just the homes, it applies to everything that they purchase as well. It's quite interesting. So, when it comes to the Americans versus Europeans for example what we find is that Europeans prefer much more cosmetic, passive experiences. Whereas the Americans are much things that stand out, things that are ... I'm trying to avoid stereotypes right now. >> Right exactly. >> It is what it is. >> Americans like ostentatious stuff. >> Robert: Exactly. >> We get it. >> Europeans are a bit more passive in that regard. And so, we do see that. >> Rebecca: Understated maybe. >> Thank you, that's a much better way of putting it. But games often have to be tweaked based on the environment. A different way of looking at it is a lot of companies in career in Asia all of these games in the West and they will have to tweak the game completely before it releases in these environments. Because players will behave differently and expect different things. And these games have become global. We have people playing all over the world all at the same time. So, how do you facilitate it? How do you support these different users with different needs in this one environment? Again, that's why BI has grown substantially in the gaming industry in the past five, ten years. >> Can you talk about the evolution of how you've been able to interact and essentially affect the user behavior or response to that behavior. You mentioned BI. So, you know, go back ten years it was very reactive. Not a lot of real time stuff going on. Are you now in the position to effect the behavior in real time, in a positive way? >> We're very close to that. We're not quite there yet. So yes, that's a very good point. So, five, ten years ago most games were traditional boxes. You makes a game, you get a box, Walmart or Gamestop, and then you're finished. The relationship with the customer ends. Now, we have this concept that's used often is games as a service. We provide an online environment, a service around a game, and people will play those games for weeks, months, if not years. And so, the shift as well as from a BI tech standpoint is one item where we've been able to streamline the ingest process. So, we're not real time but we can be hourly. Which is pretty responsive. But also, the fact that these games have become these online environments has enabled us to get this information. Five years ago, when the game was in a box, on the shelf, there was no connective tissue between us and them to interact and facilitate. With the games now being online, we can leverage BI. We can be more real time. We can respond quicker. But it's also due to the fact that now games themselves have changed to facilitate that interaction. >> Can you, Robert, paint a picture of the data pipeline? We started there with sort of the different devices. And you're bringing those in as sort of a blender. But take us through the data pipeline and how you're ultimately embedding or operationalizing those analytics. >> Sure. So, the game theater, the game and the business information, game theater is most likely 90, 95% of our total data footprint. We generate a lot more game information than we do business information. It's just due to how much we can track. We can do so. And so, a lot of these games will generate various game events, game logs that we can ingest into a single data lake. And we can use Amazon S3 for that. But it's not just a game theater. So, we have databases for financial information, account users, and so we will ingest the game events as well as the databases into one single location. At that point, however, it's still very raw. It's still very basic. We enable the analysts to actually interact with that. And they can go in there and get their feet wet but it's still very raw. The next step is really taking that raw information that is disjointed and separated, and unifying that into a single model that they can use in a much more performant way. In that first step, the analysts have the burden of a lot of the ETL work, to manipulate the data, to transform it, to make it useful. Which they can do. They should be doing the analysis, not the ingesting the data. And so, the progression from there into our warehouse is the next step of that pipeline. And so in there, we create these models and structures. And they're often born out of what the analysts are seeing and using in that initial data lake stage. So, they're repeating analysis, if they're doing this on a regular basis, the company wants something that's automated and auditable and productionized, then that's a great use case for promotion into our warehouse. You've got this initial staging layer. We have a warehouse where it's structured information. And we allow the analysts into both of those environments. So, they can pick their poison in respects. Structured data over here, raw and vast over here based on their use case. >> And what are the roles ... Just one more follow up, >> Yeah. >> if I may? Who are the people that are actually doing this work? Building the models, cleaning the data, and shoring data. You've got data scientists. You've got quality engineers. You got data engineers. You got application developers. Can you describe the collaboration between those roles? >> Sure. Yeah, so we as a BI organization we have two main groups. We have our engineering team. That's the one I drive. Then we have reporting, and that's a team. Now, we are really one single unit. We work as a team but we separate those two functions. And so, in my organization we have two main groups. We have our big data team which is doing that initial ingestion. Now, we ingest billions of troves of data a day. Terabytes a data a day. And so, we have a team just dedicated to ingestion, standardization, and exposing that first stage. Then we have our second team who are the warehouse engineers, who are actually here today somewhere. And they're the ones who are doing the modeling, the structuring. I mean the data modeling, making the data usable and promoting that into the warehouse. On the reporting team, basically we are there to support them. We provide these tool sets to engage and let them do their work. And so, in that team they have a very split of people do a lot of report development, visualization, data science. A lot of the individuals there will do all those three, two of the three, one of the three. But they do also have segmentation across your day to day reporting which has to function as well as the more deep analysis for data science or predictive analysis. >> And that data warehouse is on-prem? Is it in the cloud? >> Good question. Everything that I talked about is all in the cloud. About a year and a half, two years ago, we made the leap into the cloud. We drunk the Kool-Aid. As of Q2 next year at the very latest, we'll be 100% cloud. >> And the database infrastructure is Amazon? >> Correct. We use Amazon for all the BI platforms. >> Redshift or is it... >> Robert: Yes. >> Yeah, okay. >> That's where actually I want to go because you were talking about the architecture. So, I know you've mentioned Amazon Redshift. Cloudera is another one of your solutions provider. And of course, we're here in Pentaho World, Pentaho. You've described Pentaho as the glue. Can you expand on that a little bit? >> Absolutely. So, I've been talking about these two environments, these two worlds data lake to data warehouse. They're both are different in how they're developed, but it's really a single pipeline, as you said. And so, how do we get data from this raw form into this modeled structure? And that's where Pentaho comes into play. That's the glue. If the glue between these two environments, while they're conceptually very different they provide a singular purpose. But we need a way to unify that pipeline. And so, Pentaho we use very heavily to take this raw information, to transform it, ingest it, and model it into Redshift. And we can automate, we can schedule, we can provide error handling. And so it gives us the framework. And it's self-documenting to be able to track and understand from A to B, from raw to structured how we do that. And again, Pentaho is allowing us to make that transition. >> Pentaho 8.0 just came out yesterday. >> Hmm, it did? >> What are you most excited about there? Do you see any changes? We keep hearing a lot about the ability to scale with Pentaho World. >> Exactly. So, there's three things that really appeal to me actually on 8.0. So, things that we're missing that they've actually filled in with this release. So firstly, we on the streaming component from earlier the real time piece we were missing, we're looking at using Kafka and queuing for a lot of our ingestion purposes. And Pentaho in releasing this new version the mechanism to connect to that environment. That was good timing. We need that. Also too, get into more critical detail, the logs that we ingest, the data that we handle we use Avro and Parquet. When we can. We use JSON, Avro, and Parquet. Pentaho can handle JSON today. Avro, Parquet are coming in 8.0. And then lastly, to your point you made as well is where they're going with their system, they want to go into streaming, into all this information. It's very large and it has to go big. And so, they're adding, again, the ability to add worker nodes and scale horizontally their environment. And that's really a requirement before these other things can come into play. So, those are the things we're looking for. Our data lake can scale on demand. Our Redshift environment can scale on demand. Pentaho has not been able to but with this release they should be able to. And that was something that we've been hoping for for quite some time. >> I wonder if I can get your opinion on something. A little futures-oriented. You have a choice as an organization. You could just take roll your own opensource, best of breed opensource tools, and slog through that. And if you're an internet giant or a huge bank, you can do that. >> Robert: Right. >> You can take tooling like Pentaho which is end to end data pipeline, and this dramatically simplifies things. A lot of the cloud guys, Amazon, Microsoft, I guess to a certain extent Google, they're sort of picking off pieces of the value chain. And they're trying to come up with as a service fully-integrated pipeline. Maybe not best of breed but convenient. How do you see that shaking out generally? And then specifically, is that a challenge for Pentaho from your standpoint? >> So, you're right. That why they're trying to fill these gaps in their environment. To what Pentaho does and what they're offering, there's no comparison right now. They're not there yet. They're a long way away. >> Dave: You're saying the cloud guys are not there. >> No way. >> Pentaho is just so much more functional. >> Robert: They're not close. >> Okay. >> So, that's the first step. However, though what I've been finding in the cloud, there's lots of benefits from the ease of deployment, the scaling. You use a lot of dev ops support, DBA support. But the tools that they offer right now feel pretty bare bones. They're very generic. They have a place but they're not designed for singular purpose. Redshift is the only real piece of the pipeline that is a true Amazon product, but that came from a company called Power Excel ten years ago. They licensed that from a separate company. >> Dave: What a deal that was for Amazon! (Rebecca and Dave laugh) >> Exactly. And so, we like it because of the functionality Power Excel put in many year ago. Now, they've developed upon that. And it made it easier to deploy. But that's the core reason behind it. Now, we use for our big data environment, we use Data Breaks. Data Breaks is a cloud solution. They deploy into Amazon. And so, what I've been finding more and more is companies that are specialized in application or function who have their product support cloud deployment, is to me where it's a sweet middle ground. So, Pentaho is also talking about next year looking at Amazon deployment solutioning for their tool set. So, to me it's not really about going all Amazon. Oh, let's use all Amazon products. They're cheap and cheerful. We can make it work. We can hire ten engineers and hack out a solution. I think what's more applicable is people like Pentaho, whatever people in the industry who have the expertise and are specialized in that function who can allow their products to be deployed in that environment and leverage the Amazon advantages, the Elastic Compute, storage model, the deployment methodology. That is where I see the sweet spot. So, if Pentaho can get to that point, for me that's much more appealing than looking at Amazon trying to build out some things to replace Pentaho x years down the line. >> So, their challenge, if I can summarize, they've got to stay functionally ahead. Which they're way ahead now. They got to maintain that lead. They have to curate best of breed like Spark, for example, from Databricks. >> Right. >> Whatever's next and curate that in a way that is easy to integrate. And then look at the cloud's infrastructure. >> Right. Over the years, these companies that have been looking at ways to deploy into a data center easily and efficiently. Now, the cloud is the next option. How do they support and implement into the cloud in a way where we can leverage their tool set but in a way where we can leverage the cloud ecosystem. And that's the gap. And I think that's what we look for in companies today. And Pentaho is moving towards that. >> And so, that's a lot of good advice for Pentaho? >> I think so. I hope so. Yeah. If they do that, we'll be happy. So, we'll definitely take that. >> Is it Pen-ta-ho or Pent-a-ho? >> You've been saying Pent-a-ho with your British accent! But it is Pen-ta-ho. (laughter) Thank you. >> Dave: Cheap and cheerful, I love it. >> Rebecca: I know -- >> Bless your cotton socks! >> Yes. >> I've had it-- >> Dave: Cord and Bennett. >> Rebecca: Man, okay. Well, thank you so much, Robert. It's been a lot of fun talking to you. >> You're very welcome. >> We will have more from Pen-ta-ho World (laughter) brought to you by Hitachi Vantara just after this. (upbeat techno music)
SUMMARY :
Brought to you by Hitachi Vantara. He is the Technical Director for the Big Data category. Sure, so the company itself, gaming company in the country. And on the other side we have the game. from is it the console? So, the big challenge that Is it the user's data? So, many games that we actually release from the people. And so, that helps examples of the difference So, it's not just the homes, And so, we do see that. We have people playing all over the world affect the user behavior And so, the shift as well of the different devices. We enable the analysts to And what are the roles ... Who are the people that are and promoting that into the warehouse. about is all in the cloud. We use Amazon for all the BI platforms. You've described Pentaho as the glue. And so, Pentaho we use very heavily about the ability to scale the data that we handle And if you're an internet A lot of the cloud So, you're right. Dave: You're saying the Pentaho is just So, that's the first step. of the functionality They have to curate best of breed that is easy to integrate. And that's the gap. So, we'll definitely take that. But it is Pen-ta-ho. It's been a lot of fun talking to you. brought to you by Hitachi
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Vellante | PERSON | 0.99+ |
Rebecca Knight | PERSON | 0.99+ |
Rebecca | PERSON | 0.99+ |
Robert Walsh | PERSON | 0.99+ |
Robert | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Pentaho | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Asia | LOCATION | 0.99+ |
Walmart | ORGANIZATION | 0.99+ |
America | LOCATION | 0.99+ |
ZeniMax Media | ORGANIZATION | 0.99+ |
ZeniMax | ORGANIZATION | 0.99+ |
Power Excel | TITLE | 0.99+ |
second team | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
two | QUANTITY | 0.99+ |
two main groups | QUANTITY | 0.99+ |
two groups | QUANTITY | 0.99+ |
Wolfenstein | TITLE | 0.99+ |
one | QUANTITY | 0.99+ |
Orlando, Florida | LOCATION | 0.99+ |
Sony | ORGANIZATION | 0.99+ |
two functions | QUANTITY | 0.99+ |
three | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
90, 95% | QUANTITY | 0.99+ |
next year | DATE | 0.99+ |
Kool-Aid | ORGANIZATION | 0.99+ |
100% | QUANTITY | 0.99+ |
iOS | TITLE | 0.99+ |
today | DATE | 0.99+ |
Doom | TITLE | 0.99+ |
yesterday | DATE | 0.99+ |
Hitachi Vantara | ORGANIZATION | 0.99+ |
two main buckets | QUANTITY | 0.98+ |
Gamestop | ORGANIZATION | 0.98+ |
Fallout | TITLE | 0.98+ |
two environments | QUANTITY | 0.98+ |
first step | QUANTITY | 0.98+ |
one item | QUANTITY | 0.98+ |
Five years ago | DATE | 0.98+ |
Android | TITLE | 0.98+ |
one game | QUANTITY | 0.98+ |
Pentaho World | TITLE | 0.98+ |
three things | QUANTITY | 0.98+ |
first stage | QUANTITY | 0.98+ |
Pen-ta-ho World | ORGANIZATION | 0.98+ |
Pentaho Excellence Award | TITLE | 0.98+ |
this year | DATE | 0.98+ |