Image Title

Search Results for Department of Army:

Justin Shirk and Paul Puckett | AWS Executive Summit 2022


 

>>Welcome back here on the Cube. I'm John Walls. We are in Las Vegas at the Venetian, and this is Reinvent 22 in the Executive Summit sponsored by Accenture. Glad to have you with us here as we continue our conversations. I'm joined by Paul Puckett, who's the former director of the Enterprise Cloud Management Services at the US Army. Paul, good to see you sir. Hey, you as well, John. Thank you. And Justin, she who is managing director and cloud go to market lead at Accenture Federal Services. Justin, good morning to you. Good morning, John. Yeah, glad to have you both here on the cube. First time too, I believe, right? Yes sir. Well, welcome. I wish we had some kind of baptism or indoctrination, but I'll see what I can come up with in the next 10 minutes for you. Let's talk about the Army, Paul. So enterprise cloud management, US Army. You know, I can't imagine the scale we're talking about here. I can't imagine the solutions we're talking about. I can't imagine the users we're talking about. Just for our folks at home, paint the picture a little bit of what kind of landscape it is that you have to cover with that kind of title. >>Sure. The United States Army, about 1.4 million people. Obviously a global organization responsible for protecting and defending the United States as part of our sister services in the Department of Defense. And scale often comes up a lot, right? And we talk about any capability to your solution for the United States Army scale is the, the number one thing, but oftentimes people overlook quality first. And actually when you think of the partnership between the Army and Accenture Federal, we thought a lot when it came to establishing the enterprise Cloud management agency that we wanted to deliver quality first when it came to adopting cloud computing and then scale that quality and not so much be afraid of the, the scale of the army and the size that forces us to make bad decisions. Cuz we wanted to make sure that we proved that there was opportunity and value in the cloud first, and then we wanted to truly scale that. And so no doubt, an immense challenge. The organization's been around for now three years, but I think that we've established irreversible momentum when it comes to modernization, leveraging cloud computing >>For the army. So let's back up. You kind of threw it in there, the ecma. So this agency was, was your a collaboration, right? To create from the ground up and it's in three years in existence. So let's just talk about that. What went into that thinking? What went into the planning and then how did you actually get it up and run into the extent that it is today? >>Sure. Well, it was once the enterprise cloud management office. It was a directorate within the, the CIO G six of the United States Army. So at the headquarters, the army, the chief information Officer, and the G six, which is essentially the military arm for all IT capability were once a joint's organization and the ECMO was created to catalyze the adoption of cloud computing. The army had actually been on a, a cloud adoption journey for many years, but there wasn't a lot of value that was actually derived. And so they created the ecma, well, the ECMO at the time brought me in as the director. And so we were responsible for establishing the new strategy for the adoption of cloud. One of the components of that strategy was essentially we needed an opportunity to be able to buy cloud services at scale. And this was part of our buy secure and build model that we had in place. And so part of the buy piece, we put an acquisition strategy together around how we wanted to buy cloud at scale. We called it the cloud account management optimization. OTA >>Just rolls right off the >>Tongue, it just rolls right off the tongue. And for those that love acronyms, camo, >>Which I liked it when I was say cama, I loved that. That was, that was, >>You always have to have like a tundra, a little >>Piece of that. Very good. It was good. >>But at the time it was novetta, no, Nevada's been bought up by afs, but Novea won that agreement. And so we've had this partnership in place now for just about a year and a half for buying cloud computing net scale. >>So let's talk about, about what you deal with on, on the federal services side here, Justin, in terms of the army. So obviously governance, a major issue, compliance, a major issue, security, you know, paramount importance and all that STEM leads up to quality that Paul was talking about. So when you were looking at this and keeping all those factors in, in your mind, right? I mean, how many, like, oh my God, what kind of days did you have? Oh, well, because this was a handful. >>Well, it was, but you could see when we were responding to the acquisition that it was really, you know, forward thinking and forward leaning in terms of how they thought about cloud acquisition and cloud governance and cloud management. And it's really kind of a sleepy area like cloud account acquisition. Everyone's like, oh, it's easy to get in the cloud, you know, run your credit card on Amazon and you're in, in 30 seconds or less. That's really not the case inside the federal government, whether it's the army, the Air Force or whoever, right? Those, those are, they're real challenges in procuring and acquiring cloud. And so it was clear from, you know, Paul's office that they understood those challenges and we were excited to really meet them with them. >>And, and how, I guess from an institutional perspective, before this was right, I I assume very protective, very tight cloistered, right? You, you, in terms of being open to or, or a more open environment, there might have been some pushback was they're not. Right? So dealing with that, what did you find that to be the case? Well, so >>There's kind of a few pieces to unpacking that. There's a lot of fear in trepidation around something you don't understand, right? And so part of it is the teaching and training and the, and the capability and the opportunity in the cloud and the ability to be exceptionally secure when it comes to no doubt, the sensitivity of the information of the Department of Defense, but also from an action acquisition strategy perspective, more from a financial perspective, the DOD is accustomed to buying hardware. We make these big bets of these big things to, to live in today's centers. And so when we talk about consuming cloud as a utility, there's a lot of fear there as well, because they don't really understand how to kind of pay for something by the drink, if you will, because it incentivizes them to be more efficient with their utilization of resources. >>But when you look at the budgeting process of the d od, there really is not that much of incentive for efficiency. The p PPE process, the planning program, budgeting, execution, they care about execution, which is spending money and you can spend a lot of money in the cloud, right? But how are you actually utilizing that? And so what we wanted to do is create that feedback loop and so the utilization is actually fed into our financial systems that help us then estimate into the future. And that's the capability that we partnered with AFS on is establishing the closing of that feedback loop. So now we can actually optimize our utilization of the cloud. And that's actually driving better incentives in the PPE >>Process. You know, when you think about these keywords here, modernized, digitized, data driven, so on, so forth, I, I don't think a lot of people might connect that to the US government in general just because of, you know, it's a large intentionally slow moving bureaucratic machine, right? Is that fair to characterize it that way? It >>Is, but not in this case. Right? So what we done, >>You you totally juxtapose that. Yeah. >>Yeah. So what we've done is we've really enabled data driven decision making as it relates to cloud accounts and cloud governance. And so we have a, a tool called Cloud Tracker. We deployed for the army at a number of different classifications, and you get a full 360 view of all of your cloud utilization and cloud spend, you know, really up to date within 24 hours of it occurring, right? And there a lot of folks, you know, they didn't never went into the console, they never looked at what they were spending in cloud previously. And so now you just go to a simple web portal and see the entire entirety of the army cloud spend right there at your fingertips. So that really enables like better decision making in terms of like purchasing savings plans and reserved instances and other sorts of AWS specific tools to help you save money. >>So Paul, tell me about Cloud Tracker then. Yeah, I mean from the client side then, can you just say this dashboard lays it out for you right? In great detail about what kind of usage, what kind of efficiencies I assume Yeah. What's working, what's not? >>Absolutely. Well, and, and I think a few things to unpack that's really important here is listen, any cloud service provider has a concept. You can see what you're actually spending. But when it comes to money in the United States government, there are different colors of money. There's regulations when it comes to how money is identified for different capabilities or incentives. And you've gotta be very explicit in how you track and how you spend that money from an auditability perspective. Beyond that, there is a move when it comes to the technology business management, which is the actual labeling of what we actually spend money on for different services or labor or software. And what Cloud Tracker allows us to do is speak the language of the different colors of money. It allows us to also get very fine grain in the actual analysis of, from a TBM perspective, what we're spending on. >>But then also it has real time hooks into our financial systems for execution. And so what that really does for us is it allows us to complete the picture, not just be able to see our spend in the cloud, but also be able to able to see that spending context of all things in the P P P E process as well as the execution process that then really empowers the government to make better investments. And all we're seeing is either cost avoidance or cost savings simply because we're able to close that loop, like I said. Yep. And then we're able to redirect those funds, retag them, remove them through our actual financial office within the headquarters of the army, and be able to repurpose that to other modernization efforts that Congress is essentially asking us to invest >>In. Right. So you know how much money you have, basically. Exactly. Right. You know how much you've already spent, you know how you're spending it, and now you how much you have left, >>You can provide a reliable forecast for your spend. >>Right. You know, hey, we're, we're halfway through this quarter, we're halfway through the, the fiscal year, whatever the case might be. >>Exactly. And the focus on expenditures, you know, the government rates you on, you know, how much have you spent, right? So you have a clear total transparency into what you're going to spend through the rest of the fiscal. Sure. >>All right. Let's just talk about the relationship quickly then about going forward then in terms of federal services and then what on, on the, the US Army side. I mean, what now you've laid this great groundwork, right? You have a really solid foundation where now what next? >>We wanna be all things cloud to the army. I mean, we think there's tremendous opportunity to really aid the modernization efforts and governance across the holistic part of the army. So, you know, we just, we want to, we wanna do it all with the Army as much as we can. It's, it's, it's a fantastic >>Opportunity. Yeah. AFS is, is in a very kind of a strategic role. So as part of the ecma, we own the greater strategy and execution for adoption of cloud on behalf of the entire army. Now, when it comes to delivery of individual capabilities for mission here and there, that's all specific to system owners and different organizations. AFS plays a different role in this instance where they're able to more facilitate the greater strategy on the financial side of the house. And what we've done is we've proven the ability to adopt cloud as a utility rather than this fixed thing, kind of predict the future, spend a whole bunch of money and never use the resource. We're seeing the efficiency for the actual utilization of cloud as a utility. This actually came out as one of the previous NDAs. And so how we actually address nda, I believe it was 2018 in the adoption of cloud as a utility, really is now cornerstone of modernization across all of the do d and really feeds into the Jo Warfighting cloud capability, major acquisition on behalf of all of the D O D to establish buying cloud as just a common service for everyone. >>And so we've been fortunate to inform that team of some of our lessons learned, but when it comes to the partnership, we just see camo moving into production. We've been live for now a year and a half. And so there's another two and a half years of runway there. And then AFS also plays a strategic role at part of our cloud enablement division, which is essentially back to that teaching part, helping the Army understand the opportunity of cloud computing, align the architectures to actually leverage those resources and then deliver capabilities that save soldier's >>Lives. Well, you know, we've, we've always known that the Army does its best work on the ground, and you've done all this groundwork for the military, so I'm not surprised, right? It's, it's a winning formula. Thanks to both of you for being with us here in the executive summit. Great conversation. Awesome. Thanks for having us. A good deal. All right. Thank you. All right. You are watching the executive summit sponsored by Accenture here at Reinvent 22, and you're catching it all on the cube, the leader in high tech coverage.

Published Date : Nov 29 2022

SUMMARY :

a little bit of what kind of landscape it is that you have to cover with that kind of title. And actually when you think of the partnership between the Army and Accenture Federal, we thought a lot For the army. And so part of the Tongue, it just rolls right off the tongue. Which I liked it when I was say cama, I loved that. It was good. But at the time it was novetta, no, Nevada's been bought up by afs, but Novea won that agreement. So let's talk about, about what you deal with on, on the federal services side here, And so it was clear from, you know, Paul's office that So dealing with that, what did you find that to be the case? in the cloud and the ability to be exceptionally secure when it comes to no doubt, the sensitivity of the information And that's the capability that You know, when you think about these keywords here, modernized, digitized, data driven, So what we done, You you totally juxtapose that. We deployed for the army at a number of different classifications, and you get a full 360 Yeah, I mean from the client side then, can you just say this dashboard lays And what Cloud Tracker allows us to do is speak the language of the different colors of money. And so what So you know how much money you have, basically. You know, hey, we're, we're halfway through this quarter, we're halfway through the, the fiscal year, And the focus on expenditures, you know, the government rates you on, you know, Let's just talk about the relationship quickly then about going forward then in terms of federal services and really aid the modernization efforts and governance across the holistic the ability to adopt cloud as a utility rather than this fixed thing, kind of predict the future, And so we've been fortunate to inform that team of some of our lessons learned, Thanks to both of you for being with us here in the executive summit.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JustinPERSON

0.99+

PaulPERSON

0.99+

Paul PuckettPERSON

0.99+

JohnPERSON

0.99+

John WallsPERSON

0.99+

CongressORGANIZATION

0.99+

United States ArmyORGANIZATION

0.99+

DODORGANIZATION

0.99+

Accenture Federal ServicesORGANIZATION

0.99+

Department of DefenseORGANIZATION

0.99+

AccentureORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

2018DATE

0.99+

Las VegasLOCATION

0.99+

AFSORGANIZATION

0.99+

United States ArmyORGANIZATION

0.99+

three yearsQUANTITY

0.99+

Accenture FederalORGANIZATION

0.99+

ECMOORGANIZATION

0.99+

a year and a halfQUANTITY

0.99+

30 secondsQUANTITY

0.99+

AWSORGANIZATION

0.99+

bothQUANTITY

0.99+

two and a half yearsQUANTITY

0.99+

US ArmyORGANIZATION

0.99+

NoveaORGANIZATION

0.99+

oneQUANTITY

0.98+

360 viewQUANTITY

0.98+

Justin ShirkPERSON

0.98+

Enterprise Cloud Management ServicesORGANIZATION

0.98+

novettaORGANIZATION

0.98+

24 hoursQUANTITY

0.97+

First timeQUANTITY

0.97+

OneQUANTITY

0.96+

VenetianLOCATION

0.95+

about a year and a halfQUANTITY

0.95+

about 1.4 million peopleQUANTITY

0.95+

ArmyORGANIZATION

0.93+

Cloud TrackerTITLE

0.92+

CloudTITLE

0.92+

todayDATE

0.92+

AWSEVENT

0.91+

firstQUANTITY

0.9+

Reinvent 22EVENT

0.9+

US governmentORGANIZATION

0.88+

United StatesLOCATION

0.79+

NevadaORGANIZATION

0.76+

UnitedORGANIZATION

0.73+

Executive Summit 2022EVENT

0.72+

G sixORGANIZATION

0.71+

minutesDATE

0.67+

Air ForceORGANIZATION

0.6+

governmentORGANIZATION

0.6+

StatesLOCATION

0.58+

CIOORGANIZATION

0.51+

10QUANTITY

0.46+

The Truth About MySQL HeatWave


 

>>When Oracle acquired my SQL via the Sun acquisition, nobody really thought the company would put much effort into the platform preferring to focus all the wood behind its leading Oracle database, Arrow pun intended. But two years ago, Oracle surprised many folks by announcing my SQL Heatwave a new database as a service with a massively parallel hybrid Columbia in Mary Mary architecture that brings together transactional and analytic data in a single platform. Welcome to our latest database, power panel on the cube. My name is Dave Ante, and today we're gonna discuss Oracle's MySQL Heat Wave with a who's who of cloud database industry analysts. Holgar Mueller is with Constellation Research. Mark Stammer is the Dragon Slayer and Wikibon contributor. And Ron Westfall is with Fu Chim Research. Gentlemen, welcome back to the Cube. Always a pleasure to have you on. Thanks for having us. Great to be here. >>So we've had a number of of deep dive interviews on the Cube with Nip and Aggarwal. You guys know him? He's a senior vice president of MySQL, Heatwave Development at Oracle. I think you just saw him at Oracle Cloud World and he's come on to describe this is gonna, I'll call it a shock and awe feature additions to to heatwave. You know, the company's clearly putting r and d into the platform and I think at at cloud world we saw like the fifth major release since 2020 when they first announced MySQL heat wave. So just listing a few, they, they got, they taken, brought in analytics machine learning, they got autopilot for machine learning, which is automation onto the basic o l TP functionality of the database. And it's been interesting to watch Oracle's converge database strategy. We've contrasted that amongst ourselves. Love to get your thoughts on Amazon's get the right tool for the right job approach. >>Are they gonna have to change that? You know, Amazon's got the specialized databases, it's just, you know, the both companies are doing well. It just shows there are a lot of ways to, to skin a cat cuz you see some traction in the market in, in both approaches. So today we're gonna focus on the latest heat wave announcements and we're gonna talk about multi-cloud with a native MySQL heat wave implementation, which is available on aws MySQL heat wave for Azure via the Oracle Microsoft interconnect. This kind of cool hybrid action that they got going. Sometimes we call it super cloud. And then we're gonna dive into my SQL Heatwave Lake house, which allows users to process and query data across MyQ databases as heatwave databases, as well as object stores. So, and then we've got, heatwave has been announced on AWS and, and, and Azure, they're available now and Lake House I believe is in beta and I think it's coming out the second half of next year. So again, all of our guests are fresh off of Oracle Cloud world in Las Vegas. So they got the latest scoop. Guys, I'm done talking. Let's get into it. Mark, maybe you could start us off, what's your opinion of my SQL Heatwaves competitive position? When you think about what AWS is doing, you know, Google is, you know, we heard Google Cloud next recently, we heard about all their data innovations. You got, obviously Azure's got a big portfolio, snowflakes doing well in the market. What's your take? >>Well, first let's look at it from the point of view that AWS is the market leader in cloud and cloud services. They own somewhere between 30 to 50% depending on who you read of the market. And then you have Azure as number two and after that it falls off. There's gcp, Google Cloud platform, which is further way down the list and then Oracle and IBM and Alibaba. So when you look at AWS and you and Azure saying, hey, these are the market leaders in the cloud, then you start looking at it and saying, if I am going to provide a service that competes with the service they have, if I can make it available in their cloud, it means that I can be more competitive. And if I'm compelling and compelling means at least twice the performance or functionality or both at half the price, I should be able to gain market share. >>And that's what Oracle's done. They've taken a superior product in my SQL heat wave, which is faster, lower cost does more for a lot less at the end of the day and they make it available to the users of those clouds. You avoid this little thing called egress fees, you avoid the issue of having to migrate from one cloud to another and suddenly you have a very compelling offer. So I look at what Oracle's doing with MyQ and it feels like, I'm gonna use a word term, a flanking maneuver to their competition. They're offering a better service on their platforms. >>All right, so thank you for that. Holger, we've seen this sort of cadence, I sort of referenced it up front a little bit and they sat on MySQL for a decade, then all of a sudden we see this rush of announcements. Why did it take so long? And and more importantly is Oracle, are they developing the right features that cloud database customers are looking for in your view? >>Yeah, great question, but first of all, in your interview you said it's the edit analytics, right? Analytics is kind of like a marketing buzzword. Reports can be analytics, right? The interesting thing, which they did, the first thing they, they, they crossed the chasm between OTP and all up, right? In the same database, right? So major engineering feed very much what customers want and it's all about creating Bellevue for customers, which, which I think is the part why they go into the multi-cloud and why they add these capabilities. And they certainly with the AI capabilities, it's kind of like getting it into an autonomous field, self-driving field now with the lake cost capabilities and meeting customers where they are, like Mark has talked about the e risk costs in the cloud. So that that's a significant advantage, creating value for customers and that's what at the end of the day matters. >>And I believe strongly that long term it's gonna be ones who create better value for customers who will get more of their money From that perspective, why then take them so long? I think it's a great question. I think largely he mentioned the gentleman Nial, it's largely to who leads a product. I used to build products too, so maybe I'm a little fooling myself here, but that made the difference in my view, right? So since he's been charged, he's been building things faster than the rest of the competition, than my SQL space, which in hindsight we thought was a hot and smoking innovation phase. It kind of like was a little self complacent when it comes to the traditional borders of where, where people think, where things are separated between OTP and ola or as an example of adjacent support, right? Structured documents, whereas unstructured documents or databases and all of that has been collapsed and brought together for building a more powerful database for customers. >>So I mean it's certainly, you know, when, when Oracle talks about the competitors, you know, the competitors are in the, I always say they're, if the Oracle talks about you and knows you're doing well, so they talk a lot about aws, talk a little bit about Snowflake, you know, sort of Google, they have partnerships with Azure, but, but in, so I'm presuming that the response in MySQL heatwave was really in, in response to what they were seeing from those big competitors. But then you had Maria DB coming out, you know, the day that that Oracle acquired Sun and, and launching and going after the MySQL base. So it's, I'm, I'm interested and we'll talk about this later and what you guys think AWS and Google and Azure and Snowflake and how they're gonna respond. But, but before I do that, Ron, I want to ask you, you, you, you can get, you know, pretty technical and you've probably seen the benchmarks. >>I know you have Oracle makes a big deal out of it, publishes its benchmarks, makes some transparent on on GI GitHub. Larry Ellison talked about this in his keynote at Cloud World. What are the benchmarks show in general? I mean, when you, when you're new to the market, you gotta have a story like Mark was saying, you gotta be two x you know, the performance at half the cost or you better be or you're not gonna get any market share. So, and, and you know, oftentimes companies don't publish market benchmarks when they're leading. They do it when they, they need to gain share. So what do you make of the benchmarks? Have their, any results that were surprising to you? Have, you know, they been challenged by the competitors. Is it just a bunch of kind of desperate bench marketing to make some noise in the market or you know, are they real? What's your view? >>Well, from my perspective, I think they have the validity. And to your point, I believe that when it comes to competitor responses, that has not really happened. Nobody has like pulled down the information that's on GitHub and said, Oh, here are our price performance results. And they counter oracles. In fact, I think part of the reason why that hasn't happened is that there's the risk if Oracle's coming out and saying, Hey, we can deliver 17 times better query performance using our capabilities versus say, Snowflake when it comes to, you know, the Lakehouse platform and Snowflake turns around and says it's actually only 15 times better during performance, that's not exactly an effective maneuver. And so I think this is really to oracle's credit and I think it's refreshing because these differentiators are significant. We're not talking, you know, like 1.2% differences. We're talking 17 fold differences, we're talking six fold differences depending on, you know, where the spotlight is being shined and so forth. >>And so I think this is actually something that is actually too good to believe initially at first blush. If I'm a cloud database decision maker, I really have to prioritize this. I really would know, pay a lot more attention to this. And that's why I posed the question to Oracle and others like, okay, if these differentiators are so significant, why isn't the needle moving a bit more? And it's for, you know, some of the usual reasons. One is really deep discounting coming from, you know, the other players that's really kind of, you know, marketing 1 0 1, this is something you need to do when there's a real competitive threat to keep, you know, a customer in your own customer base. Plus there is the usual fear and uncertainty about moving from one platform to another. But I think, you know, the traction, the momentum is, is shifting an Oracle's favor. I think we saw that in the Q1 efforts, for example, where Oracle cloud grew 44% and that it generated, you know, 4.8 billion and revenue if I recall correctly. And so, so all these are demonstrating that's Oracle is making, I think many of the right moves, publishing these figures for anybody to look at from their own perspective is something that is, I think, good for the market and I think it's just gonna continue to pay dividends for Oracle down the horizon as you know, competition intens plots. So if I were in, >>Dave, can I, Dave, can I interject something and, and what Ron just said there? Yeah, please go ahead. A couple things here, one discounting, which is a common practice when you have a real threat, as Ron pointed out, isn't going to help much in this situation simply because you can't discount to the point where you improve your performance and the performance is a huge differentiator. You may be able to get your price down, but the problem that most of them have is they don't have an integrated product service. They don't have an integrated O L T P O L A P M L N data lake. Even if you cut out two of them, they don't have any of them integrated. They have multiple services that are required separate integration and that can't be overcome with discounting. And the, they, you have to pay for each one of these. And oh, by the way, as you grow, the discounts go away. So that's a, it's a minor important detail. >>So, so that's a TCO question mark, right? And I know you look at this a lot, if I had that kind of price performance advantage, I would be pounding tco, especially if I need two separate databases to do the job. That one can do, that's gonna be, the TCO numbers are gonna be off the chart or maybe down the chart, which you want. Have you looked at this and how does it compare with, you know, the big cloud guys, for example, >>I've looked at it in depth, in fact, I'm working on another TCO on this arena, but you can find it on Wiki bod in which I compared TCO for MySEQ Heat wave versus Aurora plus Redshift plus ML plus Blue. I've compared it against gcps services, Azure services, Snowflake with other services. And there's just no comparison. The, the TCO differences are huge. More importantly, thefor, the, the TCO per performance is huge. We're talking in some cases multiple orders of magnitude, but at least an order of magnitude difference. So discounting isn't gonna help you much at the end of the day, it's only going to lower your cost a little, but it doesn't improve the automation, it doesn't improve the performance, it doesn't improve the time to insight, it doesn't improve all those things that you want out of a database or multiple databases because you >>Can't discount yourself to a higher value proposition. >>So what about, I wonder ho if you could chime in on the developer angle. You, you followed that, that market. How do these innovations from heatwave, I think you used the term developer velocity. I've heard you used that before. Yeah, I mean, look, Oracle owns Java, okay, so it, it's, you know, most popular, you know, programming language in the world, blah, blah blah. But it does it have the, the minds and hearts of, of developers and does, where does heatwave fit into that equation? >>I think heatwave is gaining quickly mindshare on the developer side, right? It's not the traditional no sequel database which grew up, there's a traditional mistrust of oracles to developers to what was happening to open source when gets acquired. Like in the case of Oracle versus Java and where my sql, right? And, but we know it's not a good competitive strategy to, to bank on Oracle screwing up because it hasn't worked not on Java known my sequel, right? And for developers, it's, once you get to know a technology product and you can do more, it becomes kind of like a Swiss army knife and you can build more use case, you can build more powerful applications. That's super, super important because you don't have to get certified in multiple databases. You, you are fast at getting things done, you achieve fire, develop velocity, and the managers are happy because they don't have to license more things, send you to more trainings, have more risk of something not being delivered, right? >>So it's really the, we see the suite where this best of breed play happening here, which in general was happening before already with Oracle's flagship database. Whereas those Amazon as an example, right? And now the interesting thing is every step away Oracle was always a one database company that can be only one and they're now generally talking about heat web and that two database company with different market spaces, but same value proposition of integrating more things very, very quickly to have a universal database that I call, they call the converge database for all the needs of an enterprise to run certain application use cases. And that's what's attractive to developers. >>It's, it's ironic isn't it? I mean I, you know, the rumor was the TK Thomas Curian left Oracle cuz he wanted to put Oracle database on other clouds and other places. And maybe that was the rift. Maybe there was, I'm sure there was other things, but, but Oracle clearly is now trying to expand its Tam Ron with, with heatwave into aws, into Azure. How do you think Oracle's gonna do, you were at a cloud world, what was the sentiment from customers and the independent analyst? Is this just Oracle trying to screw with the competition, create a little diversion? Or is this, you know, serious business for Oracle? What do you think? >>No, I think it has lakes. I think it's definitely, again, attriting to Oracle's overall ability to differentiate not only my SQL heat wave, but its overall portfolio. And I think the fact that they do have the alliance with the Azure in place, that this is definitely demonstrating their commitment to meeting the multi-cloud needs of its customers as well as what we pointed to in terms of the fact that they're now offering, you know, MySQL capabilities within AWS natively and that it can now perform AWS's own offering. And I think this is all demonstrating that Oracle is, you know, not letting up, they're not resting on its laurels. That's clearly we are living in a multi-cloud world, so why not just make it more easy for customers to be able to use cloud databases according to their own specific, specific needs. And I think, you know, to holder's point, I think that definitely lines with being able to bring on more application developers to leverage these capabilities. >>I think one important announcement that's related to all this was the JSON relational duality capabilities where now it's a lot easier for application developers to use a language that they're very familiar with a JS O and not have to worry about going into relational databases to store their J S O N application coding. So this is, I think an example of the innovation that's enhancing the overall Oracle portfolio and certainly all the work with machine learning is definitely paying dividends as well. And as a result, I see Oracle continue to make these inroads that we pointed to. But I agree with Mark, you know, the short term discounting is just a stall tag. This is not denying the fact that Oracle is being able to not only deliver price performance differentiators that are dramatic, but also meeting a wide range of needs for customers out there that aren't just limited device performance consideration. >>Being able to support multi-cloud according to customer needs. Being able to reach out to the application developer community and address a very specific challenge that has plagued them for many years now. So bring it all together. Yeah, I see this as just enabling Oracles who ring true with customers. That the customers that were there were basically all of them, even though not all of them are going to be saying the same things, they're all basically saying positive feedback. And likewise, I think the analyst community is seeing this. It's always refreshing to be able to talk to customers directly and at Oracle cloud there was a litany of them and so this is just a difference maker as well as being able to talk to strategic partners. The nvidia, I think partnerships also testament to Oracle's ongoing ability to, you know, make the ecosystem more user friendly for the customers out there. >>Yeah, it's interesting when you get these all in one tools, you know, the Swiss Army knife, you expect that it's not able to be best of breed. That's the kind of surprising thing that I'm hearing about, about heatwave. I want to, I want to talk about Lake House because when I think of Lake House, I think data bricks, and to my knowledge data bricks hasn't been in the sites of Oracle yet. Maybe they're next, but, but Oracle claims that MySQL, heatwave, Lakehouse is a breakthrough in terms of capacity and performance. Mark, what are your thoughts on that? Can you double click on, on Lakehouse Oracle's claims for things like query performance and data loading? What does it mean for the market? Is Oracle really leading in, in the lake house competitive landscape? What are your thoughts? >>Well, but name in the game is what are the problems you're solving for the customer? More importantly, are those problems urgent or important? If they're urgent, customers wanna solve 'em. Now if they're important, they might get around to them. So you look at what they're doing with Lake House or previous to that machine learning or previous to that automation or previous to that O L A with O ltp and they're merging all this capability together. If you look at Snowflake or data bricks, they're tacking one problem. You look at MyQ heat wave, they're tacking multiple problems. So when you say, yeah, their queries are much better against the lake house in combination with other analytics in combination with O ltp and the fact that there are no ETLs. So you're getting all this done in real time. So it's, it's doing the query cross, cross everything in real time. >>You're solving multiple user and developer problems, you're increasing their ability to get insight faster, you're having shorter response times. So yeah, they really are solving urgent problems for customers. And by putting it where the customer lives, this is the brilliance of actually being multicloud. And I know I'm backing up here a second, but by making it work in AWS and Azure where people already live, where they already have applications, what they're saying is, we're bringing it to you. You don't have to come to us to get these, these benefits, this value overall, I think it's a brilliant strategy. I give Nip and Argo wallet a huge, huge kudos for what he's doing there. So yes, what they're doing with the lake house is going to put notice on data bricks and Snowflake and everyone else for that matter. Well >>Those are guys that whole ago you, you and I have talked about this. Those are, those are the guys that are doing sort of the best of breed. You know, they're really focused and they, you know, tend to do well at least out of the gate. Now you got Oracle's converged philosophy, obviously with Oracle database. We've seen that now it's kicking in gear with, with heatwave, you know, this whole thing of sweets versus best of breed. I mean the long term, you know, customers tend to migrate towards suite, but the new shiny toy tends to get the growth. How do you think this is gonna play out in cloud database? >>Well, it's the forever never ending story, right? And in software right suite, whereas best of breed and so far in the long run suites have always won, right? So, and sometimes they struggle again because the inherent problem of sweets is you build something larger, it has more complexity and that means your cycles to get everything working together to integrate the test that roll it out, certify whatever it is, takes you longer, right? And that's not the case. It's a fascinating part of what the effort around my SQL heat wave is that the team is out executing the previous best of breed data, bringing us something together. Now if they can maintain that pace, that's something to to, to be seen. But it, the strategy, like what Mark was saying, bring the software to the data is of course interesting and unique and totally an Oracle issue in the past, right? >>Yeah. But it had to be in your database on oci. And but at, that's an interesting part. The interesting thing on the Lake health side is, right, there's three key benefits of a lakehouse. The first one is better reporting analytics, bring more rich information together, like make the, the, the case for silicon angle, right? We want to see engagements for this video, we want to know what's happening. That's a mixed transactional video media use case, right? Typical Lakehouse use case. The next one is to build more rich applications, transactional applications which have video and these elements in there, which are the engaging one. And the third one, and that's where I'm a little critical and concerned, is it's really the base platform for artificial intelligence, right? To run deep learning to run things automatically because they have all the data in one place can create in one way. >>And that's where Oracle, I know that Ron talked about Invidia for a moment, but that's where Oracle doesn't have the strongest best story. Nonetheless, the two other main use cases of the lake house are very strong, very well only concern is four 50 terabyte sounds long. It's an arbitrary limitation. Yeah, sounds as big. So for the start, and it's the first word, they can make that bigger. You don't want your lake house to be limited and the terabyte sizes or any even petabyte size because you want to have the certainty. I can put everything in there that I think it might be relevant without knowing what questions to ask and query those questions. >>Yeah. And you know, in the early days of no schema on right, it just became a mess. But now technology has evolved to allow us to actually get more value out of that data. Data lake. Data swamp is, you know, not much more, more, more, more logical. But, and I want to get in, in a moment, I want to come back to how you think the competitors are gonna respond. Are they gonna have to sort of do a more of a converged approach? AWS in particular? But before I do, Ron, I want to ask you a question about autopilot because I heard Larry Ellison's keynote and he was talking about how, you know, most security issues are human errors with autonomy and autonomous database and things like autopilot. We take care of that. It's like autonomous vehicles, they're gonna be safer. And I went, well maybe, maybe someday. So Oracle really tries to emphasize this, that every time you see an announcement from Oracle, they talk about new, you know, autonomous capabilities. It, how legit is it? Do people care? What about, you know, what's new for heatwave Lakehouse? How much of a differentiator, Ron, do you really think autopilot is in this cloud database space? >>Yeah, I think it will definitely enhance the overall proposition. I don't think people are gonna buy, you know, lake house exclusively cause of autopilot capabilities, but when they look at the overall picture, I think it will be an added capability bonus to Oracle's benefit. And yeah, I think it's kind of one of these age old questions, how much do you automate and what is the bounce to strike? And I think we all understand with the automatic car, autonomous car analogy that there are limitations to being able to use that. However, I think it's a tool that basically every organization out there needs to at least have or at least evaluate because it goes to the point of it helps with ease of use, it helps make automation more balanced in terms of, you know, being able to test, all right, let's automate this process and see if it works well, then we can go on and switch on on autopilot for other processes. >>And then, you know, that allows, for example, the specialists to spend more time on business use cases versus, you know, manual maintenance of, of the cloud database and so forth. So I think that actually is a, a legitimate value proposition. I think it's just gonna be a case by case basis. Some organizations are gonna be more aggressive with putting automation throughout their processes throughout their organization. Others are gonna be more cautious. But it's gonna be, again, something that will help the overall Oracle proposition. And something that I think will be used with caution by many organizations, but other organizations are gonna like, hey, great, this is something that is really answering a real problem. And that is just easing the use of these databases, but also being able to better handle the automation capabilities and benefits that come with it without having, you know, a major screwup happened and the process of transitioning to more automated capabilities. >>Now, I didn't attend cloud world, it's just too many red eyes, you know, recently, so I passed. But one of the things I like to do at those events is talk to customers, you know, in the spirit of the truth, you know, they, you know, you'd have the hallway, you know, track and to talk to customers and they say, Hey, you know, here's the good, the bad and the ugly. So did you guys, did you talk to any customers my SQL Heatwave customers at, at cloud world? And and what did you learn? I don't know, Mark, did you, did you have any luck and, and having some, some private conversations? >>Yeah, I had quite a few private conversations. The one thing before I get to that, I want disagree with one point Ron made, I do believe there are customers out there buying the heat wave service, the MySEQ heat wave server service because of autopilot. Because autopilot is really revolutionary in many ways in the sense for the MySEQ developer in that it, it auto provisions, it auto parallel loads, IT auto data places it auto shape predictions. It can tell you what machine learning models are going to tell you, gonna give you your best results. And, and candidly, I've yet to meet a DBA who didn't wanna give up pedantic tasks that are pain in the kahoo, which they'd rather not do and if it's long as it was done right for them. So yes, I do think people are buying it because of autopilot and that's based on some of the conversations I had with customers at Oracle Cloud World. >>In fact, it was like, yeah, that's great, yeah, we get fantastic performance, but this really makes my life easier and I've yet to meet a DBA who didn't want to make their life easier. And it does. So yeah, I've talked to a few of them. They were excited. I asked them if they ran into any bugs, were there any difficulties in moving to it? And the answer was no. In both cases, it's interesting to note, my sequel is the most popular database on the planet. Well, some will argue that it's neck and neck with SQL Server, but if you add in Mariah DB and ProCon db, which are forks of MySQL, then yeah, by far and away it's the most popular. And as a result of that, everybody for the most part has typically a my sequel database somewhere in their organization. So this is a brilliant situation for anybody going after MyQ, but especially for heat wave. And the customers I talk to love it. I didn't find anybody complaining about it. And >>What about the migration? We talked about TCO earlier. Did your t does your TCO analysis include the migration cost or do you kind of conveniently leave that out or what? >>Well, when you look at migration costs, there are different kinds of migration costs. By the way, the worst job in the data center is the data migration manager. Forget it, no other job is as bad as that one. You get no attaboys for doing it. Right? And then when you screw up, oh boy. So in real terms, anything that can limit data migration is a good thing. And when you look at Data Lake, that limits data migration. So if you're already a MySEQ user, this is a pure MySQL as far as you're concerned. It's just a, a simple transition from one to the other. You may wanna make sure nothing broke and every you, all your tables are correct and your schema's, okay, but it's all the same. So it's a simple migration. So it's pretty much a non-event, right? When you migrate data from an O LTP to an O L A P, that's an ETL and that's gonna take time. >>But you don't have to do that with my SQL heat wave. So that's gone when you start talking about machine learning, again, you may have an etl, you may not, depending on the circumstances, but again, with my SQL heat wave, you don't, and you don't have duplicate storage, you don't have to copy it from one storage container to another to be able to be used in a different database, which by the way, ultimately adds much more cost than just the other service. So yeah, I looked at the migration and again, the users I talked to said it was a non-event. It was literally moving from one physical machine to another. If they had a new version of MySEQ running on something else and just wanted to migrate it over or just hook it up or just connect it to the data, it worked just fine. >>Okay, so every day it sounds like you guys feel, and we've certainly heard this, my colleague David Foyer, the semi-retired David Foyer was always very high on heatwave. So I think you knows got some real legitimacy here coming from a standing start, but I wanna talk about the competition, how they're likely to respond. I mean, if your AWS and you got heatwave is now in your cloud, so there's some good aspects of that. The database guys might not like that, but the infrastructure guys probably love it. Hey, more ways to sell, you know, EC two and graviton, but you're gonna, the database guys in AWS are gonna respond. They're gonna say, Hey, we got Redshift, we got aqua. What's your thoughts on, on not only how that's gonna resonate with customers, but I'm interested in what you guys think will a, I never say never about aws, you know, and are they gonna try to build, in your view a converged Oola and o LTP database? You know, Snowflake is taking an ecosystem approach. They've added in transactional capabilities to the portfolio so they're not standing still. What do you guys see in the competitive landscape in that regard going forward? Maybe Holger, you could start us off and anybody else who wants to can chime in, >>Happy to, you mentioned Snowflake last, we'll start there. I think Snowflake is imitating that strategy, right? That building out original data warehouse and the clouds tasking project to really proposition to have other data available there because AI is relevant for everybody. Ultimately people keep data in the cloud for ultimately running ai. So you see the same suite kind of like level strategy, it's gonna be a little harder because of the original positioning. How much would people know that you're doing other stuff? And I just, as a former developer manager of developers, I just don't see the speed at the moment happening at Snowflake to become really competitive to Oracle. On the flip side, putting my Oracle hat on for a moment back to you, Mark and Iran, right? What could Oracle still add? Because the, the big big things, right? The traditional chasms in the database world, they have built everything, right? >>So I, I really scratched my hat and gave Nipon a hard time at Cloud world say like, what could you be building? Destiny was very conservative. Let's get the Lakehouse thing done, it's gonna spring next year, right? And the AWS is really hard because AWS value proposition is these small innovation teams, right? That they build two pizza teams, which can be fit by two pizzas, not large teams, right? And you need suites to large teams to build these suites with lots of functionalities to make sure they work together. They're consistent, they have the same UX on the administration side, they can consume the same way, they have the same API registry, can't even stop going where the synergy comes to play over suite. So, so it's gonna be really, really hard for them to change that. But AWS super pragmatic. They're always by themselves that they'll listen to customers if they learn from customers suite as a proposition. I would not be surprised if AWS trying to bring things closer together, being morely together. >>Yeah. Well how about, can we talk about multicloud if, if, again, Oracle is very on on Oracle as you said before, but let's look forward, you know, half a year or a year. What do you think about Oracle's moves in, in multicloud in terms of what kind of penetration they're gonna have in the marketplace? You saw a lot of presentations at at cloud world, you know, we've looked pretty closely at the, the Microsoft Azure deal. I think that's really interesting. I've, I've called it a little bit of early days of a super cloud. What impact do you think this is gonna have on, on the marketplace? But, but both. And think about it within Oracle's customer base, I have no doubt they'll do great there. But what about beyond its existing install base? What do you guys think? >>Ryan, do you wanna jump on that? Go ahead. Go ahead Ryan. No, no, no, >>That's an excellent point. I think it aligns with what we've been talking about in terms of Lakehouse. I think Lake House will enable Oracle to pull more customers, more bicycle customers onto the Oracle platforms. And I think we're seeing all the signs pointing toward Oracle being able to make more inroads into the overall market. And that includes garnishing customers from the leaders in, in other words, because they are, you know, coming in as a innovator, a an alternative to, you know, the AWS proposition, the Google cloud proposition that they have less to lose and there's a result they can really drive the multi-cloud messaging to resonate with not only their existing customers, but also to be able to, to that question, Dave's posing actually garnish customers onto their platform. And, and that includes naturally my sequel but also OCI and so forth. So that's how I'm seeing this playing out. I think, you know, again, Oracle's reporting is indicating that, and I think what we saw, Oracle Cloud world is definitely validating the idea that Oracle can make more waves in the overall market in this regard. >>You know, I, I've floated this idea of Super cloud, it's kind of tongue in cheek, but, but there, I think there is some merit to it in terms of building on top of hyperscale infrastructure and abstracting some of the, that complexity. And one of the things that I'm most interested in is industry clouds and an Oracle acquisition of Cerner. I was struck by Larry Ellison's keynote, it was like, I don't know, an hour and a half and an hour and 15 minutes was focused on healthcare transformation. Well, >>So vertical, >>Right? And so, yeah, so you got Oracle's, you know, got some industry chops and you, and then you think about what they're building with, with not only oci, but then you got, you know, MyQ, you can now run in dedicated regions. You got ADB on on Exadata cloud to customer, you can put that OnPrem in in your data center and you look at what the other hyperscalers are, are doing. I I say other hyperscalers, I've always said Oracle's not really a hyperscaler, but they got a cloud so they're in the game. But you can't get, you know, big query OnPrem, you look at outposts, it's very limited in terms of, you know, the database support and again, that that will will evolve. But now you got Oracle's got, they announced Alloy, we can white label their cloud. So I'm interested in what you guys think about these moves, especially the industry cloud. We see, you know, Walmart is doing sort of their own cloud. You got Goldman Sachs doing a cloud. Do you, you guys, what do you think about that and what role does Oracle play? Any thoughts? >>Yeah, let me lemme jump on that for a moment. Now, especially with the MyQ, by making that available in multiple clouds, what they're doing is this follows the philosophy they've had the past with doing cloud, a customer taking the application and the data and putting it where the customer lives. If it's on premise, it's on premise. If it's in the cloud, it's in the cloud. By making the mice equal heat wave, essentially a plug compatible with any other mice equal as far as your, your database is concern and then giving you that integration with O L A P and ML and Data Lake and everything else, then what you've got is a compelling offering. You're making it easier for the customer to use. So I look the difference between MyQ and the Oracle database, MyQ is going to capture market more market share for them. >>You're not gonna find a lot of new users for the Oracle debate database. Yeah, there are always gonna be new users, don't get me wrong, but it's not gonna be a huge growth. Whereas my SQL heatwave is probably gonna be a major growth engine for Oracle going forward. Not just in their own cloud, but in AWS and in Azure and on premise over time that eventually it'll get there. It's not there now, but it will, they're doing the right thing on that basis. They're taking the services and when you talk about multicloud and making them available where the customer wants them, not forcing them to go where you want them, if that makes sense. And as far as where they're going in the future, I think they're gonna take a page outta what they've done with the Oracle database. They'll add things like JSON and XML and time series and spatial over time they'll make it a, a complete converged database like they did with the Oracle database. The difference being Oracle database will scale bigger and will have more transactions and be somewhat faster. And my SQL will be, for anyone who's not on the Oracle database, they're, they're not stupid, that's for sure. >>They've done Jason already. Right. But I give you that they could add graph and time series, right. Since eat with, Right, Right. Yeah, that's something absolutely right. That's, that's >>A sort of a logical move, right? >>Right. But that's, that's some kid ourselves, right? I mean has worked in Oracle's favor, right? 10 x 20 x, the amount of r and d, which is in the MyQ space, has been poured at trying to snatch workloads away from Oracle by starting with IBM 30 years ago, 20 years ago, Microsoft and, and, and, and didn't work, right? Database applications are extremely sticky when they run, you don't want to touch SIM and grow them, right? So that doesn't mean that heat phase is not an attractive offering, but it will be net new things, right? And what works in my SQL heat wave heat phases favor a little bit is it's not the massive enterprise applications which have like we the nails like, like you might be only running 30% or Oracle, but the connections and the interfaces into that is, is like 70, 80% of your enterprise. >>You take it out and it's like the spaghetti ball where you say, ah, no I really don't, don't want to do all that. Right? You don't, don't have that massive part with the equals heat phase sequel kind of like database which are more smaller tactical in comparison, but still I, I don't see them taking so much share. They will be growing because of a attractive value proposition quickly on the, the multi-cloud, right? I think it's not really multi-cloud. If you give people the chance to run your offering on different clouds, right? You can run it there. The multi-cloud advantages when the Uber offering comes out, which allows you to do things across those installations, right? I can migrate data, I can create data across something like Google has done with B query Omni, I can run predictive models or even make iron models in different place and distribute them, right? And Oracle is paving the road for that, but being available on these clouds. But the multi-cloud capability of database which knows I'm running on different clouds that is still yet to be built there. >>Yeah. And >>That the problem with >>That, that's the super cloud concept that I flowed and I I've always said kinda snowflake with a single global instance is sort of, you know, headed in that direction and maybe has a league. What's the issue with that mark? >>Yeah, the problem with the, with that version, the multi-cloud is clouds to charge egress fees. As long as they charge egress fees to move data between clouds, it's gonna make it very difficult to do a real multi-cloud implementation. Even Snowflake, which runs multi-cloud, has to pass out on the egress fees of their customer when data moves between clouds. And that's really expensive. I mean there, there is one customer I talked to who is beta testing for them, the MySQL heatwave and aws. The only reason they didn't want to do that until it was running on AWS is the egress fees were so great to move it to OCI that they couldn't afford it. Yeah. Egress fees are the big issue but, >>But Mark the, the point might be you might wanna root query and only get the results set back, right was much more tinier, which been the answer before for low latency between the class A problem, which we sometimes still have but mostly don't have. Right? And I think in general this with fees coming down based on the Oracle general E with fee move and it's very hard to justify those, right? But, but it's, it's not about moving data as a multi-cloud high value use case. It's about doing intelligent things with that data, right? Putting into other places, replicating it, what I'm saying the same thing what you said before, running remote queries on that, analyzing it, running AI on it, running AI models on that. That's the interesting thing. Cross administered in the same way. Taking things out, making sure compliance happens. Making sure when Ron says I don't want to be American anymore, I want to be in the European cloud that is gets migrated, right? So tho those are the interesting value use case which are really, really hard for enterprise to program hand by hand by developers and they would love to have out of the box and that's yet the innovation to come to, we have to come to see. But the first step to get there is that your software runs in multiple clouds and that's what Oracle's doing so well with my SQL >>Guys. Amazing. >>Go ahead. Yeah. >>Yeah. >>For example, >>Amazing amount of data knowledge and, and brain power in this market. Guys, I really want to thank you for coming on to the cube. Ron Holger. Mark, always a pleasure to have you on. Really appreciate your time. >>Well all the last names we're very happy for Romanic last and moderator. Thanks Dave for moderating us. All right, >>We'll see. We'll see you guys around. Safe travels to all and thank you for watching this power panel, The Truth About My SQL Heat Wave on the cube. Your leader in enterprise and emerging tech coverage.

Published Date : Nov 1 2022

SUMMARY :

Always a pleasure to have you on. I think you just saw him at Oracle Cloud World and he's come on to describe this is doing, you know, Google is, you know, we heard Google Cloud next recently, They own somewhere between 30 to 50% depending on who you read migrate from one cloud to another and suddenly you have a very compelling offer. All right, so thank you for that. And they certainly with the AI capabilities, And I believe strongly that long term it's gonna be ones who create better value for So I mean it's certainly, you know, when, when Oracle talks about the competitors, So what do you make of the benchmarks? say, Snowflake when it comes to, you know, the Lakehouse platform and threat to keep, you know, a customer in your own customer base. And oh, by the way, as you grow, And I know you look at this a lot, to insight, it doesn't improve all those things that you want out of a database or multiple databases So what about, I wonder ho if you could chime in on the developer angle. they don't have to license more things, send you to more trainings, have more risk of something not being delivered, all the needs of an enterprise to run certain application use cases. I mean I, you know, the rumor was the TK Thomas Curian left Oracle And I think, you know, to holder's point, I think that definitely lines But I agree with Mark, you know, the short term discounting is just a stall tag. testament to Oracle's ongoing ability to, you know, make the ecosystem Yeah, it's interesting when you get these all in one tools, you know, the Swiss Army knife, you expect that it's not able So when you say, yeah, their queries are much better against the lake house in You don't have to come to us to get these, these benefits, I mean the long term, you know, customers tend to migrate towards suite, but the new shiny bring the software to the data is of course interesting and unique and totally an Oracle issue in And the third one, lake house to be limited and the terabyte sizes or any even petabyte size because you want keynote and he was talking about how, you know, most security issues are human I don't think people are gonna buy, you know, lake house exclusively cause of And then, you know, that allows, for example, the specialists to And and what did you learn? The one thing before I get to that, I want disagree with And the customers I talk to love it. the migration cost or do you kind of conveniently leave that out or what? And when you look at Data Lake, that limits data migration. So that's gone when you start talking about So I think you knows got some real legitimacy here coming from a standing start, So you see the same And you need suites to large teams to build these suites with lots of functionalities You saw a lot of presentations at at cloud world, you know, we've looked pretty closely at Ryan, do you wanna jump on that? I think, you know, again, Oracle's reporting I think there is some merit to it in terms of building on top of hyperscale infrastructure and to customer, you can put that OnPrem in in your data center and you look at what the So I look the difference between MyQ and the Oracle database, MyQ is going to capture market They're taking the services and when you talk about multicloud and But I give you that they could add graph and time series, right. like, like you might be only running 30% or Oracle, but the connections and the interfaces into You take it out and it's like the spaghetti ball where you say, ah, no I really don't, global instance is sort of, you know, headed in that direction and maybe has a league. Yeah, the problem with the, with that version, the multi-cloud is clouds And I think in general this with fees coming down based on the Oracle general E with fee move Yeah. Guys, I really want to thank you for coming on to the cube. Well all the last names we're very happy for Romanic last and moderator. We'll see you guys around.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
MarkPERSON

0.99+

Ron HolgerPERSON

0.99+

RonPERSON

0.99+

Mark StammerPERSON

0.99+

IBMORGANIZATION

0.99+

Ron WestfallPERSON

0.99+

RyanPERSON

0.99+

AWSORGANIZATION

0.99+

DavePERSON

0.99+

WalmartORGANIZATION

0.99+

Larry EllisonPERSON

0.99+

MicrosoftORGANIZATION

0.99+

AlibabaORGANIZATION

0.99+

OracleORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

Holgar MuellerPERSON

0.99+

AmazonORGANIZATION

0.99+

Constellation ResearchORGANIZATION

0.99+

Goldman SachsORGANIZATION

0.99+

17 timesQUANTITY

0.99+

twoQUANTITY

0.99+

David FoyerPERSON

0.99+

44%QUANTITY

0.99+

1.2%QUANTITY

0.99+

4.8 billionQUANTITY

0.99+

JasonPERSON

0.99+

UberORGANIZATION

0.99+

Fu Chim ResearchORGANIZATION

0.99+

Dave AntePERSON

0.99+

Adam Meyers, CrowdStrike | CrowdStrike Fal.Con 2022


 

>> We're back at the ARIA Las Vegas. We're covering CrowdStrike's Fal.Con 22. First one since 2019. Dave Vellante and Dave Nicholson on theCUBE. Adam Meyers is here, he is the Senior Vice President of Intelligence at CrowdStrike. Adam, thanks for coming to theCUBE. >> Thanks for having me. >> Interesting times, isn't it? You're very welcome. Senior Vice President of Intelligence, tell us what your role is. >> So I run all of our intelligence offerings. All of our analysts, we have a couple hundred analysts that work at CrowdStrike tracking threat actors. There's 185 threat actors that we track today. We're constantly adding more of them and it requires us to really have that visibility and understand how they operate so that we can inform our other products: our XDR, our Cloud Workload Protections and really integrate all of this around the threat actor. >> So it's that threat hunting capability that CrowdStrike has. That's what you're sort of... >> Well, so think of it this way. When we launched the company 11 years ago yesterday, what we wanted to do was to tell customers, to tell people that, well, you don't have a malware problem, you have an adversary problem. There are humans that are out there conducting these attacks, and if you know who they are what they're up to, how they operate then you're better positioned to defend against them. And so that's really at the core, what CrowdStrike started with and all of our products are powered by intelligence. All of our services are our OverWatch and our Falcon complete, all powered by intelligence because we want to know who the threat actors are and what they're doing so we can stop them. >> So for instance like you can stop known malware. A lot of companies can stop known malware, but you also can stop unknown malware. And I infer that the intelligence is part of that equation, is that right? >> Absolutely. That that's the outcome. That's the output of the intelligence but I could also tell you who these threat actors are, where they're operating out of, show you pictures of some of them, that's the threat intel. We are tracking down to the individual persona in many cases, these various threats whether they be Chinese nation state, Russian threat actors, Iran, North Korea, we track as I said, quite a few of these threats. And over time, we develop a really robust deep knowledge about who they are and how they operate. >> Okay. And we're going to get into some of that, the big four and cyber. But before we do, I want to ask you about the eCrime index stats, the ECX you guys call it a little side joke for all your nerds out there. Maybe you could explain that Adam >> Assembly humor. >> Yeah right, right. So, but, what is that index? You guys, how often do you publish it? What are you learning from that? >> Yeah, so it was modeled off of the Dow Jones industrial average. So if you look at the Dow Jones it's a composite index that was started in the late 1800s. And they took a couple of different companies that were the industrial component of the economy back then, right. Textiles and railroads and coal and steel and things like that. And they use that to approximate the overall health of the economy. So if you take these different stocks together, swizzle 'em together, and figure out some sort of number you could say, look, it's up. The economy's doing good. It's down, not doing so good. So after World War II, everybody was exuberant and positive about the end of the war. The DGI goes up, the oil crisis in the seventies goes down, COVID hits goes up, sorry, goes down. And then everybody realizes that they can use Amazon still and they can still get the things they need goes back up with the eCrime index. We took that approach to say what is the health of the underground economy? When you read about any of these ransomware attacks or data extortion attacks there are criminal groups that are working together in order to get things spammed out or to buy credentials and things like that. And so what the eCrime index does is it takes 24 different observables, right? The price of a ransom, the number of ransom attacks, the fluctuation in cryptocurrency, how much stolen material is being sold for on the underground. And we're constantly computing this number to understand is the eCrime ecosystem healthy? Is it thriving or is it under pressure? And that lets us understand what's going on in the world and kind of contextualize it. Give an example, Microsoft on patch Tuesday releases 56 vulnerabilities. 11 of them are critical. Well guess what? After hack Tuesday. So after patch Tuesday is hack Wednesday. And so all of those 11 vulnerabilities are exploitable. And now you have threat actors that have a whole new array of weapons that they can deploy and bring to bear against their victims after that patch Tuesday. So that's hack Wednesday. Conversely we'll get something like the colonial pipeline. Colonial pipeline attack May of 21, I think it was, comes out and all of the various underground forums where these ransomware operators are doing their business. They freak out because they don't want law enforcement. President Biden is talking about them and he's putting pressure on them. They don't want this ransomware component of what they're doing to bring law enforcement, bring heat on them. So they deplatform them. They kick 'em off. And when they do that, the ransomware stops being as much of a factor at that point in time. And the eCrime index goes down. So we can look at holidays, and right around Thanksgiving, which is coming up pretty soon, it's going to go up because there's so much online commerce with cyber Monday and such, right? You're going to see this increase in online activity; eCrime actors want to take advantage of that. When Christmas comes, they take vacation too; they're going to spend time with their families, so it goes back down and it stays down till around the end of the Russian Orthodox Christmas, which you can probably extrapolate why that is. And then it goes back up. So as it's fluctuating, it gives us the ability to really just start tracking what that economy looks like. >> Realtime indicator of that crypto. >> I mean, you talked about, talked about hack Wednesday, and before that you mentioned, you know, the big four, and I think you said 185 threat actors that you're tracking, is 180, is number 185 on that list? Somebody living in their basement in their mom's basement or are the resources necessary to get on that list? Such that it's like, no, no, no, no. this is very, very organized, large groups of people. Hollywood would have you believe that it's guy with a laptop, hack Wednesday, (Dave Nicholson mimics keyboard clacking noises) and everything done. >> Right. >> Are there individuals who are doing things like that or are these typically very well organized? >> That's a great question. And I think it's an important one to ask and it's both it tends to be more, the bigger groups. There are some one-off ones where it's one or two people. Sometimes they get big. Sometimes they get small. One of the big challenges. Have you heard of ransomware as a service? >> Of course. Oh my God. Any knucklehead can be a ransomwarist. >> Exactly. So we don't track those knuckleheads as much unless they get onto our radar somehow, they're conducting a lot of operations against our customers or something like that. But what we do track is that ransomware as a service platform because the affiliates, the people that are using it they come, they go and, you know, it could be they're only there for a period of time. Sometimes they move between different ransomware services, right? They'll use the one that's most useful for them that that week or that month, they're getting the best rate because it's rev sharing. They get a percentage that platform gets percentage of the ransom. So, you know, they negotiate a better deal. They might move to a different ransomware platform. So that's really hard to track. And it's also, you know, I think more important for us to understand the platform and the technology that is being used than the individual that's doing it. >> Yeah. Makes sense. Alright, let's talk about the big four. China, Iran, North Korea, and Russia. Tell us about, you know, how you monitor these folks. Are there different signatures for each? Can you actually tell, you know based on the hack who's behind it? >> So yeah, it starts off, you know motivation is a huge factor. China conducts espionage, they do it for diplomatic purposes. They do it for military and political purposes. And they do it for economic espionage. All of these things map to known policies that they put out, the Five Year Plan, the Made in China 2025, the Belt and Road Initiative, it's all part of their efforts to become a regional and ultimately a global hegemon. >> They're not stealing nickels and dimes. >> No they're stealing intellectual property. They're stealing trade secrets. They're stealing negotiation points. When there's, you know a high speed rail or something like that. And they use a set of tools and they have a set of behaviors and they have a set of infrastructure and a set of targets that as we look at all of these things together we can derive who they are by motivation and the longer we observe them, the more data we get, the more we can get that attribution. I could tell you that there's X number of Chinese threat groups that we track under Panda, right? And they're associated with the Ministry of State Security. There's a whole other set. That's too associated with the People's Liberation Army Strategic Support Force. So, I mean, these are big operations. They're intelligence agencies that are operating out of China. Iran has a different set of targets. They have a different set of motives. They go after North American and Israeli businesses right now that's kind of their main operation. And they're doing something called hack and lock and leak. With a lock and leak, what they're doing is they're deploying ransomware. They don't care about getting a ransom payment. They're just doing it to disrupt the target. And then they're leaking information that they steal during that operation that brings embarrassment. It brings compliance, regulatory, legal impact for that particular entity. So it's disruptive >> The chaos creators that's.. >> Well, you know I think they're trying to create a they're trying to really impact the legitimacy of some of these targets and the trust that their customers and their partners and people have in them. And that is psychological warfare in a certain way. And it, you know is really part of their broader initiative. Look at some of the other things that they've done they've hacked into like the missile defense system in Israel, and they've turned on the sirens, right? Those are all things that they're doing for a specific purpose, and that's not China, right? Like as you start to look at this stuff, you can start to really understand what they're up to. Russia very much been busy targeting NATO and NATO countries and Ukraine. Obviously the conflict that started in February has been a huge focus for these threat actors. And then as we look at North Korea, totally different. They're doing, there was a major crypto attack today. They're going after these crypto platforms, they're going after DeFi platforms. They're going after all of this stuff that most people don't even understand and they're stealing the crypto currency and they're using it for revenue generation. These nuclear weapons don't pay for themselves, their research and development don't pay for themselves. And so they're using that cyber operation to either steal money or steal intelligence. >> They need the cash. Yeah. >> Yeah. And they also do economic targeting because Kim Jong Un had said back in 2016 that they need to improve the lives of North Koreans. They have this national economic development strategy. And that means that they need, you know, I think only 30% of North Korea has access to reliable power. So having access to clean energy sources and renewable energy sources, that's important to keep the people happy and stop them from rising up against the regime. So that's the type of economic espionage that they're conducting. >> Well, those are the big four. If there were big five or six, I would presume US and some Western European countries would be on there. Do you track, I mean, where United States obviously has you know, people that are capable of this we're out doing our thing, and- >> So I think- >> That defense or offense, where do we sit in this matrix? >> Well, I think the big five would probably include eCrime. We also track India, Pakistan. We track actors out of Columbia, out of Turkey, out of Syria. So there's a whole, you know this problem is getting worse over time. It's proliferating. And I think COVID was also, you know a driver there because so many of these countries couldn't move human assets around because everything was getting locked down. As machine learning and artificial intelligence and all of this makes its way into the cameras at border and transfer points, it's hard to get a human asset through there. And so cyber is a very attractive, cheap and deniable form of espionage and gives them operational capabilities, not, you know and to your question about US and other kind of five I friendly type countries we have not seen them targeting our customers. So we focus on the threats that target our customers. >> Right. >> And so, you know, if we were to find them at a customer environment sure. But you know, when you look at some of the public reporting that's out there, the malware that's associated with them is focused on, you know, real bad people, and it's, it's physically like crypted to their hard drive. So unless you have sensor on, you know, an Iranian or some other laptop that might be target or something like that. >> Well, like Stuxnet did. >> Yeah. >> Right so. >> You won't see it. Right. See, so yeah. >> Well Symantec saw it but way back when right? Back in the day. >> Well, I mean, if you want to go down that route I think it actually came from a company in the region that was doing the IR and they were working with Symantec. >> Oh, okay. So, okay. So it was a local >> Yeah. I think Crisis, I think was the company that first identified it. And then they worked with Symantec. >> It Was, they found it, I guess, a logic controller. I forget what it was. >> It was a long time ago, so I might not have that completely right. >> But it was a seminal moment in the industry. >> Oh. And it was a seminal moment for Iran because you know, that I think caused them to get into cyber operations. Right. When they realized that something like that could happen that bolstered, you know there was a lot of underground hacking forums in Iran. And, you know, after Stuxnet, we started seeing that those hackers were dropping their hacker names and they were starting businesses. They were starting to try to go after government contracts. And they were starting to build training offensive programs, things like that because, you know they realized that this is an opportunity there. >> Yeah. We were talking earlier about this with Shawn and, you know, in the nuclear war, you know the Cold War days, you had the mutually assured destruction. It's not as black and white in the cyber world. Right. Cause as, as Robert Gates told me, you know a few years ago, we have a lot more to lose. So we have to be somewhat, as the United States, careful as to how much of an offensive posture we take. >> Well here's a secret. So I have a background on political science. So mutually assured destruction, I think is a deterrent strategy where you have two kind of two, two entities that like they will destroy each other if they so they're disinclined to go down that route. >> Right. >> With cyber I really don't like that mutually assured destruction >> That doesn't fit right. >> I think it's deterrents by denial. Right? So raising the cost, if they were to conduct a cyber operation, raising that cost that they don't want to do it, they don't want to incur the impact of that. Right. And think about this in terms of a lot of people are asking about would China invade Taiwan. And so as you look at the cost that that would have on the Chinese military, the POA, the POA Navy et cetera, you know, that's that deterrents by denial, trying to, trying to make the costs so high that they don't want to do it. And I think that's a better fit for cyber to try to figure out how can we raise the cost to the adversary if they operate against our customers against our enterprises and that they'll go someplace else and do something else. >> Well, that's a retaliatory strike, isn't it? I mean, is that what you're saying? >> No, definitely not. >> It's more of reducing their return on investment essentially. >> Yeah. >> And incenting them- disincening them to do X and sending them off somewhere else. >> Right. And threat actors, whether they be criminals or nation states, you know, Bruce Lee had this great quote that was "be like water", right? Like take the path of least resistance, like water will. Threat actors do that too. So, I mean, unless you're super high value target that they absolutely have to get into by any means necessary, then if you become too hard of a target, they're going to move on to somebody that's a little easier. >> Makes sense. Awesome. Really appreciate your, I could, we'd love to have you back. >> Anytime. >> Go deeper. Adam Myers. We're here at Fal.Con 22, Dave Vellante, Dave Nicholson. We'll be right back right after this short break. (bouncy music plays)

Published Date : Sep 21 2022

SUMMARY :

he is the Senior Vice Senior Vice President of Intelligence, so that we can inform our other products: So it's that threat hunting capability And so that's really at the core, And I infer that the intelligence that's the threat intel. the ECX you guys call it What are you learning from that? and positive about the end of the war. and before that you mentioned, you know, One of the big challenges. And it's also, you know, Tell us about, you know, So yeah, it starts off, you know and the longer we observe And it, you know is really part They need the cash. And that means that they need, you know, people that are capable of this And I think COVID was also, you know And so, you know, See, so yeah. Back in the day. in the region that was doing the IR So it was a local And then they worked with Symantec. It Was, they found it, I so I might not have that completely right. moment in the industry. like that because, you know in the nuclear war, you know strategy where you have two kind of two, So raising the cost, if they were to It's more of reducing their return and sending them off somewhere else. that they absolutely have to get into to have you back. after this short break.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

SymantecORGANIZATION

0.99+

Dave NicholsonPERSON

0.99+

Dave NicholsonPERSON

0.99+

Adam MyersPERSON

0.99+

Bruce LeePERSON

0.99+

Adam MeyersPERSON

0.99+

AdamPERSON

0.99+

FebruaryDATE

0.99+

2016DATE

0.99+

NATOORGANIZATION

0.99+

TurkeyLOCATION

0.99+

AmazonORGANIZATION

0.99+

IranLOCATION

0.99+

Robert GatesPERSON

0.99+

twoQUANTITY

0.99+

SyriaLOCATION

0.99+

oneQUANTITY

0.99+

11 vulnerabilitiesQUANTITY

0.99+

Ministry of State SecurityORGANIZATION

0.99+

World War IIEVENT

0.99+

ShawnPERSON

0.99+

CrowdStrikeORGANIZATION

0.99+

Kim Jong UnPERSON

0.99+

WednesdayDATE

0.99+

ColumbiaLOCATION

0.99+

IsraelLOCATION

0.99+

56 vulnerabilitiesQUANTITY

0.99+

Cold WarEVENT

0.99+

May of 21DATE

0.99+

ChristmasEVENT

0.99+

sixQUANTITY

0.99+

24 different observablesQUANTITY

0.99+

late 1800sDATE

0.99+

ChinaORGANIZATION

0.99+

2019DATE

0.99+

People's Liberation Army Strategic Support ForceORGANIZATION

0.99+

firstQUANTITY

0.98+

bothQUANTITY

0.98+

185 threat actorsQUANTITY

0.98+

PresidentPERSON

0.98+

two peopleQUANTITY

0.98+

ChinaLOCATION

0.98+

MicrosoftORGANIZATION

0.98+

RussiaORGANIZATION

0.98+

two entitiesQUANTITY

0.98+

ThanksgivingEVENT

0.98+

TuesdayDATE

0.98+

North KoreaORGANIZATION

0.98+

HollywoodORGANIZATION

0.98+

todayDATE

0.97+

Dow JonesOTHER

0.97+

ChineseOTHER

0.97+

11 of themQUANTITY

0.97+

eachQUANTITY

0.97+

OneQUANTITY

0.97+

IranORGANIZATION

0.96+

First oneQUANTITY

0.96+

30%QUANTITY

0.96+

POA NavyORGANIZATION

0.96+

StuxnetPERSON

0.95+

IsraeliOTHER

0.94+

Las VegasLOCATION

0.94+

180QUANTITY

0.94+

RussianOTHER

0.94+

USLOCATION

0.94+

Fal.Con 22EVENT

0.91+

fiveQUANTITY

0.9+

ARIAORGANIZATION

0.89+

United StatesLOCATION

0.89+

CrisisORGANIZATION

0.88+

North KoreansPERSON

0.87+

eCrimeORGANIZATION

0.85+

11 years ago yesterdayDATE

0.84+

few years agoDATE

0.84+

Dan Molina, nth, Terry Richardson, AMD, & John Frey, HPE | Better Together with SHI


 

(futuristic music) >> Hey everyone. Lisa Martin here for theCUBE back with you, three guests join me. Dan Molina is here, the co-president and chief technology officer at NTH Generation. And I'm joined once again by Terry Richardson, North American channel chief for AMD and Dr. John Fry, chief technologist, sustainable transformation at HPE. Gentlemen, It's a pleasure to have you on theCUBE Thank you for joining me. >> Thank you, Lisa. >> Dan. Let's have you kick things off. Talk to us about how NTH Generation is addressing the environmental challenges that your customers are having while meeting the technology demands of the future. That those same customers are no doubt having. >> It's quite an interesting question, Lisa, in our case we have been in business since 1991 and we started by providing highly available computing solutions. So this is great for me to be partnered here with HPE and the AMD because we want to provide quality computing solutions. And back in the day, since 1991 saving energy saving footprint or reducing footprint in the data center saving on cooling costs was very important. Over time those became even more critical components of our solutions design. As you know, as a society we started becoming more aware of the benefits and the must that we have a responsibility back to society to basically contribute with our social and environmental responsibility. So one of the things that we continue to do and we started back in 1991 is to make sure that we're deciding compute solutions based on clients' actual needs. We go out of our way to collect real performance data real IT resource consumption data. And then we architect solutions using best in the industry components like AMD and HPE to make sure that they were going to be meeting those goals and energy savings, like cooling savings, footprint reduction, knowing that instead of maybe requiring 30 servers, just to mention an example maybe we're going to go down to 14 and that's going to result in great energy savings. Our commitment to making sure that we're providing optimized solutions goes all the way to achieving the top level certifications from our great partner, Hewlett Packard Enterprise. Also go deep into micro processing technologies like AMD but we want to make sure that the designs that we're putting together actually meet those goals. >> You talked about why sustainability is important to NTH from back in the day. I love how you said that. Dan, talk to us a little bit about what you're hearing from customers as we are seeing sustainability as a corporate initiative horizontally across industries and really rise up through the C-suite to the board. >> Right, it is quite interesting Lisa We do service pretty much horizontally just about any vertical, including public sector and the private sector from retail to healthcare, to biotech to manufacturing, of course, cities and counties. So we have a lot of experience with many different verticals. And across the board, we do see an increased interest in being socially responsible. And that includes not just being responsible on recycling as an example, most of our conversations or engagements that conversation happens, 'What what's going to happen with the old equipment ?' as we're replacing with more modern, more powerful, more efficient equipment. And we do a number of different things that go along with social responsibility and environment protection. And that's basically e-waste programs. As an example, we also have a program where we actually donate some of that older equipment to schools and that is quite quite something because we're helping an organization save energy, footprint. Basically the things that we've been talking about but at the same time, the older equipment even though it's not saving that much energy it still serves a purpose in society where maybe the unprivileged or not as able to afford computing equipment in certain schools and things of that nature. Now they can benefit and being productive to society. So it's not just about energy savings there's so many other factors around social corporate responsibility. >> So sounds like Dan, a very comprehensive end to end vision that NTH has around sustainability. Let's bring John and Terry into the conversation. John, we're going to start with you. Talk to us a little bit about how HPE and NTH are partnering together. What are some of the key aspects of the relationship from HPE's perspective that enable you both to meet not just your corporate sustainable IT objectives, but those of your customers. >> Yeah, it's a great question. And one of the things that HPE brings to bear is 20 years experience on sustainable IT, white papers, executive workbooks and a lot of expertise for how do we bring optimized solutions to market. If the customer doesn't want to manage those pieces himself we have our 'As a service solutions, HPE GreenLake. But our sales force won't get to every customer across the globe that wants to take advantage of this expertise. So we partner with companies like NTH to know the customer better, to develop the right solution for that customer and with NTH's relationships with the customers, they can constantly help the customer optimize those solutions and see where there perhaps areas of opportunity that may be outside of HPE's own portfolio, such as client devices where they can bring that expertise to bear, to help the customer have a better total customer experience. >> And that is critical, that better overall comprehensive total customer experience. As we know on the other end, all customers are demanding customers like us who want data in real time, we want access. We also want the corporate and the social responsibility of the companies that we work with. Terry, bringing you into the conversation. Talk to us a little about AMD. How are you helping customers to create what really is a sustainable IT strategy from what often starts out as sustainability tactics? >> Exactly. And to pick up on what John and and Dan were saying, we're really energized about the opportunity to allow customers to accelerate their ability to attain some of their more strategic sustainability goals. You know, since we started on our current data center, CPU and GPU offerings, each generation we continue to focus on increasing the performance capability with great sensitivity to the efficiency, right? So as customers are modernizing their data center and achieving their own digital transformation initiatives we are able to deliver solutions through HPE that really address a greater performance per watt which is a a core element in allowing customers to achieve the goals that John and Dan talked about. So, you know, as a company, we're fully on board with some very public positions around our own sustainability goals, but working with terrific partners like NTH and HPE allows us to together bring those enabling technologies directly to customers >> Enabling and accelerating technologies. Dan, let's go back to you. You mentioned some of the things that NTH is doing from a sustainability approach, the social and the community concern, energy use savings, recycling but this goes all the way from NTH's perspective to things like outreach and fairness in the workplace. Talk to us a little bit about some of those other initiatives that NTH has fired up. >> Absolutely, well at NTH , since the early days, we have invested heavily on modern equipment and we have placed that at NTH labs, just like HPE labs we have NTH labs, and that's what we do a great deal of testing to make sure that our clients, right our joint clients are going to have high quality solutions that we're not just talking about it and we actually test them. So that is definitely an investment by being conscious about energy conservation. We have programs and scripts to shut down equipment that is not needed at the time, right. So we're definitely conscious about it. So I wanted to mention that example. Another one is, we all went through a pandemic and this is still ongoing from some perspectives. And that forced pretty much all of our employees, at least for some time to work from home. Being an IT company, we're very proud that we made that transition almost seamlessly. And we're very proud that you know people who continue to work from home, they're saving of course, gasoline, time, traffic, all those benefits that go with reducing transportation, and don't get me wrong, I mean, sometimes it is important to still have face to face meetings, especially with new organizations that you want to establish trust. But for the most part we have become a hybrid workforce type of organization. At the same time, we're also implementing our own hybrid IT approach which is what we talk to our clients about. So there's certain workloads, there are certain applications that truly belong in in public cloud or Software as a Service. And there's other workloads that truly belong, to stay in your data center. So a combination and doing it correctly can result in significant savings, not just money, but also again energy, consumption. Other things that we're doing, I mentioned trading programs, again, very proud that you know, we use a e-waste programs to make sure that those IT equipment is properly disposed of and it's not going to end in a landfill somewhere but also again, donating to schools, right? And very proud about that one. We have other outreach programs. Normally at the end of the year we do some substantial donations and we encourage our employees, my coworkers to donate. And we match those donations to organizations like Operation USA, they provide health and education programs to recover from disasters. Another one is Salvation Army, where basically they fund rehabilitation programs that heal addictions change lives and restore families. We also donate to the San Diego Zoo. We also believe in the whole ecosystem, of course and we're very proud to be part of that. They are supporting more than 140 conservation projects and partnerships in 70 countries. And we're part of that donation. And our owner has been part of the board or he was for a number of years. Mercy House down in San Diego, we have our headquarters. They have programs for the homeless. And basically that they're servicing. Also Save a Life Foundation for the youth to be educated to help prevent sudden cardiac arrest for the youth. So programs like that. We're very proud to be part of the donations. Again, it's not just about energy savings but it's so many other things as part of our corporate social responsibility program. Other things that I wanted to mention. Everything in our buildings, in our offices, multiple locations. Now we turn into LED. So again, we're eating our own dog food as they say but that is definitely some significant energy savings. And then lastly, I wanted to mention, this is more what we do for our customers, but the whole HPE GreenLake program we have a growing number of clients especially in Southern California. And some of those are quite large like school districts, like counties. And we feel very proud that in the old days customers would buy IT equipment for the next three to five years. Right? And they would buy extra because obviously they're expecting some growth while that equipment must consume energy from day one. With a GreenLake type of program, the solution is sized properly. Maybe a little bit of a buffer for unexpected growth needs. And anyway, but with a GreenLake program as customers need more IT resources to continue to expand their workloads for their organizations. Then we go in with 'just in time' type of resources. Saving energy and footprint and everything else that we've been talking about along the way. So very proud to be one of the go-tos for Hewlett Packard Enterprise on the GreenLake program which is now a platform, so. >> That's great. Dan, it sounds like NTH generation has such a comprehensive focus and strategy on sustainability where you're pulling multiple levers it's almost like sustainability to the NTH degree ? See what I did there ? >> (laughing) >> I'd like to talk with all three of you now. And John, I want to start with you about employees. Dan, you talked about the hybrid work environment and some of the silver linings from the pandemic but I'd love to know, John, Terry and then Dan, in that order how educated and engaged are your employees where sustainability is concerned? Talk to me about that from their engagement perspective and also from the ability to retain them and make them proud as Dan was saying to work for these companies, John ? >> Yeah, absolutely. One of the things that we see in technology, and we hear it from our customers every day when we're meeting with them is we all have a challenge attracting and retaining new employees. And one of the ways that you can succeed in that challenge is by connecting the work that the employee does to both the purpose of your company and broader than that global purpose. So environmental and social types of activities. So for us, we actually do a tremendous amount of education for our employees. At the moment, all of our vice presidents and above are taking climate training as part of our own climate aspirations to really drive those goals into action. But we're opening that training to any employee in the company. We have a variety of employee resource groups that are focused on sustainability and carbon reduction. And in many cases, they're very loud advocates for why aren't we pushing a roadmap further? Why aren't we doing things in a particular industry segment where they think we're not moving quite as quickly as we should be. But part of the recognition around all of that as well is customers often ask us when we suggest a sustainability or sustainable IT solution to them. Their first question back is, are you doing this yourselves? So for all of those reasons, we invest a lot of time and effort in educating our employees, listening to our employees on that topic and really using them to help drive our programs forward. >> That sounds like it's critical, John for customers to understand, are you doing this as well? Are you using your own technology ? Terry, talk to us about from the AMD side the education of your employees, the engagement of them where sustainability is concerned. >> Yeah. So similar to what John said, I would characterize AMD is a very socially responsible company. We kind of share that alignment in point of view along with NTH. Corporate responsibility is something that you know, most companies have started to see become a lot more prominent, a lot more talked about internally. We've been very public with four key sustainability goals that we've set as an organization. And we regularly provide updates on where we are along the way. Some of those goals extend out to 2025 and in one case 2030 so not too far away, but we're providing milestone updates against some pretty aggressive and important goals. I think, you know, as a technology company, regardless of the role that you're in there's a way that you can connect to what the company's doing that I think is kind of a feel good. I spend more of my time with the customer facing or partner facing resources and being able to deliver a tool to partners like NTH and strategic partners like HPE that really helps quantify the benefit, you know in a bare metal, in terms of greenhouse gas emissions and a TCO tool to really quantify what an implementation of a new and modern solution will mean to a customer. And for the first time they have choice. So I think employees, they can really feel good about being able to to do something that is for a greater good than just the traditional corporate goals. And of course the engineers that are designing the next generation of products that have these as core competencies clearly can connect to the impact that we're able to make on the broader global ecosystem. >> And that is so important. Terry, you know, employee productivity and satisfaction directly translates to customer satisfaction, customer retention. So, I always think of them as inextricably linked. So great to hear what you're all doing in terms of the employee engagement. Dan, talk to me about some of the outcomes that NTH is enabling customers to achieve, from an outcomes perspective those business outcomes, maybe even at a high level or a generic level, love to dig into some of those. >> Of course. Yes. So again, our mission is really to deliver awesome in everything we do. And we're very proud about that mission, very crispy clear, short and sweet and that includes, we don't cut corners. We go through the extent of, again, learning the technology getting those certifications, testing those in the lab so that when we're working with our end user organizations they know they're going to have a quality solution. And part of our vision has been to provide industry leading transformational technologies and solutions for example, HPE and AMD for organizations to go through their own digital transformation. Those two words have been used extensively over the last decade, but this is a multi decade type of trend, super trend or mega trend. And we're very proud that by offering and architecting and implementing, and in many cases supporting, with our partners, those, you know, best in class IT cyber security solutions were helping those organizations with those business outcomes, their own digital transformation. If you extend that Lisa , a Little bit further, by helping our clients, both public and private sector become more efficient, more scalable we're also helping, you know organizations become more productive, if you scale that out to the entire society in the US that also helps with the GDP. So it's all interrelated and we're very proud through our, again, optimized solutions. We're not just going to sell a box we're going to understand what the organization truly needs and adapt and architect our solutions accordingly. And we have, again, access to amazing technology, micro processes. Is just amazing what they can do today even compared to five years ago. And that enables new initiatives like artificial intelligence through machine learning and things of that nature. You need GPU technology , that specialized microprocessors and companies like AMD, like I said that are enabling organizations to go down that path faster, right? While saving energy, footprint and everything that we've been talking about. So those are some of the outcomes that I see >> Hey, Dan, listening to you talk, I can't help but think this is not a stretch for NTH right? Although, you know, terms like sustainability and reducing carbon footprint might be, you know more in vogue, the type of solutions that you've been architecting for customers your approach, dates back decades, and you don't have to change a lot. You just have new kind of toys to play with and new compelling offerings from great vendors like HPE to position to your customers. But it's not a big change in what you need to do. >> We're blessed from that perspective that's how our founders started the company. And we only, I think we go through a very extensive interview process to make sure that there will be a fit both ways. We want our new team members to get to know the the rest of the team before they actually make the decision. We are very proud as well, Terry, Lisa and John, that our tenure here at NTH is probably well over a decade. People get here and they really value how we help organizations through our dedicated work, providing again, leading edge technology solutions and the results that they see in our own organizations where we have made many friends in the industry because they had a problem, right? Or they had a very challenging initiative for their organization and we work together and the outcome there is something that they're very, very proud of. So you're right, Terry, we've been doing this for a long time. We're also very happy again with programs like the HPE GreenLake. We were already doing optimized solutions but with something like GreenLake is helping us save more energy consumption from the very beginning by allowing organizations to pay for only what they need with a little bit of buffer that we talked about. So what we've been doing since 1991 combined with a program like GreenLake I think is going to help us even better with our social corporate responsibility. >> I think what you guys have all articulated beautifully in the last 20 minutes is how strategic and interwoven the partnerships between HP, AMD and NTH is what your enabling customers to achieve those outcomes. What you're also doing internally to do things like reduce waste, reduce carbon emissions, and ensure that your employees are proud of who they're working for. Those are all fantastic guys. I wish we had more time cause I know we are just scratching the surface here. We appreciate everything that you shared with respect to sustainable IT and what you're enabling the end user customer to achieve. >> Thank you, Lisa. >> Thanks. >> Thank you. >> My pleasure. From my guests, I'm Lisa Martin. In a moment, Dave Vellante will return to give you some closing thoughts on sustainable IT You're watching theCUBE. the leader in high tech enterprise coverage.

Published Date : Sep 15 2022

SUMMARY :

to have you on theCUBE Talk to us about how NTH and the must that we have a responsibility the C-suite to the board. that older equipment to schools Talk to us a little bit that HPE brings to bear and the social responsibility And to pick up on what John of the things that NTH is doing for the next three to five years. to the NTH degree ? and also from the ability to retain them And one of the ways that you can succeed for customers to understand, and being able to deliver a tool So great to hear what you're all doing that are enabling organizations to go Hey, Dan, listening to you talk, and the results that they and interwoven the partnerships between to give you some closing

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

Dave VellantePERSON

0.99+

DanPERSON

0.99+

Lisa MartinPERSON

0.99+

NTHORGANIZATION

0.99+

TerryPERSON

0.99+

Dan MolinaPERSON

0.99+

Terry RichardsonPERSON

0.99+

AMDORGANIZATION

0.99+

HPEORGANIZATION

0.99+

HPORGANIZATION

0.99+

NTH GenerationORGANIZATION

0.99+

1991DATE

0.99+

NTH GenerationORGANIZATION

0.99+

San DiegoLOCATION

0.99+

Southern CaliforniaLOCATION

0.99+

30 serversQUANTITY

0.99+

20 yearsQUANTITY

0.99+

LisaPERSON

0.99+

two wordsQUANTITY

0.99+

USLOCATION

0.99+

Salvation ArmyORGANIZATION

0.99+

Hewlett Packard EnterpriseORGANIZATION

0.99+

2025DATE

0.99+

14QUANTITY

0.99+

John FreyPERSON

0.99+

first questionQUANTITY

0.99+

three guestsQUANTITY

0.99+

more than 140 conservation projectsQUANTITY

0.99+

oneQUANTITY

0.99+

each generationQUANTITY

0.99+

five years agoDATE

0.99+

bothQUANTITY

0.99+

70 countriesQUANTITY

0.99+

first timeQUANTITY

0.98+

John FryPERSON

0.98+

OneQUANTITY

0.98+

both waysQUANTITY

0.98+

Analyst Power Panel: Future of Database Platforms


 

(upbeat music) >> Once a staid and boring business dominated by IBM, Oracle, and at the time newcomer Microsoft, along with a handful of wannabes, the database business has exploded in the past decade and has become a staple of financial excellence, customer experience, analytic advantage, competitive strategy, growth initiatives, visualizations, not to mention compliance, security, privacy and dozens of other important use cases and initiatives. And on the vendor's side of the house, we've seen the rapid ascendancy of cloud databases. Most notably from Snowflake, whose massive raises leading up to its IPO in late 2020 sparked a spate of interest and VC investment in the separation of compute and storage and all that elastic resource stuff in the cloud. The company joined AWS, Azure and Google to popularize cloud databases, which have become a linchpin of competitive strategies for technology suppliers. And if I get you to put your data in my database and in my cloud, and I keep innovating, I'm going to build a moat and achieve a hugely attractive lifetime customer value in a really amazing marginal economics dynamic that is going to fund my future. And I'll be able to sell other adjacent services, not just compute and storage, but machine learning and inference and training and all kinds of stuff, dozens of lucrative cloud offerings. Meanwhile, the database leader, Oracle has invested massive amounts of money to maintain its lead. It's building on its position as the king of mission critical workloads and making typical Oracle like claims against the competition. Most were recently just yesterday with another announcement around MySQL HeatWave. An extension of MySQL that is compatible with on-premises MySQLs and is setting new standards in price performance. We're seeing a dramatic divergence in strategies across the database spectrum. On the far left, we see Amazon with more than a dozen database offerings each with its own API and primitives. AWS is taking a right tool for the right job approach, often building on open source platforms and creating services that it offers to customers to solve very specific problems for developers. And on the other side of the line, we see Oracle, which is taking the Swiss Army Knife approach, converging database functionality, enabling analytic and transactional workloads to run in the same data store, eliminating the need to ETL, at the same time adding capabilities into its platform like automation and machine learning. Welcome to this database Power Panel. My name is Dave Vellante, and I'm so excited to bring together some of the most respected industry analyst in the community. Today we're going to assess what's happening in the market. We're going to dig into the competitive landscape and explore the future of database and database platforms and decode what it means to customers. Let me take a moment to welcome our guest analyst today. Matt Kimball is a vice president and principal analysts at Moor Insights and Strategy, Matt. He knows products, he knows industry, he's got real world IT expertise, and he's got all the angles 25 plus years of experience in all kinds of great background. Matt, welcome. Thanks very much for coming on theCUBE. Holgar Mueller, friend of theCUBE, vice president and principal analyst at Constellation Research in depth knowledge on applications, application development, knows developers. He's worked at SAP and Oracle. And then Bob Evans is Chief Content Officer and co-founder of the Acceleration Economy, founder and principle of Cloud Wars. Covers all kinds of industry topics and great insights. He's got awesome videos, these three minute hits. If you haven't seen 'em, checking them out, knows cloud companies, his Cloud Wars minutes are fantastic. And then of course, Marc Staimer is the founder of Dragon Slayer Research. A frequent contributor and guest analyst at Wikibon. He's got a wide ranging knowledge across IT products, knows technology really well, can go deep. And then of course, Ron Westfall, Senior Analyst and Director Research Director at Futurum Research, great all around product trends knowledge. Can take, you know, technical dives and really understands competitive angles, knows Redshift, Snowflake, and many others. Gents, thanks so much for taking the time to join us in theCube today. It's great to have you on, good to see you. >> Good to be here, thanks for having us. >> Thanks, Dave. >> All right, let's start with an around the horn and briefly, if each of you would describe, you know, anything I missed in your areas of expertise and then you answer the following question, how would you describe the state of the database, state of platform market today? Matt Kimball, please start. >> Oh, I hate going first, but that it's okay. How would I describe the world today? I would just in one sentence, I would say, I'm glad I'm not in IT anymore, right? So, you know, it is a complex and dangerous world out there. And I don't envy IT folks I'd have to support, you know, these modernization and transformation efforts that are going on within the enterprise. It used to be, you mentioned it, Dave, you would argue about IBM versus Oracle versus this newcomer in the database space called Microsoft. And don't forget Sybase back in the day, but you know, now it's not just, which SQL vendor am I going to go with? It's all of these different, divergent data types that have to be taken, they have to be merged together, synthesized. And somehow I have to do that cleanly and use this to drive strategic decisions for my business. That is not easy. So, you know, you have to look at it from the perspective of the business user. It's great for them because as a DevOps person, or as an analyst, I have so much flexibility and I have this thing called the cloud now where I can go get services immediately. As an IT person or a DBA, I am calling up prevention hotlines 24 hours a day, because I don't know how I'm going to be able to support the business. And as an Oracle or as an Oracle or a Microsoft or some of the cloud providers and cloud databases out there, I'm licking my chops because, you know, my market is expanding and expanding every day. >> Great, thank you for that, Matt. Holgar, how do you see the world these days? You always have a good perspective on things, share with us. >> Well, I think it's the best time to be in IT, I'm not sure what Matt is talking about. (laughing) It's easier than ever, right? The direction is going to cloud. Kubernetes has won, Google has the best AI for now, right? So things are easier than ever before. You made commitments for five plus years on hardware, networking and so on premise, and I got gray hair about worrying it was the wrong decision. No, just kidding. But you kind of both sides, just to be controversial, make it interesting, right. So yeah, no, I think the interesting thing specifically with databases, right? We have this big suite versus best of breed, right? Obviously innovation, like you mentioned with Snowflake and others happening in the cloud, the cloud vendors server, where to save of their databases. And then we have one of the few survivors of the old guard as Evans likes to call them is Oracle who's doing well, both their traditional database. And now, which is really interesting, remarkable from that because Oracle it was always the power of one, have one database, add more to it, make it what I call the universal database. And now this new HeatWave offering is coming and MySQL open source side. So they're getting the second (indistinct) right? So it's interesting that older players, traditional players who still are in the market are diversifying their offerings. Something we don't see so much from the traditional tools from Oracle on the Microsoft side or the IBM side these days. >> Great, thank you Holgar. Bob Evans, you've covered this business for a while. You've worked at, you know, a number of different outlets and companies and you cover the competition, how do you see things? >> Dave, you know, the other angle to look at this from is from the customer side, right? You got now CEOs who are any sort of business across all sorts of industries, and they understand that their future success is going to be dependent on their ability to become a digital company, to understand data, to use it the right way. So as you outline Dave, I think in your intro there, it is a fantastic time to be in the database business. And I think we've got a lot of new buyers and influencers coming in. They don't know all this history about IBM and Microsoft and Oracle and you know, whoever else. So I think they're going to take a long, hard look, Dave, at some of these results and who is able to help these companies not serve up the best technology, but who's going to be able to help their business move into the digital future. So it's a fascinating time now from every perspective. >> Great points, Bob. I mean, digital transformation has gone from buzzword to imperative. Mr. Staimer, how do you see things? >> I see things a little bit differently than my peers here in that I see the database market being segmented. There's all the different kinds of databases that people are looking at for different kinds of data, and then there is databases in the cloud. And so database as cloud service, I view very differently than databases because the traditional way of implementing a database is changing and it's changing rapidly. So one of the premises that you stated earlier on was that you viewed Oracle as a database company. I don't view Oracle as a database company anymore. I view Oracle as a cloud company that happens to have a significant expertise and specialty in databases, and they still sell database software in the traditional way, but ultimately they're a cloud company. So database cloud services from my point of view is a very distinct market from databases. >> Okay, well, you gave us some good meat on the bone to talk about that. Last but not least-- >> Dave did Marc, just say Oracle's a cloud company? >> Yeah. (laughing) Take away the database, it would be interesting to have that discussion, but let's let Ron jump in here. Ron, give us your take. >> That's a great segue. I think it's truly the era of the cloud database, that's something that's rising. And the key trends that come with it include for example, elastic scaling. That is the ability to scale on demand, to right size workloads according to customer requirements. And also I think it's going to increase the prioritization for high availability. That is the player who can provide the highest availability is going to have, I think, a great deal of success in this emerging market. And also I anticipate that there will be more consolidation across platforms in order to enable cost savings for customers, and that's something that's always going to be important. And I think we'll see more of that over the horizon. And then finally security, security will be more important than ever. We've seen a spike (indistinct), we certainly have seen geopolitical originated cybersecurity concerns. And as a result, I see database security becoming all the more important. >> Great, thank you. Okay, let me share some data with you guys. I'm going to throw this at you and see what you think. We have this awesome data partner called Enterprise Technology Research, ETR. They do these quarterly surveys and each period with dozens of industry segments, they track clients spending, customer spending. And this is the database, data warehouse sector okay so it's taxonomy, so it's not perfect, but it's a big kind of chunk. They essentially ask customers within a category and buy a specific vendor, you're spending more or less on the platform? And then they subtract the lesses from the mores and they derive a metric called net score. It's like NPS, it's a measure of spending velocity. It's more complicated and granular than that, but that's the basis and that's the vertical axis. The horizontal axis is what they call market share, it's not like IDC market share, it's just pervasiveness in the data set. And so there are a couple of things that stand out here and that we can use as reference point. The first is the momentum of Snowflake. They've been off the charts for many, many, for over two years now, anything above that dotted red line, that 40%, is considered by ETR to be highly elevated and Snowflake's even way above that. And I think it's probably not sustainable. We're going to see in the next April survey, next month from those guys, when it comes out. And then you see AWS and Microsoft, they're really pervasive on the horizontal axis and highly elevated, Google falls behind them. And then you got a number of well funded players. You got Cockroach Labs, Mongo, Redis, MariaDB, which of course is a fork on MySQL started almost as protest at Oracle when they acquired Sun and they got MySQL and you can see the number of others. Now Oracle who's the leading database player, despite what Marc Staimer says, we know, (laughs) and they're a cloud player (laughing) who happens to be a leading database player. They dominate in the mission critical space, we know that they're the king of that sector, but you can see here that they're kind of legacy, right? They've been around a long time, they get a big install base. So they don't have the spending momentum on the vertical axis. Now remember this is, just really this doesn't capture spending levels, so that understates Oracle but nonetheless. So it's not a complete picture like SAP for instance is not in here, no Hana. I think people are actually buying it, but it doesn't show up here, (laughs) but it does give an indication of momentum and presence. So Bob Evans, I'm going to start with you. You've commented on many of these companies, you know, what does this data tell you? >> Yeah, you know, Dave, I think all these compilations of things like that are interesting, and that folks at ETR do some good work, but I think as you said, it's a snapshot sort of a two-dimensional thing of a rapidly changing, three dimensional world. You know, the incidents at which some of these companies are mentioned versus the volume that happens. I think it's, you know, with Oracle and I'm not going to declare my religious affiliation, either as cloud company or database company, you know, they're all of those things and more, and I think some of our old language of how we classify companies is just not relevant anymore. But I want to ask too something in here, the autonomous database from Oracle, nobody else has done that. So either Oracle is crazy, they've tried out a technology that nobody other than them is interested in, or they're onto something that nobody else can match. So to me, Dave, within Oracle, trying to identify how they're doing there, I would watch autonomous database growth too, because right, it's either going to be a big plan and it breaks through, or it's going to be caught behind. And the Snowflake phenomenon as you mentioned, that is a rare, rare bird who comes up and can grow 100% at a billion dollar revenue level like that. So now they've had a chance to come in, scare the crap out of everybody, rock the market with something totally new, the data cloud. Will the bigger companies be able to catch up and offer a compelling alternative, or is Snowflake going to continue to be this outlier. It's a fascinating time. >> Really, interesting points there. Holgar, I want to ask you, I mean, I've talked to certainly I'm sure you guys have too, the founders of Snowflake that came out of Oracle and they actually, they don't apologize. They say, "Hey, we not going to do all that complicated stuff that Oracle does, we were trying to keep it real simple." But at the same time, you know, they don't do sophisticated workload management. They don't do complex joints. They're kind of relying on the ecosystems. So when you look at the data like this and the various momentums, and we talked about the diverging strategies, what does this say to you? >> Well, it is a great point. And I think Snowflake is an example how the cloud can turbo charge a well understood concept in this case, the data warehouse, right? You move that and you find steroids and you see like for some players who've been big in data warehouse, like Sentara Data, as an example, here in San Diego, what could have been for them right in that part. The interesting thing, the problem though is the cloud hides a lot of complexity too, which you can scale really well as you attract lots of customers to go there. And you don't have to build things like what Bob said, right? One of the fascinating things, right, nobody's answering Oracle on the autonomous database. I don't think is that they cannot, they just have different priorities or the database is not such a priority. I would dare to say that it's for IBM and Microsoft right now at the moment. And the cloud vendors, you just hide that right through scripts and through scale because you support thousands of customers and you can deal with a little more complexity, right? It's not against them. Whereas if you have to run it yourself, very different story, right? You want to have the autonomous parts, you want to have the powerful tools to do things. >> Thank you. And so Matt, I want to go to you, you've set up front, you know, it's just complicated if you're in IT, it's a complicated situation and you've been on the customer side. And if you're a buyer, it's obviously, it's like Holgar said, "Cloud's supposed to make this stuff easier, but the simpler it gets the more complicated gets." So where do you place your bets? Or I guess more importantly, how do you decide where to place your bets? >> Yeah, it's a good question. And to what Bob and Holgar said, you know, the around autonomous database, I think, you know, part of, as I, you know, play kind of armchair psychologist, if you will, corporate psychologists, I look at what Oracle is doing and, you know, databases where they've made their mark and it's kind of, that's their strong position, right? So it makes sense if you're making an entry into this cloud and you really want to kind of build momentum, you go with what you're good at, right? So that's kind of the strength of Oracle. Let's put a lot of focus on that. They do a lot more than database, don't get me wrong, but you know, I'm going to short my strength and then kind of pivot from there. With regards to, you know, what IT looks at and what I would look at you know as an IT director or somebody who is, you know, trying to consume services from these different cloud providers. First and foremost, I go with what I know, right? Let's not forget IT is a conservative group. And when we look at, you know, all the different permutations of database types out there, SQL, NoSQL, all the different types of NoSQL, those are largely being deployed by business users that are looking for agility or businesses that are looking for agility. You know, the reason why MongoDB is so popular is because of DevOps, right? It's a great platform to develop on and that's where it kind of gained its traction. But as an IT person, I want to go with what I know, where my muscle memory is, and that's my first position. And so as I evaluate different cloud service providers and cloud databases, I look for, you know, what I know and what I've invested in and where my muscle memory is. Is there enough there and do I have enough belief that that company or that service is going to be able to take me to, you know, where I see my organization in five years from a data management perspective, from a business perspective, are they going to be there? And if they are, then I'm a little bit more willing to make that investment, but it is, you know, if I'm kind of going in this blind or if I'm cloud native, you know, that's where the Snowflakes of the world become very attractive to me. >> Thank you. So Marc, I asked Andy Jackson in theCube one time, you have all these, you know, data stores and different APIs and primitives and you know, very granular, what's the strategy there? And he said, "Hey, that allows us as the market changes, it allows us to be more flexible. If we start building abstractions layers, it's harder for us." I think also it was not a good time to market advantage, but let me ask you, I described earlier on that spectrum from AWS to Oracle. We just saw yesterday, Oracle announced, I think the third major enhancement in like 15 months to MySQL HeatWave, what do you make of that announcement? How do you think it impacts the competitive landscape, particularly as it relates to, you know, converging transaction and analytics, eliminating ELT, I know you have some thoughts on this. >> So let me back up for a second and defend my cloud statement about Oracle for a moment. (laughing) AWS did a great job in developing the cloud market in general and everything in the cloud market. I mean, I give them lots of kudos on that. And a lot of what they did is they took open source software and they rent it to people who use their cloud. So I give 'em lots of credit, they dominate the market. Oracle was late to the cloud market. In fact, they actually poo-pooed it initially, if you look at some of Larry Ellison's statements, they said, "Oh, it's never going to take off." And then they did 180 turn, and they said, "Oh, we're going to embrace the cloud." And they really have, but when you're late to a market, you've got to be compelling. And this ties into the announcement yesterday, but let's deal with this compelling. To be compelling from a user point of view, you got to be twice as fast, offer twice as much functionality, at half the cost. That's generally what compelling is that you're going to capture market share from the leaders who established the market. It's very difficult to capture market share in a new market for yourself. And you're right. I mean, Bob was correct on this and Holgar and Matt in which you look at Oracle, and they did a great job of leveraging their database to move into this market, give 'em lots of kudos for that too. But yesterday they announced, as you said, the third innovation release and the pace is just amazing of what they're doing on these releases on HeatWave that ties together initially MySQL with an integrated builtin analytics engine, so a data warehouse built in. And then they added automation with autopilot, and now they've added machine learning to it, and it's all in the same service. It's not something you can buy and put on your premise unless you buy their cloud customers stuff. But generally it's a cloud offering, so it's compellingly better as far as the integration. You don't buy multiple services, you buy one and it's lower cost than any of the other services, but more importantly, it's faster, which again, give 'em credit for, they have more integration of a product. They can tie things together in a way that nobody else does. There's no additional services, ETL services like Glue and AWS. So from that perspective, they're getting better performance, fewer services, lower cost. Hmm, they're aiming at the compelling side again. So from a customer point of view it's compelling. Matt, you wanted to say something there. >> Yeah, I want to kind of, on what you just said there Marc, and this is something I've found really interesting, you know. The traditional way that you look at software and, you know, purchasing software and IT is, you look at either best of breed solutions and you have to work on the backend to integrate them all and make them all work well. And generally, you know, the big hit against the, you know, we have one integrated offering is that, you lose capability or you lose depth of features, right. And to what you were saying, you know, that's the thing I found interesting about what Oracle is doing is they're building in depth as they kind of, you know, build that service. It's not like you're losing a lot of capabilities, because you're going to one integrated service versus having to use A versus B versus C, and I love that idea. >> You're right. Yeah, not only you're not losing, but you're gaining functionality that you can't get by integrating a lot of these. I mean, I can take Snowflake and integrate it in with machine learning, but I also have to integrate in with a transactional database. So I've got to have connectors between all of this, which means I'm adding time. And what it comes down to at the end of the day is expertise, effort, time, and cost. And so what I see the difference from the Oracle announcements is they're aiming at reducing all of that by increasing performance as well. Correct me if I'm wrong on that but that's what I saw at the announcement yesterday. >> You know, Marc, one thing though Marc, it's funny you say that because I started out saying, you know, I'm glad I'm not 19 anymore. And the reason is because of exactly what you said, it's almost like there's a pseudo level of witchcraft that's required to support the modern data environment right in the enterprise. And I need simpler faster, better. That's what I need, you know, I am no longer wearing pocket protectors. I have turned from, you know, break, fix kind of person, to you know, business consultant. And I need that point and click simplicity, but I can't sacrifice, you know, a depth of features of functionality on the backend as I play that consultancy role. >> So, Ron, I want to bring in Ron, you know, it's funny. So Matt, you mentioned Mongo, I often and say, if Oracle mentions you, you're on the map. We saw them yesterday Ron, (laughing) they hammered RedShifts auto ML, they took swipes at Snowflake, a little bit of BigQuery. What were your thoughts on that? Do you agree with what these guys are saying in terms of HeatWaves capabilities? >> Yes, Dave, I think that's an excellent question. And fundamentally I do agree. And the question is why, and I think it's important to know that all of the Oracle data is backed by the fact that they're using benchmarks. For example, all of the ML and all of the TPC benchmarks, including all the scripts, all the configs and all the detail are posted on GitHub. So anybody can look at these results and they're fully transparent and replicate themselves. If you don't agree with this data, then by all means challenge it. And we have not really seen that in all of the new updates in HeatWave over the last 15 months. And as a result, when it comes to these, you know, fundamentals in looking at the competitive landscape, which I think gives validity to outcomes such as Oracle being able to deliver 4.8 times better price performance than Redshift. As well as for example, 14.4 better price performance than Snowflake, and also 12.9 better price performance than BigQuery. And so that is, you know, looking at the quantitative side of things. But again, I think, you know, to Marc's point and to Matt's point, there are also qualitative aspects that clearly differentiate the Oracle proposition, from my perspective. For example now the MySQL HeatWave ML capabilities are native, they're built in, and they also support things such as completion criteria. And as a result, that enables them to show that hey, when you're using Redshift ML for example, you're having to also use their SageMaker tool and it's running on a meter. And so, you know, nobody really wants to be running on a meter when, you know, executing these incredibly complex tasks. And likewise, when it comes to Snowflake, they have to use a third party capability. They don't have the built in, it's not native. So the user, to the point that he's having to spend more time and it increases complexity to use auto ML capabilities across the Snowflake platform. And also, I think it also applies to other important features such as data sampling, for example, with the HeatWave ML, it's intelligent sampling that's being implemented. Whereas in contrast, we're seeing Redshift using random sampling. And again, Snowflake, you're having to use a third party library in order to achieve the same capabilities. So I think the differentiation is crystal clear. I think it definitely is refreshing. It's showing that this is where true value can be assigned. And if you don't agree with it, by all means challenge the data. >> Yeah, I want to come to the benchmarks in a minute. By the way, you know, the gentleman who's the Oracle's architect, he did a great job on the call yesterday explaining what you have to do. I thought that was quite impressive. But Bob, I know you follow the financials pretty closely and on the earnings call earlier this month, Ellison said that, "We're going to see HeatWave on AWS." And the skeptic in me said, oh, they must not be getting people to come to OCI. And then they, you remember this chart they showed yesterday that showed the growth of HeatWave on OCI. But of course there was no data on there, it was just sort of, you know, lines up and to the right. So what do you guys think of that? (Marc laughs) Does it signal Bob, desperation by Oracle that they can't get traction on OCI, or is it just really a smart tame expansion move? What do you think? >> Yeah, Dave, that's a great question. You know, along the way there, and you know, just inside of that was something that said Ellison said on earnings call that spoke to a different sort of philosophy or mindset, almost Marc, where he said, "We're going to make this multicloud," right? With a lot of their other cloud stuff, if you wanted to use any of Oracle's cloud software, you had to use Oracle's infrastructure, OCI, there was no other way out of it. But this one, but I thought it was a classic Ellison line. He said, "Well, we're making this available on AWS. We're making this available, you know, on Snowflake because we're going after those users. And once they see what can be done here." So he's looking at it, I guess you could say, it's a concession to customers because they want multi-cloud. The other way to look at it, it's a hunting expedition and it's one of those uniquely I think Oracle ways. He said up front, right, he doesn't say, "Well, there's a big market, there's a lot for everybody, we just want on our slice." Said, "No, we are going after Amazon, we're going after Redshift, we're going after Aurora. We're going after these users of Snowflake and so on." And I think it's really fairly refreshing these days to hear somebody say that, because now if I'm a buyer, I can look at that and say, you know, to Marc's point, "Do they measure up, do they crack that threshold ceiling? Or is this just going to be more pain than a few dollars savings is worth?" But you look at those numbers that Ron pointed out and that we all saw in that chart. I've never seen Dave, anything like that. In a substantive market, a new player coming in here, and being able to establish differences that are four, seven, eight, 10, 12 times better than competition. And as new buyers look at that, they're going to say, "What the hell are we doing paying, you know, five times more to get a poor result? What's going on here?" So I think this is going to rattle people and force a harder, closer look at what these alternatives are. >> I wonder if the guy, thank you. Let's just skip ahead of the benchmarks guys, bring up the next slide, let's skip ahead a little bit here, which talks to the benchmarks and the benchmarking if we can. You know, David Floyer, the sort of semiretired, you know, Wikibon analyst said, "Dave, this is going to force Amazon and others, Snowflake," he said, "To rethink actually how they architect databases." And this is kind of a compilation of some of the data that they shared. They went after Redshift mostly, (laughs) but also, you know, as I say, Snowflake, BigQuery. And, like I said, you can always tell which companies are doing well, 'cause Oracle will come after you, but they're on the radar here. (laughing) Holgar should we take this stuff seriously? I mean, or is it, you know, a grain salt? What are your thoughts here? >> I think you have to take it seriously. I mean, that's a great question, great point on that. Because like Ron said, "If there's a flaw in a benchmark, we know this database traditionally, right?" If anybody came up that, everybody will be, "Oh, you put the wrong benchmark, it wasn't audited right, let us do it again," and so on. We don't see this happening, right? So kudos to Oracle to be aggressive, differentiated, and seem to having impeccable benchmarks. But what we really see, I think in my view is that the classic and we can talk about this in 100 years, right? Is the suite versus best of breed, right? And the key question of the suite, because the suite's always slower, right? No matter at which level of the stack, you have the suite, then the best of breed that will come up with something new, use a cloud, put the data warehouse on steroids and so on. The important thing is that you have to assess as a buyer what is the speed of my suite vendor. And that's what you guys mentioned before as well, right? Marc said that and so on, "Like, this is a third release in one year of the HeatWave team, right?" So everybody in the database open source Marc, and there's so many MySQL spinoffs to certain point is put on shine on the speed of (indistinct) team, putting out fundamental changes. And the beauty of that is right, is so inherent to the Oracle value proposition. Larry's vision of building the IBM of the 21st century, right from the Silicon, from the chip all the way across the seven stacks to the click of the user. And that what makes the database what Rob was saying, "Tied to the OCI infrastructure," because designed for that, it runs uniquely better for that, that's why we see the cross connect to Microsoft. HeatWave so it's different, right? Because HeatWave runs on cheap hardware, right? Which is the breadth and butter 886 scale of any cloud provider, right? So Oracle probably needs it to scale OCI in a different category, not the expensive side, but also allow us to do what we said before, the multicloud capability, which ultimately CIOs really want, because data gravity is real, you want to operate where that is. If you have a fast, innovative offering, which gives you more functionality and the R and D speed is really impressive for the space, puts away bad results, then it's a good bet to look at. >> Yeah, so you're saying, that we versus best of breed. I just want to sort of play back then Marc a comment. That suite versus best of breed, there's always been that trade off. If I understand you Holgar you're saying that somehow Oracle has magically cut through that trade off and they're giving you the best of both. >> It's the developing velocity, right? The provision of important features, which matter to buyers of the suite vendor, eclipses the best of breed vendor, then the best of breed vendor is in the hell of a potential job. >> Yeah, go ahead Marc. >> Yeah and I want to add on what Holgar just said there. I mean the worst job in the data center is data movement, moving the data sucks. I don't care who you are, nobody likes it. You never get any kudos for doing it well, and you always get the ah craps, when things go wrong. So it's in- >> In the data center Marc all the time across data centers, across cloud. That's where the bleeding comes. >> It's right, you get beat up all the time. So nobody likes to move data, ever. So what you're looking at with what they announce with HeatWave and what I love about HeatWave is it doesn't matter when you started with it, you get all the additional features they announce it's part of the service, all the time. But they don't have to move any of the data. You want to analyze the data that's in your transactional, MySQL database, it's there. You want to do machine learning models, it's there, there's no data movement. The data movement is the key thing, and they just eliminate that, in so many ways. And the other thing I wanted to talk about is on the benchmarks. As great as those benchmarks are, they're really conservative 'cause they're underestimating the cost of that data movement. The ETLs, the other services, everything's left out. It's just comparing HeatWave, MySQL cloud service with HeatWave versus Redshift, not Redshift and Aurora and Glue, Redshift and Redshift ML and SageMaker, it's just Redshift. >> Yeah, so what you're saying is what Oracle's doing is saying, "Okay, we're going to run MySQL HeatWave benchmarks on analytics against Redshift, and then we're going to run 'em in transaction against Aurora." >> Right. >> But if you really had to look at what you would have to do with the ETL, you'd have to buy two different data stores and all the infrastructure around that, and that goes away so. >> Due to the nature of the competition, they're running narrow best of breed benchmarks. There is no suite level benchmark (Dave laughs) because they created something new. >> Well that's you're the earlier point they're beating best of breed with a suite. So that's, I guess to Floyer's earlier point, "That's going to shake things up." But I want to come back to Bob Evans, 'cause I want to tap your Cloud Wars mojo before we wrap. And line up the horses, you got AWS, you got Microsoft, Google and Oracle. Now they all own their own cloud. Snowflake, Mongo, Couchbase, Redis, Cockroach by the way they're all doing very well. They run in the cloud as do many others. I think you guys all saw the Andreessen, you know, commentary from Sarah Wang and company, to talk about the cost of goods sold impact of cloud. So owning your own cloud has to be an advantage because other guys like Snowflake have to pay cloud vendors and negotiate down versus having the whole enchilada, Safra Catz's dream. Bob, how do you think this is going to impact the market long term? >> Well, Dave, that's a great question about, you know, how this is all going to play out. If I could mention three things, one, Frank Slootman has done a fantastic job with Snowflake. Really good company before he got there, but since he's been there, the growth mindset, the discipline, the rigor and the phenomenon of what Snowflake has done has forced all these bigger companies to really accelerate what they're doing. And again, it's an example of how this intense competition makes all the different cloud vendors better and it provides enormous value to customers. Second thing I wanted to mention here was look at the Adam Selipsky effect at AWS, took over in the middle of May, and in Q2, Q3, Q4, AWS's growth rate accelerated. And in each of those three quotas, they grew faster than Microsoft's cloud, which has not happened in two or three years, so they're closing the gap on Microsoft. The third thing, Dave, in this, you know, incredibly intense competitive nature here, look at Larry Ellison, right? He's got his, you know, the product that for the last two or three years, he said, "It's going to help determine the future of the company, autonomous database." You would think he's the last person in the world who's going to bring in, you know, in some ways another database to think about there, but he has put, you know, his whole effort and energy behind this. The investments Oracle's made, he's riding this horse really hard. So it's not just a technology achievement, but it's also an investment priority for Oracle going forward. And I think it's going to form a lot of how they position themselves to this new breed of buyer with a new type of need and expectations from IT. So I just think the next two or three years are going to be fantastic for people who are lucky enough to get to do the sorts of things that we do. >> You know, it's a great point you made about AWS. Back in 2018 Q3, they were doing about 7.4 billion a quarter and they were growing in the mid forties. They dropped down to like 29% Q4, 2020, I'm looking at the data now. They popped back up last quarter, last reported quarter to 40%, that is 17.8 billion, so they more doubled and they accelerated their growth rate. (laughs) So maybe that pretends, people are concerned about Snowflake right now decelerating growth. You know, maybe that's going to be different. By the way, I think Snowflake has a different strategy, the whole data cloud thing, data sharing. They're not trying to necessarily take Oracle head on, which is going to make this next 10 years, really interesting. All right, we got to go, last question. 30 seconds or less, what can we expect from the future of data platforms? Matt, please start. >> I have to go first again? You're killing me, Dave. (laughing) In the next few years, I think you're going to see the major players continue to meet customers where they are, right. Every organization, every environment is, you know, kind of, we use these words bespoke in Snowflake, pardon the pun, but Snowflakes, right. But you know, they're all opinionated and unique and what's great as an IT person is, you know, there is a service for me regardless of where I am on my journey, in my data management journey. I think you're going to continue to see with regards specifically to Oracle, I think you're going to see the company continue along this path of being all things to all people, if you will, or all organizations without sacrificing, you know, kind of richness of features and sacrificing who they are, right. Look, they are the data kings, right? I mean, they've been a database leader for an awful long time. I don't see that going away any time soon and I love the innovative spirit they've brought in with HeatWave. >> All right, great thank you. Okay, 30 seconds, Holgar go. >> Yeah, I mean, the interesting thing that we see is really that trend to autonomous as Oracle calls or self-driving software, right? So the database will have to do more things than just store the data and support the DVA. It will have to show it can wide insights, the whole upside, it will be able to show to one machine learning. We haven't really talked about that. How in just exciting what kind of use case we can get of machine learning running real time on data as it changes, right? So, which is part of the E5 announcement, right? So we'll see more of that self-driving nature in the database space. And because you said we can promote it, right. Check out my report about HeatWave latest release where I post in oracle.com. >> Great, thank you for that. And Bob Evans, please. You're great at quick hits, hit us. >> Dave, thanks. I really enjoyed getting to hear everybody's opinion here today and I think what's going to happen too. I think there's a new generation of buyers, a new set of CXO influencers in here. And I think what Oracle's done with this, MySQL HeatWave, those benchmarks that Ron talked about so eloquently here that is going to become something that forces other companies, not just try to get incrementally better. I think we're going to see a massive new wave of innovation to try to play catch up. So I really take my hat off to Oracle's achievement from going to, push everybody to be better. >> Excellent. Marc Staimer, what do you say? >> Sure, I'm going to leverage off of something Matt said earlier, "Those companies that are going to develop faster, cheaper, simpler products that are going to solve customer problems, IT problems are the ones that are going to succeed, or the ones who are going to grow. The one who are just focused on the technology are going to fall by the wayside." So those who can solve more problems, do it more elegantly and do it for less money are going to do great. So Oracle's going down that path today, Snowflake's going down that path. They're trying to do more integration with third party, but as a result, aiming at that simpler, faster, cheaper mentality is where you're going to continue to see this market go. >> Amen brother Marc. >> Thank you, Ron Westfall, we'll give you the last word, bring us home. >> Well, thank you. And I'm loving it. I see a wave of innovation across the entire cloud database ecosystem and Oracle is fueling it. We are seeing it, with the native integration of auto ML capabilities, elastic scaling, lower entry price points, et cetera. And this is just going to be great news for buyers, but also developers and increased use of open APIs. And so I think that is really the key takeaways. Just we're going to see a lot of great innovation on the horizon here. >> Guys, fantastic insights, one of the best power panel as I've ever done. Love to have you back. Thanks so much for coming on today. >> Great job, Dave, thank you. >> All right, and thank you for watching. This is Dave Vellante for theCube and we'll see you next time. (soft music)

Published Date : Mar 31 2022

SUMMARY :

and co-founder of the and then you answer And don't forget Sybase back in the day, the world these days? and others happening in the cloud, and you cover the competition, and Oracle and you know, whoever else. Mr. Staimer, how do you see things? in that I see the database some good meat on the bone Take away the database, That is the ability to scale on demand, and they got MySQL and you I think it's, you know, and the various momentums, and Microsoft right now at the moment. So where do you place your bets? And to what Bob and Holgar said, you know, and you know, very granular, and everything in the cloud market. And to what you were saying, you know, functionality that you can't get to you know, business consultant. you know, it's funny. and all of the TPC benchmarks, By the way, you know, and you know, just inside of that was of some of the data that they shared. the stack, you have the suite, and they're giving you the best of both. of the suite vendor, and you always get the ah In the data center Marc all the time And the other thing I wanted to talk about and then we're going to run 'em and all the infrastructure around that, Due to the nature of the competition, I think you guys all saw the Andreessen, And I think it's going to form I'm looking at the data now. and I love the innovative All right, great thank you. and support the DVA. Great, thank you for that. And I think what Oracle's done Marc Staimer, what do you say? or the ones who are going to grow. we'll give you the last And this is just going to Love to have you back. and we'll see you next time.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
David FloyerPERSON

0.99+

Dave VellantePERSON

0.99+

Ron WestfallPERSON

0.99+

DavePERSON

0.99+

Marc StaimerPERSON

0.99+

MicrosoftORGANIZATION

0.99+

IBMORGANIZATION

0.99+

MarcPERSON

0.99+

EllisonPERSON

0.99+

Bob EvansPERSON

0.99+

OracleORGANIZATION

0.99+

MattPERSON

0.99+

Holgar MuellerPERSON

0.99+

AWSORGANIZATION

0.99+

Frank SlootmanPERSON

0.99+

RonPERSON

0.99+

StaimerPERSON

0.99+

Andy JacksonPERSON

0.99+

BobPERSON

0.99+

Matt KimballPERSON

0.99+

GoogleORGANIZATION

0.99+

100%QUANTITY

0.99+

Sarah WangPERSON

0.99+

San DiegoLOCATION

0.99+

AmazonORGANIZATION

0.99+

RobPERSON

0.99+

Nandi Leslie, Raytheon | WiDS 2022


 

(upbeat music) >> Hey everyone. Welcome back to theCUBE's live coverage of Women in Data Science, WiDS 2022, coming to live from Stanford University. I'm Lisa Martin. My next guest is here. Nandi Leslie, Doctor Nandi Leslie, Senior Engineering Fellow at Raytheon Technologies. Nandi, it's great to have you on the program. >> Oh it's my pleasure, thank you. >> This is your first WiDS you were saying before we went live. >> That's right. >> What's your take so far? >> I'm absolutely loving it. I love the comradery and the community of women in data science. You know, what more can you say? It's amazing. >> It is. It's amazing what they built since 2015, that this is now reaching 100,000 people 200 online event. It's a hybrid event. Of course, here we are in person, and the online event going on, but it's always an inspiring, energy-filled experience in my experience of WiDS. >> I'm thoroughly impressed at what the organizers have been able to accomplish. And it's amazing, that you know, you've been involved from the beginning. >> Yeah, yeah. Talk to me, so you're Senior Engineering Fellow at Raytheon. Talk to me a little bit about your role there and what you're doing. >> Well, my role is really to think about our customer's most challenging problems, primarily at the intersection of data science, and you know, the intersectional fields of applied mathematics, machine learning, cybersecurity. And then we have a plethora of government clients and commercial clients. And so what their needs are beyond those sub-fields as well, I address. >> And your background is mathematics. >> Yes. >> Have you always been a math fan? >> I have, I actually have loved math for many, many years. My dad is a mathematician, and he introduced me to, you know mathematical research and the sciences at a very early age. And so, yeah, I went on, I studied in a math degree at Howard undergrad, and then I went on to do my PhD at Princeton in applied math. And later did a postdoc in the math department at University of Maryland. >> And how long have you been with Raytheon? >> I've been with Raytheon about six years. Yeah, and before Raytheon, I worked at a small to midsize defense company, defense contracting company in the DC area, systems planning and analysis. And then prior to that, I taught in a math department where I also did my postdoc, at University of Maryland College Park. >> You have a really interesting background. I was doing some reading on you, and you have worked with the Navy. You've worked with very interesting organizations. Talk to the audience a little bit about your diverse background. >> Awesome yeah, I've worked with the Navy on submarine force security, and submarine tracking, and localization, sensor performance. Also with the Army and the Army Research Laboratory during research at the intersection of machine learning and cyber security. Also looking at game theoretic and graph theoretic approaches to understand network resilience and robustness. I've also supported Department of Homeland Security, and other government agencies, other governments, NATO. Yeah, so I've really been excited by the diverse problems that our various customers have you know, brought to us. >> Well, you get such great experience when you are able to work in different industries and different fields. And that really just really probably helps you have such a much diverse kind of diversity of thought with what you're doing even now with Raytheon. >> Yeah, it definitely does help me build like a portfolio of topics that I can address. And then when new problems emerge, then I can pull from a toolbox of capabilities. And, you know, the solutions that have previously been developed to address those wide array of problems, but then also innovate new solutions based on those experiences. So I've been really blessed to have those experiences. >> Talk to me about one of the things I heard this morning in the session I was able to attend before we came to set was about mentors and sponsors. And, you know, I actually didn't know the difference between that until a few years ago. But it's so important. Talk to me about some of the mentors you've had along the way that really helped you find your voice in research and development. >> Definitely, I mean, beyond just the mentorship of my my family and my parents, I've had amazing opportunities to meet with wonderful people, who've helped me navigate my career. One in particular, I can think of as and I'll name a number of folks, but Dr. Carlos Castillo-Chavez was one of my earlier mentors. I was an undergrad at Howard University. He encouraged me to apply to his summer research program in mathematical and theoretical biology, which was then at Cornell. And, you know, he just really developed an enthusiasm with me for applied mathematics. And for how it can be, mathematics that is, can be applied to epidemiological and theoretical immunological problems. And then I had an amazing mentor in my PhD advisor, Dr. Simon Levin at Princeton, who just continued to inspire me, in how to leverage mathematical approaches and computational thinking for ecological conservation problems. And then since then, I've had amazing mentors, you know through just a variety of people that I've met, through customers, who've inspired me to write these papers that you mentioned in the beginning. >> Yeah, you've written 55 different publications so far. 55 and counting I'm sure, right? >> Well, I hope so. I hope to continue to contribute to the conversation and the community, you know, within research, and specifically research that is computationally driven. That really is applicable to problems that we face, whether it's cyber security, or machine learning problems, or others in data science. >> What are some of the things, you're giving a a tech vision talk this afternoon. Talk to me a little bit about that, and maybe the top three takeaways you want the audience to leave with. >> Yeah, so my talk is entitled "Unsupervised Learning for Network Security, or Network Intrusion Detection" I believe. And essentially three key areas I want to convey are the following. That unsupervised learning, that is the mathematical and statistical approach, which tries to derive patterns from unlabeled data is a powerful one. And one can still innovate new algorithms in this area. Secondly, that network security, and specifically, anomaly detection, and anomaly-based methods can be really useful to discerning and ensuring, that there is information confidentiality, availability, and integrity in our data >> A CIA triad. >> There you go, you know. And so in addition to that, you know there is this wealth of data that's out there. It's coming at us quickly. You know, there are millions of packets to represent communications. And that data has, it's mixed, in terms of there's categorical or qualitative data, text data, along with numerical data. And it is streaming, right. And so we need methods that are efficient, and that are capable of being deployed real time, in order to detect these anomalies, which we hope are representative of malicious activities, and so that we can therefore alert on them and thwart them. >> It's so interesting that, you know, the amount of data that's being generated and collected is growing exponentially. There's also, you know, some concerning challenges, not just with respect to data that's reinforcing social biases, but also with cyber warfare. I mean, that's a huge challenge right now. We've seen from a cybersecurity perspective in the last couple of years during the pandemic, a massive explosion in anomalies, and in social engineering. And companies in every industry have to be super vigilant, and help the people understand how to interact with it, right. There's a human component. >> Oh, for sure. There's a huge human component. You know, there are these phishing attacks that are really a huge source of the vulnerability that corporations, governments, and universities face. And so to be able to close that gap and the understanding that each individual plays in the vulnerability of a network is key. And then also seeing the link between the network activities or the cyber realm, and physical systems, right. And so, you know, especially in cyber warfare as a remote cyber attack, unauthorized network activities can have real implications for physical systems. They can, you know, stop a vehicle from running properly in an autonomous vehicle. They can impact a SCADA system that's, you know there to provide HVAC for example. And much more grievous implications. And so, you know, definitely there's the human component. >> Yes, and humans being so vulnerable to those social engineering that goes on in those phishing attacks. And we've seen them get more and more personal, which is challenging. You talking about, you know, sensitive data, personally identifiable data, using that against someone in cyber warfare is a huge challenge. >> Oh yeah, certainly. And it's one that computational thinking and mathematics can be leveraged to better understand and to predict those patterns. And that's a very rich area for innovation. >> What would you say is the power of computational thinking in the industry? >> In industry at-large? >> At large. >> Yes, I think that it is such a benefit to, you know, a burgeoning scientist, if they want to get into industry. There's so many opportunities, because computational thinking is needed. We need to be more objective, and it provides that objectivity, and it's so needed right now. Especially with the emergence of data, and you know, across industries. So there are so many opportunities for data scientists, whether it's in aerospace and defense, like Raytheon or in the health industry. And we saw with the pandemic, the utility of mathematical modeling. There are just so many opportunities. >> Yeah, there's a lot of opportunities, and that's one of the themes I think, of WiDS, is just the opportunities, not just in data science, and for women. And there's obviously even high school girls that are here, which is so nice to see those young, fresh faces, but opportunities to build your own network and your own personal board of directors, your mentors, your sponsors. There's tremendous opportunity in data science, and it's really all encompassing, at least from my seat. >> Oh yeah, no I completely agree with that. >> What are some of the things that you've heard at this WiDS event that inspire you going, we're going in the right direction. If we think about International Women's Day tomorrow, "Breaking the Bias" is the theme, do you think we're on our way to breaking that bias? >> Definitely, you know, there was a panel today talking about the bias in data, and in a variety of fields, and how we are, you know discovering that bias, and creating solutions to address it. So there was that panel. There was another talk by a speaker from Pinterest, who had presented some solutions that her, and her team had derived to address bias there, in you know, image recognition and search. And so I think that we've realized this bias, and, you know, in AI ethics, not only in these topics that I've mentioned, but also in the implications for like getting a loan, so economic implications, as well. And so we're realizing those issues and bias now in AI, and we're addressing them. So I definitely am optimistic. I feel encouraged by the talks today at WiDS that you know, not only are we recognizing the issues, but we're creating solutions >> Right taking steps to remediate those, so that ultimately going forward. You know, we know it's not possible to have unbiased data. That's not humanly possible, or probably mathematically possible. But the steps that they're taking, they're going in the right direction. And a lot of it starts with awareness. >> Exactly. >> Of understanding there is bias in this data, regardless. All the people that are interacting with it, and touching it, and transforming it, and cleaning it, for example, that's all influencing the veracity of it. >> Oh, for sure. Exactly, you know, and I think that there are for sure solutions are being discussed here, papers written by some of the speakers here, that are driving the solutions to the mitigation of this bias and data problem. So I agree a hundred percent with you, that awareness is you know, half the battle, if not more. And then, you know, that drives creation of solutions >> And that's what we need the creation of solutions. Nandi, thank you so much for joining me today. It was a pleasure talking with you about what you're doing with Raytheon, what you've done and your path with mathematics, and what excites you about data science going forward. We appreciate your insights. >> Thank you so much. It was my pleasure. >> Good, for Nandi Leslie, I'm Lisa Martin. You're watching theCUBE's coverage of Women in Data Science 2022. Stick around, I'll be right back with my next guest. (upbeat flowing music)

Published Date : Mar 7 2022

SUMMARY :

have you on the program. This is your first WiDS you were saying You know, what more can you say? and the online event going on, And it's amazing, that you know, and what you're doing. and you know, the intersectional fields and he introduced me to, you And then prior to that, I and you have worked with the Navy. have you know, brought to us. And that really just And, you know, the solutions that really helped you that you mentioned in the beginning. 55 and counting I'm sure, right? and the community, you and maybe the top three takeaways that is the mathematical and so that we can therefore and help the people understand And so, you know, Yes, and humans being so vulnerable and to predict those patterns. and you know, across industries. and that's one of the themes I think, completely agree with that. that inspire you going, and how we are, you know And a lot of it starts with awareness. that's all influencing the veracity of it. And then, you know, that and what excites you about Thank you so much. of Women in Data Science 2022.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Lisa MartinPERSON

0.99+

NandiPERSON

0.99+

Carlos Castillo-ChavezPERSON

0.99+

Simon LevinPERSON

0.99+

Nandi LesliePERSON

0.99+

Nandi LesliePERSON

0.99+

NATOORGANIZATION

0.99+

RaytheonORGANIZATION

0.99+

International Women's DayEVENT

0.99+

100,000 peopleQUANTITY

0.99+

Department of Homeland SecurityORGANIZATION

0.99+

Raytheon TechnologiesORGANIZATION

0.99+

2015DATE

0.99+

todayDATE

0.99+

University of MarylandORGANIZATION

0.99+

PinterestORGANIZATION

0.99+

Army Research LaboratoryORGANIZATION

0.99+

NavyORGANIZATION

0.99+

firstQUANTITY

0.98+

oneQUANTITY

0.98+

pandemicEVENT

0.98+

millions of packetsQUANTITY

0.97+

55QUANTITY

0.97+

CornellORGANIZATION

0.97+

Howard UniversityORGANIZATION

0.97+

each individualQUANTITY

0.97+

about six yearsQUANTITY

0.97+

HowardORGANIZATION

0.96+

55 different publicationsQUANTITY

0.96+

Stanford UniversityORGANIZATION

0.96+

OneQUANTITY

0.96+

Unsupervised Learning for Network Security, or Network Intrusion DetectionTITLE

0.96+

University of Maryland College ParkORGANIZATION

0.96+

ArmyORGANIZATION

0.96+

WiDSEVENT

0.95+

Women in Data Science 2022TITLE

0.95+

Women in Data ScienceEVENT

0.95+

PrincetonORGANIZATION

0.94+

hundred percentQUANTITY

0.94+

theCUBEORGANIZATION

0.93+

CIAORGANIZATION

0.93+

SecondlyQUANTITY

0.92+

tomorrowDATE

0.89+

WiDSORGANIZATION

0.88+

DoctorPERSON

0.88+

200 onlineQUANTITY

0.87+

WiDS 2022EVENT

0.87+

this afternoonDATE

0.85+

three takeawaysQUANTITY

0.84+

last couple of yearsDATE

0.83+

this morningDATE

0.83+

few years agoDATE

0.82+

SCADAORGANIZATION

0.78+

topQUANTITY

0.75+

threeQUANTITY

0.71+

2022DATE

0.7+

DCLOCATION

0.64+

Breaking the BiasEVENT

0.52+

WiDSTITLE

0.39+

Analyst Predictions 2022: The Future of Data Management


 

[Music] in the 2010s organizations became keenly aware that data would become the key ingredient in driving competitive advantage differentiation and growth but to this day putting data to work remains a difficult challenge for many if not most organizations now as the cloud matures it has become a game changer for data practitioners by making cheap storage and massive processing power readily accessible we've also seen better tooling in the form of data workflows streaming machine intelligence ai developer tools security observability automation new databases and the like these innovations they accelerate data proficiency but at the same time they had complexity for practitioners data lakes data hubs data warehouses data marts data fabrics data meshes data catalogs data oceans are forming they're evolving and exploding onto the scene so in an effort to bring perspective to the sea of optionality we've brought together the brightest minds in the data analyst community to discuss how data management is morphing and what practitioners should expect in 2022 and beyond hello everyone my name is dave vellante with the cube and i'd like to welcome you to a special cube presentation analyst predictions 2022 the future of data management we've gathered six of the best analysts in data and data management who are going to present and discuss their top predictions and trends for 2022 in the first half of this decade let me introduce our six power panelists sanjeev mohan is former gartner analyst and principal at sanjamo tony bear is principal at db insight carl olufsen is well-known research vice president with idc dave meninger is senior vice president and research director at ventana research brad shimon chief analyst at ai platforms analytics and data management at omnia and doug henschen vice president and principal analyst at constellation research gentlemen welcome to the program and thanks for coming on thecube today great to be here thank you all right here's the format we're going to use i as moderator are going to call on each analyst separately who then will deliver their prediction or mega trend and then in the interest of time management and pace two analysts will have the opportunity to comment if we have more time we'll elongate it but let's get started right away sanjeev mohan please kick it off you want to talk about governance go ahead sir thank you dave i i believe that data governance which we've been talking about for many years is now not only going to be mainstream it's going to be table stakes and all the things that you mentioned you know with data oceans data lakes lake houses data fabric meshes the common glue is metadata if we don't understand what data we have and we are governing it there is no way we can manage it so we saw informatica when public last year after a hiatus of six years i've i'm predicting that this year we see some more companies go public uh my bet is on colibra most likely and maybe alation we'll see go public this year we we i'm also predicting that the scope of data governance is going to expand beyond just data it's not just data and reports we are going to see more transformations like spark jaws python even airflow we're going to see more of streaming data so from kafka schema registry for example we will see ai models become part of this whole governance suite so the governance suite is going to be very comprehensive very detailed lineage impact analysis and then even expand into data quality we already seen that happen with some of the tools where they are buying these smaller companies and bringing in data quality monitoring and integrating it with metadata management data catalogs also data access governance so these so what we are going to see is that once the data governance platforms become the key entry point into these modern architectures i'm predicting that the usage the number of users of a data catalog is going to exceed that of a bi tool that will take time and we already seen that that trajectory right now if you look at bi tools i would say there are 100 users to a bi tool to one data catalog and i i see that evening out over a period of time and at some point data catalogs will really become you know the main way for us to access data data catalog will help us visualize data but if we want to do more in-depth analysis it'll be the jumping-off point into the bi tool the data science tool and and that is that is the journey i see for the data governance products excellent thank you some comments maybe maybe doug a lot a lot of things to weigh in on there maybe you could comment yeah sanjeev i think you're spot on a lot of the trends uh the one disagreement i think it's it's really still far from mainstream as you say we've been talking about this for years it's like god motherhood apple pie everyone agrees it's important but too few organizations are really practicing good governance because it's hard and because the incentives have been lacking i think one thing that deserves uh mention in this context is uh esg mandates and guidelines these are environmental social and governance regs and guidelines we've seen the environmental rags and guidelines imposed in industries particularly the carbon intensive industries we've seen the social mandates particularly diversity imposed on suppliers by companies that are leading on this topic we've seen governance guidelines now being imposed by banks and investors so these esgs are presenting new carrots and sticks and it's going to demand more solid data it's going to demand more detailed reporting and solid reporting tighter governance but we're still far from mainstream adoption we have a lot of uh you know best of breed niche players in the space i think the signs that it's going to be more mainstream are starting with things like azure purview google dataplex the big cloud platform uh players seem to be uh upping the ante and and addressing starting to address governance excellent thank you doug brad i wonder if you could chime in as well yeah i would love to be a believer in data catalogs um but uh to doug's point i think that it's going to take some more pressure for for that to happen i recall metadata being something every enterprise thought they were going to get under control when we were working on service oriented architecture back in the 90s and that didn't happen quite the way we we anticipated and and uh to sanjeev's point it's because it is really complex and really difficult to do my hope is that you know we won't sort of uh how do we put this fade out into this nebulous nebula of uh domain catalogs that are specific to individual use cases like purview for getting data quality right or like data governance and cyber security and instead we have some tooling that can actually be adaptive to gather metadata to create something i know is important to you sanjeev and that is this idea of observability if you can get enough metadata without moving your data around but understanding the entirety of a system that's running on this data you can do a lot to help with with the governance that doug is talking about so so i just want to add that you know data governance like many other initiatives did not succeed even ai went into an ai window but that's a different topic but a lot of these things did not succeed because to your point the incentives were not there i i remember when starbucks oxley had come into the scene if if a bank did not do service obviously they were very happy to a million dollar fine that was like you know pocket change for them instead of doing the right thing but i think the stakes are much higher now with gdpr uh the floodgates open now you know california you know has ccpa but even ccpa is being outdated with cpra which is much more gdpr like so we are very rapidly entering a space where every pretty much every major country in the world is coming up with its own uh compliance regulatory requirements data residence is becoming really important and and i i think we are going to reach a stage where uh it won't be optional anymore so whether we like it or not and i think the reason data catalogs were not successful in the past is because we did not have the right focus on adoption we were focused on features and these features were disconnected very hard for business to stop these are built by it people for it departments to to take a look at technical metadata not business metadata today the tables have turned cdo's are driving this uh initiative uh regulatory compliances are beating down hard so i think the time might be right yeah so guys we have to move on here and uh but there's some some real meat on the bone here sanjeev i like the fact that you late you called out calibra and alation so we can look back a year from now and say okay he made the call he stuck it and then the ratio of bi tools the data catalogs that's another sort of measurement that we can we can take even though some skepticism there that's something that we can watch and i wonder if someday if we'll have more metadata than data but i want to move to tony baer you want to talk about data mesh and speaking you know coming off of governance i mean wow you know the whole concept of data mesh is decentralized data and then governance becomes you know a nightmare there but take it away tony we'll put it this way um data mesh you know the the idea at least is proposed by thoughtworks um you know basically was unleashed a couple years ago and the press has been almost uniformly almost uncritical um a good reason for that is for all the problems that basically that sanjeev and doug and brad were just you know we're just speaking about which is that we have all this data out there and we don't know what to do about it um now that's not a new problem that was a problem we had enterprise data warehouses it was a problem when we had our hadoop data clusters it's even more of a problem now the data's out in the cloud where the data is not only your data like is not only s3 it's all over the place and it's also including streaming which i know we'll be talking about later so the data mesh was a response to that the idea of that we need to debate you know who are the folks that really know best about governance is the domain experts so it was basically data mesh was an architectural pattern and a process my prediction for this year is that data mesh is going to hit cold hard reality because if you if you do a google search um basically the the published work the articles and databases have been largely you know pretty uncritical um so far you know that you know basically learning is basically being a very revolutionary new idea i don't think it's that revolutionary because we've talked about ideas like this brad and i you and i met years ago when we were talking about so and decentralizing all of us was at the application level now we're talking about at the data level and now we have microservices so there's this thought of oh if we manage if we're apps in cloud native through microservices why don't we think of data in the same way um my sense this year is that you know this and this has been a very active search if you look at google search trends is that now companies are going to you know enterprises are going to look at this seriously and as they look at seriously it's going to attract its first real hard scrutiny it's going to attract its first backlash that's not necessarily a bad thing it means that it's being taken seriously um the reason why i think that that uh that it will you'll start to see basically the cold hard light of day shine on data mesh is that it's still a work in progress you know this idea is basically a couple years old and there's still some pretty major gaps um the biggest gap is in is in the area of federated governance now federated governance itself is not a new issue uh federated governance position we're trying to figure out like how can we basically strike the balance between getting let's say you know between basically consistent enterprise policy consistent enterprise governance but yet the groups that understand the data know how to basically you know that you know how do we basically sort of balance the two there's a huge there's a huge gap there in practice and knowledge um also to a lesser extent there's a technology gap which is basically in the self-service technologies that will help teams essentially govern data you know basically through the full life cycle from developed from selecting the data from you know building the other pipelines from determining your access control determining looking at quality looking at basically whether data is fresh or whether or not it's trending of course so my predictions is that it will really receive the first harsh scrutiny this year you are going to see some organization enterprises declare premature victory when they've uh when they build some federated query implementations you're going to see vendors start to data mesh wash their products anybody in the data management space they're going to say that whether it's basically a pipelining tool whether it's basically elt whether it's a catalog um or confederated query tool they're all going to be like you know basically promoting the fact of how they support this hopefully nobody is going to call themselves a data mesh tool because data mesh is not a technology we're going to see one other thing come out of this and this harks back to the metadata that sanji was talking about and the catalogs that he was talking about which is that there's going to be a new focus on every renewed focus on metadata and i think that's going to spur interest in data fabrics now data fabrics are pretty vaguely defined but if we just take the most elemental definition which is a common metadata back plane i think that if anybody is going to get serious about data mesh they need to look at a data fabric because we all at the end of the day need to speak you know need to read from the same sheet of music so thank you tony dave dave meninger i mean one of the things that people like about data mesh is it pretty crisply articulates some of the flaws in today's organizational approaches to data what are your thoughts on this well i think we have to start by defining data mesh right the the term is already getting corrupted right tony said it's going to see the cold hard uh light of day and there's a problem right now that there are a number of overlapping terms that are similar but not identical so we've got data virtualization data fabric excuse me for a second sorry about that data virtualization data fabric uh uh data federation right uh so i i think that it's not really clear what each vendor means by these terms i see data mesh and data fabric becoming quite popular i've i've interpreted data mesh as referring primarily to the governance aspects as originally you know intended and specified but that's not the way i see vendors using i see vendors using it much more to mean data fabric and data virtualization so i'm going to comment on the group of those things i think the group of those things is going to happen they're going to happen they're going to become more robust our research suggests that a quarter of organizations are already using virtualized access to their data lakes and another half so a total of three quarters will eventually be accessing their data lakes using some sort of virtualized access again whether you define it as mesh or fabric or virtualization isn't really the point here but this notion that there are different elements of data metadata and governance within an organization that all need to be managed collectively the interesting thing is when you look at the satisfaction rates of those organizations using virtualization versus those that are not it's almost double 68 of organizations i'm i'm sorry um 79 of organizations that were using virtualized access express satisfaction with their access to the data lake only 39 expressed satisfaction if they weren't using virtualized access so thank you uh dave uh sanjeev we just got about a couple minutes on this topic but i know you're speaking or maybe you've spoken already on a panel with jamal dagani who sort of invented the concept governance obviously is a big sticking point but what are your thoughts on this you are mute so my message to your mark and uh and to the community is uh as opposed to what dave said let's not define it we spent the whole year defining it there are four principles domain product data infrastructure and governance let's take it to the next level i get a lot of questions on what is the difference between data fabric and data mesh and i'm like i can compare the two because data mesh is a business concept data fabric is a data integration pattern how do you define how do you compare the two you have to bring data mesh level down so to tony's point i'm on a warp path in 2022 to take it down to what does a data product look like how do we handle shared data across domains and govern it and i think we are going to see more of that in 2022 is operationalization of data mesh i think we could have a whole hour on this topic couldn't we uh maybe we should do that uh but let's go to let's move to carl said carl your database guy you've been around that that block for a while now you want to talk about graph databases bring it on oh yeah okay thanks so i regard graph database as basically the next truly revolutionary database management technology i'm looking forward to for the graph database market which of course we haven't defined yet so obviously i have a little wiggle room in what i'm about to say but that this market will grow by about 600 percent over the next 10 years now 10 years is a long time but over the next five years we expect to see gradual growth as people start to learn how to use it problem isn't that it's used the problem is not that it's not useful is that people don't know how to use it so let me explain before i go any further what a graph database is because some of the folks on the call may not may not know what it is a graph database organizes data according to a mathematical structure called a graph a graph has elements called nodes and edges so a data element drops into a node the nodes are connected by edges the edges connect one node to another node combinations of edges create structures that you can analyze to determine how things are related in some cases the nodes and edges can have properties attached to them which add additional informative material that makes it richer that's called a property graph okay there are two principal use cases for graph databases there's there's semantic proper graphs which are used to break down human language text uh into the semantic structures then you can search it organize it and and and answer complicated questions a lot of ai is aimed at semantic graphs another kind is the property graph that i just mentioned which has a dazzling number of use cases i want to just point out is as i talk about this people are probably wondering well we have relational databases isn't that good enough okay so a relational database defines it uses um it supports what i call definitional relationships that means you define the relationships in a fixed structure the database drops into that structure there's a value foreign key value that relates one table to another and that value is fixed you don't change it if you change it the database becomes unstable it's not clear what you're looking at in a graph database the system is designed to handle change so that it can reflect the true state of the things that it's being used to track so um let me just give you some examples of use cases for this um they include uh entity resolution data lineage uh um social media analysis customer 360 fraud prevention there's cyber security there's strong supply chain is a big one actually there's explainable ai and this is going to become important too because a lot of people are adopting ai but they want a system after the fact to say how did the ai system come to that conclusion how did it make that recommendation right now we don't have really good ways of tracking that okay machine machine learning in general um social network i already mentioned that and then we've got oh gosh we've got data governance data compliance risk management we've got recommendation we've got personalization anti-money money laundering that's another big one identity and access management network and i.t operations is already becoming a key one where you actually have mapped out your operation your your you know whatever it is your data center and you you can track what's going on as things happen there root cause analysis fraud detection is a huge one a number of major credit card companies use graph databases for fraud detection risk analysis tracking and tracing churn analysis next best action what-if analysis impact analysis entity resolution and i would add one other thing or just a few other things to this list metadata management so sanjay here you go this is your engine okay because i was in metadata management for quite a while in my past life and one of the things i found was that none of the data management technologies that were available to us could efficiently handle metadata because of the kinds of structures that result from it but grass can okay grafts can do things like say this term in this context means this but in that context it means that okay things like that and in fact uh logistics management supply chain it also because it handles recursive relationships by recursive relationships i mean objects that own other objects that are of the same type you can do things like bill materials you know so like parts explosion you can do an hr analysis who reports to whom how many levels up the chain and that kind of thing you can do that with relational databases but yes it takes a lot of programming in fact you can do almost any of these things with relational databases but the problem is you have to program it it's not it's not supported in the database and whenever you have to program something that means you can't trace it you can't define it you can't publish it in terms of its functionality and it's really really hard to maintain over time so carl thank you i wonder if we could bring brad in i mean brad i'm sitting there wondering okay is this incremental to the market is it disruptive and replaceable what are your thoughts on this space it's already disrupted the market i mean like carl said go to any bank and ask them are you using graph databases to do to get fraud detection under control and they'll say absolutely that's the only way to solve this problem and it is frankly um and it's the only way to solve a lot of the problems that carl mentioned and that is i think it's it's achilles heel in some ways because you know it's like finding the best way to cross the seven bridges of konigsberg you know it's always going to kind of be tied to those use cases because it's really special and it's really unique and because it's special and it's unique uh it it still unfortunately kind of stands apart from the rest of the community that's building let's say ai outcomes as the great great example here the graph databases and ai as carl mentioned are like chocolate and peanut butter but technologically they don't know how to talk to one another they're completely different um and you know it's you can't just stand up sql and query them you've got to to learn um yeah what is that carlos specter or uh special uh uh yeah thank you uh to actually get to the data in there and if you're gonna scale that data that graph database especially a property graph if you're gonna do something really complex like try to understand uh you know all of the metadata in your organization you might just end up with you know a graph database winter like we had the ai winter simply because you run out of performance to make the thing happen so i i think it's already disrupted but we we need to like treat it like a first-class citizen in in the data analytics and ai community we need to bring it into the fold we need to equip it with the tools it needs to do that the magic it does and to do it not just for specialized use cases but for everything because i i'm with carl i i think it's absolutely revolutionary so i had also identified the principal achilles heel of the technology which is scaling now when these when these things get large and complex enough that they spill over what a single server can handle you start to have difficulties because the relationships span things that have to be resolved over a network and then you get network latency and that slows the system down so that's still a problem to be solved sanjeev any quick thoughts on this i mean i think metadata on the on the on the word cloud is going to be the the largest font uh but what are your thoughts here i want to like step away so people don't you know associate me with only meta data so i want to talk about something a little bit slightly different uh dbengines.com has done an amazing job i think almost everyone knows that they chronicle all the major databases that are in use today in january of 2022 there are 381 databases on its list of ranked list of databases the largest category is rdbms the second largest category is actually divided into two property graphs and rdf graphs these two together make up the second largest number of data databases so talking about accolades here this is a problem the problem is that there's so many graph databases to choose from they come in different shapes and forms uh to bright's point there's so many query languages in rdbms is sql end of the story here we've got sci-fi we've got gremlin we've got gql and then your proprietary languages so i think there's a lot of disparity in this space but excellent all excellent points sanji i must say and that is a problem the languages need to be sorted and standardized and it needs people need to have a road map as to what they can do with it because as you say you can do so many things and so many of those things are unrelated that you sort of say well what do we use this for i'm reminded of the saying i learned a bunch of years ago when somebody said that the digital computer is the only tool man has ever devised that has no particular purpose all right guys we gotta we gotta move on to dave uh meninger uh we've heard about streaming uh your prediction is in that realm so please take it away sure so i like to say that historical databases are to become a thing of the past but i don't mean that they're going to go away that's not my point i mean we need historical databases but streaming data is going to become the default way in which we operate with data so in the next say three to five years i would expect the data platforms and and we're using the term data platforms to represent the evolution of databases and data lakes that the data platforms will incorporate these streaming capabilities we're going to process data as it streams into an organization and then it's going to roll off into historical databases so historical databases don't go away but they become a thing of the past they store the data that occurred previously and as data is occurring we're going to be processing it we're going to be analyzing we're going to be acting on it i mean we we only ever ended up with historical databases because we were limited by the technology that was available to us data doesn't occur in batches but we processed it in batches because that was the best we could do and it wasn't bad and we've continued to improve and we've improved and we've improved but streaming data today is still the exception it's not the rule right there's there are projects within organizations that deal with streaming data but it's not the default way in which we deal with data yet and so that that's my prediction is that this is going to change we're going to have um streaming data be the default way in which we deal with data and and how you label it what you call it you know maybe these databases and data platforms just evolve to be able to handle it but we're going to deal with data in a different way and our research shows that already about half of the participants in our analytics and data benchmark research are using streaming data you know another third are planning to use streaming technologies so that gets us to about eight out of ten organizations need to use this technology that doesn't mean they have to use it throughout the whole organization but but it's pretty widespread in its use today and has continued to grow if you think about the consumerization of i.t we've all been conditioned to expect immediate access to information immediate responsiveness you know we want to know if an uh item is on the shelf at our local retail store and we can go in and pick it up right now you know that's the world we live in and that's spilling over into the enterprise i.t world where we have to provide those same types of capabilities um so that's my prediction historical database has become a thing of the past streaming data becomes the default way in which we we operate with data all right thank you david well so what what say you uh carl a guy who's followed historical databases for a long time well one thing actually every database is historical because as soon as you put data in it it's now history it's no longer it no longer reflects the present state of things but even if that history is only a millisecond old it's still history but um i would say i mean i know you're trying to be a little bit provocative in saying this dave because you know as well as i do that people still need to do their taxes they still need to do accounting they still need to run general ledger programs and things like that that all involves historical data that's not going to go away unless you want to go to jail so you're going to have to deal with that but as far as the leading edge functionality i'm totally with you on that and i'm just you know i'm just kind of wondering um if this chain if this requires a change in the way that we perceive applications in order to truly be manifested and rethinking the way m applications work um saying that uh an application should respond instantly as soon as the state of things changes what do you say about that i i think that's true i think we do have to think about things differently that's you know it's not the way we design systems in the past uh we're seeing more and more systems designed that way but again it's not the default and and agree 100 with you that we do need historical databases you know that that's clear and even some of those historical databases will be used in conjunction with the streaming data right so absolutely i mean you know let's take the data warehouse example where you're using the data warehouse as context and the streaming data as the present you're saying here's a sequence of things that's happening right now have we seen that sequence before and where what what does that pattern look like in past situations and can we learn from that so tony bear i wonder if you could comment i mean if you when you think about you know real-time inferencing at the edge for instance which is something that a lot of people talk about um a lot of what we're discussing here in this segment looks like it's got great potential what are your thoughts yeah well i mean i think you nailed it right you know you hit it right on the head there which is that i think a key what i'm seeing is that essentially and basically i'm going to split this one down the middle is i don't see that basically streaming is the default what i see is streaming and basically and transaction databases um and analytics data you know data warehouses data lakes whatever are converging and what allows us technically to converge is cloud native architecture where you can basically distribute things so you could have you can have a note here that's doing the real-time processing that's also doing it and this is what your leads in we're maybe doing some of that real-time predictive analytics to take a look at well look we're looking at this customer journey what's happening with you know you know with with what the customer is doing right now and this is correlated with what other customers are doing so what i so the thing is that in the cloud you can basically partition this and because of basically you know the speed of the infrastructure um that you can basically bring these together and or and so and kind of orchestrate them sort of loosely coupled manner the other part is that the use cases are demanding and this is part that goes back to what dave is saying is that you know when you look at customer 360 when you look at let's say smart you know smart utility grids when you look at any type of operational problem it has a real-time component and it has a historical component and having predictives and so like you know you know my sense here is that there that technically we can bring this together through the cloud and i think the use case is that is that we we can apply some some real-time sort of you know predictive analytics on these streams and feed this into the transactions so that when we make a decision in terms of what to do as a result of a transaction we have this real time you know input sanjeev did you have a comment yeah i was just going to say that to this point you know we have to think of streaming very different because in the historical databases we used to bring the data and store the data and then we used to run rules on top uh aggregations and all but in case of streaming the mindset changes because the rules normally the inference all of that is fixed but the data is constantly changing so it's a completely reverse way of thinking of uh and building applications on top of that so dave menninger there seemed to be some disagreement about the default or now what kind of time frame are you are you thinking about is this end of decade it becomes the default what would you pin i i think around you know between between five to ten years i think this becomes the reality um i think you know it'll be more and more common between now and then but it becomes the default and i also want sanjeev at some point maybe in one of our subsequent conversations we need to talk about governing streaming data because that's a whole other set of challenges we've also talked about it rather in a two dimensions historical and streaming and there's lots of low latency micro batch sub second that's not quite streaming but in many cases it's fast enough and we're seeing a lot of adoption of near real time not quite real time as uh good enough for most for many applications because nobody's really taking the hardware dimension of this information like how do we that'll just happen carl so near real time maybe before you lose the customer however you define that right okay um let's move on to brad brad you want to talk about automation ai uh the the the pipeline people feel like hey we can just automate everything what's your prediction yeah uh i'm i'm an ai fiction auto so apologies in advance for that but uh you know um i i think that um we've been seeing automation at play within ai for some time now and it's helped us do do a lot of things for especially for practitioners that are building ai outcomes in the enterprise uh it's it's helped them to fill skills gaps it's helped them to speed development and it's helped them to to actually make ai better uh because it you know in some ways provides some swim lanes and and for example with technologies like ottawa milk and can auto document and create that sort of transparency that that we talked about a little bit earlier um but i i think it's there's an interesting kind of conversion happening with this idea of automation um and and that is that uh we've had the automation that started happening for practitioners it's it's trying to move outside of the traditional bounds of things like i'm just trying to get my features i'm just trying to pick the right algorithm i'm just trying to build the right model uh and it's expanding across that full life cycle of building an ai outcome to start at the very beginning of data and to then continue on to the end which is this continuous delivery and continuous uh automation of of that outcome to make sure it's right and it hasn't drifted and stuff like that and because of that because it's become kind of powerful we're starting to to actually see this weird thing happen where the practitioners are starting to converge with the users and that is to say that okay if i'm in tableau right now i can stand up salesforce einstein discovery and it will automatically create a nice predictive algorithm for me um given the data that i that i pull in um but what's starting to happen and we're seeing this from the the the companies that create business software so salesforce oracle sap and others is that they're starting to actually use these same ideals and a lot of deep learning to to basically stand up these out of the box flip a switch and you've got an ai outcome at the ready for business users and um i i'm very much you know i think that that's that's the way that it's going to go and what it means is that ai is is slowly disappearing uh and i don't think that's a bad thing i think if anything what we're going to see in 2022 and maybe into 2023 is this sort of rush to to put this idea of disappearing ai into practice and have as many of these solutions in the enterprise as possible you can see like for example sap is going to roll out this quarter this thing called adaptive recommendation services which which basically is a cold start ai outcome that can work across a whole bunch of different vertical markets and use cases it's just a recommendation engine for whatever you need it to do in the line of business so basically you're you're an sap user you look up to turn on your software one day and you're a sales professional let's say and suddenly you have a recommendation for customer churn it's going that's great well i i don't know i i think that's terrifying in some ways i think it is the future that ai is going to disappear like that but i am absolutely terrified of it because um i i think that what it what it really does is it calls attention to a lot of the issues that we already see around ai um specific to this idea of what what we like to call it omdia responsible ai which is you know how do you build an ai outcome that is free of bias that is inclusive that is fair that is safe that is secure that it's audible etc etc etc etc that takes some a lot of work to do and so if you imagine a customer that that's just a sales force customer let's say and they're turning on einstein discovery within their sales software you need some guidance to make sure that when you flip that switch that the outcome you're going to get is correct and that's that's going to take some work and so i think we're going to see this let's roll this out and suddenly there's going to be a lot of a lot of problems a lot of pushback uh that we're going to see and some of that's going to come from gdpr and others that sam jeeve was mentioning earlier a lot of it's going to come from internal csr requirements within companies that are saying hey hey whoa hold up we can't do this all at once let's take the slow route let's make ai automated in a smart way and that's going to take time yeah so a couple predictions there that i heard i mean ai essentially you disappear it becomes invisible maybe if i can restate that and then if if i understand it correctly brad you're saying there's a backlash in the near term people can say oh slow down let's automate what we can those attributes that you talked about are non trivial to achieve is that why you're a bit of a skeptic yeah i think that we don't have any sort of standards that companies can look to and understand and we certainly within these companies especially those that haven't already stood up in internal data science team they don't have the knowledge to understand what that when they flip that switch for an automated ai outcome that it's it's gonna do what they think it's gonna do and so we need some sort of standard standard methodology and practice best practices that every company that's going to consume this invisible ai can make use of and one of the things that you know is sort of started that google kicked off a few years back that's picking up some momentum and the companies i just mentioned are starting to use it is this idea of model cards where at least you have some transparency about what these things are doing you know so like for the sap example we know for example that it's convolutional neural network with a long short-term memory model that it's using we know that it only works on roman english uh and therefore me as a consumer can say oh well i know that i need to do this internationally so i should not just turn this on today great thank you carl can you add anything any context here yeah we've talked about some of the things brad mentioned here at idc in the our future of intelligence group regarding in particular the moral and legal implications of having a fully automated you know ai uh driven system uh because we already know and we've seen that ai systems are biased by the data that they get right so if if they get data that pushes them in a certain direction i think there was a story last week about an hr system that was uh that was recommending promotions for white people over black people because in the past um you know white people were promoted and and more productive than black people but not it had no context as to why which is you know because they were being historically discriminated black people being historically discriminated against but the system doesn't know that so you know you have to be aware of that and i think that at the very least there should be controls when a decision has either a moral or a legal implication when when you want when you really need a human judgment it could lay out the options for you but a person actually needs to authorize that that action and i also think that we always will have to be vigilant regarding the kind of data we use to train our systems to make sure that it doesn't introduce unintended biases and to some extent they always will so we'll always be chasing after them that's that's absolutely carl yeah i think that what you have to bear in mind as a as a consumer of ai is that it is a reflection of us and we are a very flawed species uh and so if you look at all the really fantastic magical looking supermodels we see like gpt three and four that's coming out z they're xenophobic and hateful uh because the people the data that's built upon them and the algorithms and the people that build them are us so ai is a reflection of us we need to keep that in mind yeah we're the ai's by us because humans are biased all right great okay let's move on doug henson you know a lot of people that said that data lake that term's not not going to not going to live on but it appears to be have some legs here uh you want to talk about lake house bring it on yes i do my prediction is that lake house and this idea of a combined data warehouse and data lake platform is going to emerge as the dominant data management offering i say offering that doesn't mean it's going to be the dominant thing that organizations have out there but it's going to be the predominant vendor offering in 2022. now heading into 2021 we already had cloudera data bricks microsoft snowflake as proponents in 2021 sap oracle and several of these fabric virtualization mesh vendors join the bandwagon the promise is that you have one platform that manages your structured unstructured and semi-structured information and it addresses both the beyond analytics needs and the data science needs the real promise there is simplicity and lower cost but i think end users have to answer a few questions the first is does your organization really have a center of data gravity or is it is the data highly distributed multiple data warehouses multiple data lakes on-premises cloud if it if it's very distributed and you you know you have difficulty consolidating and that's not really a goal for you then maybe that single platform is unrealistic and not likely to add value to you um you know also the fabric and virtualization vendors the the mesh idea that's where if you have this highly distributed situation that might be a better path forward the second question if you are looking at one of these lake house offerings you are looking at consolidating simplifying bringing together to a single platform you have to make sure that it meets both the warehouse need and the data lake need so you have vendors like data bricks microsoft with azure synapse new really to the data warehouse space and they're having to prove that these data warehouse capabilities on their platforms can meet the scaling requirements can meet the user and query concurrency requirements meet those tight slas and then on the other hand you have the or the oracle sap snowflake the data warehouse uh folks coming into the data science world and they have to prove that they can manage the unstructured information and meet the needs of the data scientists i'm seeing a lot of the lake house offerings from the warehouse crowd managing that unstructured information in columns and rows and some of these vendors snowflake in particular is really relying on partners for the data science needs so you really got to look at a lake house offering and make sure that it meets both the warehouse and the data lake requirement well thank you doug well tony if those two worlds are going to come together as doug was saying the analytics and the data science world does it need to be some kind of semantic layer in between i don't know weigh in on this topic if you would oh didn't we talk about data fabrics before common metadata layer um actually i'm almost tempted to say let's declare victory and go home in that this is actually been going on for a while i actually agree with uh you know much what doug is saying there which is that i mean we i remembered as far back as i think it was like 2014 i was doing a a study you know it was still at ovum predecessor omnia um looking at all these specialized databases that were coming up and seeing that you know there's overlap with the edges but yet there was still going to be a reason at the time that you would have let's say a document database for json you'd have a relational database for tran you know for transactions and for data warehouse and you had you know and you had basically something at that time that that resembles to do for what we're considering a day of life fast fo and the thing is what i was saying at the time is that you're seeing basically blur you know sort of blending at the edges that i was saying like about five or six years ago um that's all and the the lake house is essentially you know the amount of the the current manifestation of that idea there is a dichotomy in terms of you know it's the old argument do we centralize this all you know you know in in in in in a single place or do we or do we virtualize and i think it's always going to be a yin and yang there's never going to be a single single silver silver bullet i do see um that they're also going to be questions and these are things that points that doug raised they're you know what your what do you need of of of your of you know for your performance there or for your you know pre-performance characteristics do you need for instance hiking currency you need the ability to do some very sophisticated joins or is your requirement more to be able to distribute and you know distribute our processing is you know as far as possible to get you know to essentially do a kind of brute force approach all these approaches are valid based on you know based on the used case um i just see that essentially that the lake house is the culmination of it's nothing it's just it's a relatively new term introduced by databricks a couple years ago this is the culmination of basically what's been a long time trend and what we see in the cloud is that as we start seeing data warehouses as a checkbox item say hey we can basically source data in cloud and cloud storage and s3 azure blob store you know whatever um as long as it's in certain formats like you know like you know parquet or csv or something like that you know i see that as becoming kind of you know a check box item so to that extent i think that the lake house depending on how you define it is already reality um and in some in some cases maybe new terminology but not a whole heck of a lot new under the sun yeah and dave menger i mean a lot of this thank you tony but a lot of this is going to come down to you know vendor marketing right some people try to co-opt the term we talked about data mesh washing what are your thoughts on this yeah so um i used the term data platform earlier and and part of the reason i use that term is that it's more vendor neutral uh we've we've tried to uh sort of stay out of the the vendor uh terminology patenting world right whether whether the term lake house is what sticks or not the concept is certainly going to stick and we have some data to back it up about a quarter of organizations that are using data lakes today already incorporate data warehouse functionality into it so they consider their data lake house and data warehouse one in the same about a quarter of organizations a little less but about a quarter of organizations feed the data lake from the data warehouse and about a quarter of organizations feed the data warehouse from the data lake so it's pretty obvious that three quarters of organizations need to bring this stuff together right the need is there the need is apparent the technology is going to continue to verge converge i i like to talk about you know you've got data lakes over here at one end and i'm not going to talk about why people thought data lakes were a bad idea because they thought you just throw stuff in a in a server and you ignore it right that's not what a data lake is so you've got data lake people over here and you've got database people over here data warehouse people over here database vendors are adding data lake capabilities and data lake vendors are adding data warehouse capabilities so it's obvious that they're going to meet in the middle i mean i think it's like tony says i think we should there declare victory and go home and so so i it's just a follow-up on that so are you saying these the specialized lake and the specialized warehouse do they go away i mean johnny tony data mesh practitioners would say or or advocates would say well they could all live as just a node on the on the mesh but based on what dave just said are we going to see those all morph together well number one as i was saying before there's always going to be this sort of you know kind of you know centrifugal force or this tug of war between do we centralize the data do we do it virtualize and the fact is i don't think that work there's ever going to be any single answer i think in terms of data mesh data mesh has nothing to do with how you physically implement the data you could have a data mesh on a basically uh on a data warehouse it's just that you know the difference being is that if we use the same you know physical data store but everybody's logically manual basically governing it differently you know um a data mission is basically it's not a technology it's a process it's a governance process um so essentially um you know you know i basically see that you know as as i was saying before that this is basically the culmination of a long time trend we're essentially seeing a lot of blurring but there are going to be cases where for instance if i need let's say like observe i need like high concurrency or something like that there are certain things that i'm not going to be able to get efficiently get out of a data lake um and you know we're basically i'm doing a system where i'm just doing really brute forcing very fast file scanning and that type of thing so i think there always will be some delineations but i would agree with dave and with doug that we are seeing basically a a confluence of requirements that we need to essentially have basically the element you know the ability of a data lake and a data laid out their warehouse we these need to come together so i think what we're likely to see is organizations look for a converged platform that can handle both sides for their center of data gravity the mesh and the fabric vendors the the fabric virtualization vendors they're all on board with the idea of this converged platform and they're saying hey we'll handle all the edge cases of the stuff that isn't in that center of data gradient that is off distributed in a cloud or at a remote location so you can have that single platform for the center of of your your data and then bring in virtualization mesh what have you for reaching out to the distributed data bingo as they basically said people are happy when they virtualize data i i think yes at this point but to this uh dave meningas point you know they have convert they are converging snowflake has introduced support for unstructured data so now we are literally splitting here now what uh databricks is saying is that aha but it's easy to go from data lake to data warehouse than it is from data warehouse to data lake so i think we're getting into semantics but we've already seen these two converge so is that so it takes something like aws who's got what 15 data stores are they're going to have 15 converged data stores that's going to be interesting to watch all right guys i'm going to go down the list and do like a one i'm going to one word each and you guys each of the analysts if you wouldn't just add a very brief sort of course correction for me so sanjeev i mean governance is going to be the maybe it's the dog that wags the tail now i mean it's coming to the fore all this ransomware stuff which really didn't talk much about security but but but what's the one word in your prediction that you would leave us with on governance it's uh it's going to be mainstream mainstream okay tony bear mesh washing is what i wrote down that's that's what we're going to see in uh in in 2022 a little reality check you you want to add to that reality check is i hope that no vendor you know jumps the shark and calls their offering a data mesh project yeah yeah let's hope that doesn't happen if they do we're going to call them out uh carl i mean graph databases thank you for sharing some some you know high growth metrics i know it's early days but magic is what i took away from that it's the magic database yeah i would actually i've said this to people too i i kind of look at it as a swiss army knife of data because you can pretty much do anything you want with it it doesn't mean you should i mean that's definitely the case that if you're you know managing things that are in a fixed schematic relationship probably a relational database is a better choice there are you know times when the document database is a better choice it can handle those things but maybe not it may not be the best choice for that use case but for a great many especially the new emerging use cases i listed it's the best choice thank you and dave meninger thank you by the way for bringing the data in i like how you supported all your comments with with some some data points but streaming data becomes the sort of default uh paradigm if you will what would you add yeah um i would say think fast right that's the world we live in you got to think fast fast love it uh and brad shimon uh i love it i mean on the one hand i was saying okay great i'm afraid i might get disrupted by one of these internet giants who are ai experts so i'm gonna be able to buy instead of build ai but then again you know i've got some real issues there's a potential backlash there so give us the there's your bumper sticker yeah i i would say um going with dave think fast and also think slow uh to to talk about the book that everyone talks about i would say really that this is all about trust trust in the idea of automation and of a transparent invisible ai across the enterprise but verify verify before you do anything and then doug henson i mean i i look i think the the trend is your friend here on this prediction with lake house is uh really becoming dominant i liked the way you set up that notion of you know the the the data warehouse folks coming at it from the analytics perspective but then you got the data science worlds coming together i still feel as though there's this piece in the middle that we're missing but your your final thoughts we'll give you the last well i think the idea of consolidation and simplification uh always prevails that's why the appeal of a single platform is going to be there um we've already seen that with uh you know hadoop platforms moving toward cloud moving toward object storage and object storage becoming really the common storage point for whether it's a lake or a warehouse uh and that second point uh i think esg mandates are uh are gonna come in alongside uh gdpr and things like that to uh up the ante for uh good governance yeah thank you for calling that out okay folks hey that's all the time that that we have here your your experience and depth of understanding on these key issues and in data and data management really on point and they were on display today i want to thank you for your your contributions really appreciate your time enjoyed it thank you now in addition to this video we're going to be making available transcripts of the discussion we're going to do clips of this as well we're going to put them out on social media i'll write this up and publish the discussion on wikibon.com and siliconangle.com no doubt several of the analysts on the panel will take the opportunity to publish written content social commentary or both i want to thank the power panelist and thanks for watching this special cube presentation this is dave vellante be well and we'll see you next time [Music] you

Published Date : Jan 8 2022

SUMMARY :

the end of the day need to speak you

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
381 databasesQUANTITY

0.99+

2014DATE

0.99+

2022DATE

0.99+

2021DATE

0.99+

january of 2022DATE

0.99+

100 usersQUANTITY

0.99+

jamal daganiPERSON

0.99+

last weekDATE

0.99+

dave meningerPERSON

0.99+

sanjiPERSON

0.99+

second questionQUANTITY

0.99+

15 converged data storesQUANTITY

0.99+

dave vellantePERSON

0.99+

microsoftORGANIZATION

0.99+

threeQUANTITY

0.99+

sanjeevPERSON

0.99+

2023DATE

0.99+

15 data storesQUANTITY

0.99+

siliconangle.comOTHER

0.99+

last yearDATE

0.99+

sanjeev mohanPERSON

0.99+

sixQUANTITY

0.99+

twoQUANTITY

0.99+

carlPERSON

0.99+

tonyPERSON

0.99+

carl olufsenPERSON

0.99+

six yearsQUANTITY

0.99+

davidPERSON

0.99+

carlos specterPERSON

0.98+

both sidesQUANTITY

0.98+

2010sDATE

0.98+

first backlashQUANTITY

0.98+

five yearsQUANTITY

0.98+

todayDATE

0.98+

davePERSON

0.98+

eachQUANTITY

0.98+

three quartersQUANTITY

0.98+

firstQUANTITY

0.98+

single platformQUANTITY

0.98+

lake houseORGANIZATION

0.98+

bothQUANTITY

0.98+

this yearDATE

0.98+

dougPERSON

0.97+

one wordQUANTITY

0.97+

this yearDATE

0.97+

wikibon.comOTHER

0.97+

one platformQUANTITY

0.97+

39QUANTITY

0.97+

about 600 percentQUANTITY

0.97+

two analystsQUANTITY

0.97+

ten yearsQUANTITY

0.97+

single platformQUANTITY

0.96+

fiveQUANTITY

0.96+

oneQUANTITY

0.96+

three quartersQUANTITY

0.96+

californiaLOCATION

0.96+

googleORGANIZATION

0.96+

singleQUANTITY

0.95+

Predictions 2022: Top Analysts See the Future of Data


 

(bright music) >> In the 2010s, organizations became keenly aware that data would become the key ingredient to driving competitive advantage, differentiation, and growth. But to this day, putting data to work remains a difficult challenge for many, if not most organizations. Now, as the cloud matures, it has become a game changer for data practitioners by making cheap storage and massive processing power readily accessible. We've also seen better tooling in the form of data workflows, streaming, machine intelligence, AI, developer tools, security, observability, automation, new databases and the like. These innovations they accelerate data proficiency, but at the same time, they add complexity for practitioners. Data lakes, data hubs, data warehouses, data marts, data fabrics, data meshes, data catalogs, data oceans are forming, they're evolving and exploding onto the scene. So in an effort to bring perspective to the sea of optionality, we've brought together the brightest minds in the data analyst community to discuss how data management is morphing and what practitioners should expect in 2022 and beyond. Hello everyone, my name is Dave Velannte with theCUBE, and I'd like to welcome you to a special Cube presentation, analysts predictions 2022: the future of data management. We've gathered six of the best analysts in data and data management who are going to present and discuss their top predictions and trends for 2022 in the first half of this decade. Let me introduce our six power panelists. Sanjeev Mohan is former Gartner Analyst and Principal at SanjMo. Tony Baer, principal at dbInsight, Carl Olofson is well-known Research Vice President with IDC, Dave Menninger is Senior Vice President and Research Director at Ventana Research, Brad Shimmin, Chief Analyst, AI Platforms, Analytics and Data Management at Omdia and Doug Henschen, Vice President and Principal Analyst at Constellation Research. Gentlemen, welcome to the program and thanks for coming on theCUBE today. >> Great to be here. >> Thank you. >> All right, here's the format we're going to use. I as moderator, I'm going to call on each analyst separately who then will deliver their prediction or mega trend, and then in the interest of time management and pace, two analysts will have the opportunity to comment. If we have more time, we'll elongate it, but let's get started right away. Sanjeev Mohan, please kick it off. You want to talk about governance, go ahead sir. >> Thank you Dave. I believe that data governance which we've been talking about for many years is now not only going to be mainstream, it's going to be table stakes. And all the things that you mentioned, you know, the data, ocean data lake, lake houses, data fabric, meshes, the common glue is metadata. If we don't understand what data we have and we are governing it, there is no way we can manage it. So we saw Informatica went public last year after a hiatus of six. I'm predicting that this year we see some more companies go public. My bet is on Culebra, most likely and maybe Alation we'll see go public this year. I'm also predicting that the scope of data governance is going to expand beyond just data. It's not just data and reports. We are going to see more transformations like spark jawsxxxxx, Python even Air Flow. We're going to see more of a streaming data. So from Kafka Schema Registry, for example. We will see AI models become part of this whole governance suite. So the governance suite is going to be very comprehensive, very detailed lineage, impact analysis, and then even expand into data quality. We already seen that happen with some of the tools where they are buying these smaller companies and bringing in data quality monitoring and integrating it with metadata management, data catalogs, also data access governance. So what we are going to see is that once the data governance platforms become the key entry point into these modern architectures, I'm predicting that the usage, the number of users of a data catalog is going to exceed that of a BI tool. That will take time and we already seen that trajectory. Right now if you look at BI tools, I would say there a hundred users to BI tool to one data catalog. And I see that evening out over a period of time and at some point data catalogs will really become the main way for us to access data. Data catalog will help us visualize data, but if we want to do more in-depth analysis, it'll be the jumping off point into the BI tool, the data science tool and that is the journey I see for the data governance products. >> Excellent, thank you. Some comments. Maybe Doug, a lot of things to weigh in on there, maybe you can comment. >> Yeah, Sanjeev I think you're spot on, a lot of the trends the one disagreement, I think it's really still far from mainstream. As you say, we've been talking about this for years, it's like God, motherhood, apple pie, everyone agrees it's important, but too few organizations are really practicing good governance because it's hard and because the incentives have been lacking. I think one thing that deserves mention in this context is ESG mandates and guidelines, these are environmental, social and governance, regs and guidelines. We've seen the environmental regs and guidelines and posts in industries, particularly the carbon-intensive industries. We've seen the social mandates, particularly diversity imposed on suppliers by companies that are leading on this topic. We've seen governance guidelines now being imposed by banks on investors. So these ESGs are presenting new carrots and sticks, and it's going to demand more solid data. It's going to demand more detailed reporting and solid reporting, tighter governance. But we're still far from mainstream adoption. We have a lot of, you know, best of breed niche players in the space. I think the signs that it's going to be more mainstream are starting with things like Azure Purview, Google Dataplex, the big cloud platform players seem to be upping the ante and starting to address governance. >> Excellent, thank you Doug. Brad, I wonder if you could chime in as well. >> Yeah, I would love to be a believer in data catalogs. But to Doug's point, I think that it's going to take some more pressure for that to happen. I recall metadata being something every enterprise thought they were going to get under control when we were working on service oriented architecture back in the nineties and that didn't happen quite the way we anticipated. And so to Sanjeev's point it's because it is really complex and really difficult to do. My hope is that, you know, we won't sort of, how do I put this? Fade out into this nebula of domain catalogs that are specific to individual use cases like Purview for getting data quality right or like data governance and cybersecurity. And instead we have some tooling that can actually be adaptive to gather metadata to create something. And I know its important to you, Sanjeev and that is this idea of observability. If you can get enough metadata without moving your data around, but understanding the entirety of a system that's running on this data, you can do a lot. So to help with the governance that Doug is talking about. >> So I just want to add that, data governance, like any other initiatives did not succeed even AI went into an AI window, but that's a different topic. But a lot of these things did not succeed because to your point, the incentives were not there. I remember when Sarbanes Oxley had come into the scene, if a bank did not do Sarbanes Oxley, they were very happy to a million dollar fine. That was like, you know, pocket change for them instead of doing the right thing. But I think the stakes are much higher now. With GDPR, the flood gates opened. Now, you know, California, you know, has CCPA but even CCPA is being outdated with CPRA, which is much more GDPR like. So we are very rapidly entering a space where pretty much every major country in the world is coming up with its own compliance regulatory requirements, data residents is becoming really important. And I think we are going to reach a stage where it won't be optional anymore. So whether we like it or not, and I think the reason data catalogs were not successful in the past is because we did not have the right focus on adoption. We were focused on features and these features were disconnected, very hard for business to adopt. These are built by IT people for IT departments to take a look at technical metadata, not business metadata. Today the tables have turned. CDOs are driving this initiative, regulatory compliances are beating down hard, so I think the time might be right. >> Yeah so guys, we have to move on here. But there's some real meat on the bone here, Sanjeev. I like the fact that you called out Culebra and Alation, so we can look back a year from now and say, okay, he made the call, he stuck it. And then the ratio of BI tools to data catalogs that's another sort of measurement that we can take even though with some skepticism there, that's something that we can watch. And I wonder if someday, if we'll have more metadata than data. But I want to move to Tony Baer, you want to talk about data mesh and speaking, you know, coming off of governance. I mean, wow, you know the whole concept of data mesh is, decentralized data, and then governance becomes, you know, a nightmare there, but take it away, Tony. >> We'll put this way, data mesh, you know, the idea at least as proposed by ThoughtWorks. You know, basically it was at least a couple of years ago and the press has been almost uniformly almost uncritical. A good reason for that is for all the problems that basically Sanjeev and Doug and Brad we're just speaking about, which is that we have all this data out there and we don't know what to do about it. Now, that's not a new problem. That was a problem we had in enterprise data warehouses, it was a problem when we had over DoOP data clusters, it's even more of a problem now that data is out in the cloud where the data is not only your data lake, is not only us three, it's all over the place. And it's also including streaming, which I know we'll be talking about later. So the data mesh was a response to that, the idea of that we need to bait, you know, who are the folks that really know best about governance? It's the domain experts. So it was basically data mesh was an architectural pattern and a process. My prediction for this year is that data mesh is going to hit cold heart reality. Because if you do a Google search, basically the published work, the articles on data mesh have been largely, you know, pretty uncritical so far. Basically loading and is basically being a very revolutionary new idea. I don't think it's that revolutionary because we've talked about ideas like this. Brad now you and I met years ago when we were talking about so and decentralizing all of us, but it was at the application level. Now we're talking about it at the data level. And now we have microservices. So there's this thought of have we managed if we're deconstructing apps in cloud native to microservices, why don't we think of data in the same way? My sense this year is that, you know, this has been a very active search if you look at Google search trends, is that now companies, like enterprise are going to look at this seriously. And as they look at it seriously, it's going to attract its first real hard scrutiny, it's going to attract its first backlash. That's not necessarily a bad thing. It means that it's being taken seriously. The reason why I think that you'll start to see basically the cold hearted light of day shine on data mesh is that it's still a work in progress. You know, this idea is basically a couple of years old and there's still some pretty major gaps. The biggest gap is in the area of federated governance. Now federated governance itself is not a new issue. Federated governance decision, we started figuring out like, how can we basically strike the balance between getting let's say between basically consistent enterprise policy, consistent enterprise governance, but yet the groups that understand the data and know how to basically, you know, that, you know, how do we basically sort of balance the two? There's a huge gap there in practice and knowledge. Also to a lesser extent, there's a technology gap which is basically in the self-service technologies that will help teams essentially govern data. You know, basically through the full life cycle, from develop, from selecting the data from, you know, building the pipelines from, you know, determining your access control, looking at quality, looking at basically whether the data is fresh or whether it's trending off course. So my prediction is that it will receive the first harsh scrutiny this year. You are going to see some organization and enterprises declare premature victory when they build some federated query implementations. You going to see vendors start with data mesh wash their products anybody in the data management space that they are going to say that where this basically a pipelining tool, whether it's basically ELT, whether it's a catalog or federated query tool, they will all going to get like, you know, basically promoting the fact of how they support this. Hopefully nobody's going to call themselves a data mesh tool because data mesh is not a technology. We're going to see one other thing come out of this. And this harks back to the metadata that Sanjeev was talking about and of the catalog just as he was talking about. Which is that there's going to be a new focus, every renewed focus on metadata. And I think that's going to spur interest in data fabrics. Now data fabrics are pretty vaguely defined, but if we just take the most elemental definition, which is a common metadata back plane, I think that if anybody is going to get serious about data mesh, they need to look at the data fabric because we all at the end of the day, need to speak, you know, need to read from the same sheet of music. >> So thank you Tony. Dave Menninger, I mean, one of the things that people like about data mesh is it pretty crisply articulate some of the flaws in today's organizational approaches to data. What are your thoughts on this? >> Well, I think we have to start by defining data mesh, right? The term is already getting corrupted, right? Tony said it's going to see the cold hard light of day. And there's a problem right now that there are a number of overlapping terms that are similar but not identical. So we've got data virtualization, data fabric, excuse me for a second. (clears throat) Sorry about that. Data virtualization, data fabric, data federation, right? So I think that it's not really clear what each vendor means by these terms. I see data mesh and data fabric becoming quite popular. I've interpreted data mesh as referring primarily to the governance aspects as originally intended and specified. But that's not the way I see vendors using it. I see vendors using it much more to mean data fabric and data virtualization. So I'm going to comment on the group of those things. I think the group of those things is going to happen. They're going to happen, they're going to become more robust. Our research suggests that a quarter of organizations are already using virtualized access to their data lakes and another half, so a total of three quarters will eventually be accessing their data lakes using some sort of virtualized access. Again, whether you define it as mesh or fabric or virtualization isn't really the point here. But this notion that there are different elements of data, metadata and governance within an organization that all need to be managed collectively. The interesting thing is when you look at the satisfaction rates of those organizations using virtualization versus those that are not, it's almost double, 68% of organizations, I'm sorry, 79% of organizations that were using virtualized access express satisfaction with their access to the data lake. Only 39% express satisfaction if they weren't using virtualized access. >> Oh thank you Dave. Sanjeev we just got about a couple of minutes on this topic, but I know you're speaking or maybe you've always spoken already on a panel with (indistinct) who sort of invented the concept. Governance obviously is a big sticking point, but what are your thoughts on this? You're on mute. (panelist chuckling) >> So my message to (indistinct) and to the community is as opposed to what they said, let's not define it. We spent a whole year defining it, there are four principles, domain, product, data infrastructure, and governance. Let's take it to the next level. I get a lot of questions on what is the difference between data fabric and data mesh? And I'm like I can't compare the two because data mesh is a business concept, data fabric is a data integration pattern. How do you compare the two? You have to bring data mesh a level down. So to Tony's point, I'm on a warpath in 2022 to take it down to what does a data product look like? How do we handle shared data across domains and governance? And I think we are going to see more of that in 2022, or is "operationalization" of data mesh. >> I think we could have a whole hour on this topic, couldn't we? Maybe we should do that. But let's corner. Let's move to Carl. So Carl, you're a database guy, you've been around that block for a while now, you want to talk about graph databases, bring it on. >> Oh yeah. Okay thanks. So I regard graph database as basically the next truly revolutionary database management technology. I'm looking forward for the graph database market, which of course we haven't defined yet. So obviously I have a little wiggle room in what I'm about to say. But this market will grow by about 600% over the next 10 years. Now, 10 years is a long time. But over the next five years, we expect to see gradual growth as people start to learn how to use it. The problem is not that it's not useful, its that people don't know how to use it. So let me explain before I go any further what a graph database is because some of the folks on the call may not know what it is. A graph database organizes data according to a mathematical structure called a graph. The graph has elements called nodes and edges. So a data element drops into a node, the nodes are connected by edges, the edges connect one node to another node. Combinations of edges create structures that you can analyze to determine how things are related. In some cases, the nodes and edges can have properties attached to them which add additional informative material that makes it richer, that's called a property graph. There are two principle use cases for graph databases. There's semantic property graphs, which are use to break down human language texts into the semantic structures. Then you can search it, organize it and answer complicated questions. A lot of AI is aimed at semantic graphs. Another kind is the property graph that I just mentioned, which has a dazzling number of use cases. I want to just point out as I talk about this, people are probably wondering, well, we have relation databases, isn't that good enough? So a relational database defines... It supports what I call definitional relationships. That means you define the relationships in a fixed structure. The database drops into that structure, there's a value, foreign key value, that relates one table to another and that value is fixed. You don't change it. If you change it, the database becomes unstable, it's not clear what you're looking at. In a graph database, the system is designed to handle change so that it can reflect the true state of the things that it's being used to track. So let me just give you some examples of use cases for this. They include entity resolution, data lineage, social media analysis, Customer 360, fraud prevention. There's cybersecurity, there's strong supply chain is a big one actually. There is explainable AI and this is going to become important too because a lot of people are adopting AI. But they want a system after the fact to say, how do the AI system come to that conclusion? How did it make that recommendation? Right now we don't have really good ways of tracking that. Machine learning in general, social network, I already mentioned that. And then we've got, oh gosh, we've got data governance, data compliance, risk management. We've got recommendation, we've got personalization, anti money laundering, that's another big one, identity and access management, network and IT operations is already becoming a key one where you actually have mapped out your operation, you know, whatever it is, your data center and you can track what's going on as things happen there, root cause analysis, fraud detection is a huge one. A number of major credit card companies use graph databases for fraud detection, risk analysis, tracking and tracing turn analysis, next best action, what if analysis, impact analysis, entity resolution and I would add one other thing or just a few other things to this list, metadata management. So Sanjeev, here you go, this is your engine. Because I was in metadata management for quite a while in my past life. And one of the things I found was that none of the data management technologies that were available to us could efficiently handle metadata because of the kinds of structures that result from it, but graphs can, okay? Graphs can do things like say, this term in this context means this, but in that context, it means that, okay? Things like that. And in fact, logistics management, supply chain. And also because it handles recursive relationships, by recursive relationships I mean objects that own other objects that are of the same type. You can do things like build materials, you know, so like parts explosion. Or you can do an HR analysis, who reports to whom, how many levels up the chain and that kind of thing. You can do that with relational databases, but yet it takes a lot of programming. In fact, you can do almost any of these things with relational databases, but the problem is, you have to program it. It's not supported in the database. And whenever you have to program something, that means you can't trace it, you can't define it. You can't publish it in terms of its functionality and it's really, really hard to maintain over time. >> Carl, thank you. I wonder if we could bring Brad in, I mean. Brad, I'm sitting here wondering, okay, is this incremental to the market? Is it disruptive and replacement? What are your thoughts on this phase? >> It's already disrupted the market. I mean, like Carl said, go to any bank and ask them are you using graph databases to get fraud detection under control? And they'll say, absolutely, that's the only way to solve this problem. And it is frankly. And it's the only way to solve a lot of the problems that Carl mentioned. And that is, I think it's Achilles heel in some ways. Because, you know, it's like finding the best way to cross the seven bridges of Koenigsberg. You know, it's always going to kind of be tied to those use cases because it's really special and it's really unique and because it's special and it's unique, it's still unfortunately kind of stands apart from the rest of the community that's building, let's say AI outcomes, as a great example here. Graph databases and AI, as Carl mentioned, are like chocolate and peanut butter. But technologically, you think don't know how to talk to one another, they're completely different. And you know, you can't just stand up SQL and query them. You've got to learn, know what is the Carl? Specter special. Yeah, thank you to, to actually get to the data in there. And if you're going to scale that data, that graph database, especially a property graph, if you're going to do something really complex, like try to understand you know, all of the metadata in your organization, you might just end up with, you know, a graph database winter like we had the AI winter simply because you run out of performance to make the thing happen. So, I think it's already disrupted, but we need to like treat it like a first-class citizen in the data analytics and AI community. We need to bring it into the fold. We need to equip it with the tools it needs to do the magic it does and to do it not just for specialized use cases, but for everything. 'Cause I'm with Carl. I think it's absolutely revolutionary. >> Brad identified the principal, Achilles' heel of the technology which is scaling. When these things get large and complex enough that they spill over what a single server can handle, you start to have difficulties because the relationships span things that have to be resolved over a network and then you get network latency and that slows the system down. So that's still a problem to be solved. >> Sanjeev, any quick thoughts on this? I mean, I think metadata on the word cloud is going to be the largest font, but what are your thoughts here? >> I want to (indistinct) So people don't associate me with only metadata, so I want to talk about something slightly different. dbengines.com has done an amazing job. I think almost everyone knows that they chronicle all the major databases that are in use today. In January of 2022, there are 381 databases on a ranked list of databases. The largest category is RDBMS. The second largest category is actually divided into two property graphs and IDF graphs. These two together make up the second largest number databases. So talking about Achilles heel, this is a problem. The problem is that there's so many graph databases to choose from. They come in different shapes and forms. To Brad's point, there's so many query languages in RDBMS, in SQL. I know the story, but here We've got cipher, we've got gremlin, we've got GQL and then we're proprietary languages. So I think there's a lot of disparity in this space. >> Well, excellent. All excellent points, Sanjeev, if I must say. And that is a problem that the languages need to be sorted and standardized. People need to have a roadmap as to what they can do with it. Because as you say, you can do so many things. And so many of those things are unrelated that you sort of say, well, what do we use this for? And I'm reminded of the saying I learned a bunch of years ago. And somebody said that the digital computer is the only tool man has ever device that has no particular purpose. (panelists chuckle) >> All right guys, we got to move on to Dave Menninger. We've heard about streaming. Your prediction is in that realm, so please take it away. >> Sure. So I like to say that historical databases are going to become a thing of the past. By that I don't mean that they're going to go away, that's not my point. I mean, we need historical databases, but streaming data is going to become the default way in which we operate with data. So in the next say three to five years, I would expect that data platforms and we're using the term data platforms to represent the evolution of databases and data lakes, that the data platforms will incorporate these streaming capabilities. We're going to process data as it streams into an organization and then it's going to roll off into historical database. So historical databases don't go away, but they become a thing of the past. They store the data that occurred previously. And as data is occurring, we're going to be processing it, we're going to be analyzing it, we're going to be acting on it. I mean we only ever ended up with historical databases because we were limited by the technology that was available to us. Data doesn't occur in patches. But we processed it in patches because that was the best we could do. And it wasn't bad and we've continued to improve and we've improved and we've improved. But streaming data today is still the exception. It's not the rule, right? There are projects within organizations that deal with streaming data. But it's not the default way in which we deal with data yet. And so that's my prediction is that this is going to change, we're going to have streaming data be the default way in which we deal with data and how you label it and what you call it. You know, maybe these databases and data platforms just evolved to be able to handle it. But we're going to deal with data in a different way. And our research shows that already, about half of the participants in our analytics and data benchmark research, are using streaming data. You know, another third are planning to use streaming technologies. So that gets us to about eight out of 10 organizations need to use this technology. And that doesn't mean they have to use it throughout the whole organization, but it's pretty widespread in its use today and has continued to grow. If you think about the consumerization of IT, we've all been conditioned to expect immediate access to information, immediate responsiveness. You know, we want to know if an item is on the shelf at our local retail store and we can go in and pick it up right now. You know, that's the world we live in and that's spilling over into the enterprise IT world We have to provide those same types of capabilities. So that's my prediction, historical databases become a thing of the past, streaming data becomes the default way in which we operate with data. >> All right thank you David. Well, so what say you, Carl, the guy who has followed historical databases for a long time? >> Well, one thing actually, every database is historical because as soon as you put data in it, it's now history. They'll no longer reflect the present state of things. But even if that history is only a millisecond old, it's still history. But I would say, I mean, I know you're trying to be a little bit provocative in saying this Dave 'cause you know, as well as I do that people still need to do their taxes, they still need to do accounting, they still need to run general ledger programs and things like that. That all involves historical data. That's not going to go away unless you want to go to jail. So you're going to have to deal with that. But as far as the leading edge functionality, I'm totally with you on that. And I'm just, you know, I'm just kind of wondering if this requires a change in the way that we perceive applications in order to truly be manifested and rethinking the way applications work. Saying that an application should respond instantly, as soon as the state of things changes. What do you say about that? >> I think that's true. I think we do have to think about things differently. It's not the way we designed systems in the past. We're seeing more and more systems designed that way. But again, it's not the default. And I agree 100% with you that we do need historical databases you know, that's clear. And even some of those historical databases will be used in conjunction with the streaming data, right? >> Absolutely. I mean, you know, let's take the data warehouse example where you're using the data warehouse as its context and the streaming data as the present and you're saying, here's the sequence of things that's happening right now. Have we seen that sequence before? And where? What does that pattern look like in past situations? And can we learn from that? >> So Tony Baer, I wonder if you could comment? I mean, when you think about, you know, real time inferencing at the edge, for instance, which is something that a lot of people talk about, a lot of what we're discussing here in this segment, it looks like it's got a great potential. What are your thoughts? >> Yeah, I mean, I think you nailed it right. You know, you hit it right on the head there. Which is that, what I'm seeing is that essentially. Then based on I'm going to split this one down the middle is that I don't see that basically streaming is the default. What I see is streaming and basically and transaction databases and analytics data, you know, data warehouses, data lakes whatever are converging. And what allows us technically to converge is cloud native architecture, where you can basically distribute things. So you can have a node here that's doing the real-time processing, that's also doing... And this is where it leads in or maybe doing some of that real time predictive analytics to take a look at, well look, we're looking at this customer journey what's happening with what the customer is doing right now and this is correlated with what other customers are doing. So the thing is that in the cloud, you can basically partition this and because of basically the speed of the infrastructure then you can basically bring these together and kind of orchestrate them sort of a loosely coupled manner. The other parts that the use cases are demanding, and this is part of it goes back to what Dave is saying. Is that, you know, when you look at Customer 360, when you look at let's say Smart Utility products, when you look at any type of operational problem, it has a real time component and it has an historical component. And having predictive and so like, you know, my sense here is that technically we can bring this together through the cloud. And I think the use case is that we can apply some real time sort of predictive analytics on these streams and feed this into the transactions so that when we make a decision in terms of what to do as a result of a transaction, we have this real-time input. >> Sanjeev, did you have a comment? >> Yeah, I was just going to say that to Dave's point, you know, we have to think of streaming very different because in the historical databases, we used to bring the data and store the data and then we used to run rules on top, aggregations and all. But in case of streaming, the mindset changes because the rules are normally the inference, all of that is fixed, but the data is constantly changing. So it's a completely reversed way of thinking and building applications on top of that. >> So Dave Menninger, there seem to be some disagreement about the default. What kind of timeframe are you thinking about? Is this end of decade it becomes the default? What would you pin? >> I think around, you know, between five to 10 years, I think this becomes the reality. >> I think its... >> It'll be more and more common between now and then, but it becomes the default. And I also want Sanjeev at some point, maybe in one of our subsequent conversations, we need to talk about governing streaming data. 'Cause that's a whole nother set of challenges. >> We've also talked about it rather in two dimensions, historical and streaming, and there's lots of low latency, micro batch, sub-second, that's not quite streaming, but in many cases its fast enough and we're seeing a lot of adoption of near real time, not quite real-time as good enough for many applications. (indistinct cross talk from panelists) >> Because nobody's really taking the hardware dimension (mumbles). >> That'll just happened, Carl. (panelists laughing) >> So near real time. But maybe before you lose the customer, however we define that, right? Okay, let's move on to Brad. Brad, you want to talk about automation, AI, the pipeline people feel like, hey, we can just automate everything. What's your prediction? >> Yeah I'm an AI aficionados so apologies in advance for that. But, you know, I think that we've been seeing automation play within AI for some time now. And it's helped us do a lot of things especially for practitioners that are building AI outcomes in the enterprise. It's helped them to fill skills gaps, it's helped them to speed development and it's helped them to actually make AI better. 'Cause it, you know, in some ways provide some swim lanes and for example, with technologies like AutoML can auto document and create that sort of transparency that we talked about a little bit earlier. But I think there's an interesting kind of conversion happening with this idea of automation. And that is that we've had the automation that started happening for practitioners, it's trying to move out side of the traditional bounds of things like I'm just trying to get my features, I'm just trying to pick the right algorithm, I'm just trying to build the right model and it's expanding across that full life cycle, building an AI outcome, to start at the very beginning of data and to then continue on to the end, which is this continuous delivery and continuous automation of that outcome to make sure it's right and it hasn't drifted and stuff like that. And because of that, because it's become kind of powerful, we're starting to actually see this weird thing happen where the practitioners are starting to converge with the users. And that is to say that, okay, if I'm in Tableau right now, I can stand up Salesforce Einstein Discovery, and it will automatically create a nice predictive algorithm for me given the data that I pull in. But what's starting to happen and we're seeing this from the companies that create business software, so Salesforce, Oracle, SAP, and others is that they're starting to actually use these same ideals and a lot of deep learning (chuckles) to basically stand up these out of the box flip-a-switch, and you've got an AI outcome at the ready for business users. And I am very much, you know, I think that's the way that it's going to go and what it means is that AI is slowly disappearing. And I don't think that's a bad thing. I think if anything, what we're going to see in 2022 and maybe into 2023 is this sort of rush to put this idea of disappearing AI into practice and have as many of these solutions in the enterprise as possible. You can see, like for example, SAP is going to roll out this quarter, this thing called adaptive recommendation services, which basically is a cold start AI outcome that can work across a whole bunch of different vertical markets and use cases. It's just a recommendation engine for whatever you needed to do in the line of business. So basically, you're an SAP user, you look up to turn on your software one day, you're a sales professional let's say, and suddenly you have a recommendation for customer churn. Boom! It's going, that's great. Well, I don't know, I think that's terrifying. In some ways I think it is the future that AI is going to disappear like that, but I'm absolutely terrified of it because I think that what it really does is it calls attention to a lot of the issues that we already see around AI, specific to this idea of what we like to call at Omdia, responsible AI. Which is, you know, how do you build an AI outcome that is free of bias, that is inclusive, that is fair, that is safe, that is secure, that its audible, et cetera, et cetera, et cetera, et cetera. I'd take a lot of work to do. And so if you imagine a customer that's just a Salesforce customer let's say, and they're turning on Einstein Discovery within their sales software, you need some guidance to make sure that when you flip that switch, that the outcome you're going to get is correct. And that's going to take some work. And so, I think we're going to see this move, let's roll this out and suddenly there's going to be a lot of problems, a lot of pushback that we're going to see. And some of that's going to come from GDPR and others that Sanjeev was mentioning earlier. A lot of it is going to come from internal CSR requirements within companies that are saying, "Hey, hey, whoa, hold up, we can't do this all at once. "Let's take the slow route, "let's make AI automated in a smart way." And that's going to take time. >> Yeah, so a couple of predictions there that I heard. AI simply disappear, it becomes invisible. Maybe if I can restate that. And then if I understand it correctly, Brad you're saying there's a backlash in the near term. You'd be able to say, oh, slow down. Let's automate what we can. Those attributes that you talked about are non trivial to achieve, is that why you're a bit of a skeptic? >> Yeah. I think that we don't have any sort of standards that companies can look to and understand. And we certainly, within these companies, especially those that haven't already stood up an internal data science team, they don't have the knowledge to understand when they flip that switch for an automated AI outcome that it's going to do what they think it's going to do. And so we need some sort of standard methodology and practice, best practices that every company that's going to consume this invisible AI can make use of them. And one of the things that you know, is sort of started that Google kicked off a few years back that's picking up some momentum and the companies I just mentioned are starting to use it is this idea of model cards where at least you have some transparency about what these things are doing. You know, so like for the SAP example, we know, for example, if it's convolutional neural network with a long, short term memory model that it's using, we know that it only works on Roman English and therefore me as a consumer can say, "Oh, well I know that I need to do this internationally. "So I should not just turn this on today." >> Thank you. Carl could you add anything, any context here? >> Yeah, we've talked about some of the things Brad mentioned here at IDC and our future of intelligence group regarding in particular, the moral and legal implications of having a fully automated, you know, AI driven system. Because we already know, and we've seen that AI systems are biased by the data that they get, right? So if they get data that pushes them in a certain direction, I think there was a story last week about an HR system that was recommending promotions for White people over Black people, because in the past, you know, White people were promoted and more productive than Black people, but it had no context as to why which is, you know, because they were being historically discriminated, Black people were being historically discriminated against, but the system doesn't know that. So, you know, you have to be aware of that. And I think that at the very least, there should be controls when a decision has either a moral or legal implication. When you really need a human judgment, it could lay out the options for you. But a person actually needs to authorize that action. And I also think that we always will have to be vigilant regarding the kind of data we use to train our systems to make sure that it doesn't introduce unintended biases. In some extent, they always will. So we'll always be chasing after them. But that's (indistinct). >> Absolutely Carl, yeah. I think that what you have to bear in mind as a consumer of AI is that it is a reflection of us and we are a very flawed species. And so if you look at all of the really fantastic, magical looking supermodels we see like GPT-3 and four, that's coming out, they're xenophobic and hateful because the people that the data that's built upon them and the algorithms and the people that build them are us. So AI is a reflection of us. We need to keep that in mind. >> Yeah, where the AI is biased 'cause humans are biased. All right, great. All right let's move on. Doug you mentioned mentioned, you know, lot of people that said that data lake, that term is not going to live on but here's to be, have some lakes here. You want to talk about lake house, bring it on. >> Yes, I do. My prediction is that lake house and this idea of a combined data warehouse and data lake platform is going to emerge as the dominant data management offering. I say offering that doesn't mean it's going to be the dominant thing that organizations have out there, but it's going to be the pro dominant vendor offering in 2022. Now heading into 2021, we already had Cloudera, Databricks, Microsoft, Snowflake as proponents, in 2021, SAP, Oracle, and several of all of these fabric virtualization/mesh vendors joined the bandwagon. The promise is that you have one platform that manages your structured, unstructured and semi-structured information. And it addresses both the BI analytics needs and the data science needs. The real promise there is simplicity and lower cost. But I think end users have to answer a few questions. The first is, does your organization really have a center of data gravity or is the data highly distributed? Multiple data warehouses, multiple data lakes, on premises, cloud. If it's very distributed and you'd have difficulty consolidating and that's not really a goal for you, then maybe that single platform is unrealistic and not likely to add value to you. You know, also the fabric and virtualization vendors, the mesh idea, that's where if you have this highly distributed situation, that might be a better path forward. The second question, if you are looking at one of these lake house offerings, you are looking at consolidating, simplifying, bringing together to a single platform. You have to make sure that it meets both the warehouse need and the data lake need. So you have vendors like Databricks, Microsoft with Azure Synapse. New really to the data warehouse space and they're having to prove that these data warehouse capabilities on their platforms can meet the scaling requirements, can meet the user and query concurrency requirements. Meet those tight SLS. And then on the other hand, you have the Oracle, SAP, Snowflake, the data warehouse folks coming into the data science world, and they have to prove that they can manage the unstructured information and meet the needs of the data scientists. I'm seeing a lot of the lake house offerings from the warehouse crowd, managing that unstructured information in columns and rows. And some of these vendors, Snowflake a particular is really relying on partners for the data science needs. So you really got to look at a lake house offering and make sure that it meets both the warehouse and the data lake requirement. >> Thank you Doug. Well Tony, if those two worlds are going to come together, as Doug was saying, the analytics and the data science world, does it need to be some kind of semantic layer in between? I don't know. Where are you in on this topic? >> (chuckles) Oh, didn't we talk about data fabrics before? Common metadata layer (chuckles). Actually, I'm almost tempted to say let's declare victory and go home. And that this has actually been going on for a while. I actually agree with, you know, much of what Doug is saying there. Which is that, I mean I remember as far back as I think it was like 2014, I was doing a study. I was still at Ovum, (indistinct) Omdia, looking at all these specialized databases that were coming up and seeing that, you know, there's overlap at the edges. But yet, there was still going to be a reason at the time that you would have, let's say a document database for JSON, you'd have a relational database for transactions and for data warehouse and you had basically something at that time that resembles a dupe for what we consider your data life. Fast forward and the thing is what I was seeing at the time is that you were saying they sort of blending at the edges. That was saying like about five to six years ago. And the lake house is essentially on the current manifestation of that idea. There is a dichotomy in terms of, you know, it's the old argument, do we centralize this all you know in a single place or do we virtualize? And I think it's always going to be a union yeah and there's never going to be a single silver bullet. I do see that there are also going to be questions and these are points that Doug raised. That you know, what do you need for your performance there, or for your free performance characteristics? Do you need for instance high concurrency? You need the ability to do some very sophisticated joins, or is your requirement more to be able to distribute and distribute our processing is, you know, as far as possible to get, you know, to essentially do a kind of a brute force approach. All these approaches are valid based on the use case. I just see that essentially that the lake house is the culmination of it's nothing. It's a relatively new term introduced by Databricks a couple of years ago. This is the culmination of basically what's been a long time trend. And what we see in the cloud is that as we start seeing data warehouses as a check box items say, "Hey, we can basically source data in cloud storage, in S3, "Azure Blob Store, you know, whatever, "as long as it's in certain formats, "like, you know parquet or CSP or something like that." I see that as becoming kind of a checkbox item. So to that extent, I think that the lake house, depending on how you define is already reality. And in some cases, maybe new terminology, but not a whole heck of a lot new under the sun. >> Yeah. And Dave Menninger, I mean a lot of these, thank you Tony, but a lot of this is going to come down to, you know, vendor marketing, right? Some people just kind of co-op the term, we talked about you know, data mesh washing, what are your thoughts on this? (laughing) >> Yeah, so I used the term data platform earlier. And part of the reason I use that term is that it's more vendor neutral. We've tried to sort of stay out of the vendor terminology patenting world, right? Whether the term lake houses, what sticks or not, the concept is certainly going to stick. And we have some data to back it up. About a quarter of organizations that are using data lakes today, already incorporate data warehouse functionality into it. So they consider their data lake house and data warehouse one in the same, about a quarter of organizations, a little less, but about a quarter of organizations feed the data lake from the data warehouse and about a quarter of organizations feed the data warehouse from the data lake. So it's pretty obvious that three quarters of organizations need to bring this stuff together, right? The need is there, the need is apparent. The technology is going to continue to converge. I like to talk about it, you know, you've got data lakes over here at one end, and I'm not going to talk about why people thought data lakes were a bad idea because they thought you just throw stuff in a server and you ignore it, right? That's not what a data lake is. So you've got data lake people over here and you've got database people over here, data warehouse people over here, database vendors are adding data lake capabilities and data lake vendors are adding data warehouse capabilities. So it's obvious that they're going to meet in the middle. I mean, I think it's like Tony says, I think we should declare victory and go home. >> As hell. So just a follow-up on that, so are you saying the specialized lake and the specialized warehouse, do they go away? I mean, Tony data mesh practitioners would say or advocates would say, well, they could all live. It's just a node on the mesh. But based on what Dave just said, are we gona see those all morphed together? >> Well, number one, as I was saying before, there's always going to be this sort of, you know, centrifugal force or this tug of war between do we centralize the data, do we virtualize? And the fact is I don't think that there's ever going to be any single answer. I think in terms of data mesh, data mesh has nothing to do with how you're physically implement the data. You could have a data mesh basically on a data warehouse. It's just that, you know, the difference being is that if we use the same physical data store, but everybody's logically you know, basically governing it differently, you know? Data mesh in space, it's not a technology, it's processes, it's governance process. So essentially, you know, I basically see that, you know, as I was saying before that this is basically the culmination of a long time trend we're essentially seeing a lot of blurring, but there are going to be cases where, for instance, if I need, let's say like, Upserve, I need like high concurrency or something like that. There are certain things that I'm not going to be able to get efficiently get out of a data lake. And, you know, I'm doing a system where I'm just doing really brute forcing very fast file scanning and that type of thing. So I think there always will be some delineations, but I would agree with Dave and with Doug, that we are seeing basically a confluence of requirements that we need to essentially have basically either the element, you know, the ability of a data lake and the data warehouse, these need to come together, so I think. >> I think what we're likely to see is organizations look for a converge platform that can handle both sides for their center of data gravity, the mesh and the fabric virtualization vendors, they're all on board with the idea of this converged platform and they're saying, "Hey, we'll handle all the edge cases "of the stuff that isn't in that center of data gravity "but that is off distributed in a cloud "or at a remote location." So you can have that single platform for the center of your data and then bring in virtualization, mesh, what have you, for reaching out to the distributed data. >> As Dave basically said, people are happy when they virtualized data. >> I think we have at this point, but to Dave Menninger's point, they are converging, Snowflake has introduced support for unstructured data. So obviously literally splitting here. Now what Databricks is saying is that "aha, but it's easy to go from data lake to data warehouse "than it is from databases to data lake." So I think we're getting into semantics, but we're already seeing these two converge. >> So take somebody like AWS has got what? 15 data stores. Are they're going to 15 converge data stores? This is going to be interesting to watch. All right, guys, I'm going to go down and list do like a one, I'm going to one word each and you guys, each of the analyst, if you would just add a very brief sort of course correction for me. So Sanjeev, I mean, governance is going to to be... Maybe it's the dog that wags the tail now. I mean, it's coming to the fore, all this ransomware stuff, which you really didn't talk much about security, but what's the one word in your prediction that you would leave us with on governance? >> It's going to be mainstream. >> Mainstream. Okay. Tony Baer, mesh washing is what I wrote down. That's what we're going to see in 2022, a little reality check, you want to add to that? >> Reality check, 'cause I hope that no vendor jumps the shark and close they're offering a data niche product. >> Yeah, let's hope that doesn't happen. If they do, we're going to call them out. Carl, I mean, graph databases, thank you for sharing some high growth metrics. I know it's early days, but magic is what I took away from that, so magic database. >> Yeah, I would actually, I've said this to people too. I kind of look at it as a Swiss Army knife of data because you can pretty much do anything you want with it. That doesn't mean you should. I mean, there's definitely the case that if you're managing things that are in fixed schematic relationship, probably a relation database is a better choice. There are times when the document database is a better choice. It can handle those things, but maybe not. It may not be the best choice for that use case. But for a great many, especially with the new emerging use cases I listed, it's the best choice. >> Thank you. And Dave Menninger, thank you by the way, for bringing the data in, I like how you supported all your comments with some data points. But streaming data becomes the sort of default paradigm, if you will, what would you add? >> Yeah, I would say think fast, right? That's the world we live in, you got to think fast. >> Think fast, love it. And Brad Shimmin, love it. I mean, on the one hand I was saying, okay, great. I'm afraid I might get disrupted by one of these internet giants who are AI experts. I'm going to be able to buy instead of build AI. But then again, you know, I've got some real issues. There's a potential backlash there. So give us your bumper sticker. >> I'm would say, going with Dave, think fast and also think slow to talk about the book that everyone talks about. I would say really that this is all about trust, trust in the idea of automation and a transparent and visible AI across the enterprise. And verify, verify before you do anything. >> And then Doug Henschen, I mean, I think the trend is your friend here on this prediction with lake house is really becoming dominant. I liked the way you set up that notion of, you know, the data warehouse folks coming at it from the analytics perspective and then you get the data science worlds coming together. I still feel as though there's this piece in the middle that we're missing, but your, your final thoughts will give you the (indistinct). >> I think the idea of consolidation and simplification always prevails. That's why the appeal of a single platform is going to be there. We've already seen that with, you know, DoOP platforms and moving toward cloud, moving toward object storage and object storage, becoming really the common storage point for whether it's a lake or a warehouse. And that second point, I think ESG mandates are going to come in alongside GDPR and things like that to up the ante for good governance. >> Yeah, thank you for calling that out. Okay folks, hey that's all the time that we have here, your experience and depth of understanding on these key issues on data and data management really on point and they were on display today. I want to thank you for your contributions. Really appreciate your time. >> Enjoyed it. >> Thank you. >> Thanks for having me. >> In addition to this video, we're going to be making available transcripts of the discussion. We're going to do clips of this as well we're going to put them out on social media. I'll write this up and publish the discussion on wikibon.com and siliconangle.com. No doubt, several of the analysts on the panel will take the opportunity to publish written content, social commentary or both. I want to thank the power panelists and thanks for watching this special CUBE presentation. This is Dave Vellante, be well and we'll see you next time. (bright music)

Published Date : Jan 7 2022

SUMMARY :

and I'd like to welcome you to I as moderator, I'm going to and that is the journey to weigh in on there, and it's going to demand more solid data. Brad, I wonder if you that are specific to individual use cases in the past is because we I like the fact that you the data from, you know, Dave Menninger, I mean, one of the things that all need to be managed collectively. Oh thank you Dave. and to the community I think we could have a after the fact to say, okay, is this incremental to the market? the magic it does and to do it and that slows the system down. I know the story, but And that is a problem that the languages move on to Dave Menninger. So in the next say three to five years, the guy who has followed that people still need to do their taxes, And I agree 100% with you and the streaming data as the I mean, when you think about, you know, and because of basically the all of that is fixed, but the it becomes the default? I think around, you know, but it becomes the default. and we're seeing a lot of taking the hardware dimension That'll just happened, Carl. Okay, let's move on to Brad. And that is to say that, Those attributes that you And one of the things that you know, Carl could you add in the past, you know, I think that what you have to bear in mind that term is not going to and the data science needs. and the data science world, You need the ability to do lot of these, thank you Tony, I like to talk about it, you know, It's just a node on the mesh. basically either the element, you know, So you can have that single they virtualized data. "aha, but it's easy to go from I mean, it's coming to the you want to add to that? I hope that no vendor Yeah, let's hope that doesn't happen. I've said this to people too. I like how you supported That's the world we live I mean, on the one hand I And verify, verify before you do anything. I liked the way you set up We've already seen that with, you know, the time that we have here, We're going to do clips of this as well

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave MenningerPERSON

0.99+

DavePERSON

0.99+

Dave VellantePERSON

0.99+

Doug HenschenPERSON

0.99+

DavidPERSON

0.99+

Brad ShimminPERSON

0.99+

DougPERSON

0.99+

Tony BaerPERSON

0.99+

Dave VelanntePERSON

0.99+

TonyPERSON

0.99+

CarlPERSON

0.99+

BradPERSON

0.99+

Carl OlofsonPERSON

0.99+

MicrosoftORGANIZATION

0.99+

2014DATE

0.99+

Sanjeev MohanPERSON

0.99+

Ventana ResearchORGANIZATION

0.99+

2022DATE

0.99+

OracleORGANIZATION

0.99+

last yearDATE

0.99+

January of 2022DATE

0.99+

threeQUANTITY

0.99+

381 databasesQUANTITY

0.99+

IDCORGANIZATION

0.99+

InformaticaORGANIZATION

0.99+

SnowflakeORGANIZATION

0.99+

DatabricksORGANIZATION

0.99+

twoQUANTITY

0.99+

SanjeevPERSON

0.99+

2021DATE

0.99+

GoogleORGANIZATION

0.99+

OmdiaORGANIZATION

0.99+

AWSORGANIZATION

0.99+

SanjMoORGANIZATION

0.99+

79%QUANTITY

0.99+

second questionQUANTITY

0.99+

last weekDATE

0.99+

15 data storesQUANTITY

0.99+

100%QUANTITY

0.99+

SAPORGANIZATION

0.99+

Ajay Patel, VMware | VMworld 2021


 

(upbeat music) >> Welcome to theCUBE's coverage of VMworld 2021. I'm Lisa Martin. I've got a CUBE alum with me next. Ajay Patel is here, the SVP and GM of Modern Apps and Management at VMware. Ajay, welcome back to the program, it's great to see you. >> Well thank you for having me. It's always great to be here. >> Glad that you're doing well. I want to dig into your role as SVP and GM with Modern Apps and Management. Talk to me about some of the dynamics of your role and then we'll get into the vision and the strategy that VMware has. >> Makes sense. VMware has created a business group called Modern Apps and Management, with the single mission of helping our customers accelerate their digital transformation through software. And we're finding them leveraging both the edge and the multiple clouds they deploy on. So our mission here is helping, them be the cloud diagnostic manager for application development and management through our portfolio of Tazu and VRealize solutions allowing customers to both build and operate applications at speed across these edge data center and cloud deployments And the big thing we hear is all the day two challenges, right of managing costs, risks, security, performance. That's really the essence of what the business group is about. How do we speed idea to production and allow you to operate at scale. >> When we think of speed, we can't help, but think of the acceleration that we've seen in the last 18 months, businesses transforming digitally to first survive the dynamics of the market. But talk to me about how the, the pandemic has influenced catalyzed VMware's vision here. >> You can see in every industry, this need for speed has really accelerated. What used to be weeks and months of planning and execution has materialized into getting something out in production in days. One of great example I can remember is one of my financial services customer that was responsible for getting all the COVID payments out to the small businesses and being able to get that application from idea to production matter of 10 days, it was just truly impressive to see the teams come together, to come up with the idea, put the software together and getting production so that we could start delivering the financial funds the companies needed, to keep them viable. So great social impact and great results in matter of days. >> And again, that acceleration that we've seen there, there's been a lot of silver linings, I think, but I want to get in next to some of the industry trends that are influencing app modernization. What are you seeing in the customer environment? What are some of those key trends that are driving adoption? >> I mean, this move to cloud is here to stay and most of customers have a cloud first strategy, and we rebranded this from VMware the cloud smart strategy, but it's not just about one particular flavor of cloud. We're putting the best workload on the best cloud. But the reality is when I speak to many of the customers is they're way behind on the bar of digital plats. And it's, that's because the simple idea of, you know, lift and shift or completely rewrite. So there's no one fits all and they're struggling with hardware capability, their the development teams, their IT assets, the applications are modernized across these three things. So we see modernization kind of fall in three categories, infrastructure modernization, the practice of development or devops modernization, and the application transform itself. And we are starting to find out that customers are struggling with all three. Well, they want to leverage the best of cloud. They just don't have the skills or the expertise to do that effectively. >> And how does VMware help address that skills gap. >> Yeah, so the way we've looked at it is we put a lot of effort around education. So on the everyone knows containers and Kubernetes is the future. They're looking to build these modern microservices, architectures and applications. A lot of investment in just kind of putting the effort to help customers learn these new tools, techniques, and create best practices. So theCUBE academy and the effort and the investment putting in just enabling the ecosystem now with the skills and capabilities is one big effort that VMware is putting. But more importantly, on the product side, we're delivering solutions that help customers both build design, deliver and operate these applications on Kubernetes across the cloud of choice. I'm most excited about our announcement around this product. We're just launching called Tanzu application platform. It is what we call an application aware platform. It's about making it easy for developers to take the ideas and get into production. It kind of bridging that gap that exists between development and operations. We hear a lot about dev ops, as you know, how do you bring that to life? How do you make that real? That's what Tanzu application platform is about. >> I'm curious of your customer conversations, how they've changed in the last year or so in terms of, app modernization, things like security being board level conversations, are you noticing that that is rising up the chain that app modernization is now a business critical initiative for our businesses? >> So it's what I'm finding is it's the means. It's not that if you think about the board level conversations about digital transformation you know, I'm a financial services company. I need to provide mobile FinTech. I'm competing with this new age application and you're delivering the same service that they offered digitally now, right. Like from a retail bank. I can't go to the store, the retail branch anymore, right. I need to provide the same capability for payments processing all online through my mobile phone. So it's really the digitalization of the traditional processes that we're finding most exciting. In order to do that, we're finding that no applications are in cloud right. They had to take the existing financial applications and put a mobile frontend to it, or put some new business logic or drive some transformation there. So it's really a transformation around existing application to deliver a business outcome. And we're focusing it through our Tanzu lab services, our capabilities of Tanzu application platform, all the way to the operations and management of getting these products in production or these applications in production. So it's the full life cycle from idea to production is what customers are looking for. They're looking to compress the cycle time as you and I spoke about, through this agility they're looking for. >> Right, definitely a compressed cycle time. Talk to me about some of the other announcements that are being made at VMworld with respect to Tanzu and helping customers on the app modernization front, and that aligned to the vision and mission that you talked about. >> Wonderful, I would say they're kind of, I put them in three buckets. One is what are we doing to help developers get access to the new technology. Back to the skills learning part of it, most excited about Tanzu of community edition and Tanzu mission control starter pack. This is really about getting Kubernetes stood up in your favorite deployment of choice and get started building your application very quickly. We're also announcing Tanzu application platform that I spoke about, we're going to beta 2 for that platform, which makes it really easy for developers to get access to Kubernetes capability. It makes development easy. We're also announcing marketplace enhancements, allowing us to take the best of breed IC solutions and making them available to help you build applications faster. So one set of announcements around building applications, delivering value, getting them down to market very quickly. On the management side, we're really excited about the broad portfolio management we've assembled. We're probably in the customer's a way to build a cloud operating model. And in the cloud operating model, it's about how do I do VMs and containers? How do I provide a consistent management control plane so I can deliver applications on the cloud of my choice? How do I provide intrinsic observability, intrinsic security so I can operate at scale. So this combination of development tooling, platform operations, and day two operations, along with enhancements in our cost management solution with CloudHealth or being able to take our universal capabilities for consumption, driving insight and observity that really makes it a powerful story for customers, either on the build or develop or deploy side of the equation. >> You mentioned a couple of things are interesting. Consistency being key from a management perspective, especially given this accelerated time in which we're living, but also you mentioned security. We've seen so much movement on the security front in the last year and a half with the massive rise in ransomware attacks, ransomware now becoming a household word. Talk to me about the security factor and how you're helping customers from a risk mitigation perspective, because now it's not, if we get attacked, it's when. >> And I think it's really starts with, we have this notion of a secure software supply chain. We think of software as a production factory from idea to production. And if you don't start with known good hard attacks to start with, trying to wire in security after attack is just too difficult. So we started with secure content, curated images content catalogs that customers are setting up as best practices. We started with application accelerators. These are best practice that codifies with the right guard rails in place. And then we automate that supply chain so that you have checks in every process, every step of the way, whether it's in the build process and the deploy process or in runtime production. And you had to do this at the application layer because there is no kind of firewall or edge you can protect the application is highly distributed. So things like application security and API security, another area we announced a new offering at VM world around API security, but everything starts with an API endpoint when you have a security. So security is kind of woven in into the design build, deploy and in the runtime operation. And we're kind of wire this in intrinsically to the platform with best of breed security partners now extending in evolving their solution on top of us. >> What's been some of the customer feedback from some of the new technologies that you announced. I'm curious, I imagine knowing how VMware is very customer centric, customers were essential in the development and iteration of the technologies, but just give me some of the idea on customer feedback of this direction that you're going. >> Yeah, there's a great, exciting example where we're working with the army to create a software factory. you would've never imagined right, The US army being a software digital enterprise, we're partnering with what we call the US army futures command in a joint effort to help them build the first ever software development factory where army personnel are actually becoming true cloud native developers, where you're putting the soldiers to do cloud native development, everything in the terms of practice of building software, but also using the Tanzu portfolio in delivering best-in-class capability. This is going to rival some of the top tech companies in Silicon valley. This is a five-year prototype project in which we're picking cohorts of soldiers, making them software developers and helping them build great capability through both combination of classroom based training, but also strong technical foundation and expertise provided by our lab. So this is an example where, you know, the industry is working with the customer to co-innovate, how we build software, but also driving the expertise of these personnel hierarchs. As a soldier, you know, what you need, what if you could start delivering solutions for rest of your members in a productive way. So very exciting, It's an example where we've leapfrogging and delivering the kind of the Silicon valley type innovation to our standard practice. It's traditionally been a procurement driven model. We're trying to speed that and drive it into a more agile delivery factory concept as well. So one of the most exciting projects that I've run into the last six months. >> The army software factory, I love that my dad was an army medic and combat medic in Vietnam. And I'm sure probably wouldn't have been apt to become a software developer. But tell me a little bit about, it's a very cool project and so essential. Talk to me a little bit about the impetus of the army software factory. How did that come about? >> You know, this came back with strong sponsorship from the top. I had an opportunity to be at the opening of the campus in partnership with the local Austin college. And as General Milley and team spoke about it, they just said the next battleground is going to be a digital backup power hub. It's something we're going to have to put our troops in place and have modernized, not just the army, but modernize the way we deliver it through software. It's it speaks so much to the digital transformation we're talking about right. At the very heart of it is about using software to enable whether it's medics, whether it's supplies, either in a real time intelligence on the battlefield to know what's happening. And we're starting to see user technology is going to drive dramatically hopefully the next war, we don't have to fight it more of a defensive mode, but that capability alone is going to be significant. So it's really exciting to see how technology has become pervasive in all aspects, in every format including the US army. And this partnership is a great example of thought leadership from the army command to deliver software as the innovation factory, for the army itself. >> Right, and for the army to rival Silicon valley tech companies, that's pretty impressive. >> Pretty ambitious right. In partnership with one of the local colleges. So that's also starting to show in terms of how to bring new talent out, that shortage of skills we talked about. It's a critical way to kind of invest in the future in our people, right? As we, as we build out this capability. >> That's excellent that investment in the future and helping fill those skills gaps across industries is so needed. Talk to me about some of the things that you're excited about this year's VMworld is again virtual, but what are some of the things that you think are really fantastic for customers and prospects to learn? >> I think as Raghu said, we're in the third act of VM-ware, but more interestingly, but the third act of where the cloud is, the cloud has matured cloud 2.0 was really about shifting and using a public cloud for the IS capabilities. Cloud 3.0 is about to use the cloud of choice for the best application. We are going to increasingly see this distributed nature of application. I asked most customers, where does your application run? It's hard to answer that, right? It's on your mobile device, it's in your storefront, it's in your data center, it's in a particular cloud. And so an application is a collection of services. So what I'm most excited about is all business capables being published as an API, had an opportunity to be part of a company called Sonos and then Apogee. And we talked about API management years ago. I see increasingly this need for being able to expose a business capability as an API, being able to compose these new applications rapidly, being able to secure them, being able to observe what's going on in production and then adjust and automate, you can scale up scale down or deploy the application where it's most needed in minutes. That's a dynamic future that we see, and we're excited that VM was right at the heart of it. Where that in our cloud agnostic software player, that can help you, whether it's your development challenges, your deployment challenges, or your management challenges, in the future of multi-cloud, that's what I'm most excited about, we're set up to help our customers on this cloud journey, regardless of where they're going and what solution they're looking to build. >> Ajay, what are some of the key business outcomes that the cloud is going to deliver across industries as things progress forward? >> I think we're finding the consistent message I hear from our customers is leverage the power of cloud to transform my business. So it's about business outcomes. It's less about technology. It's what outcomes we're driving. Second it's about speed and agility. How do I respond, adjust kind of dynamic contiuness. How do I innovate continuously? How do I adjust to what the business needs? And third thing we're seeing more and more is I need to be able to management costs and I get some predictability and able to optimize how I run my business. what they're finding with the cloud is the costs are running out of control, they need a way, a better way of knowing the value that they're getting and using the best cloud for the right technology. Whether may be a private cloud in some cases, a public cloud or an edge cloud. So they want to able to going to select and move and have that portability. Being able to make those choices optimization is something they're demanding from us. And so we're most excited about this need to have a flexible infrastructure and a cloud agnostic infrastructure that helps them deliver these kinds of business outcomes. >> You mentioned a couple of customer examples and financial services. You mentioned the army software factory. In terms of looking at where we are in 2021. Are there any industries in particular, maybe essential services that you think are really prime targets for the technologies, the new announcements that you're making at VM world. >> You know, what we are trying to see is this is a broad change that's happening. If you're in retail, you know, you're kind of running a hybrid world of digital and physical. So we're seeing this blending of physical and digital reality coming together. You know, FedEx is a great customer of ours and you see them as spoken as example of it, you know, they're continue to both drive operational change in terms of being delivering the packages to you on time at a lower cost, but on the other side, they're also competing with their primary partners and retailers and in some cases, right, from a distribution perspective for Amazon, with Amazon prime. So in every industry, you're starting to see the lines are blurring between traditional partners and competitors. And in doing so, they're looking for a way to innovate, innovate at speed and leverage technology. So I don't think there is a specific industry that's not being disrupted whether it's FinTech, whether it's retail, whether it's transportation logistics, or healthcare telemedicine, right? The way you do pharmaceutical, how you deliver medicine, it's all changing. It's all being driven by data. And so we see a broad application of our technology, but financial services, healthcare, telco, government tend to be a kind of traditional industries that are with us but I think the reaches are pretty broad. >> Yeah, it is all changing. Everything is becoming more and more data-driven and many businesses are becoming data companies or if they're not, they need to otherwise their competition, as you mentioned, is going to be right in the rear view mirror, ready to take their place. But that's something that we see that isn't being talked about. I don't think enough, as some of the great innovations coming as a result of the situation that we're in. We're seeing big transformations in industries where we're all benefiting. I think we need to get that, that word out there a little bit more so we can start showing more of those silver linings. >> Sure. And I think what's happening here is it's about connecting the people to the services at the end of the day, these applications are means for delivering value. And so how do we connect us as consumers or us employees or us as partners to the business to the operator with both digitally and in a physical way. And we bring that in a seamless experience. So we're seeing more and more experience matters, you know, service quality and delivery matter. It's less about the technologies back again to the outcomes. And so very much focused in building that the platform that our customers can use to leverage the best of the cloud, the best of their people, the best of the innovation they have within the organization. >> You're right. It's all about outcomes. Ajay, thank you for joining me today, talking about some of the new things that the mission of your organization, the vision, some of the new products and technologies that are being announced at VM world, we appreciate your time and hopefully next year we'll see you in person. >> Thank you again and look forward to the next VMWorld in person. >> Likewise for Ajay Patel. You're very welcome for Ajay Patel. I'm Lisa Martin, and you're watching theCUBEs coverage of VMWorld of 2021. (soft music)

Published Date : Oct 6 2021

SUMMARY :

Ajay Patel is here, the SVP and GM It's always great to be here. and the strategy that VMware has. and the multiple clouds they deploy on. the dynamics of the market. and being able to get that application some of the industry trends or the expertise to do that effectively. address that skills gap. putting the effort to help So it's really the digitalization of the and that aligned to the vision And in the cloud operating model, in the last year and a half at the application layer and iteration of the technologies, the customer to co-innovate, impetus of the army software factory. of the campus in partnership Right, and for the army to rival of invest in the future Talk to me about some of the things in the future of multi-cloud, and able to optimize You mentioned the army software factory. the packages to you on time of the situation that we're in. building that the platform that the mission of your organization, and look forward to the and you're watching theCUBEs

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Lisa MartinPERSON

0.99+

Ajay PatelPERSON

0.99+

VMwareORGANIZATION

0.99+

SonosORGANIZATION

0.99+

Silicon valleyLOCATION

0.99+

FedExORGANIZATION

0.99+

VietnamLOCATION

0.99+

ApogeeORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

TanzuORGANIZATION

0.99+

10 daysQUANTITY

0.99+

2021DATE

0.99+

AjayPERSON

0.99+

thirdQUANTITY

0.99+

SecondQUANTITY

0.99+

OneQUANTITY

0.99+

Cloud 3.0TITLE

0.99+

oneQUANTITY

0.99+

firstQUANTITY

0.99+

next yearDATE

0.99+

two challengesQUANTITY

0.99+

todayDATE

0.98+

third actQUANTITY

0.98+

RaghuPERSON

0.98+

bothQUANTITY

0.98+

last yearDATE

0.98+

TazuORGANIZATION

0.97+

VMworld 2021EVENT

0.97+

AustinLOCATION

0.97+

VMWorldEVENT

0.97+

KubernetesTITLE

0.97+

first strategyQUANTITY

0.96+

threeQUANTITY

0.95+

USORGANIZATION

0.95+

VMworldORGANIZATION

0.95+

this yearDATE

0.95+

VRealizeORGANIZATION

0.95+

single missionQUANTITY

0.95+

five-year prototypeQUANTITY

0.95+

Modern Apps and ManagementORGANIZATION

0.94+

beta 2OTHER

0.93+

primeCOMMERCIAL_ITEM

0.93+

three bucketsQUANTITY

0.91+

last six monthsDATE

0.89+

SVPPERSON

0.87+

Modern AppsORGANIZATION

0.86+

three thingsQUANTITY

0.84+

twoQUANTITY

0.84+

three categoriesQUANTITY

0.83+

cloud 2.0TITLE

0.83+

last year and a halfDATE

0.8+

VMWorld of 2021EVENT

0.78+

pandemicEVENT

0.78+

dayQUANTITY

0.77+

one setQUANTITY

0.76+

theCUBEORGANIZATION

0.76+

US armyORGANIZATION

0.75+

theCUBE academyORGANIZATION

0.73+

COVIDOTHER

0.73+

last 18 monthsDATE

0.72+

CUBEORGANIZATION

0.71+

telcoORGANIZATION

0.67+

VM worldEVENT

0.66+

Show Wrap with DR


 

(upbeat music) >> Hey, we're back here in theCube. This is day three of our coverage right here in the middle of all the action of Cloud City at Mobile World Congress. This is the hit of the entire show in Barcelona, not only in person, but out on the interwebs virtually. This is a hybrid event. This is back to real life, and theCube is here. I'm John Furrier with Dave Vellante and D. R. is here, Danielle Royston. >> Totally. >> Welcome back to theCube for fourth time. now at the anchor desk, coming back. >> I don't know. It's been a busy day. It's been a busy week. It's been an awesome week. >> Dave: Feeling good? >> Oh, my god. >> You made the call. >> I made the call. You finished your podcast, what months ago? >> Yeah. >> Made the call. >> Made the call. You're on the right side of history. >> Right? And people were like, "It's going to be canceled. COVID won't be handled." Blahbity blah. >> She's crazy. >> And I'm like, nope. She's crazy. I'm okay with that. Right? But I'm like... >> Crazy good. >> Right, I'm like, I'm forward-looking in a lot of ways. And we were looking towards June, and we're like, "I think this is going to be the first event back. We're going to be able to do it." >> You know, the crazy one's commercial that Apple ran, probably one of the best commercials of all time. You can't ignore the crazy ones in a good way. You can't ignore what you're doing. And I think to me, what I'm so excited about is, 'cause we've been covering cloud. We're cloud bigots. We love the cloud, public cloud. We've been on that train from day one. But when you hear the interviews we did here on theCube and interviews that we talked about with the top people, Google, Amazon Web Services. We're talking about the top people, both technology leaders like Bill Vass and the people who run the Telecom Verticals like Alf, Alfonzo. >> Danielle: Yeah. >> Adolfo, I mean, Hernandez. >> Danielle: Yeah. >> We had Google's top networking executive. We had their industry leader in the telecom, Microsoft, and the Silicon. All are validating, and it's like surround sound to what you're saying here. And it cannot be ignored. >> I mean, we are coming to a big moment in Telco, right? And I mean, I've been saying that it's coming. I called 2021, the year of public cloud and Telco. It helped that Ericcson bailed. So thank you, Ericcson people. >> Dave: It was a gift. >> It was a gift. >> John: It really was. >> It really was a gift. And it was not just for me, but I think also for the vendors in the booth. I mean, we have a Cloud City army, right? Here we go. Let's start marching. And it's awesome. >> He reminds me of that baseball player that took a break 'cause he had a hangover and Cal Ripken. >> Cal Ripken, right, yeah, yeah. What was that guy's name? >> Did it really happen? >> Yeah, he took a break and... >> The new guy stepped in? >> Yeah, and so we'll go to Cal Ripken. >> No, no, so before it was it? Lou Gehrig. >> Lou Gehrig, yeah. >> Right, so Lou Gehrig was nobody. And we can't remember the guy's name. Nobody knows the guy's name. >> Danielle: Yeah, yeah. >> What was that guy's name? Nobody knows. Oh, 'cause Lou Garrett, he got hurt. >> Danielle: And Lou Gehrig stepped in. >> He sat out, and Lou Gehrig replaced him. >> Danielle: Love it. >> And never heard of him again. >> Danielle: I'll take that. >> Never missed a game. Never missed a game for his entire career. So again, this is what Ericcson did. They just okay, take a break and... >> But I mean, it's been great. Again, I had a great day yesterday. My keynote was delivered. Things are going well with the booth. We had Jon Bon Jovi. I mean, that was just epic, and it was acoustic, and it was right after lockdown. I think everyone was really excited to be there. But I was talking to a vendor that said we'd been able to accomplish in three days what normally it would take three years from a sales funnel perspective. I mean, that is, that's big, and that's not me. That's not my organization. That's other organizations that are benefiting from this energy. Oh, that's awesome. >> The post-isolation economy has become a living metaphor for transformation. And I've been trying to sort of grok and put the pieces together as to how this thing progresses. And my interview with Portaone, in particular, >> Danielle: Yeah. >> really brought it into focus for me, anyway. I'd love to get your thoughts. One of the things we haven't talked much about is public policy. And I think about all the time, all the discussion in the United States about infrastructure, this is critical infrastructure, right? >> Danielle: Yeah. >> And the spectrum is a country like South Africa saying, "Come on in. We want to open up." >> Danielle: Yeah. >> "We want to innovate." And to me that's to me, that's the model for these tier two and tier three telcos that are just going to disrupt the big guys. Whereas, you know, China, may be using the other end of the spectrum, very controlling, but it's the former that is going to adopt the cloud sooner. It's going to completely transform the next decade. >> Yeah, I think this is a great technology for a smaller challenger CSP that still is a large successful company to challenge the incumbents that are, they are dinosaurs too. They move a little bit slow. And maybe if you're a little bit faster, quicker dinosaur you'll survive longer. Maybe it will be able to transform and a public cloud enables that. And I think, you know, I'm playing the long game here, right? >> Dave: Yeah. >> Is public cloud ready for every telco in every corner of the world? No. And there's a couple of things that are barriers to that. We don't really talk about the downsides, and so maybe we sort of wrap up with, there are challenges, and I acknowledge there are challenges. You know, in some cases there are data regulations and issues, right? And you can't, right? There's not a hyperscaler in your country, right? And so you're having a little bit of challenges, but you trend this out over 10 years and then pace it with the hyperscalers are building new data centers. They're each at 25 plus each, plus or minus a few, right? They're marching along, and you trend this out over 10 years, I think one of two things happens. Your data regulations are eased or you a hyperscaler appears in a place you can use it. And those points converge, and hopefully the software's there, and that's my effort. And, yeah. >> You know what's an interesting trend, D. R., John? That is maybe a harbinger to this. You just mentioned something. If the hyperscalers might not have a presence in a country, you know what they're doing? And our data shows this, I do that weekly series "Breaking Analysis," and the data, OpenStack was popping up. >> Danielle: Yeah. >> Like where does OpenStack come from? Well, guess what. When you cut the data, it was telcos using open source to build clouds in regions where there was no hyperscaler. >> Where it didn't exist, yeah. >> So it's a-- >> Gap-filler. >> Yeah, it's a gap-filler. It's a Band-aid. >> But I think this is where like Outpost is such a great idea, right? Like getting Outposts, and I think Microsoft has the ability to do this as well, Google less so, right. They're not providing the staff. They're doing Anthos, so you're still managing this, the rack, but they're giving you the ability to tap into those services. But I was talking to a CE, a CTO in Bolivia. He was like, "We have data privacy issues in our country. There's no hyperscaler." Not sure Bolivia is like next on the list for AWS, right? But he's like, "I'm going to build my own public cloud." And I'm like, "Why would you do that when you can just use Outposts?" And then when your data regulations release or there's a, they get to Bolivia, you can switch and you're on the stack and you're ready to go. I think that's what you should do. You should totally do that. >> Yeah, and one of the things that's come up here on the interviews and theCube and here, the show, is that there are risk takers and innovators and there's operators. And this has been the consistent theme around, yeah, the on-premises world. You mentioned this regulation reasons and/or some workflows just have to be on premise for security reasons, whatever. That's the corner case. >> Danielle: Yeah. >> But the operating model of the technology architecture is shifted. >> Danielle: Yep. >> And that reality, I don't think, is debatable. So I find it. I've got to ask you this because I'm really curious. I know you get a lot of people steering 'ya, oh the public cloud's just a hosting, but why aren't people getting this architectural shift? I mean, you mentioned Outpost, and Wavelength, which Amazon has, is a game changer. It's Amazon Cloud at the hub. >> Yeah, at the edge, yeah. >> Okay, that's a low latency again, low-hanging fruit applications, robotics, whatnot. I mean, that's an architectural dot that's been connected. >> Yeah. >> Why aren't people getting it? >> In our industry, I think it is a lot of not invented here syndrome, right? And that's a very sort of nineties thought, and I have been advocating stand on the shoulders of the greatest technologists in the world. Right? And you know, there is a geopolitical US thing. I think we lived through a presidency that had a sort of nationalistic approach and a lot of those conversations pop up, but I've also looked to these guys and I'm like, you still have your Huawei kit installed, and there's concerns with that, too. So, and you picked it because of cost. And it's really hard to switch off of. >> John: Yeah. >> So give me a break with your public cloud USA stuff, right? You can use it. You're just making excuses. You're just afraid. What are you afraid of? The HR implications? Let's talk about that, right? And the minute I take it there, conversation changes. >> I talked to Teresa Carlson when she was running the public sector at AWS. She's now president of Splunk. I call her a Renaissance woman. She's been a great leader. In public sector there's been this weird little pocket of AWS where it's, I guess, a sales division, but it's still its own company. >> Danielle: Yeah. >> And she just did the CIA deal. The DOD and the public sector partnerships are now private, a lot more private relationships. So it's not like just governments. You mentioned government and national security and these things. You start to see the ecosystem, not, not just be about companies, government and private sector. So this whole vibe of the telecomm being regulated, unregulated, unbundled is an interesting kind of theory. What's your thoughts and reactions to this kind melting pot of ecosystem change and evolution? >> Yeah, I mean, I think there's a very nationalistic approach by the telcos, right? They sort of think about the countries that they operate in. There's a couple of groups that go across multiple countries, but can there be a global telco? Can that happen, right? Just like we say, you were saying it earlier, Netflix. Right? You didn't say Netflix, UK, right? And so can we have a global telco, right? That is challenging on a lot of different levels. But think about that in a public cloud starts to enable that idea. Right? Elon Musk is going to get Mars. >> Dave: Yep. >> John: Yeah. >> You need a planetary level telco, and I think that day is, I mean, I don't think it's tomorrow, but I think that's like 10, 20 years away. >> You're done. We're going to see it start this decade. It's already starting. >> Danielle: Yeah. >> But we're going to see the fruits of that dividend. >> Danielle: Right, yeah. >> I got to ask you. You're a student of the industry and you got so much experience. It's great to have you on theCube and chat about, riff about, these things, but the the classic "Who's ready for disruption?" question comes up. And I think there's no doubt that the telcos, as an industry, has been slow moving, and the role and the importance has changed. People need the need to have the internet access. They need to access. >> Danielle: Yeah. >> So and you've got the Edge. Now applications are now running on a, since the iPhone 14 years ago, as you pointed out, people now are interested in how packets move. >> Danielle: Yeah. >> That's fast, whether it's a doctor or an emergency worker or someone. >> What would we have done in 2020 without the internet and broadband and our mobile phones? I mean. >> Dave: We would have been miserable. >> You know, I think about 1920 when the Spanish flu pandemic hit a hundred years ago. Those guys did not have mobile phones, and they must have been bored, right? I mean, what are you going to do? Right? And so, yeah, I think, I think last year really moved a lot of thinking forward in this respect, so. >> Yeah, it's always like that animal out in the Serengeti that gets taken down, you know, by the cheetah or the lion. How do you know when someone is going to be disrupted? What's the, what's the tell sign in your mind? You look at the telco landscape, what is someone waiting to be disrupted or replaced look like? >> Know what? They're ostriches. Ostriches, how do you say that word right? They stick their head in the sand. Like they don't want to talk about it. La, la, la, I don't want to. I don't want to think about it. You know, they bring up all these like roadblocks, and I'm like, okay, I'm going to come visit you in another six months to a year, and let's see what happens when the guys that are moving fast that are open-minded to this. And it's, I mean, when you start to use the public cloud, you don't like turn it on overnight. You start experimenting, right? You start. You take an application that is non-threatening. You have, I mean, these guys are running thousands of apps inside their data centers. Pick some boring ones. Pick some old ones that no one likes. Move that to the public cloud. Play with it, right? I'm not talking about moving your whole network overnight tomorrow. You got to learn. You have no, I mean, very little talent in the telco that know how to program against the AWS stack. Start hiring. Start doing it. And you're going to start to learn about the compensation. And I used to do compensation, right? I spent a lot of time in HR, right? The compensation points and structures, and they can bear AWS and Google versus a telco. You want Telco stock? Do you want Google stock? >> John: Right, where do you want to go? >> Right? Right? And so you need to start. Like that's going to challenge the HR organization in terms of compensate. How do we compensate our people when they're learning these new, valuable skills? >> When you think about disruption, you know, the master or the professor of disruption, Clay Christensen, one of the best lectures he ever gave is we were at Cambridge, and he gave a lecture on the steel industry and he was describing it. It was like four layers of value in the steel industry, the value chain. It started with rebar, like the lowest end. Right? >> Danielle: Yeah, yeah. >> And the telco's actually the opposite. So, you know, when the international companies came in, they went after rebar, and the higher end steel companies said, "Nah, let them have it." >> Danielle: Let it go. >> "That's the low margin stuff." And then eventually when they got up to the high end, they all got killed. >> Danielle: It was over, yeah. >> The telcos are the opposite. They're like, you know, in the connectivity, and they're hanging on to that because it's so big, but all the high value stuff, it's already gone to the over-the-top players, right? >> It's being eaten away. And I'm like, "What is going to wake you guys up to realize those are your competitors?" That's where the battle is, right? >> Dave: That's really where the value is. >> The battle of the bastards. You're there by yourself, the Game of Thrones, and they're coming at you. >> John: You need a dragon. >> What are you doing about it? >> I need a dragon. I need a dragon to compete in this market. Riding on the dragon would be a good strategy. >> I know. I was just watching. 'Cause I have a podcast. I have a podcast called "Telco in 20," and we always put like little nuggets in the show notes. I personally review them. I was just reviewing the one for the keynote that we're putting out. And I had a dragon in my keynote, right? It was a really great moment. It was really fun to do. But there's, I don't know if you guys are Game of Thrones fans. >> Dave: Oh, yeah. >> John: For sure. >> Right? But there's a great moment when Daenerys guts her dragons, the baby dragons, and she takes over the Unsullied Army. Right? And it's just this, right? Like all of a sudden, the tables turn in an instant where she has nothing, and she's like on her quest, right? I'm on a quest. >> John: Comes out of the fire. >> Right, comes out of the fire. The unburnt, right? She has her dragons, right? She has them hatch. She takes over the Unsullied Army, right? Slays and starts her march, right? And I'm like, we're putting that clip into the show notes because I think that's where we are. I think I've hatched some dragons, right? The Cloud City Army, let's go, let's go take on Telco. >> John: Well, I mean to me... >> Easy. >> I definitely have made it happen because I heard many people talking about cloud. This is turning into a cloud show. The question is, when does this be, going to be a cloud show? You know it's just Cloud City is a big section of the show. I mean, all the big players are behind it. >> Danielle: Yeah, yeah. >> Amazon Web Services, Google, Azure, Ecosystem, startups thinking differently, but everyone's agreeing, "Why aren't we doing this?" >> I think, like I said, I mean, people are like, you're such a visionary. And how did, why do you think this will work? I'm like, it's worked in every other industry. Am I really that visionary? And like, these are the three best tech companies in the world. Like, are you kidding me? And so I think we've shown the momentum here. I think we're looking forward to 2022, you know? And do we see 2022, you get to start planning this the minute we get back. Right? >> John: Yeah. >> Like I wouldn't recommend doing this in a hundred days again. That was a very painful, but you know, February, I was, there's a sign inside NWC, February 28th, right? We're talking seven months. You got to get going now. >> John: Let's get on the phone. (John and Dave talking at the same time) >> I mean, I think you're right on. I mean, you know, remember Skype in the early days? >> Danielle: Yeah, yeah, yeah, yeah. >> It wasn't regional. >> Danielle: Yeah. >> It was just plug into the internet, right? >> Danielle: It was just Skype. It was just WhatsApp. >> Well, this great location, and if you can get a shot, guys, of the people behind us. I don't know if you can. If you're watching, check out the scene here. It's winding down. A lot of people having happy hour now. This is a social construct here at Cloud City. Not only is it chock full of information, reporting that we're doing and getting all the data and with the presentations on the main stage with Adam and the studio and the team. This is a place where people are meeting and there's deals being done face to face, intimate relationships. The best of the best are here. They make the trek, so there's been a successful formula. Of course theCube is in the middle of all the action, which we love. We're excited to be back. I want to thank you personally while we have you on stage here. >> I want to thank you guys and the crew. The crew has been amazing turning out videos on short order. We have all these crews in different cities. It's our own show has been virtual. You know, Adam's at Bristol, right? We're here. This was an experiment. We talked about this a hundred days ago, 90 days ago. Could we get theCube there and do the show, but also theCube. >> You are a visionary. And you said, made for TV hybrid event with your team, reduced television shows, theCube. We're digital. We love you guys. Great alignment, but it's magical because the content doesn't end here. The show might end. They might break down the beautiful plants and the exhibits, but the community is going to continue. The content and the conversations. >> Yeah. >> So. >> We are looking forward to it and. >> Yeah, super-glad, super-glad we did this. >> Awesome. Well, any final moments that you would like to share? And the last two minutes we have, favorite moments, observations, funny things that have happened to you, weird things that have happened to you. Share something that people might not know or a favorite moment. >> I think, I mean I don't know that people know we have a 3D printer in the coffee shops, and so you can upload any picture, and there are three 3D printing coffee art, right? So I've been seeing lots of social posts around people uploading their, their logos and things like that. I think Jon Bon Jovi, he was super-thankful to be back. He thanked me personally two different times of like, I'm just glad to be out in front of people. And I think just even just the people walking around, thank you for being brave, thank you for coming back. You've helped Barcelona, and we're happy to be together even if it is with masks. It's hard to do business with masks on. Everyone's happy and psyched. >> The one thing that people cannot do relative to you is they cannot ignore you. You are making a great big waves. >> Danielle: I shout pretty loud. It's kind of hard to ignore me. >> Okay, you're making a great big wave. You're on the right side, we believe, of history. Public cloud is driving the bus down main street of Cloud City, and if people don't get out of the way, they will be under the bus. >> And like I said, in my keynote, it's go time. Let's do it. >> Okay, thank you so much for all your tension and mission behind the cloud and the success of... >> Danielle: We'll do it again. We're going to do it again soon. >> Ketogi's hundred million dollar investment. Be the CEO of Togi as we follow that progress. And of course, Telco D. R. Danielle Royston, the digital revolution. Thanks for coming on theCube. >> Thank you, guys. It was super-fun. Thank you so much. >> This is theCube. I'm John Furrier with Dave Vellante. We're going to send it back to Adam in the studio. Thanks the team here. (Danielle clapping and cheering) I want to thank the team, everyone here. Adam is great. Chloe, great working with you guys. Awesome. And what a great crew. >> So great. >> Thank you everybody. That's it for theCube here on the last day, Wednesday, of theCube. Stay tuned for tomorrow, more action on the main stage here in Cloud City. Thanks for watching.

Published Date : Jul 1 2021

SUMMARY :

This is the hit of the now at the anchor desk, coming back. I don't know. I made the call. You're on the right side of history. "It's going to be canceled. And I'm like, nope. be the first event back. And I think to me, what Microsoft, and the Silicon. I called 2021, the year I mean, we have a Cloud City army, right? He reminds me of that What was that guy's name? No, no, so before it was it? Nobody knows the guy's name. What was that guy's name? He sat out, and Lou So again, this is what Ericcson did. I mean, that was just epic, and put the pieces together as One of the things we And the spectrum is a country end of the spectrum, And I think, you know, and hopefully the software's there, and the data, OpenStack was popping up. When you cut the data, Yeah, it's a gap-filler. I think that's what you should do. Yeah, and one of the things of the technology architecture is shifted. I mean, you mentioned Outpost, I mean, that's an architectural of the greatest And the minute I take it I talked to Teresa Carlson The DOD and the public sector approach by the telcos, right? I don't think it's tomorrow, We're going to see it start this decade. the fruits of that dividend. People need the need to since the iPhone 14 years That's fast, whether it's a doctor I mean. I mean, what are you going to do? You look at the telco landscape, in the telco that know how to And so you need to start. on the steel industry And the telco's actually the opposite. "That's the low margin stuff." in the connectivity, "What is going to wake you guys up The battle of the bastards. I need a dragon to compete in this market. And I had a dragon in my keynote, right? Like all of a sudden, the that clip into the show notes I mean, all the big players are behind it. in the world. You got to get going now. (John and Dave talking at the same time) I mean, you know, remember Danielle: It was just Skype. and getting all the data I want to thank you guys and the crew. but the community is going to continue. super-glad we did this. And the last two minutes we have, And I think just even just relative to you is they cannot ignore you. It's kind of hard to ignore me. You're on the right side, And like I said, in and mission behind the We're going to do it again soon. Be the CEO of Togi as Thank you so much. Thanks the team here. more action on the main

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

DavePERSON

0.99+

Lou GarrettPERSON

0.99+

DaniellePERSON

0.99+

GoogleORGANIZATION

0.99+

NetflixORGANIZATION

0.99+

Danielle RoystonPERSON

0.99+

Dave VellantePERSON

0.99+

Lou GehrigPERSON

0.99+

Dave VellantePERSON

0.99+

TelcoORGANIZATION

0.99+

AdamPERSON

0.99+

AWSORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

Lou GehrigPERSON

0.99+

Teresa CarlsonPERSON

0.99+

AmazonORGANIZATION

0.99+

BoliviaLOCATION

0.99+

February 28thDATE

0.99+

Clay ChristensenPERSON

0.99+

ChloePERSON

0.99+

Jon Bon JoviPERSON

0.99+

Cal RipkenPERSON

0.99+

Amazon Web ServicesORGANIZATION

0.99+

BarcelonaLOCATION

0.99+

D. R.PERSON

0.99+

2020DATE

0.99+

three yearsQUANTITY

0.99+

AppleORGANIZATION

0.99+

CIAORGANIZATION

0.99+

Cloud CityLOCATION

0.99+

John FurrierPERSON

0.99+

Bill VassPERSON

0.99+

JuneDATE

0.99+

Game of ThronesTITLE

0.99+

FebruaryDATE

0.99+

last yearDATE

0.99+

2022DATE

0.99+

General Keith Alexander, IronNet Cybersecurity & Gil Quiniones, NY Power Authority | AWS PS Awards


 

(bright music) >> Hello and welcome to today's session of the 2021 AWS Global Public Sector Partner Awards for the award for Best Partner Transformation, Best Cybersecurity Solution. I'm now honored to welcome our next guests, General Keith Alexander, Founder, and Co-CEO of IronNet Cybersecurity, as well as Gil Quiniones, President and CEO of the New York Power Authority. Welcome to the program gentlemen, delighted to have you here. >> Good to be here. >> Terrific. Well, General Alexander, I'd like to start with you. Tell us about the collective defense program or platform and why is it winning awards? >> Well, great question and it's great to have Gil here because it actually started with the energy sector. And the issue that we had is how do we protect the grid? The energy sector CEOs came together with me and several others and said, how do we protect this grid together? Because we can't defend it each by ourselves. We've got to defend it together. And so the strategy that IronNet is using is to go beyond what the conventional way of sharing information known as signature-based solutions to behavioral-based so that we can see the events that are happening, the unknown unknowns, share those among companies and among both small and large in a way that helps us defend because we can anonymize that data. We can also share it with the government. The government can see a tax on our country. That's the future, we believe, of cybersecurity and that collective defense is critical for our energy sector and for all the companies within it. >> Terrific. Well, Gil, I'd like to shift to you. As the CEO of the largest state public power utility in the United States, why do you think it's so important now to have a collective defense approach for utility companies? >> Well, the utility sector lied with the financial sector as number one targets by our adversaries and you can't really solve cybersecurity in silos. We, NYPA, my company, New York Power Authority alone cannot be the only one and other companies doing this in silos. So what's really going to be able to be effective if all of the utilities and even other sectors, financial sectors, telecom sectors cooperate in this collective defense situation. And as we transform the grid, the grid is getting transformed and decentralized. We'll have more electric cars, smart appliances. The grid is going to be more distributed with solar and batteries charging stations. So the threat surface and the threat points will be expanding significantly and it is critical that we address that issue collectively. >> Terrific. Well, General Alexander, with collective defense, what industries and business models are you now disrupting? >> Well, we're doing the energy sector, obviously. Now the defense industrial base, the healthcare sector, as well as international partners along the way. And we have a group of what we call technical and other companies that we also deal with and a series of partner companies, because no company alone can solve this problem, no cybersecurity company alone. So partners like Amazon and others partner with us to help bring this vision to life. >> Terrific. Well, staying with you, what role does data and cloud scale now play in solving these security threats that face the businesses, but also nations? >> That's a great question. Because without the cloud, bringing collective security together is very difficult. But with the cloud, we can move all this information into the cloud. We can correlate and show attacks that are going on against different companies. They can see that company A, B, C or D, it's anonymized, is being hit with the same thing. And the government, we can share that with the government. They can see a tax on critical infrastructure, energy, finance, healthcare, the defense industrial base or the government. In doing that, what we quickly see is a radar picture for cyber. That's what we're trying to build. That's where everybody's coming together. Imagine a future where attacks are coming against our country can be seen at network speed and the same for our allies and sharing that between our nation and our allies begins to broaden that picture, broaden our defensive base and provide insights for companies like NYPA and others. >> Terrific. Well, now Gil, I'd like to move it back to you. If you could describe the utility landscape and the unique threats that both large ones and small ones are facing in terms of cybersecurity and the risks, the populous that live there. >> Well, the power grid is an amazing machine, but it is controlled electronically and more and more digitally. So as I mentioned before, as we transform this grid to be a cleaner grid, to be more of an integrated energy network with solar panels and electric vehicle charging stations and wind farms, the threat is going to be multiple from a cyber perspective. Now we have many smaller utilities. There are towns and cities and villages that own their poles and wires. They're called municipal utilities, rural cooperative systems, and they are not as sophisticated and well-resourced as a company like the New York Power Authority or our investor on utilities across the nation. But as the saying goes, we're only as strong as our weakest link. And so we need- >> Terrific. >> we need to address the issues of our smaller utilities as well. >> Yeah, terrific. Do you see a potential for more collaboration between the larger utilities and the smaller ones? What do you see as the next phase of defense? >> Well, in fact, General Alexander's company, IronNet and NYPA are working together to help bring in the 51 smaller utilities here in New York in their collective defense tool, the IronDefense or the IronDome as we call it here in New York. We had a meeting the other day, where even thinking about bringing in critical state agencies and authorities. The Metropolitan Transportation Authority, Port Authority of New York and New Jersey, and other relevant critical infrastructure state agencies to be in this cloud and to be in this radar of cybersecurity. And the beauty of what IronNet is bringing to this arrangement is they're trying to develop a product that can be scalable and affordable by those smaller utilities. I think that's important because if we can achieve that, then we can replicate this across the country where you have a lot of smaller utilities and rural cooperative systems. >> Yeah. Terrific. Well, Gil, staying with you. I'd love to learn more about what was the solution that worked so well for you? >> In cybersecurity, you need public-private partnerships. So we have private companies like IronNet that we're partnering with and others, but also partnering with state and federal government because they have a lot of resources. So the key to all of this is bringing all of that information together and being able to react, the General mentioned, network speed, we call it machine speed, has to be quick and we need to protect and or isolate and be able to recover it and be resilient. So that's the beauty of this solution that we're currently developing here in New York. >> Terrific. Well, thank you for those points. Shifting back to General Alexander. With your depth of experience in the defense sector, in your view, how can we stay in front of the attacks, mitigate them, and then respond to them before any damage is done? >> So having run our nations, the offense. I know that the offense has the upper hand almost entirely because every company and every agency defends itself as an isolated entity. Think about 50 mid-sized companies, each with 10 people, they're all defending themselves and they depend on that defense individually and they're being attacked individually. Now take those 50 companies and their 10 people each and put them together and collect the defense where they share information, they share knowledge. This is the way to get out in front of the offense, the attackers that you just asked about. And when people start working together, that knowledge sharing and crowdsourcing is a solution for the future because it allows us to work together where now you have a unified approach between the public and private sectors that can share information and defend each of the sectors together. That is the future of cybersecurity. What makes it possible is the cloud, by being able to share this information into the cloud and move it around the cloud. So what Amazon has done with AWS has exactly that. It gives us the platform that allows us to now share that information and to go at network speed and share it with the government in an anonymized way. I believe that will change radically how we think about cybersecurity. >> Yeah. Terrific. Well, you mention data sharing, but how is it now a common tactic to get the best out of the data? And now, how is it sharing data among companies accelerated or changed over the past year? And what does it look like going forward when we think about moving out of the pandemic? >> So first, this issue of sharing data, there's two types of data. One about the known threats. So sharing that everybody knows because they use a signature-based system and a set of rules. That shared and that's the common approach to it. We need to go beyond that and share the unknown. And the way to share the unknown is with behavioral analytics. Detect behaviors out there that are anonymous or anomalous, are suspicious and are malicious and share those and get an understanding for what's going on in company A and see if there's correlations in B, C and D that give you insights to suspicious activity. Like solar winds, recognizes solar winds at 18,000 companies, each defending themselves. None of them were able to recognize that. Using our tools, we did recognize it in three of our companies. So what you can begin to see is a platform that can now expand and work at network speed to defend against these types of attacks. But you have to be able to see that information, the unknown unknowns, and quickly bring people together to understand what that means. Is this bad? Is this suspicious? What do I need to know about this? And if I can share that information anonymized with the government, they can reach in and say, this is bad. You need to do something about it. And we'll take the responsibility from here to block that from hitting our nation or hitting our allies. I think that's the key part about cybersecurity for the future. >> Terrific. General Alexander, ransomware of course, is the hottest topic at the moment. What do you see as the solution to that growing threat? >> So I think, a couple things on ransomware. First, doing what we're talking about here to detect the phishing and the other ways they get in is an advanced way. So protect yourself like that. But I think we have to go beyond, we have to attribute who's doing it, where they're doing it from and hold them accountable. So helping provide that information to our government as it's going on and going after these guys, making them pay a price is part of the future. It's too easy today. Look at what happened with the DarkSide and others. They hit Colonial Pipeline and they said, oh, we're not going to do that anymore. Then they hit a company in Japan and prior to that, they hit a company in Norway. So they're attacking and they pretty much operate at will. Now, let's indict some of them, hold them accountable, get other governments to come in on this. That's the way we stop it. And that requires us to work together, both the public and private sector. It means having these advanced tools, but also that public and private partnership. And I think we have to change the rhetoric. The first approach everybody takes is, Colonial, why did you let this happen? They're a victim. If they were hit with missiles, we wouldn't be asking that, but these were nation state like actors going after them. So now our government and the private sector have to work together and we need to change that to say, they're victim, and we're going to go after the guys that did this as a nation and with our allies. I think that's the way to solve it. >> Yeah. Well, terrific. Thank you so much for those insights. Gil, I'd also like to ask you some key questions and of course, certainly people today have a lot of concerns about security, but also about data sharing. How are you addressing those concerns? >> Well, data governance is critical for a utility like the New York Power Authority. A few years ago, we declared that we aspire to be the first end-to-end digital utility. And so by definition, protecting the data of our system, our industrial controls, and the data of our customers are paramount to us. So data governance, considering data or treating data as an asset, like a physical asset is very, very important. So we in our cybersecurity, plans that is a top priority for us. >> Yeah. And Gil thinking about industry 4.0, how has the surface area changed with Cloud and IoT? >> Well, it's grown significantly. At the power authority, we're installing sensors and smart meters at our power plants, at our substations and transmission lines, so that we can monitor them real time, all the time, know their health, know their status. Our customers we're monitoring about 15 to 20,000 state and local government buildings across our states. So just imagine the amount of data that we're streaming real time, all the time into our integrated smart operations center. So it's increasing and it will only increase with 5G, with quantum computing. This is just going to increase and we need to be prepared and integrate cyber into every part of what we do from beginning to end of our processes. >> Yeah. And to both of you actually, as we see industry 4.0 develop even further, are you more concerned about malign actors developing more sophistication? What steps can we take to really be ahead of them? Let's start with General Alexander. >> So, I think the key differentiator and what the energy sector is doing, the approach to cybersecurity is led by CEOs. So you bring CEOs like Gil Quiniones in, you've got other CEOs that are actually bringing together forums to talk about cybersecurity. It is CEO led. That the first part. And then the second part is how do we train and work together, that collective defense. How do we actually do this? I think that's another one that NYPA is leading with West Point in the Army Cyber Institute. How can we start to bring this training session together and train to defend ourselves? This is an area where we can uplift our people that are working in this process, our cyber analysts if you will at the security operations center level. By training them, giving them hard tests and continuing to go. That approach will uplift our cybersecurity and our cyber defense to the point where we can now stop these types of attacks. So I think CEO led, bring in companies that give us the good and bad about our products. We'd like to hear the good, we need to hear the bad, and we needed to improve that, and then how do we train and work together. I think that's part of that solution to the future. >> And Gil, what are your thoughts as we embrace industry 4.0? Are you worried that this malign actors are going to build up their own sophistication and strategy in terms of data breaches and cyber attacks against our utility systems? What can we do to really step up our game? >> Well, as the General said, the good thing with the energy sector is that on the foundational level, we're the only sector with mandatory regulatory requirements that we need to meet. So we are regulated by the Federal Energy Regulatory Commission and the North American Electric Reliability Corporation to meet certain standards in cyber and critical infrastructure. But as the General said, the good thing with the utility is by design, just like storms, we're used to working with each other. So this is just an extension of that storm restoration and other areas where we work all the time together. So we are naturally working together when it comes to to cyber. We work very closely with our federal government partners, Department of Homeland Security, Department of Energy and the National Labs. The National Labs have a lot of expertise. And with the private sector, like great companies like IronNet, NYPA, we stood up an excellence, center of excellence with private partners like IronNet and Siemens and others to start really advancing the art of the possible and the technology innovation in this area. And as the governor mentioned, we partnered with West Point because just like any sporting or just any sport, actual exercises of the red team, green team, and doing that constantly, tabletop exercises, and having others try and breach your walls. Those are good exercises to really be ready against the adversaries. >> Yeah. Terrific. Thank you so much for those insights. General Alexander, now I'd like to ask you this question. Can you share the innovation strategy as the world moves out of the pandemic? Are we seeing new threats, new realities? >> Well, I think, it's not just coming out of the pandemic, but the pandemic actually brought a lot of people into video teleconferences like we are right here. So more people are working from home. You add in the 5G that Gil talked about that gives you a huge attack surface. You're thinking now about instead of a hundred devices per square kilometer up to a million devices. And so you're increasing the attack surface. Everything is changing. So as we come out of the pandemic, people are going to work more from home. You're going to have this attack surface that's going on, it's growing, it's changing, it's challenging. We have to be really good about now, how we trained together, how we think about this new area and we have to continue to innovate, not only what are the cyber tools that we need for the IT side, the internet and the OT side, operational technology. So those kinds of issues are facing all of us and it's a constantly changing environment. So that's where that education, that training, that communication, working between companies, the customers, the NYPA's and the IronNet's and others and then working with the government to make sure that we're all in sync. It's going to grow and is growing at an increased rate exponentially. >> Terrific. Thank you for that. Now, Gil, same question for you. As a result of this pandemic, do you see any kind of new realities emerging? What is your position? >> Well, as the General said, most likely, many companies will be having this hybrid setup. And for company's life like mine, I'm thinking about, okay, how many employees do I have that can access our industrial controls in our power plants, in our substations, and transmission system remotely? And what will that mean from a risk perspective, but even on the IT side, our business information technology. You mentioned about the Colonial Pipeline type situation. How do we now really make sure that our cyber hygiene of our employees is always up-to-date and that we're always vigilant from potential entry whether it's through phishing or other techniques that our adversaries are using. Those are the kinds of things that keep myself like a CEO of a utility up at night. >> Yeah. Well, shifting gears a bit, this question for General Alexander. How come supply chain is such an issue? >> Well, the supply chain, of course, for a company like NYPA, you have hundreds or thousands of companies that you work with. Each of them have different ways of communicating with your company. And in those communications, you now get threats. If they get infected and they reach out to you, they're normally considered okay to talk to, but at the same time that threat could come in. So you have both suppliers that help you do your job. And smaller companies that Gil has, he's got the 47 munis and four co-ops out there, 51, that he's got to deal with and then all the state agencies. So his ecosystem has all these different companies that are part of his larger network. And when you think about that larger network, the issue becomes, how am I going to defend that? And I think, as Gil mentioned earlier, if we put them all together and we operate and train together and we defend together, then we know that we're doing the best we can, especially for those smaller companies, the munis and co-ops that don't have the people and a security ops centers and other things to defend them. But working together, we can help defend them collectively. >> Terrific. And I'd also like to ask you a bit more on IronDefense. You spoke about its behavioral capabilities, it's behavioral detection techniques, excuse me. How is it really different from the rest of the competitive landscape? What sets it apart from traditional cybersecurity tools? >> So traditional cybersecurity tools use what we call a signature-based system. Think of that as a barcode for the threat. It's a specific barcode. We use that barcode to identify the threat at the firewall or at the endpoint. Those are known threats. We can stop those and we do a really good job. We share those indicators of compromise in those barcodes, in the rules that we have, Suricata rules and others, those go out. The issue becomes, what about the things we don't know about? And to detect those, you need behavioral analytics. Behavioral analytics are a little bit noisier. So you want to collect all the data and anomalies with behavioral analytics using an expert system to sort them out and then use collected defense to share knowledge and actually look across those. And the great thing about behavioral analytics is you can detect all of the anomalies. You can share very quickly and you can operate at network speed. So that's going to be the future where you start to share that, and that becomes the engine if you will for the future radar picture for cybersecurity. You add in, as we have already machine learning and AI, artificial intelligence, people talk about that, but in this case, it's a clustering algorithms about all those events and the ways of looking at it that allow you to up that speed, up your confidence in and whether it's malicious, suspicious or benign and share that. I think that is part of that future that we're talking about. You've got to have that and the government can come in and say, you missed something. Here's something you should be concerned about. And up the call from suspicious to malicious that gives everybody in the nation and our allies insights, okay, that's bad. Let's defend against it. >> Yeah. Terrific. Well, how does the type of technology address the President's May 2021 executive order on cybersecurity as you mentioned the government? >> So there's two parts of that. And I think one of the things that I liked about the executive order is it talked about, in the first page, the public-private partnership. That's the key. We got to partner together. And the other thing it went into that was really key is how do we now bring in the IT infrastructure, what our company does with the OT companies like Dragos, how do we work together for the collective defense for the energy sector and other key parts. So I think it is hit two key parts. It also goes on about what you do about the supply chain for software were all needed, but that's a little bit outside what we're talking about here today. The real key is how we work together between the public and private sector. And I think it did a good job in that area. >> Terrific. Well, thank you so much for your insights and to you as well, Gil, really lovely to have you both on this program. That was General Keith Alexander, Founder and Co-CEO of IronNet Cybersecurity, as well as Gil Quiniones, the President and CEO of the New York Power Authority. That's all for this session of the 2021 AWS Global Public Sector Partner Awards. I'm your host for theCUBE, Natalie Erlich. Stay with us for more coverage. (bright music)

Published Date : Jun 30 2021

SUMMARY :

President and CEO of the I'd like to start with you. And the issue that we had is in the United States, why do and it is critical that we and business models and other companies that we also deal with that face the businesses, And the government, we can and the risks, the the threat is going to be we need to address the issues and the smaller ones? and to be in this radar of cybersecurity. I'd love to learn more So the key to all of this is bringing in the defense sector, and defend each of the sectors together. the best out of the data? and share the unknown. is the hottest topic at the moment. and the private sector and of course, certainly and the data of our customers how has the surface area and we need to be prepared What steps can we take to the approach to are going to build up and the North American Electric like to ask you this question. and the OT side, operational technology. do you see any kind of Well, as the General said, most likely, this question for General Alexander. doing the best we can, like to ask you a bit more and that becomes the engine if you will Well, how does the type And the other thing it went and to you as well, Gil, really lovely

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
IronNetORGANIZATION

0.99+

SiemensORGANIZATION

0.99+

Natalie ErlichPERSON

0.99+

Federal Energy Regulatory CommissionORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

Gil QuinionesPERSON

0.99+

North American Electric Reliability CorporationORGANIZATION

0.99+

New York Power AuthorityORGANIZATION

0.99+

JapanLOCATION

0.99+

New York Power AuthorityORGANIZATION

0.99+

two partsQUANTITY

0.99+

New YorkLOCATION

0.99+

NYPAORGANIZATION

0.99+

Department of Homeland SecurityORGANIZATION

0.99+

West PointORGANIZATION

0.99+

GilPERSON

0.99+

hundredsQUANTITY

0.99+

AWSORGANIZATION

0.99+

first pageQUANTITY

0.99+

Metropolitan Transportation AuthorityORGANIZATION

0.99+

Department of EnergyORGANIZATION

0.99+

NorwayLOCATION

0.99+

18,000 companiesQUANTITY

0.99+

IronNet CybersecurityORGANIZATION

0.99+

two key partsQUANTITY

0.99+

United StatesLOCATION

0.99+

IronDefenseORGANIZATION

0.99+

50 companiesQUANTITY

0.99+

National LabsORGANIZATION

0.99+

DragosORGANIZATION

0.99+

AlexanderPERSON

0.99+

FirstQUANTITY

0.99+

oneQUANTITY

0.99+

IronDomeORGANIZATION

0.99+

10 peopleQUANTITY

0.99+

first partQUANTITY

0.99+

NY Power AuthorityORGANIZATION

0.99+

bothQUANTITY

0.99+

second partQUANTITY

0.99+

todayDATE

0.99+

eachQUANTITY

0.99+

51 smaller utilitiesQUANTITY

0.99+

firstQUANTITY

0.99+

May 2021DATE

0.99+

2021 AWS Global Public Sector Partner AwardsEVENT

0.98+

Army Cyber InstituteORGANIZATION

0.98+

EachQUANTITY

0.98+

pandemicEVENT

0.98+

two typesQUANTITY

0.98+

OneQUANTITY

0.98+

GeneralPERSON

0.97+

Keith AlexanderPERSON

0.97+

50 mid-sized companiesQUANTITY

0.97+

Maria Colgan & Gerald Venzl, Oracle | June CUBEconversation


 

(upbeat music) Developers have become the new king makers in the world of digital and cloud. The rise of containers and microservices has accelerated the transition to cloud native applications. A lot of people will talk about application architecture and the related paradigms and the benefits they bring for the process of writing and delivering new apps. But a major challenge continues to be, the how and the what when it comes to accessing, processing and getting insights from the massive amounts of data that we have to deal with in today's world. And with me are two experts from the data management world who will share with us how they think about the best techniques and practices based on what they see at large organizations who are working with data and developing so-called data-driven apps. Please welcome Maria Colgan and Gerald Venzl, two distinguish product managers from Oracle. Folks, welcome, thanks so much for coming on. >> Thanks for having us Dave. >> Thank you very much for having us. >> Okay, Maria let's start with you. So, we throw around this term data-driven, data-driven applications. What are we really talking about there? >> So data-driven applications are applications that work on a diverse set of data. So anything from spatial to sensor data, document data as well as your usual transaction processing data. And what they're going to do is they'll generate value from that data in very different ways to a traditional application. So for example, they may use machine learning, they are able to do product recommendations in the middle of a transaction. Or we could use graph to be able to identify an influencer within the community so we can target them with a specific promotion. It could also use spatial data to be able to help find the nearest stores to a particular customer. And because these apps are deployed on multiple platforms, everything from mobile devices as well as standard browsers, they need a data platform that's going to be both secure, reliable and scalable. >> Well, so when you think about how the workloads are shifting I mean, we're not talking about, you know it's not anymore a world of just your ERP or your HCM or your CRM, you know kind of the traditional operational systems. You really are seeing an explosion of these new data oriented apps. You're seeing, you know, modeling in the cloud, you are going to see more and more inferencing, inferencing at the edge. But Maria maybe you could talk a little bit about sort of the benefits that customers are seeing from developing these types of applications. I mean, why should people care about data-driven apps? >> Oh, for sure, there's massive benefits to them. I mean, probably the most obvious one for any business regardless of the industry, is that they not only allow you to understand what your customers are up to, but they allow you to be able to anticipate those customer's needs. So that helps businesses maintain that competitive edge and retain their customers. But it also helps them make data-driven decisions in real time based on actual data rather than on somebody's gut feeling or basing those decisions on historical data. So for example, you can do real-time price adjustments on products based on demand and so forth, that kind of thing. So it really changes the way people do business today. >> So Gerald, you think about the narrative in the industry everybody wants to be a platform player all your customers they are becoming software companies, they are becoming platform players. Everybody wants to be like, you know name a company that is huge trillion dollar market cap or whatever, and those are data-driven companies. And so it would seem to me that data-driven applications, there's nobody, no company really shouldn't be data-driven. Do you buy that? >> Yeah, absolutely. I mean, data-driven, and that naturally the whole industry is data-driven, right? It's like we all have information technologies about processing data and deriving information out of it. But when it comes to app development I think there is a big push to kind of like we have to do machine learning in our applications, we have to get insights from data. And when you actually look back a bit and take a step back, you see that there's of course many different kinds of applications out there as well that's not to be forgotten, right? So there is a usual front end user interfaces where really the application all it does is just entering some piece of information that's stored somewhere or perhaps a microservice that's not attached to a data to you at all but just receives or asks calls (indistinct). So I think it's not necessarily so important for every developer to kind of go on a bandwagon that they have to be data-driven. But I think it's equally important for those applications and those developers that build applications, that drive the business, that make business critical decisions as Maria mentioned before. Those guys should take really a close look into what data-driven apps means and what the data to you can actually give to them. Because what we see also happening a lot is that a lot of the things that are well known and out there just ready to use are being reimplemented in the applications. And for those applications, they essentially just ended up spending more time writing codes that will be already there and then have to maintain and debug the code as well rather than just going to market faster. >> Gerald can you talk to the prevailing approaches that developers take to build data-driven applications? What are the ones that you see? Let's dig into that a little bit more and maybe differentiate the different approaches and talk about that? >> Yeah, absolutely. I think right now the industry is like in two camps, it's like sort of a religious war going on that you'll see often happening with different architectures and so forth going on. So we have single purpose databases or data management technologies. Which are technologies that are as the name suggests build around a single purpose. So it's like, you know a typical example would be your ordinary key-value store. And a key-value store all it does is it allows you to store and retrieve a piece of data whatever that may be really, really fast but it doesn't really go beyond that. And then the other side of the house or the other camp would be multimodal databases, multimodal data management technologies. Those are technologies that allow you to store different types of data, different formats of data in the same technology in the same system alongside. And, you know, when you look at the geographics out there of what we have from technology, is pretty much any relational database or any database really has evolved into such a multimodal database. Whether that's MySQL that allows you to store or chase them alongside relational or even a MongoDB that allows you to do or gives you native graph support since (mumbles) and as well alongside the adjacent support. >> Well, it's clearly a trend in the industry. We've talked about this a lot in The Cube. We know where Oracle stands on this. I mean, you just mentioned MySQL but I mean, Oracle Databases you've been extending, you've mentioned JSON, we've got blockchain now in there you're infusing, you know ML and AI into the database, graph database capabilities, you know on and on and on. We talked a lot about we compared that to Amazon which is kind of the right tool, the right job approach. So maybe you could talk about, you know, your point of view, the benefits for developers of using that converged database if I can use that word approach being able to store multiple data formats? Why do you feel like that's a better approach? >> Yeah, I think on a high level it comes down to complexity. You are actually avoiding additional complexity, right? So not every use case that you have necessarily warrants to have yet another data management technology or yet the special build technology for managing that data, right? It's like many use cases that we see out there happily want to just store a piece of a chase and document, a piece of chase in a database and then perhaps retrieve it again afterwards so write some simple queries over it. And you really don't have to get a new database technology or a NoSQL database into the mix if you already have some to just fulfill that exact use case. You could just happily store that information as well in the database you already have. And what it really comes down to is the learning curve for developers, right? So it's like, as you use the same technology to store other types of data, you don't have to learn a new technology, you don't have to associate yourself with new and learn new drivers. You don't have to find new frameworks and you don't have to know how to necessarily operate or best model your data for that database. You can essentially just reuse your knowledge of the technology as well as the libraries and code you have already built in house perhaps in another application, perhaps, you know framework that you used against the same technology because it is still the same technology. So, kind of all comes down again to avoiding complexity rather than not fragmenting you know, the many different technologies we have. If you were to look at the different data formats that are out there today it's like, you know, you would end up with many different databases just to store them if you were to fully religiously follow the single purpose best built technology for every use case paradigm, right? And then you would just end up having to manage many different databases more than actually focusing on your app and getting value to your business or to your user. >> Okay, so I get that and I buy that by the way. I mean, especially if you're a larger organization and you've got all these projects going on but before we go back to Maria, Gerald, I want to just, I want to push on that a little bit. Because the counter to that argument would be in the analogy. And I wonder if you, I'd love for you to, you know knock this analogy off the blocks. The counter would be okay, Oracle is the Swiss Army knife and it's got, you know, all in one. But sometimes I need that specialized long screwdriver and I go into my toolbox and I grab that. It's better than the screwdriver in my Swiss Army knife. Why, are you the Swiss Army knife of databases? Or are you the all-in-one have that best of breed screwdriver for me? How do you think about that? >> Yeah, that's a fantastic question, right? And I think it's first of all, you have to separate between Oracle the company that has actually multiple data management technologies and databases out there as you said before, right? And Oracle Database. And I think Oracle Database is definitely a Swiss Army knife has many capabilities of since the last 40 years, you know that we've seen object support coming that's still in the Oracle Database today. We have seen XML coming, it's still in the Oracle Database, graph, spatial, et cetera. And so you have many different ways of managing your data and then on top of that going into the converge, not only do we allow you to store the different data model in there but we actually allow you also to, you apply all the security policies and so forth on top of it something Maria can talk more about the mission around converged database. I would also argue though that for some aspects, we do actually have to or add a screwdriver that you talked about as well. So especially in the relational world people get very quickly hung up on this idea that, oh, if you only do rows and columns, well, that's kind of what you put down on disk. And that was never true, it's the relational model is actually a logical model. What's probably being put down on disk is blocks that align themselves nice with block storage and always has been. So that allows you to actually model and process the data sort of differently. And one common example or one good example that we have that we introduced a couple of years ago was when, column and databases were very strong and you know, the competition came it's like, yeah, we have In-Memory column that stores now they're so much better. And we were like, well, orienting the data role-based or column-based really doesn't matter in the sense that we store them as blocks on disks. And so we introduced the in memory technology which gives you an In-Memory column, a representation of your data as well alongside your relational. So there is an example where you go like, well, actually you know, if you have this use case of the column or analytics all In-Memory, I would argue Oracle Database is also that screwdriver you want to go down to and gives you that capability. Because not only gives you representation in columnar, but also which many people then forget all the analytic power on top of SQL. It's one thing to store your data columnar, it's a completely different story to actually be able to run analytics on top of that and having all the built-in functionalities and stuff that you want to do with the data on top of it as you analyze it. >> You know, that's a great example, the kilometer 'cause I remember there was like a lot of hype around it. Oh, it's the Oracle killer, you know, at Vertica. Vertica is still around but, you know it never really hit escape velocity. But you know, good product, good company, whatever. Natezza, it kind of got buried inside of IBM. ParXL kind of became, you know, red shift with that deal so that kind of went away. Teradata bought a company, I forget which company it bought but. So that hype kind of disapated and now it's like, oh yeah, columnar. It's kind of like In-Memory, we've had a In-Memory databases ever since we've had databases you know, it's a kind of a feature not a sector. But anyway, Maria, let's come back to you. You've got a lot of customer experience. And you speak with a lot of companies, you know during your time at Oracle. What else are you seeing in terms of the benefits to this approach that might not be so intuitive and obvious right away? >> I think one of the biggest benefits to having a multimodel multiworkload or as we call it a converged database, is the fact that you can get greater data synergy from it. In other words, you can utilize all these different techniques and data models to get better value out of that data. So things like being able to do real-time machine learning, fraud detection inside a transaction or being able to do a product recommendation by accessing three different data models. So for example, if I'm trying to recommend a product for you Dave, I might use graph analytics to be able to figure out your community. Not just your friends, but other people on our system who look and behave just like you. Once I know that community then I can go over and see what products they bought by looking up our product catalog which may be stored as JSON. And then on top of that I can then see using the key-value what products inside that catalog those community members gave a five star rating to. So that way I can really pinpoint the right product for you. And I can do all of that in one transaction inside the database without having to transform that data into different models or God forbid, access different systems to be able to get all of that information. So it really simplifies how we can generate that value from the data. And of course, the other thing our customers love is when it comes to deploying data-driven apps, when you do it on a converged database it's much simpler because it is that standard data platform. So you're not having to manage multiple independent single purpose databases. You're not having to implement the security and the high availability policies, you know across a bunch of different diverse platforms. All of that can be done much simpler with a converged database 'cause the DBA team of course, is going to just use that standard set of tools to manage, monitor and secure those systems. >> Thank you for that. And you know, it's interesting, you talk about simplification and you are in Juan's organization so you've big focus on mission critical. And so one of the things that I think is often overlooked well, we talk about all the time is recovery. And if things are simpler, recovery is faster and easier. And so it's kind of the hallmark of Oracle is like the gold standard of the toughest apps, the most mission critical apps. But I wanted to get to the cloud Maria. So because everything is going to the cloud, right? Not all workloads are going to the cloud but everybody is talking about the cloud. Everybody has cloud first mentality and so yes, it's a hybrid world. But the natural next question is how do you think the cloud fits into this world of data-driven apps? >> I think just like any app that you're developing, the cloud helps to accelerate that development. And of course the deployment of these data-driven applications. 'Cause if you think about it, the developer is instantly able to provision a converged database that Oracle will automatically manage and look after for them. But what's great about doing something like that if you use like our autonomous database service is that it comes in different flavors. So you can get autonomous transaction processing, data warehousing or autonomous JSON so that the developer is going to get a database that's been optimized for their specific use case, whatever they are trying to solve. And it's also going to contain all of that great functionality and capabilities that we've been talking about. So what that really means to the developer though is as the project evolves and inevitably the business needs change a little, there's no need to panic when one of those changes comes in because your converged database or your autonomous database has all of those additional capabilities. So you can simply utilize those to able to address those evolving changes in the project. 'Cause let's face it, none of us normally know exactly what we need to build right at the very beginning. And on top of that they also kind of get a built-in buddy in the cloud, especially in the autonomous database. And that buddy comes in the form of built-in workload optimizations. So with the autonomous database we do things like automatic indexing where we're using machine learning to be that buddy for the developer. So what it'll do is it'll monitor the workload and see what kind of queries are being run on that system. And then it will actually determine if there are indexes that should be built to help improve the performance of that application. And not only does it bill those indexes but it verifies that they help improve the performance before publishing it to the application. So by the time the developer is finished with that app and it's ready to be deployed, it's actually also been optimized by the developers buddy, the Oracle autonomous database. So, you know, it's a really nice helping hand for developers when they're building any app especially data-driven apps. >> I like how you sort of gave us, you know the truth here is you don't always know where you're going when you're building an app. It's like it goes from you are trying to build it and they will come to start building it and we'll figure out where it's going to go. With Agile that's kind of how it works. But so I wonder, can you give some examples of maybe customers or maybe genericize them if you need to. Data-driven apps in the cloud where customers were able to drive more efficiency, where the cloud buddy allowed the customers to do more with less? >> No, we have tons of these but I'll try and keep it to just a couple. One that comes to mind straight away is retrace. These folks built a blockchain app in the Oracle Cloud that allows manufacturers to actually share the supply chain with the consumer. So the consumer can see exactly, who made their product? Using what raw materials? Where they were sourced from? How it was done? All of that is visible to the consumer. And in order to be able to share that they had to work on a very diverse set of data. So they had everything from JSON documents to images as well as your traditional transactions in there. And they store all of that information inside the Oracle autonomous database, they were able to build their app and deploy it on the cloud. And they were able to do all of that very, very quickly. So, you know, that ability to work on multiple different data types in a single database really helped them build that product and get it to market in a very short amount of time. Another customer that's doing something really, really interesting is MindSense. So these guys operate the largest mines in Canada, Chile, and Peru. But what they do is they put these x-ray devices on the massive mechanical shovels that are at the cove or at the mine face. And what that does is it senses the contents of the buckets inside these mining machines. And it's looking to see at that content, to see how it can optimize the processing of the ore inside in that bucket. So they're looking to minimize the amount of power and water that it's going to take to process that. And also of course, minimize the amount of waste that's going to come out of that project. So all of that sensor data is sent into an autonomous database where it's going to be processed by a whole host of different users. So everything from the mine engineers to the geo scientists, to even their own data scientists utilize that data to drive their business forward. And what I love about these guys is they're not happy with building just one app. MindSense actually use our built-in low core development environment, APEX that comes as part of the autonomous database and they actually produce applications constantly for different aspects of their business using that technology. And it's actually able to accelerate those new apps to the business. It takes them now just a couple of days or weeks to produce an app instead of months or years to build those new apps. >> Great, thank you for that Maria. Gerald, I'm going to push you again. So, I said upfront and talked about microservices and the cloud and containers and you know, anybody in the developer space follows that very closely. But some of the things that we've been talking about here people might look at that and say, well, they're kind of antithetical to microservices. This is our Oracles monolithic approach. But when you think about the benefits of microservices, people want freedom of choice, technology choice, seen as a big advantage of microservices and containers. How do you address such an argument? >> Yeah, that's an excellent question and I get that quite often. The microservices architecture in general as I said before had architectures, Linux distributions, et cetera. It's kind of always a bit of like there's an academic approach and there's a pragmatic approach. And when you look at the microservices the original definitions that came out at the early 2010s. They actually never said that each microservice has to have a database. And they also never said that if a microservice has a database, you have to use a different technology for each microservice. Just like they never said, you have to write a microservice in a different programming language, right? So where I'm going with this is like, yes you know, sometimes when you look at some vendors out there, some niche players, they push this message or they jump on this academic approach of like each microservice has the best tool at hand or I'd use a different database for your purpose, et cetera. Which almost often comes across like us. You know, we want to stay part of the conversation. Nothing stops a developer from, you know using a multimodal database for the microservice and just using that as a document store, right? Or just using that as a relational database. And, you know, sometimes I mean, it was actually something that happened that was really interesting yesterday I don't know whether you follow Dave or not. But Facebook had an outage yesterday, right? And Facebook is one of those companies that are seen as the Silicon Valley, you know know how to do microservices companies. And when you add through the outage, well, what happened, right? Some unfortunate logical error with configuration as a force that took a database cluster down. So, you know, there you have it where you go like, well, maybe not every microservice is actually in fact talking to its own database or its own special purpose database. I think there, you know, well, what we should, the industry should be focusing much more on this argument of which technology to use? What's the right tool for a job? Is more to ask themselves, what business problem actually are we trying to solve? And therefore what's the right approach and the right technology for this. And so therefore, just as I said before, you know multimodal databases they do have strong benefits. They have many built-in functionalities that are already there and they allow you to reduce this complexity of having to know many different technologies, right? And so it's not only to store different data models either you know, treat a multimodal database as a chasing documents store or a relational database but most databases are multimodal since 20 plus years. But it's also actually being able to perhaps if you store that data together, you can perhaps actually derive additional value for somebody else but perhaps not for your application. But like for example, if you were to use Oracle Database you can actually write queries on top of all of that data. It doesn't really matter for our query engine whether it's the data is format that then chase or the data is formatted in rows and columns you can just rather than query over it. And that's actually very powerful for those guys that have to, you know get the reporting done the end of the day, the end of the week. And for those guys that are the data scientists that they want to figure out, you know which product performed really well or can we tweak something here and there. When you look into that space you still see a huge divergence between the guys to put data in kind of the altarpiece style and guys that try to derive new insights. And there's still a lot of ETL going around and, you know we have big data technologies that some of them come and went and some of them came in that are still around like Apache Spark which is still like a SQL engine on top of any of your data kind of going back to the same concept. And so I will say that, you know, for developers when we look at microservices it's like, first of all, is the argument you were making because the vendor or the technology you want to use tells you this argument or, you know, you kind of want to have an argument to use a specific technology? Or is it really more because it is the best technology, to best use for this given use case for this given application that you have? And if so there's of course, also nothing wrong to use a single purpose technology either, right? >> Yeah, I mean, whenever I talk about Oracle I always come back to the most important applications, the mission critical. It's very difficult to architect databases with microservices and containers. You have to be really, really careful. And so and again, it comes back to what we were talking before about with Maria that the complexity and the recovery. But Gerald I want to stay with you for a minute. So there's other data management technologies popping out there. I mean, I've seen some people saying, okay just leave the data in an S3 bucket. We can query that, then we've got some magic sauce to do that. And so why are you optimistic about you know, traditional database technology going forward? >> I would say because of the history of databases. So one thing that once struck me when I came to Oracle and then got to meet great people like Juan Luis and Andy Mendelsohn who had been here for a long, long time. I come to realization that relational databases are around for about 45 years now. And, you know, I was like, I'm too young to have been around then, right? So I was like, what else was around 45 years? It's like just the tech stack that we have today. It's like, how does this look like? Well, Linux only came out in 93. Well, databases pre-date Linux a lot rather than as I started digging I saw a lot of technologies come and go, right? And you mentioned before like the technologies that data management systems that we had that came and went like the columnar databases or XML databases, object databases. And even before relational databases before Cot gave us the relational model there were apparently these networks stores network databases which to some extent look very similar to adjacent documents. There wasn't a harder storing data and a hierarchy to format. And, you know when you then start actually reading the Cot paper and diving a little bit more into the relation model, that's I think one important crux in there that most of the industry keeps forgetting or it hasn't been around to even know. And that is that when Cot created the relational model, he actually focused not so much on the application putting the data in, but on future users and applications still being able to making sense out of the data, right? And that's kind of like I said before we had those network models, we had XML databases you have adjacent documents stores. And the one thing that they all have along with it is like the application that puts the data in decides the structure of the data. And that's all well and good if you had an application of the developer writing an application. It can become really tricky when 10 years later you still want to look at that data and the application that the developer is no longer around then you go like, what does this all mean? Where is the structure defined? What is this attribute? What does it mean? How does it correlate to others? And the one thing that people tend to forget is that it's actually the data that's here to stay not someone who does the applications where it is. Ideally, every company wants to store every single byte of data that they have because there might be future value in it. Economically may not make sense that's now much more feasible than just years ago. But if you could, why wouldn't you want to store all your data, right? And sometimes you actually have to store the data for seven years or whatever because the laws require you to. And so coming back then and you know, like 10 years from now and looking at the data and going like making sense of that data can actually become a lot more difficult and a lot more challenging than having to first figure out and how we store this data for general use. And that kind of was what the relational model was all about. We decompose the data structures into tables and columns with relationships amongst each other so therefore between each other. So that therefore if somebody wants to, you know typical example would be well you store some purchases from your web store, right? There's a customer attribute in it. There's some credit card payment information in it, just some product information on what the customer bought. Well, in the relational model if you just want to figure out which products were sold on a given day or week, you just would query the payment and products table to get the sense out of it. You don't need to touch the customer and so forth. And with the hierarchical model you have to first sit down and understand how is the structure, what is the customer? Where is the payment? You know, does the document start with the payment or does it start with the customer? Where do I find this information? And then in the very early days those databases even struggled to then not having to scan all the documents to get the data out. So coming back to your question a bit, I apologize for going on here. But you know, it's like relational databases have been around for 45 years. I actually argue it's one of the most successful software technologies that we have out there when you look in the overall industry, right? 45 years is like, in IT terms it's like from a star being the ones who are going supernova. You have said it before that many technologies coming and went, right? And just want to add a more really interesting example by the way is Hadoop and HDFS, right? They kind of gave us this additional promise of like, you know, the 2010s like 2012, 2013 the hype of Hadoop and so forth and (mumbles) and HDFS. And people are just like, just put everything into HDFS and worry about the data later, right? And we can query it and map reduce it and whatever. And we had customers actually coming to us they were like, great we have half a petabyte of data on an HDFS cluster and we have no clue what's stored in there. How do we figure this out? What are we going to do now? Now you had a big data cleansing problem. And so I think that is why databases and also data modeling is something that will not go away anytime soon. And I think databases and database technologies are here for quite a while to stay. Because many of those are people they don't think about what's happening to the data five years from now. And many of the niche players also and also frankly even Amazon you know, following with this single purpose thing is like, just use the right tool for the job for your application, right? Just pull in the data there the way you wanted. And it's like, okay, so you use technologies all over the place and then five years from now you have your data fragmented everywhere in different formats and, you know inconsistencies, and, and, and. And those are usually when you come back to this data-driven business critical business decision applications the worst case scenario you can have, right? Because now you need an army of people to actually do data cleansing. And there's not a coincidence that data science has become very, very popular the last recent years as we kind of went on with this proliferation of different database or data management technologies some of those are not even database. But I think I leave it at that. >> It's an interesting talk track because you're right. I mean, no schema on right was alluring, but it definitely created some problems. It also created an entire, you know you referenced the hyper specialized roles and did the data cleansing component. I mean, maybe technology will eventually solve that problem but it hasn't up at least up tonight. Okay, last question, Maria maybe you could start off and Gerald if you want to chime in as well it'd be great. I mean, it's interesting to watch this industry when Oracle sort of won the top database mantle. I mean, I watched it, I saw it. It was, remember it was Informix and it was (indistinct) too and of course, Microsoft you got to give them credit with SQL server, but Oracle won the database wars. And then everything got kind of quiet for awhile database was sort of boring. And then it exploded, you know, all the, you know not only SQL and the key-value stores and the cloud databases and this is really a hot area now. And when we looked at Oracle we said, okay, Oracle it's all about Oracle Database, but we've seen the kind of resurgence in MySQL which everybody thought, you know once Oracle bought Sun they were going to kill MySQL. But now we see you investing in HeatWave, TimesTen, we talked about In-Memory databases before. So where do those fit in Maria in the grand scheme? How should we think about Oracle's database portfolio? >> So there's lots of places where you'd use those different things. 'Cause just like any other industry there are going to be new and boutique use cases that are going to benefit from a more specialized product or single purpose product. So good examples off the top of my head of the kind of systems that would benefit from that would be things like a stock exchange system or a telephone exchange system. Both of those are latency critical transaction processing applications where they need microsecond response times. And that's going to exceed perhaps what you might normally get or deploy with a converged database. And so Oracle's TimesTen database our In-Memory database is perfect for those kinds of applications. But there's also a host of MySQL applications out there today and you said it yourself there Dave, HeatWave is a great place to provision and deploy those kinds of applications because it's going to run 100 times faster than AWS (mumbles). So, you know, there really is a place in the market and in our customer's systems and the needs they have for all of these different members of our database family here at Oracle. >> Yeah, well, the internet is basically running in the lamp stack so I see MySQL going away. All right Gerald, will give you the final word, bring us home. >> Oh, thank you very much. Yeah, I mean, as Maria said, I think it comes back to what we discussed before. There is obviously still needs for special technologies or different technologies than a relational database or multimodal database. Oracle has actually many more databases that people may first think of. Not only the three that we have already mentioned but there's even SP so the Oracle's NoSQL database. And, you know, on a high level Oracle is a data management company, right? And we want to give our customers the best tools and the best technology to manage all of their data. Rather than therefore there has to be a need or there should be a part of the business that also focuses on this highly specialized systems and this highly specialized technologies that address those use cases. And I think it makes perfect sense. It's like, you know, when the customer comes to Oracle they're not only getting this, take this one product you know, and if you don't like it your problem but actually you have choice, right? And choice allows you to make a decision based on what's best for you and not necessarily best for the vendor you're talking to. >> Well guys, really appreciate your time today and your insights. Maria, Gerald, thanks so much for coming on The Cube. >> Thank you very much for having us. >> And thanks for watching this Cube conversation this is Dave Vellante and we'll see you next time. (upbeat music)

Published Date : Jun 24 2021

SUMMARY :

in the world of digital and cloud. and the benefits they bring What are we really talking about there? the nearest stores to kind of the traditional So it really changes the way So Gerald, you think about to you at all but just receives or even a MongoDB that allows you to do ML and AI into the database, in the database you already have. and I buy that by the way. of since the last 40 years, you know the benefits to this approach is the fact that you can get And so one of the things that And that buddy comes in the form of the truth here is you don't and deploy it on the cloud. and the cloud and containers and you know, is the argument you were making that the complexity and the recovery. because the laws require you to. And then it exploded, you and the needs they have in the lamp stack so I and the best technology to and your insights. we'll see you next time.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

Gerald VenzlPERSON

0.99+

Andy MendelsohnPERSON

0.99+

MariaPERSON

0.99+

ChileLOCATION

0.99+

PeruLOCATION

0.99+

Maria ColganPERSON

0.99+

CanadaLOCATION

0.99+

OracleORGANIZATION

0.99+

GeraldPERSON

0.99+

AmazonORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

Maria ColganPERSON

0.99+

seven yearsQUANTITY

0.99+

IBMORGANIZATION

0.99+

Juan LuisPERSON

0.99+

100 timesQUANTITY

0.99+

five starQUANTITY

0.99+

DavePERSON

0.99+

FacebookORGANIZATION

0.99+

two expertsQUANTITY

0.99+

AWSORGANIZATION

0.99+

SunORGANIZATION

0.99+

45 yearsQUANTITY

0.99+

MySQLTITLE

0.99+

threeQUANTITY

0.99+

yesterdayDATE

0.99+

each microserviceQUANTITY

0.99+

Swiss ArmyORGANIZATION

0.99+

early 2010sDATE

0.99+

TeradataORGANIZATION

0.99+

Swiss ArmyORGANIZATION

0.99+

LinuxTITLE

0.99+

10 years laterDATE

0.99+

2012DATE

0.99+

two campsQUANTITY

0.99+

SQLTITLE

0.99+

BothQUANTITY

0.98+

Oracle DatabaseTITLE

0.98+

2010sDATE

0.98+

TimesTenORGANIZATION

0.98+

HadoopTITLE

0.98+

firstQUANTITY

0.98+

OraclesORGANIZATION

0.98+

VerticaORGANIZATION

0.98+

tonightDATE

0.98+

2013DATE

0.98+

Maria Colgan & Gerald Venzl, Oracle | June CUBEconversation


 

(upbeat music) >> It'll be five, four, three and then silent two, one, and then you guys just follow my lead. We're just making some last minute adjustments. Like I said, we're down two hands today. So, you good Alex? Okay, are you guys ready? >> I'm ready. >> Ready. >> I got to get get one note here. >> So I noticed Maria you stopped anyway, so I have time. >> Just so they know Dave and the Boston Studio, are they both kind of concurrently be on film even when they're not speaking or will only the speaker be on film for like if Gerald's drawing while Maria is talking about-- >> Sorry but then I missed one part of my onboarding spiel. There should be, if you go into gallery there should be a label. There should be something labeled Boston live switch feed. If you pin that gallery view you'll see what our program currently being recorded is. So any time you don't see yourself on that feed is an excellent time to take a drink of water, scratch your nose, check your notes. Do whatever you got to do off screen. >> Can you give us a three shot, Alex? >> Yes, there it is. >> And then go to me, just give me a one-shot to Dave. So when I'm here you guys can take a drink or whatever >> That makes sense? >> Yeah. >> Excellent, I will get my recordings restarted and we'll open up when Dave's ready. >> All right, you guys ready? >> Ready. >> All right Steve, you go on mute. >> Okay, on me in 5, 4, 3. Developers have become the new king makers in the world of digital and cloud. The rise of containers and microservices has accelerated the transition to cloud native applications. A lot of people will talk about application architecture and the related paradigms and the benefits they bring for the process of writing and delivering new apps. But a major challenge continues to be, the how and the what when it comes to accessing, processing and getting insights from the massive amounts of data that we have to deal with in today's world. And with me are two experts from the data management world who will share with us how they think about the best techniques and practices based on what they see at large organizations who are working with data and developing so-called data-driven apps. Please welcome Maria Colgan and Gerald Venzl, two distinguish product managers from Oracle. Folks, welcome, thanks so much for coming on. >> Thanks for having us Dave. >> Thank you very much for having us. >> Okay, Maria let's start with you. So, we throw around this term data-driven, data-driven applications. What are we really talking about there? >> So data-driven applications are applications that work on a diverse set of data. So anything from spatial to sensor data, document data as well as your usual transaction processing data. And what they're going to do is they'll generate value from that data in very different ways to a traditional application. So for example, they may use machine learning, they are able to do product recommendations in the middle of a transaction. Or we could use graph to be able to identify an influencer within the community so we can target them with a specific promotion. It could also use spatial data to be able to help find the nearest stores to a particular customer. And because these apps are deployed on multiple platforms, everything from mobile devices as well as standard browsers, they need a data platform that's going to be both secure, reliable and scalable. >> Well, so when you think about how the workloads are shifting I mean, we're not talking about, you know it's not anymore a world of just your ERP or your HCM or your CRM, you know kind of the traditional operational systems. You really are seeing an explosion of these new data oriented apps. You're seeing, you know, modeling in the cloud, you are going to see more and more inferencing, inferencing at the edge. But Maria maybe you could talk a little bit about sort of the benefits that customers are seeing from developing these types of applications. I mean, why should people care about data-driven apps? >> Oh, for sure, there's massive benefits to them. I mean, probably the most obvious one for any business regardless of the industry, is that they not only allow you to understand what your customers are up to, but they allow you to be able to anticipate those customer's needs. So that helps businesses maintain that competitive edge and retain their customers. But it also helps them make data-driven decisions in real time based on actual data rather than on somebody's gut feeling or basing those decisions on historical data. So for example, you can do real-time price adjustments on products based on demand and so forth, that kind of thing. So it really changes the way people do business today. >> So Gerald, you think about the narrative in the industry everybody wants to be a platform player all your customers they are becoming software companies, they are becoming platform players. Everybody wants to be like, you know name a company that is huge trillion dollar market cap or whatever, and those are data-driven companies. And so it would seem to me that data-driven applications, there's nobody, no company really shouldn't be data-driven. Do you buy that? >> Yeah, absolutely. I mean, data-driven, and that naturally the whole industry is data-driven, right? It's like we all have information technologies about processing data and deriving information out of it. But when it comes to app development I think there is a big push to kind of like we have to do machine learning in our applications, we have to get insights from data. And when you actually look back a bit and take a step back, you see that there's of course many different kinds of applications out there as well that's not to be forgotten, right? So there is a usual front end user interfaces where really the application all it does is just entering some piece of information that's stored somewhere or perhaps a microservice that's not attached to a data to you at all but just receives or asks calls (indistinct). So I think it's not necessarily so important for every developer to kind of go on a bandwagon that they have to be data-driven. But I think it's equally important for those applications and those developers that build applications, that drive the business, that make business critical decisions as Maria mentioned before. Those guys should take really a close look into what data-driven apps means and what the data to you can actually give to them. Because what we see also happening a lot is that a lot of the things that are well known and out there just ready to use are being reimplemented in the applications. And for those applications, they essentially just ended up spending more time writing codes that will be already there and then have to maintain and debug the code as well rather than just going to market faster. >> Gerald can you talk to the prevailing approaches that developers take to build data-driven applications? What are the ones that you see? Let's dig into that a little bit more and maybe differentiate the different approaches and talk about that? >> Yeah, absolutely. I think right now the industry is like in two camps, it's like sort of a religious war going on that you'll see often happening with different architectures and so forth going on. So we have single purpose databases or data management technologies. Which are technologies that are as the name suggests build around a single purpose. So it's like, you know a typical example would be your ordinary key-value store. And a key-value store all it does is it allows you to store and retrieve a piece of data whatever that may be really, really fast but it doesn't really go beyond that. And then the other side of the house or the other camp would be multimodal databases, multimodal data management technologies. Those are technologies that allow you to store different types of data, different formats of data in the same technology in the same system alongside. And, you know, when you look at the geographics out there of what we have from technology, is pretty much any relational database or any database really has evolved into such a multimodal database. Whether that's MySQL that allows you to store or chase them alongside relational or even a MongoDB that allows you to do or gives you native graph support since (mumbles) and as well alongside the adjacent support. >> Well, it's clearly a trend in the industry. We've talked about this a lot in The Cube. We know where Oracle stands on this. I mean, you just mentioned MySQL but I mean, Oracle Databases you've been extending, you've mentioned JSON, we've got blockchain now in there you're infusing, you know ML and AI into the database, graph database capabilities, you know on and on and on. We talked a lot about we compared that to Amazon which is kind of the right tool, the right job approach. So maybe you could talk about, you know, your point of view, the benefits for developers of using that converged database if I can use that word approach being able to store multiple data formats? Why do you feel like that's a better approach? >> Yeah, I think on a high level it comes down to complexity. You are actually avoiding additional complexity, right? So not every use case that you have necessarily warrants to have yet another data management technology or yet the special build technology for managing that data, right? It's like many use cases that we see out there happily want to just store a piece of a chase and document, a piece of chase in a database and then perhaps retrieve it again afterwards so write some simple queries over it. And you really don't have to get a new database technology or a NoSQL database into the mix if you already have some to just fulfill that exact use case. You could just happily store that information as well in the database you already have. And what it really comes down to is the learning curve for developers, right? So it's like, as you use the same technology to store other types of data, you don't have to learn a new technology, you don't have to associate yourself with new and learn new drivers. You don't have to find new frameworks and you don't have to know how to necessarily operate or best model your data for that database. You can essentially just reuse your knowledge of the technology as well as the libraries and code you have already built in house perhaps in another application, perhaps, you know framework that you used against the same technology because it is still the same technology. So, kind of all comes down again to avoiding complexity rather than not fragmenting you know, the many different technologies we have. If you were to look at the different data formats that are out there today it's like, you know, you would end up with many different databases just to store them if you were to fully religiously follow the single purpose best built technology for every use case paradigm, right? And then you would just end up having to manage many different databases more than actually focusing on your app and getting value to your business or to your user. >> Okay, so I get that and I buy that by the way. I mean, especially if you're a larger organization and you've got all these projects going on but before we go back to Maria, Gerald, I want to just, I want to push on that a little bit. Because the counter to that argument would be in the analogy. And I wonder if you, I'd love for you to, you know knock this analogy off the blocks. The counter would be okay, Oracle is the Swiss Army knife and it's got, you know, all in one. But sometimes I need that specialized long screwdriver and I go into my toolbox and I grab that. It's better than the screwdriver in my Swiss Army knife. Why, are you the Swiss Army knife of databases? Or are you the all-in-one have that best of breed screwdriver for me? How do you think about that? >> Yeah, that's a fantastic question, right? And I think it's first of all, you have to separate between Oracle the company that has actually multiple data management technologies and databases out there as you said before, right? And Oracle Database. And I think Oracle Database is definitely a Swiss Army knife has many capabilities of since the last 40 years, you know that we've seen object support coming that's still in the Oracle Database today. We have seen XML coming, it's still in the Oracle Database, graph, spatial, et cetera. And so you have many different ways of managing your data and then on top of that going into the converge, not only do we allow you to store the different data model in there but we actually allow you also to, you apply all the security policies and so forth on top of it something Maria can talk more about the mission around converged database. I would also argue though that for some aspects, we do actually have to or add a screwdriver that you talked about as well. So especially in the relational world people get very quickly hung up on this idea that, oh, if you only do rows and columns, well, that's kind of what you put down on disk. And that was never true, it's the relational model is actually a logical model. What's probably being put down on disk is blocks that align themselves nice with block storage and always has been. So that allows you to actually model and process the data sort of differently. And one common example or one good example that we have that we introduced a couple of years ago was when, column and databases were very strong and you know, the competition came it's like, yeah, we have In-Memory column that stores now they're so much better. And we were like, well, orienting the data role-based or column-based really doesn't matter in the sense that we store them as blocks on disks. And so we introduced the in memory technology which gives you an In-Memory column, a representation of your data as well alongside your relational. So there is an example where you go like, well, actually you know, if you have this use case of the column or analytics all In-Memory, I would argue Oracle Database is also that screwdriver you want to go down to and gives you that capability. Because not only gives you representation in columnar, but also which many people then forget all the analytic power on top of SQL. It's one thing to store your data columnar, it's a completely different story to actually be able to run analytics on top of that and having all the built-in functionalities and stuff that you want to do with the data on top of it as you analyze it. >> You know, that's a great example, the kilometer 'cause I remember there was like a lot of hype around it. Oh, it's the Oracle killer, you know, at Vertica. Vertica is still around but, you know it never really hit escape velocity. But you know, good product, good company, whatever. Natezza, it kind of got buried inside of IBM. ParXL kind of became, you know, red shift with that deal so that kind of went away. Teradata bought a company, I forget which company it bought but. So that hype kind of disapated and now it's like, oh yeah, columnar. It's kind of like In-Memory, we've had a In-Memory databases ever since we've had databases you know, it's a kind of a feature not a sector. But anyway, Maria, let's come back to you. You've got a lot of customer experience. And you speak with a lot of companies, you know during your time at Oracle. What else are you seeing in terms of the benefits to this approach that might not be so intuitive and obvious right away? >> I think one of the biggest benefits to having a multimodel multiworkload or as we call it a converged database, is the fact that you can get greater data synergy from it. In other words, you can utilize all these different techniques and data models to get better value out of that data. So things like being able to do real-time machine learning, fraud detection inside a transaction or being able to do a product recommendation by accessing three different data models. So for example, if I'm trying to recommend a product for you Dave, I might use graph analytics to be able to figure out your community. Not just your friends, but other people on our system who look and behave just like you. Once I know that community then I can go over and see what products they bought by looking up our product catalog which may be stored as JSON. And then on top of that I can then see using the key-value what products inside that catalog those community members gave a five star rating to. So that way I can really pinpoint the right product for you. And I can do all of that in one transaction inside the database without having to transform that data into different models or God forbid, access different systems to be able to get all of that information. So it really simplifies how we can generate that value from the data. And of course, the other thing our customers love is when it comes to deploying data-driven apps, when you do it on a converged database it's much simpler because it is that standard data platform. So you're not having to manage multiple independent single purpose databases. You're not having to implement the security and the high availability policies, you know across a bunch of different diverse platforms. All of that can be done much simpler with a converged database 'cause the DBA team of course, is going to just use that standard set of tools to manage, monitor and secure those systems. >> Thank you for that. And you know, it's interesting, you talk about simplification and you are in Juan's organization so you've big focus on mission critical. And so one of the things that I think is often overlooked well, we talk about all the time is recovery. And if things are simpler, recovery is faster and easier. And so it's kind of the hallmark of Oracle is like the gold standard of the toughest apps, the most mission critical apps. But I wanted to get to the cloud Maria. So because everything is going to the cloud, right? Not all workloads are going to the cloud but everybody is talking about the cloud. Everybody has cloud first mentality and so yes, it's a hybrid world. But the natural next question is how do you think the cloud fits into this world of data-driven apps? >> I think just like any app that you're developing, the cloud helps to accelerate that development. And of course the deployment of these data-driven applications. 'Cause if you think about it, the developer is instantly able to provision a converged database that Oracle will automatically manage and look after for them. But what's great about doing something like that if you use like our autonomous database service is that it comes in different flavors. So you can get autonomous transaction processing, data warehousing or autonomous JSON so that the developer is going to get a database that's been optimized for their specific use case, whatever they are trying to solve. And it's also going to contain all of that great functionality and capabilities that we've been talking about. So what that really means to the developer though is as the project evolves and inevitably the business needs change a little, there's no need to panic when one of those changes comes in because your converged database or your autonomous database has all of those additional capabilities. So you can simply utilize those to able to address those evolving changes in the project. 'Cause let's face it, none of us normally know exactly what we need to build right at the very beginning. And on top of that they also kind of get a built-in buddy in the cloud, especially in the autonomous database. And that buddy comes in the form of built-in workload optimizations. So with the autonomous database we do things like automatic indexing where we're using machine learning to be that buddy for the developer. So what it'll do is it'll monitor the workload and see what kind of queries are being run on that system. And then it will actually determine if there are indexes that should be built to help improve the performance of that application. And not only does it bill those indexes but it verifies that they help improve the performance before publishing it to the application. So by the time the developer is finished with that app and it's ready to be deployed, it's actually also been optimized by the developers buddy, the Oracle autonomous database. So, you know, it's a really nice helping hand for developers when they're building any app especially data-driven apps. >> I like how you sort of gave us, you know the truth here is you don't always know where you're going when you're building an app. It's like it goes from you are trying to build it and they will come to start building it and we'll figure out where it's going to go. With Agile that's kind of how it works. But so I wonder, can you give some examples of maybe customers or maybe genericize them if you need to. Data-driven apps in the cloud where customers were able to drive more efficiency, where the cloud buddy allowed the customers to do more with less? >> No, we have tons of these but I'll try and keep it to just a couple. One that comes to mind straight away is retrace. These folks built a blockchain app in the Oracle Cloud that allows manufacturers to actually share the supply chain with the consumer. So the consumer can see exactly, who made their product? Using what raw materials? Where they were sourced from? How it was done? All of that is visible to the consumer. And in order to be able to share that they had to work on a very diverse set of data. So they had everything from JSON documents to images as well as your traditional transactions in there. And they store all of that information inside the Oracle autonomous database, they were able to build their app and deploy it on the cloud. And they were able to do all of that very, very quickly. So, you know, that ability to work on multiple different data types in a single database really helped them build that product and get it to market in a very short amount of time. Another customer that's doing something really, really interesting is MindSense. So these guys operate the largest mines in Canada, Chile, and Peru. But what they do is they put these x-ray devices on the massive mechanical shovels that are at the cove or at the mine face. And what that does is it senses the contents of the buckets inside these mining machines. And it's looking to see at that content, to see how it can optimize the processing of the ore inside in that bucket. So they're looking to minimize the amount of power and water that it's going to take to process that. And also of course, minimize the amount of waste that's going to come out of that project. So all of that sensor data is sent into an autonomous database where it's going to be processed by a whole host of different users. So everything from the mine engineers to the geo scientists, to even their own data scientists utilize that data to drive their business forward. And what I love about these guys is they're not happy with building just one app. MindSense actually use our built-in low core development environment, APEX that comes as part of the autonomous database and they actually produce applications constantly for different aspects of their business using that technology. And it's actually able to accelerate those new apps to the business. It takes them now just a couple of days or weeks to produce an app instead of months or years to build those new apps. >> Great, thank you for that Maria. Gerald, I'm going to push you again. So, I said upfront and talked about microservices and the cloud and containers and you know, anybody in the developer space follows that very closely. But some of the things that we've been talking about here people might look at that and say, well, they're kind of antithetical to microservices. This is our Oracles monolithic approach. But when you think about the benefits of microservices, people want freedom of choice, technology choice, seen as a big advantage of microservices and containers. How do you address such an argument? >> Yeah, that's an excellent question and I get that quite often. The microservices architecture in general as I said before had architectures, Linux distributions, et cetera. It's kind of always a bit of like there's an academic approach and there's a pragmatic approach. And when you look at the microservices the original definitions that came out at the early 2010s. They actually never said that each microservice has to have a database. And they also never said that if a microservice has a database, you have to use a different technology for each microservice. Just like they never said, you have to write a microservice in a different programming language, right? So where I'm going with this is like, yes you know, sometimes when you look at some vendors out there, some niche players, they push this message or they jump on this academic approach of like each microservice has the best tool at hand or I'd use a different database for your purpose, et cetera. Which almost often comes across like us. You know, we want to stay part of the conversation. Nothing stops a developer from, you know using a multimodal database for the microservice and just using that as a document store, right? Or just using that as a relational database. And, you know, sometimes I mean, it was actually something that happened that was really interesting yesterday I don't know whether you follow Dave or not. But Facebook had an outage yesterday, right? And Facebook is one of those companies that are seen as the Silicon Valley, you know know how to do microservices companies. And when you add through the outage, well, what happened, right? Some unfortunate logical error with configuration as a force that took a database cluster down. So, you know, there you have it where you go like, well, maybe not every microservice is actually in fact talking to its own database or its own special purpose database. I think there, you know, well, what we should, the industry should be focusing much more on this argument of which technology to use? What's the right tool for a job? Is more to ask themselves, what business problem actually are we trying to solve? And therefore what's the right approach and the right technology for this. And so therefore, just as I said before, you know multimodal databases they do have strong benefits. They have many built-in functionalities that are already there and they allow you to reduce this complexity of having to know many different technologies, right? And so it's not only to store different data models either you know, treat a multimodal database as a chasing documents store or a relational database but most databases are multimodal since 20 plus years. But it's also actually being able to perhaps if you store that data together, you can perhaps actually derive additional value for somebody else but perhaps not for your application. But like for example, if you were to use Oracle Database you can actually write queries on top of all of that data. It doesn't really matter for our query engine whether it's the data is format that then chase or the data is formatted in rows and columns you can just rather than query over it. And that's actually very powerful for those guys that have to, you know get the reporting done the end of the day, the end of the week. And for those guys that are the data scientists that they want to figure out, you know which product performed really well or can we tweak something here and there. When you look into that space you still see a huge divergence between the guys to put data in kind of the altarpiece style and guys that try to derive new insights. And there's still a lot of ETL going around and, you know we have big data technologies that some of them come and went and some of them came in that are still around like Apache Spark which is still like a SQL engine on top of any of your data kind of going back to the same concept. And so I will say that, you know, for developers when we look at microservices it's like, first of all, is the argument you were making because the vendor or the technology you want to use tells you this argument or, you know, you kind of want to have an argument to use a specific technology? Or is it really more because it is the best technology, to best use for this given use case for this given application that you have? And if so there's of course, also nothing wrong to use a single purpose technology either, right? >> Yeah, I mean, whenever I talk about Oracle I always come back to the most important applications, the mission critical. It's very difficult to architect databases with microservices and containers. You have to be really, really careful. And so and again, it comes back to what we were talking before about with Maria that the complexity and the recovery. But Gerald I want to stay with you for a minute. So there's other data management technologies popping out there. I mean, I've seen some people saying, okay just leave the data in an S3 bucket. We can query that, then we've got some magic sauce to do that. And so why are you optimistic about you know, traditional database technology going forward? >> I would say because of the history of databases. So one thing that once struck me when I came to Oracle and then got to meet great people like Juan Luis and Andy Mendelsohn who had been here for a long, long time. I come to realization that relational databases are around for about 45 years now. And, you know, I was like, I'm too young to have been around then, right? So I was like, what else was around 45 years? It's like just the tech stack that we have today. It's like, how does this look like? Well, Linux only came out in 93. Well, databases pre-date Linux a lot rather than as I started digging I saw a lot of technologies come and go, right? And you mentioned before like the technologies that data management systems that we had that came and went like the columnar databases or XML databases, object databases. And even before relational databases before Cot gave us the relational model there were apparently these networks stores network databases which to some extent look very similar to adjacent documents. There wasn't a harder storing data and a hierarchy to format. And, you know when you then start actually reading the Cot paper and diving a little bit more into the relation model, that's I think one important crux in there that most of the industry keeps forgetting or it hasn't been around to even know. And that is that when Cot created the relational model, he actually focused not so much on the application putting the data in, but on future users and applications still being able to making sense out of the data, right? And that's kind of like I said before we had those network models, we had XML databases you have adjacent documents stores. And the one thing that they all have along with it is like the application that puts the data in decides the structure of the data. And that's all well and good if you had an application of the developer writing an application. It can become really tricky when 10 years later you still want to look at that data and the application that the developer is no longer around then you go like, what does this all mean? Where is the structure defined? What is this attribute? What does it mean? How does it correlate to others? And the one thing that people tend to forget is that it's actually the data that's here to stay not someone who does the applications where it is. Ideally, every company wants to store every single byte of data that they have because there might be future value in it. Economically may not make sense that's now much more feasible than just years ago. But if you could, why wouldn't you want to store all your data, right? And sometimes you actually have to store the data for seven years or whatever because the laws require you to. And so coming back then and you know, like 10 years from now and looking at the data and going like making sense of that data can actually become a lot more difficult and a lot more challenging than having to first figure out and how we store this data for general use. And that kind of was what the relational model was all about. We decompose the data structures into tables and columns with relationships amongst each other so therefore between each other. So that therefore if somebody wants to, you know typical example would be well you store some purchases from your web store, right? There's a customer attribute in it. There's some credit card payment information in it, just some product information on what the customer bought. Well, in the relational model if you just want to figure out which products were sold on a given day or week, you just would query the payment and products table to get the sense out of it. You don't need to touch the customer and so forth. And with the hierarchical model you have to first sit down and understand how is the structure, what is the customer? Where is the payment? You know, does the document start with the payment or does it start with the customer? Where do I find this information? And then in the very early days those databases even struggled to then not having to scan all the documents to get the data out. So coming back to your question a bit, I apologize for going on here. But you know, it's like relational databases have been around for 45 years. I actually argue it's one of the most successful software technologies that we have out there when you look in the overall industry, right? 45 years is like, in IT terms it's like from a star being the ones who are going supernova. You have said it before that many technologies coming and went, right? And just want to add a more really interesting example by the way is Hadoop and HDFS, right? They kind of gave us this additional promise of like, you know, the 2010s like 2012, 2013 the hype of Hadoop and so forth and (mumbles) and HDFS. And people are just like, just put everything into HDFS and worry about the data later, right? And we can query it and map reduce it and whatever. And we had customers actually coming to us they were like, great we have half a petabyte of data on an HDFS cluster and we have no clue what's stored in there. How do we figure this out? What are we going to do now? Now you had a big data cleansing problem. And so I think that is why databases and also data modeling is something that will not go away anytime soon. And I think databases and database technologies are here for quite a while to stay. Because many of those are people they don't think about what's happening to the data five years from now. And many of the niche players also and also frankly even Amazon you know, following with this single purpose thing is like, just use the right tool for the job for your application, right? Just pull in the data there the way you wanted. And it's like, okay, so you use technologies all over the place and then five years from now you have your data fragmented everywhere in different formats and, you know inconsistencies, and, and, and. And those are usually when you come back to this data-driven business critical business decision applications the worst case scenario you can have, right? Because now you need an army of people to actually do data cleansing. And there's not a coincidence that data science has become very, very popular the last recent years as we kind of went on with this proliferation of different database or data management technologies some of those are not even database. But I think I leave it at that. >> It's an interesting talk track because you're right. I mean, no schema on right was alluring, but it definitely created some problems. It also created an entire, you know you referenced the hyper specialized roles and did the data cleansing component. I mean, maybe technology will eventually solve that problem but it hasn't up at least up tonight. Okay, last question, Maria maybe you could start off and Gerald if you want to chime in as well it'd be great. I mean, it's interesting to watch this industry when Oracle sort of won the top database mantle. I mean, I watched it, I saw it. It was, remember it was Informix and it was (indistinct) too and of course, Microsoft you got to give them credit with SQL server, but Oracle won the database wars. And then everything got kind of quiet for awhile database was sort of boring. And then it exploded, you know, all the, you know not only SQL and the key-value stores and the cloud databases and this is really a hot area now. And when we looked at Oracle we said, okay, Oracle it's all about Oracle Database, but we've seen the kind of resurgence in MySQL which everybody thought, you know once Oracle bought Sun they were going to kill MySQL. But now we see you investing in HeatWave, TimesTen, we talked about In-Memory databases before. So where do those fit in Maria in the grand scheme? How should we think about Oracle's database portfolio? >> So there's lots of places where you'd use those different things. 'Cause just like any other industry there are going to be new and boutique use cases that are going to benefit from a more specialized product or single purpose product. So good examples off the top of my head of the kind of systems that would benefit from that would be things like a stock exchange system or a telephone exchange system. Both of those are latency critical transaction processing applications where they need microsecond response times. And that's going to exceed perhaps what you might normally get or deploy with a converged database. And so Oracle's TimesTen database our In-Memory database is perfect for those kinds of applications. But there's also a host of MySQL applications out there today and you said it yourself there Dave, HeatWave is a great place to provision and deploy those kinds of applications because it's going to run 100 times faster than AWS (mumbles). So, you know, there really is a place in the market and in our customer's systems and the needs they have for all of these different members of our database family here at Oracle. >> Yeah, well, the internet is basically running in the lamp stack so I see MySQL going away. All right Gerald, will give you the final word, bring us home. >> Oh, thank you very much. Yeah, I mean, as Maria said, I think it comes back to what we discussed before. There is obviously still needs for special technologies or different technologies than a relational database or multimodal database. Oracle has actually many more databases that people may first think of. Not only the three that we have already mentioned but there's even SP so the Oracle's NoSQL database. And, you know, on a high level Oracle is a data management company, right? And we want to give our customers the best tools and the best technology to manage all of their data. Rather than therefore there has to be a need or there should be a part of the business that also focuses on this highly specialized systems and this highly specialized technologies that address those use cases. And I think it makes perfect sense. It's like, you know, when the customer comes to Oracle they're not only getting this, take this one product you know, and if you don't like it your problem but actually you have choice, right? And choice allows you to make a decision based on what's best for you and not necessarily best for the vendor you're talking to. >> Well guys, really appreciate your time today and your insights. Maria, Gerald, thanks so much for coming on The Cube. >> Thank you very much for having us. >> And thanks for watching this Cube conversation this is Dave Vellante and we'll see you next time. (upbeat music)

Published Date : Jun 24 2021

SUMMARY :

and then you guys just follow my lead. So I noticed Maria you stopped anyway, So any time you don't So when I'm here you guys and we'll open up when Dave's ready. and the benefits they bring What are we really talking about there? the nearest stores to kind of the traditional So for example, you can do So Gerald, you think about to you at all but just receives or even a MongoDB that allows you to do ML and AI into the database, in the database you already have. and I buy that by the way. of since the last 40 years, you know the benefits to this approach is the fact that you can get And you know, it's And that buddy comes in the form of the truth here is you don't and deploy it on the cloud. and the cloud and containers and you know, is the argument you were making And so why are you because the laws require you to. And then it exploded, you and the needs they have in the lamp stack so I and the best technology to and your insights. we'll see you next time.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

Gerald VenzlPERSON

0.99+

Andy MendelsohnPERSON

0.99+

MariaPERSON

0.99+

DavePERSON

0.99+

ChileLOCATION

0.99+

Maria ColganPERSON

0.99+

PeruLOCATION

0.99+

100 timesQUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

GeraldPERSON

0.99+

AmazonORGANIZATION

0.99+

OracleORGANIZATION

0.99+

CanadaLOCATION

0.99+

seven yearsQUANTITY

0.99+

Juan LuisPERSON

0.99+

IBMORGANIZATION

0.99+

StevePERSON

0.99+

five starQUANTITY

0.99+

Maria ColganPERSON

0.99+

Swiss ArmyORGANIZATION

0.99+

Swiss ArmyORGANIZATION

0.99+

AlexPERSON

0.99+

FacebookORGANIZATION

0.99+

MySQLTITLE

0.99+

one noteQUANTITY

0.99+

yesterdayDATE

0.99+

two handsQUANTITY

0.99+

threeQUANTITY

0.99+

two expertsQUANTITY

0.99+

AWSORGANIZATION

0.99+

LinuxTITLE

0.99+

TeradataORGANIZATION

0.99+

each microserviceQUANTITY

0.99+

HadoopTITLE

0.99+

45 yearsQUANTITY

0.99+

OraclesORGANIZATION

0.99+

early 2010sDATE

0.99+

todayDATE

0.99+

one-shotQUANTITY

0.99+

fiveQUANTITY

0.99+

one good exampleQUANTITY

0.99+

SunORGANIZATION

0.99+

tonightDATE

0.99+

firstQUANTITY

0.99+

Aileen Black, Collibra and Marco Temaner, U.S. Army | AWS PS Partner Awards 2021


 

>>Mhm. Yes one. >>Hello and welcome. Today's session of the 2021 AWS Global Public Sector Partner Awards. I am pleased to introduce our very next guests. Their names are a lean black S. V. P. Public sector at culebra and Marco Timon are Chief Enterprise Architect at the HQ. D. A. Office of business transformation at the U. S. Army. I'm your host Natalie ehrlich, we're going to be discussing the award for best partner transformation. Best data led migration. Thank you both for joining the program. >>Thank you for having us. >>Thank you. Glad to be here. >>Well, a lien, why is it important to have a data driven migration? >>You know, migrations to the cloud that are simply just a lift and ship does take advantage of the elasticity of the cloud but not really about how to innovate and leverage what truly the AWS cloud has to offer. Um so a data led migration allows agencies to truly innovate and really kind of almost reimagine how they make their mission objectives and how they leverage the cloud, you know, the government has, let's face it mountains of data, right? I mean every single day there's more and more data and you you can't pick up a trade magazine that doesn't talk about how data is the new currency or data is the new oil. Um, so you know, data to have value has to be usable, right? So you to turn your data into knowledge. You really need to have a robust data intelligence platform which allows agencies to find understand and trust or data data intelligence platform like culebra is the system of record for their data no matter where it may reside. Um no strategy is complete without a strong data, governments platform and security and privacy baked in from the very start, data has to be accessible to the average data. Citizen people need to be able to better collaborate to make data driven decisions. Organizations need to be united by data. This is how a technology and platform like cal Ibra really allows agencies to leverage the data as a strategic asset. >>Terrific. Well, why is it more important than ever to do this than ever before? >>Well, you know, there's just the innovation of technology like Ai and Ml truly to be truly leveraged. Um you know, they need to be able to have trust the data that they're using it. If it if the model is trained with only a small set of data, um it's not going to really produce the trusted results they want. ML models deliver faster results at scale, but the results can be only precise when data feeding them is of high quality. And let's say Gardner just came out with a study that said data quality is the number one obstacle for adoption of A. I. Um when good data and good models find a unified scalable platform with superior collaboration capabilities, you're A I. M. L. Opportunities to truly be leveraged and you can truly leverage data as a strategic asset. >>Terrific. Well marco what does the future look like for the army and data >>so and let me play off. Do you think that Allen said so in terms of the future um obviously data's uh as you mentioned the data volumes are growing enormously so. Part of the future has to do with dealing with those data volumes just from a straight >>technological >>perspective. But as the data volumes grow and as we have to react to things that we need to react to the military, we're not just trying to understand the quantity of data but what it is and not just the quality but the nature of it. So understanding authoritative nous. Being able to identify what data we need to solve certain problems or answer certain questions. I mean a major theme in terms of what we're doing with data governance and having a data governance platform and a data catalog is having immediate knowledge of what data is, where what quality and confidence we have in the data. Sometimes it's more important to have data that's approximately correct than truly correct as quickly as possible, you know. So not all data needs to be of perfect quality at all times you need to understand what's authoritative, what the quality is, how current the information is. So as the data volumes grow and grow and grow. Keeping up with that. Not just from the standpoint of can we scale we know how to scale pretty well in terms of containing data volume but keeping up what it is, the knowledge of the data itself, understand authoritative nous quality, providence etcetera, uh that's a whole enterprise to keep keeping up with and that's what we're doing right now with this, with this project. >>Yeah. And I'd like to also follow up with that, how has leveraging palabras data intelligence platform enabled the army to accelerate its overall mission. >>So there's uh there's sort of interplay between, you know, just having a technology does something doesn't mean you're going to use it to do that something, but often having a place to do work of governance, work of knowledge management can be the precipitating functions or the stimulus to do so. So it's not and if you build it they will come. But if you don't have a place to play ball, you're not going to play ball to kind of run with that metaphor. So having technology that can do these things is a precursor to being able to. But then of course we, as an organization have to do it. So the interplay between making a selection of technology and doing the implementation from a technical perspective that plays off of an urgency, we've made the decision to use a technology, so then that helped accelerate getting roles, responsibilities of our ceo of our missionary data. Officers of data Stewart's the folks that have to be doing the work. Um, when you educate system owners in cataloging and giving a central environment, the information is needed. If you say here's a place to put it, then it's very tangible, especially in the military where work is done in a very uh, concrete task based way. If you have a place to do things, then it's easier to tell people to do things. So the technology is great and works for us. But the choice to to move with the technology has then been a productive interplay with with the doing of the things that need to be done to take advantage of the technology, if that makes >>sense? Well, >>yeah, that's really great to hear. I mean, speaking of taking advantage of the technology, a lien can collaborate, help your other public sector customers take advantage of A. I and machine learning. >>Well, people need to be able to collaborate and take advantage of their most strategic asset data to make those data driven decisions. It gives them the agility to be able to act 2020 was a great lesson around the importance of having your data house in order. Let's face it, the pandemic, we watched organizations that, you know, had a strong data governance framework who had looked at and understood where their data were and they were very able to very quickly assess the situation in react and others were not in such a good situation. So, you know, being able to have that data governance framework, being able to have that data quality, being able to have the right information and being able to trust it allows people to be effective and quickly to react to situations >>fascinating. Um do you have any insight on that marco, would you like to weigh in? >>Well, definitely concur. Um I think our strategy, like I said has been to um use the technology to highlight the need to put governance into place and to focus on increasing data quality the data sources. And I would say this has also helped us uh I mean things that we weren't doing before that have to do with just educating the populace, you know all the way from the folks operators of systems to the most senior executives. Being conversant in the principles that we're talking about this whole discipline is a bit arcane and kind of back office and kind of I. T. But it's actually not. If you don't have the data to make, if you don't know where to get the data to make a decision then you're going to make a decision based on incorrect data and and you know that's pretty important in the military to not get wrong. So definitely concur and we're taking that approach as well. >>I'd like to take it one step further. If if you're speaking the same language then so if you have an understanding what the data governments framework is you can understand what the data is, where it is. Sometimes there's duplicate data and there's duplicate data for a reason, but understanding where it came from and what the linage is associated with, it really gives you the power of being able to shop for data and get the right information at the right time and give it the right perspective. And I think that's the power of what has laid the foundation for the work that the army and MArco has done to really set the stage for what they can do in the future. >>Terrific and marco, if you could comment a little bit about data storage ship and how it can positively dry future outcomes. >>Yeah, So um data stewardship for us um has a lot to do with the functional, so the people that were signing as a senior data Stewart's are the senior functional in the respective organizations, logistics, financial management, training, readiness, etcetera. So the idea of the folks who know really everything about those functional domains, um looking at things from the perspective of the data that's needed to support those functions, logistics, human resources, etcetera. Um and being, you know, call it the the most authoritative subject matter experts. So the governance that we're doing is coming much more from a functional perspective than a technical perspective, so that when a when a system is being built, if we're talking about data migration, if we're talking about somebody driving analytics, the knowledge that were associated with the data comes from the functional. So our data stewardship is less about the technical side and more about making sure that the understanding from functional perspective of what the data is for, what the provenance is, not from a technical perspective, but what it means in terms of sources of information, sources of personnel, sources of munitions et cetera um is available to the folks using it. So they basically know what it is. So the emphasis is on that functional infusion of knowledge into the metadata so that then people who are trying to use that day to have a way of understanding what it really is and what the meaning is. And that's what really what data stewardship means from were actually very good at stewarding data. From a technical perspective. We know how to run systems very well. We know how to scale, We're good at that, but making sure that people know what it is and why and when to use it. Um that's where it's maybe we have some catching up to do, which is what this efforts about. >>Terrific. Well, fantastic insights from you both. I really appreciate you taking the time uh to tell all our viewers about this. That was Eileen Black and Marco Timoner and that, of course, was our section for the AWS Global Public Partner Sector Awards. Thanks for watching. I'm your host, Natalie Early. Thank you. >>Yeah. Mm.

Published Date : Jun 22 2021

SUMMARY :

I am pleased to introduce our very next guests. Glad to be here. the elasticity of the cloud but not really about how to innovate and leverage Well, why is it more important than ever to do this than ever before? Um you know, they need to be able to have Well marco what does the future look like for the army and data Part of the future has to do with dealing with those data volumes just from a straight needs to be of perfect quality at all times you need to understand what's authoritative, enabled the army to accelerate its overall mission. doing of the things that need to be done to take advantage of the technology, if that makes I mean, speaking of taking advantage of the technology, Well, people need to be able to collaborate and take advantage of their most strategic asset Um do you have any insight on that marco, would you like to weigh in? that have to do with just educating the populace, you know all the way from the folks operators of systems from and what the linage is associated with, it really gives you the power of being able to shop for data Terrific and marco, if you could comment a little bit about data storage ship and the perspective of the data that's needed to support those functions, logistics, human resources, I really appreciate you taking the time uh to

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Eileen BlackPERSON

0.99+

Marco TimonerPERSON

0.99+

Natalie ehrlichPERSON

0.99+

Marco TimonPERSON

0.99+

Natalie EarlyPERSON

0.99+

Marco TemanerPERSON

0.99+

AWSORGANIZATION

0.99+

Aileen BlackPERSON

0.99+

CollibraPERSON

0.99+

StewartPERSON

0.99+

AllenPERSON

0.98+

U. S. ArmyORGANIZATION

0.98+

bothQUANTITY

0.98+

GardnerPERSON

0.98+

TodayDATE

0.96+

2021 AWS Global Public Sector Partner AwardsEVENT

0.96+

MArcoORGANIZATION

0.94+

pandemicEVENT

0.93+

oneQUANTITY

0.93+

AWS Global Public Partner Sector AwardsEVENT

0.92+

S. V. P.ORGANIZATION

0.9+

AWSEVENT

0.9+

2020DATE

0.86+

U.S. ArmyORGANIZATION

0.8+

single dayQUANTITY

0.75+

PS Partner Awards 2021EVENT

0.75+

D. A. OfficeORGANIZATION

0.69+

culebraORGANIZATION

0.65+

AiORGANIZATION

0.62+

calTITLE

0.62+

IbraTITLE

0.38+

Andy Mendelsohn, Oracle | CUBE Conversation, March 2021


 

the cloud has dramatically changed the way providers think about delivering database technologies not only has cloud first become a mandate for many if not most but customers are demanding more capabilities from their technology vendors examples include a substantially similar experience for cloud and on-prem workloads increased automation and a never-ending quest for more secure platforms broadly there are two prevailing models that have emerged one is to provide highly specialized database products that focus on optimizing for a specific workload signature the other end of the spectrum combines technologies in a converge platform to satisfy satisfy the needs of a much broader set of use cases and with me to get a perspective on these and other issues is andy mendelson is the executive vice president of oracle the world's leading database company andy leads database server technologies hello andy thanks for coming on hey dave glad to be here okay so we saw the recent announcements this is kind of your baby around next generation autonomous data warehouse maybe you could take us through the path you took from the original cloud data warehouses to where we are today yeah when we uh we first brought autonomous database out uh we were basically a second generation technology at that point you know we decided that what customers wanted was to the other you know the push of a button provision the really powerful oracle database technology that they've been using for years and um we did that with autonomous database and beyond that we provided a very unique capability that around self-tuning self-driving of the database which is something the first generation vendors didn't provide and this this is really important because customers today are you know developers and data analysts you know you know at the push of a button build out their their data warehouses but you know they're not experts in tuning and so what we thought was really important is that customers get great performance out of the box and that's one of the really unique things about autonomous data warehouse autonomous database in particular and then this latest generation that we just came out with also answers the questions we got from you know the data analysts and developers they said you know it's really great that i can press a button and provision this very powerful data warehouse infrastructure or database infrastructure from oracle but you know if i'm an analyst i want data you know so it's still hard for me to go and you know get data from various data sources transform them clean them up and get them to a way a place where i can start querying the data now i still need data engineers to help me do that and so we've done in the new release we said okay we want to give data analysts and data engineer data scientists developers is a true self-service experience where they can do their job completely without bringing in any you know any any engineers from their i.t organization and so that's what this new version is all about yeah awesome i mean look years ago you guys identified the i.t labor problem and you've been focused on r d and putting it in your r d to solve that problem for customers so we're really starting to see that hit now now gartner recently did some analysis they ranked and rated them some of the more popular cloud databases and oracle did very well i mean particularly particularly in operational categories i mean an operational side and the mission critical stuff you smoked everybody we had mark stamer and david floyer on and our big takeaways were that you're you're again dominating in that mission critical workloads that that that dominance continues but your approach of converging functionality really differs from some others that we saw i mean obviously when you get high ratings from gartner you're pretty stoked about that but what do you think contributed to those rankings and what are you finding specifically in customer interactions yeah so gardner does a lot of its analysis based on talking to customers finding out how their product these products that sound great on paper actually work in practice and i think that's one of the places where oracle database technology really shines it's it's uh it solves real-world problems um it's been doing it for a long time and as we've moved that technology into the cloud you know that continues you know the differentiation we've built up over the years really stands out you know you look at like amazon's databases they generally take some open source technology that isn't that new it could be 30 years old 25 years old and they put it up on the cloud and they say oh it's cloud native it's great but but in fact it's the same old you know technology that that doesn't really compete you know decade behind oracle's database technology so i think the gartner analysis really showed that sort of thing quite clearly yeah so let's talk about that a little bit because obviously i've learned a lot you know one of the things i've learned over the last many years of following this business a lot of ways to skin a cat and cloud database vendors if you think about you mentioned aws you know look at snowflake kind of right tool for the right job approach they're going to say that their specialty databases they're focused uh are better than your converged approach which they make you know think of as a you know swiss army knife what's your take on that yeah well the converged approach is something of course we've been working on for a long time so the the idea is pretty simple you know think about your smartphone you know if you can think back you know over 10 years ago used to have you know a camcorder and a a camera and a messaging device and also a dump phone device that all those different devices got converged into what we now call the smartphone why did the smartphone win it's just simply much more productive for you to carry one device around that that is actually best to breed in all the different categories instead of lots of separate devices and that's what we're doing with converge database over the years you know we've been able to build out technologies that are really good at transaction breasts at analytics for data warehousing now we're working on you know json technologies graph technologies the other vendors basically can't do this i mean it's much easier to build a specialty database that does one thing to build out a converged database that does end things really well and that's what we've been doing for years and again it's it's based on technology that uh you've invested in for quite a long time um and it's something that i think uh customers and developers and analyze analysts find to be a much more productive way of doing their jobs it's very unique and not common at all to see a technology that's been around as long as oracle database to see that sort of morph into a more modern platform i mean you mentioned aws uses leverages open source a lot you know snowflake would say okay hey we are born in the cloud and they are i think google bigquery would be another good example but but but that notion of boy i want to get your take on this born in the cloud those folks would say well we're superior to oracle's because you know they started you know decades ago not necessarily you know native cloud services uh how have you been able to address that i know you know cloud first is kind of the buzzword but but how have you you made that sort of transparent to users or or irrelevant to users because you are cloud first maybe you could talk about how you've able to achieve that and convince us that you actually really are cloud native now you know one of the things we we sort of like pointing out is that um oracle very uniquely has had this scale out technology for running all kinds of workloads not just analytic workloads which is what you see out in the cloud there but we can also scale out transaction processing workloads now that was another one of the reasons we do so well in for example the gardner analysis for trans operational workloads and that technology is really valuable as we went to cloud it lets us do some really unique things and the most obvious unique thing we we have is something we like to call you know you know cloud native you know instant elasticity and so with our technology if you want to provision a share you know some number of amount of compute to run your workloads you can provision exactly what you need you know if you need 17 cpus to get your job done you do 17 cpus when you provision your autonomous database our competitors who claim to be born in the cloud like snowflake and amazon they still use this this archaic way of provisioning uh servers based on shapes you know snowflake you know says what which shape cluster do you want you want 16 you want 32 you want 64. no it goes up by a power of 2 which means if you compare that to what oracle does you you have to provision up to like twice as much cpu than you really need so if you really need 17 they make you provision 32. if you really need 33 they make your provision 64. so this is not a cloud native experience at all it's an archaic way of doing things and and we like to point out with our instant elasticity you know we can go from 17 to 18 to 19 you know whatever you want plus we have something called auto scale so you can set your baseline to be 17 let's say but we will automatically based on your workload scale you up to three times that so in this case be 51 and because of that true elasticity we have we are really the only ones that can deliver true pay as you go kind of you know just pay for what you need kind of capability which is certainly what amazon was talking about when they first called their cloud elastic but it turns out for database services these guys still do this archaic thing with shapes so that's a really good example of where we're quite better than the other guys and it's much more cloud native than the other guys i want to follow up on that uh just stay here for a second because you're basically saying we have we have better granularity than the so-called cloud native guys now you mentioned snowflake right you got you got the shapes you got to you got to choose which shape you want and it sounds like it sounds like redshift the same and of course i know the way in which amazon separates compute from storage is largely a tiering exercise so it's not as as is as smooth as you might expect but nonetheless it's it's good how is it that you were you were able to achieve this with a database that was you know born you know many decades ago is it i mean what is it in from a technical standpoint an r d standpoint that you were able to do i mean did you design that in in the 1980s how did you how did you get here yeah well um it's a combination of interesting technologies so autonomous database you know it has the oracle database software that software is running on a very powerful optimized infrastructure for database based on the exadata technology that we've had on prem for many years we brought that to the cloud and that technology is a scale-out infrastructure that supports you know thousands of cpus and then we use our multi-tenant technology which is a way of sharing large infrastructures amongst amongst separate uh clients and we divide it up dynamically on the fly so if there's thousands of cpus you know this guy wants 20 and this one wants 30 we we divide it up and give them exactly what they need and if they want to grow we just take some extra cpus that are in reserve and we give it to them instantly and so that's a very different way of doing things and that's been a shape based approach where you know what what snowflake and amazon do under the covers they give you a real physical server you know or a cluster and that's how they provision if you want to grow they give you another big physical cluster which takes a long time to get the data populated to get it get it working we just have that one infrastructure that we're sharing among lots of users and we just give you a little extra capacity we don't it doesn't it's done instantly there's no need for data to be moved to populate the new clusters that you know snowflake or amazon are provisioning for you so it's a very different way of doing things and you're able to do that because of the tight integration between you mentioned exadata tight integration between the hardware and software we got david floyer calls it the iphone of enterprise sometimes sometimes you get some grief for that but it's it's not a bad metaphor but is that really the sort of secret well the big secret under the covers is this you know exudated technology our real application cluster scale out technologies our multi-tenant technologies so these are things we've been working on for a long time and they are very mature very powerful technologies and they really provide very unique benefits in a cloud world where people want things to happen instantly and they want to work well for any kind of workload um you know that's that's why we call we talk about being converged we can do mixed workloads you can do transactions and analytics all in the same data the other guys can't do that you know they're really good at like you said a narrow workload like i can do analytics or i can do graph you know i can do json but they can't really do the combination which is what real world applications are like they're not pure one thing versus enough right thank you for that so one of the questions people want to know is can oracle attract you know new customers that aren't existing oracle customers so maybe you could talk about that and you know why should uh somebody who's not an existing oracle customer think about using autonomous database yeah that's a that's a really good question you know oracle if you look at our customer base has a lot of really large enterprises you know the biggest banks and the biggest telcos you know they run oracle they run their businesses on oracle and these guys are sort of the most conservative of the bunch out there and they are moving to cloud at a somewhat slower rate than the than the smaller companies and so if you look at who's using autonomous database now it's actually the smaller companies you know the same type of people that first decided amazon was an interesting cloud 10 years ago they're also using our technologies and it's for the same reason they're finding you know they don't have large it organizations they don't have large numbers of engineers to engineer their infrastructure and that's why cloud is so attractive to them and autonomous database on top of cloud is really attractive as well because you know information is the lifeblood of every organization and if they can empower their analysts to get their job done without lots of help from it organizations they're going to do it and you know that's really what's made autonomous database really interesting you know the whole self-driving nature is very attractive to the smaller shops that don't have a lot of sophisticated um i.t expertise all right let's talk about developers you guys are the stewards of the java community so obviously you know big probably you know the biggest most popular programming language out there but when i think of developers i think of guys in hoodies pounding away but when i think of oracle developers i might think of maybe an app dev team inside of maybe some of those large customers that you talked about but why would developers and or analysts be interested in in using oracle as opposed to some some of those more focused narrow use databases that we were talking about earlier yeah so if you're a developer um you want to get your job done as fast as possible and so having a database that gives you the most productive application development experience is important to you and so you know i was talking we've been talking about converged database off and on so if i'm a developer i have a given job to do a converged database that lets me do a combination of analytics and and transactions and do a little json and little graph all in one is a much more productive place to go because if i if i i don't have something like that then i'm stuck taking my my application and breaking it up into pieces you know this piece i'm going to run on say aurora on amazon and this piece i have to run on the graph database and here's some json i got to run that on some document database and then i have to move the data around the data gets sort of fragmented between these databases and i have to do all this data you know integration and and whatever with a converged database i have a much simpler world where i can just use one technology stack i can get my job done and then i'm future proof against change you know requirements change all the time so you build the initial version of the application and your users say you know that this is not what i want i want some something else and it turns out that something else often is why i want analytics and you use something like a you know a document stored technology that has really poor analytic capabilities and then so you have to take that data and you have to move it to another database and so with with our converged approach you don't have to do that you know you're already in a place where everything works everything that you need you can possibly need in the future is going to be there as well and so for developers i i think you know converged is the right way to go plus for people who are what we call citizen developers you know like the data analysts that they cuddle they write a little code occasionally but they're really after getting value of the data we have this really fabulous no code loco tool called apex and apex is again a very mature technology it's been around for years and it lets somebody who's just a data analyst he knows a little sql but doesn't want to write code get their job done really fast and we've published some benchmark on our website showing you know basically you can get the job done 20 to 40 times faster using a no co loco tool like apex versus something like you know just writing cutting lots of traditional code i'm glad you brought up apex we recently interviewed one of your former colleagues amit xavery and all he would talk about is low code no code and then in the apex announcement you said something to the effect of coding should be the exception not the rule did you mean that what do you mean by that yeah so apex is a tool that people use with our our database technology for building what we call data driven applications so if you got a bunch of data and you want to get some value out of it you want to build maybe dashboards or more sophisticated reports apex is an incredible tool for doing that and it's it's modern you know it builds applications that look great on your smartphone and it automatically you know renders that same user interface on a bigger device like a laptop desktop device as well and uh it's very it's one of these things that uh the people that use it just go bonkers with it it's a viral technology they get really excited about how productive they they've been using it and they tell all their friends and i think we decided uh i guess about a year ago when we came up with this apex service that you know we really want to start going bigger on the marketing around it because it's very unique nobody else has anything quite like it and it's it again it just adds value to the whole developer productivity story around an oracle database so uh that's why we have the apex service now and we also have apex available with every oracle database on the cloud god i want to i want to ask you about some of the features around 21c there are a lot of them you announced earlier this year maybe you could tease out some of the top things that we should be paying attention to in 21c yeah sure um so one of the ways to look at 21c is we're we're continuing down this path of a converged database and so one of the the marquee features in 21c is something we call blockchain tables so what is blockchain well blockchain was this technology that's under the covers behind bitcoin you know it's a way of creating a tamper-proof data store um that was used by the original bitcoin algorithms well developers actually like having tamper proof data objects and databases too um you know and so what we decided to do was say well if i create a sql table in an oracle database what if there's a new option that just says i want that table implemented using blockchain technology to make the table tamper proof and fully audited etc and so we just did that and so in 21c you can now get a basically another feature of the converged database that says uh you know give me a sql table i can do everything i can query it i can insert rows into it but it's it's tamper proof i can't ever update it i can't delete rows from it amazon did the their usual thing they took again some open source technology and they said hey we got this great thing called quantum ledger database and it does blockchain tables but but if you want to do blockchain tables in any of their other databases you're out of luck they don't have it you have to go move the data into this new thing and it's again one of their it's again showing sort of the problem with their their proprietary this proprietary approach of having specialty databases versus just having one conversion that does it all so that's the blockchain cable feature uh we did a bunch of other things um the one i i think is worth mentioning the most is is support for persistent memory so a lot of people out there haven't noticed this this very interesting technology that intel shipped a couple years ago called optane data center memory and what it is it's basically a hybrid of flash memory which is persistent memory and standard dram which is not persistent means you can't store a database in dram um and so with this persistent memory you can basically have a database stored persistently in memory all the time and so it's a very innovative new technology from a database standpoint it's a very disruptive technology to the database market because now you can have an in-memory database basic period all the time 24 7. and so 21c is the first database out there that has native support for this new kind of persistent memory technology and we think it's it's really important so we're actually making it available as uh to our 19c customers as well and uh you know that's another technology i'd call out that we think is very unique we're way ahead of the game there and we're going to continue investing moving forward in that space as well yeah so that layer in between dram and and persistent flash that's that's a great innovation and good game changing from a from a performance and actually the way you write applications but i gotta i gotta ask you i and all the analysts were wrong with juan recently juan loyza and and to listen to that introduction of blockchain and everybody wants to know is safra going to start putting bitcoin on the oracle balance sheet i'm about to get that leap yeah that's a good question who knows yeah i can't comment on speculation ah that would be interesting okay last question then we got to go uh look oracle the narrative on oracle is you're expensive and you're mean you know it's hard to do business with do you care are you doing things to maybe change that perception in the cloud yeah i think we've made a very conscious decision that as we move to the cloud we're offering a totally new business model on the club that is a a cloud-native model you pay for what you use um you have everyday low prices you don't have to negotiate with some salesman for for months to get get a good price um so yeah we really like the message to get out there that those of you who think you know what oracle's all about um you know i and how it might be to work with oracle on in from your on premises days um you should really check out how oracle is now on the cloud we have this autonomous database technology really easy to use really simple any analysts can help get value out of the data without any help from any other engineers it's very unique it's it's uh it's the same technology you're used to but now it's delivered in a way that's much easier to consume and much lower cost and so yeah you should definitely take a look at what we've got out there on the cloud and it's all free to try out we got this free tier you can provision free vms free databases um free apex whatever you want and uh try it out and see what you think well thanks for that i was kidding about me and a lot of a lot of friends at oracle some relatives as well and thanks andy for coming on thecube today it's really great to talk to you yeah it's my pleasure and thanks for watching this is dave vellante we'll see you next time you

Published Date : Mar 29 2021

SUMMARY :

and so for developers i i think you know

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Andy MendelsohnPERSON

0.99+

amazonORGANIZATION

0.99+

March 2021DATE

0.99+

20QUANTITY

0.99+

gartnerORGANIZATION

0.99+

oracleORGANIZATION

0.99+

apexTITLE

0.99+

juan loyzaPERSON

0.99+

first databaseQUANTITY

0.99+

OracleORGANIZATION

0.99+

david floyerPERSON

0.99+

two prevailing modelsQUANTITY

0.99+

twiceQUANTITY

0.98+

dave vellantePERSON

0.98+

todayDATE

0.98+

first generationQUANTITY

0.98+

10 years agoDATE

0.98+

thousands of cpusQUANTITY

0.98+

decades agoDATE

0.98+

40 timesQUANTITY

0.98+

51OTHER

0.97+

25 years oldQUANTITY

0.97+

30QUANTITY

0.96+

firstQUANTITY

0.96+

andy mendelsonPERSON

0.96+

17OTHER

0.96+

1980sDATE

0.96+

second generationQUANTITY

0.96+

33OTHER

0.96+

oneQUANTITY

0.96+

30 years oldQUANTITY

0.96+

jsonORGANIZATION

0.95+

earlier this yearDATE

0.95+

one deviceQUANTITY

0.94+

amit xaveryPERSON

0.94+

mark stamerPERSON

0.92+

googleORGANIZATION

0.91+

yearsDATE

0.91+

32OTHER

0.9+

about a year agoDATE

0.9+

oracleTITLE

0.9+

over 10 years agoDATE

0.89+

safraORGANIZATION

0.88+

16OTHER

0.88+

one thingQUANTITY

0.87+

many decades agoDATE

0.87+

lot of peopleQUANTITY

0.83+

Breaking Analysis: Unpacking Oracle’s Autonomous Data Warehouse Announcement


 

(upbeat music) >> On February 19th of this year, Barron's dropped an article declaring Oracle, a cloud giant and the article explained why the stock was a buy. Investors took notice and the stock ran up 18% over the next nine trading days and it peaked on March 9th, the day before Oracle announced its latest earnings. The company beat consensus earnings on both top-line and EPS last quarter, but investors, they did not like Oracle's tepid guidance and the stock pulled back. But it's still, as you can see, well above its pre-Barron's article price. What does all this mean? Is Oracle a cloud giant? What are its growth prospects? Now many parts of Oracle's business are growing including Fusion ERP, Fusion HCM, NetSuite, we're talking deep into the double digits, 20 plus percent growth. It's OnPrem legacy licensed business however, continues to decline and that moderates, the overall company growth because that OnPrem business is so large. So the overall Oracle's growing in the low single digits. Now what stands out about Oracle is it's recurring revenue model. That figure, the company says now it represents 73% of its revenue and that's going to continue to grow. Now two other things stood out on the earnings call to us. First, Oracle plans on increasing its CapEX by 50% in the coming quarter, that's a lot. Now it's still far less than AWS Google or Microsoft Spend on capital but it's a meaningful data point. Second Oracle's consumption revenue for Autonomous Database and Cloud Infrastructure, OCI or Oracle Cloud Infrastructure grew at 64% and 139% respectively and these two factors combined with the CapEX Spend suggest that the company has real momentum. I mean look, it's possible that the CapEx announcements maybe just optics in they're front loading, some spend to show the street that it's a player in cloud but I don't think so. Oracle's Safra Catz's usually pretty disciplined when it comes to it's spending. Now today on March 17th, Oracle announced updates towards Autonomous Data Warehouse and with me is David Floyer who has extensively researched Oracle over the years and today we're going to unpack the Oracle Autonomous Data Warehouse, ADW announcement. What it means to customers but we also want to dig into Oracle's strategy. We want to compare it to some other prominent database vendors specifically, AWS and Snowflake. David Floyer, Welcome back to The Cube, thanks for making some time for me. >> Thank you Vellante, great pleasure to be here. >> All right, I want to get into the news but I want to start with this idea of the autonomous database which Oracle's announcement today is building on. Oracle uses the analogy of a self-driving car. It's obviously powerful metaphor as they call it the self-driving database and my takeaway is that, this means that the system automatically provisions, it upgrades, it does all the patching for you, it tunes itself. Oracle claims that all reduces labor costs or admin costs by 90%. So I ask you, is this the right interpretation of what Oracle means by autonomous database? And is it real? >> Is that the right interpretation? It's a nice analogy. It's a test to that analogy, isn't it? I would put it as the first stage of the Autonomous Data Warehouse was to do the things that you talked about, which was the tuning, the provisioning, all of that sort of thing. The second stage is actually, I think more interesting in that what they're focusing on is making it easy to use for the end user. Eliminating the requirement for IT, staff to be there to help in the actual using of it and that is a very big step for them but an absolutely vital step because all of the competition focusing on ease of use, ease of use, ease of use and cheapness of being able to manage and deploy. But, so I think that is the really important area that Oracle has focused on and it seemed to have done so very well. >> So in your view, is this, I mean you don't really hear a lot of other companies talking about this analogy of the self-driving database, is this unique? Is it differentiable for Oracle? If so, why, or maybe you could help us understand that a little bit better. >> Well, the whole strategy is unique in its breadth. It has really brought together a whole number of things together and made it of its type the best. So it has a single, whole number of data sources and database types. So it's got a very broad range of different ways that you can look at the data and the second thing that is also excellent is it's a platform. It is fully self provisioned and its functionality is very, very broad indeed. The quality of the original SQL and the query languages, etc, is very, very good indeed and it's a better agent to do joints for example, is excellent. So all of the building blocks are there and together with it's sharing of the same data with OLTP and inference and in memory data paces as well. All together the breadth of what they have is unique and very, very powerful. >> I want to come back to this but let's get into the news a little bit and the announcement. I mean, it seems like what's new in the autonomous data warehouse piece for Oracle's new tooling around four areas that so Andy Mendelsohn, the head of this group instead of the guy who releases his baby, he talked about four things. My takeaway, faster simpler loads, simplified transforms, autonomous machine learning models which are facilitating, What do you call it? Citizen data science and then faster time to insights. So tooling to make those four things happen. What's your take and takeaways on the news? >> I think those are all correct. I would add the ease of use in terms of being able to drag and drop, the user interface has been dramatically improved. Again, I think those, strategically are actually more important that the others are all useful and good components of it but strategically, I think is more important. There's ease of use, the use of apex for example, are more important. And, >> Why are they more important strategically? >> Because they focus on the end users capability. For example, one of other things that they've started to introduce is Python together with their spatial databases, for example. That is really important that you reach out to the developer as they are and what tools they want to use. So those type of ease of use things, those types of things are respecting what the end users use. For example, they haven't come out with anything like click or Tableau. They've left that there for that marketplace for the end user to use what they like best. >> Do you mean, they're not trying to compete with those two tools. They indeed had a laundry list of stuff that they supported, Talend, Tableau, Looker, click, Informatica, IBM, I had IBM there. So their claim was, hey, we're open. But so that's smart. That's just, hey, they realized that people use these tools. >> I'm trying to exclude other people, be a platform and be an ecosystem for the end users. >> Okay, so Mendelsohn who made the announcement said that Oracle's the smartphone of databases and I think, I actually think Alison kind of used that or maybe that was us planing to have, I thought he did like the iPhone of when he announced the exit data way back when the integrated hardware and software but is that how you see it, is Oracle, the smartphone of databases? >> It is, I mean, they are trying to own the complete stack, the hardware with the exit data all the way up to the databases at the data warehouses and the OLTP databases, the inference databases. They're trying to own the complete stack from top to bottom and that's what makes autonomy process possible. You can make it autonomous when you control all of that. Take away all of the requirements for IT in the business itself. So it's democratizing the use of data warehouses. It is pushing it out to the lines of business and it's simplifying it and making it possible to push out so that they can own their own data. They can manage their own data and they do not need an IT person from headquarters to help them. >> Let's stay in this a little bit more and then I want to go into some of the competitive stuff because Mendelsohn mentioned AWS several times. One of the things that struck me, he said, hey, we're basically one API 'cause we're doing analytics in the cloud, we're doing data in the cloud, we're doing integration in the cloud and that's sort of a big part of the value proposition. He made some comparisons to Redshift. Of course, I would say, if you can't find a workload where you beat your big competitor then you shouldn't be in this business. So I take those things with a grain of salt but one of the other things that caught me is that migrating from OnPrem to Oracle, Oracle Cloud was very simple and I think he might've made some comparisons to other platforms. And this to me is important because he also brought in that Gartner data. We looked at that Gardner data when they came out with it in the operational database class, Oracle smoked everybody. They were like way ahead and the reason why I think that's important is because let's face it, the Mission Critical Workloads, when you look at what's moving into AWS, the Mission Critical Workloads, the high performance, high criticality OLTP stuff. That's not moving in droves and you've made the point often that companies with their own cloud particularly, Oracle you've mentioned this about IBM for certain, DB2 for instance, customers are going to, there should be a lower risk environment moving from OnPrem to their cloud, because you could do, I don't think you could get Oracle RAC on AWS. For example, I don't think EXIF data is running in AWS data centers and so that like component is going to facilitate migration. What's your take on all that spiel? >> I think that's absolutely right. You all crown Jewels, the most expensive and the most valuable applications, the mission-critical applications. The ones that have got to take a beating, keep on taking. So those types of applications are where Oracle really shines. They own a very large high percentage of those Mission Critical Workloads and you have the choice if you're going to AWS, for example of either migrating to Oracle on AWS and that is frankly not a good fit at all. There're a lot of constraints to running large systems on AWS, large mission critical systems. So that's not an option and then the option, of course, that AWS will push is move to a Roller, change your way of writing applications, make them tiny little pieces and stitch them all together with microservices and that's okay if you're a small organization but that has got a lot of problems in its own, right? Because then you, the user have to stitch all those pieces together and you're responsible for testing it and you're responsible for looking after it. And that as you grow becomes a bigger and bigger overhead. So AWS, in my opinion needs to have a move towards a tier-one database of it's own and it's not in that position at the moment. >> Interesting, okay. So, let's talk about the competitive landscape and the choices that customers have. As I said, Mendelssohn mentioned AWS many times, Larry on the calls often take shy, it's a compliment to me. When Larry Ellison calls you out, that means you've made it, you're doing well. We've seen it over the years, whether it's IBM or Workday or Salesforce, even though Salesforce's big Oracle customer 'cause AWS, as we know are Oracle customer as well, even though AWS tells us they've off called when you peel the onion >> Five years should be great, some of the workers >> Well, as I said, I believe they're still using Oracle in certain workloads. Way, way, we digress. So AWS though, they take a different approach and I want to push on this a little bit with database. It's got more than a dozen, I think purpose-built databases. They take this kind of right tool for the right job approach was Oracle there converging all this function into a single database. SQL JSON graph databases, machine learning, blockchain. I'd love to talk about more about blockchain if we have time but seems to me that the right tool for the right job purpose-built, very granular down to the primitives and APIs. That seems to me to be a pretty viable approach versus kind of a Swiss Army approach. How do you compare the two? >> Yes, and it is to many initial programmers who are very interested for example, in graph databases or in time series databases. They are looking for a cheap database that will do the job for a particular project and that makes, for the program or for that individual piece of work is making a very sensible way of doing it and they pay for ads on it's clear cloud dynamics. The challenge as you have more and more data and as you're building up your data warehouse in your data lakes is that you do not want to have to move data from one place to another place. So for example, if you've got a Roller,, you have to move the database and it's a pretty complicated thing to do it, to move it to Redshift. It's a five or six steps to do that and each of those costs money and each of those take time. More importantly, they take time. The Oracle approach is a single database in terms of all the pieces that obviously you have multiple databases you have different OLTP databases and data warehouse databases but as a single architecture and a single design which means that all of the work in terms of moving stuff from one place to another place is within Oracle itself. It's Oracle that's doing that work for you and as you grow, that becomes very, very important. To me, very, very important, cost saving. The overhead of all those different ones and the databases themselves originate with all as open source and they've done very well with it and then there's a large revenue stream behind the, >> The AWS, you mean? >> Yes, the original database is in AWS and they've done a lot of work in terms of making it set with the panels, etc. But if a larger organization, especially very large ones and certainly if they want to combine, for example data warehouse with the OLTP and the inference which is in my opinion, a very good thing that they should be trying to do then that is incredibly difficult to do with AWS and in my opinion, AWS has to invest enormously in to make the whole ecosystem much better. >> Okay, so innovation required there maybe is part of the TAM expansion strategy but just to sort of digress for a second. So it seems like, and by the way, there are others that are doing, they're taking this converged approach. It seems like that is a trend. I mean, you certainly see it with single store. I mean, the name sort of implies that formerly MemSQL I think Monte Zweben of splice machine is probably headed in a similar direction, embedding AI in Microsoft's, kind of interesting. It seems like Microsoft is willing to build this abstraction layer that hides that complexity of the different tooling. AWS thus far has not taken that approach and then sort of looking at Snowflake, Snowflake's got a completely different, I think Snowflake's trying to do something completely different. I don't think they're necessarily trying to take Oracle head-on. I mean, they're certainly trying to just, I guess, let's talk about this. Snowflake simplified EDW, that's clear. Zero to snowflake in 90 minutes. It's got this data cloud vision. So you sign on to this Snowflake, speaking of layers they're abstracting the complexity of the underlying cloud. That's what the data cloud vision is all about. They, talk about this Global Mesh but they've not done a good job of explaining what the heck it is. We've been pushing them on that, but we got, >> Aspiration of moment >> Well, I guess, yeah, it seems that way. And so, but conceptually, it's I think very powerful but in reality, what snowflake is doing with data sharing, a lot of reading it's probably mostly read-only and I say, mostly read-only, oh, there you go. You'll get better but it's mostly read and so you're able to share the data, it's governed. I mean, it's exactly, quite genius how they've implemented this with its simplicity. It is a caching architecture. We've talked about that, we can geek out about that. There's good, there's bad, there's ugly but generally speaking, I guess my premise here I would love your thoughts. Is snowflakes trying to do something different? It's trying to be not just another data warehouse. It's not just trying to compete with data lakes. It's trying to create this data cloud to facilitate data sharing, put data in the hands of business owners in terms of a product build, data product builders. That's a different vision than anything I've seen thus far, your thoughts. >> I agree and even more going further, being a place where people can sell data. Put it up and make it available to whoever needs it and making it so simple that it can be shared across the country and across the world. I think it's a very powerful vision indeed. The challenge they have is that the pieces at the moment are very, very easy to use but the quality in terms of the, for example, joints, I mentioned, the joints were very powerful in Oracle. They don't try and do joints. They, they say >> They being Snowflake, snowflake. Yeah, they don't even write it. They would say use another Postgres >> Yeah. >> Database to do that. >> Yeah, so then they have a long way to go. >> Complex joints anyway, maybe simple joints, yeah. >> Complex joints, so they have a long way to go in terms of the functionality of their product and also in my opinion, they sure be going to have more types of databases inside it, including OLTP and they can do that. They have obviously got a great market gap and they can do that by acquisition as well as they can >> They've started. I think, I think they support JSON, right. >> Do they support JSON? And graph, I think there's a graph database that's either coming or it's there, I can't keep all that stuff in my head but there's no reason they can't go in that direction. I mean, in speaking to the founders in Snowflake they were like, look, we're kind of new. We would focus on simple. A lot of them came from Oracle so they know all database and they know how hard it is to do things like facilitate complex joints and do complex workload management and so they said, let's just simplify, we'll put it in the cloud and it will spin up a separate data warehouse. It's a virtual data warehouse every time you want one to. So that's how they handle those things. So different philosophy but again, coming back to some of the mission critical work and some of the larger Oracle customers, they said they have a thousand autonomous database customers. I think it was autonomous database, not ADW but anyway, a few stood out AON, lift, I think Deloitte stood out and as obviously, hundreds more. So we have people who misunderstand Oracle, I think. They got a big install base. They invest in R and D and they talk about lock-in sure but the CIO that I talked to and you talked to David, they're looking for business value. I would say that 75 to 80% of them will gravitate toward business value over the fear of lock-in and I think at the end of the day, they feel like, you know what? If our business is performing, it's a better business decision, it's a better business case. >> I fully agree, they've been very difficult to do business with in the past. Everybody's in dread of the >> The audit. >> The knock on the door from the auditor. >> Right. >> And that from a purchasing point of view has been really bad experience for many, many customers. The users of the database itself are very happy indeed. I mean, you talk to them and they understand why, what they're paying for. They understand the value and in terms of availability and all of the tools for complex multi-dimensional types of applications. It's pretty well, the only game in town. It's only DB2 and SQL that had any hope of doing >> Doing Microsoft, Microsoft SQL, right. >> Okay, SQL >> Which, okay, yeah, definitely competitive for sure. DB2, no IBM look, IBM lost its dominant position in database. They kind of seeded that. Oracle had to fight hard to win it. It wasn't obvious in the 80s who was going to be the database King and all had to fight. And to me, I always tell people the difference is that the chairman of Oracle is also the CTO. They spend money on R and D and they throw off a ton of cash. I want to say something about, >> I was just going to make one extra point. The simplicity and the capability of their cloud versions of all of this is incredibly good. They are better in terms of spending what you need or what you use much better than AWS, for example or anybody else. So they have really come full circle in terms of attractiveness in a cloud environment. >> You mean charging you for what you consume. Yeah, Mendelsohn talked about that. He made a big point about the granularity, you pay for only what you need. If you need 33 CPUs or the other databases you've got to shape, if you need 33, you've got to go to 64. I know that's true for everyone. I'm not sure if that's true too for snowflake. It may be, I got to dig into that a little bit, but maybe >> Yes, Snowflake has got a front end to hiding behind. >> Right, but I didn't want to push it that a little bit because I want to go look at their pricing strategies because I still think they make you buy, I may be wrong. I thought they make you still do a one-year or two-year or three-year term. I don't know if you can just turn it off at any time. They might allow, I should hold off. I'll do some more research on that but I wanted to make a point about the audits, you mentioned audits before. A big mistake that a lot of Oracle customers have made many times and we've written about this, negotiating with Oracle, you've got to bring your best and your brightest when you negotiate with Oracle. Some of the things that people didn't pay attention to and I think they've sort of caught onto this is that Oracle's SOW is adjudicate over the MSA, a lot of legal departments and procurement department. Oh, do we have an MSA? With all, Yes, you do, okay, great and because they think the MSA, they then can run. If they have an MSA, they can rubber stamp it but the SOW really dictateS and Oracle's gotcha there and they're really smart about that. So you got to bring your best and the brightest and you've got to really negotiate hard with Oracle, you get trouble. >> Sure. >> So it is what it is but coming back to Oracle, let's sort of wrap on this. Dominant position in mission critical, we saw that from the Gartner research, especially for operational, giant customer base, there's cloud-first notion, there's investing in R and D, open, we'll put a question Mark around that but hey, they're doing some cool stuff with Michael stuff. >> Ecosystem, I put that, ecosystem they're promoting their ecosystem. >> Yeah, and look, I mean, for a lot of their customers, we've talked to many, they say, look, there's actually, a tail at the tail way, this saves us money and we don't have to migrate. >> Yeah. So interesting, so I'll give you the last word. We started sort of focusing on the announcement. So what do you want to leave us with? >> My last word is that there are platforms with a certain key application or key parts of the infrastructure, which I think can differentiate themselves from the Azures or the AWS. and Oracle owns one of those, SAP might be another one but there are certain platforms which are big enough and important enough that they will, in my opinion will succeed in that cloud strategy for this. >> Great, David, thanks so much, appreciate your insights. >> Good to be here. Thank you for watching everybody, this is Dave Vellante for The Cube. We'll see you next time. (upbeat music)

Published Date : Mar 17 2021

SUMMARY :

and that moderates, the great pleasure to be here. that the system automatically and it seemed to have done so very well. So in your view, is this, I mean and the second thing and the announcement. that the others are all useful that they've started to of stuff that they supported, and be an ecosystem for the end users. and the OLTP databases, and the reason why I and the most valuable applications, and the choices that customers have. for the right job approach was and that makes, for the program OLTP and the inference that complexity of the different tooling. put data in the hands of business owners that the pieces at the moment Yeah, they don't even write it. Yeah, so then they Complex joints anyway, and also in my opinion, they sure be going I think, I think they support JSON, right. and some of the larger Everybody's in dread of the and all of the tools is that the chairman of The simplicity and the capability He made a big point about the granularity, front end to hiding behind. and because they think the but coming back to Oracle, Ecosystem, I put that, ecosystem Yeah, and look, I mean, on the announcement. and important enough that much, appreciate your insights. Good to be here.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

MendelsohnPERSON

0.99+

Andy MendelsohnPERSON

0.99+

OracleORGANIZATION

0.99+

David FloyerPERSON

0.99+

AWSORGANIZATION

0.99+

Dave VellantePERSON

0.99+

IBMORGANIZATION

0.99+

March 9thDATE

0.99+

February 19thDATE

0.99+

fiveQUANTITY

0.99+

DeloitteORGANIZATION

0.99+

75QUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

Larry EllisonPERSON

0.99+

MendelssohnPERSON

0.99+

twoQUANTITY

0.99+

eachQUANTITY

0.99+

90%QUANTITY

0.99+

one-yearQUANTITY

0.99+

GartnerORGANIZATION

0.99+

73%QUANTITY

0.99+

SnowflakeORGANIZATION

0.99+

two toolsQUANTITY

0.99+

MichaelPERSON

0.99+

64%QUANTITY

0.99+

two factorsQUANTITY

0.99+

more than a dozenQUANTITY

0.99+

last quarterDATE

0.99+

SQLTITLE

0.99+

Rahul Pathak, AWS | AWS re:Invent 2020


 

>>from around the globe. It's the Cube with digital coverage of AWS reinvent 2020 sponsored by Intel and AWS. Yeah, welcome back to the cubes. Ongoing coverage of AWS reinvent virtual Cuba's Gone Virtual along with most events these days are all events and continues to bring our digital coverage of reinvent With me is Rahul Pathak, who is the vice president of analytics at AWS A Ro. It's great to see you again. Welcome. And thanks for joining the program. >>They have Great co two and always a pleasure. Thanks for having me on. >>You're very welcome. Before we get into your leadership discussion, I want to talk about some of the things that AWS has announced. Uh, in the early parts of reinvent, I want to start with a glue elastic views. Very notable announcement allowing people to, you know, essentially share data across different data stores. Maybe tell us a little bit more about glue. Elastic view is kind of where the name came from and what the implication is, >>Uh, sure. So, yeah, we're really excited about blue elastic views and, you know, as you mentioned, the idea is to make it easy for customers to combine and use data from a variety of different sources and pull them together into one or many targets. And the reason for it is that you know we're really seeing customers adopt what we're calling a lake house architectural, which is, uh, at its core Data Lake for making sense of data and integrating it across different silos, uh, typically integrated with the data warehouse, and not just that, but also a range of other purpose. Both stores like Aurora, Relation of Workloads or dynamodb for non relational ones. And while customers typically get a lot of benefit from using purpose built stores because you get the best possible functionality, performance and scale forgiven use case, you often want to combine data across them to get a holistic view of what's happening in your business or with your customers. And before glue elastic views, customers would have to either use E. T. L or data integration software, or they have to write custom code that could be complex to manage, and I could be are prone and tough to change. And so, with elastic views, you can now use sequel to define a view across multiple data sources pick one or many targets. And then the system will actually monitor the sources for changes and propagate them into the targets in near real time. And it manages the anti pipeline and can notify operators if if anything, changes. And so the you know the components of the name are pretty straightforward. Blues are survivalists E T Elling data integration service on blue elastic views about our about data integration their views because you could define these virtual tables using sequel and then elastic because it's several lists and will scale up and down to deal with the propagation of changes. So we're really excited about it, and customers are as well. >>Okay, great. So my understanding is I'm gonna be able to take what's called what the parlance of materialized views, which in my laypersons terms assumes I'm gonna run a query on the database and take that subset. And then I'm gonna be ableto thio. Copy that and move it to another data store. And then you're gonna automatically keep track of the changes and keep everything up to date. Is that right? >>Yes. That's exactly right. So you can imagine. So you had a product catalog for example, that's being updated in dynamodb, and you can create a view that will move that to Amazon Elasticsearch service. You could search through a current version of your catalog, and we will monitor your dynamodb tables for any changes and make sure those air all propagated in the real time. And all of that is is taken care of for our customers as soon as they defined the view on. But they don't be just kept in sync a za long as the views in effect. >>Let's see, this is being really valuable for a person who's building Looks like I like to think in terms of data services or data products that are gonna help me, you know, monetize my business. Maybe, you know, maybe it's a simple as a dashboard, but maybe it's actually a product. You know, it might be some content that I want to develop, and I've got transaction systems. I've got unstructured data, may be in a no sequel database, and I wanna actually combine those build new products, and I want to do that quickly. So So take me through what I would have to do. You you sort of alluded to it with, you know, a lot of e t l and but take me through in a little bit more detail how I would do that, you know, before this innovation. And maybe you could give us a sense as to what the possibilities are with glue. Elastic views? >>Sure. So, you know, before we announced elastic views, a customer would typically have toe think about using a T l software, so they'd have to write a neat L pipeline that would extract data periodically from a range of sources. They then have to write transformation code that would do things like matchup types. Make sure you didn't have any invalid values, and then you would combine it on periodically, Write that into a target. And so once you've got that pipeline set up, you've got to monitor it. If you see an unusual spike in data volume, you might have to add more. Resource is to the pipeline to make a complete on time. And then, if anything changed in either the source of the destination that prevented that data from flowing in the way you would expect it, you'd have toe manually, figure that out and have data, quality checks and all of that in place to make sure everything kept working but with elastic views just gets much simpler. So instead of having to write custom transformation code, you right view using sequel and um, sequel is, uh, you know, widely popular with data analysts and folks that work with data, as you well know. And so you can define that view and sequel. The view will look across multiple sources, and then you pick your destination and then glue. Elastic views essentially monitors both the source for changes as well as the source and the destination for any any issues like, for example, did the schema changed. The shape of the data change is something briefly unavailable, and it can monitor. All of that can handle any errors, but it can recover from automatically. Or if it can't say someone dropped an important table in the source. That was part of your view. You can actually get alerted and notified to take some action to prevent bad data from getting through your system or to prevent your pipeline from breaking without your knowledge and then the final pieces, the elasticity of it. It will automatically deal with adding more resource is if, for example, say you had a spiky day, Um, in the markets, maybe you're building a financial services application and you needed to add more resource is to process those changes into your targets more quickly. The system would handle that for you. And then, if you're monetizing data services on the back end, you've got a range of options for folks subscribing to those targets. So we've got capabilities like our, uh, Amazon data exchange, where people can exchange and monetize data set. So it allows this and to end flow in a much more straightforward way. It was possible before >>awesome. So a lot of automation, especially if something goes wrong. So something goes wrong. You can automatically recover. And if for whatever reason, you can't what happens? You quite ask the system and and let the operator No. Hey, there's an issue. You gotta go fix it. How does that work? >>Yes, exactly. Right. So if we can recover, say, for example, you can you know that for a short period of time, you can't read the target database. The system will keep trying until it can get through. But say someone dropped a column from your source. That was a key part of your ultimate view and destination. You just can't proceed at that point. So the pipeline stops and then we notify using a PS or an SMS alert eso that programmatic action can be taken. So this effectively provides a really great way to enforce the integrity of data that's going between the sources and the targets. >>All right, make it kindergarten proof of it. So let's talk about another innovation. You guys announced quicksight que, uh, kind of speaking to the machine in my natural language, but but give us some more detail there. What is quicksight Q and and how doe I interact with it. What What kind of questions can I ask it >>so quick? Like you is essentially a deep, learning based semantic model of your data that allows you to ask natural language questions in your dashboard so you'll get a search bar in your quick side dashboard and quick site is our service B I service. That makes it really easy to provide rich dashboards. Whoever needs them in the organization on what Q does is it's automatically developing relationships between the entities in your data, and it's able to actually reason about the questions you ask. So unlike earlier natural language systems, where you have to pre define your models, you have to pre define all the calculations that you might ask the system to do on your behalf. Q can actually figure it out. So you can say Show me the top five categories for sales in California and it'll look in your data and figure out what that is and will prevent. It will present you with how it parse that question, and there will, in line in seconds, pop up a dashboard of what you asked and actually automatically try and take a chart or visualization for that data. That makes sense, and you could then start to refine it further and say, How does this compare to what happened in New York? And we'll be able to figure out that you're tryingto overlay those two data sets and it'll add them. And unlike other systems, it doesn't need to have all of those things pre defined. It's able to reason about it because it's building a model of what your data means on the flight and we pre trained it across a variety of different domains So you can ask a question about sales or HR or any of that on another great part accused that when it presents to you what it's parsed, you're actually able toe correct it if it needs it and provide feedback to the system. So, for example, if it got something slightly off you could actually select from a drop down and then it will remember your selection for the next time on it will get better as you use it. >>I saw a demo on in Swamis Keynote on December 8. That was basically you were able to ask Quick psych you the same question, but in different ways, you know, like compare California in New York or and then the data comes up or give me the top, you know, five. And then the California, New York, the same exact data. So so is that how I kind of can can check and see if the answer that I'm getting back is correct is ask different questions. I don't have to know. The schema is what you're saying. I have to have knowledge of that is the user I can. I can triangulate from different angles and then look and see if that's correct. Is that is that how you verify or there are other ways? >>Eso That's one way to verify. You could definitely ask the same question a couple of different ways and ensure you're seeing the same results. I think the third option would be toe, uh, you know, potentially click and drill and filter down into that data through the dash one on, then the you know, the other step would be at data ingestion Time. Typically, data pipelines will have some quality controls, but when you're interacting with Q, I think the ability to ask the question multiple ways and make sure that you're getting the same result is a perfectly reasonable way to validate. >>You know what I like about that answer that you just gave, and I wonder if I could get your opinion on this because you're you've been in this business for a while? You work with a lot of customers is if you think about our operational systems, you know things like sales or E r. P systems. We've contextualized them. In other words, the business lines have inject context into the system. I mean, they kind of own it, if you will. They own the data when I put in quotes, but they do. They feel like they're responsible for it. There's not this constant argument because it's their data. It seems to me that if you look back in the last 10 years, ah, lot of the the data architecture has been sort of generis ized. In other words, the experts. Whether it's the data engineer, the quality engineer, they don't really have the business context. But the example that you just gave it the drill down to verify that the answer is correct. It seems to me, just in listening again to Swamis Keynote the other day is that you're really trying to put data in the hands of business users who have the context on the domain knowledge. And that seems to me to be a change in mindset that we're gonna see evolve over the next decade. I wonder if you could give me your thoughts on that change in the data architecture data mindset. >>David, I think you're absolutely right. I mean, we see this across all the customers that we speak with there's there's an increasing desire to get data broadly distributed into the hands of the organization in a well governed and controlled way. But customers want to give data to the folks that know what it means and know how they can take action on it to do something for the business, whether that's finding a new opportunity or looking for efficiencies. And I think, you know, we're seeing that increasingly, especially given the unpredictability that we've all gone through in 2020 customers are realizing that they need to get a lot more agile, and they need to get a lot more data about their business, their customers, because you've got to find ways to adapt quickly. And you know, that's not gonna change anytime in the future. >>And I've said many times in the The Cube, you know, there are industry. The technology industry used to be all about the products, and in the last decade it was really platforms, whether it's SAS platforms or AWS cloud platforms, and it seems like innovation in the coming years, in many respects is coming is gonna come from the ecosystem and the ability toe share data we've We've had some examples today and then But you hit on. You know, one of the key challenges, of course, is security and governance. And can you automate that if you will and protect? You know the users from doing things that you know, whether it's data access of corporate edicts for governance and compliance. How are you handling that challenge? >>That's a great question, and it's something that really emphasized in my leadership session. But the you know, the notion of what customers are doing and what we're seeing is that there's, uh, the Lake House architectural concept. So you've got a day late. Purpose build stores and customers are looking for easy data movement across those. And so we have things like blue elastic views or some of the other blue features we announced. But they're also looking for unified governance, and that's why we built it ws late formation. And the idea here is that it can quickly discover and catalog customer data assets and then allows customers to define granular access policies centrally around that data. And once you have defined that, it then sets customers free to give broader access to the data because they put the guardrails in place. They put the protections in place. So you know you can tag columns as being private so nobody can see them on gun were announced. We announced a couple of new capabilities where you can provide row based control. So only a certain set of users can see certain rose in the data, whereas a different set of users might only be able to see, you know, a different step. And so, by creating this fine grained but unified governance model, this actually sets customers free to give broader access to the data because they know that they're policies and compliance requirements are being met on it gets them out of the way of the analyst. For someone who can actually use the data to drive some value for the business, >>right? They could really focus on driving value. And I always talk about monetization. However monetization could be, you know, a generic term, for it could be saving lives, admission of the business or the or the organization I meant to ask you about acute customers in bed. Uh, looks like you into their own APs. >>Yes, absolutely so one of quick sites key strengths is its embed ability. And on then it's also serverless, so you could embed it at a really massive scale. And so we see customers, for example, like blackboard that's embedding quick side dashboards into information. It's providing the thousands of educators to provide data on the effectiveness of online learning. For example, on you could embed Q into that capability. So it's a really cool way to give a broad set of people the ability to ask questions of data without requiring them to be fluent in things like Sequel. >>If I ask you a question, we've talked a little bit about data movement. I think last year reinvent you guys announced our A three. I think it made general availability this year. And remember Andy speaking about it, talking about you know, the importance of having big enough pipes when you're moving, you know, data around. Of course you do. Doing tearing. You also announced Aqua Advanced Query accelerator, which kind of reduces bringing the computer. The data, I guess, is how I would think about that reducing that movement. But then we're talking about, you know, glue, elastic views you're copying and moving data. How are you ensuring you know, maintaining that that maximum performance for your customers. I mean, I know it's an architectural question, but as an analytics professional, you have toe be comfortable that that infrastructure is there. So how does what's A. W s general philosophy in that regard? >>So there's a few ways that we think about this, and you're absolutely right. I think there's data volumes were going up, and we're seeing customers going from terabytes, two petabytes and even people heading into the exabyte range. Uh, there's really a need to deliver performance at scale. And you know, the reality of customer architectures is that customers will use purpose built systems for different best in class use cases. And, you know, if you're trying to do a one size fits all thing, you're inevitably going to end up compromising somewhere. And so the reality is, is that customers will have more data. We're gonna want to get it to more people on. They're gonna want their analytics to be fast and cost effective. And so we look at strategies to enable all of this. So, for example, glue elastic views. It's about moving data, but it's about moving data efficiently. So What we do is we allow customers to define a view that represents the subset of their data they care about, and then we only look to move changes as efficiently as possible. So you're reducing the amount of data that needs to get moved and making sure it's focused on the essential. Similarly, with Aqua, what we've done, as you mentioned, is we've taken the compute down to the storage layer, and we're using our nitro chips to help with things like compression and encryption. And then we have F. P. J s in line to allow filtering an aggregation operation. So again, you're tryingto quickly and effectively get through as much data as you can so that you're only sending back what's relevant to the query that's being processed. And that again leads to more performance. If you can avoid reading a bite, you're going to speed up your queries. And that Awkward is trying to do. It's trying to push those operations down so that you're really reducing data as close to its origin as possible on focusing on what's essential. And that's what we're applying across our analytics portfolio. I would say one other piece we're focused on with performance is really about innovating across the stack. So you mentioned network performance. You know, we've got 100 gigabits per second throughout now, with the next 10 instances and then with things like Grab it on to your able to drive better price performance for customers, for general purpose workloads. So it's really innovating at all layers. >>It's amazing to watch it. I mean, you guys, it's a It's an incredible engineering challenge as you built this hyper distributed system. That's now, of course, going to the edge. I wanna come back to something you mentioned on do wanna hit on your leadership session as well. But you mentioned the one size fits all, uh, system. And I've asked Andy Jassy about this. I've had a discussion with many folks that because you're full and and of course, you mentioned the challenges you're gonna have to make tradeoffs if it's one size fits all. The flip side of that is okay. It's simple is you know, 11 of the Swiss Army knife of database, for example. But your philosophy is Amazon is you wanna have fine grained access and to the primitives in case the market changes you, you wanna be able to move quickly. So that puts more pressure on you to then simplify. You're not gonna build this big hairball abstraction layer. That's not what he gonna dio. Uh, you know, I think about, you know, layers and layers of paint. I live in a very old house. Eso your That's not your approach. So it puts greater pressure on on you to constantly listen to your customers, and and they're always saying, Hey, I want to simplify, simplify, simplify. We certainly again heard that in swamis presentation the other day, all about, you know, minimizing complexity. So that really is your trade office. It puts pressure on Amazon Engineering to continue to raise the bar on simplification. Isn't Is that a fair statement? >>Yeah, I think so. I mean, you know, I think any time we can do work, so our customers don't have to. I think that's a win for both of us. Um, you know, because I think we're delivering more value, and it makes it easier for our customers to get value from their data way. Absolutely believe in using the right tool for the right job. And you know you talked about an old house. You're not gonna build or renovate a house of the Swiss Army knife. It's just the wrong tool. It might work for small projects, but you're going to need something more specialized. The handle things that matter. It's and that is, uh, that's really what we see with that, you know, with that set of capabilities. So we want to provide customers with the best of both worlds. We want to give them purpose built tools so they don't have to compromise on performance or scale of functionality. And then we want to make it easy to use these together. Whether it's about data movement or things like Federated Queries, you can reach into each of them and through a single query and through a unified governance model. So it's all about stitching those together. >>Yeah, so far you've been on the right side of history. I think it serves you well on your customers. Well, I wanna come back to your leadership discussion, your your leadership session. What else could you tell us about? You know, what you covered there? >>So we we've actually had a bunch of innovations on the analytics tax. So some of the highlights are in m r, which is our managed spark. And to do service, we've been able to achieve 1.7 x better performance and open source with our spark runtime. So we've invested heavily in performance on now. EMR is also available for customers who are running and containerized environment. So we announced you Marnie chaos on then eh an integrated development environment and studio for you Marco D M R studio. So making it easier both for people at the infrastructure layer to run em are on their eks environments and make it available within their organizations but also simplifying life for data analysts and folks working with data so they can operate in that studio and not have toe mess with the details of the clusters underneath and then a bunch of innovation in red shift. We talked about Aqua already, but then we also announced data sharing for red Shift. So this makes it easy for red shift clusters to share data with other clusters without putting any load on the central producer cluster. And this also speaks to the theme of simplifying getting data from point A to point B so you could have central producer environments publishing data, which represents the source of truth, say into other departments within the organization or departments. And they can query the data, use it. It's always up to date, but it doesn't put any load on the producers that enables these really powerful data sharing on downstream data monetization capabilities like you've mentioned. In addition, like Swami mentioned in his keynote Red Shift ML, so you can now essentially train and run models that were built in sage maker and optimized from within your red shift clusters. And then we've also automated all of the performance tuning that's possible in red ships. So we really invested heavily in price performance, and now we've automated all of the things that make Red Shift the best in class data warehouse service from a price performance perspective up to three X better than others. But customers can just set red shift auto, and it'll handle workload management, data compression and data distribution. Eso making it easier to access all about performance and then the other big one was in Lake Formacion. We announced three new capabilities. One is transactions, so enabling consistent acid transactions on data lakes so you can do things like inserts and updates and deletes. We announced row based filtering for fine grained access control and that unified governance model and then automated storage optimization for Data Lake. So customers are dealing with an optimized small files that air coming off streaming systems, for example, like Formacion can auto compact those under the covers, and you can get a 78 x performance boost. It's been a busy year for prime lyrics. >>I'll say that, z that it no great great job, bro. Thanks so much for coming back in the Cube and, you know, sharing the innovations and, uh, great to see you again. And good luck in the coming here. Well, >>thank you very much. Great to be here. Great to see you. And hope we get Thio see each other in person against >>I hope so. All right. And thank you for watching everybody says Dave Volonte for the Cube will be right back right after this short break

Published Date : Dec 10 2020

SUMMARY :

It's great to see you again. They have Great co two and always a pleasure. to, you know, essentially share data across different And so the you know the components of the name are pretty straightforward. And then you're gonna automatically keep track of the changes and keep everything up to date. So you can imagine. services or data products that are gonna help me, you know, monetize my business. that prevented that data from flowing in the way you would expect it, you'd have toe manually, And if for whatever reason, you can't what happens? So if we can recover, say, for example, you can you know that for a So let's talk about another innovation. that you might ask the system to do on your behalf. but in different ways, you know, like compare California in New York or and then the data comes then the you know, the other step would be at data ingestion Time. But the example that you just gave it the drill down to verify that the answer is correct. And I think, you know, we're seeing that increasingly, You know the users from doing things that you know, whether it's data access But the you know, the notion of what customers are doing and what we're seeing is that admission of the business or the or the organization I meant to ask you about acute customers And on then it's also serverless, so you could embed it at a really massive But then we're talking about, you know, glue, elastic views you're copying and moving And you know, the reality of customer architectures is that customers will use purpose built So that puts more pressure on you to then really what we see with that, you know, with that set of capabilities. I think it serves you well on your customers. speaks to the theme of simplifying getting data from point A to point B so you could have central in the Cube and, you know, sharing the innovations and, uh, great to see you again. thank you very much. And thank you for watching everybody says Dave Volonte for the Cube will be right back right after

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Rahul PathakPERSON

0.99+

Andy JassyPERSON

0.99+

AWSORGANIZATION

0.99+

DavidPERSON

0.99+

CaliforniaLOCATION

0.99+

New YorkLOCATION

0.99+

AndyPERSON

0.99+

Swiss ArmyORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

December 8DATE

0.99+

Dave VolontePERSON

0.99+

last yearDATE

0.99+

2020DATE

0.99+

third optionQUANTITY

0.99+

SwamiPERSON

0.99+

eachQUANTITY

0.99+

bothQUANTITY

0.99+

A. WPERSON

0.99+

this yearDATE

0.99+

10 instancesQUANTITY

0.98+

A threeCOMMERCIAL_ITEM

0.98+

78 xQUANTITY

0.98+

two petabytesQUANTITY

0.98+

fiveQUANTITY

0.97+

Amazon EngineeringORGANIZATION

0.97+

Red Shift MLTITLE

0.97+

FormacionORGANIZATION

0.97+

11QUANTITY

0.96+

oneQUANTITY

0.96+

one wayQUANTITY

0.96+

IntelORGANIZATION

0.96+

OneQUANTITY

0.96+

five categoriesQUANTITY

0.94+

AquaORGANIZATION

0.93+

ElasticsearchTITLE

0.93+

terabytesQUANTITY

0.93+

both worldsQUANTITY

0.93+

next decadeDATE

0.92+

two data setsQUANTITY

0.91+

Lake FormacionORGANIZATION

0.9+

single queryQUANTITY

0.9+

Data LakeORGANIZATION

0.89+

thousands of educatorsQUANTITY

0.89+

Both storesQUANTITY

0.88+

ThioPERSON

0.88+

agileTITLE

0.88+

CubaLOCATION

0.87+

dynamodbORGANIZATION

0.86+

1.7 xQUANTITY

0.86+

SwamisPERSON

0.84+

EMRTITLE

0.82+

one sizeQUANTITY

0.82+

Red ShiftTITLE

0.82+

up to three XQUANTITY

0.82+

100 gigabits per secondQUANTITY

0.82+

MarniePERSON

0.79+

last decadeDATE

0.79+

reinvent 2020EVENT

0.74+

InventEVENT

0.74+

last 10 yearsDATE

0.74+

CubeCOMMERCIAL_ITEM

0.74+

todayDATE

0.74+

A RoEVENT

0.71+

three new capabilitiesQUANTITY

0.71+

twoQUANTITY

0.7+

E T EllingPERSON

0.69+

EsoORGANIZATION

0.66+

AquaTITLE

0.64+

CubeORGANIZATION

0.63+

QueryCOMMERCIAL_ITEM

0.63+

SASORGANIZATION

0.62+

AuroraORGANIZATION

0.61+

Lake HouseORGANIZATION

0.6+

SequelTITLE

0.58+

P.PERSON

0.56+

Troy Massey, Iron Bow Technologies & Jon Siegal, Dell Technologies | Dell Technologies World 2020


 

>>From around the globe. It's the queue with digital coverage of Dell technologies, world digital experience brought to you by Dell technologies. >>Hi, welcome to the cubes coverage of Dell technologies, world 2020, the virtual experience. I am Lisa Martin and I've got a couple of guests joining me. One of them is a longtime cube alumni. John Siegel is back the VP of product marketing for Dell technologies. John, it's great to see you. >>Great to be back. Thank you. >>And also joining us is Troy Massey the director of enterprise engagements from iron bow technologies. Troy, welcome to the cube. >>Hi, thank you. Grabbed him. >>So we're going to be talking about VxRail, how it's driving the future of HCI to the edge, but first let's get choice perspective. I would like the audience to understand who iron bow technologies is and what you do. And then we'll kind of look at it as what you're doing with the extra rail, as well as your channel partner business with Dell technologies. So Troy, take it away. >>Hi. Yeah. So, uh, Iron Bow is a global company. We're a value added reseller, uh, having partnered with Dell. Um, we have people physically living from Europe all the way through in Korea, um, from kind of based the globe, uh, primarily in wherever there's DOD or federal government agencies. >>And tell me about from a channel partner perspective, what you guys are doing together. >>Yeah, so we have a lot of efforts going on channel partner together, uh, specifically, uh, Iron target is, is a huge effort to where we're doing together. Uh, it's a on prem cloud, uh, that's uh, it's basis, VxRail VMware Cloud foundation on top, uh, with Intel all throughout. So there's an Intel Xeon processors and, uh, Optane drives. Uh, so just the perfect elegant OnPrem cloud, hybrid cloud solution that Dell and Iron Bow are driving together. >>So let's talk about the edge, cause a big focus of Dell technologies world this year is about the edge. How do you see Troy iron bow extending services to the edge and what do you anticipate from your customers in terms of what their needs are as they're changing? >>Great, great question. So, um, for one, I've gotta talk a little bit about what the edge and what the edges and the edges different things to different people. So I'm going to explain a little bit of the edge and what we're seeing and, and the federal government. So I'll give you one example and that's, um, uh, you know,  the air force reserves. So they have a, uh, an entire squadron that does all of the firefighting, uh, the large fires you see across California or whatever states engulfed in fires that year, um, where they take an entire squadron of airplanes out when they sort of water overall, the whole fire, uh, they don't just bring planes. They entire squatters military personnel to help communicate with the police and with the local fire and all of that takes information. So they need to bring information data with them. Is there a building over there? Do people live over there where we got to actually concentrate on site on fighting that fire priority-wise so it doesn't make a lot of sense to try to do that remotely over satellite it's large, large chunks of data that needs to be local to the customer. So, um, VxRail is, is the power beast of the HCI world VxRail at that edge provides them with the performance they need to get that job done. >>I think that's going to be a new new segment here in Silicon Valley. That thinking about all the fires we've had, and it's really VxRail at the edge, that's helping fight the fires. That's not something I knew. So thanks for sharing that. >>So there's all kinds of workings in that area, same deal. They need to know where to go rescue those people and it's all data. >>Exactly. And it's gotta be data that's that, as you said, it was not delayed sent over the wire, but obviously being able to be transmitted in real time so that actions can be taken, which is one of the things we talk about with data all the time. You have to be able to get the insight and act on it quickly. So, so John, the theme of this year's virtual Dell technologies world is the edge is a big part of the theme. So talk to us about driving the future of HCI at the edge with VxRail, how there's been a lot of growth, I think 9,600 plus customers so far. So talk to us about the future of HCI at the edge with VxRail as a driver. >>Absolutely. So first of all, I want to thank iron bow for being one of our nearly 10,000 customers for VxRail. Um, and you know, absolutely. So, you know, overall the edge is going to be a major theme for Dell tech world this week. Uh, and specifically for VxRail. Um, we of course continue to play with VxRail, a key role in modernizing data centers, uh, as well as hybrid cloud. And this week really wanted to highlight some of the recent innovations we have around extending the simplified operations of VxRail that many like, uh, iron bow and others are experiencing today in the core, uh, are in the cloud and extending those, that automation to the edge. Um, and you heard a lot about what the edge can do in the end and the implications and the value of the edge. Um, while we have lots of customers today, um, including IMO that are using VxRail at their edge locations, uh, we have others like large retail, uh, home improvement chains, financial institutions. Um, we expect the edge to soon explode. Um, we like to think that, uh, we are at the edge of the edge opportunity, um, in >>It in fact, IDC recently stated that by 2023, over 50% of new enterprise data that is generated is going to be generated outside the core data center and outside the cloud. That's up front 10% today. So this is, this is massive, um, edge locations. Um, of course come with their own challenges, whether it's sometimes less than ideal conditions around power and cooling, or they may not have typically, um, skilled it staff at the edge, right? So they, they need, they need new special configuration. They need operational efficiencies. And I think VxRail is uniquely positioned to help address that. >>Let's kind of dig into those operational challenges because in the last seven months, so much of what we all do has become remote and a good amount of that is going to be probably permanent. Right. So when you think about the volume of remote devices that VxRail could potentially manage, John, how, how do you see  VxRail being able to help in this sort of very distributed environment that might be very well much permanent? >>Yeah, I know. And like you said, it's going to just grow and grow the distributed environment and what that means for each company might be slightly different, but regardless what they do need to seamless operations across all of those different edge locations, um, and a, again, a big focus for us. So we're really doing three things to extend the, the automated operations of VxRail to the edge and doing so at scale. Uh, the first thing I want to say, talk about is that we did on avail just two VxRail platforms designed specifically for the edge, uh, the new VxRail E-Series, which is ideal for remote office locations, where space is limited. Um, the remote, uh, the VxRail D series, I think of D as in durable, uh, this is our ruggedized platform, uh, built from the ground up for harsh environments, you know, such as the DOD environments, like in the, um, in the desert. >>Um, and both of these VxRail platforms are fully automated. They automate everything from deployment to expansions to, to lifecycle management overall. Um, and now what we're doing now with extending that automation is the second thing we doing, uh, you know, to the edge from an operational perspective. And what we're doing first and foremost is we are introducing a new software as a service multi cluster management. Uh, this is part of the VxRail HCI system software that we deliver today as part of the VxRail. Uh, this not only provides a global view of the infrastructure performance, um, and capacity analysis across all the locations, but even more importantly, it actively ensures that all the clusters and the remote locate locations always stay in a continuously validated state. This means that it can automatically determine which software components need to be upgraded. Um, you know, and also automatically execute the full stack upgrades, right? >>Without any technical expertise at the site, it can be done centrally, further automating the lifecycle management process and process that we do, uh, at the core and the cloud, and now extending to the edge. So, yeah, imagine the operational efficiencies for customers with tens or hundreds or even thousands of edge sites. So this is we think truly a game changer from that perspective. And then in addition to that, we're also adding, uh, the support for BCS on VxRail. So, uh, just at VM world just a couple of weeks ago, uh, VMware announced, uh, remote edge cluster support for VCF. Uh, so those customers that run run BCF on VxRail now can get the, the, they can enjoy a consistent cloud operating model, um, you know, for those edge locations. So, you know, in summary, you're getting consistency, you're getting automation regardless of where your VxRail is located. >>And this is something that I saw in the notes. John is described as a curated experience. Can you describe what that is if I think of reference architectures and things of that, what is a curated experience and how is it different? >>Yeah,  a curated experience for VxRail...  really what it is it's about seamless. Uh, it it's about we, we have taken the burden if you will, of integrating infrastructure off of the customer's shoulders and onto ours, right? So what we do is we ensure VxRail is in fact, the only, um, jointly engineered HCI system in the market, that's doing the engineering with VMware, for VMware to enhance VMware environments. Uh, and so what we've done is we, uh, we have a pre-integrated, uh, full stack experience that we're providing the customer from deployment, uh, to, uh, again, to everyday operations, to making changes, et cetera. Uh, we've essentially what we've done here, um, is that we've, we've taken again, that, that burden off of customers, uh, and allowed them to spend more time innovating, uh, and less, you know, less time integrating >>That sounds good to everyone, right? Simplifying less time to troubleshoot more time to be able to be strategic and innovative, especially in such a rapidly changing world toy overview now, Oh, go ahead, John, >>To add to that, you know, we've seen a real acceleration this year to digital transformation, to your point earlier, just with remote everything. And I think a lot of the projects, and so including a shift that we've seen to consuming infrastructure overall, whether, you know, and that's, that's the, the onset of the cloud and wherever that cloud might be, right. It could be on prem, could it be on premises, could be off premises. Um, and so, you know, that focused on consuming infrastructure versus in that preference for consuming infrastructure versus building and maintaining it, that's something that we're going to continue to see accelerate over time. >>You're right. That digital transformation acceleration has been one of the biggest topics in the last seven months and looking at which businesses really are set up and have the foundation and the culture to be able to make those changes quickly, to not just survive in this environment, but win tomorrow. So talk over to you for a second, in terms of, of the edge. What are your thoughts on as a partner, with VxRail, you've got a solution built on it. What are your thoughts about what VxRail is going to be able to deliver, enable you to deliver at the edge? You know, you gave us that great example of the air force reserve, but what our iron bows thoughts there, what do you envision going forward? >>He talks about tens, hundreds, thousands of different sites that all need their data, they all need process and compute but those types of sites don't necessarily need to have and IT on staff at those sites, a great example is the army Corps of engineers. They have to have one or two people out at every dam to monitor the dam, but that mean it justifies an IT staffer out there with them. So the idea to remotely manage that VxRail, they're just industry leaders in the ability to deploy this  somewhere where there's not an it person and be able to manage it, but not just manage it, predictive analysis on when they're starting to run out of storage , give alerts so that we can start the upgrade. >>John talked to us about the engagement that you're expecting your customers to have with Dell technologies during this virtual event. >>Absolutely. I think so. First of all, um, yeah, virtual is different, but there's lot of advantages to that. Um, one of them is that we can have, um, an ongoing dialogue during, uh, a number of the sessions that we have, why some of the sessions might be prerecorded. There are alive chats all the way through, including a number of breakouts on VxRail, specifically, uh, as well as the edge, as well as a number of different, um, topics that you can imagine. Um, we've also just launched a new game, a fun game, uh, from mobile called data center sin, um, where, uh, customers can have some fun, uh, learning about VxRail, uh, the experience that takes and balancing the budget and staffing and capacity, uh, to address the needs of the business. So, uh, we're always looking for fun and engaging ways to experience, experience the real life benefits of our HCI platforms, such as VxRail. And, um, so customers can, uh, check that out as well, um, by searching their app store of choice for Dell technologies, data center, sin, uh, and have at it and have some fun. But again, whether it's playing the game online through it, I've met the reality experience or it's, um, you know, connecting directly with any of our subject matter experts. Um, there's going to be a lot of opportunity, uh, to learn more about how VxRail and ACI can help our customers thrive. >>Excellent. I like that game idea. Well, Troy, John, thank you for joining me today and letting me know what you guys are doing with the VxRail what's coming with the edge and the fact that they use cases are just going to proliferate. We appreciate your time. Thank you as well for my guests. I'm Lisa Martin. You're watching the cubes coverage of Dell technologies world 2020.

Published Date : Oct 21 2020

SUMMARY :

It's the queue with digital coverage of Dell technologies, John, it's great to see you. Great to be back. And also joining us is Troy Massey the director of enterprise engagements from iron Hi, thank you. is and what you do. We're a value added reseller, uh, having partnered with Dell. Uh, it's a on prem cloud, uh, that's uh, to the edge and what do you anticipate from your customers in terms of what their needs are as they're changing? does all of the firefighting, uh, the large fires you see across California or I think that's going to be a new new segment here in Silicon Valley. They need to know where of HCI at the edge with VxRail, how there's been a Um, and you know, absolutely. of new enterprise data that is generated is going to be generated outside the core data center and So when you think about the volume Um, the remote, uh, the VxRail D series, I think of D as in durable, Um, you know, and also automatically execute the full they can enjoy a consistent cloud operating model, um, you know, for those edge locations. Can you describe what that is if I think of reference architectures and things of that, what is a curated experience and how is it uh, and allowed them to spend more time innovating, uh, and less, you know, less time integrating To add to that, you know, we've seen a real acceleration this year to digital transformation, to your point earlier, So talk over to you for a second, in terms of, So the idea to remotely manage that VxRail, they're just industry leaders in the ability to deploy this somewhere John talked to us about the engagement that you're expecting your customers to Um, there's going to be a lot of opportunity, uh, to learn more about how VxRail to proliferate.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

Lisa MartinPERSON

0.99+

EuropeLOCATION

0.99+

KoreaLOCATION

0.99+

John SiegelPERSON

0.99+

DellORGANIZATION

0.99+

Silicon ValleyLOCATION

0.99+

tensQUANTITY

0.99+

Dell TechnologiesORGANIZATION

0.99+

Troy MasseyPERSON

0.99+

oneQUANTITY

0.99+

CaliforniaLOCATION

0.99+

Iron Bow TechnologiesORGANIZATION

0.99+

DODORGANIZATION

0.99+

TroyPERSON

0.99+

IDCORGANIZATION

0.99+

VxRailTITLE

0.99+

hundredsQUANTITY

0.99+

10%QUANTITY

0.99+

bothQUANTITY

0.99+

2023DATE

0.99+

OneQUANTITY

0.99+

twoQUANTITY

0.99+

Jon SiegalPERSON

0.99+

thousandsQUANTITY

0.99+

Iron BowORGANIZATION

0.99+

todayDATE

0.99+

this weekDATE

0.99+

each companyQUANTITY

0.98+

over 50%QUANTITY

0.98+

tomorrowDATE

0.98+

VxRail E-SeriesCOMMERCIAL_ITEM

0.98+

two peopleQUANTITY

0.98+

this yearDATE

0.98+

firstQUANTITY

0.97+

IntelORGANIZATION

0.97+

HCIORGANIZATION

0.96+

2020DATE

0.96+

VMwareORGANIZATION

0.95+

first thingQUANTITY

0.95+

FirstQUANTITY

0.95+

VxRailCOMMERCIAL_ITEM

0.94+

9,600 plus customersQUANTITY

0.94+

second thingQUANTITY

0.91+

couple of weeks agoDATE

0.9+

DODTITLE

0.89+

last seven monthsDATE

0.85+

one exampleQUANTITY

0.85+

army CorpsORGANIZATION

0.85+

nearly 10,000 customeQUANTITY

0.83+

Troy iron bowORGANIZATION

0.83+

Troy Massey & John Siegal V1


 

>> Instructor: From around the globe. It's the queue with digital coverage of Dell technologies world, digital experience brought to you by Dell technologies. >> Hi, welcome to the cubes coverage of Dell technologies world 2020, the virtual experience. I am Lisa Martin and I've got a couple of guests joining me. One of them is a longtime cube alumni. Jon Siegel is back the VP of product marketing for Dell technologies. Jon, it's great to see you. >> Great to be back (indistinct), thank you. >> And also joining us is Troy Massey the director of enterprise engagements from iron bow technologies. Troy, welcome to the cube. >> Hi, thank you for having me. >> So we're going to be talking about VxRail, how it's driving the future of HCI to the edge, but first let's get choice perspective. I would like the audience to understand who iron build technologies is and what you do. And then we'll kind of look at it as what you're doing with the extra rail, as well as your channel partner business with Dell technologies. So Troy, take it away. >> Yeah. So IML is a global company, we're a value added reseller. Having partnered with Dell. We have people physically living from Europe all the way through Korea from kind of based the globe, primarily in wherever there's DOD or federal government agencies. >> And tell me about from a channel partner perspective what you guys are doing together. >> Yeah, so we have a lot of efforts going on channel partner together. Specifically ,iron (murmurs) is a huge effort to what we're doing together. It's a on-premise cloud that's it's basis VxRail, VMware cloud foundation on top with Intel all throughout, so there's Intel Xeon processors with Optane drives. So just the perfect elegant On-Prem cloud and hybrid cloud solution Dell, I remember driving together. >> So let's talk about the edge cause a big focus of Dell technologies world this year is about the edge. How do you see Troy iron bow extending services to the edge? And what do you anticipate from your customers in terms of what their needs are as they're changing? >> Great, great question. So for one I've got to talk a little bit about what the edge and what the edges and the edge is different things to different people. So I'm going to explain a little bit of the edge and what we're seeing and in the federal government. So I'll give you one example and that's you know, (indistinct). So they have a an entire squadron that does all of the firefighting, the large fires you see across California or whatever state and Gulfton fires that year where they take the entire squadron of airplanes out. When they, they sort of water overall the whole fire, they don't just bring planes. They bring entire squatters of military personnel to help communicate with the police and with the local fire and all of that takes information. So they need to bring information data with them. Is there a building over there? Do people live over there where? We got to actually concentrate on site and that's higher priority wise? So it doesn't make a lot of sense to try to do that remotely over satellite it's large, large chunks of data that needs to be local to the customer. So VxRail is the power in the HCI world. So a VxRail at that edge provides and what's the performance I need to get that job (indistinct). >> I think that's going to be a new segment here in Silicon Valley. That thinking about all the fires we've had and it's really VxRail at the edge that's helping fight the fires. (Murmurs)That's thanks for sharing that. (indistinct chatting) >> So there's all kinds of workings in that same deal. They need to know where to go rescue those people and it's all data. >> Exactly, it's going to be data that's, that as you said it was not delayed sent over the wire but obviously being able to be transmitted in real time so that actions can be taken which is one of the things we talk about with data all the time you have to be able to get the insight and act on it quickly. So Jon, the theme of this year's virtual Dell technologies world is the edge is a big part of the theme. So talk to us about driving the future of HCI at the edge with VxRail, how there's been a lot of growth I think 9,600 plus customers so far. So talk to us about the future of HCI at the edge with VxRail as a driver of that. >> Absolutely. So first of all, I want to thank iron bow for being one of our nearly 10,000 customers for VxRail and you know, absolutely. So, you know, overall the edge is going to be a major theme for Dell tech world this week. And specifically for VxRail, we of course continue to play with VxRail, a key role in modernizing data centers, as well as hybrid cloud. And this week we really want to highlight some of the recent innovations we have around extending the simplified operations of VxRail that many like iron bow and others are experiencing today in the core or in the cloud in extending those that automation to the edge. And you heard a lot about what the edge can do in the implications and the value of the edge. While we have lots of customers today including (indistinct) that are using VxRail at their edge locations, we have others like large retail, home improvement chains, financial institutions. We expect the edge to soon explode. We like to think that we are at the edge of the edge opportunity in IT. In fact, IDC recently stated that by 2023 over 50% of new enterprise data that is generated can going to be generated outside the core data center and outside the cloud. That's up in 10% today. So this is, this is massive edge locations. Of course come with their own challenges, whether it's sometimes less than ideal conditions around power and cooling or they may not have typically skilled IT staff at the edge, right? So they need, they need new special configurations, they need operational efficiencies. And I think VxRail is uniquely positioned to help address that. >> Let's kind of dig into this operational challenges because in the last seven months so much of what we all do has become remote and a good amount of that is going to be probably permanent, right? So when you think about the volume of remote devices that VxRail can potentially manage. Jon, how do you see the actual being able to help in this sort of very distributed environment that might be very well much permanent? >> Yeah, I know. And like you said, it's going to just grow and grow the distributed environment and what that means for each company might be slightly different but regardless of what they do need to seamless operations across all of those different edge locations and again, a big focus for us. So we're really doing three things to extend the automated operations of VxRail to the edge and doing so at scale. The first thing I want to say, talk about is that we did unveil just two VxRail platforms designed specifically for the edge, the new VxRail E Series, which is ideal for remote office locations, where space is limited the remote, the VxRail D series. I think of D as in durable, this is our ruggedized platform built from the ground up for harsh environments. You know, such as DOD environments like in the, in the desert. And both of these VxRail platforms are fully automated. They automate everything from deployment to expansions to life cycle management overall. And now what we're doing now with extending that automation is the second thing we doing you know, to the edge from an operational perspective. And what we're doing first and foremost is we are introducing a new software as a service multi cluster management. This is part of the VxVRail ACI system software that we deliver today as part of the X rail. This not only provides a global view of the infrastructure performance and capacity analysis across all the locations, but even more importantly it actively ensures that all the clusters and the remote locate locations always stay in a continuously validated state. This means that I can automatically determine which software components need to be upgraded you know, and also automatically execute the full stack upgrades, right? Without any technical expertise at the site, it can be done centrally further automating the lifecycle management process and process that we do at the core and the cloud, and now extending out to the edge. So imaginely the operational efficiencies for customers with tens or hundreds or even thousands of edge sites. So this is we think truly a game changer from that perspective. And then in addition to that, we're also adding the support for BCF on VxRail. So just at VMware, just a couple of weeks ago VMware announced remote edge cluster support for BCF. So those customers that run them on BCF on VxRail now can get the, they can enjoy a consistent cloud operating model you know, for those edge locations. So you know, in somewhere you're getting consistency you're getting automation regardless of where your VxRail is located. >> And this is something that I saw in the notes, Jon is described as a curated experience. Can you describe what that is if I think of reference architectures and things of that what is a curated experience and how is it different? >> Yeah, I know I'm curious to experience for the exhale, really what it is it's about seamless. It's about we, we have taken the burden if you were integrating infrastructure off of the customer's shoulders and onto ours right? So what we do is we ensure VxRail is, in fact the only jointly engineered ATI system in the market, that's doing engineering with VMware, for VMware to enhance VMware environments. And so what we've done there is, we have a pre-integrated full stack experience that we're providing the customer from deployment to again to everyday operations, to making changes, et cetera. We've essentially what we've done here is that we've taken again, that burden off our customers and allowed them to spend more time innovating in less, you know, less time integrating. >> That sounds good to everyone, right? Simplifying less time to troubleshoot more time to be able to be strategic and innovative, especially in such a rapidly changing world. Troy overview now. Oh, go ahead Jon >> As you can say, to add to that you know, we've seen a real acceleration this year to digital transformation to your point earlier just with remote everything. And I think a lot of the projects and so including a shift that we've seen to consuming infrastructure overall, whether you know, and that's the onset of the cloud and wherever that cloud might be, right, it could be on-prem, it could be on premises, could be off premises. And so, you know, that focused on consuming infrastructure versus in that preference for consuming infrastructure versus building and maintaining it is something that we're going to continue to see accelerate over time. >> You're right, that digital transformation acceleration has been one of the biggest topics in the last seven months and looking at which businesses really are set up and have the foundation and the culture to be able to make those changes quickly to not just survive in this environment, but win tomorrow. So let's talk over to you for a second, in terms of, of the edge. What are your thoughts on as a partner, with VxRail got a solution built on it? What are your thoughts about what VxRail is going to be able to deliver enable you to deliver at the edge? You know you gave us that great example of the air force reserve but what are iron boast thoughts there, what do you envision going forward? >> Yeah absolutely, thank you. And first expand a little bit on what Jon was paying for a picture. You talked about tens, hundreds even thousands of different sites that all need their data. They all need processing compute, but those types of scale of sites don't necessarily need to have an IT on staff at those sites, (murmurs) army Corps of engineers. They have to have one or two people out at every dam to monitor the dam but that doesn't mean that justifies it. And then 19 staff are out there with them. So the idea to remotely manage that VxRail they're, they're just industry leaders in the ability to deploy this somewhere where there's not an IT person and be able to manage it but not just manage it predictive analysis on when they're starting to run out of storage give alerts so that we can, we can start the operating. So, we see that as part of our path forward with our iron target on-premise cloud is the ability to get people back to doing their job away from doing IT. >> And Jon, I'm curious what your thoughts are in Del tech thoughts are about, some of these really interesting DOD use cases that Troy talked about really compelling. What do you see in terms of influence into the enterprise space or the consumer space as the world is so different now as we go into 2021? >> Yeah I mean, I think you know, as I mentioned earlier, I think the you know, we're talking a lot about the edge today and I think what I described earlier about the trend towards the, in the preference to consume infrastructure versus build and maintain it is something that we're seeing you know, of course, you know across highly distributed environments, more and more now. And I think that use cases are going to continue to expand whether it's financial institutions with edge, edge offices spread across the world, you know, to manufacturing, to you know logistics companies, et cetera, et cetera. These are retail's another great example. We have a number of retail companies that are going to leverage and want to have data again processed and analyzed more importantly at the edge to make more informed decisions more quickly. And this, and that's just the beginning. I mean, obviously the automotive industries and other one that frequently comes up, it's something that can take full advantage of the edge where decisions need to be made in real time at the edge. So where, you know, I think the use cases are endless and the promises is just beginning now. We're really excited to help companies of all shapes and sizes, you know, really thrive in this new world. >> Speaking of excitement, I'm sure exciting Dell technologies world obviously challenging to not be able to gather in one place in Las Vegas with what 14,000 or so, folks including many many partners. Jon talked to us about the engagement that you're expecting your customers to have with Dell technologies during this virtual events. >> Absolutely, I think so. First of all, yeah virtual is different, but there's lot of advantages to that. One of them is that we can have an ongoing dialogue during a number of the sessions that we have why some of the sessions might be prerecorded. There are alive chats all the way through including a number of breakouts on VxRail specifically, as well as the edge, as well as a number of different topics that you can imagine. We've also just launched a new game a fun game for mobile called data center sin where customers can have some fun learning about VxRail, the experience, the takes and balancing the budget and staffing and capacity to address the needs of the business. So we're always looking for fun and engaging ways to experience the real life benefits of our HCI platform, such as VxRail. And so customers can check that out as well by searching their app store of choice for Dell technologies data center sin and have at it and have some fun. But again, whether it's playing the game online through it, augmented reality experience or it's, you know connecting directly with any of our subject matter experts there's going to be a lot of opportunity to learn more about how VxRail and HCI can help our customers thrive. >> Excellent, I like that game idea. Well, Troy and Jon, thank you for joining me today and letting me know what you guys are doing with VxRail. What's coming with the edge. And the fact that they use cases are just going to proliferate. We appreciate your time. >> Thank you as well. >> For my guests, I'm Lisa Martin. You're watching the cubes coverage of Dell technologies world 2020. (upbeat music)

Published Date : Oct 13 2020

SUMMARY :

to you by Dell technologies. Jon Siegel is back the VP Great to be back Troy Massey the director of HCI to the edge, but first living from Europe all the way what you guys are doing together. So just the perfect elegant On-Prem cloud the edge cause a big focus a little bit of the edge and it's really VxRail at the edge They need to know where of HCI at the edge with We expect the edge to soon explode. and a good amount of that of the infrastructure that I saw in the notes, of the customer's shoulders to be able to be strategic add to that you know, and the culture to be able So the idea to remotely manage as the world is so different in the preference to consume Jon talked to us about the during a number of the And the fact that they use cases of Dell technologies world 2020.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Lisa MartinPERSON

0.99+

EuropeLOCATION

0.99+

Jon SiegelPERSON

0.99+

DellORGANIZATION

0.99+

JonPERSON

0.99+

KoreaLOCATION

0.99+

Silicon ValleyLOCATION

0.99+

TroyPERSON

0.99+

oneQUANTITY

0.99+

tensQUANTITY

0.99+

CaliforniaLOCATION

0.99+

VMwareORGANIZATION

0.99+

Las VegasLOCATION

0.99+

IDCORGANIZATION

0.99+

DODORGANIZATION

0.99+

10%QUANTITY

0.99+

2021DATE

0.99+

hundredsQUANTITY

0.99+

14,000QUANTITY

0.99+

VxRailTITLE

0.99+

OneQUANTITY

0.99+

2023DATE

0.99+

bothQUANTITY

0.99+

Troy MasseyPERSON

0.99+

19 staffQUANTITY

0.99+

twoQUANTITY

0.99+

todayDATE

0.99+

second thingQUANTITY

0.98+

thousandsQUANTITY

0.98+

IMLORGANIZATION

0.98+

army CorpsORGANIZATION

0.98+

this weekDATE

0.98+

two peopleQUANTITY

0.98+

over 50%QUANTITY

0.97+

9,600 plus customersQUANTITY

0.97+

firstQUANTITY

0.97+

this yearDATE

0.97+

one exampleQUANTITY

0.97+

VxVRailTITLE

0.96+

ATIORGANIZATION

0.96+

VxRailORGANIZATION

0.96+

VxRail E SeriesCOMMERCIAL_ITEM

0.96+

2020DATE

0.95+

each companyQUANTITY

0.95+

tomorrowDATE

0.95+

John SiegalPERSON

0.93+

DelORGANIZATION

0.93+

FirstQUANTITY

0.92+

IntelORGANIZATION

0.91+

one placeQUANTITY

0.91+

last seven monthsDATE

0.89+

DODTITLE

0.89+

nearly 10,000 customersQUANTITY

0.89+

VxRailCOMMERCIAL_ITEM

0.88+

4-video test


 

>>don't talk mhm, >>Okay, thing is my presentation on coherent nonlinear dynamics and combinatorial optimization. This is going to be a talk to introduce an approach we're taking to the analysis of the performance of coherent using machines. So let me start with a brief introduction to easing optimization. The easing model represents a set of interacting magnetic moments or spins the total energy given by the expression shown at the bottom left of this slide. Here, the signal variables are meditate binary values. The Matrix element J. I. J. Represents the interaction, strength and signed between any pair of spins. I. J and A Chive represents a possible local magnetic field acting on each thing. The easing ground state problem is to find an assignment of binary spin values that achieves the lowest possible value of total energy. And an instance of the easing problem is specified by giving numerical values for the Matrix J in Vector H. Although the easy model originates in physics, we understand the ground state problem to correspond to what would be called quadratic binary optimization in the field of operations research and in fact, in terms of computational complexity theory, it could be established that the easing ground state problem is np complete. Qualitatively speaking, this makes the easing problem a representative sort of hard optimization problem, for which it is expected that the runtime required by any computational algorithm to find exact solutions should, as anatomically scale exponentially with the number of spends and for worst case instances at each end. Of course, there's no reason to believe that the problem instances that actually arrives in practical optimization scenarios are going to be worst case instances. And it's also not generally the case in practical optimization scenarios that we demand absolute optimum solutions. Usually we're more interested in just getting the best solution we can within an affordable cost, where costs may be measured in terms of time, service fees and or energy required for a computation. This focuses great interest on so called heuristic algorithms for the easing problem in other NP complete problems which generally get very good but not guaranteed optimum solutions and run much faster than algorithms that are designed to find absolute Optima. To get some feeling for present day numbers, we can consider the famous traveling salesman problem for which extensive compilations of benchmarking data may be found online. A recent study found that the best known TSP solver required median run times across the Library of Problem instances That scaled is a very steep route exponential for end up to approximately 4500. This gives some indication of the change in runtime scaling for generic as opposed the worst case problem instances. Some of the instances considered in this study were taken from a public library of T SPS derived from real world Veil aside design data. This feels I TSP Library includes instances within ranging from 131 to 744,710 instances from this library with end between 6880 13,584 were first solved just a few years ago in 2017 requiring days of run time and a 48 core to King hurts cluster, while instances with and greater than or equal to 14,233 remain unsolved exactly by any means. Approximate solutions, however, have been found by heuristic methods for all instances in the VLS i TSP library with, for example, a solution within 0.14% of a no lower bound, having been discovered, for instance, with an equal 19,289 requiring approximately two days of run time on a single core of 2.4 gigahertz. Now, if we simple mindedly extrapolate the root exponential scaling from the study up to an equal 4500, we might expect that an exact solver would require something more like a year of run time on the 48 core cluster used for the N equals 13,580 for instance, which shows how much a very small concession on the quality of the solution makes it possible to tackle much larger instances with much lower cost. At the extreme end, the largest TSP ever solved exactly has an equal 85,900. This is an instance derived from 19 eighties VLSI design, and it's required 136 CPU. Years of computation normalized to a single cord, 2.4 gigahertz. But the 24 larger so called world TSP benchmark instance within equals 1,904,711 has been solved approximately within ophthalmology. Gap bounded below 0.474%. Coming back to the general. Practical concerns have applied optimization. We may note that a recent meta study analyzed the performance of no fewer than 37 heuristic algorithms for Max cut and quadratic pioneer optimization problems and found the performance sort and found that different heuristics work best for different problem instances selected from a large scale heterogeneous test bed with some evidence but cryptic structure in terms of what types of problem instances were best solved by any given heuristic. Indeed, their their reasons to believe that these results from Mexico and quadratic binary optimization reflected general principle of performance complementarity among heuristic optimization algorithms in the practice of solving heart optimization problems there. The cerise is a critical pre processing issue of trying to guess which of a number of available good heuristic algorithms should be chosen to tackle a given problem. Instance, assuming that any one of them would incur high costs to run on a large problem, instances incidence, making an astute choice of heuristic is a crucial part of maximizing overall performance. Unfortunately, we still have very little conceptual insight about what makes a specific problem instance, good or bad for any given heuristic optimization algorithm. This has certainly been pinpointed by researchers in the field is a circumstance that must be addressed. So adding this all up, we see that a critical frontier for cutting edge academic research involves both the development of novel heuristic algorithms that deliver better performance, with lower cost on classes of problem instances that are underserved by existing approaches, as well as fundamental research to provide deep conceptual insight into what makes a given problem in, since easy or hard for such algorithms. In fact, these days, as we talk about the end of Moore's law and speculate about a so called second quantum revolution, it's natural to talk not only about novel algorithms for conventional CPUs but also about highly customized special purpose hardware architectures on which we may run entirely unconventional algorithms for combinatorial optimization such as easing problem. So against that backdrop, I'd like to use my remaining time to introduce our work on analysis of coherent using machine architectures and associate ID optimization algorithms. These machines, in general, are a novel class of information processing architectures for solving combinatorial optimization problems by embedding them in the dynamics of analog, physical or cyber physical systems, in contrast to both MAWR traditional engineering approaches that build using machines using conventional electron ICS and more radical proposals that would require large scale quantum entanglement. The emerging paradigm of coherent easing machines leverages coherent nonlinear dynamics in photonic or Opto electronic platforms to enable near term construction of large scale prototypes that leverage post Simoes information dynamics, the general structure of of current CM systems has shown in the figure on the right. The role of the easing spins is played by a train of optical pulses circulating around a fiber optical storage ring. A beam splitter inserted in the ring is used to periodically sample the amplitude of every optical pulse, and the measurement results are continually read into a refugee A, which uses them to compute perturbations to be applied to each pulse by a synchronized optical injections. These perturbations, air engineered to implement the spin, spin coupling and local magnetic field terms of the easing Hamiltonian, corresponding to a linear part of the CME Dynamics, a synchronously pumped parametric amplifier denoted here as PPL and Wave Guide adds a crucial nonlinear component to the CIA and Dynamics as well. In the basic CM algorithm, the pump power starts very low and has gradually increased at low pump powers. The amplitude of the easing spin pulses behaviors continuous, complex variables. Who Israel parts which can be positive or negative, play the role of play the role of soft or perhaps mean field spins once the pump, our crosses the threshold for parametric self oscillation. In the optical fiber ring, however, the attitudes of the easing spin pulses become effectively Qantas ized into binary values while the pump power is being ramped up. The F P J subsystem continuously applies its measurement based feedback. Implementation of the using Hamiltonian terms, the interplay of the linear rised using dynamics implemented by the F P G A and the threshold conversation dynamics provided by the sink pumped Parametric amplifier result in the final state of the optical optical pulse amplitude at the end of the pump ramp that could be read as a binary strain, giving a proposed solution of the easing ground state problem. This method of solving easing problem seems quite different from a conventional algorithm that runs entirely on a digital computer as a crucial aspect of the computation is performed physically by the analog, continuous, coherent, nonlinear dynamics of the optical degrees of freedom. In our efforts to analyze CIA and performance, we have therefore turned to the tools of dynamical systems theory, namely, a study of modifications, the evolution of critical points and apologies of hetero clinic orbits and basins of attraction. We conjecture that such analysis can provide fundamental insight into what makes certain optimization instances hard or easy for coherent using machines and hope that our approach can lead to both improvements of the course, the AM algorithm and a pre processing rubric for rapidly assessing the CME suitability of new instances. Okay, to provide a bit of intuition about how this all works, it may help to consider the threshold dynamics of just one or two optical parametric oscillators in the CME architecture just described. We can think of each of the pulse time slots circulating around the fiber ring, as are presenting an independent Opio. We can think of a single Opio degree of freedom as a single, resonant optical node that experiences linear dissipation, do toe out coupling loss and gain in a pump. Nonlinear crystal has shown in the diagram on the upper left of this slide as the pump power is increased from zero. As in the CME algorithm, the non linear game is initially to low toe overcome linear dissipation, and the Opio field remains in a near vacuum state at a critical threshold. Value gain. Equal participation in the Popeo undergoes a sort of lazing transition, and the study states of the OPIO above this threshold are essentially coherent states. There are actually two possible values of the Opio career in amplitude and any given above threshold pump power which are equal in magnitude but opposite in phase when the OPI across the special diet basically chooses one of the two possible phases randomly, resulting in the generation of a single bit of information. If we consider to uncoupled, Opio has shown in the upper right diagram pumped it exactly the same power at all times. Then, as the pump power has increased through threshold, each Opio will independently choose the phase and thus to random bits are generated for any number of uncoupled. Oppose the threshold power per opio is unchanged from the single Opio case. Now, however, consider a scenario in which the two appeals air, coupled to each other by a mutual injection of their out coupled fields has shown in the diagram on the lower right. One can imagine that depending on the sign of the coupling parameter Alfa, when one Opio is lazing, it will inject a perturbation into the other that may interfere either constructively or destructively, with the feel that it is trying to generate by its own lazing process. As a result, when came easily showed that for Alfa positive, there's an effective ferro magnetic coupling between the two Opio fields and their collective oscillation threshold is lowered from that of the independent Opio case. But on Lee for the two collective oscillation modes in which the two Opio phases are the same for Alfa Negative, the collective oscillation threshold is lowered on Lee for the configurations in which the Opio phases air opposite. So then, looking at how Alfa is related to the J. I. J matrix of the easing spin coupling Hamiltonian, it follows that we could use this simplistic to a p o. C. I am to solve the ground state problem of a fair magnetic or anti ferro magnetic ankles to easing model simply by increasing the pump power from zero and observing what phase relation occurs as the two appeals first start delays. Clearly, we can imagine generalizing this story toe larger, and however the story doesn't stay is clean and simple for all larger problem instances. And to find a more complicated example, we only need to go to n equals four for some choices of J J for n equals, for the story remains simple. Like the n equals two case. The figure on the upper left of this slide shows the energy of various critical points for a non frustrated and equals, for instance, in which the first bifurcated critical point that is the one that I forget to the lowest pump value a. Uh, this first bifurcated critical point flows as symptomatically into the lowest energy easing solution and the figure on the upper right. However, the first bifurcated critical point flows to a very good but sub optimal minimum at large pump power. The global minimum is actually given by a distinct critical critical point that first appears at a higher pump power and is not automatically connected to the origin. The basic C am algorithm is thus not able to find this global minimum. Such non ideal behaviors needs to become more confident. Larger end for the n equals 20 instance, showing the lower plots where the lower right plot is just a zoom into a region of the lower left lot. It can be seen that the global minimum corresponds to a critical point that first appears out of pump parameter, a around 0.16 at some distance from the idiomatic trajectory of the origin. That's curious to note that in both of these small and examples, however, the critical point corresponding to the global minimum appears relatively close to the idiomatic projector of the origin as compared to the most of the other local minima that appear. We're currently working to characterize the face portrait topology between the global minimum in the antibiotic trajectory of the origin, taking clues as to how the basic C am algorithm could be generalized to search for non idiomatic trajectories that jump to the global minimum during the pump ramp. Of course, n equals 20 is still too small to be of interest for practical optimization applications. But the advantage of beginning with the study of small instances is that we're able reliably to determine their global minima and to see how they relate to the 80 about trajectory of the origin in the basic C am algorithm. In the smaller and limit, we can also analyze fully quantum mechanical models of Syrian dynamics. But that's a topic for future talks. Um, existing large scale prototypes are pushing into the range of in equals 10 to the 4 10 to 5 to six. So our ultimate objective in theoretical analysis really has to be to try to say something about CIA and dynamics and regime of much larger in our initial approach to characterizing CIA and behavior in the large in regime relies on the use of random matrix theory, and this connects to prior research on spin classes, SK models and the tap equations etcetera. At present, we're focusing on statistical characterization of the CIA ingredient descent landscape, including the evolution of critical points in their Eigen value spectra. As the pump power is gradually increased. We're investigating, for example, whether there could be some way to exploit differences in the relative stability of the global minimum versus other local minima. We're also working to understand the deleterious or potentially beneficial effects of non ideologies, such as a symmetry in the implemented these and couplings. Looking one step ahead, we plan to move next in the direction of considering more realistic classes of problem instances such as quadratic, binary optimization with constraints. Eso In closing, I should acknowledge people who did the hard work on these things that I've shown eso. My group, including graduate students Ed winning, Daniel Wennberg, Tatsuya Nagamoto and Atsushi Yamamura, have been working in close collaboration with Syria Ganguly, Marty Fair and Amir Safarini Nini, all of us within the Department of Applied Physics at Stanford University. On also in collaboration with the Oshima Moto over at NTT 55 research labs, Onda should acknowledge funding support from the NSF by the Coherent Easing Machines Expedition in computing, also from NTT five research labs, Army Research Office and Exxon Mobil. Uh, that's it. Thanks very much. >>Mhm e >>t research and the Oshie for putting together this program and also the opportunity to speak here. My name is Al Gore ism or Andy and I'm from Caltech, and today I'm going to tell you about the work that we have been doing on networks off optical parametric oscillators and how we have been using them for icing machines and how we're pushing them toward Cornum photonics to acknowledge my team at Caltech, which is now eight graduate students and five researcher and postdocs as well as collaborators from all over the world, including entity research and also the funding from different places, including entity. So this talk is primarily about networks of resonate er's, and these networks are everywhere from nature. For instance, the brain, which is a network of oscillators all the way to optics and photonics and some of the biggest examples or metal materials, which is an array of small resonate er's. And we're recently the field of technological photonics, which is trying thio implement a lot of the technological behaviors of models in the condensed matter, physics in photonics and if you want to extend it even further, some of the implementations off quantum computing are technically networks of quantum oscillators. So we started thinking about these things in the context of icing machines, which is based on the icing problem, which is based on the icing model, which is the simple summation over the spins and spins can be their upward down and the couplings is given by the JJ. And the icing problem is, if you know J I J. What is the spin configuration that gives you the ground state? And this problem is shown to be an MP high problem. So it's computational e important because it's a representative of the MP problems on NPR. Problems are important because first, their heart and standard computers if you use a brute force algorithm and they're everywhere on the application side. That's why there is this demand for making a machine that can target these problems, and hopefully it can provide some meaningful computational benefit compared to the standard digital computers. So I've been building these icing machines based on this building block, which is a degenerate optical parametric. Oscillator on what it is is resonator with non linearity in it, and we pump these resonate er's and we generate the signal at half the frequency of the pump. One vote on a pump splits into two identical photons of signal, and they have some very interesting phase of frequency locking behaviors. And if you look at the phase locking behavior, you realize that you can actually have two possible phase states as the escalation result of these Opio which are off by pie, and that's one of the important characteristics of them. So I want to emphasize a little more on that and I have this mechanical analogy which are basically two simple pendulum. But there are parametric oscillators because I'm going to modulate the parameter of them in this video, which is the length of the string on by that modulation, which is that will make a pump. I'm gonna make a muscular. That'll make a signal which is half the frequency of the pump. And I have two of them to show you that they can acquire these face states so they're still facing frequency lock to the pump. But it can also lead in either the zero pie face states on. The idea is to use this binary phase to represent the binary icing spin. So each opio is going to represent spin, which can be either is your pie or up or down. And to implement the network of these resonate er's, we use the time off blood scheme, and the idea is that we put impulses in the cavity. These pulses air separated by the repetition period that you put in or t r. And you can think about these pulses in one resonator, xaz and temporarily separated synthetic resonate Er's if you want a couple of these resonator is to each other, and now you can introduce these delays, each of which is a multiple of TR. If you look at the shortest delay it couples resonator wanted to 2 to 3 and so on. If you look at the second delay, which is two times a rotation period, the couple's 123 and so on. And if you have and minus one delay lines, then you can have any potential couplings among these synthetic resonate er's. And if I can introduce these modulators in those delay lines so that I can strength, I can control the strength and the phase of these couplings at the right time. Then I can have a program will all toe all connected network in this time off like scheme, and the whole physical size of the system scales linearly with the number of pulses. So the idea of opium based icing machine is didn't having these o pos, each of them can be either zero pie and I can arbitrarily connect them to each other. And then I start with programming this machine to a given icing problem by just setting the couplings and setting the controllers in each of those delight lines. So now I have a network which represents an icing problem. Then the icing problem maps to finding the face state that satisfy maximum number of coupling constraints. And the way it happens is that the icing Hamiltonian maps to the linear loss of the network. And if I start adding gain by just putting pump into the network, then the OPI ohs are expected to oscillate in the lowest, lowest lost state. And, uh and we have been doing these in the past, uh, six or seven years and I'm just going to quickly show you the transition, especially what happened in the first implementation, which was using a free space optical system and then the guided wave implementation in 2016 and the measurement feedback idea which led to increasing the size and doing actual computation with these machines. So I just want to make this distinction here that, um, the first implementation was an all optical interaction. We also had an unequal 16 implementation. And then we transition to this measurement feedback idea, which I'll tell you quickly what it iss on. There's still a lot of ongoing work, especially on the entity side, to make larger machines using the measurement feedback. But I'm gonna mostly focused on the all optical networks and how we're using all optical networks to go beyond simulation of icing Hamiltonian both in the linear and non linear side and also how we're working on miniaturization of these Opio networks. So the first experiment, which was the four opium machine, it was a free space implementation and this is the actual picture off the machine and we implemented a small and it calls for Mexico problem on the machine. So one problem for one experiment and we ran the machine 1000 times, we looked at the state and we always saw it oscillate in one of these, um, ground states of the icing laboratoria. So then the measurement feedback idea was to replace those couplings and the controller with the simulator. So we basically simulated all those coherent interactions on on FB g. A. And we replicated the coherent pulse with respect to all those measurements. And then we injected it back into the cavity and on the near to you still remain. So it still is a non. They're dynamical system, but the linear side is all simulated. So there are lots of questions about if this system is preserving important information or not, or if it's gonna behave better. Computational wars. And that's still ah, lot of ongoing studies. But nevertheless, the reason that this implementation was very interesting is that you don't need the end minus one delight lines so you can just use one. Then you can implement a large machine, and then you can run several thousands of problems in the machine, and then you can compare the performance from the computational perspective Looks so I'm gonna split this idea of opium based icing machine into two parts. One is the linear part, which is if you take out the non linearity out of the resonator and just think about the connections. You can think about this as a simple matrix multiplication scheme. And that's basically what gives you the icing Hambletonian modeling. So the optical laws of this network corresponds to the icing Hamiltonian. And if I just want to show you the example of the n equals for experiment on all those face states and the history Graham that we saw, you can actually calculate the laws of each of those states because all those interferences in the beam splitters and the delay lines are going to give you a different losses. And then you will see that the ground states corresponds to the lowest laws of the actual optical network. If you add the non linearity, the simple way of thinking about what the non linearity does is that it provides to gain, and then you start bringing up the gain so that it hits the loss. Then you go through the game saturation or the threshold which is going to give you this phase bifurcation. So you go either to zero the pie face state. And the expectation is that Theis, the network oscillates in the lowest possible state, the lowest possible loss state. There are some challenges associated with this intensity Durban face transition, which I'm going to briefly talk about. I'm also going to tell you about other types of non aerodynamics that we're looking at on the non air side of these networks. So if you just think about the linear network, we're actually interested in looking at some technological behaviors in these networks. And the difference between looking at the technological behaviors and the icing uh, machine is that now, First of all, we're looking at the type of Hamilton Ian's that are a little different than the icing Hamilton. And one of the biggest difference is is that most of these technological Hamilton Ian's that require breaking the time reversal symmetry, meaning that you go from one spin to in the one side to another side and you get one phase. And if you go back where you get a different phase, and the other thing is that we're not just interested in finding the ground state, we're actually now interesting and looking at all sorts of states and looking at the dynamics and the behaviors of all these states in the network. So we started with the simplest implementation, of course, which is a one d chain of thes resonate, er's, which corresponds to a so called ssh model. In the technological work, we get the similar energy to los mapping and now we can actually look at the band structure on. This is an actual measurement that we get with this associate model and you see how it reasonably how How? Well, it actually follows the prediction and the theory. One of the interesting things about the time multiplexing implementation is that now you have the flexibility of changing the network as you are running the machine. And that's something unique about this time multiplex implementation so that we can actually look at the dynamics. And one example that we have looked at is we can actually go through the transition off going from top A logical to the to the standard nontrivial. I'm sorry to the trivial behavior of the network. You can then look at the edge states and you can also see the trivial and states and the technological at states actually showing up in this network. We have just recently implement on a two D, uh, network with Harper Hofstadter model and when you don't have the results here. But we're one of the other important characteristic of time multiplexing is that you can go to higher and higher dimensions and keeping that flexibility and dynamics, and we can also think about adding non linearity both in a classical and quantum regimes, which is going to give us a lot of exotic, no classical and quantum, non innate behaviors in these networks. Yeah, So I told you about the linear side. Mostly let me just switch gears and talk about the nonlinear side of the network. And the biggest thing that I talked about so far in the icing machine is this face transition that threshold. So the low threshold we have squeezed state in these. Oh, pios, if you increase the pump, we go through this intensity driven phase transition and then we got the face stays above threshold. And this is basically the mechanism off the computation in these O pos, which is through this phase transition below to above threshold. So one of the characteristics of this phase transition is that below threshold, you expect to see quantum states above threshold. You expect to see more classical states or coherent states, and that's basically corresponding to the intensity off the driving pump. So it's really hard to imagine that it can go above threshold. Or you can have this friends transition happen in the all in the quantum regime. And there are also some challenges associated with the intensity homogeneity off the network, which, for example, is if one opioid starts oscillating and then its intensity goes really high. Then it's going to ruin this collective decision making off the network because of the intensity driven face transition nature. So So the question is, can we look at other phase transitions? Can we utilize them for both computing? And also can we bring them to the quantum regime on? I'm going to specifically talk about the face transition in the spectral domain, which is the transition from the so called degenerate regime, which is what I mostly talked about to the non degenerate regime, which happens by just tuning the phase of the cavity. And what is interesting is that this phase transition corresponds to a distinct phase noise behavior. So in the degenerate regime, which we call it the order state, you're gonna have the phase being locked to the phase of the pump. As I talked about non degenerate regime. However, the phase is the phase is mostly dominated by the quantum diffusion. Off the off the phase, which is limited by the so called shallow towns limit, and you can see that transition from the general to non degenerate, which also has distinct symmetry differences. And this transition corresponds to a symmetry breaking in the non degenerate case. The signal can acquire any of those phases on the circle, so it has a you one symmetry. Okay, and if you go to the degenerate case, then that symmetry is broken and you only have zero pie face days I will look at. So now the question is can utilize this phase transition, which is a face driven phase transition, and can we use it for similar computational scheme? So that's one of the questions that were also thinking about. And it's not just this face transition is not just important for computing. It's also interesting from the sensing potentials and this face transition, you can easily bring it below threshold and just operated in the quantum regime. Either Gaussian or non Gaussian. If you make a network of Opio is now, we can see all sorts off more complicated and more interesting phase transitions in the spectral domain. One of them is the first order phase transition, which you get by just coupling to Opio, and that's a very abrupt face transition and compared to the to the single Opio phase transition. And if you do the couplings right, you can actually get a lot of non her mission dynamics and exceptional points, which are actually very interesting to explore both in the classical and quantum regime. And I should also mention that you can think about the cup links to be also nonlinear couplings. And that's another behavior that you can see, especially in the nonlinear in the non degenerate regime. So with that, I basically told you about these Opio networks, how we can think about the linear scheme and the linear behaviors and how we can think about the rich, nonlinear dynamics and non linear behaviors both in the classical and quantum regime. I want to switch gear and tell you a little bit about the miniaturization of these Opio networks. And of course, the motivation is if you look at the electron ICS and what we had 60 or 70 years ago with vacuum tube and how we transition from relatively small scale computers in the order of thousands of nonlinear elements to billions of non elements where we are now with the optics is probably very similar to 70 years ago, which is a table talk implementation. And the question is, how can we utilize nano photonics? I'm gonna just briefly show you the two directions on that which we're working on. One is based on lithium Diabate, and the other is based on even a smaller resonate er's could you? So the work on Nana Photonic lithium naive. It was started in collaboration with Harvard Marko Loncar, and also might affair at Stanford. And, uh, we could show that you can do the periodic polling in the phenomenon of it and get all sorts of very highly nonlinear processes happening in this net. Photonic periodically polls if, um Diabate. And now we're working on building. Opio was based on that kind of photonic the film Diabate. And these air some some examples of the devices that we have been building in the past few months, which I'm not gonna tell you more about. But the O. P. O. S. And the Opio Networks are in the works. And that's not the only way of making large networks. Um, but also I want to point out that The reason that these Nana photonic goblins are actually exciting is not just because you can make a large networks and it can make him compact in a in a small footprint. They also provide some opportunities in terms of the operation regime. On one of them is about making cat states and Opio, which is, can we have the quantum superposition of the zero pie states that I talked about and the Net a photonic within? I've It provides some opportunities to actually get closer to that regime because of the spatial temporal confinement that you can get in these wave guides. So we're doing some theory on that. We're confident that the type of non linearity two losses that it can get with these platforms are actually much higher than what you can get with other platform their existing platforms and to go even smaller. We have been asking the question off. What is the smallest possible Opio that you can make? Then you can think about really wavelength scale type, resonate er's and adding the chi to non linearity and see how and when you can get the Opio to operate. And recently, in collaboration with us see, we have been actually USC and Creole. We have demonstrated that you can use nano lasers and get some spin Hamilton and implementations on those networks. So if you can build the a P. O s, we know that there is a path for implementing Opio Networks on on such a nano scale. So we have looked at these calculations and we try to estimate the threshold of a pos. Let's say for me resonator and it turns out that it can actually be even lower than the type of bulk Pip Llano Pos that we have been building in the past 50 years or so. So we're working on the experiments and we're hoping that we can actually make even larger and larger scale Opio networks. So let me summarize the talk I told you about the opium networks and our work that has been going on on icing machines and the measurement feedback. And I told you about the ongoing work on the all optical implementations both on the linear side and also on the nonlinear behaviors. And I also told you a little bit about the efforts on miniaturization and going to the to the Nano scale. So with that, I would like Thio >>three from the University of Tokyo. Before I thought that would like to thank you showing all the stuff of entity for the invitation and the organization of this online meeting and also would like to say that it has been very exciting to see the growth of this new film lab. And I'm happy to share with you today of some of the recent works that have been done either by me or by character of Hong Kong. Honest Group indicates the title of my talk is a neuro more fic in silica simulator for the communities in machine. And here is the outline I would like to make the case that the simulation in digital Tektronix of the CME can be useful for the better understanding or improving its function principles by new job introducing some ideas from neural networks. This is what I will discuss in the first part and then it will show some proof of concept of the game and performance that can be obtained using dissimulation in the second part and the protection of the performance that can be achieved using a very large chaos simulator in the third part and finally talk about future plans. So first, let me start by comparing recently proposed izing machines using this table there is elected from recent natural tronics paper from the village Park hard people, and this comparison shows that there's always a trade off between energy efficiency, speed and scalability that depends on the physical implementation. So in red, here are the limitation of each of the servers hardware on, interestingly, the F p G, a based systems such as a producer, digital, another uh Toshiba beautification machine or a recently proposed restricted Bozeman machine, FPD A by a group in Berkeley. They offer a good compromise between speed and scalability. And this is why, despite the unique advantage that some of these older hardware have trust as the currency proposition in Fox, CBS or the energy efficiency off memory Sisters uh P. J. O are still an attractive platform for building large organizing machines in the near future. The reason for the good performance of Refugee A is not so much that they operate at the high frequency. No, there are particular in use, efficient, but rather that the physical wiring off its elements can be reconfigured in a way that limits the funding human bottleneck, larger, funny and phenols and the long propagation video information within the system. In this respect, the LPGA is They are interesting from the perspective off the physics off complex systems, but then the physics of the actions on the photos. So to put the performance of these various hardware and perspective, we can look at the competition of bringing the brain the brain complete, using billions of neurons using only 20 watts of power and operates. It's a very theoretically slow, if we can see and so this impressive characteristic, they motivate us to try to investigate. What kind of new inspired principles be useful for designing better izing machines? The idea of this research project in the future collaboration it's to temporary alleviates the limitations that are intrinsic to the realization of an optical cortex in machine shown in the top panel here. By designing a large care simulator in silicone in the bottom here that can be used for digesting the better organization principles of the CIA and this talk, I will talk about three neuro inspired principles that are the symmetry of connections, neural dynamics orphan chaotic because of symmetry, is interconnectivity the infrastructure? No. Next talks are not composed of the reputation of always the same types of non environments of the neurons, but there is a local structure that is repeated. So here's the schematic of the micro column in the cortex. And lastly, the Iraqi co organization of connectivity connectivity is organizing a tree structure in the brain. So here you see a representation of the Iraqi and organization of the monkey cerebral cortex. So how can these principles we used to improve the performance of the icing machines? And it's in sequence stimulation. So, first about the two of principles of the estimate Trian Rico structure. We know that the classical approximation of the car testing machine, which is the ground toe, the rate based on your networks. So in the case of the icing machines, uh, the okay, Scott approximation can be obtained using the trump active in your position, for example, so the times of both of the system they are, they can be described by the following ordinary differential equations on in which, in case of see, I am the X, I represent the in phase component of one GOP Oh, Theo f represents the monitor optical parts, the district optical Parametric amplification and some of the good I JoJo extra represent the coupling, which is done in the case of the measure of feedback coupling cm using oh, more than detection and refugee A and then injection off the cooking time and eso this dynamics in both cases of CNN in your networks, they can be written as the grand set of a potential function V, and this written here, and this potential functionally includes the rising Maccagnan. So this is why it's natural to use this type of, uh, dynamics to solve the icing problem in which the Omega I J or the eyes in coping and the H is the extension of the icing and attorney in India and expect so. Not that this potential function can only be defined if the Omega I j. R. A. Symmetric. So the well known problem of this approach is that this potential function V that we obtain is very non convicts at low temperature, and also one strategy is to gradually deformed this landscape, using so many in process. But there is no theorem. Unfortunately, that granted conventions to the global minimum of There's even Tony and using this approach. And so this is why we propose, uh, to introduce a macro structures of the system where one analog spin or one D O. P. O is replaced by a pair off one another spin and one error, according viable. And the addition of this chemical structure introduces a symmetry in the system, which in terms induces chaotic dynamics, a chaotic search rather than a learning process for searching for the ground state of the icing. Every 20 within this massacre structure the role of the er variable eyes to control the amplitude off the analog spins toe force. The amplitude of the expense toe become equal to certain target amplitude a uh and, uh, and this is done by modulating the strength off the icing complaints or see the the error variable E I multiply the icing complaint here in the dynamics off air d o p. O. On then the dynamics. The whole dynamics described by this coupled equations because the e I do not necessarily take away the same value for the different. I thesis introduces a symmetry in the system, which in turn creates security dynamics, which I'm sure here for solving certain current size off, um, escape problem, Uh, in which the X I are shown here and the i r from here and the value of the icing energy showing the bottom plots. You see this Celtics search that visit various local minima of the as Newtonian and eventually finds the global minimum? Um, it can be shown that this modulation off the target opportunity can be used to destabilize all the local minima off the icing evertonians so that we're gonna do not get stuck in any of them. On more over the other types of attractors I can eventually appear, such as limits I contractors, Okot contractors. They can also be destabilized using the motivation of the target and Batuta. And so we have proposed in the past two different moderation of the target amateur. The first one is a modulation that ensure the uh 100 reproduction rate of the system to become positive on this forbids the creation off any nontrivial tractors. And but in this work, I will talk about another moderation or arrested moderation which is given here. That works, uh, as well as this first uh, moderation, but is easy to be implemented on refugee. So this couple of the question that represent becoming the stimulation of the cortex in machine with some error correction they can be implemented especially efficiently on an F B. G. And here I show the time that it takes to simulate three system and also in red. You see, at the time that it takes to simulate the X I term the EI term, the dot product and the rising Hamiltonian for a system with 500 spins and Iraq Spain's equivalent to 500 g. O. P. S. So >>in >>f b d a. The nonlinear dynamics which, according to the digital optical Parametric amplification that the Opa off the CME can be computed in only 13 clock cycles at 300 yards. So which corresponds to about 0.1 microseconds. And this is Toby, uh, compared to what can be achieved in the measurements back O C. M. In which, if we want to get 500 timer chip Xia Pios with the one she got repetition rate through the obstacle nine narrative. Uh, then way would require 0.5 microseconds toe do this so the submission in F B J can be at least as fast as ah one g repression. Uh, replicate pulsed laser CIA Um, then the DOT product that appears in this differential equation can be completed in 43 clock cycles. That's to say, one microseconds at 15 years. So I pieced for pouring sizes that are larger than 500 speeds. The dot product becomes clearly the bottleneck, and this can be seen by looking at the the skating off the time the numbers of clock cycles a text to compute either the non in your optical parts or the dog products, respect to the problem size. And And if we had infinite amount of resources and PGA to simulate the dynamics, then the non illogical post can could be done in the old one. On the mattress Vector product could be done in the low carrot off, located off scales as a look at it off and and while the guide off end. Because computing the dot product involves assuming all the terms in the product, which is done by a nephew, GE by another tree, which heights scarce logarithmic any with the size of the system. But This is in the case if we had an infinite amount of resources on the LPGA food, but for dealing for larger problems off more than 100 spins. Usually we need to decompose the metrics into ah, smaller blocks with the block side that are not you here. And then the scaling becomes funny, non inner parts linear in the end, over you and for the products in the end of EU square eso typically for low NF pdf cheap PGA you the block size off this matrix is typically about 100. So clearly way want to make you as large as possible in order to maintain this scanning in a log event for the numbers of clock cycles needed to compute the product rather than this and square that occurs if we decompose the metrics into smaller blocks. But the difficulty in, uh, having this larger blocks eyes that having another tree very large Haider tree introduces a large finding and finance and long distance start a path within the refugee. So the solution to get higher performance for a simulator of the contest in machine eyes to get rid of this bottleneck for the dot product by increasing the size of this at the tree. And this can be done by organizing your critique the electrical components within the LPGA in order which is shown here in this, uh, right panel here in order to minimize the finding finance of the system and to minimize the long distance that a path in the in the fpt So I'm not going to the details of how this is implemented LPGA. But just to give you a idea off why the Iraqi Yahiko organization off the system becomes the extremely important toe get good performance for similar organizing machine. So instead of instead of getting into the details of the mpg implementation, I would like to give some few benchmark results off this simulator, uh, off the that that was used as a proof of concept for this idea which is can be found in this archive paper here and here. I should results for solving escape problems. Free connected person, randomly person minus one spring last problems and we sure, as we use as a metric the numbers of the mattress Victor products since it's the bottleneck of the computation, uh, to get the optimal solution of this escape problem with the Nina successful BT against the problem size here and and in red here, this propose FDJ implementation and in ah blue is the numbers of retrospective product that are necessary for the C. I am without error correction to solve this escape programs and in green here for noisy means in an evening which is, uh, behavior with similar to the Cartesian mission. Uh, and so clearly you see that the scaring off the numbers of matrix vector product necessary to solve this problem scales with a better exponents than this other approaches. So So So that's interesting feature of the system and next we can see what is the real time to solution to solve this SK instances eso in the last six years, the time institution in seconds to find a grand state of risk. Instances remain answers probability for different state of the art hardware. So in red is the F B g. A presentation proposing this paper and then the other curve represent Ah, brick a local search in in orange and silver lining in purple, for example. And so you see that the scaring off this purpose simulator is is rather good, and that for larger plant sizes we can get orders of magnitude faster than the state of the art approaches. Moreover, the relatively good scanning off the time to search in respect to problem size uh, they indicate that the FPD implementation would be faster than risk. Other recently proposed izing machine, such as the hope you know, natural complimented on memories distance that is very fast for small problem size in blue here, which is very fast for small problem size. But which scanning is not good on the same thing for the restricted Bosman machine. Implementing a PGA proposed by some group in Broken Recently Again, which is very fast for small parliament sizes but which canning is bad so that a dis worse than the proposed approach so that we can expect that for programs size is larger than 1000 spins. The proposed, of course, would be the faster one. Let me jump toe this other slide and another confirmation that the scheme scales well that you can find the maximum cut values off benchmark sets. The G sets better candidates that have been previously found by any other algorithms, so they are the best known could values to best of our knowledge. And, um or so which is shown in this paper table here in particular, the instances, uh, 14 and 15 of this G set can be We can find better converse than previously known, and we can find this can vary is 100 times faster than the state of the art algorithm and CP to do this which is a very common Kasich. It s not that getting this a good result on the G sets, they do not require ah, particular hard tuning of the parameters. So the tuning issuing here is very simple. It it just depends on the degree off connectivity within each graph. And so this good results on the set indicate that the proposed approach would be a good not only at solving escape problems in this problems, but all the types off graph sizing problems on Mexican province in communities. So given that the performance off the design depends on the height of this other tree, we can try to maximize the height of this other tree on a large F p g a onda and carefully routing the components within the P G A and and we can draw some projections of what type of performance we can achieve in the near future based on the, uh, implementation that we are currently working. So here you see projection for the time to solution way, then next property for solving this escape programs respect to the prime assize. And here, compared to different with such publicizing machines, particularly the digital. And, you know, 42 is shown in the green here, the green line without that's and, uh and we should two different, uh, hypothesis for this productions either that the time to solution scales as exponential off n or that the time of social skills as expression of square root off. So it seems, according to the data, that time solution scares more as an expression of square root of and also we can be sure on this and this production show that we probably can solve prime escape problem of science 2000 spins, uh, to find the rial ground state of this problem with 99 success ability in about 10 seconds, which is much faster than all the other proposed approaches. So one of the future plans for this current is in machine simulator. So the first thing is that we would like to make dissimulation closer to the rial, uh, GOP oh, optical system in particular for a first step to get closer to the system of a measurement back. See, I am. And to do this what is, uh, simulate Herbal on the p a is this quantum, uh, condoms Goshen model that is proposed described in this paper and proposed by people in the in the Entity group. And so the idea of this model is that instead of having the very simple or these and have shown previously, it includes paired all these that take into account on me the mean off the awesome leverage off the, uh, European face component, but also their violence s so that we can take into account more quantum effects off the g o p. O, such as the squeezing. And then we plan toe, make the simulator open access for the members to run their instances on the system. There will be a first version in September that will be just based on the simple common line access for the simulator and in which will have just a classic or approximation of the system. We don't know Sturm, binary weights and museum in term, but then will propose a second version that would extend the current arising machine to Iraq off F p g. A, in which we will add the more refined models truncated, ignoring the bottom Goshen model they just talked about on the support in which he valued waits for the rising problems and support the cement. So we will announce later when this is available and and far right is working >>hard comes from Universal down today in physics department, and I'd like to thank the organizers for their kind invitation to participate in this very interesting and promising workshop. Also like to say that I look forward to collaborations with with a file lab and Yoshi and collaborators on the topics of this world. So today I'll briefly talk about our attempt to understand the fundamental limits off another continues time computing, at least from the point off you off bullion satisfy ability, problem solving, using ordinary differential equations. But I think the issues that we raise, um, during this occasion actually apply to other other approaches on a log approaches as well and into other problems as well. I think everyone here knows what Dorien satisfy ability. Problems are, um, you have boolean variables. You have em clauses. Each of disjunction of collaterals literally is a variable, or it's, uh, negation. And the goal is to find an assignment to the variable, such that order clauses are true. This is a decision type problem from the MP class, which means you can checking polynomial time for satisfy ability off any assignment. And the three set is empty, complete with K three a larger, which means an efficient trees. That's over, uh, implies an efficient source for all the problems in the empty class, because all the problems in the empty class can be reduced in Polian on real time to reset. As a matter of fact, you can reduce the NP complete problems into each other. You can go from three set to set backing or two maximum dependent set, which is a set packing in graph theoretic notions or terms toe the icing graphs. A problem decision version. This is useful, and you're comparing different approaches, working on different kinds of problems when not all the closest can be satisfied. You're looking at the accusation version offset, uh called Max Set. And the goal here is to find assignment that satisfies the maximum number of clauses. And this is from the NPR class. In terms of applications. If we had inefficient sets over or np complete problems over, it was literally, positively influenced. Thousands off problems and applications in industry and and science. I'm not going to read this, but this this, of course, gives a strong motivation toe work on this kind of problems. Now our approach to set solving involves embedding the problem in a continuous space, and you use all the east to do that. So instead of working zeros and ones, we work with minus one across once, and we allow the corresponding variables toe change continuously between the two bounds. We formulate the problem with the help of a close metrics. If if a if a close, uh, does not contain a variable or its negation. The corresponding matrix element is zero. If it contains the variable in positive, for which one contains the variable in a gated for Mitt's negative one, and then we use this to formulate this products caused quote, close violation functions one for every clause, Uh, which really, continuously between zero and one. And they're zero if and only if the clause itself is true. Uh, then we form the define in order to define a dynamic such dynamics in this and dimensional hyper cube where the search happens and if they exist, solutions. They're sitting in some of the corners of this hyper cube. So we define this, uh, energy potential or landscape function shown here in a way that this is zero if and only if all the clauses all the kmc zero or the clauses off satisfied keeping these auxiliary variables a EMS always positive. And therefore, what you do here is a dynamics that is a essentially ingredient descend on this potential energy landscape. If you were to keep all the M's constant that it would get stuck in some local minimum. However, what we do here is we couple it with the dynamics we cooperated the clothes violation functions as shown here. And if he didn't have this am here just just the chaos. For example, you have essentially what case you have positive feedback. You have increasing variable. Uh, but in that case, you still get stuck would still behave will still find. So she is better than the constant version but still would get stuck only when you put here this a m which makes the dynamics in in this variable exponential like uh, only then it keeps searching until he finds a solution on deer is a reason for that. I'm not going toe talk about here, but essentially boils down toe performing a Grady and descend on a globally time barren landscape. And this is what works. Now I'm gonna talk about good or bad and maybe the ugly. Uh, this is, uh, this is What's good is that it's a hyperbolic dynamical system, which means that if you take any domain in the search space that doesn't have a solution in it or any socially than the number of trajectories in it decays exponentially quickly. And the decay rate is a characteristic in variant characteristic off the dynamics itself. Dynamical systems called the escape right the inverse off that is the time scale in which you find solutions by this by this dynamical system, and you can see here some song trajectories that are Kelty because it's it's no linear, but it's transient, chaotic. Give their sources, of course, because eventually knowledge to the solution. Now, in terms of performance here, what you show for a bunch off, um, constraint densities defined by M overran the ratio between closes toe variables for random, said Problems is random. Chris had problems, and they as its function off n And we look at money toward the wartime, the wall clock time and it behaves quite value behaves Azat party nominally until you actually he to reach the set on set transition where the hardest problems are found. But what's more interesting is if you monitor the continuous time t the performance in terms off the A narrow, continuous Time t because that seems to be a polynomial. And the way we show that is, we consider, uh, random case that random three set for a fixed constraint density Onda. We hear what you show here. Is that the right of the trash hold that it's really hard and, uh, the money through the fraction of problems that we have not been able to solve it. We select thousands of problems at that constraint ratio and resolve them without algorithm, and we monitor the fractional problems that have not yet been solved by continuous 90. And this, as you see these decays exponentially different. Educate rates for different system sizes, and in this spot shows that is dedicated behaves polynomial, or actually as a power law. So if you combine these two, you find that the time needed to solve all problems except maybe appear traction off them scales foreign or merely with the problem size. So you have paranormal, continuous time complexity. And this is also true for other types of very hard constraints and sexual problems such as exact cover, because you can always transform them into three set as we discussed before, Ramsey coloring and and on these problems, even algorithms like survey propagation will will fail. But this doesn't mean that P equals NP because what you have first of all, if you were toe implement these equations in a device whose behavior is described by these, uh, the keys. Then, of course, T the continue style variable becomes a physical work off. Time on that will be polynomial is scaling, but you have another other variables. Oxidative variables, which structured in an exponential manner. So if they represent currents or voltages in your realization and it would be an exponential cost Al Qaeda. But this is some kind of trade between time and energy, while I know how toe generate energy or I don't know how to generate time. But I know how to generate energy so it could use for it. But there's other issues as well, especially if you're trying toe do this son and digital machine but also happens. Problems happen appear. Other problems appear on in physical devices as well as we discuss later. So if you implement this in GPU, you can. Then you can get in order off to magnitude. Speed up. And you can also modify this to solve Max sad problems. Uh, quite efficiently. You are competitive with the best heuristic solvers. This is a weather problems. In 2016 Max set competition eso so this this is this is definitely this seems like a good approach, but there's off course interesting limitations, I would say interesting, because it kind of makes you think about what it means and how you can exploit this thes observations in understanding better on a low continues time complexity. If you monitored the discrete number the number of discrete steps. Don't buy the room, Dakota integrator. When you solve this on a digital machine, you're using some kind of integrator. Um and you're using the same approach. But now you measure the number off problems you haven't sold by given number of this kid, uh, steps taken by the integrator. You find out you have exponential, discrete time, complexity and, of course, thistles. A problem. And if you look closely, what happens even though the analog mathematical trajectory, that's the record here. If you monitor what happens in discrete time, uh, the integrator frustrates very little. So this is like, you know, third or for the disposition, but fluctuates like crazy. So it really is like the intervention frees us out. And this is because of the phenomenon of stiffness that are I'll talk a little bit a more about little bit layer eso. >>You know, it might look >>like an integration issue on digital machines that you could improve and could definitely improve. But actually issues bigger than that. It's It's deeper than that, because on a digital machine there is no time energy conversion. So the outside variables are efficiently representing a digital machine. So there's no exponential fluctuating current of wattage in your computer when you do this. Eso If it is not equal NP then the exponential time, complexity or exponential costs complexity has to hit you somewhere. And this is how um, but, you know, one would be tempted to think maybe this wouldn't be an issue in a analog device, and to some extent is true on our devices can be ordered to maintain faster, but they also suffer from their own problems because he not gonna be affect. That classes soldiers as well. So, indeed, if you look at other systems like Mirandizing machine measurement feedback, probably talk on the grass or selected networks. They're all hinge on some kind off our ability to control your variables in arbitrary, high precision and a certain networks you want toe read out across frequencies in case off CM's. You required identical and program because which is hard to keep, and they kind of fluctuate away from one another, shift away from one another. And if you control that, of course that you can control the performance. So actually one can ask if whether or not this is a universal bottleneck and it seems so aside, I will argue next. Um, we can recall a fundamental result by by showing harder in reaction Target from 1978. Who says that it's a purely computer science proof that if you are able toe, compute the addition multiplication division off riel variables with infinite precision, then you could solve any complete problems in polynomial time. It doesn't actually proposals all where he just chose mathematically that this would be the case. Now, of course, in Real warned, you have also precision. So the next question is, how does that affect the competition about problems? This is what you're after. Lots of precision means information also, or entropy production. Eso what you're really looking at the relationship between hardness and cost of computing off a problem. Uh, and according to Sean Hagar, there's this left branch which in principle could be polynomial time. But the question whether or not this is achievable that is not achievable, but something more cheerful. That's on the right hand side. There's always going to be some information loss, so mental degeneration that could keep you away from possibly from point normal time. So this is what we like to understand, and this information laws the source off. This is not just always I will argue, uh, in any physical system, but it's also off algorithm nature, so that is a questionable area or approach. But China gets results. Security theoretical. No, actual solar is proposed. So we can ask, you know, just theoretically get out off. Curiosity would in principle be such soldiers because it is not proposing a soldier with such properties. In principle, if if you want to look mathematically precisely what the solar does would have the right properties on, I argue. Yes, I don't have a mathematical proof, but I have some arguments that that would be the case. And this is the case for actually our city there solver that if you could calculate its trajectory in a loss this way, then it would be, uh, would solve epic complete problems in polynomial continuous time. Now, as a matter of fact, this a bit more difficult question, because time in all these can be re scared however you want. So what? Burns says that you actually have to measure the length of the trajectory, which is a new variant off the dynamical system or property dynamical system, not off its parameters ization. And we did that. So Suba Corral, my student did that first, improving on the stiffness off the problem off the integrations, using implicit solvers and some smart tricks such that you actually are closer to the actual trajectory and using the same approach. You know what fraction off problems you can solve? We did not give the length of the trajectory. You find that it is putting on nearly scaling the problem sites we have putting on your skin complexity. That means that our solar is both Polly length and, as it is, defined it also poorly time analog solver. But if you look at as a discreet algorithm, if you measure the discrete steps on a digital machine, it is an exponential solver. And the reason is because off all these stiffness, every integrator has tow truck it digitizing truncate the equations, and what it has to do is to keep the integration between the so called stability region for for that scheme, and you have to keep this product within a grimace of Jacoby in and the step size read in this region. If you use explicit methods. You want to stay within this region? Uh, but what happens that some off the Eigen values grow fast for Steve problems, and then you're you're forced to reduce that t so the product stays in this bonded domain, which means that now you have to you're forced to take smaller and smaller times, So you're you're freezing out the integration and what I will show you. That's the case. Now you can move to increase its soldiers, which is which is a tree. In this case, you have to make domain is actually on the outside. But what happens in this case is some of the Eigen values of the Jacobean, also, for six systems, start to move to zero. As they're moving to zero, they're going to enter this instability region, so your soul is going to try to keep it out, so it's going to increase the data T. But if you increase that to increase the truncation hours, so you get randomized, uh, in the large search space, so it's it's really not, uh, not going to work out. Now, one can sort off introduce a theory or language to discuss computational and are computational complexity, using the language from dynamical systems theory. But basically I I don't have time to go into this, but you have for heart problems. Security object the chaotic satellite Ouch! In the middle of the search space somewhere, and that dictates how the dynamics happens and variant properties off the dynamics. Of course, off that saddle is what the targets performance and many things, so a new, important measure that we find that it's also helpful in describing thesis. Another complexity is the so called called Makarov, or metric entropy and basically what this does in an intuitive A eyes, uh, to describe the rate at which the uncertainty containing the insignificant digits off a trajectory in the back, the flow towards the significant ones as you lose information because off arrows being, uh grown or are developed in tow. Larger errors in an exponential at an exponential rate because you have positively up north spawning. But this is an in variant property. It's the property of the set of all. This is not how you compute them, and it's really the interesting create off accuracy philosopher dynamical system. A zay said that you have in such a high dimensional that I'm consistent were positive and negatively upon of exponents. Aziz Many The total is the dimension of space and user dimension, the number off unstable manifold dimensions and as Saddam was stable, manifold direction. And there's an interesting and I think, important passion, equality, equality called the passion, equality that connect the information theoretic aspect the rate off information loss with the geometric rate of which trajectory separate minus kappa, which is the escape rate that I already talked about. Now one can actually prove a simple theorems like back off the envelope calculation. The idea here is that you know the rate at which the largest rated, which closely started trajectory separate from one another. So now you can say that, uh, that is fine, as long as my trajectory finds the solution before the projective separate too quickly. In that case, I can have the hope that if I start from some region off the face base, several close early started trajectories, they kind of go into the same solution orphaned and and that's that's That's this upper bound of this limit, and it is really showing that it has to be. It's an exponentially small number. What? It depends on the end dependence off the exponents right here, which combines information loss rate and the social time performance. So these, if this exponents here or that has a large independence or river linear independence, then you then you really have to start, uh, trajectories exponentially closer to one another in orderto end up in the same order. So this is sort off like the direction that you're going in tow, and this formulation is applicable toe all dynamical systems, uh, deterministic dynamical systems. And I think we can We can expand this further because, uh, there is, ah, way off getting the expression for the escaped rate in terms off n the number of variables from cycle expansions that I don't have time to talk about. What? It's kind of like a program that you can try toe pursuit, and this is it. So the conclusions I think of self explanatory I think there is a lot of future in in, uh, in an allo. Continue start computing. Um, they can be efficient by orders of magnitude and digital ones in solving empty heart problems because, first of all, many of the systems you like the phone line and bottleneck. There's parallelism involved, and and you can also have a large spectrum or continues time, time dynamical algorithms than discrete ones. And you know. But we also have to be mindful off. What are the possibility of what are the limits? And 11 open question is very important. Open question is, you know, what are these limits? Is there some kind off no go theory? And that tells you that you can never perform better than this limit or that limit? And I think that's that's the exciting part toe to derive thes thes this levian 10.

Published Date : Sep 27 2020

SUMMARY :

bifurcated critical point that is the one that I forget to the lowest pump value a. the chi to non linearity and see how and when you can get the Opio know that the classical approximation of the car testing machine, which is the ground toe, than the state of the art algorithm and CP to do this which is a very common Kasich. right the inverse off that is the time scale in which you find solutions by first of all, many of the systems you like the phone line and bottleneck.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Exxon MobilORGANIZATION

0.99+

AndyPERSON

0.99+

Sean HagarPERSON

0.99+

Daniel WennbergPERSON

0.99+

ChrisPERSON

0.99+

USCORGANIZATION

0.99+

CaltechORGANIZATION

0.99+

2016DATE

0.99+

100 timesQUANTITY

0.99+

BerkeleyLOCATION

0.99+

Tatsuya NagamotoPERSON

0.99+

twoQUANTITY

0.99+

1978DATE

0.99+

FoxORGANIZATION

0.99+

six systemsQUANTITY

0.99+

HarvardORGANIZATION

0.99+

Al QaedaORGANIZATION

0.99+

SeptemberDATE

0.99+

second versionQUANTITY

0.99+

CIAORGANIZATION

0.99+

IndiaLOCATION

0.99+

300 yardsQUANTITY

0.99+

University of TokyoORGANIZATION

0.99+

todayDATE

0.99+

BurnsPERSON

0.99+

Atsushi YamamuraPERSON

0.99+

0.14%QUANTITY

0.99+

48 coreQUANTITY

0.99+

0.5 microsecondsQUANTITY

0.99+

NSFORGANIZATION

0.99+

15 yearsQUANTITY

0.99+

CBSORGANIZATION

0.99+

NTTORGANIZATION

0.99+

first implementationQUANTITY

0.99+

first experimentQUANTITY

0.99+

123QUANTITY

0.99+

Army Research OfficeORGANIZATION

0.99+

firstQUANTITY

0.99+

1,904,711QUANTITY

0.99+

oneQUANTITY

0.99+

sixQUANTITY

0.99+

first versionQUANTITY

0.99+

StevePERSON

0.99+

2000 spinsQUANTITY

0.99+

five researcherQUANTITY

0.99+

CreoleORGANIZATION

0.99+

three setQUANTITY

0.99+

second partQUANTITY

0.99+

third partQUANTITY

0.99+

Department of Applied PhysicsORGANIZATION

0.99+

10QUANTITY

0.99+

eachQUANTITY

0.99+

85,900QUANTITY

0.99+

OneQUANTITY

0.99+

one problemQUANTITY

0.99+

136 CPUQUANTITY

0.99+

ToshibaORGANIZATION

0.99+

ScottPERSON

0.99+

2.4 gigahertzQUANTITY

0.99+

1000 timesQUANTITY

0.99+

two timesQUANTITY

0.99+

two partsQUANTITY

0.99+

131QUANTITY

0.99+

14,233QUANTITY

0.99+

more than 100 spinsQUANTITY

0.99+

two possible phasesQUANTITY

0.99+

13,580QUANTITY

0.99+

5QUANTITY

0.99+

4QUANTITY

0.99+

one microsecondsQUANTITY

0.99+

first stepQUANTITY

0.99+

first partQUANTITY

0.99+

500 spinsQUANTITY

0.99+

two identical photonsQUANTITY

0.99+

3QUANTITY

0.99+

70 years agoDATE

0.99+

IraqLOCATION

0.99+

one experimentQUANTITY

0.99+

zeroQUANTITY

0.99+

Amir Safarini NiniPERSON

0.99+

SaddamPERSON

0.99+

Eileen Vidrine, US Air Force | MIT CDOIQ 2020


 

>> Announcer: From around the globe, it's theCube with digital coverage of MIT, Chief Data Officer and Information Quality Symposium brought to you by Silicon Angle Media. >> Hi, I'm Stu Miniman and this is the seventh year of theCubes coverage of the MIT, Chief Data Officer and Information Quality Symposium. We love getting to talk to these chief data officers and the people in this ecosystem, the importance of data, driving data-driven cultures, and really happy to welcome to the program, first time guests Eileen Vitrine, Eileen is the Chief Data Officer for the United States Air Force, Eileen, thank you so much for joining us. >> Thank you Stu really excited about being here today. >> All right, so the United States Air Force, I believe had it first CDO office in 2017, you were put in the CDO role in June of 2018. If you could, bring us back, give us how that was formed inside the Air force and how you came to be in that role. >> Well, Stu I like to say that we are a startup organization and a really mature organization, so it's really about culture change and it began by bringing a group of amazing citizen airman reservists back to the Air Force to bring their skills from industry and bring them into the Air Force. So, I like to say that we're a total force because we have active and reservists working with civilians on a daily basis and one of the first things we did in June was we stood up a data lab, that's based in the Jones building on Andrews Air Force Base. And there, we actually take small use cases that have enterprise focus, and we really try to dig deep to try to drive data insights, to inform senior leaders across the department on really important, what I would call enterprise focused challenges, it's pretty exciting. >> Yeah, it's been fascinating when we've dug into this ecosystem, of course while the data itself is very sensitive and I'm sure for the Air Force, there are some very highest level of security, the practices that are done as to how to leverage data, the line between public and private blurs, because you have people that have come from industry that go into government and people that are from government that have leveraged their experiences there. So, if you could give us a little bit of your background and what it is that your charter has been and what you're looking to build out, as you mentioned that culture of change. >> Well, I like to say I began my data leadership journey as an active duty soldier in the army, and I was originally a transportation officer, today we would use the title condition based maintenance, but back then, it was really about running the numbers so that I could optimize my truck fleet on the road each and every day, so that my soldiers were driving safely. Data has always been part of my leadership journey and so I like to say that one of our challenges is really to make sure that data is part of every airmans core DNA, so that they're using the right data at the right level to drive insights, whether it's tactical, operational or strategic. And so it's really about empowering each and every airman, which I think is pretty exciting. >> There's so many pieces of that data, you talk about data quality, there's obviously the data life cycle. I know your presentation that you're given here at the CDO, IQ talks about the data platform that your team has built, could you explain that? What are the key tenants and what maybe differentiates it from what other organizations might have done? >> So, when we first took the challenge to build our data lab, we really wanted to really come up. Our goal was to have a cross domain solution where we could solve data problems at the appropriate classification level. And so we built the VAULT data platform, VAULT stands for visible, accessible, understandable, linked, and trustworthy. And if you look at the DOD data strategy, they will also add the tenants of interoperability and secure. So, the first steps that we have really focused on is making data visible and accessible to airmen, to empower them, to drive insights from available data to solve their problems. So, it's really about that data empowerment, we like to use the hashtag built by airmen because it's really about each and every airman being part of the solution. And I think it's really an exciting time to be in the Air Force because any airman can solve a really hard challenge and it can very quickly wrap it up rapidly, escalate up with great velocity to senior leadership, to be an enterprise solution. >> Is there some basic training that goes on from a data standpoint? For any of those that have lived in data, oftentimes you can get lost in numbers, you have to have context, you need to understand how do I separate good from bad data, or when is data still valid? So, how does someone in the Air Force get some of that beta data competency? >> Well, we have taken a multitenant approach because each and every airman has different needs. So, we have quite a few pathfinders across the Air Force today, to help what I call, upscale our total force. And so I developed a partnership with the Air Force Institute of Technology and they now have a online graduate level data science certificate program. So, individuals studying at AFIT or remotely have the opportunity to really focus on building up their data touchpoints. Just recently, we have been working on a pathfinder to allow our data officers to get their ICCP Federal Data Sector Governance Certificate Program. So, we've been running what I would call short boot camps to prep data officers to be ready for that. And I think the one that I'm most excited about is that this year, this fall, new cadets at the U.S Air Force Academy will be able to have an undergraduate degree in data science and so it's not about a one prong approach, it's about having short courses as well as academe solutions to up skill our total force moving forward. >> Well, information absolutely is such an important differentiator(laughs) in general business and absolutely the military aspects are there. You mentioned the DOD talks about interoperability in their platform, can you speak a little bit to how you make sure that data is secure? Yet, I'm sure there's opportunities for other organizations, for there to be collaboration between them. >> Well, I like to say, that we don't fight alone. So, I work on a daily basis with my peers, Tom Cecila at the Department of Navy and Greg Garcia at the Department of Army, as well as Mr. David Berg in the DOD level. It's really important that we have an integrated approach moving forward and in the DOD we partner with our security experts, so it's not about us doing security individually, it's really about, in the Air Force we use a term called digital air force, and it's about optimizing and building a trusted partnership with our CIO colleagues, as well as our chief management colleagues because it's really about that trusted partnership to make sure that we're working collaboratively across the enterprise and whatever we do in the department, we also have to reach across our services so that we're all working together. >> Eileen, I'm curious if there's been much impact from the global pandemic. When I talk to enterprise companies, that they had to rapidly make sure that while they needed to protect data, when it was in their four walls and maybe for VPN, now everyone is accessing data, much more work from home and the like. I have to imagine some of those security measures you've already taken, but have there anything along those lines or anything else that this shift in where people are, and a little bit more dispersed has impacted your work? >> Well, the story that I like to say is, that this has given us velocity. So, prior to COVID, we built our VAULT data platform as a multitenancy platform that is also cross-domain solution, so it allows people to develop and do their problem solving in an appropriate classification level. And it allows us to connect or pushup if we need to into higher classification levels. The other thing that it has helped us really work smart because we do as much as we can in that unclassified environment and then using our cloud based solution in our gateways, it allows us to bring people in at a very scheduled component so that we maximize, or we optimize their time on site. And so I really think that it's really given us great velocity because it has really allowed people to work on the right problem set, on the right class of patient level at a specific time. And plus the other pieces, we look at what we're doing is that the problem set that we've had has really allowed people to become more data focused. I think that it's personal for folks moving forward, so it has increased understanding in terms of the need for data insights, as we move forward to drive decision making. It's not that data makes the decision, but it's using the insight to make the decision. >> And one of the interesting conversations we've been having about how to get to those data insights is the use of things like machine learning, artificial intelligence, anything you can share about, how you're looking at that journey, where you are along that discovery. >> Well, I love to say that in order to do AI and machine learning, you have to have great volumes of high quality data. And so really step one was visible, accessible data, but we in the Department of the Air Force stood up an accelerator at MIT. And so we have a group of amazing airmen that are actually working with MIT on a daily basis to solve some of those, what I would call opportunities for us to move forward. My office collaborates with them on a consistent basis, because they're doing additional use cases in that academic environment, which I'm pretty excited about because I think it gives us access to some of the smartest minds. >> All right, Eileen also I understand it's your first year doing the event. Unfortunately, we don't get, all come together in Cambridge, walking those hallways and being able to listen to some of those conversations and follow up is something we've very much enjoyed over the years. What excites you about being interact with your peers and participating in the event this year? >> Well, I really think it's about helping each other leverage the amazing lessons learned. I think that if we look collaboratively, both across industry and in the federal sector, there have been amazing lessons learned and it gives us a great forum for us to really share and leverage those lessons learned as we move forward so that we're not hitting the reboot button, but we actually are starting faster. So, it comes back to the velocity component, it all helps us go faster and at a higher quality level and I think that's really exciting. >> So, final question I have for you, we've talked for years about digital transformation, we've really said that having that data strategy and that culture of leveraging data is one of the most critical pieces of having gone through that transformation. For people that are maybe early on their journey, any advice that you'd give them, having worked through a couple of years of this and the experience you've had with your peers. >> I think that the first thing is that you have to really start with a blank slate and really look at the art of the possible. Don't think about what you've always done, think about where you want to go because there are many different paths to get there. And if you look at what the target goal is, it's really about making sure that you do that backward tracking to get to that goal. And the other piece that I tell my colleagues is celebrate the wins. My team of airmen, they are amazing, it's an honor to serve them and the reality is that they are doing great things and sometimes you want more. And it's really important to celebrate the victories because it's a very long journey and we keep moving the goalposts because we're always striving for excellence. >> Absolutely, it is always a journey that we're on, it's not about the destination. Eileen, thank you so much for sharing all that you've learned and glad you could participate. >> Thank you, STU, I appreciate being included today. Have a great day. >> Thanks and thank you for watching theCube. I'm Stu Miniman stay tuned for more from the MIT, CDO IQ event. (lively upbeat music)

Published Date : Sep 3 2020

SUMMARY :

brought to you by Silicon Angle Media. and the people in this ecosystem, Thank you Stu really All right, so the of the first things we did sure for the Air Force, at the right level to drive at the CDO, IQ talks to build our data lab, we have the opportunity to and absolutely the It's really important that we that they had to rapidly make Well, the story that I like to say is, And one of the interesting that in order to do AI and participating in the event this year? in the federal sector, is one of the most critical and really look at the art it's not about the destination. Have a great day. from the MIT, CDO IQ event.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
MichaelPERSON

0.99+

EileenPERSON

0.99+

ClairePERSON

0.99+

Tom CecilaPERSON

0.99+

Lisa MartinPERSON

0.99+

David BergPERSON

0.99+

2017DATE

0.99+

Greg GarciaPERSON

0.99+

June of 2018DATE

0.99+

Jonathan RosenbergPERSON

0.99+

Michael RosePERSON

0.99+

JuneDATE

0.99+

Eileen VitrinePERSON

0.99+

BlairPERSON

0.99+

U.S Air Force AcademyORGANIZATION

0.99+

MITORGANIZATION

0.99+

WednesdayDATE

0.99+

five minutesQUANTITY

0.99+

Omni ChannelORGANIZATION

0.99+

five billionQUANTITY

0.99+

Air Force Institute of TechnologyORGANIZATION

0.99+

two yearsQUANTITY

0.99+

CambridgeLOCATION

0.99+

ThursdayDATE

0.99+

Orlando, FloridaLOCATION

0.99+

Silicon Angle MediaORGANIZATION

0.99+

United States Air ForceORGANIZATION

0.99+

Eileen VidrinePERSON

0.99+

RyanPERSON

0.99+

GoogleORGANIZATION

0.99+

BlakePERSON

0.99+

Stu MinimanPERSON

0.99+

Blair PleasantPERSON

0.99+

BC StrategiesORGANIZATION

0.99+

Department of NavyORGANIZATION

0.99+

next yearDATE

0.99+

StuPERSON

0.99+

firstQUANTITY

0.99+

ConfusionORGANIZATION

0.99+

todayDATE

0.99+

five o'ClockDATE

0.99+

YouTubeORGANIZATION

0.99+

seventh yearQUANTITY

0.99+

twenty years agoDATE

0.99+

first yearQUANTITY

0.99+

twentyQUANTITY

0.99+

decades agoDATE

0.99+

last yearDATE

0.99+

JonesLOCATION

0.98+

Andrews Air Force BaseLOCATION

0.98+

this yearDATE

0.98+

United States Air ForceORGANIZATION

0.98+

last yearDATE

0.98+

fiveQUANTITY

0.98+

five nine mugsQUANTITY

0.98+

first thingQUANTITY

0.98+

first thingQUANTITY

0.98+

first stepsQUANTITY

0.98+

this fallDATE

0.98+

DODORGANIZATION

0.98+

Enterprise ConnectORGANIZATION

0.97+

Department of the Air ForceORGANIZATION

0.97+

AFITORGANIZATION

0.97+

bothQUANTITY

0.97+

oneQUANTITY

0.97+

Department of ArmyORGANIZATION

0.97+

first timeQUANTITY

0.97+

eachQUANTITY

0.96+

CDO IQEVENT

0.96+

OneQUANTITY

0.95+

TwitterORGANIZATION

0.95+

Jennifer Chronis, AWS | AWS Public Sector Online


 

>>from around the globe. It's the queue with digital coverage of AWS Public sector online brought to you by Amazon Web services. Everyone welcome back to the Cube's virtual coverage of AWS Public sector online summit, which is also virtual. I'm John Furrier, host of the Cube, with a great interview. He remotely Jennifer Cronus, who's the general manager with the D. O. D. Account for Amazon Web services. Jennifer, welcome to the Cube, and great to have you over the phone. I know we couldn't get the remote video cause location, but glad to have you via your voice. Thanks for joining us. >>Well, thank you very much, John. Thanks for the opportunity here >>to the Department of Defense. Big part of the conversation over the past couple of years, One of many examples of the agencies modernizing. And here at the public sector summit virtual on line. One of your customers, the Navy with their air p is featured. Yes, this is really kind of encapsulate. It's kind of this modernization of the public sector. So tell us about what they're doing and their journey. >>Sure, Absolutely. So ah, maybe er P, which is Navy enterprise resource planning is the department of the Navy's financial system of record. It's built on S AP, and it provides financial acquisition and my management information to maybe commands and Navy leadership. Essentially keep the Navy running and to increase the effectiveness and the efficiency of baby support warfighter. It handles about $70 billion in financial transactions each year and has over 72,000 users across six Navy commands. Um, and they checked the number of users to double over the next five years. So essentially, you know, this program was in a situation where their on premises infrastructure was end of life. They were facing an expensive tech upgrade in 2019. They had infrastructure that was hard to steal and prone to system outages. Data Analytics for too slow to enable decision making, and users actually referred to it as a fragile system. And so, uh, the Navy made the decision last year to migrate the Europe E system to AWS Cloud along with S AP and S two to s AP National Security Services. So it's a great use case for a government organization modernizing in the cloud, and we're really happy to have them speaking at something this year. >>Now, was this a new move for the Navy to move to the cloud? Actually, has a lot of people are end life in their data center? Certainly seeing in public sector from education to modernize. So is this a new move for them? And what kind of information does this effect? I mean, ASAP is kind of like, Is it, like just financial data as an operational data? What is some of the What's the move about it Was that new? And what kind of data is impacted? >>Sure. Yeah, well, the Navy actually issued a Cloud First Policy in November of 2017. So they've been at it for a while, moving lots of different systems of different sizes and shapes to the cloud. But this migration really marked the first significant enterprise business system for the Navy to move to the actually the largest business system. My migrate to the cloud across D o D. Today to date. And so, essentially, what maybe Air P does is it modernizes and standardizes Navy business operation. So everything think about from time keeping to ordering missile and radar components for Navy weapon system. So it's really a comprehensive system. And, as I said, the migration to AWS govcloud marks the Navy's largest cloud migration to date. And so this essentially puts the movement and documentation of some $70 billion worth of parts of goods into one accessible space so the information can be shared, analyzed and protected more uniformly. And what's really exciting about this and you'll hear from the Navy at Summit is that they were actually able to complete this migration in just under 10 months, which was nearly half the time it was originally expected to take different sizing complexity. So it's a really, really great spring. >>That's huge numbers. I mean, they used to be years. Well, that was the minicomputer. I'm old enough to remember like, Oh, it's gonna be a two year process. Um, 10 months, pretty spectacular. I got to ask, What is some of the benefits that they're seeing in the cloud? Is that it? Has it changed the roles and responsibilities? What's what's some of the impact that they're seeing expecting to see quickly? >>Yeah, I'd say, you know, there's been a really big impact to the Navy across probably four different areas. One is in decision making. Also better customer experience improves security and then disaster recovery. So we just kind of dive into each of those a little bit. So, you know, moving the system to the cloud has really allowed the Navy make more timely and informed decisions, as well as to conduct advanced analytics that they weren't able to do as efficiently in the past. So as an example, pulling financial reports and using advanced analytics on their own from system used to take them around 20 hours. And now ah, maybe your API is able to all these ports in less than four hours, obviously allowing them to run the reports for frequently and more efficiently. And so this is obviously lead to an overall better customer experience enhance decision making, and they've also been able to deploy their first self service business intelligence capabilities. So to put the hat, you know, the capability, Ah, using these advanced analytics in the hands of the actual users, they've also experienced improve security. You know, we talk a lot about the security benefits of migrating to the cloud, but it's given them of the opportunity to increase their data protection because now there's only one based as a. We have data to protect instead of multiple across a whole host of your traditional computing hardware. And then finally, they've implemented a really true disaster recovery system by implementing a dual strategy by putting data in both our AWS about East and govcloud West. They were the first to the Navy to do those to provide them with true disaster become >>so full govcloud edge piece. So that brings up the question around. And I love all this tactical edge military kind of D o d. Thinking the agility makes total sense. Been following that for a couple of years now, is this business side of it that the business operations Or is there a tactical edge military component here both. Or is that next ahead for the Navy? >>Yeah. You know, I think there will ultimately both You know that the Navy's big challenge right now is audit readiness. So what they're focusing on next is migrating all of these financial systems into one General ledger for audit readiness, which has never been done before. I think you know, audit readiness press. The the D has really been problematic. So the next thing that they're focusing on in their journey is not only consolidating to one financial ledger, but also to bring on new users from working capital fund commands across the Navy into this one platform that is secure and stable, more fragile system that was previously in place. So we expect over time, once all of the systems migrate, that maybe your API is going to double in size, have more users, and the infrastructure is already going to be in place. Um, we are seeing use of all of the tactical edge abilities in other parts of the Navy. Really exciting programs for the Navy is making use of our snowball and snowball edge capabilities. And, uh, maybe your key that that this follows part of their migration. >>I saw snow cones out. There was no theme there. So the news Jassy tweeted. You know, it's interesting to see the progression, and you mentioned the audit readiness. The pattern of cloud is implementing the business model infrastructure as a service platform as a service and sass, and on the business side, you've got to get that foundational infrastructure audit, readiness, monitoring and then the platform, and then ultimately, the application so a really, you know, indicator that this is happening much faster. So congratulations. But I want to bring that back to now. The d o d. Generally, because this is the big surge infrastructure platform sas. Um, other sessions at the Public sector summit here on the D. O. D is the cybersecurity maturity model, which gets into this notion of base lining at foundation and build on top. What is this all about? The CME EMC. What does it mean? >>Yeah, well, I'll tell you, you know, I think the most people know that are U S defense industrial base of what we call the Dev has experienced and continues to experience an increasing number of cyber attacks. So every year, the loss of sensitive information and an election property across the United States, billions each year. And really, it's our national security. And there's many examples for weapons systems and sensitive information has been compromised. The F 35 Joint Strike Fighter C 17 the Empty Nine Reaper. All of these programs have unfortunately, experience some some loss of sensitive information. So to address this, the d o. D. Has put in place, but they all see em and see which is the Cybersecurity Maturity Models certification framework. It's a mouthful, which is really designed to ensure that they did the defense industrial base. And all of the contractors that are part of the Defense Supply Chain network are protecting federal contract information and controlled unclassified information, and that they have the appropriate levels of cyber security in place to protect against advanced, persistent, persistent threats. So in CMC, there are essentially five levels with various processes and practices in each level. And this is a morton not only to us as a company but also to all of our partners and customers. Because with new programs the defense, investor base and supply take, companies will be required to achieve a certain see MNC certification level based on the sensitivity of the programs data. So it's really important initiative for the for the Deal E. And it's really a great way for us to help >>Jennifer. Thanks so much for taking the time to come on the phone. I really appreciate it. I know there's so much going on the D o d Space force Final question real quick for a minute. Take a minute to just share what trends within the d o. D you're watching around this modernization. >>Yeah, well, it has been a really exciting time to be serving our customers in the D. And I would say there's a couple of things that we're really excited about. One is the move to tactical edge that you've talked about using out at the tactical edge. We're really excited about capabilities like the AWS Snowball Edge, which helped Navy Ear Key hybrid. So the cloud more quickly but also, as you mentioned, our AWS cone, which isn't even smaller military grades for edge computing and data transfer device that was just under £5 kids fitness entered mailbox or even a small backpacks. It's a really cool capability for our diode, the warfighters. Another thing. That's what we're really watching. Mostly it's DRDs adoption of artificial intelligence and machine learning. So you know, Dear D has really shown that it's pursuing deeper integration of AI and ML into mission critical and business systems for organizations like the Joint Artificial Intelligence. Enter the J and the Army AI task force to help accelerate the use of cloud based AI really improved war fighting abilities And then finally, what I'd say we're really excited about is the fact that D o. D is starting Teoh Bill. New mission critical systems in the cloud born in the cloud, so to speak. Systems and capabilities like a BMS in the airports. Just the Air Force Advanced data management system is being constructed and created as a born in the cloud systems. So we're really, really excited about those things and think that continued adoption at scale of cloud computing The idea is going to ensure that our military and our nation maintain our technological advantages, really deliver on mission critical systems. >>Jennifer, Thanks so much for sharing that insight. General General manager at Amazon Web services handling the Department of Defense Super important transformation efforts going on across the government modernization. Certainly the d o d. Leading the effort. Thank you for your time. This is the Cube's coverage here. I'm John Furrier, your host for AWS Public sector Summit online. It's a cube. Virtual. We're doing the remote interviews and getting all the content and share that with you. Thank you for watching. Yeah, Yeah, yeah, yeah, yeah

Published Date : Jun 30 2020

SUMMARY :

I'm John Furrier, host of the Cube, Thanks for the opportunity here One of many examples of the agencies modernizing. Essentially keep the Navy running and to increase the What is some of the What's the move about it Was that new? as I said, the migration to AWS govcloud marks the Navy's largest cloud migration to date. I got to ask, What is some of the benefits that they're seeing in the cloud? So to put the hat, you know, ahead for the Navy? So the next thing that they're focusing on in their journey So the news Jassy tweeted. And all of the contractors that are part of the Defense Supply Chain network Thanks so much for taking the time to come on the phone. One is the move to tactical edge that you've talked We're doing the remote interviews and getting all the content and share that with you.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jennifer CronusPERSON

0.99+

John FurrierPERSON

0.99+

JohnPERSON

0.99+

JenniferPERSON

0.99+

November of 2017DATE

0.99+

Jennifer ChronisPERSON

0.99+

2019DATE

0.99+

AWSORGANIZATION

0.99+

JassyPERSON

0.99+

two yearQUANTITY

0.99+

NavyORGANIZATION

0.99+

10 monthsQUANTITY

0.99+

United StatesLOCATION

0.99+

over 72,000 usersQUANTITY

0.99+

about $70 billionQUANTITY

0.99+

last yearDATE

0.99+

bothQUANTITY

0.99+

each levelQUANTITY

0.99+

Department of DefenseORGANIZATION

0.99+

less than four hoursQUANTITY

0.99+

sixQUANTITY

0.98+

firstQUANTITY

0.98+

one platformQUANTITY

0.98+

five levelsQUANTITY

0.98+

$70 billionQUANTITY

0.98+

D. O. DLOCATION

0.98+

Amazon WebORGANIZATION

0.98+

this yearDATE

0.97+

AP National Security ServicesORGANIZATION

0.97+

under £5QUANTITY

0.97+

one financial ledgerQUANTITY

0.97+

around 20 hoursQUANTITY

0.97+

Snowball EdgeCOMMERCIAL_ITEM

0.97+

CubeCOMMERCIAL_ITEM

0.97+

D o. DPERSON

0.97+

under 10 monthsQUANTITY

0.96+

each yearQUANTITY

0.96+

D o D.LOCATION

0.95+

oneQUANTITY

0.95+

billions each yearQUANTITY

0.95+

F 35 Joint Strike Fighter C 17COMMERCIAL_ITEM

0.94+

OneQUANTITY

0.94+

CME EMCORGANIZATION

0.93+

ASAPORGANIZATION

0.91+

govcloud WestORGANIZATION

0.91+

Amazon Web servicesORGANIZATION

0.88+

DPERSON

0.87+

Navy Ear Key hybridCOMMERCIAL_ITEM

0.86+

Amazon Web servicesORGANIZATION

0.85+

eachQUANTITY

0.83+

TodayDATE

0.82+

U SORGANIZATION

0.81+

sectorEVENT

0.81+

public sectorEVENT

0.81+

Public sector SummitEVENT

0.81+

Europe ELOCATION

0.79+

twoTITLE

0.77+

first significantQUANTITY

0.76+

Brian Reagan, Actifio & Paul Forte, Actifio | CUBE Conversation, May 2020


 

from the cube studios in Palo Alto in Boston connecting with thought leaders all around the world this is a cube conversation [Music] hi buddy this is Dave Volante and welcome to this cute conversation you know the we've been following a company called Activia for quite some time now they they've really popularized the concept of copy data management really innovative Boston based Waltham based company and with me Brian Regan who's the chief marketing officer and all 40 who's the newly minted chief revenue officer of actifi Oh guys great to see you I wish we were face to face that you're you're you're June event but this will have to do yeah you bet yeah so you know Brian you've been on the cube a bunch I'm gonna start with Paul if that's okay Paul you know just let's talk a little bit about your your background you've you've done a number of stance at a variety of companies you know big companies like IBM and others as well what attracted you to Activia in all honesty I've been a software guy and candidly a data specific leader for many many years and so IT infrastructure particularly associated around data has always been sort of my forte for fun on words there and and so Activia was just smack dab in the middle of that right and so when I was looking for my next adventure you know I had an opportunity to to meet with a shower CEO and Founder and describe and discuss kind of what activity was all about and candidly the the number of connections that we had that were the same a lot of our OEM relationships with people that I actually worked with and for and some that worked for me historically so it was almost this perfect world right and I'm a Boston guy so it was in my in my old backyard and it was just a perfect yeah it was a perfect match for what I was looking for which was really a small growth company that was trying to you know get to the next level that had compelling technology in a space that I was super familiar with and understanding and articulate the value proposition well as we're saying in Boston Paulie we got to get you back here I know I pack my cock let's talk about the let's talk about the climate right now I mean nobody expected this of course I mean it's funny I was I saw ash and an event in in Boston last fall we were talking like hey what do you expected for next year yeah a little bit of softening but you know nobody expected this sort of Black Swan but you guys I just got your press release you put out you had a good you had a good quarter you had a record first quarter um what's going on in the marketplace how you guys doing yeah well I think that today more than than ever businesses are realizing that data is what is actually going to carry them through this crisis and that data whether it's changing the nature of how companies interact with their customers how they manage through their supply chain and in frankly how they take care of their employees is all very data-centric and so businesses that are protecting that data that are helping businesses get faster access to that data and ultimately give them choice as to where they manage that data on-premises in the cloud and hybrid configuration those are the businesses that are really going to be top of a CIOs mind I think our q1 is a demonstration that customers voted with their wallets in their confidence in ectopy Oh has an important part of their data supplied nopal I want to come back to you first of all your your other people know you're next you're next Army Ranger so thank you for your service that's awesome you know I was talking to Frank's lute man we interviewed me other day and he was sharing with me sort of how he manages and and he says the other managed by a playbook he's a situational manager and that's something that he learned in the military well this is weird this is a situation okay and that really is kind of how you're trained and and of course we've never seen anything like this but you're trained to deal with things that you've never seen before so how are you seeing organizations generally actifi Oh specifically going to manage through this process what are some of the moves that you're advising recommending give us some insight there yeah so I'm it's really interesting it's a it's funny that you mentioned my military background I was just having this discussion with one of my leaders the other day that you know one of the things that they trained for in the military is the eventualities of chaos right and so when you when you do an exercise they we will literally tap the leader on the shoulder and say okay you're now dead and without that person being allowed to speak they take a knee and the unit has to go on and so what happens is you you learn by muscle memory like how to react in time suffice it or and you know this is a classic example of leadership and crisis and so um so it's just it's just interesting like so to me you have a playbook I think everybody needs to start with a playbook and then start with a plan I can't remember if it was Mike Tyson but one of them one of my famous quotes was you know let you know plan is good until somebody punches you in the face that's the reality of what just happened the business across the globe is it just got punched in the face and so you got a playbook that you rely on and then you have to remain nimble and creative and candidly opportunistic and from a leadership perspective I think you can't lose your confidence right so I've watched some of my friends and of what some other businesses crippled in the midst of this and I'm because they're afraid instead of instead of looking at this in my first commentary that our first staff meeting Brian if I remember it was this okay so what makes active feel great in disembark like not why is it not great right and so we didn't get scared we jumped right into it we you know we adjusted our playbook a little bit and candidly we just had a record quarter and we just down here the honestly date we took down deals in every single geography around the globe to include Italy I mean so it was insane it was really fun okay so this wasn't just one monster deal that gave you that record Porter is really a broad-based the demand yeah so if you you know if you dug underneath the covers you would see that we had the largest number of transactions ever in the first quarter we had the largest average selling price in the first quarter ever we had the largest contribution from our panel partners and our OEM partners ever and we had the highest number ever and so it was a it was really a nice truly balanced performance across the globe and across the size of deal sets and candidly across industries interesting I mean you use the term opportunistic and and I think you're right on I mean you obviously you don't want to be chasing ambulances at the same time you know we've talked to a lot of CEOs and essentially what they're doing and I'd like to get your feedback on this Brian you you you're kind of reassessing the ideal profile of a customer you're reassessing your value proposition in the context of the current pandemic and and I noticed that you guys in your press release talked about cyber resiliency you talked about digital initiatives you know data center transformations etc so maybe you could talk a little bit about that Brian did you do those things how did you do those things what kind of pace were you guys at how did you do it remotely with everybody working from home give us some color on that sure and you know Ashley if you were here you would probably remind us that Activia was born in the midst of the 2008 financial crisis so we we have essentially been bookended by two black swans over the last decade the and the lessons we learned in 2008 are every bit as as relevant today everything starts with cost containment in hospital and in protection of the business and so cio is in the midst of this shock to the system I think we're very much looking at what are the absolutely vital critical initiatives and what is a nice to have and I'm going to pause on my step and invest entirely in the critical mission and the critical initiatives tended to be around getting people safely working for remotely getting people safe access to their systems and their applications in their data and then ultimately it also became about protecting the systems from malicious individuals and state actors up unfortunately as we've seen in other times of crisis this is when crime and cyber crime particularly tends to spike particularly against industries that don't have the strong safeguards in place to to really ensure the resiliency their applications so we very much went a little bit back to the 2008 playbook around helping people get control of their costs helping people continue to do the things they need to do at a much more infrastructure light manner but also really emphasize the fact that if you are under attack or if you are concerned that you're infected but you don't know when you know instant access to data and a time machine that can take you back and forth to those points in time is something that is incredibly valuable so so let's >> cyber resiliency so specifically what is aekta video doing for its customers from a product standpoint capabilities maybe it's part of the the 10 see announcement as well but but can you can you give us some specifics on where you fit in let's take that use case cyber resiliency yeah absolutely so I think there's there's a staff of capabilities when it comes to cyber resiliency at the lowest level you need a time machine because most people don't know when they're in fact and so the ability to go back in time test the recoverability of data test the validity of the data is step one step two is once you've found the clean point being able to resume operations being able to resume the applications operation instantly or very rapidly is the next phase and that's something that Activia was founded on this notion of instant access to data and then the third phase and this is really where our partnerships really shine is you probably want to go back and mitigate that risk you want to go back and clean that system you want to go back and find the infection and eliminate it and that's where our partnership with IBM freezing resiliency services and their cyber incident recovery solution which takes the activity of platform and then rappers and a complete managed services around it so they can help the customer not only get their their systems and applications back on their feet but clean the systems and allow them to resume operations normally on a much safer and more stable okay so so that's interesting so Paul Paul was it kind of new adoptions was it was it increases from existing customers kind of a combination and you talk to that yeah totally so like ironically to really come clean we are the metrics that we had in the first quarter were very similar through the metrics that we see historically so the mix need our existing customer base and then our new customer acquisition were very similar to our historical metrics which candidly we were a little surprised by we anticipated um that the majority of our business would come from that safe harbor of your existing customer base but candidly we had a really nice split which was great which meant that you know a value proposition was resonating not only with our existing customer base where you would expect it but also in in any of our new customers as well who had been evaluating us that either accelerated or or just continue down the path of adoption during the time frame of Koba 19 across industries I would say that again um there was there were there were some industries I would say that pushed pause and so the ones that you can imagine that accelerated during during this past period were the ones you would think of right so financial institutions primarily as well as some some of the medical so some of those transactions healthcare and medical they accelerated along with financial institutions and then I would say that that we did have some industries that push pause and you can probably guess what some of those are a majority of those were the ones that we're dealing with the small and mid-sized businesses or consumer facing businesses things like retail stuff like that where we typically do have a pretty nice residence in a really nice value proposition but there were there were definitely some transactions that we saw basically just pause like we're going to come back but overall the yeah the feedback was just in general it felt like any other quarter and it felt like just pretty normal as strange as that sounds because I know speaking to a lot of my friends and gear companies your software companies they didn't have that experience but we did pretty well that's interesting I mean you're right I mean certain industries Airlines I'm interviewing a cio of major resort next week you know really interested to hear how they're you know dealing with this but those those are obviously depressed and they've dialed everything down but but we've we were one of the first to report that work from home pivot it didn't it didn't you know buffer the decline in IT spending that were expecting to be down you know maybe as much as 5% this year but it definitely offset it what about cloud we're seeing elevated levels in cloud demand guys you know have offerings there what are you seeing in cloud guys you want that yeah I'll start and then fall please please weigh in I think that'd be the move to the cloud that we've been witnessing and the acceleration of the MOOC table that we've been whipped over the past several years probably ramped up in intensity over the last two months The Improv been on the you know 18 to 24 month road map have all of a sudden been accelerated into maybe this year but in terms of the wholesale you know everything moves to cloud and I abandoned my on-premises estate I I don't think we've seen that quite yet I think the the world is still hybrid when it comes to cloud although I do think that the beneficiaries of this are probably the the non number one or number two cloud providers but the rest of the hyper scalers who are fighting for market share because now they have an opportunity to perhaps google for example a strategic partner of ours has a you know a huge offering when it comes to enabling work home and remote work so leveraging that as a platform and then extending into their enterprise offerings I think gives them a wedge that the you know Amazon might not have so this it's an acceleration of interest but I think it's just a continuation of the trend of seeing four years yeah and I would add a little bit if the you know IBM held their think conference this past week I don't know if you had an opportunity to participate there one of our OEM partners and oh yeah because you know when our the CEO presented his kind of opening his opening remarks it was really about digital transformation and he really he really kind of put it down to two things and said you know any business that's trying to transform is either talking about hybrid cloud but they're talking about AI and machine learning and that's kind of it right and so every digital business is talking in one of those categories and so when I look 2q1 it's interesting that we really didn't see anything other than as brian talked about all the cloud business which is some version of an acceleration but outside of that the customers that are in those industries that are in position to accelerate and double down during this opportunity didn't so and those that did not you know kind of just peeled back a little bit but overall I still I would agree with with ibm's assessment of the market that you know those are kind of the two hot spots and have a cloud is hot and the good news is we've got a nice guy operating Molloy yeah Arvind Krista talked about the the in and it has it maybe not I think but he talked earlier in his remarks on the earnings call just in Publix Davis that IBM must win the battle the architectural battle the hybrid cloud and also that he wants to lead with a more technical sell essentially which is submitted to me those those two things are great news for you guys obviously you know Red Hat is the linchpin of that I want to ask you guys about your your conference data-driven so we were there last year it was a great really great intimate event of course you know you hand up the physical events anymore so you've pushed to September you're going all digital would give us the update on on that program we're um we're eager to have the cube participate in our September event so I'm sure we'll be talking more about that in the coming weeks but awesome we love it we exactly so you can tell Frank to put that so we we've been participating in some of the other conferences I think most notably last week learning a lot and and really trying to cherry pick the best ideas and the best tactics for putting on a digital event I think that as we look to September and as we look to put on a really rich digital event one of the things that is I think first and foremost in our minds is we want to actually produce more on-demand digital content particularly from a technology standpoint our technology sessions last year were oversubscribed the digital format allows people to stream whenever they can and frankly as many sessions as they as they might so I think we can be far more efficient in terms of delivering technical content or the users of our technology and then we're also eager to have as we've done with data driven in the years past our customers tell the story of how they're using data and this year certainly I think we're going to hear a lot of stories about in particular how they use data during this incredible you know crisis and and hopefully renewal from crisis well one of my favorite interviews last year your show is the the guys from draft King so hopefully they'll be back on it will have some football to talk about let's hope I mean I want it I want to end with just sort of this notion of you know we've been so tactical the last eight weeks right I'm you guys too I'm sure just making sure you're there for customers making sure your employees are ok but as we start to think about coming out of this you know into a post probe Adaro it looks like it's gonna be with us for a while but we're getting back the you know quasi opening so I'm hearing you know hybrid is here to stay we agree for sure cyber resiliency is very interesting I think you know one of the things we've said is that that companies may sub optimize near-term profitability to make sure that they've got the flexibility and resilience business resiliency in place you know that's obviously something that is I think good news for you guys but but I'll start with Paul and then maybe Brian you can bring us home how do you see this sort of emergence from this lockdown and into the post ghovat era yeah so this is a really interesting topic for me in fact I've had many discussions over the last couple weeks with some of our investors as well as our executive staff and so my personal belief is that the way buying and selling has occurred for IT specifically at the enterprise level is about to go through a transformation no different than we watched the transformation of SAS businesses when you basically replace the cold-calling salesperson with an inside and you know inbound marketing kind of effort followed up with SDR and vdr because what we're finding is that our clients now are able to meet more frequently because we don't have the friction of airplane ride or or physical building to go through and so like that that whole thing has been removed from the sales process and so it's interesting to me that one of the things that I'm starting to see is that the amount of activity that our sales organization is doing and the amount of physical calls that were going on they happen to be online however you couple that with the cost savings of not traveling around the globe and not being in offices and and I really think that those companies that embrace this new model are gonna find ways to penetrate more customers in a less expensive way and I do believe that the professional sales enterprise salesperson of tomorrow is gonna look at then it looks today and so I'm super excited to be in a company that is smack dab in the middle of selling to enterprise clients and and watching us learn together how we're gonna buy sell and market to each other in this post public way because I I'm the only thing I really do know it's just not gonna be the way it used to be what is it gonna look like I think all of us are placing bets and I don't think anybody has the answer yet but it's gonna look different for sure they're very very thoughtful comments and so Brian you know our thinking is the differentiation and the war yes it gets one in digital how is that affecting you know sort of your marketing and your thing around that we we fortunately decided coming into 2020 our fiscal 21 that we were actually going to overweight digital anyway we felt that it was far more effective we were seeing far better conversion rates we saw you know way better ROI in terms of very targeted tentative digital campaigns or general-purpose ABM type of efforts so our strategy had essentially been set and and what this provided us is the opportunity to essentially redirect all of the other funds individually so you know we have essentially a two-pronged marketing you know attack Frank now which is you know digital creating inbounds and B DRS that are calling on those in bounds that are created digital and so it's a you know it's going to be a really interesting transition back when physical events if and when they do actually come back into form you know how much we decide to actually go back into that that been I think that you know to someone to some extent we've talked about this in the past II you know the physical events and the the sheer spectacle and this year you know audacity of having to spend a million dollars just to break through that was an unsustainable model and so I think this is this is hastening perhaps the decline or demise of really silly marketing expense and getting back to telling telling customers what they need to know to help their an assist their buying journey in their investigation journey into a new technology I mean the IT world is hybrid and I think the events world is also going to be hybrid to me nice intimate events you know they're gonna live on but they're also gonna have a major digital component to them I'm very excited that you know we're a lot of learnings now in digital especially around events and by September the a lot of the the bugs are gonna be worked out you know we've been going to it so it feels like 24/7 but really excited to have you guys on thanks so much really looking forward to working with you in in September it's data-driven so guys thanks a lot for coming on the cube oh my gosh thank you Dave so nice it's so nice to be here thank you alright pleasure you did thank you everybody thank you and thanks for watching this is Dave Volante for the cube and we'll see you next time [Music]

Published Date : May 20 2020

**Summary and Sentiment Analysis are not been shown because of improper transcript**

ENTITIES

EntityCategoryConfidence
Brian ReganPERSON

0.99+

IBMORGANIZATION

0.99+

BostonLOCATION

0.99+

BrianPERSON

0.99+

Dave VolantePERSON

0.99+

May 2020DATE

0.99+

2008DATE

0.99+

Brian ReaganPERSON

0.99+

ActiviaORGANIZATION

0.99+

Mike TysonPERSON

0.99+

SeptemberDATE

0.99+

PaulPERSON

0.99+

Palo AltoLOCATION

0.99+

2020DATE

0.99+

Arvind KristaPERSON

0.99+

18QUANTITY

0.99+

Paul FortePERSON

0.99+

AmazonORGANIZATION

0.99+

DavePERSON

0.99+

last yearDATE

0.99+

AshleyPERSON

0.99+

FrankPERSON

0.99+

next weekDATE

0.99+

four yearsQUANTITY

0.99+

ibmORGANIZATION

0.99+

ItalyLOCATION

0.99+

next yearDATE

0.99+

todayDATE

0.99+

last weekDATE

0.99+

two thingsQUANTITY

0.99+

last fallDATE

0.99+

first commentaryQUANTITY

0.99+

third phaseQUANTITY

0.99+

tomorrowDATE

0.98+

brianPERSON

0.98+

two hot spotsQUANTITY

0.98+

24 monthQUANTITY

0.98+

5%QUANTITY

0.98+

JuneDATE

0.97+

actifiORGANIZATION

0.97+

PublixORGANIZATION

0.97+

first staffQUANTITY

0.97+

two-prongedQUANTITY

0.97+

oneQUANTITY

0.97+

this yearDATE

0.96+

this yearDATE

0.96+

fiscal 21DATE

0.96+

Paul PaulPERSON

0.96+

firstQUANTITY

0.95+

10 seeQUANTITY

0.95+

last decadeDATE

0.94+

WalthamLOCATION

0.94+

40QUANTITY

0.93+

2008 financial crisisEVENT

0.9+

past weekDATE

0.9+

nopalORGANIZATION

0.89+

ActifioORGANIZATION

0.88+

googleORGANIZATION

0.88+

last eight weeksDATE

0.87+

first quarterDATE

0.87+

one of my favorite interviewsQUANTITY

0.86+

a million dollarsQUANTITY

0.86+

Bill McGee, Trend Micro | AWS re Invent 2019


 

>>law from Las Vegas. It's the Q covering a ws re invent 2019. Brought to you by Amazon Web service is and in along with its ecosystem partners. >>Okay, Welcome back, everyone. Cube coverage. Las Vegas live action. It was re invent 2019 3rd day of a massive show where our seventh year of the eight years of Abel documenting the history and the rise in the changing landscape of the business. I'm John for Bruce. To Minutemen, my co host. Our next guest Bill McGee, senior vice president, general manager of the Hybrid Cloud Security group within Trend Micro. So, this company, those guys now lead executive of the Cloud Hybrid. I have rid Cloud Security hybrid in there looking cute. >>And I've been to every reinvent, every single one. >>Congratulations. Thank you. >>Thank you. Nice to be >>here. So, eight years, what's changed in your mind? Real quick. >>Uh, wow. The Yeah, certainly. The amount of a dot Uh, the amount of adoption is now massive mainstream. You don't have the question. Should I go to the cloud? It's all about how and how much. Probably the biggest change we've seen is how it's really being embraced all around the world where a global company we saw initially a US on Australia type focused you K. Now it's all over the place and it's really relevant everywhere, >>you know, at least from my standpoint. And I have enough friends of mine in the security industry. When we first started coming to show, I mean security was here. Security is not only is so front and center in the discussion of cloud that they had all show for it here, so you know, it gives the 2019 view of security inside that the broader hybrid cloud discussion here, a re >>investor. Let me tell you a couple of things, kind of what we're seeing within our customer base and then what matters from a security perspective. So we see, you know, some organizations doing cloud migration moving. We're close to the cloud of various forms. Had a couple of meetings yesterday. One was college evacuating their data center. The other one was celebrating that two weeks ago they closed their data center, So that's a big step. Windows and Lennox workloads moving to the cloud and really changing existing security controls toe work better in the cloud. But certainly what a lot of these cloud builders are here for is, you know, developing cloud native applications. Originally back 78 years ago, that was on top of what's now seem like pretty simple. Service is like s three E. C two. I've got containers and server lists and other platforms that that people are using. And then the last thing. A lot of companies are establishing a cloud centre of excellence, and they're trying to optimize the use of the cloud. They still have compliance requirements that they need to achieve. So these are what we see happening and really the challenge for the customer. How do we secure all this? How do we secure the aggressive, aggressive cloud Native application development? How do we help a customer achieve compliance easily from a cloud centre of excellence? So that's where we see us fitting. And we made a big announcement a couple of weeks ago about a new platform that we've created. I would love to talk to >>love that. Let's dig into that. But first we were at reinforces Amazons First security, Carver's David Locked and I were talking about cloud security was on Prem security and then what's happening here and had a conversation with someone who was close to the C I. A. Can't say his or her name. And they said Cloud has changed the game for them because they're cost line was pretty much flat. But the demand for missions were squirrels going scaling. So we're seeing that same dynamic. You were referring to it earlier that costs and data centers is kind of flat. But the demand for application new stuff's happened, so there's a real increased her demand for APS. Sure, this is the real driver, how people are flexing and deploying technology. So the security becomes really the built in conversation, cracked comment on that dynamic. And what do you recommend? Well, so here's a couple >>of things we've seen, Really? You know, again, we've been doing private security for about a decade, and really it was primarily focused on one service of eight of us, which is easy to now that's a pretty darn big service and widely used within their customer base. There's no 170 service's, I think is the most recent number. So the developers are embracing all these new service is we acquired a new capability in October. Company called Cloud Conformity, based in Sydney, Australia, very focused on AWS, analyzes implementations against the eight of US well-architected framework. So the first step we see for customers is you gotta get visibility into use of the cloud for the security team. What service is air being used, then? Can you set up a set of security guard rails to allow those service is to be used in a secure manner. Then we help our customers turn to more detailed, specialized protection of easy to or containers or server list. So that's what we've recognized ourselves. We had to create a very modest version of what Amazon has created themselves, which is a platform that allows builders to connect to and choose what security service is they want. >>Road is your service bases and all the service's air. You guys now pick and choose the wall. Yeah, there's a main ones. What does highlight? So >>there's Yeah, I'll give you the ones where we provide a very large breath of protection. So in the what we're calling Cloud one conformity service. So that's this technology we acquired a couple months ago. It cuts across about 70 service is right now and gives you visibility of potential security configuration errors that you have in your environment now if it's in a deaf team, maybe not such a big deal. But if it's in production, that is a big deal. Even better, you can scan your cloud formacion templates on the way to being live. Then we have a set of specialized protection that you know will run on a workload and protect it protected containerized environment. A library that can sit within a server lis application. That's kind of how we look at it. All right, >>So, Bill, one of things of going to the more and more cloud for customers is that there's that shared responsibility. Modern. We know that security is everyone's responsibility. It needs to be built in from the ground up. How are your customers doing with that shift? And are they understanding what they need to do? There have been some pretty visible, like a weight. I really had to configure that. I've thought about that Amazons trying to close the gap on song. But for some of those, >>we've seen a big positive change over the years. Initially I would say that there was what I would call a naive perception that the cloud with magic and it was perfectly secure and that I don't have to worry about it, right. Amazon data did the industry a real favor by establishing the shared responsibility model and making crystal clear what they've got covered that you don't need to worry about anymore as a customer. And then what are the capabilities you still need? Toe worry about? They've delivered a set of security tools that help their customers, and then they rely on partners like us. Thio deliver a set of more in depth tools. Thio, you know, specialized market. >>You actually used a word that we've been talking about a lot this week. Naive. Yeah. So we said, there's, you know, the one letter difference between being cloud native meeting Cloud naive there. Yeah. What does it mean to be cloud native in the security world? >>Well, I would say what allows you to be so first, the most important thing in every customer's mind. I don't care how good the security capabilities you're helping with me with. If you're going to slow down the improvements that I've just made to my development lifecycle. I'm not interested. So that is the most important thing is, are you able to inject your security technology and allow the customer to deliver at the rate that they're currently or continuing to improve? That is by far the most important thing. Then it's our your controls, fitting into an environment in a way that that are as easy as possible for the customer. One part that's been very critical for us. We've been a lead adopter of the AWS marketplace, allowing customers too procure security technology easily. They don't actually have to talk to us to buy our product. That's pretty revolutionary >>about the number of breaches that I'm going on, What's changed with you guys over the year because new vectors air coming out at this more surface area. Obviously, it's been discussed. What's changed most in your I'll >>tell you what we're worried about and what we expect to see, although I would say the evidence. It's early, uh, the reality in our traditional data centers. They were so porous at runtime in terms of the infrastructure and vulnerabilities that it was relatively easy for Attackers to get in the cloud has actually improved the level of security because of automation, less configuration errors. Unfortunately, what we expect his Attackers >>to move to. >>The developers moved to the depth pipeline, injecting code not a run time, but injecting it earlier in the life cycle. We've seen evidence of container images up on Dr Hub getting infected and then developers just pulling in without thinking about it. That's where Attackers are going to move to the depth pipeline. And we need to move some of our security technology to the dead pipeline toe, help customers defend themselves. >>What about International Geo Geo issues around compliance. How is that changing the game or slowing it down? Or I'm sailing it or you talk about that dynamic with regions? Are you >>sure you know us is the most innovative market and the most risk taking market, and therefore people moved to the cloud quite bravely over this over this decade. Some of the markets So, for example, were Japanese headquarters company. In general, Japanese companies, you know, really taken to a lot of considerations before they make that type of big bet. But now we're seeing it. We're seeing auto manufacturers embrace the cloud. So I think those it was a struggle for us in the early days. How regional the adoption of Cloud was. That's not the case anymore. It's really a relevant conversation in every one of our markets. >>Bill. Thank you for coming on the Cuban Sharing your insights Hybrid Cloud Security Got to ask you to end the segment. Yeah, What is going on for you This year? I'll see hybrids in your title. Operating models. Cloud center, gravity clouds going to the edge or data center. Just operate model. What's on your mind this year? What are you trying to do? Accomplish what you excited >>about? What? We're really excited about what this product announcement we made, called Cloud One. And what Cloud one is, is a set of Security Service's, which customers can access through common common access common building infrastructure, common cloud account management and choose what to use. You know, Andy put it pretty well in his keynote where you know he talked about He doesn't think of aws, a Swiss Army knife. He thinks of it as a specialized set of tools that builders get to adopt. We want to create a set of security tools in a similar way where customers can choose which of these specialized security service is that they want to adopt >>Bill. Great pleasure to meet you and have this conversation pro and then security area entrepreneur sold his company to Trend Micro. This is the hybrid world. It's all about the cloud operating model. So about agility and getting things done with application developers. This cube bringing all the data from reinvent stables for more coverage after this short break.

Published Date : Dec 6 2019

SUMMARY :

Brought to you by Amazon Web service and the rise in the changing landscape of the business. Thank you. Nice to be So, eight years, what's changed in your mind? is how it's really being embraced all around the world where a global company we saw initially center in the discussion of cloud that they had all show for it here, so you know, So we see, you know, some organizations doing cloud migration And what do you recommend? So the first step we see for customers is you gotta get visibility You guys now pick and choose the wall. So in the what we're calling Cloud one conformity service. So, Bill, one of things of going to the more and more cloud for customers is that the shared responsibility model and making crystal clear what they've got covered that you don't need to What does it mean to be cloud native in the security world? So that is the most important thing is, are you able to inject your security technology about the number of breaches that I'm going on, What's changed with you guys over the year because new easy for Attackers to get in the cloud has actually improved the level of security because The developers moved to the depth pipeline, injecting code not a run time, How is that changing the game or slowing it down? Some of the markets So, for example, were Japanese headquarters company. Yeah, What is going on for you This year? you know he talked about He doesn't think of aws, a Swiss Army knife. This is the hybrid world.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AndyPERSON

0.99+

AmazonORGANIZATION

0.99+

Bill McGeePERSON

0.99+

OctoberDATE

0.99+

Trend MicroORGANIZATION

0.99+

AWSORGANIZATION

0.99+

2019DATE

0.99+

AmazonsORGANIZATION

0.99+

JohnPERSON

0.99+

CarverORGANIZATION

0.99+

Las VegasLOCATION

0.99+

eight yearsQUANTITY

0.99+

AustraliaLOCATION

0.99+

Sydney, AustraliaLOCATION

0.99+

seventh yearQUANTITY

0.99+

first stepQUANTITY

0.99+

David LockedPERSON

0.99+

Swiss ArmyORGANIZATION

0.99+

eightQUANTITY

0.99+

one serviceQUANTITY

0.99+

two weeks agoDATE

0.99+

BrucePERSON

0.99+

yesterdayDATE

0.99+

3rd dayQUANTITY

0.99+

this weekDATE

0.98+

this yearDATE

0.98+

FirstQUANTITY

0.98+

OneQUANTITY

0.98+

firstQUANTITY

0.97+

LennoxORGANIZATION

0.97+

One partQUANTITY

0.97+

about 70 serviceQUANTITY

0.97+

170 serviceQUANTITY

0.97+

This yearDATE

0.96+

one letterQUANTITY

0.96+

USLOCATION

0.96+

78 years agoDATE

0.95+

about a decadeQUANTITY

0.94+

E. C twoTITLE

0.93+

couple months agoDATE

0.93+

Cloud ConformityORGANIZATION

0.93+

Amazon WebORGANIZATION

0.91+

ThioORGANIZATION

0.89+

oneQUANTITY

0.89+

CubanOTHER

0.86+

BillPERSON

0.86+

Cloud Security hybridCOMMERCIAL_ITEM

0.86+

PremORGANIZATION

0.82+

Hybrid Cloud SecurityORGANIZATION

0.8+

a couple of weeks agoDATE

0.77+

C I. A. CaORGANIZATION

0.76+

Cloud oneTITLE

0.74+

AbelPERSON

0.73+

Cloud naiveTITLE

0.68+

Cloud HybridCOMMERCIAL_ITEM

0.67+

Cloud OneTITLE

0.64+

JapaneseOTHER

0.63+

International Geo GeoORGANIZATION

0.63+

single oneQUANTITY

0.59+

coupleQUANTITY

0.58+

NaivePERSON

0.52+

JapaneseLOCATION

0.51+

WindowsTITLE

0.5+

Dr HubORGANIZATION

0.43+

MinutemenTITLE

0.42+

InventEVENT

0.4+

CloudORGANIZATION

0.31+

Teresa Carlson, AWS Worldwide Public Sector | AWS re:Invent 2019


 

>>long from Las Vegas. It's the Q covering a ws re invent 2019. Brought to you by Amazon Web service is and in along with its ecosystem partners. >>Welcome back to the Cube. Here live in Las Vegas for aws reinvent I'm John for a devil on the ads, always extracting the signal from the noise. We're here for 1/7 reinvent of the eight years that they've had at what a wave. One of the biggest waves is the modernization of procurement, the modernization of business, commercial business and the rapid acceleration of public sector. We're here with the chief of public sector for AWS. Teresa Carlson, vice president publics that globally great to have you >>so great to have the Q begin this year. We appreciate you being here, >>so we're just seeing so much acceleration of modernization. Even in the commercial side, 80 talks about transformation. It's just a hard core on the public sector side. You have so many different areas transforming faster because they haven't transformed before. That's correct. This is a lot of change. What's changed the most for you in your business? >>Well, again, I'll be here 10 years this mad that A B s and my eighth reinvent, and what really changed, which was very exciting this year, is on Monday. We had 550 international government executives here from 40 countries who were talking about their modernization efforts at every level. Now again, think about that. 40 different governments, 550 executives. We had a fantastic day for them planned. It was really phenomenal because the way that these international governments or think about their budget, how much are they going to use that for maintaining? And they want to get that lesson last. Beckett for Modernization The Thin John It's a Beckett for innovation so that they continue not only modernized, but they're really looking at innovation cycles. So that's a big one. And then you heard from somewhere customers at the breakfast this morning morning from from a T. F. As part of the Department of Justice. What they're doing out. I'll call to back on firearms. They completely made you the cloud. They got rid of 20 years of technical debt thio the Veterans Administration on what they're digging for V A benefits to educational institutions like our mighty >>nose, and he had on stages Kino, Cerner, which the health care companies and what struck me about that? I think it relates to your because I want to get your reaction is that the health care is such an acute example that everyone can relate to rising costs. So cloud helping reduce costs increase the efficiencies and patient care is a triple win. The same thing happens in public sector. There's no place to hide anymore. You have a bona fide efficiencies that could come right out of the gate with cloud plus innovation. And it's happening in all the sectors within the public sector. >>So true. Well, Cerner is a great example because they won the award at V a Veteran's administration to do the whole entire medical records modernization. So you have a company on stage that's commercial as I met, commercial as they are public sector that are going into these large modernization efforts. And as you sit on these air, not easy. This takes focus and leadership and a real culture change to make these things happen. >>You know, the international expansion is impressive. We saw each other in London. We did the health care drill down at your office is, of course, a national health. And then you guys were in Bahrain, and what I deserve is it's not like these organizations. They're way behind. I mean, especially the ones that it moved to. The clouds are moving really fast. So well, >>they don't have as much technical debt internationally. It's what we see here in the U. S. So, like I was just in Africa and you know what we talked about digitizing paper. Well, there's no technology on that >>end >>there. It's kind of exciting because they can literally start from square one and get going. And there's a really hunger and the need to make that happen. So it's different for every country in terms of where they are in their cloud journey. >>So I want to ask you about some of the big deals. I'll see Jet eyes in the news, and you can't talk about it because it's in protest and little legal issues. But you have a lot of big deals that you've done. You share some color commentary on from the big deals and what it really means. >>Yeah, well, first of all, let me just say with Department of Defense, Jet are no jet. I We have a very significant business, you know, doing work at every part of different defense. Army, Navy, Air Force in the intelligence community who has a mission for d o d terminus a t o N g eight in a row on And we are not slowing down in D. O d. We had, like, 250 people at a breakfast. Are Lantian yesterday giving ideas on what they're doing and sharing best practices around the fence. So we're not slowing down in D. O d. We're really excited. We have amazing partners. They're doing mission work with us. But in terms of some really kind of fend, things have happened. We did a press announcement today with Finn Rat, the financial regulatory authority here in the U. S. That regulates markets at this is the largest financial transactions you'll ever see being processed and run on the cloud. And the program is called Cat Consolidated Audit Trail. And if you remember the flash crash and the markets kind of going crazy from 2000 day in 2008 when it started, Finneran's started on a journey to try to understand why these market events were happening, and now they have once have been called CAT, which will do more than 100 billion market points a day that will be processed on the cloud. And this is what we know of right now, and they'll be looking for indicators of nefarious behavior within the markets. And we'll look for indicators on a continuous basis. Now what? We've talked about it. We don't even know what we don't know yet because we're getting so much data, we're going to start processing and crunching coming out of all kinds of groups that they're working with, that this is an important point even for Finn rep. They're gonna be retiring technical debt that they have. So they roll out Cat. They'll be retiring other systems, like oats and other programs that they >>just say so that flash crash is really important. Consolidated, honest, because the flash crash, we'll chalk it up to a glitch in the system. Translation. We don't really know what happened. Soto have a consolidated auto trail and having the data and the capabilities, I understand it is really, really important for transparency and confidence in the >>huge and by the way, thinner has been working with us since 2014. They're one of our best partners and are prolific users of the cloud. And I will tell you it's important that we have industries like thin red regulatory authorities, that air going in and saying, Look, we couldn't possibly do what we're doing without cloud computing. >>Tell me about the technical debt because I like this conversation is that we talk about in the commercial side and developer kind of thinking. Most businesses start ups, Whatever. What is technical debt meet in public sector? Can you be specific? >>Well, it's years and years of legacy applications that never had any modernization associated with them in public sector. You know now, because you've talked about these procurement, your very best of your very savvy now public sector >>like 1995 >>not for the faint of heart, for sure that when you do procurement over the years when they would do something they wouldn't build in at new innovations or modernizations. So if you think about if you build a data center today a traditional data center, it's outdated. Tomorrow, the same thing with the procurement. By the time that they delivered on those requirements. They were outdated. So technical debt then has been built up years of on years of not modernizing, just kind of maintaining a status quo with no new insides or analytics. You couldn't add any new tooling. So that is where you see agencies like a T F. That has said, Wow, if I'm gonna if I'm gonna have a modern agency that tracks things like forensics understands the machine learning of what's happening in justice and public safety, I need to have the most modern tools. And I can't do that on an outdated system. So that's what we kind of call technical death that just maintains that system without having anything new that you're adding to >>their capabilities lag. Everything's products bad. Okay, great. Thanks for definite. I gotta ask you about something that's near and dear to our heart collaboration. If you look at the big successes in the world and within Amazon Quantum Caltex partnering on the quantum side, you've done a lot of collaboration with Cal Cal Poly for ground station Amazon Educate. You've been very collaborative in your business, and that's a continuing to be a best practice you have now new things like the cloud innovation centers. Talk about that dynamic and how collaboration has become an important part of your business model. >>What we use their own principles from Amazon. We got building things in our plan. Innovation centers. We start out piloting those two to see, Could they work? And it's really a public private partnership between eight MPs and universities, but its universities that really want to do something. And Cal Poly's a great example. Arizona State University A great example. The number one most innovative university in the US for like, four years in a row. And what we do is we go in and we do these public sector challenges. So the collaboration happens. John, between the public sector Entity, university with students and us, and what we bring to the table is technical talent, air technology and our mechanisms and processes, like they're working backwards processes, and they were like, We want you to bring your best and brightest students. Let's bring public sector in the bowl. They bring challenges there, riel that we can take on, and then they can go back and absorb, and they're pretty exciting. I today I talked about we have over 44 today that we've documented were working at Cal Poly. The one in Arizona State University is about smart cities. And then you heard We're announcing new ones. We've got two in France, one in Germany now, one that we're doing on cybersecurity with our mighty in Australia to be sitting bata rain. So you're going to see us Add a lot more of these and we're getting the results out of them. So you know we won't do if we don't like him. But right now we really like these partnerships. >>Results are looking good. What's going on with >>you? All right. And I'll tell you why. That why they're different, where we are taking on riel public sector issues and challenges that are happening, they're not kind of pie in the sky. We might get there because those are good things to do. But what we want to do is let's tackle things that are really homelessness, opioid crisis, human sex trafficking, that we're seeing things that are really in these communities and those air kind of grand. But then we're taking on areas like farming where we talked about Can we get strawberries rotting on the vine out of the field into the market before you lose billions of dollars in California. So it's things like that that were so its challenges that are quick and riel. And the thing about Cloud is you can create an application and solution and test it out very rapidly without high cost of doing that. No technical Dan, >>you mentioned Smart Cities. I just attended a session. Marty Walsh, the mayor of Boston's, got this 50 50 years smart city plan, and it's pretty impressive, but it's a heavy lift. So what do you see going on in smart cities? And you really can't do it without the cloud, which was kind of my big input cloud. Where's the data? What do you say, >>cloud? I O. T is a big part at these. All the centers that Andy talked about yesterday in his keynote and why the five G partnerships are so important. These centers, they're gonna be everywhere, and you don't even know they really exist because they could be everywhere. And if you have the five G capabilities to move those communications really fast and crypt them so you have all the security you need. This is game changing, but I'll give you an example. I'll go back to the kids for a minute at at Arizona State University, they put Io TI centers everywhere. They no traffic patterns. Have any parking slots? Airfield What Utilities of water, if they're trash bins are being filled at number of seats that are being taken up in stadiums. So it's things like that that they're really working to understand. What are the dynamics of their city and traffic flow around that smart city? And then they're adding things on for the students like Alexis skills. Where's all the activity? So you're adding all things like Alexa Abs, which go into a smart city kind of dynamic. We're not shop. Where's the best activities for about books, for about clothes? What's the pizza on sale tonight? So on and then two things like you saw today on Singapore, where they're taking data from all different elements of agencies and presenting that bad to citizen from their child as example Day one of a birth even before, where's all the service is what I do? How do I track these things? How do I navigate my city? to get all those service is the same. One can find this guy things they're not. They're really and they're actually happening. >>Seems like they're instrumented a lot of the components of the city learning from that and then deciding. Okay, where do we double down on where do we place? >>You're making it Every resilient government, a resilient town. I mean, these were the things that citizens can really help take intro Web and have a voice in doing >>threes. I want to say congratulations to your success. I know it's not for the faint of heart in the public sector of these days, a lot of blockers, a lot of politics, a lot of government lockers and the old procurement system technical debt. I mean, Windows 95 is probably still in a bunch of PCs and 50 45 fighters. 15 fighters. Oh, you've got a great job. You've been doing a great job and riding that wave. So congratulations. >>Well, I'll just say it's worth it. It is worth it. We are committed to public sector, and we really want to see everyone from our war fighters. Are citizens have the capabilities they need. So >>you know, you know that we're very passionate this year about going in the 2020 for the Cube and our audience to do a lot more tech for good programming. This'll is something that's near and dear to your heart as well. You have a chance to shape technology. >>Yes, well, today you saw we had a really amazing not for profit on stage with It's called Game Changer. And what we found with not for profits is that technology can be a game changer if they use it because it makes their mission dollars damage further. And they're an amazing father. And send a team that started game changer at. Taylor was in the hospital five years with terminal cancer, and he and his father, through these five years, kind of looked around. Look at all these Children what they need and they started. He is actually still here with us today, and now he's a young adult taking care of other young Children with cancer, using gaming technologies with their partner, twitch and eight MPs and helping analyze and understand what these young affected Children with cancer need, both that personally and academically and the tools he has He's helping really permit office and get back and it's really hard, Warren says. I was happy. My partner, Mike Level, who is my Gran's commercial sales in business, and I ran public Sector Day. We're honored to give them at a small token of our gift from A to B s to help support their efforts. >>Congratulates, We appreciate you coming on the Cube sharing the update on good luck into 2020. Great to see you 10 years at AWS day one. Still, >>it's day one. I feel like I started >>it like still, like 10 o'clock in the morning or like still a day it wasn't like >>I still wake up every day with the jump in my staff and excited about what I'm gonna do. And so I am. You know, I am really excited that we're doing and like Andy and I say we're just scratching the surface. >>You're a fighter. You are charging We love you, Great executive. You're the chief of public. Get a great job. Great, too. Follow you and ride the wave with Amazon and cover. You guys were documenting history. >>Yeah, exactly. We're in happy holidays to you all and help seeing our seventh and 20 >>so much. Okay, Cube coverage here live in Las Vegas. This is the cube coverage. Extracting the signals. Wanna shout out to eight of us? An intel for putting on the two sets without sponsorship, we wouldn't be able to support the mission of the Cube. I want to thank them. And thank you for watching with more after this short break.

Published Date : Dec 5 2019

SUMMARY :

Brought to you by Amazon Web service One of the biggest waves is the modernization of We appreciate you being here, What's changed the most for you in your And then you heard from somewhere And it's happening in all the sectors So you have a company on stage that's commercial as I met, And then you guys were in Bahrain, and what I deserve is it's not like S. So, like I was just in Africa and you know what we talked about digitizing And there's a really hunger and the need to make that happen. I'll see Jet eyes in the news, and you can't talk about it because it's I We have a very significant business, you know, doing work at every Consolidated, honest, because the flash crash, And I will tell you it's important that we have industries like thin red regulatory Tell me about the technical debt because I like this conversation is that we talk about in the commercial side and developer You know now, because you've talked about these procurement, your very best of your very savvy now public not for the faint of heart, for sure that when you do procurement over the years continuing to be a best practice you have now new things like the cloud innovation centers. and they were like, We want you to bring your best and brightest students. What's going on with And the thing about Cloud is you can create an application and solution and test So what do you see going on in smart cities? And if you have the five G capabilities to move those communications really fast and crypt Seems like they're instrumented a lot of the components of the city learning from that and then deciding. I mean, these were the things that citizens can really help take intro Web I know it's not for the faint of heart in the public Are citizens have the capabilities you know, you know that we're very passionate this year about going in the 2020 for the Cube and And what we found with not Great to see you 10 years at AWS day one. I feel like I started You know, I am really excited that we're doing and like Andy and You're the chief of public. We're in happy holidays to you all and help seeing our seventh and 20 And thank you for watching with

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Marty WalshPERSON

0.99+

WarrenPERSON

0.99+

Teresa CarlsonPERSON

0.99+

CaliforniaLOCATION

0.99+

AndyPERSON

0.99+

Mike LevelPERSON

0.99+

2008DATE

0.99+

AWSORGANIZATION

0.99+

LondonLOCATION

0.99+

AustraliaLOCATION

0.99+

FranceLOCATION

0.99+

AfricaLOCATION

0.99+

10 yearsQUANTITY

0.99+

Veterans AdministrationORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

GermanyLOCATION

0.99+

BahrainLOCATION

0.99+

20 yearsQUANTITY

0.99+

twoQUANTITY

0.99+

1995DATE

0.99+

five yearsQUANTITY

0.99+

MondayDATE

0.99+

yesterdayDATE

0.99+

Las VegasLOCATION

0.99+

TaylorPERSON

0.99+

five yearsQUANTITY

0.99+

two setsQUANTITY

0.99+

oneQUANTITY

0.99+

Arizona State UniversityORGANIZATION

0.99+

2020DATE

0.99+

U. S.LOCATION

0.99+

USLOCATION

0.99+

todayDATE

0.99+

Department of JusticeORGANIZATION

0.99+

eightQUANTITY

0.99+

40 countriesQUANTITY

0.99+

Cal PolyORGANIZATION

0.99+

seventhQUANTITY

0.99+

JohnPERSON

0.99+

10 o'clockDATE

0.99+

550 executivesQUANTITY

0.99+

2014DATE

0.99+

D. O d.LOCATION

0.99+

TomorrowDATE

0.99+

four yearsQUANTITY

0.99+

eight yearsQUANTITY

0.99+

15 fightersQUANTITY

0.99+

SingaporeLOCATION

0.99+

Department of DefenseORGANIZATION

0.99+

40 different governmentsQUANTITY

0.99+

250 peopleQUANTITY

0.99+

Finn RatORGANIZATION

0.99+

two thingsQUANTITY

0.99+

DanPERSON

0.98+

billions of dollarsQUANTITY

0.98+

tonightDATE

0.98+

bothQUANTITY

0.98+

Windows 95TITLE

0.97+

FinneranORGANIZATION

0.97+

50 50 yearsQUANTITY

0.96+

20QUANTITY

0.96+

this yearDATE

0.96+

U. S.LOCATION

0.96+

more than 100 billion market points a dayQUANTITY

0.96+

2019DATE

0.95+

this morning morningDATE

0.95+

Cal Cal PolyORGANIZATION

0.93+

OneQUANTITY

0.93+

550 international government executivesQUANTITY

0.92+

KinoORGANIZATION

0.89+

Amazon WebORGANIZATION

0.89+

eight MPsQUANTITY

0.89+

T. F.PERSON

0.88+

firstQUANTITY

0.87+

CubeCOMMERCIAL_ITEM

0.87+