Image Title

Search Results for 42 analysis:

Breaking Analysis: Databricks faces critical strategic decisions…here’s why


 

>> From theCUBE Studios in Palo Alto and Boston, bringing you data-driven insights from theCUBE and ETR. This is Breaking Analysis with Dave Vellante. >> Spark became a top level Apache project in 2014, and then shortly thereafter, burst onto the big data scene. Spark, along with the cloud, transformed and in many ways, disrupted the big data market. Databricks optimized its tech stack for Spark and took advantage of the cloud to really cleverly deliver a managed service that has become a leading AI and data platform among data scientists and data engineers. However, emerging customer data requirements are shifting into a direction that will cause modern data platform players generally and Databricks, specifically, we think, to make some key directional decisions and perhaps even reinvent themselves. Hello and welcome to this week's wikibon theCUBE Insights, powered by ETR. In this Breaking Analysis, we're going to do a deep dive into Databricks. We'll explore its current impressive market momentum. We're going to use some ETR survey data to show that, and then we'll lay out how customer data requirements are changing and what the ideal data platform will look like in the midterm future. We'll then evaluate core elements of the Databricks portfolio against that vision, and then we'll close with some strategic decisions that we think the company faces. And to do so, we welcome in our good friend, George Gilbert, former equities analyst, market analyst, and current Principal at TechAlpha Partners. George, good to see you. Thanks for coming on. >> Good to see you, Dave. >> All right, let me set this up. We're going to start by taking a look at where Databricks sits in the market in terms of how customers perceive the company and what it's momentum looks like. And this chart that we're showing here is data from ETS, the emerging technology survey of private companies. The N is 1,421. What we did is we cut the data on three sectors, analytics, database-data warehouse, and AI/ML. The vertical axis is a measure of customer sentiment, which evaluates an IT decision maker's awareness of the firm and the likelihood of engaging and/or purchase intent. The horizontal axis shows mindshare in the dataset, and we've highlighted Databricks, which has been a consistent high performer in this survey over the last several quarters. And as we, by the way, just as aside as we previously reported, OpenAI, which burst onto the scene this past quarter, leads all names, but Databricks is still prominent. You can see that the ETR shows some open source tools for reference, but as far as firms go, Databricks is very impressively positioned. Now, let's see how they stack up to some mainstream cohorts in the data space, against some bigger companies and sometimes public companies. This chart shows net score on the vertical axis, which is a measure of spending momentum and pervasiveness in the data set is on the horizontal axis. You can see that chart insert in the upper right, that informs how the dots are plotted, and net score against shared N. And that red dotted line at 40% indicates a highly elevated net score, anything above that we think is really, really impressive. And here we're just comparing Databricks with Snowflake, Cloudera, and Oracle. And that squiggly line leading to Databricks shows their path since 2021 by quarter. And you can see it's performing extremely well, maintaining an elevated net score and net range. Now it's comparable in the vertical axis to Snowflake, and it consistently is moving to the right and gaining share. Now, why did we choose to show Cloudera and Oracle? The reason is that Cloudera got the whole big data era started and was disrupted by Spark. And of course the cloud, Spark and Databricks and Oracle in many ways, was the target of early big data players like Cloudera. Take a listen to Cloudera CEO at the time, Mike Olson. This is back in 2010, first year of theCUBE, play the clip. >> Look, back in the day, if you had a data problem, if you needed to run business analytics, you wrote the biggest check you could to Sun Microsystems, and you bought a great big, single box, central server, and any money that was left over, you handed to Oracle for a database licenses and you installed that database on that box, and that was where you went for data. That was your temple of information. >> Okay? So Mike Olson implied that monolithic model was too expensive and inflexible, and Cloudera set out to fix that. But the best laid plans, as they say, George, what do you make of the data that we just shared? >> So where Databricks has really come up out of sort of Cloudera's tailpipe was they took big data processing, made it coherent, made it a managed service so it could run in the cloud. So it relieved customers of the operational burden. Where they're really strong and where their traditional meat and potatoes or bread and butter is the predictive and prescriptive analytics that building and training and serving machine learning models. They've tried to move into traditional business intelligence, the more traditional descriptive and diagnostic analytics, but they're less mature there. So what that means is, the reason you see Databricks and Snowflake kind of side by side is there are many, many accounts that have both Snowflake for business intelligence, Databricks for AI machine learning, where Snowflake, I'm sorry, where Databricks also did really well was in core data engineering, refining the data, the old ETL process, which kind of turned into ELT, where you loaded into the analytic repository in raw form and refine it. And so people have really used both, and each is trying to get into the other. >> Yeah, absolutely. We've reported on this quite a bit. Snowflake, kind of moving into the domain of Databricks and vice versa. And the last bit of ETR evidence that we want to share in terms of the company's momentum comes from ETR's Round Tables. They're run by Erik Bradley, and now former Gartner analyst and George, your colleague back at Gartner, Daren Brabham. And what we're going to show here is some direct quotes of IT pros in those Round Tables. There's a data science head and a CIO as well. Just make a few call outs here, we won't spend too much time on it, but starting at the top, like all of us, we can't talk about Databricks without mentioning Snowflake. Those two get us excited. Second comment zeros in on the flexibility and the robustness of Databricks from a data warehouse perspective. And then the last point is, despite competition from cloud players, Databricks has reinvented itself a couple of times over the year. And George, we're going to lay out today a scenario that perhaps calls for Databricks to do that once again. >> Their big opportunity and their big challenge for every tech company, it's managing a technology transition. The transition that we're talking about is something that's been bubbling up, but it's really epical. First time in 60 years, we're moving from an application-centric view of the world to a data-centric view, because decisions are becoming more important than automating processes. So let me let you sort of develop. >> Yeah, so let's talk about that here. We going to put up some bullets on precisely that point and the changing sort of customer environment. So you got IT stacks are shifting is George just said, from application centric silos to data centric stacks where the priority is shifting from automating processes to automating decision. You know how look at RPA and there's still a lot of automation going on, but from the focus of that application centricity and the data locked into those apps, that's changing. Data has historically been on the outskirts in silos, but organizations, you think of Amazon, think Uber, Airbnb, they're putting data at the core, and logic is increasingly being embedded in the data instead of the reverse. In other words, today, the data's locked inside the app, which is why you need to extract that data is sticking it to a data warehouse. The point, George, is we're putting forth this new vision for how data is going to be used. And you've used this Uber example to underscore the future state. Please explain? >> Okay, so this is hopefully an example everyone can relate to. The idea is first, you're automating things that are happening in the real world and decisions that make those things happen autonomously without humans in the loop all the time. So to use the Uber example on your phone, you call a car, you call a driver. Automatically, the Uber app then looks at what drivers are in the vicinity, what drivers are free, matches one, calculates an ETA to you, calculates a price, calculates an ETA to your destination, and then directs the driver once they're there. The point of this is that that cannot happen in an application-centric world very easily because all these little apps, the drivers, the riders, the routes, the fares, those call on data locked up in many different apps, but they have to sit on a layer that makes it all coherent. >> But George, so if Uber's doing this, doesn't this tech already exist? Isn't there a tech platform that does this already? >> Yes, and the mission of the entire tech industry is to build services that make it possible to compose and operate similar platforms and tools, but with the skills of mainstream developers in mainstream corporations, not the rocket scientists at Uber and Amazon. >> Okay, so we're talking about horizontally scaling across the industry, and actually giving a lot more organizations access to this technology. So by way of review, let's summarize the trend that's going on today in terms of the modern data stack that is propelling the likes of Databricks and Snowflake, which we just showed you in the ETR data and is really is a tailwind form. So the trend is toward this common repository for analytic data, that could be multiple virtual data warehouses inside of Snowflake, but you're in that Snowflake environment or Lakehouses from Databricks or multiple data lakes. And we've talked about what JP Morgan Chase is doing with the data mesh and gluing data lakes together, you've got various public clouds playing in this game, and then the data is annotated to have a common meaning. In other words, there's a semantic layer that enables applications to talk to the data elements and know that they have common and coherent meaning. So George, the good news is this approach is more effective than the legacy monolithic models that Mike Olson was talking about, so what's the problem with this in your view? >> So today's data platforms added immense value 'cause they connected the data that was previously locked up in these monolithic apps or on all these different microservices, and that supported traditional BI and AI/ML use cases. But now if we want to build apps like Uber or Amazon.com, where they've got essentially an autonomously running supply chain and e-commerce app where humans only care and feed it. But the thing is figuring out what to buy, when to buy, where to deploy it, when to ship it. We needed a semantic layer on top of the data. So that, as you were saying, the data that's coming from all those apps, the different apps that's integrated, not just connected, but it means the same. And the issue is whenever you add a new layer to a stack to support new applications, there are implications for the already existing layers, like can they support the new layer and its use cases? So for instance, if you add a semantic layer that embeds app logic with the data rather than vice versa, which we been talking about and that's been the case for 60 years, then the new data layer faces challenges that the way you manage that data, the way you analyze that data, is not supported by today's tools. >> Okay, so actually Alex, bring me up that last slide if you would, I mean, you're basically saying at the bottom here, today's repositories don't really do joins at scale. The future is you're talking about hundreds or thousands or millions of data connections, and today's systems, we're talking about, I don't know, 6, 8, 10 joins and that is the fundamental problem you're saying, is a new data error coming and existing systems won't be able to handle it? >> Yeah, one way of thinking about it is that even though we call them relational databases, when we actually want to do lots of joins or when we want to analyze data from lots of different tables, we created a whole new industry for analytic databases where you sort of mung the data together into fewer tables. So you didn't have to do as many joins because the joins are difficult and slow. And when you're going to arbitrarily join thousands, hundreds of thousands or across millions of elements, you need a new type of database. We have them, they're called graph databases, but to query them, you go back to the prerelational era in terms of their usability. >> Okay, so we're going to come back to that and talk about how you get around that problem. But let's first lay out what the ideal data platform of the future we think looks like. And again, we're going to come back to use this Uber example. In this graphic that George put together, awesome. We got three layers. The application layer is where the data products reside. The example here is drivers, rides, maps, routes, ETA, et cetera. The digital version of what we were talking about in the previous slide, people, places and things. The next layer is the data layer, that breaks down the silos and connects the data elements through semantics and everything is coherent. And then the bottom layers, the legacy operational systems feed that data layer. George, explain what's different here, the graph database element, you talk about the relational query capabilities, and why can't I just throw memory at solving this problem? >> Some of the graph databases do throw memory at the problem and maybe without naming names, some of them live entirely in memory. And what you're dealing with is a prerelational in-memory database system where you navigate between elements, and the issue with that is we've had SQL for 50 years, so we don't have to navigate, we can say what we want without how to get it. That's the core of the problem. >> Okay. So if I may, I just want to drill into this a little bit. So you're talking about the expressiveness of a graph. Alex, if you'd bring that back out, the fourth bullet, expressiveness of a graph database with the relational ease of query. Can you explain what you mean by that? >> Yeah, so graphs are great because when you can describe anything with a graph, that's why they're becoming so popular. Expressive means you can represent anything easily. They're conducive to, you might say, in a world where we now want like the metaverse, like with a 3D world, and I don't mean the Facebook metaverse, I mean like the business metaverse when we want to capture data about everything, but we want it in context, we want to build a set of digital twins that represent everything going on in the world. And Uber is a tiny example of that. Uber built a graph to represent all the drivers and riders and maps and routes. But what you need out of a database isn't just a way to store stuff and update stuff. You need to be able to ask questions of it, you need to be able to query it. And if you go back to prerelational days, you had to know how to find your way to the data. It's sort of like when you give directions to someone and they didn't have a GPS system and a mapping system, you had to give them turn by turn directions. Whereas when you have a GPS and a mapping system, which is like the relational thing, you just say where you want to go, and it spits out the turn by turn directions, which let's say, the car might follow or whoever you're directing would follow. But the point is, it's much easier in a relational database to say, "I just want to get these results. You figure out how to get it." The graph database, they have not taken over the world because in some ways, it's taking a 50 year leap backwards. >> Alright, got it. Okay. Let's take a look at how the current Databricks offerings map to that ideal state that we just laid out. So to do that, we put together this chart that looks at the key elements of the Databricks portfolio, the core capability, the weakness, and the threat that may loom. Start with the Delta Lake, that's the storage layer, which is great for files and tables. It's got true separation of compute and storage, I want you to double click on that George, as independent elements, but it's weaker for the type of low latency ingest that we see coming in the future. And some of the threats highlighted here. AWS could add transactional tables to S3, Iceberg adoption is picking up and could accelerate, that could disrupt Databricks. George, add some color here please? >> Okay, so this is the sort of a classic competitive forces where you want to look at, so what are customers demanding? What's competitive pressure? What are substitutes? Even what your suppliers might be pushing. Here, Delta Lake is at its core, a set of transactional tables that sit on an object store. So think of it in a database system, this is the storage engine. So since S3 has been getting stronger for 15 years, you could see a scenario where they add transactional tables. We have an open source alternative in Iceberg, which Snowflake and others support. But at the same time, Databricks has built an ecosystem out of tools, their own and others, that read and write to Delta tables, that's what makes the Delta Lake and ecosystem. So they have a catalog, the whole machine learning tool chain talks directly to the data here. That was their great advantage because in the past with Snowflake, you had to pull all the data out of the database before the machine learning tools could work with it, that was a major shortcoming. They fixed that. But the point here is that even before we get to the semantic layer, the core foundation is under threat. >> Yep. Got it. Okay. We got a lot of ground to cover. So we're going to take a look at the Spark Execution Engine next. Think of that as the refinery that runs really efficient batch processing. That's kind of what disrupted the DOOp in a large way, but it's not Python friendly and that's an issue because the data science and the data engineering crowd are moving in that direction, and/or they're using DBT. George, we had Tristan Handy on at Supercloud, really interesting discussion that you and I did. Explain why this is an issue for Databricks? >> So once the data lake was in place, what people did was they refined their data batch, and Spark has always had streaming support and it's gotten better. The underlying storage as we've talked about is an issue. But basically they took raw data, then they refined it into tables that were like customers and products and partners. And then they refined that again into what was like gold artifacts, which might be business intelligence metrics or dashboards, which were collections of metrics. But they were running it on the Spark Execution Engine, which it's a Java-based engine or it's running on a Java-based virtual machine, which means all the data scientists and the data engineers who want to work with Python are really working in sort of oil and water. Like if you get an error in Python, you can't tell whether the problems in Python or where it's in Spark. There's just an impedance mismatch between the two. And then at the same time, the whole world is now gravitating towards DBT because it's a very nice and simple way to compose these data processing pipelines, and people are using either SQL in DBT or Python in DBT, and that kind of is a substitute for doing it all in Spark. So it's under threat even before we get to that semantic layer, it so happens that DBT itself is becoming the authoring environment for the semantic layer with business intelligent metrics. But that's again, this is the second element that's under direct substitution and competitive threat. >> Okay, let's now move down to the third element, which is the Photon. Photon is Databricks' BI Lakehouse, which has integration with the Databricks tooling, which is very rich, it's newer. And it's also not well suited for high concurrency and low latency use cases, which we think are going to increasingly become the norm over time. George, the call out threat here is customers want to connect everything to a semantic layer. Explain your thinking here and why this is a potential threat to Databricks? >> Okay, so two issues here. What you were touching on, which is the high concurrency, low latency, when people are running like thousands of dashboards and data is streaming in, that's a problem because SQL data warehouse, the query engine, something like that matures over five to 10 years. It's one of these things, the joke that Andy Jassy makes just in general, he's really talking about Azure, but there's no compression algorithm for experience. The Snowflake guy started more than five years earlier, and for a bunch of reasons, that lead is not something that Databricks can shrink. They'll always be behind. So that's why Snowflake has transactional tables now and we can get into that in another show. But the key point is, so near term, it's struggling to keep up with the use cases that are core to business intelligence, which is highly concurrent, lots of users doing interactive query. But then when you get to a semantic layer, that's when you need to be able to query data that might have thousands or tens of thousands or hundreds of thousands of joins. And that's a SQL query engine, traditional SQL query engine is just not built for that. That's the core problem of traditional relational databases. >> Now this is a quick aside. We always talk about Snowflake and Databricks in sort of the same context. We're not necessarily saying that Snowflake is in a position to tackle all these problems. We'll deal with that separately. So we don't mean to imply that, but we're just sort of laying out some of the things that Snowflake or rather Databricks customers we think, need to be thinking about and having conversations with Databricks about and we hope to have them as well. We'll come back to that in terms of sort of strategic options. But finally, when come back to the table, we have Databricks' AI/ML Tool Chain, which has been an awesome capability for the data science crowd. It's comprehensive, it's a one-stop shop solution, but the kicker here is that it's optimized for supervised model building. And the concern is that foundational models like GPT could cannibalize the current Databricks tooling, but George, can't Databricks, like other software companies, integrate foundation model capabilities into its platform? >> Okay, so the sound bite answer to that is sure, IBM 3270 terminals could call out to a graphical user interface when they're running on the XT terminal, but they're not exactly good citizens in that world. The core issue is Databricks has this wonderful end-to-end tool chain for training, deploying, monitoring, running inference on supervised models. But the paradigm there is the customer builds and trains and deploys each model for each feature or application. In a world of foundation models which are pre-trained and unsupervised, the entire tool chain is different. So it's not like Databricks can junk everything they've done and start over with all their engineers. They have to keep maintaining what they've done in the old world, but they have to build something new that's optimized for the new world. It's a classic technology transition and their mentality appears to be, "Oh, we'll support the new stuff from our old stuff." Which is suboptimal, and as we'll talk about, their biggest patron and the company that put them on the map, Microsoft, really stopped working on their old stuff three years ago so that they could build a new tool chain optimized for this new world. >> Yeah, and so let's sort of close with what we think the options are and decisions that Databricks has for its future architecture. They're smart people. I mean we've had Ali Ghodsi on many times, super impressive. I think they've got to be keenly aware of the limitations, what's going on with foundation models. But at any rate, here in this chart, we lay out sort of three scenarios. One is re-architect the platform by incrementally adopting new technologies. And example might be to layer a graph query engine on top of its stack. They could license key technologies like graph database, they could get aggressive on M&A and buy-in, relational knowledge graphs, semantic technologies, vector database technologies. George, as David Floyer always says, "A lot of ways to skin a cat." We've seen companies like, even think about EMC maintained its relevance through M&A for many, many years. George, give us your thought on each of these strategic options? >> Okay, I find this question the most challenging 'cause remember, I used to be an equity research analyst. I worked for Frank Quattrone, we were one of the top tech shops in the banking industry, although this is 20 years ago. But the M&A team was the top team in the industry and everyone wanted them on their side. And I remember going to meetings with these CEOs, where Frank and the bankers would say, "You want us for your M&A work because we can do better." And they really could do better. But in software, it's not like with EMC in hardware because with hardware, it's easier to connect different boxes. With software, the whole point of a software company is to integrate and architect the components so they fit together and reinforce each other, and that makes M&A harder. You can do it, but it takes a long time to fit the pieces together. Let me give you examples. If they put a graph query engine, let's say something like TinkerPop, on top of, I don't even know if it's possible, but let's say they put it on top of Delta Lake, then you have this graph query engine talking to their storage layer, Delta Lake. But if you want to do analysis, you got to put the data in Photon, which is not really ideal for highly connected data. If you license a graph database, then most of your data is in the Delta Lake and how do you sync it with the graph database? If you do sync it, you've got data in two places, which kind of defeats the purpose of having a unified repository. I find this semantic layer option in number three actually more promising, because that's something that you can layer on top of the storage layer that you have already. You just have to figure out then how to have your query engines talk to that. What I'm trying to highlight is, it's easy as an analyst to say, "You can buy this company or license that technology." But the really hard work is making it all work together and that is where the challenge is. >> Yeah, and well look, I thank you for laying that out. We've seen it, certainly Microsoft and Oracle. I guess you might argue that well, Microsoft had a monopoly in its desktop software and was able to throw off cash for a decade plus while it's stock was going sideways. Oracle had won the database wars and had amazing margins and cash flow to be able to do that. Databricks isn't even gone public yet, but I want to close with some of the players to watch. Alex, if you'd bring that back up, number four here. AWS, we talked about some of their options with S3 and it's not just AWS, it's blob storage, object storage. Microsoft, as you sort of alluded to, was an early go-to market channel for Databricks. We didn't address that really. So maybe in the closing comments we can. Google obviously, Snowflake of course, we're going to dissect their options in future Breaking Analysis. Dbt labs, where do they fit? Bob Muglia's company, Relational.ai, why are these players to watch George, in your opinion? >> So everyone is trying to assemble and integrate the pieces that would make building data applications, data products easy. And the critical part isn't just assembling a bunch of pieces, which is traditionally what AWS did. It's a Unix ethos, which is we give you the tools, you put 'em together, 'cause you then have the maximum choice and maximum power. So what the hyperscalers are doing is they're taking their key value stores, in the case of ASW it's DynamoDB, in the case of Azure it's Cosmos DB, and each are putting a graph query engine on top of those. So they have a unified storage and graph database engine, like all the data would be collected in the key value store. Then you have a graph database, that's how they're going to be presenting a foundation for building these data apps. Dbt labs is putting a semantic layer on top of data lakes and data warehouses and as we'll talk about, I'm sure in the future, that makes it easier to swap out the underlying data platform or swap in new ones for specialized use cases. Snowflake, what they're doing, they're so strong in data management and with their transactional tables, what they're trying to do is take in the operational data that used to be in the province of many state stores like MongoDB and say, "If you manage that data with us, it'll be connected to your analytic data without having to send it through a pipeline." And that's hugely valuable. Relational.ai is the wildcard, 'cause what they're trying to do, it's almost like a holy grail where you're trying to take the expressiveness of connecting all your data in a graph but making it as easy to query as you've always had it in a SQL database or I should say, in a relational database. And if they do that, it's sort of like, it'll be as easy to program these data apps as a spreadsheet was compared to procedural languages, like BASIC or Pascal. That's the implications of Relational.ai. >> Yeah, and again, we talked before, why can't you just throw this all in memory? We're talking in that example of really getting down to differences in how you lay the data out on disk in really, new database architecture, correct? >> Yes. And that's why it's not clear that you could take a data lake or even a Snowflake and why you can't put a relational knowledge graph on those. You could potentially put a graph database, but it'll be compromised because to really do what Relational.ai has done, which is the ease of Relational on top of the power of graph, you actually need to change how you're storing your data on disk or even in memory. So you can't, in other words, it's not like, oh we can add graph support to Snowflake, 'cause if you did that, you'd have to change, or in your data lake, you'd have to change how the data is physically laid out. And then that would break all the tools that talk to that currently. >> What in your estimation, is the timeframe where this becomes critical for a Databricks and potentially Snowflake and others? I mentioned earlier midterm, are we talking three to five years here? Are we talking end of decade? What's your radar say? >> I think something surprising is going on that's going to sort of come up the tailpipe and take everyone by storm. All the hype around business intelligence metrics, which is what we used to put in our dashboards where bookings, billings, revenue, customer, those things, those were the key artifacts that used to live in definitions in your BI tools, and DBT has basically created a standard for defining those so they live in your data pipeline or they're defined in their data pipeline and executed in the data warehouse or data lake in a shared way, so that all tools can use them. This sounds like a digression, it's not. All this stuff about data mesh, data fabric, all that's going on is we need a semantic layer and the business intelligence metrics are defining common semantics for your data. And I think we're going to find by the end of this year, that metrics are how we annotate all our analytic data to start adding common semantics to it. And we're going to find this semantic layer, it's not three to five years off, it's going to be staring us in the face by the end of this year. >> Interesting. And of course SVB today was shut down. We're seeing serious tech headwinds, and oftentimes in these sort of downturns or flat turns, which feels like this could be going on for a while, we emerge with a lot of new players and a lot of new technology. George, we got to leave it there. Thank you to George Gilbert for excellent insights and input for today's episode. I want to thank Alex Myerson who's on production and manages the podcast, of course Ken Schiffman as well. Kristin Martin and Cheryl Knight help get the word out on social media and in our newsletters. And Rob Hof is our EIC over at Siliconangle.com, he does some great editing. Remember all these episodes, they're available as podcasts. Wherever you listen, all you got to do is search Breaking Analysis Podcast, we publish each week on wikibon.com and siliconangle.com, or you can email me at David.Vellante@siliconangle.com, or DM me @DVellante. Comment on our LinkedIn post, and please do check out ETR.ai, great survey data, enterprise tech focus, phenomenal. This is Dave Vellante for theCUBE Insights powered by ETR. Thanks for watching, and we'll see you next time on Breaking Analysis.

Published Date : Mar 10 2023

SUMMARY :

bringing you data-driven core elements of the Databricks portfolio and pervasiveness in the data and that was where you went for data. and Cloudera set out to fix that. the reason you see and the robustness of Databricks and their big challenge and the data locked into in the real world and decisions Yes, and the mission of that is propelling the likes that the way you manage that data, is the fundamental problem because the joins are difficult and slow. and connects the data and the issue with that is the fourth bullet, expressiveness and it spits out the and the threat that may loom. because in the past with Snowflake, Think of that as the refinery So once the data lake was in place, George, the call out threat here But the key point is, in sort of the same context. and the company that put One is re-architect the platform and architect the components some of the players to watch. in the case of ASW it's DynamoDB, and why you can't put a relational and executed in the data and manages the podcast, of

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Alex MyersonPERSON

0.99+

David FloyerPERSON

0.99+

Mike OlsonPERSON

0.99+

2014DATE

0.99+

George GilbertPERSON

0.99+

Dave VellantePERSON

0.99+

GeorgePERSON

0.99+

Cheryl KnightPERSON

0.99+

Ken SchiffmanPERSON

0.99+

Andy JassyPERSON

0.99+

OracleORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

Erik BradleyPERSON

0.99+

DavePERSON

0.99+

UberORGANIZATION

0.99+

thousandsQUANTITY

0.99+

Sun MicrosystemsORGANIZATION

0.99+

50 yearsQUANTITY

0.99+

AWSORGANIZATION

0.99+

Bob MugliaPERSON

0.99+

GartnerORGANIZATION

0.99+

AirbnbORGANIZATION

0.99+

60 yearsQUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

Ali GhodsiPERSON

0.99+

2010DATE

0.99+

DatabricksORGANIZATION

0.99+

Kristin MartinPERSON

0.99+

Rob HofPERSON

0.99+

threeQUANTITY

0.99+

15 yearsQUANTITY

0.99+

Databricks'ORGANIZATION

0.99+

two placesQUANTITY

0.99+

BostonLOCATION

0.99+

Tristan HandyPERSON

0.99+

M&AORGANIZATION

0.99+

Frank QuattronePERSON

0.99+

second elementQUANTITY

0.99+

Daren BrabhamPERSON

0.99+

TechAlpha PartnersORGANIZATION

0.99+

third elementQUANTITY

0.99+

SnowflakeORGANIZATION

0.99+

50 yearQUANTITY

0.99+

40%QUANTITY

0.99+

ClouderaORGANIZATION

0.99+

Palo AltoLOCATION

0.99+

five yearsQUANTITY

0.99+

Breaking Analysis: MWC 2023 highlights telco transformation & the future of business


 

>> From the Cube Studios in Palo Alto in Boston, bringing you data-driven insights from The Cube and ETR. This is "Breaking Analysis" with Dave Vellante. >> The world's leading telcos are trying to shed the stigma of being monopolies lacking innovation. Telcos have been great at operational efficiency and connectivity and living off of transmission, and the costs and expenses or revenue associated with that transmission. But in a world beyond telephone poles and basic wireless and mobile services, how will telcos modernize and become more agile and monetize new opportunities brought about by 5G and private wireless and a spate of new innovations and infrastructure, cloud data and apps? Hello, and welcome to this week's Wikibon CUBE Insights powered by ETR. In this breaking analysis and ahead of Mobile World Congress or now, MWC23, we explore the evolution of the telco business and how the industry is in many ways, mimicking transformations that took place decades ago in enterprise IT. We'll model some of the traditional enterprise vendors using ETR data and investigate how they're faring in the telecommunications sector, and we'll pose some of the key issues facing the industry this decade. First, let's take a look at what the GSMA has in store for MWC23. GSMA is the host of what used to be called Mobile World Congress. They've set the theme for this year's event as "Velocity" and they've rebranded MWC to reflect the fact that mobile technology is only one part of the story. MWC has become one of the world's premier events highlighting innovations not only in Telco, mobile and 5G, but the collision between cloud, infrastructure, apps, private networks, smart industries, machine intelligence, and AI, and more. MWC comprises an enormous ecosystem of service providers, technology companies, and firms from virtually every industry including sports and entertainment. And as well, GSMA, along with its venue partner at the Fira Barcelona, have placed a major emphasis on sustainability and public and private partnerships. Virtually every industry will be represented at the event because every industry is impacted by the trends and opportunities in this space. GSMA has said it expects 80,000 attendees at MWC this year, not quite back to 2019 levels, but trending in that direction. Of course, attendance from Chinese participants has historically been very high at the show, and obviously the continued travel issues from that region are affecting the overall attendance, but still very strong. And despite these concerns, Huawei, the giant Chinese technology company. has the largest physical presence of any exhibitor at the show. And finally, GSMA estimates that more than $300 million in economic benefit will result from the event which takes place at the end of February and early March. And The Cube will be back at MWC this year with a major presence thanks to our anchor sponsor, Dell Technologies and other supporters of our content program, including Enterprise Web, ArcaOS, VMware, Snowflake, Cisco, AWS, and others. And one of the areas we're interested in exploring is the evolution of the telco stack. It's a topic that's often talked about and one that we've observed taking place in the 1990s when the vertically integrated IBM mainframe monopoly gave way to a disintegrated and horizontal industry structure. And in many ways, the same thing is happening today in telecommunications, which is shown on the left-hand side of this diagram. Historically, telcos have relied on a hardened, integrated, and incredibly reliable, and secure set of hardware and software services that have been fully vetted and tested, and certified, and relied upon for decades. And at the top of that stack on the left are the crown jewels of the telco stack, the operational support systems and the business support systems. For the OSS, we're talking about things like network management, network operations, service delivery, quality of service, fulfillment assurance, and things like that. For the BSS systems, these refer to customer-facing elements of the stack, like revenue, order management, what products they sell, billing, and customer service. And what we're seeing is telcos have been really good at operational efficiency and making money off of transport and connectivity, but they've lacked the innovation in services and applications. They own the pipes and that works well, but others, be the over-the-top content companies, or private network providers and increasingly, cloud providers have been able to bypass the telcos, reach around them, if you will, and drive innovation. And so, the right-most diagram speaks to the need to disaggregate pieces of the stack. And while the similarities to the 1990s in enterprise IT are greater than the differences, there are things that are different. For example, the granularity of hardware infrastructure will not likely be as high where competition occurred back in the 90s at every layer of the value chain with very little infrastructure integration. That of course changed in the 2010s with converged infrastructure and hyper-converged and also software defined. So, that's one difference. And the advent of cloud, containers, microservices, and AI, none of that was really a major factor in the disintegration of legacy IT. And that probably means that disruptors can move even faster than did the likes of Intel and Microsoft, Oracle, Cisco, and the Seagates of the 1990s. As well, while many of the products and services will come from traditional enterprise IT names like Dell, HPE, Cisco, Red Hat, VMware, AWS, Microsoft, Google, et cetera, many of the names are going to be different and come from traditional network equipment providers. These are names like Ericsson and Huawei, and Nokia, and other names, like Wind River, and Rakuten, and Dish Networks. And there are enormous opportunities in data to help telecom companies and their competitors go beyond telemetry data into more advanced analytics and data monetization. There's also going to be an entirely new set of apps based on the workloads and use cases ranging from hospitals, sports arenas, race tracks, shipping ports, you name it. Virtually every vertical will participate in this transformation as the industry evolves its focus toward innovation, agility, and open ecosystems. Now remember, this is not a binary state. There are going to be greenfield companies disrupting the apple cart, but the incumbent telcos are going to have to continue to ensure newer systems work with their legacy infrastructure, in their OSS and BSS existing systems. And as we know, this is not going to be an overnight task. Integration is a difficult thing, transformations, migrations. So that's what makes this all so interesting because others can come in with Greenfield and potentially disrupt. There'll be interesting partnerships and ecosystems will form and coalitions will also form. Now, we mentioned that several traditional enterprise companies are or will be playing in this space. Now, ETR doesn't have a ton of data on specific telecom equipment and software providers, but it does have some interesting data that we cut for this breaking analysis. What we're showing here in this graphic is some of the names that we've followed over the years and how they're faring. Specifically, we did the cut within the telco sector. So the Y-axis here shows net score or spending velocity. And the horizontal axis, that shows the presence or pervasiveness in the data set. And that table insert in the upper left, that informs as to how the dots are plotted. You know, the two columns there, net score and the ends. And that red-dotted line, that horizontal line at 40%, that is an indicator of a highly elevated level. Anything above that, we consider quite outstanding. And what we'll do now is we'll comment on some of the cohorts and share with you how they're doing in telecommunications, and that sector, that vertical relative to their position overall in the data set. Let's start with the public cloud players. They're prominent in every industry. Telcos, telecommunications is no exception and it's quite an interesting cohort here. On the one hand, they can help telecommunication firms modernize and become more agile by eliminating the heavy lifting and you know, all the cloud, you know, value prop, data center costs, and the cloud benefits. At the same time, public cloud players are bringing their services to the edge, building out their own global networks and are a disruptive force to traditional telcos. All right, let's talk about Azure first. Their net score is basically identical to telco relative to its overall average. AWS's net score is higher in telco by just a few percentage points. Google Cloud platform is eight percentage points higher in telco with a 53% net score. So all three hyperscalers have an equal or stronger presence in telco than their average overall. Okay, let's look at the traditional enterprise hardware and software infrastructure cohort. Dell, Cisco, HPE, Red Hat, VMware, and Oracle. We've highlighted in this chart just as sort of indicators or proxies. Dell's net score's 10 percentage points higher in telco than its overall average. Interesting. Cisco's is a bit higher. HPE's is actually lower by about nine percentage points in the ETR survey, and VMware's is lower by about four percentage points. Now, Red Hat is really interesting. OpenStack, as we've previously reported is popular with telcos who want to build out their own private cloud. And the data shows that Red Hat OpenStack's net score is 15 percentage points higher in the telco sector than its overall average. OpenShift, on the other hand, has a net score that's four percentage points lower in telco than its overall average. So this to us talks to the pace of adoption of microservices and containers. You know, it's going to happen, but it's going to happen more slowly. Finally, Oracle's spending momentum is somewhat lower in the sector than its average, despite the firm having a decent telco business. IBM and Accenture, heavy services companies are both lower in this sector than their average. And real quickly, snowflake's net score is much lower by about 12 percentage points relative to its very high average net score of 62%. But we look for them to be a player in this space as telcos need to modernize their analytics stack and share data in a governed manner. Databricks' net score is also much lower than its average by about 13 points. And same, I would expect them to be a player as open architectures and cloud gains steam in telco. All right, let's close out now on what we're going to be talking about at MWC23 and some of the key issues that we'll be unpacking. We've talked about stack disaggregation in this breaking analysis, but the key here will be the pace at which it will reach the operational efficiency and reliability of closed stacks. Telcos, you know, in a large part, they're engineering heavy firms and much of their work takes place, kind of in the basement, in the dark. It's not really a big public hype machine, and they tend to move slowly and cautiously. While they understand the importance of agility, they're going to be careful because, you know, it's in their DNA. And so at the same time, if they don't move fast enough, they're going to get hurt and disrupted by competitors. So that's going to be a topic of conversation, and we'll be looking for proof points. And the other comment I'll make is around integration. Telcos because of their conservatism will benefit from better testing and those firms that can innovate on the testing front and have labs and certifications and innovate at that level, with an ecosystem are going to be in a better position. Because open sometimes means wild west. So the more players like Dell, HPE, Cisco, Red Hat, et cetera, that do that and align with their ecosystems and provide those resources, the faster adoption is going to go. So we'll be looking for, you know, who's actually doing that, Open RAN or Radio Access Networks. That fits in this discussion because O-RAN is an emerging network architecture. It essentially enables the use of open technologies from an ecosystem and over time, look at O-RAN is going to be open, but the questions, you know, a lot of questions remain as to when it will be able to deliver the operational efficiency of traditional RAN. Got some interesting dynamics going on. Rakuten is a company that's working hard on this problem, really focusing on operational efficiency. Then you got Dish Networks. They're also embracing O-RAN. They're coming at it more from service innovation. So that's something that we'll be monitoring and unpacking. We're going to look at cloud as a disruptor. On the one hand, cloud can help drive agility, as we said earlier and optionality, and innovation for incumbent telcos. But the flip side is going to also do the same for startups trying to disrupt and cloud attracts startups. While some of the telcos are actually embracing the cloud, many are being cautious. So that's going to be an interesting topic of discussion. And there's private wireless networks and 5G, and hyperlocal private networks, they're being deployed, you know, at the edge. This idea of open edge is also a really hot topic and this trend is going to accelerate. You know, the importance here is that the use cases are going to be widely varied. The needs of a hospital are going to be different than those of a sports venue are different from a remote drilling location, and energy or a concert venue. Things like real-time AI inference and data flows are going to bring new services and monetization opportunities. And many firms are going to be bypassing traditional telecommunications networks to build these out. Satellites as well, we're going to see, you know, in this decade, you're going to have, you're going to look down at Google Earth and you're going to see real-time. You know, today you see snapshots and so, lots of innovations going in that space. So how is this going to disrupt industries and traditional industry structures? Now, as always, we'll be looking at data angles, right? 'Cause it's in The Cube's DNA to follow the data and what opportunities and risks data brings. The Cube is going to be on location at MWC23 at the end of the month. We got a great set. We're in the walkway between halls four and five, right in Congress Square, it's booths CS60. So we'll have a full, they're called Stan CS60. We have a full schedule. I'm going to be there with Lisa Martin, Dave Nicholson and the entire Cube crew, so don't forget to stop by. All right, that's a wrap. I want to thank Alex Myerson, who's on production and manages the podcast, Ken Schiffman as well. Kristin Martin and Cheryl Knight help get the word out on social media and in our newsletters. And Rob Hof is our editor-in-chief over at Silicon Angle, does some great stuff for us. Thank you all. Remember, all these episodes are available as podcasts. Wherever you listen, just search "Breaking Analysis" podcasts I publish each week on wikibon.com and silicon angle.com. And all the video content is available on demand at thecube.net. You can email me directly at david.vellante@silicon angle.com. You can DM me at dvellante or comment on my LinkedIn post. Please do check out etr.ai for the best survey data in the enterprise tech business. This is Dave Vellante for The Cube Insights powered by ETR. Thanks for watching and we'll see you at Mobile World Congress, and/or at next time on "Breaking Analysis." (bright music) (bright music fades)

Published Date : Feb 18 2023

SUMMARY :

From the Cube Studios and some of the key issues

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Alex MyersonPERSON

0.99+

Lisa MartinPERSON

0.99+

Dave NicholsonPERSON

0.99+

IBMORGANIZATION

0.99+

CiscoORGANIZATION

0.99+

EricssonORGANIZATION

0.99+

Dave VellantePERSON

0.99+

DellORGANIZATION

0.99+

HuaweiORGANIZATION

0.99+

Ken SchiffmanPERSON

0.99+

Kristin MartinPERSON

0.99+

Cheryl KnightPERSON

0.99+

AWSORGANIZATION

0.99+

NokiaORGANIZATION

0.99+

RakutenORGANIZATION

0.99+

Rob HofPERSON

0.99+

OracleORGANIZATION

0.99+

Red HatORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

GSMAORGANIZATION

0.99+

AccentureORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

2019DATE

0.99+

53%QUANTITY

0.99+

Palo AltoLOCATION

0.99+

Wind RiverORGANIZATION

0.99+

HPEORGANIZATION

0.99+

Dell TechnologiesORGANIZATION

0.99+

more than $300 millionQUANTITY

0.99+

40%QUANTITY

0.99+

TelcosORGANIZATION

0.99+

Congress SquareLOCATION

0.99+

FirstQUANTITY

0.99+

VMwareORGANIZATION

0.99+

TelcoORGANIZATION

0.99+

Dish NetworksORGANIZATION

0.99+

telcoORGANIZATION

0.99+

2010sDATE

0.99+

IntelORGANIZATION

0.99+

david.vellante@silicon angle.comOTHER

0.99+

MWC23EVENT

0.99+

1990sDATE

0.99+

62%QUANTITY

0.99+

Mobile World CongressEVENT

0.99+

two columnsQUANTITY

0.99+

each weekQUANTITY

0.99+

SeagatesORGANIZATION

0.99+

Red HatORGANIZATION

0.99+

todayDATE

0.99+

early MarchDATE

0.99+

bothQUANTITY

0.99+

thecube.netOTHER

0.99+

MWCEVENT

0.99+

ETRORGANIZATION

0.98+

this yearDATE

0.98+

Cube StudiosORGANIZATION

0.98+

one partQUANTITY

0.98+

ChineseOTHER

0.98+

BostonLOCATION

0.98+

decades agoDATE

0.97+

threeQUANTITY

0.97+

90sDATE

0.97+

about 13 pointsQUANTITY

0.97+

Chat w/ Arctic Wolf exec re: budget restraints could lead to lax cloud security


 

>> Now we're recording. >> All right. >> Appreciate that, Hannah. >> Yeah, so I mean, I think in general we continue to do very, very well as a company. I think like everybody, there's economic headwinds today that are unavoidable, but I think we have a couple things going for us. One, we're in the cyberspace, which I think is, for the most part, recession proof as an industry. I think the impact of a recession will impact some vendors and some categories, but in general, I think the industry is pretty resilient. It's like the power industry, no? Recession or not, you still need electricity to your house. Cybersecurity is almost becoming a utility like that as far as the needs of companies go. I think for us, we also have the ability to do the security, the security operations, for a lot of companies, and if you look at the value proposition, the ROI for the cost of less than one to maybe two or three, depending on how big you are as a customer, what you'd have to pay for half to three security operations people, we can give you a full security operations. And so the ROI is is almost kind of brain dead simple, and so that keeps us going pretty well. And I think the other areas, we remove all that complexity for people. So in a world where you got other problems to worry about, handling all the security complexity is something that adds to that ROI. So for us, I think what we're seeing is mostly is some of the larger deals are taking a little bit longer than they have, some of the large enterprise deals, 'cause I think they are being a little more cautious about how they spend it, but in general, business is still kind of cranking along. >> Anything you can share with me that you guys have talked about publicly in terms of any metrics, or what can you tell me other than cranking? >> Yeah, I mean, I would just say we're still very, very high growth, so I think our financial profile would kind of still put us clearly in the cyber unicorn position, but I think other than that, we don't really share business metrics as a private- >> Okay, so how about headcount? >> Still growing. So we're not growing as fast as we've been growing, but I don't think we were anyway. I think we kind of, we're getting to the point of critical mass. We'll start to grow in a more kind of normal course and speed. I don't think we overhired like a lot of companies did in the past, even though we added, almost doubled the size of the company in the last 18 months. So we're still hiring, but very kind of targeted to certain roles going forward 'cause I do think we're kind of at critical mass in some of the other functions. >> You disclose headcount or no? >> We do not. >> You don't, okay. And never have? >> Not that I'm aware of, no. >> Okay, on the macro, I don't know if security's recession proof, but it's less susceptible, let's say. I've had Nikesh Arora on recently, we're at Palo Alto's Ignite, and he was saying, "Look," it's just like you were saying, "Larger deal's a little harder." A lot of times customers, he was saying customers are breaking larger deals into smaller deals, more POCs, more approvals, more people to get through the approval, not whole, blah, blah, blah. Now they're a different animal, I understand, but are you seeing similar trends, and how are you dealing with that? >> Yeah, I think the exact same trends, and I think it's just in a world where spending a dollar matters, I think a lot more oversight comes into play, a lot more reviewers, and can you shave it down here? Can you reduce the scope of the project to save money there? And I think it just caused a lot of those things. I think, in the large enterprise, I think most of those deals for companies like us and Palo and CrowdStrike and kind of the upper tier companies, they'll still go through. I think they'll just going to take a lot longer, and, yeah, maybe they're 80% of what they would've been otherwise, but there's still a lot of business to be had out there. >> So how are you dealing with that? I mean, you're talking about you double the size of the company. Is it kind of more focused on go-to-market, more sort of, maybe not overlay, but sort of SE types that are going to be doing more handholding. How have you dealt with that? Or have you just sort of said, "Hey, it is what it is, and we're not going to, we're not going to tactically respond to. We got long-term direction"? >> Yeah, I think it's more the latter. I think for us, it's we've gone through all these things before. It just takes longer now. So a lot of the steps we're taking are the same steps. We're still involved in a lot of POCs, we're involved in a lot of demos, and I don't think that changed. It's just the time between your POC and when someone sends you the PO, there's five more people now got to review things and go through a budget committee and all sorts of stuff like that. I think where we're probably focused more now is adding more and more capabilities just so we continue to be on the front foot of innovation and being relevant to the market, and trying to create more differentiators for us and the competitors. That's something that's just built into our culture, and we don't want to slow that down. And so even though the business is still doing extremely, extremely well, we want to keep investing in kind of technology. >> So the deal size, is it fair to say the initial deal size for new accounts, while it may be smaller, you're adding more capabilities, and so over time, your average contract values will go up? Are you seeing that trend? Or am I- >> Well, I would say I don't even necessarily see our average deal size has gotten smaller. I think in total, it's probably gotten a little bigger. I think what happens is when something like this happens, the old cream rises to the top thing, I think, comes into play, and you'll see some organizations instead of doing a deal with three or four vendors, they may want to pick one or two and really kind of put a lot of energy behind that. For them, they're maybe spending a little less money, but for those vendors who are amongst those getting chosen, I think they're doing pretty good. So our average deal size is pretty stable. For us, it's just a temporal thing. It's just the larger deals take a little bit longer. I don't think we're seeing much of a deal velocity difference in our mid-market commercial spaces, but in the large enterprise it's a little bit slower. But for us, we have ambitious plans in our strategy or on how we want to execute and what we want to build, and so I think we want to just continue to make sure we go down that path technically. >> So I have some questions on sort of the target markets and the cohorts you're going after, and I have some product questions. I know we're somewhat limited on time, but the historical focus has been on SMB, and I know you guys have gone in into enterprise. I'm curious as to how that's going. Any guidance you can give me on mix? Or when I talk to the big guys, right, you know who they are, the big managed service providers, MSSPs, and they're like, "Poo poo on Arctic Wolf," like, "Oh, they're (groans)." I said, "Yeah, that's what they used to say about the PC. It's just a toy. Or Microsoft SQL Server." But so I kind of love that narrative for you guys, but I'm curious from your words as to, what is that enterprise? How's the historical business doing, and how's the entrance into the enterprise going? What kind of hurdles are you having, blockers are you having to remove? Any color you can give me there would be super helpful. >> Yeah, so I think our commercial S&B business continues to do really good. Our mid-market is a very strong market for us. And I think while a lot of companies like to focus purely on large enterprise, there's a lot more mid-market companies, and a much larger piece of the IT puzzle collectively is in mid-market than it is large enterprise. That being said, we started to get pulled into the large enterprise not because we're a toy but because we're quite a comprehensive service. And so I think what we're trying to do from a roadmap perspective is catch up with some of the kind of capabilities that a large enterprise would want from us that a potential mid-market customer wouldn't. In some case, it's not doing more. It's just doing it different. Like, so we have a very kind of hands-on engagement with some of our smaller customers, something we call our concierge. Some of the large enterprises want more of a hybrid where they do some stuff and you do some stuff. And so kind of building that capability into the platform is something that's really important for us. Just how we engage with them as far as giving 'em access to their data, the certain APIs they want, things of that nature, what we're building out for large enterprise, but the demand by large enterprise on our business is enormous. And so it's really just us kind of catching up with some of the kind of the features that they want that we lack today, but many of 'em are still signing up with us, obviously, and in lieu of that, knowing that it's coming soon. And so I think if you look at the growth of our large enterprise, it's one of our fastest growing segments, and I think it shows anything but we're a toy. I would be shocked, frankly, if there's an MSSP, and, of course, we don't see ourself as an MSSP, but I'd be shocked if any of them operate a platform at the scale that ours operates. >> Okay, so wow. A lot I want to unpack there. So just to follow up on that last question, you don't see yourself as an MSSP because why, you see yourselves as a technology platform? >> Yes, I mean, the vast, vast, vast majority of what we deliver is our own technology. So we integrate with third-party solutions mostly to bring in that telemetry. So we've built our own platform from the ground up. We have our own threat intelligence, our own detection logic. We do have our own agents and network sensors. MSSP is typically cobbling together other tools, third party off-the-shelf tools to run their SOC. Ours is all homegrown technology. So I have a whole group called Arctic Wolf Labs, is building, just cranking out ML-based detections, building out infrastructure to take feeds in from a variety of different sources. We have a full integration kind of effort where we integrate into other third parties. So when we go into a customer, we can leverage whatever they have, but at the same time, we produce some tech that if they're lacking in a certain area, we can provide that tech, particularly around things like endpoint agents and network sensors and the like. >> What about like identity, doing your own identity? >> So we don't do our own identity, but we take feeds in from things like Okta and Active Directory and the like, and we have detection logic built on top of that. So part of our value add is we were XDR before XDR was the cool thing to talk about, meaning we can look across multiple attack surfaces and come to a security conclusion where most EDR vendors started with looking just at the endpoint, right? And then they called themselves XDR because now they took in a network feed, but they still looked at it as a separate network detection. We actually look at the things across multiple attack surfaces and stitch 'em together to look at that from a security perspective. In some cases we have automatic detections that will fire. In other cases, we can surface some to a security professional who can go start pulling on that thread. >> So you don't need to purchase CrowdStrike software and integrate it. You have your own equivalent essentially. >> Well, we'll take a feed from the CrowdStrike endpoint into our platform. We don't have to rely on their detections and their alerts, and things of that nature. Now obviously anything they discover we pull in as well, it's just additional context, but we have all our own tech behind it. So we operate kind of at an MSSP scale. We have a similar value proposition in the sense that we'll use whatever the customer has, but once that data kind of comes into our pipeline, it's all our own homegrown tech from there. >> But I mean, what I like about the MSSP piece of your business is it's very high touch. It's very intimate. What I like about what you're saying is that it's software-like economics, so software, software-like part of it. >> That's what makes us the unicorn, right? Is we do have, our concierges is very hands-on. We continue to drive automation that makes our concierge security professionals more efficient, but we always want that customer to have that concierge person as, is almost an extension to their security team, or in some cases, for companies that don't even have a security team, as their security team. As we go down the path, as I mentioned, one of the things we want to be able to do is start to have a more flexible model where we can have that high touch if you want it. We can have the high touch on certain occasions, and you can do stuff. We can have low touch, like we can span the spectrum, but we never want to lose our kind of unique value proposition around the concierge, but we also want to make sure that we're providing an interface that any customer would want to use. >> So given that sort of software-like economics, I mean, services companies need this too, but especially in software, things like net revenue retention and churn are super important. How are those metrics looking? What can you share with me there? >> Yeah, I mean, again, we don't share those metrics publicly, but all's I can continue to repeat is, if you looked at all of our financial metrics, I think you would clearly put us in the unicorn category. I think very few companies are going to have the level of growth that we have on the amount of ARR that we have with the net revenue retention and the churn and upsell. All those aspects continue to be very, very strong for us. >> I want to go back to the sort of enterprise conversation. So large enterprises would engage with you as a complement to their existing SOC, correct? Is that a fair statement or not necessarily? >> It's in some cases. In some cases, they're looking to not have a SOC. So we run into a lot of cases where they want to replace their SIEM, and they want a solution like Arctic Wolf to do that. And so there's a poll, I can't remember, I think it was Forrester, IDC, one of them did it a couple years ago, and they found out that 70% of large enterprises do not want to build the SOC, and it's not 'cause they don't need one, it's 'cause they can't afford it, they can't staff it, they don't have the expertise. And you think about if you're a tech company or a bank, or something like that, of course you can do it, but if you're an international plumbing distributor, you're not going to (chuckles), someone's not going to graduate from Stanford with a cybersecurity degree and go, "Cool, I want to go work for a plumbing distributor in their SOC," right? So they're going to have trouble kind of bringing in the right talent, and as a result, it's difficult to go make a multimillion-dollar investment into a SOC if you're not going to get the quality people to operate it, so they turn to companies like us. >> Got it, so, okay, so you're talking earlier about capabilities that large enterprises require that there might be some gaps, you might lack some features. A couple questions there. One is, when you do some of those, I inferred some of that is integrations. Are those integrations sort of one-off snowflakes or are you finding that you're able to scale those across the large enterprises? That's my first question. >> Yeah, so most of the integrations are pretty straightforward. I think where we run into things that are kind of enterprise-centric, they definitely want open APIs, they want access to our platform, which we don't do today, which we are going to be doing, but we don't do that yet today. They want to do more of a SIEM replacement. So we're really kind of what we call an open XDR platform, so there's things that we would need to build to kind of do raw log ingestion. I mean, we do this today. We have raw log ingestion, we have log storage, we have log searching, but there's like some of the compliance scenarios that they need out of their SIEM. We don't do those today. And so that's kind of holding them back from getting off their SIEM and going fully onto a solution like ours. Then the other one is kind of the level of customization, so the ability to create a whole bunch of custom rules, and that ties back to, "I want to get off my SIEM. I've built all these custom rules in my SIEM, and it's great that you guys do all this automatic AI stuff in the background, but I need these very specific things to be executed on." And so trying to build an interface for them to be able to do that and then also simulate it, again, because, no matter how big they are running their SIEM and their SOC... Like, we talked to one of the largest financial institutions in the world. As far as we were told, they have the largest individual company SOC in the world, and we operate almost 15 times their size. So we always have to be careful because this is a cloud-based native platform, but someone creates some rule that then just craters the performance of the whole platform, so we have to build kind of those guardrails around it. So those are the things primarily that the large enterprises are asking for. Most of those issues are not holding them back from coming. They want to know they're coming, and we're working on all of those. >> Cool, and see, just aside, I was talking to CISO the other day, said, "If it weren't for my compliance and audit group, I would chuck my SIEM." I mean, everybody wants to get rid of their SIEM. >> I've never met anyone who likes their SIEM. >> Do you feel like you've achieved product market fit in the larger enterprise or is that still something that you're sorting out? >> So I think we know, like, we're on a path to do that. We're on a provable path to do that, so I don't think there's any surprises left. I think everything that we know we need to do for that is someone's writing code for it today. It's just a matter of getting it through the system and getting into production. So I feel pretty good about it. I think that's why we are seeing such a high growth rate in our large enterprise business, 'cause we share that feedback with some of those key customers. We have a Customer Advisory Board that we share a lot of this information with. So yeah, I mean, I feel pretty good about what we need to do. We're certainly operate at large enterprise scales, so taking in the amount of the volume of data they're going to have and the types of integrations they need. We're comfortable with that. It's just more or less the interfaces that a large enterprise would want that some of the smaller companies don't ask for. >> Do you have enough tenure in the market to get a sense as to stickiness or even indicators that will lead toward retention? Have you been at it long enough in the enterprise or you still, again, figuring that out? >> Yeah, no, I think we've been at it long enough, and our retention rates are extremely high. If anything, kind of our net retention rates, well over 100% 'cause we have opportunities to upsell into new modules and expanding the coverage of what they have today. I think the areas that if you cornered enterprise that use us and things they would complain about are things I just told you about, right? There's still some things I want to do in my Splunk, and I need an API to pull my data out and put it in my Splunk and stuff like that, and those are the things we want to enable. >> Yeah, so I can't wait till you guys go public because you got Snowflake up here, and you got Veritas down here, and I'm very curious as to where you guys go. When's the IPO? You want to tell me that? (chuckling) >> Unfortunately, it's not up to us right now. You got to get the markets- >> Yeah, I hear you. Right, if the market were better. Well, if the market were better, you think you'd be out? >> Yeah, I mean, we'd certainly be a viable candidate to go. >> Yeah, there you go. I have a question for you because I don't have a SOC. I run a small business with my co-CEO. We're like 30, 40 people W-2s, we got another 50 or so contractors, and I'm always like have one eye, sleep with one eye open 'cause of security. What is your ideal SMB customer? Think S. >> Yeah. >> Would I fit? >> Yeah, I mean you're you're right in the sweet spot. I think where the company started and where we still have a lot of value proposition, which is companies like, like you said it, you sleep with one eye open, but you don't have necessarily the technical acumen to be able to do that security for yourself, and that's where we fit in. We bring kind of this whole security, we call it Security Operations Cloud, to bear, and we have some of the best professionals in the world who can basically be your SOC for less than it would cost you to hire somebody right out of college to do IT stuff. And so the value proposition's there. You're going to get the best of the best, providing you a kind of a security service that you couldn't possibly build on your own, and that way you can go to bed at night and close both eyes. >> So (chuckling) I'm sure something else would keep me up. But so in thinking about that, our Amazon bill keeps growing and growing and growing. What would it, and I presume I can engage with you on a monthly basis, right? As a consumption model, or how's the pricing work? >> Yeah, so there's two models that we have. So typically the kind of the monthly billing type of models would be through one of our MSP partners, where they have monthly billing capabilities. Usually direct with us is more of a longer term deal, could be one, two, or three, or it's up to the customer. And so we have both of those engagement models. Were doing more and more and more through MSPs today because of that model you just described, and they do kind of target the very S in the SMB as well. >> I mean, rough numbers, even ranges. If I wanted to go with the MSP monthly, I mean, what would a small company like mine be looking at a month? >> Honestly, I do not even know the answer to that. >> We're not talking hundreds of thousands of dollars a month? >> No. God, no. God, no. No, no, no. >> I mean, order of magnitude, we're talking thousands, tens of thousands? >> Thousands, on a monthly basis. Yeah. >> Yeah, yeah. Thousands per month. So if I were to budget between 20 and $50,000 a year, I'm definitely within the envelope. Is that fair? I mean, I'm giving a wide range >> That's fair. just to try to make- >> No, that's fair. >> And if I wanted to go direct with you, I would be signing up for a longer term agreement, correct, like I do with Salesforce? >> Yeah, yeah, a year. A year would, I think, be the minimum for that, and, yeah, I think the budget you set aside is kind of right in the sweet spot there. >> Yeah, I'm interested, I'm going to... Have a sales guy call me (chuckles) somehow. >> All right, will do. >> No, I'm serious. I want to start >> I will. >> investigating these things because we sell to very large organizations. I mean, name a tech company. That's our client base, except for Arctic Wolf. We should talk about that. And increasingly they're paranoid about data protection agreements, how you're protecting your data, our data. We write a lot of software and deliver it as part of our services, so it's something that's increasingly important. It's certainly a board level discussion and beyond, and most large organizations and small companies oftentimes don't think about it or try not to. They just put their head in the sand and, "We don't want to be doing that," so. >> Yeah, I will definitely have someone get in touch with you. >> Cool. Let's see. Anything else you can tell me on the product side? Are there things that you're doing that we talked about, the gaps at the high end that you're, some of the features that you're building in, which was super helpful. Anything in the SMB space that you want to share? >> Yeah, I think the biggest thing that we're doing technically now is really trying to drive more and more automation and efficiency through our operations, and that comes through really kind of a generous use of AI. So building models around more efficient detections based upon signal, but also automating the actions of our operators so we can start to learn through the interface. When they do A and B, they always do C. Well, let's just do C for them, stuff like that. Then also building more automation as far as the response back to third-party solutions as well so we can remediate more directly on third-party products without having to get into the consoles or having our customers do it. So that's really just trying to drive efficiency in the system, and that helps provide better security outcomes but also has a big impact on our margins as well. >> I know you got to go, but I want to show you something real quick. I have data. I do a weekly program called "Breaking Analysis," and I have a partner called ETR, Enterprise Technology Research, and they have a platform. I don't know if you can see this. They have a survey platform, and each quarter, they do a survey of about 1,500 IT decision makers. They also have a survey on, they call ETS, Emerging Technology Survey. So it's private companies. And I don't want to go into it too much, but this is a sentiment graph. This is net sentiment. >> Just so you know, all I see is a white- >> Yeah, just a white bar. >> Oh, that's weird. Oh, whiteboard. Oh, here we go. How about that? >> There you go. >> Yeah, so this is a sentiment graph. So this is net sentiment and this is mindshare. And if I go to Arctic Wolf... So it's typical security, right? The 8,000 companies. And when I go here, what impresses me about this is you got a decent mindshare, that's this axis, but you've also got an N in the survey. It's about 1,500 in the survey, It's 479 Arctic Wolf customers responded to this. 57% don't know you. Oh, sorry, they're aware of you, but no plan to evaluate; 19% plan to evaluate, 7% are evaluating; 11%, no plan to utilize even though they've evaluated you; and 1% say they've evaluated you and plan to utilize. It's a small percentage, but actually it's not bad in the random sample of the world about that. And so obviously you want to get that number up, but this is a really impressive position right here that I wanted to just share with you. I do a lot of analysis weekly, and this is a really, it's completely independent survey, and you're sort of separating from the pack, as you can see. So kind of- >> Well, it's good to see that. And I think that just is a further indicator of what I was telling you. We continue to have a strong financial performance. >> Yeah, in a good market. Okay, well, thanks you guys. And hey, if I can get this recording, Hannah, I may even figure out how to write it up. (chuckles) That would be super helpful. >> Yes. We'll get that up. >> And David or Hannah, if you can send me David's contact info so I can get a salesperson in touch with him. (Hannah chuckling) >> Yeah, great. >> Yeah, we'll work on that as well. Thanks so much for both your time. >> Thanks a lot. It was great talking with you. >> Thanks, you guys. Great to meet you. >> Thank you. >> Bye. >> Bye.

Published Date : Feb 15 2023

SUMMARY :

I think for us, we also have the ability I don't think we overhired And never have? and how are you dealing with that? I think they'll just going to that are going to be So a lot of the steps we're and so I think we want to just continue and the cohorts you're going after, And so I think if you look at the growth So just to follow up but at the same time, we produce some tech and Active Directory and the like, So you don't need to but we have all our own tech behind it. like about the MSSP piece one of the things we want So given that sort of of growth that we have on the So large enterprises would engage with you kind of bringing in the right I inferred some of that is integrations. and it's great that you guys do to get rid of their SIEM. I've never met anyone I think everything that we and expanding the coverage to where you guys go. You got to get the markets- Well, if the market were Yeah, I mean, we'd certainly I have a question for you and that way you can go to bed I can engage with you because of that model you just described, the MSP monthly, I mean, know the answer to that. No. God, no. Thousands, on a monthly basis. I mean, I'm giving just to try to make- is kind of right in the sweet spot there. Yeah, I'm interested, I'm going to... I want to start because we sell to very get in touch with you. doing that we talked about, of our operators so we can start to learn I don't know if you can see this. Oh, here we go. from the pack, as you can see. And I think that just I may even figure out how to write it up. if you can send me David's contact info Thanks so much for both your time. great talking with you. Great to meet you.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

HannahPERSON

0.99+

two modelsQUANTITY

0.99+

threeQUANTITY

0.99+

Arctic Wolf LabsORGANIZATION

0.99+

oneQUANTITY

0.99+

80%QUANTITY

0.99+

70%QUANTITY

0.99+

Arctic WolfORGANIZATION

0.99+

twoQUANTITY

0.99+

AmazonORGANIZATION

0.99+

30QUANTITY

0.99+

PaloORGANIZATION

0.99+

479QUANTITY

0.99+

halfQUANTITY

0.99+

19%QUANTITY

0.99+

first questionQUANTITY

0.99+

ForresterORGANIZATION

0.99+

50QUANTITY

0.99+

8,000 companiesQUANTITY

0.99+

ThousandsQUANTITY

0.99+

1%QUANTITY

0.99+

7%QUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

57%QUANTITY

0.99+

IDCORGANIZATION

0.99+

CrowdStrikeORGANIZATION

0.99+

todayDATE

0.99+

A yearQUANTITY

0.99+

one eyeQUANTITY

0.99+

bothQUANTITY

0.99+

both eyesQUANTITY

0.99+

each quarterQUANTITY

0.99+

less than oneQUANTITY

0.98+

11%QUANTITY

0.98+

OneQUANTITY

0.98+

five more peopleQUANTITY

0.98+

axisORGANIZATION

0.98+

thousandsQUANTITY

0.98+

tens of thousandsQUANTITY

0.97+

VeritasORGANIZATION

0.97+

about 1,500 IT decision makersQUANTITY

0.97+

20QUANTITY

0.97+

a yearQUANTITY

0.96+

SalesforceORGANIZATION

0.96+

ETSORGANIZATION

0.96+

StanfordORGANIZATION

0.96+

40 peopleQUANTITY

0.95+

over 100%QUANTITY

0.95+

couple years agoDATE

0.95+

CISOORGANIZATION

0.94+

four vendorsQUANTITY

0.94+

$50,000 a yearQUANTITY

0.93+

about 1,500QUANTITY

0.92+

Enterprise Technology ResearchORGANIZATION

0.92+

almost 15 timesQUANTITY

0.91+

couple questionsQUANTITY

0.91+

CrowdStrikeTITLE

0.9+

hundreds of thousands of dollars a monthQUANTITY

0.9+

ETRORGANIZATION

0.88+

last 18 monthsDATE

0.87+

SQL ServerTITLE

0.84+

three securityQUANTITY

0.84+

Breaking AnalysisTITLE

0.82+

Thousands per monthQUANTITY

0.8+

XDRTITLE

0.79+

a monthQUANTITY

0.74+

SIEMTITLE

0.74+

ArcticORGANIZATION

0.74+

Breaking Analysis: Google's Point of View on Confidential Computing


 

>> From theCUBE studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR. This is Breaking Analysis with Dave Vellante. >> Confidential computing is a technology that aims to enhance data privacy and security by providing encrypted computation on sensitive data and isolating data from apps in a fenced off enclave during processing. The concept of confidential computing is gaining popularity, especially in the cloud computing space where sensitive data is often stored and of course processed. However, there are some who view confidential computing as an unnecessary technology in a marketing ploy by cloud providers aimed at calming customers who are cloud phobic. Hello and welcome to this week's Wikibon CUBE Insights powered by ETR. In this Breaking Analysis, we revisit the notion of confidential computing, and to do so, we'll invite two Google experts to the show, but before we get there, let's summarize briefly. There's not a ton of ETR data on the topic of confidential computing. I mean, it's a technology that's deeply embedded into silicon and computing architectures. But at the highest level, security remains the number one priority being addressed by IT decision makers in the coming year as shown here. And this data is pretty much across the board by industry, by region, by size of company. I mean we dug into it and the only slight deviation from the mean is in financial services. The second and third most cited priorities, cloud migration and analytics, are noticeably closer to cybersecurity in financial services than in other sectors, likely because financial services has always been hyper security conscious, but security is still a clear number one priority in that sector. The idea behind confidential computing is to better address threat models for data in execution. Protecting data at rest and data and transit have long been a focus of security approaches, but more recently, silicon manufacturers have introduced architectures that separate data and applications from the host system. Arm, Intel, AMD, Nvidia and other suppliers are all on board, as are the big cloud players. Now the argument against confidential computing is that it narrowly focuses on memory encryption and it doesn't solve the biggest problems in security. Multiple system images updates different services and the entire code flow aren't directly addressed by memory encryption, rather to truly attack these problems, many believe that OSs need to be re-engineered with the attacker and hacker in mind. There are so many variables and at the end of the day, critics say the emphasis on confidential computing made by cloud providers is overstated and largely hype. This tweet from security researcher Rodrigo Branco sums up the sentiment of many skeptics. He says, "Confidential computing is mostly a marketing campaign for memory encryption. It's not driving the industry towards the hard open problems. It is selling an illusion." Okay. Nonetheless, encrypting data in use and fencing off key components of the system isn't a bad thing, especially if it comes with the package essentially for free. There has been a lack of standardization and interoperability between different confidential computing approaches. But the confidential computing consortium was established in 2019 ostensibly to accelerate the market and influence standards. Notably, AWS is not part of the consortium, likely because the politics of the consortium were probably a conundrum for AWS because the base technology defined by the the consortium is seen as limiting by AWS. This is my guess, not AWS's words, and but I think joining the consortium would validate a definition which AWS isn't aligned with. And two, it's got a lead with this Annapurna acquisition. This was way ahead with Arm integration and so it probably doesn't feel the need to validate its competitors. Anyway, one of the premier members of the confidential computing consortium is Google, along with many high profile names including Arm, Intel, Meta, Red Hat, Microsoft, and others. And we're pleased to welcome two experts on confidential computing from Google to unpack the topic, Nelly Porter is head of product for GCP confidential computing and encryption, and Dr. Patricia Florissi is the technical director for the office of the CTO at Google Cloud. Welcome Nelly and Patricia, great to have you. >> Great to be here. >> Thank you so much for having us. >> You're very welcome. Nelly, why don't you start and then Patricia, you can weigh in. Just tell the audience a little bit about each of your roles at Google Cloud. >> So I'll start, I'm owning a lot of interesting activities in Google and again security or infrastructure securities that I usually own. And we are talking about encryption and when encryption and confidential computing is a part of portfolio in additional areas that I contribute together with my team to Google and our customers is secure software supply chain. Because you need to trust your software. Is it operate in your confidential environment to have end-to-end story about if you believe that your software and your environment doing what you expect, it's my role. >> Got it. Okay. Patricia? >> Well, I am a technical director in the office of the CTO, OCTO for short, in Google Cloud. And we are a global team. We include former CTOs like myself and senior technologists from large corporations, institutions and a lot of success, we're startups as well. And we have two main goals. First, we walk side by side with some of our largest, more strategic or most strategical customers and we help them solve complex engineering technical problems. And second, we are devise Google and Google Cloud engineering and product management and tech on there, on emerging trends and technologies to guide the trajectory of our business. We are unique group, I think, because we have created this collaborative culture with our customers. And within OCTO, I spend a lot of time collaborating with customers and the industry at large on technologies that can address privacy, security, and sovereignty of data in general. >> Excellent. Thank you for that both of you. Let's get into it. So Nelly, what is confidential computing? From Google's perspective, how do you define it? >> Confidential computing is a tool and it's still one of the tools in our toolbox. And confidential computing is a way how we would help our customers to complete this very interesting end-to-end lifecycle of the data. And when customers bring in the data to cloud and want to protect it as they ingest it to the cloud, they protect it at rest when they store data in the cloud. But what was missing for many, many years is ability for us to continue protecting data and workloads of our customers when they running them. And again, because data is not brought to cloud to have huge graveyard, we need to ensure that this data is actually indexed. Again, there is some insights driven and drawn from this data. You have to process this data and confidential computing here to help. Now we have end to end protection of our customer's data when they bring the workloads and data to cloud, thanks to confidential computing. >> Thank you for that. Okay, we're going to get into the architecture a bit, but before we do, Patricia, why do you think this topic of confidential computing is such an important technology? Can you explain, do you think it's transformative for customers and if so, why? >> Yeah, I would maybe like to use one thought, one way, one intuition behind why confidential commuting matters, because at the end of the day, it reduces more and more the customer's thresh boundaries and the attack surface. That's about reducing that periphery, the boundary in which the customer needs to mind about trust and safety. And in a way, is a natural progression that you're using encryption to secure and protect the data. In the same way that we are encrypting data in transit and at rest, now we are also encrypting data while in use. And among other beneficials, I would say one of the most transformative ones is that organizations will be able to collaborate with each other and retain the confidentiality of the data. And that is across industry, even though it's highly focused on, I wouldn't say highly focused, but very beneficial for highly regulated industries. It applies to all of industries. And if you look at financing for example, where bankers are trying to detect fraud, and specifically double finance where you are, a customer is actually trying to get a finance on an asset, let's say a boat or a house, and then it goes to another bank and gets another finance on that asset. Now bankers would be able to collaborate and detect fraud while preserving confidentiality and privacy of the data. >> Interesting. And I want to understand that a little bit more but I'm going to push you a little bit on this, Nelly, if I can because there's a narrative out there that says confidential computing is a marketing ploy, I talked about this upfront, by cloud providers that are just trying to placate people that are scared of the cloud. And I'm presuming you don't agree with that, but I'd like you to weigh in here. The argument is confidential computing is just memory encryption and it doesn't address many other problems. It is over hyped by cloud providers. What do you say to that line of thinking? >> I absolutely disagree, as you can imagine, with this statement, but the most importantly is we mixing multiple concepts, I guess. And exactly as Patricia said, we need to look at the end-to-end story, not again the mechanism how confidential computing trying to again, execute and protect a customer's data and why it's so critically important because what confidential computing was able to do, it's in addition to isolate our tenants in multi-tenant environments the cloud covering to offer additional stronger isolation. They called it cryptographic isolation. It's why customers will have more trust to customers and to other customers, the tenant that's running on the same host but also us because they don't need to worry about against threats and more malicious attempts to penetrate the environment. So what confidential computing is helping us to offer our customers, stronger isolation between tenants in this multi-tenant environment, but also incredibly important, stronger isolation of our customers, so tenants from us. We also writing code, we also software providers will also make mistakes or have some zero days. Sometimes again us introduced, sometimes introduced by our adversaries. But what I'm trying to say by creating this cryptographic layer of isolation between us and our tenants and amongst those tenants, we're really providing meaningful security to our customers and eliminate some of the worries that they have running on multi-tenant spaces or even collaborating to gather this very sensitive data knowing that this particular protection is available to them. >> Okay, thank you. Appreciate that. And I think malicious code is often a threat model missed in these narratives. Operator access, yeah, maybe I trust my clouds provider, but if I can fence off your access even better, I'll sleep better at night. Separating a code from the data, everybody's, Arm, Intel, AMD, Nvidia, others, they're all doing it. I wonder if, Nelly, if we could stay with you and bring up the slide on the architecture. What's architecturally different with confidential computing versus how operating systems and VMs have worked traditionally. We're showing a slide here with some VMs, maybe you could take us through that. >> Absolutely. And Dave, the whole idea for Google and now industry way of dealing with confidential computing is to ensure that three main property is actually preserved. Customers don't need to change the code. They can operate on those VMs exactly as they would with normal non-confidential VMs, but to give them this opportunity of lift and shift or no changing their apps and performing and having very, very, very low latency and scale as any cloud can, something that Google actually pioneer in confidential computing. I think we need to open and explain how this magic was actually done. And as I said, it's again the whole entire system have to change to be able to provide this magic. And I would start with we have this concept of root of trust and root of trust where we will ensure that this machine, when the whole entire post has integrity guarantee, means nobody changing my code on the most low level of system. And we introduce this in 2017 called Titan. It was our specific ASIC, specific, again, inch by inch system on every single motherboard that we have that ensures that your low level former, your actually system code, your kernel, the most powerful system is actually proper configured and not changed, not tampered. We do it for everybody, confidential computing included. But for confidential computing, what we have to change, we bring in AMD, or again, future silicon vendors and we have to trust their former, their way to deal with our confidential environments. And that's why we have obligation to validate integrity, not only our software and our former but also former and software of our vendors, silicon vendors. So we actually, when we booting this machine, as you can see, we validate that integrity of all of the system is in place. It means nobody touching, nobody changing, nobody modifying it. But then we have this concept of AMD secure processor, it's special ASICs, best specific things that generate a key for every single VM that our customers will run or every single node in Kubernetes or every single worker thread in our Hadoop or Spark capability. We offer all of that. And those keys are not available to us. It's the best keys ever in encryption space because when we are talking about encryption, the first question that I'm receiving all the time, where's the key, who will have access to the key? Because if you have access to the key then it doesn't matter if you encrypted or not. So, but the case in confidential computing provides so revolutionary technology, us cloud providers, who don't have access to the keys. They sitting in the hardware and they head to memory controller. And it means when hypervisors that also know about these wonderful things saying I need to get access to the memories that this particular VM trying to get access to, they do not decrypt the data, they don't have access to the key because those keys are random, ephemeral and per VM, but the most importantly, in hardware not exportable. And it means now you would be able to have this very interesting role that customers or cloud providers will not be able to get access to your memory. And what we do, again, as you can see our customers don't need to change their applications, their VMs are running exactly as it should run and what you're running in VM, you actually see your memory in clear, it's not encrypted, but God forbid is trying somebody to do it outside of my confidential box. No, no, no, no, no, they would not be able to do it. Now you'll see cyber and it's exactly what combination of these multiple hardware pieces and software pieces have to do. So OS is also modified. And OS is modified such way to provide integrity. It means even OS that you're running in your VM box is not modifiable and you, as customer, can verify. But the most interesting thing, I guess, how to ensure the super performance of this environment because you can imagine, Dave, that encrypting and it's additional performance, additional time, additional latency. So we were able to mitigate all of that by providing incredibly interesting capability in the OS itself. So our customers will get no changes needed, fantastic performance and scales as they would expect from cloud providers like Google. >> Okay, thank you. Excellent. Appreciate that explanation. So, again, the narrative on this as well, you've already given me guarantees as a cloud provider that you don't have access to my data, but this gives another level of assurance, key management as they say is key. Now humans aren't managing the keys, the machines are managing them. So Patricia, my question to you is, in addition to, let's go pre confidential computing days, what are the sort of new guarantees that these hardware-based technologies are going to provide to customers? >> So if I am a customer, I am saying I now have full guarantee of confidentiality and integrity of the data and of the code. So if you look at code and data confidentiality, the customer cares and they want to know whether their systems are protected from outside or unauthorized access, and that recovered with Nelly, that it is. Confidential computing actually ensures that the applications and data internals remain secret, right? The code is actually looking at the data, the only the memory is decrypting the data with a key that is ephemeral and per VM and generated on demand. Then you have the second point where you have code and data integrity, and now customers want to know whether their data was corrupted, tampered with or impacted by outside actors. And what confidential computing ensures is that application internals are not tampered with. So the application, the workload as we call it, that is processing the data, it's also, it has not been tampered and preserves integrity. I would also say that this is all verifiable. So you have attestation and these attestation actually generates a log trail and the log trail guarantees that, provides a proof that it was preserved. And I think that the offer's also a guarantee of what we call ceiling, this idea that the secrets have been preserved and not tampered with, confidentiality and integrity of code and data. >> Got it. Okay, thank you. Nelly, you mentioned, I think I heard you say that the applications, it's transparent, you don't have to change the application, it just comes for free essentially. And we showed some various parts of the stack before. I'm curious as to what's affected, but really more importantly, what is specifically Google's value add? How do partners participate in this, the ecosystem, or maybe said another way, how does Google ensure the compatibility of confidential computing with existing systems and applications? >> And a fantastic question by the way. And it's very difficult and definitely complicated world because to be able to provide these guarantees, actually a lot of work was done by community. Google is very much operate in open, so again, our operating system, we working with operating system repository OSs, OS vendors to ensure that all capabilities that we need is part of the kernels, are part of the releases and it's available for customers to understand and even explore if they have fun to explore a lot of code. We have also modified together with our silicon vendors a kernel, host kernel to support this capability and it means working this community to ensure that all of those patches are there. We also worked with every single silicon vendor as you've seen, and that's what I probably feel that Google contributed quite a bit in this whole, we moved our industry, our community, our vendors to understand the value of easy to use confidential computing or removing barriers. And now I don't know if you noticed, Intel is pulling the lead and also announcing their trusted domain extension, very similar architecture. And no surprise, it's, again, a lot of work done with our partners to, again, convince, work with them and make this capability available. The same with Arm this year, actually last year, Arm announced their future design for confidential computing. It's called Confidential Computing Architecture. And it's also influenced very heavily with similar ideas by Google and industry overall. So it's a lot of work in confidential computing consortiums that we are doing, for example, simply to mention, to ensure interop, as you mentioned, between different confidential environments of cloud providers. They want to ensure that they can attest to each other because when you're communicating with different environments, you need to trust them. And if it's running on different cloud providers, you need to ensure that you can trust your receiver when you are sharing your sensitive data workloads or secret with them. So we coming as a community and we have this attestation sig, the, again, the community based systems that we want to build and influence and work with Arm and every other cloud providers to ensure that we can interrupt and it means it doesn't matter where confidential workloads will be hosted, but they can exchange the data in secure, verifiable and controlled by customers way. And to do it, we need to continue what we are doing, working open, again, and contribute with our ideas and ideas of our partners to this role to become what we see confidential computing has to become, it has to become utility. It doesn't need to be so special, but it's what we want it to become. >> Let's talk about, thank you for that explanation. Let's talk about data sovereignty because when you think about data sharing, you think about data sharing across the ecosystem and different regions and then of course data sovereignty comes up. Typically public policy lags, the technology industry and sometimes is problematic. I know there's a lot of discussions about exceptions, but Patricia, we have a graphic on data sovereignty. I'm interested in how confidential computing ensures that data sovereignty and privacy edicts are adhered to, even if they're out of alignment maybe with the pace of technology. One of the frequent examples is when you delete data, can you actually prove that data is deleted with a hundred percent certainty? You got to prove that and a lot of other issues. So looking at this slide, maybe you could take us through your thinking on data sovereignty. >> Perfect. So for us, data sovereignty is only one of the three pillars of digital sovereignty. And I don't want to give the impression that confidential computing addresses it all. That's why we want to step back and say, hey, digital sovereignty includes data sovereignty where we are giving you full control and ownership of the location, encryption and access to your data. Operational sovereignty where the goal is to give our Google Cloud customers full visibility and control over the provider operations, right? So if there are any updates on hardware, software stack, any operations, there is full transparency, full visibility. And then the third pillar is around software sovereignty where the customer wants to ensure that they can run their workloads without dependency on the provider's software. So they have sometimes is often referred as survivability, that you can actually survive if you are untethered to the cloud and that you can use open source. Now let's take a deep dive on data sovereignty, which by the way is one of my favorite topics. And we typically focus on saying, hey, we need to care about data residency. We care where the data resides because where the data is at rest or in processing, it typically abides to the jurisdiction, the regulations of the jurisdiction where the data resides. And others say, hey, let's focus on data protection. We want to ensure the confidentiality and integrity and availability of the data, which confidential computing is at the heart of that data protection. But it is yet another element that people typically don't talk about when talking about data sovereignty, which is the element of user control. And here, Dave, is about what happens to the data when I give you access to my data. And this reminds me of security two decades ago, even a decade ago, where we started the security movement by putting firewall protections and login accesses. But once you were in, you were able to do everything you wanted with the data. An insider had access to all the infrastructure, the data and the code. And that's similar because with data sovereignty we care about whether it resides, where, who is operating on the data. But the moment that the data is being processed, I need to trust that the processing of the data will abide by user control, by the policies that I put in place of how my data is going to be used. And if you look at a lot of the regulation today and a lot of the initiatives around the International Data Space Association, IDSA, and Gaia-X, there is a movement of saying the two parties, the provider of the data and the receiver of the data are going to agree on a contract that describes what my data can be used for. The challenge is to ensure that once the data crosses boundaries, that the data will be used for the purposes that it was intended and specified in the contract. And if you actually bring together, and this is the exciting part, confidential computing together with policy enforcement, now the policy enforcement can guarantee that the data is only processed within the confines of a confidential computing environment, that the workload is cryptographically verified that there is the workload that was meant to process the data and that the data will be only used when abiding to the confidentiality and integrity safety of the confidential computing environment. And that's why we believe confidential computing is one necessary and essential technology that will allow us to ensure data sovereignty, especially when it comes to user control. >> Thank you for that. I mean it was a deep dive, I mean brief, but really detailed. So I appreciate that, especially the verification of the enforcement. Last question, I met you two because as part of my year end prediction post, you guys sent in some predictions and I wasn't able to get to them in the predictions post. So I'm thrilled that you were able to make the time to come on the program. How widespread do you think the adoption of confidential computing will be in 23 and what's the maturity curve look like, this decade in your opinion? Maybe each of you could give us a brief answer. >> So my prediction in five, seven years, as I started, it'll become utility. It'll become TLS as of, again, 10 years ago we couldn't believe that websites will have certificates and we will support encrypted traffic. Now we do and it's become ubiquity. It's exactly where confidential computing is getting and heading, I don't know we deserve yet. It'll take a few years of maturity for us, but we will be there. >> Thank you. And Patricia, what's your prediction? >> I will double that and say, hey, in the future, in the very near future, you will not be able to afford not having it. I believe as digital sovereignty becomes evermore top of mind with sovereign states and also for multi national organizations and for organizations that want to collaborate with each other, confidential computing will become the norm. It'll become the default, if I say, mode of operation. I like to compare that today is inconceivable. If we talk to the young technologists, it's inconceivable to think that at some point in history, and I happen to be alive that we had data at rest that was not encrypted, data in transit that was not encrypted, and I think that will be inconceivable at some point in the near future that to have unencrypted data while in use. >> And plus I think the beauty of the this industry is because there's so much competition, this essentially comes for free. I want to thank you both for spending some time on Breaking Analysis. There's so much more we could cover. I hope you'll come back to share the progress that you're making in this area and we can double click on some of these topics. Really appreciate your time. >> Anytime. >> Thank you so much. >> In summary, while confidential computing is being touted by the cloud players as a promising technology for enhancing data privacy and security, there are also those, as we said, who remain skeptical. The truth probably lies somewhere in between and it will depend on the specific implementation and the use case as to how effective confidential computing will be. Look, as with any new tech, it's important to carefully evaluate the potential benefits, the drawbacks, and make informed decisions based on the specific requirements in the situation and the constraints of each individual customer. But the bottom line is silicon manufacturers are working with cloud providers and other system companies to include confidential computing into their architectures. Competition, in our view, will moderate price hikes. And at the end of the day, this is under the covers technology that essentially will come for free. So we'll take it. I want to thank our guests today, Nelly and Patricia from Google, and thanks to Alex Myerson who's on production and manages the podcast. Ken Schiffman as well out of our Boston studio, Kristin Martin and Cheryl Knight help get the word out on social media and in our newsletters. And Rob Hof is our editor-in-chief over at siliconangle.com. Does some great editing for us, thank you all. Remember all these episodes are available as podcasts. Wherever you listen, just search Breaking Analysis podcast. I publish each week on wikibon.com and siliconangle.com where you can get all the news. If you want to get in touch, you can email me at david.vellante@siliconangle.com or dm me @DVellante. And you can also comment on my LinkedIn post. Definitely you want to check out etr.ai for the best survey data in the enterprise tech business. I know we didn't hit on a lot today, but there's some amazing data and it's always being updated, so check that out. This is Dave Vellante for theCUBE Insights, powered by ETR. Thanks for watching and we'll see you next time on Breaking Analysis. (upbeat music)

Published Date : Feb 11 2023

SUMMARY :

bringing you data-driven and at the end of the day, Just tell the audience a little and confidential computing Got it. and the industry at large for that both of you. in the data to cloud into the architecture a bit, and privacy of the data. people that are scared of the cloud. and eliminate some of the we could stay with you and they head to memory controller. So, again, the narrative on this as well, and integrity of the data and of the code. how does Google ensure the compatibility and ideas of our partners to this role One of the frequent examples and that the data will be only used of the enforcement. and we will support encrypted traffic. And Patricia, and I happen to be alive beauty of the this industry and the constraints of

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
NellyPERSON

0.99+

PatriciaPERSON

0.99+

International Data Space AssociationORGANIZATION

0.99+

Alex MyersonPERSON

0.99+

AWSORGANIZATION

0.99+

IDSAORGANIZATION

0.99+

Rodrigo BrancoPERSON

0.99+

Dave VellantePERSON

0.99+

DavePERSON

0.99+

MicrosoftORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

NvidiaORGANIZATION

0.99+

2019DATE

0.99+

2017DATE

0.99+

Kristin MartinPERSON

0.99+

Nelly PorterPERSON

0.99+

Ken SchiffmanPERSON

0.99+

Rob HofPERSON

0.99+

Cheryl KnightPERSON

0.99+

last yearDATE

0.99+

Palo AltoLOCATION

0.99+

Red HatORGANIZATION

0.99+

two partiesQUANTITY

0.99+

AMDORGANIZATION

0.99+

Patricia FlorissiPERSON

0.99+

IntelORGANIZATION

0.99+

oneQUANTITY

0.99+

fiveQUANTITY

0.99+

second pointQUANTITY

0.99+

david.vellante@siliconangle.comOTHER

0.99+

MetaORGANIZATION

0.99+

secondQUANTITY

0.99+

thirdQUANTITY

0.99+

OneQUANTITY

0.99+

twoQUANTITY

0.99+

ArmORGANIZATION

0.99+

eachQUANTITY

0.99+

two expertsQUANTITY

0.99+

FirstQUANTITY

0.99+

first questionQUANTITY

0.99+

Gaia-XORGANIZATION

0.99+

two decades agoDATE

0.99+

bothQUANTITY

0.99+

this yearDATE

0.99+

seven yearsQUANTITY

0.99+

OCTOORGANIZATION

0.99+

zero daysQUANTITY

0.98+

10 years agoDATE

0.98+

each weekQUANTITY

0.98+

todayDATE

0.97+

Breaking Analysis: Google's PoV on Confidential Computing


 

>> From theCUBE Studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR. This is Breaking Analysis with Dave Vellante. >> Confidential computing is a technology that aims to enhance data privacy and security, by providing encrypted computation on sensitive data and isolating data, and apps that are fenced off enclave during processing. The concept of, I got to start over. I fucked that up, I'm sorry. That's not right, what I said was not right. On Dave in five, four, three. Confidential computing is a technology that aims to enhance data privacy and security by providing encrypted computation on sensitive data, isolating data from apps and a fenced off enclave during processing. The concept of confidential computing is gaining popularity, especially in the cloud computing space, where sensitive data is often stored and of course processed. However, there are some who view confidential computing as an unnecessary technology in a marketing ploy by cloud providers aimed at calming customers who are cloud phobic. Hello and welcome to this week's Wikibon Cube Insights powered by ETR. In this Breaking Analysis, we revisit the notion of confidential computing, and to do so, we'll invite two Google experts to the show. But before we get there, let's summarize briefly. There's not a ton of ETR data on the topic of confidential computing, I mean, it's a technology that's deeply embedded into silicon and computing architectures. But at the highest level, security remains the number one priority being addressed by IT decision makers in the coming year as shown here. And this data is pretty much across the board by industry, by region, by size of company. I mean we dug into it and the only slight deviation from the mean is in financial services. The second and third most cited priorities, cloud migration and analytics are noticeably closer to cybersecurity in financial services than in other sectors, likely because financial services has always been hyper security conscious, but security is still a clear number one priority in that sector. The idea behind confidential computing is to better address threat models for data in execution. Protecting data at rest and data in transit have long been a focus of security approaches, but more recently, silicon manufacturers have introduced architectures that separate data and applications from the host system, ARM, Intel, AMD, Nvidia and other suppliers are all on board, as are the big cloud players. Now, the argument against confidential computing is that it narrowly focuses on memory encryption and it doesn't solve the biggest problems in security. Multiple system images, updates, different services and the entire code flow aren't directly addressed by memory encryption. Rather to truly attack these problems, many believe that OSs need to be re-engineered with the attacker and hacker in mind. There are so many variables and at the end of the day, critics say the emphasis on confidential computing made by cloud providers is overstated and largely hype. This tweet from security researcher Rodrigo Bronco, sums up the sentiment of many skeptics. He says, "Confidential computing is mostly a marketing campaign from memory encryption. It's not driving the industry towards the hard open problems. It is selling an illusion." Okay. Nonetheless, encrypting data in use and fencing off key components of the system isn't a bad thing, especially if it comes with the package essentially for free. There has been a lack of standardization and interoperability between different confidential computing approaches. But the confidential computing consortium was established in 2019 ostensibly to accelerate the market and influence standards. Notably, AWS is not part of the consortium, likely because the politics of the consortium were probably a conundrum for AWS because the base technology defined by the consortium is seen as limiting by AWS. This is my guess, not AWS' words. But I think joining the consortium would validate a definition which AWS isn't aligned with. And two, it's got to lead with this Annapurna acquisition. It was way ahead with ARM integration, and so it's probably doesn't feel the need to validate its competitors. Anyway, one of the premier members of the confidential computing consortium is Google, along with many high profile names, including Aem, Intel, Meta, Red Hat, Microsoft, and others. And we're pleased to welcome two experts on confidential computing from Google to unpack the topic. Nelly Porter is Head of Product for GCP Confidential Computing and Encryption and Dr. Patricia Florissi is the Technical Director for the Office of the CTO at Google Cloud. Welcome Nelly and Patricia, great to have you. >> Great to be here. >> Thank you so much for having us. >> You're very welcome. Nelly, why don't you start and then Patricia, you can weigh in. Just tell the audience a little bit about each of your roles at Google Cloud. >> So I'll start, I'm owning a lot of interesting activities in Google and again, security or infrastructure securities that I usually own. And we are talking about encryption, end-to-end encryption, and confidential computing is a part of portfolio. Additional areas that I contribute to get with my team to Google and our customers is secure software supply chain because you need to trust your software. Is it operate in your confidential environment to have end-to-end security, about if you believe that your software and your environment doing what you expect, it's my role. >> Got it. Okay, Patricia? >> Well, I am a Technical Director in the Office of the CTO, OCTO for short in Google Cloud. And we are a global team, we include former CTOs like myself and senior technologies from large corporations, institutions and a lot of success for startups as well. And we have two main goals, first, we walk side by side with some of our largest, more strategic or most strategical customers and we help them solve complex engineering technical problems. And second, we advice Google and Google Cloud Engineering, product management on emerging trends and technologies to guide the trajectory of our business. We are unique group, I think, because we have created this collaborative culture with our customers. And within OCTO I spend a lot of time collaborating with customers in the industry at large on technologies that can address privacy, security, and sovereignty of data in general. >> Excellent. Thank you for that both of you. Let's get into it. So Nelly, what is confidential computing from Google's perspective? How do you define it? >> Confidential computing is a tool and one of the tools in our toolbox. And confidential computing is a way how we would help our customers to complete this very interesting end-to-end lifecycle of the data. And when customers bring in the data to cloud and want to protect it as they ingest it to the cloud, they protect it at rest when they store data in the cloud. But what was missing for many, many years is ability for us to continue protecting data and workloads of our customers when they run them. And again, because data is not brought to cloud to have huge graveyard, we need to ensure that this data is actually indexed. Again, there is some insights driven and drawn from this data. You have to process this data and confidential computing here to help. Now we have end-to-end protection of our customer's data when they bring the workloads and data to cloud thanks to confidential computing. >> Thank you for that. Okay, we're going to get into the architecture a bit, but before we do Patricia, why do you think this topic of confidential computing is such an important technology? Can you explain? Do you think it's transformative for customers and if so, why? >> Yeah, I would maybe like to use one thought, one way, one intuition behind why confidential computing matters because at the end of the day, it reduces more and more the customer's thrush boundaries and the attack surface. That's about reducing that periphery, the boundary in which the customer needs to mind about trust and safety. And in a way is a natural progression that you're using encryption to secure and protect data in the same way that we are encrypting data in transit and at rest. Now, we are also encrypting data while in the use. And among other beneficials, I would say one of the most transformative ones is that organizations will be able to collaborate with each other and retain the confidentiality of the data. And that is across industry, even though it's highly focused on, I wouldn't say highly focused but very beneficial for highly regulated industries, it applies to all of industries. And if you look at financing for example, where bankers are trying to detect fraud and specifically double finance where a customer is actually trying to get a finance on an asset, let's say a boat or a house, and then it goes to another bank and gets another finance on that asset. Now bankers would be able to collaborate and detect fraud while preserving confidentiality and privacy of the data. >> Interesting and I want to understand that a little bit more but I got to push you a little bit on this, Nellie if I can, because there's a narrative out there that says confidential computing is a marketing ploy I talked about this up front, by cloud providers that are just trying to placate people that are scared of the cloud. And I'm presuming you don't agree with that, but I'd like you to weigh in here. The argument is confidential computing is just memory encryption, it doesn't address many other problems. It is over hyped by cloud providers. What do you say to that line of thinking? >> I absolutely disagree as you can imagine Dave, with this statement. But the most importantly is we mixing a multiple concepts I guess, and exactly as Patricia said, we need to look at the end-to-end story, not again, is a mechanism. How confidential computing trying to execute and protect customer's data and why it's so critically important. Because what confidential computing was able to do, it's in addition to isolate our tenants in multi-tenant environments the cloud offering to offer additional stronger isolation, they called it cryptographic isolation. It's why customers will have more trust to customers and to other customers, the tenants running on the same host but also us because they don't need to worry about against rats and more malicious attempts to penetrate the environment. So what confidential computing is helping us to offer our customers stronger isolation between tenants in this multi-tenant environment, but also incredibly important, stronger isolation of our customers to tenants from us. We also writing code, we also software providers, we also make mistakes or have some zero days. Sometimes again us introduce, sometimes introduced by our adversaries. But what I'm trying to say by creating this cryptographic layer of isolation between us and our tenants and among those tenants, we really providing meaningful security to our customers and eliminate some of the worries that they have running on multi-tenant spaces or even collaborating together with very sensitive data knowing that this particular protection is available to them. >> Okay, thank you. Appreciate that. And I think malicious code is often a threat model missed in these narratives. You know, operator access. Yeah, maybe I trust my cloud's provider, but if I can fence off your access even better, I'll sleep better at night separating a code from the data. Everybody's ARM, Intel, AMD, Nvidia and others, they're all doing it. I wonder if Nell, if we could stay with you and bring up the slide on the architecture. What's architecturally different with confidential computing versus how operating systems and VMs have worked traditionally? We're showing a slide here with some VMs, maybe you could take us through that. >> Absolutely, and Dave, the whole idea for Google and now industry way of dealing with confidential computing is to ensure that three main property is actually preserved. Customers don't need to change the code. They can operate in those VMs exactly as they would with normal non-confidential VMs. But to give them this opportunity of lift and shift though, no changing the apps and performing and having very, very, very low latency and scale as any cloud can, some things that Google actually pioneer in confidential computing. I think we need to open and explain how this magic was actually done, and as I said, it's again the whole entire system have to change to be able to provide this magic. And I would start with we have this concept of root of trust and root of trust where we will ensure that this machine within the whole entire host has integrity guarantee, means nobody changing my code on the most low level of system, and we introduce this in 2017 called Titan. So our specific ASIC, specific inch by inch system on every single motherboard that we have that ensures that your low level former, your actually system code, your kernel, the most powerful system is actually proper configured and not changed, not tempered. We do it for everybody, confidential computing included, but for confidential computing is what we have to change, we bring in AMD or future silicon vendors and we have to trust their former, their way to deal with our confidential environments. And that's why we have obligation to validate intelligent not only our software and our former but also former and software of our vendors, silicon vendors. So we actually, when we booting this machine as you can see, we validate that integrity of all of this system is in place. It means nobody touching, nobody changing, nobody modifying it. But then we have this concept of AMD Secure Processor, it's special ASIC best specific things that generate a key for every single VM that our customers will run or every single node in Kubernetes or every single worker thread in our Hadoop spark capability. We offer all of that and those keys are not available to us. It's the best case ever in encryption space because when we are talking about encryption, the first question that I'm receiving all the time, "Where's the key? Who will have access to the key?" because if you have access to the key then it doesn't matter if you encrypted or not. So, but the case in confidential computing why it's so revolutionary technology, us cloud providers who don't have access to the keys, they're sitting in the hardware and they fed to memory controller. And it means when hypervisors that also know about this wonderful things saying I need to get access to the memories, that this particular VM I'm trying to get access to. They do not decrypt the data, they don't have access to the key because those keys are random, ephemeral and per VM, but most importantly in hardware not exportable. And it means now you will be able to have this very interesting world that customers or cloud providers will not be able to get access to your memory. And what we do, again as you can see, our customers don't need to change their applications. Their VMs are running exactly as it should run. And what you've running in VM, you actually see your memory clear, it's not encrypted. But God forbid is trying somebody to do it outside of my confidential box, no, no, no, no, no, you will now be able to do it. Now, you'll see cyber test and it's exactly what combination of these multiple hardware pieces and software pieces have to do. So OS is also modified and OS is modified such way to provide integrity. It means even OS that you're running in your VM box is not modifiable and you as customer can verify. But the most interesting thing I guess how to ensure the super performance of this environment because you can imagine Dave, that's increasing and it's additional performance, additional time, additional latency. So we're able to mitigate all of that by providing incredibly interesting capability in the OS itself. So our customers will get no changes needed, fantastic performance and scales as they would expect from cloud providers like Google. >> Okay, thank you. Excellent, appreciate that explanation. So you know again, the narrative on this is, well, you've already given me guarantees as a cloud provider that you don't have access to my data, but this gives another level of assurance, key management as they say is key. Now humans aren't managing the keys, the machines are managing them. So Patricia, my question to you is in addition to, let's go pre-confidential computing days, what are the sort of new guarantees that these hardware based technologies are going to provide to customers? >> So if I am a customer, I am saying I now have full guarantee of confidentiality and integrity of the data and of the code. So if you look at code and data confidentiality, the customer cares and they want to know whether their systems are protected from outside or unauthorized access, and that we covered with Nelly that it is. Confidential computing actually ensures that the applications and data antennas remain secret. The code is actually looking at the data, only the memory is decrypting the data with a key that is ephemeral, and per VM, and generated on demand. Then you have the second point where you have code and data integrity and now customers want to know whether their data was corrupted, tempered with or impacted by outside actors. And what confidential computing ensures is that application internals are not tempered with. So the application, the workload as we call it, that is processing the data is also has not been tempered and preserves integrity. I would also say that this is all verifiable, so you have attestation and this attestation actually generates a log trail and the log trail guarantees that provides a proof that it was preserved. And I think that the offers also a guarantee of what we call sealing, this idea that the secrets have been preserved and not tempered with, confidentiality and integrity of code and data. >> Got it. Okay, thank you. Nelly, you mentioned, I think I heard you say that the applications is transparent, you don't have to change the application, it just comes for free essentially. And we showed some various parts of the stack before, I'm curious as to what's affected, but really more importantly, what is specifically Google's value add? How do partners participate in this, the ecosystem or maybe said another way, how does Google ensure the compatibility of confidential computing with existing systems and applications? >> And a fantastic question by the way, and it's very difficult and definitely complicated world because to be able to provide these guarantees, actually a lot of work was done by community. Google is very much operate and open. So again our operating system, we working this operating system repository OS is OS vendors to ensure that all capabilities that we need is part of the kernels are part of the releases and it's available for customers to understand and even explore if they have fun to explore a lot of code. We have also modified together with our silicon vendors kernel, host kernel to support this capability and it means working this community to ensure that all of those pages are there. We also worked with every single silicon vendor as you've seen, and it's what I probably feel that Google contributed quite a bit in this world. We moved our industry, our community, our vendors to understand the value of easy to use confidential computing or removing barriers. And now I don't know if you noticed Intel is following the lead and also announcing a trusted domain extension, very similar architecture and no surprise, it's a lot of work done with our partners to convince work with them and make this capability available. The same with ARM this year, actually last year, ARM announced future design for confidential computing, it's called confidential computing architecture. And it's also influenced very heavily with similar ideas by Google and industry overall. So it's a lot of work in confidential computing consortiums that we are doing, for example, simply to mention, to ensure interop as you mentioned, between different confidential environments of cloud providers. They want to ensure that they can attest to each other because when you're communicating with different environments, you need to trust them. And if it's running on different cloud providers, you need to ensure that you can trust your receiver when you sharing your sensitive data workloads or secret with them. So we coming as a community and we have this at Station Sig, the community-based systems that we want to build, and influence, and work with ARM and every other cloud providers to ensure that they can interop. And it means it doesn't matter where confidential workloads will be hosted, but they can exchange the data in secure, verifiable and controlled by customers really. And to do it, we need to continue what we are doing, working open and contribute with our ideas and ideas of our partners to this role to become what we see confidential computing has to become, it has to become utility. It doesn't need to be so special, but it's what what we've wanted to become. >> Let's talk about, thank you for that explanation. Let's talk about data sovereignty because when you think about data sharing, you think about data sharing across the ecosystem in different regions and then of course data sovereignty comes up, typically public policy, lags, the technology industry and sometimes it's problematic. I know there's a lot of discussions about exceptions but Patricia, we have a graphic on data sovereignty. I'm interested in how confidential computing ensures that data sovereignty and privacy edicts are adhered to, even if they're out of alignment maybe with the pace of technology. One of the frequent examples is when you delete data, can you actually prove the data is deleted with a hundred percent certainty, you got to prove that and a lot of other issues. So looking at this slide, maybe you could take us through your thinking on data sovereignty. >> Perfect. So for us, data sovereignty is only one of the three pillars of digital sovereignty. And I don't want to give the impression that confidential computing addresses it at all, that's why we want to step back and say, hey, digital sovereignty includes data sovereignty where we are giving you full control and ownership of the location, encryption and access to your data. Operational sovereignty where the goal is to give our Google Cloud customers full visibility and control over the provider operations, right? So if there are any updates on hardware, software stack, any operations, there is full transparency, full visibility. And then the third pillar is around software sovereignty, where the customer wants to ensure that they can run their workloads without dependency on the provider's software. So they have sometimes is often referred as survivability that you can actually survive if you are untethered to the cloud and that you can use open source. Now, let's take a deep dive on data sovereignty, which by the way is one of my favorite topics. And we typically focus on saying, hey, we need to care about data residency. We care where the data resides because where the data is at rest or in processing need to typically abides to the jurisdiction, the regulations of the jurisdiction where the data resides. And others say, hey, let's focus on data protection, we want to ensure the confidentiality, and integrity, and availability of the data, which confidential computing is at the heart of that data protection. But it is yet another element that people typically don't talk about when talking about data sovereignty, which is the element of user control. And here Dave, is about what happens to the data when I give you access to my data, and this reminds me of security two decades ago, even a decade ago, where we started the security movement by putting firewall protections and logging accesses. But once you were in, you were able to do everything you wanted with the data. An insider had access to all the infrastructure, the data, and the code. And that's similar because with data sovereignty, we care about whether it resides, who is operating on the data, but the moment that the data is being processed, I need to trust that the processing of the data we abide by user's control, by the policies that I put in place of how my data is going to be used. And if you look at a lot of the regulation today and a lot of the initiatives around the International Data Space Association, IDSA and Gaia-X, there is a movement of saying the two parties, the provider of the data and the receiver of the data going to agree on a contract that describes what my data can be used for. The challenge is to ensure that once the data crosses boundaries, that the data will be used for the purposes that it was intended and specified in the contract. And if you actually bring together, and this is the exciting part, confidential computing together with policy enforcement. Now, the policy enforcement can guarantee that the data is only processed within the confines of a confidential computing environment, that the workload is in cryptographically verified that there is the workload that was meant to process the data and that the data will be only used when abiding to the confidentiality and integrity safety of the confidential computing environment. And that's why we believe confidential computing is one necessary and essential technology that will allow us to ensure data sovereignty, especially when it comes to user's control. >> Thank you for that. I mean it was a deep dive, I mean brief, but really detailed. So I appreciate that, especially the verification of the enforcement. Last question, I met you two because as part of my year-end prediction post, you guys sent in some predictions and I wasn't able to get to them in the predictions post, so I'm thrilled that you were able to make the time to come on the program. How widespread do you think the adoption of confidential computing will be in '23 and what's the maturity curve look like this decade in your opinion? Maybe each of you could give us a brief answer. >> So my prediction in five, seven years as I started, it will become utility, it will become TLS. As of freakin' 10 years ago, we couldn't believe that websites will have certificates and we will support encrypted traffic. Now we do, and it's become ubiquity. It's exactly where our confidential computing is heeding and heading, I don't know we deserve yet. It'll take a few years of maturity for us, but we'll do that. >> Thank you. And Patricia, what's your prediction? >> I would double that and say, hey, in the very near future, you will not be able to afford not having it. I believe as digital sovereignty becomes ever more top of mind with sovereign states and also for multinational organizations, and for organizations that want to collaborate with each other, confidential computing will become the norm, it will become the default, if I say mode of operation. I like to compare that today is inconceivable if we talk to the young technologists, it's inconceivable to think that at some point in history and I happen to be alive, that we had data at rest that was non-encrypted, data in transit that was not encrypted. And I think that we'll be inconceivable at some point in the near future that to have unencrypted data while we use. >> You know, and plus I think the beauty of the this industry is because there's so much competition, this essentially comes for free. I want to thank you both for spending some time on Breaking Analysis, there's so much more we could cover. I hope you'll come back to share the progress that you're making in this area and we can double click on some of these topics. Really appreciate your time. >> Anytime. >> Thank you so much, yeah. >> In summary, while confidential computing is being touted by the cloud players as a promising technology for enhancing data privacy and security, there are also those as we said, who remain skeptical. The truth probably lies somewhere in between and it will depend on the specific implementation and the use case as to how effective confidential computing will be. Look as with any new tech, it's important to carefully evaluate the potential benefits, the drawbacks, and make informed decisions based on the specific requirements in the situation and the constraints of each individual customer. But the bottom line is silicon manufacturers are working with cloud providers and other system companies to include confidential computing into their architectures. Competition in our view will moderate price hikes and at the end of the day, this is under-the-covers technology that essentially will come for free, so we'll take it. I want to thank our guests today, Nelly and Patricia from Google. And thanks to Alex Myerson who's on production and manages the podcast. Ken Schiffman as well out of our Boston studio. Kristin Martin and Cheryl Knight help get the word out on social media and in our newsletters, and Rob Hoof is our editor-in-chief over at siliconangle.com, does some great editing for us. Thank you all. Remember all these episodes are available as podcasts. Wherever you listen, just search Breaking Analysis podcast. I publish each week on wikibon.com and siliconangle.com where you can get all the news. If you want to get in touch, you can email me at david.vellante@siliconangle.com or DM me at D Vellante, and you can also comment on my LinkedIn post. Definitely you want to check out etr.ai for the best survey data in the enterprise tech business. I know we didn't hit on a lot today, but there's some amazing data and it's always being updated, so check that out. This is Dave Vellante for theCUBE Insights powered by ETR. Thanks for watching and we'll see you next time on Breaking Analysis. (subtle music)

Published Date : Feb 10 2023

SUMMARY :

bringing you data-driven and at the end of the day, and then Patricia, you can weigh in. contribute to get with my team Okay, Patricia? Director in the Office of the CTO, for that both of you. in the data to cloud into the architecture a bit, and privacy of the data. that are scared of the cloud. and eliminate some of the we could stay with you and they fed to memory controller. to you is in addition to, and integrity of the data and of the code. that the applications is transparent, and ideas of our partners to this role One of the frequent examples and a lot of the initiatives of the enforcement. and we will support encrypted traffic. And Patricia, and I happen to be alive, the beauty of the this industry and at the end of the day,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
NellyPERSON

0.99+

PatriciaPERSON

0.99+

Alex MyersonPERSON

0.99+

AWSORGANIZATION

0.99+

International Data Space AssociationORGANIZATION

0.99+

DavePERSON

0.99+

AWS'ORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

Dave VellantePERSON

0.99+

Rob HoofPERSON

0.99+

Cheryl KnightPERSON

0.99+

Nelly PorterPERSON

0.99+

GoogleORGANIZATION

0.99+

NvidiaORGANIZATION

0.99+

IDSAORGANIZATION

0.99+

Rodrigo BroncoPERSON

0.99+

2019DATE

0.99+

Ken SchiffmanPERSON

0.99+

IntelORGANIZATION

0.99+

AMDORGANIZATION

0.99+

2017DATE

0.99+

ARMORGANIZATION

0.99+

AemORGANIZATION

0.99+

NelliePERSON

0.99+

Kristin MartinPERSON

0.99+

Red HatORGANIZATION

0.99+

two partiesQUANTITY

0.99+

Palo AltoLOCATION

0.99+

last yearDATE

0.99+

Patricia FlorissiPERSON

0.99+

oneQUANTITY

0.99+

MetaORGANIZATION

0.99+

twoQUANTITY

0.99+

thirdQUANTITY

0.99+

Gaia-XORGANIZATION

0.99+

second pointQUANTITY

0.99+

two expertsQUANTITY

0.99+

david.vellante@siliconangle.comOTHER

0.99+

secondQUANTITY

0.99+

bothQUANTITY

0.99+

first questionQUANTITY

0.99+

fiveQUANTITY

0.99+

OneQUANTITY

0.99+

theCUBE StudiosORGANIZATION

0.99+

two decades agoDATE

0.99+

'23DATE

0.99+

eachQUANTITY

0.99+

a decade agoDATE

0.99+

threeQUANTITY

0.99+

zero daysQUANTITY

0.98+

fourQUANTITY

0.98+

OCTOORGANIZATION

0.98+

todayDATE

0.98+

Breaking Analysis: Cloud players sound a cautious tone for 2023


 

>> From the Cube Studios in Palo Alto in Boston bringing you data-driven insights from the Cube and ETR. This is Breaking Analysis with Dave Vellante. >> The unraveling of market enthusiasm continued in Q4 of 2022 with the earnings reports from the US hyperscalers, the big three now all in. As we said earlier this year, even the cloud is an immune from the macro headwinds and the cracks in the armor that we saw from the data that we shared last summer, they're playing out into 2023. For the most part actuals are disappointing beyond expectations including our own. It turns out that our estimates for the big three hyperscaler's revenue missed by 1.2 billion or 2.7% lower than we had forecast from even our most recent November estimates. And we expect continued decelerating growth rates for the hyperscalers through the summer of 2023 and we don't think that's going to abate until comparisons get easier. Hello and welcome to this week's Wikibon Cube Insights powered by ETR. In this Breaking Analysis, we share our view of what's happening in cloud markets not just for the hyperscalers but other firms that have hitched a ride on the cloud. And we'll share new ETR data that shows why these trends are playing out tactics that customers are employing to deal with their cost challenges and how long the pain is likely to last. You know, riding the cloud wave, it's a two-edged sword. Let's look at the players that have gone all in on or are exposed to both the positive and negative trends of cloud. Look the cloud has been a huge tailwind for so many companies like Snowflake and Databricks, Workday, Salesforce, Mongo's move with Atlas, Red Hats Cloud strategy with OpenShift and so forth. And you know, the flip side is because cloud is elastic what comes up can also go down very easily. Here's an XY graphic from ETR that shows spending momentum or net score on the vertical axis and market presence in the dataset on the horizontal axis provision or called overlap. This is data from the January 2023 survey and that the red dotted lines show the positions of several companies that we've highlighted going back to January 2021. So let's unpack this for a bit starting with the big three hyperscalers. The first point is AWS and Azure continue to solidify their moat relative to Google Cloud platform. And we're going to get into this in a moment, but Azure and AWS revenues are five to six times that of GCP for IaaS. And at those deltas, Google should be gaining ground much faster than the big two. The second point on Google is notice the red line on GCP relative to its starting point. While it appears to be gaining ground on the horizontal axis, its net score is now below that of AWS and Azure in the survey. So despite its significantly smaller size it's just not keeping pace with the leaders in terms of market momentum. Now looking at AWS and Microsoft, what we see is basically AWS is holding serve. As we know both Google and Microsoft benefit from including SaaS in their cloud numbers. So the fact that AWS hasn't seen a huge downward momentum relative to a January 2021 position is one positive in the data. And both companies are well above that magic 40% line on the Y-axis, anything above 40% we consider to be highly elevated. But the fact remains that they're down as are most of the names on this chart. So let's take a closer look. I want to start with Snowflake and Databricks. Snowflake, as we reported from several quarters back came down to Earth, it was up in the 80% range in the Y-axis here. And it's still highly elevated in the 60% range and it continues to move to the right, which is positive but as we'll address in a moment it's customers can dial down consumption just as in any cloud. Now, Databricks is really interesting. It's not a public company, it never made it to IPO during the sort of tech bubble. So we don't have the same level of transparency that we do with other companies that did make it through. But look at how much more prominent it is on the X-axis relative to January 2021. And it's net score is basically held up over that period of time. So that's a real positive for Databricks. Next, look at Workday and Salesforce. They've held up relatively well, both inching to the right and generally holding their net scores. Same from Mongo, which is the brown dot above its name that says Elastic, it says a little gets a little crowded which Elastic's actually the blue dot above it. But generally, SaaS is harder to dial down, Workday, Salesforce, Oracles, SaaS and others. So it's harder to dial down because commitments have been made in advance, they're kind of locked in. Now, one of the discussions from last summer was as Mongo, less discretionary than analytics i.e. Snowflake. And it's an interesting debate but maybe Snowflake customers, you know, they're also generally committed to a dollar amount. So over time the spending is going to be there. But in the short term, yeah maybe Snowflake customers can dial down. Now that highlighted dotted red line, that bolded one is Datadog and you can see it's made major strides on the X-axis but its net score has decelerated quite dramatically. Openshift's momentum in the survey has dropped although IBM just announced that OpenShift has a a billion dollar ARR and I suspect what's happening there is IBM consulting is bundling OpenShift into its modernization projects. It's got a, that sort of captive base if you will. And as such it's probably not as top of mind to the respondents but I'll bet you the developers are certainly aware of it. Now the other really notable call out here is CloudFlare, We've reported on them earlier. Cloudflare's net score has held up really well since January of 2021. It really hasn't seen the downdraft of some of these others, but it's making major major moves to the right gaining market presence. We really like how CloudFlare is performing. And the last comment is on Oracle which as you can see, despite its much, much lower net score continues to gain ground in the market and thrive from a profitability standpoint. But the data pretty clearly shows that there's a downdraft in the market. Okay, so what's happening here? Let's dig deeper into this data. Here's a graphic from the most recent ETR drill down asking customers that said they were going to cut spending what technique they're using to do so. Now, as we've previously reported, consolidating redundant vendors is by far the most cited approach but there's two key points we want to make here. One is reducing excess cloud resources. As you can see in the bars is the second most cited technique and it's up from the previous polling period. The second we're not showing, you know directly but we've got some red call outs there. Reducing cloud costs jumps to 29% and 28% respectively in financial services and tech telco. And it's much closer to second. It's basically neck and neck with consolidating redundant vendors in those two industries. So they're being really aggressive about optimizing cloud cost. Okay, so as we said, cloud is great 'cause you can dial it up but it's just as easy to dial down. We've identified six factors that customers tell us are affecting their cloud consumption and there are probably more, if you got more we'd love to hear them but these are the ones that are fairly prominent that have hit our radar. First, rising mortgage rates mean banks are processing fewer loans means less cloud. The crypto crash means less trading activity and that means less cloud resources. Third lower ad spend has led companies to reduce not only you know, their ad buying but also their frequency of running their analytics and their calculations. And they're also often using less data, maybe compressing the timeframe of the corpus down to a shorter time period. Also very prominent is down to the bottom left, using lower cost compute instances. For example, Graviton from AWS or AMD chips and tiering storage to cheaper S3 or deep archived tiers. And finally, optimizing based on better pricing plans. So customers are moving from, you know, smaller companies in particular moving maybe from on demand or other larger companies that are experimenting using on demand or they're moving to spot pricing or reserved instances or optimized savings plans. That all lowers cost and that means less cloud resource consumption and less cloud revenue. Now in the days when everything was on prem CFOs, what would they do? They would freeze CapEx and IT Pros would have to try to do more with less and often that meant a lot of manual tasks. With the cloud it's much easier to move things around. It still takes some thinking and some effort but it's dramatically simpler to do so. So you can get those savings a lot faster. Now of course the other huge factor is you can cut or you can freeze. And this graphic shows data from a recent ETR survey with 159 respondents and you can see the meaningful uptick in hiring freezes, freezing new IT deployments and layoffs. And as we've been reporting, this has been trending up since earlier last year. And note the call out, this is especially prominent in retail sectors, all three of these techniques jump up in retail and that's a bit of a concern because oftentimes consumer spending helps the economy make a softer landing out of a pullback. But this is a potential canary in the coal mine. If retail firms are pulling back it's because consumers aren't spending as much. And so we're keeping a close eye on that. So let's boil this down to the market data and what this all means. So in this graphic we show our estimates for Q4 IaaS revenues compared to the "actual" IaaS revenues. And we say quote because AWS is the only one that reports, you know clean revenue and IaaS, Azure and GCP don't report actuals. Why would they? Because it would make them look even, you know smaller relative to AWS. Rather, they bury the figures in overall cloud which includes their, you know G-Suite for Google and all the Microsoft SaaS. And then they give us little tidbits about in Microsoft's case, Azure, they give growth rates. Google gives kind of relative growth of GCP. So, and we use survey data and you know, other data to try to really pinpoint and we've been covering this for, I don't know, five or six years ever since the cloud really became a thing. But looking at the data, we had AWS growing at 25% this quarter and it came in at 20%. So a significant decline relative to our expectations. AWS announced that it exited December, actually, sorry it's January data showed about a 15% mid-teens growth rate. So that's, you know, something we're watching. Azure was two points off our forecast coming in at 38% growth. It said it exited December in the 35% growth range and it said that it's expecting five points of deceleration off of that. So think 30% for Azure. GCP came in three points off our expectation coming in 35% and Alibaba has yet to report but we've shaved a bid off that forecast based on some survey data and you know what maybe 9% is even still not enough. Now for the year, the big four hyperscalers generated almost 160 billion of revenue, but that was 7 billion lower than what what we expected coming into 2022. For 2023, we're expecting 21% growth for a total of 193.3 billion. And while it's, you know, lower, you know, significantly lower than historical expectations it's still four to five times the overall spending forecast that we just shared with you in our predictions post of between 4 and 5% for the overall market. We think AWS is going to come in in around 93 billion this year with Azure closing in at over 71 billion. This is, again, we're talking IaaS here. Now, despite Amazon focusing investors on the fact that AWS's absolute dollar growth is still larger than its competitors. By our estimates Azure will come in at more than 75% of AWS's forecasted revenue. That's a significant milestone. AWS is operating margins by the way declined significantly this past quarter, dropping from 30% of revenue to 24%, 30% the year earlier to 24%. Now that's still extremely healthy and we've seen wild fluctuations like this before so I don't get too freaked out about that. But I'll say this, Microsoft has a marginal cost advantage relative to AWS because one, it has a captive cloud on which to run its massive software estate. So it can just throw software at its own cloud and two software marginal costs. Marginal economics despite AWS's awesomeness in high degrees of automation, software is just a better business. Now the upshot for AWS is the ecosystem. AWS is essentially in our view positioning very smartly as a platform for data partners like Snowflake and Databricks, security partners like CrowdStrike and Okta and Palo Alto and many others and SaaS companies. You know, Microsoft is more competitive even though AWS does have competitive products. Now of course Amazon's competitive to retail companies so that's another factor but generally speaking for tech players, Amazon is a really thriving ecosystem that is a secret weapon in our view. AWS happy to spin the meter with its partners even though it sells competitive products, you know, more so in our view than other cloud players. Microsoft, of course is, don't forget is hyping now, we're hearing a lot OpenAI and ChatGPT we reported last week in our predictions post. How OpenAI is shot up in terms of market sentiment in ETR's emerging technology company surveys and people are moving to Azure to get OpenAI and get ChatGPT that is a an interesting lever. Amazon in our view has to have a response. They have lots of AI and they're going to have to make some moves there. Meanwhile, Google is emphasizing itself as an AI first company. In fact, Google spent at least five minutes of continuous dialogue, nonstop on its AI chops during its latest earnings call. So that's an area that we're watching very closely as the buzz around large language models continues. All right, let's wrap up with some assumptions for 2023. We think SaaS players are going to continue to be sticky. They're going to be somewhat insulated from all these downdrafts because they're so tied in and customers, you know they make the commitment up front, you've got the lock in. Now having said that, we do expect some backlash over time on the onerous and generally customer unfriendly pricing models of most large SaaS companies. But that's going to play out over a longer period of time. Now for cloud generally and the hyperscalers specifically we do expect accelerating growth rates into Q3 but the amplitude of the demand swings from this rubber band economy, we expect to continue to compress and become more predictable throughout the year. Estimates are coming down, CEOs we think are going to be more cautious when the market snaps back more cautious about hiring and spending and as such a perhaps we expect a more orderly return to growth which we think will slightly accelerate in Q4 as comps get easier. Now of course the big risk to these scenarios is of course the economy, the FED, consumer spending, inflation, supply chain, energy prices, wars, geopolitics, China relations, you know, all the usual stuff. But as always with our partners at ETR and the Cube community, we're here for you. We have the data and we'll be the first to report when we see a change at the margin. Okay, that's a wrap for today. I want to thank Alex Morrison who's on production and manages the podcast, Ken Schiffman as well out of our Boston studio getting this up on LinkedIn Live. Thank you for that. Kristen Martin also and Cheryl Knight help get the word out on social media and in our newsletters. And Rob Hof is our Editor-in-Chief over at siliconangle.com. He does some great editing for us. Thank you all. Remember all these episodes are available as podcast. Wherever you listen, just search Breaking Analysis podcast. I publish each week on wikibon.com, at siliconangle.com where you can see all the data and you want to get in touch. Just all you can do is email me david.vellante@siliconangle.com or DM me @dvellante if you if you got something interesting, I'll respond. If you don't, it's either 'cause I'm swamped or it's just not tickling me. You can comment on our LinkedIn post as well. And please check out ETR.ai for the best survey data in the enterprise tech business. This is Dave Vellante for the Cube Insights powered by ETR. Thanks for watching and we'll see you next time on Breaking Analysis. (gentle upbeat music)

Published Date : Feb 4 2023

SUMMARY :

From the Cube Studios and how long the pain is likely to last.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Alex MorrisonPERSON

0.99+

AWSORGANIZATION

0.99+

AlibabaORGANIZATION

0.99+

Cheryl KnightPERSON

0.99+

Kristen MartinPERSON

0.99+

Dave VellantePERSON

0.99+

Ken SchiffmanPERSON

0.99+

January 2021DATE

0.99+

MicrosoftORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

Rob HofPERSON

0.99+

2.7%QUANTITY

0.99+

JanuaryDATE

0.99+

AmazonORGANIZATION

0.99+

DecemberDATE

0.99+

January of 2021DATE

0.99+

fiveQUANTITY

0.99+

January 2023DATE

0.99+

SnowflakeORGANIZATION

0.99+

Palo AltoLOCATION

0.99+

1.2 billionQUANTITY

0.99+

20%QUANTITY

0.99+

IBMORGANIZATION

0.99+

DatabricksORGANIZATION

0.99+

29%QUANTITY

0.99+

30%QUANTITY

0.99+

six factorsQUANTITY

0.99+

second pointQUANTITY

0.99+

24%QUANTITY

0.99+

2022DATE

0.99+

david.vellante@siliconangle.comOTHER

0.99+

X-axisORGANIZATION

0.99+

2023DATE

0.99+

28%QUANTITY

0.99+

193.3 billionQUANTITY

0.99+

ETRORGANIZATION

0.99+

38%QUANTITY

0.99+

7 billionQUANTITY

0.99+

21%QUANTITY

0.99+

EarthLOCATION

0.99+

25%QUANTITY

0.99+

MongoORGANIZATION

0.99+

OracleORGANIZATION

0.99+

AtlasORGANIZATION

0.99+

two industriesQUANTITY

0.99+

last weekDATE

0.99+

six yearsQUANTITY

0.99+

first pointQUANTITY

0.99+

Red HatsORGANIZATION

0.99+

35%QUANTITY

0.99+

fourQUANTITY

0.99+

159 respondentsQUANTITY

0.99+

OktaORGANIZATION

0.99+

Breaking Analysis: ChatGPT Won't Give OpenAI First Mover Advantage


 

>> From theCUBE Studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR. This is Breaking Analysis with Dave Vellante. >> OpenAI The company, and ChatGPT have taken the world by storm. Microsoft reportedly is investing an additional 10 billion dollars into the company. But in our view, while the hype around ChatGPT is justified, we don't believe OpenAI will lock up the market with its first mover advantage. Rather, we believe that success in this market will be directly proportional to the quality and quantity of data that a technology company has at its disposal, and the compute power that it could deploy to run its system. Hello and welcome to this week's Wikibon CUBE insights, powered by ETR. In this Breaking Analysis, we unpack the excitement around ChatGPT, and debate the premise that the company's early entry into the space may not confer winner take all advantage to OpenAI. And to do so, we welcome CUBE collaborator, alum, Sarbjeet Johal, (chuckles) and John Furrier, co-host of the Cube. Great to see you Sarbjeet, John. Really appreciate you guys coming to the program. >> Great to be on. >> Okay, so what is ChatGPT? Well, actually we asked ChatGPT, what is ChatGPT? So here's what it said. ChatGPT is a state-of-the-art language model developed by OpenAI that can generate human-like text. It could be fine tuned for a variety of language tasks, such as conversation, summarization, and language translation. So I asked it, give it to me in 50 words or less. How did it do? Anything to add? >> Yeah, think it did good. It's large language model, like previous models, but it started applying the transformers sort of mechanism to focus on what prompt you have given it to itself. And then also the what answer it gave you in the first, sort of, one sentence or two sentences, and then introspect on itself, like what I have already said to you. And so just work on that. So it it's self sort of focus if you will. It does, the transformers help the large language models to do that. >> So to your point, it's a large language model, and GPT stands for generative pre-trained transformer. >> And if you put the definition back up there again, if you put it back up on the screen, let's see it back up. Okay, it actually missed the large, word large. So one of the problems with ChatGPT, it's not always accurate. It's actually a large language model, and it says state of the art language model. And if you look at Google, Google has dominated AI for many times and they're well known as being the best at this. And apparently Google has their own large language model, LLM, in play and have been holding it back to release because of backlash on the accuracy. Like just in that example you showed is a great point. They got almost right, but they missed the key word. >> You know what's funny about that John, is I had previously asked it in my prompt to give me it in less than a hundred words, and it was too long, I said I was too long for Breaking Analysis, and there it went into the fact that it's a large language model. So it largely, it gave me a really different answer the, for both times. So, but it's still pretty amazing for those of you who haven't played with it yet. And one of the best examples that I saw was Ben Charrington from This Week In ML AI podcast. And I stumbled on this thanks to Brian Gracely, who was listening to one of his Cloudcasts. Basically what Ben did is he took, he prompted ChatGPT to interview ChatGPT, and he simply gave the system the prompts, and then he ran the questions and answers into this avatar builder and sped it up 2X so it didn't sound like a machine. And voila, it was amazing. So John is ChatGPT going to take over as a cube host? >> Well, I was thinking, we get the questions in advance sometimes from PR people. We should actually just plug it in ChatGPT, add it to our notes, and saying, "Is this good enough for you? Let's ask the real question." So I think, you know, I think there's a lot of heavy lifting that gets done. I think the ChatGPT is a phenomenal revolution. I think it highlights the use case. Like that example we showed earlier. It gets most of it right. So it's directionally correct and it feels like it's an answer, but it's not a hundred percent accurate. And I think that's where people are seeing value in it. Writing marketing, copy, brainstorming, guest list, gift list for somebody. Write me some lyrics to a song. Give me a thesis about healthcare policy in the United States. It'll do a bang up job, and then you got to go in and you can massage it. So we're going to do three quarters of the work. That's why plagiarism and schools are kind of freaking out. And that's why Microsoft put 10 billion in, because why wouldn't this be a feature of Word, or the OS to help it do stuff on behalf of the user. So linguistically it's a beautiful thing. You can input a string and get a good answer. It's not a search result. >> And we're going to get your take on on Microsoft and, but it kind of levels the playing- but ChatGPT writes better than I do, Sarbjeet, and I know you have some good examples too. You mentioned the Reed Hastings example. >> Yeah, I was listening to Reed Hastings fireside chat with ChatGPT, and the answers were coming as sort of voice, in the voice format. And it was amazing what, he was having very sort of philosophy kind of talk with the ChatGPT, the longer sentences, like he was going on, like, just like we are talking, he was talking for like almost two minutes and then ChatGPT was answering. It was not one sentence question, and then a lot of answers from ChatGPT and yeah, you're right. I, this is our ability. I've been thinking deep about this since yesterday, we talked about, like, we want to do this segment. The data is fed into the data model. It can be the current data as well, but I think that, like, models like ChatGPT, other companies will have those too. They can, they're democratizing the intelligence, but they're not creating intelligence yet, definitely yet I can say that. They will give you all the finite answers. Like, okay, how do you do this for loop in Java, versus, you know, C sharp, and as a programmer you can do that, in, but they can't tell you that, how to write a new algorithm or write a new search algorithm for you. They cannot create a secretive code for you to- >> Not yet. >> Have competitive advantage. >> Not yet, not yet. >> but you- >> Can Google do that today? >> No one really can. The reasoning side of the data is, we talked about at our Supercloud event, with Zhamak Dehghani who's was CEO of, now of Nextdata. This next wave of data intelligence is going to come from entrepreneurs that are probably cross discipline, computer science and some other discipline. But they're going to be new things, for example, data, metadata, and data. It's hard to do reasoning like a human being, so that needs more data to train itself. So I think the first gen of this training module for the large language model they have is a corpus of text. Lot of that's why blog posts are, but the facts are wrong and sometimes out of context, because that contextual reasoning takes time, it takes intelligence. So machines need to become intelligent, and so therefore they need to be trained. So you're going to start to see, I think, a lot of acceleration on training the data sets. And again, it's only as good as the data you can get. And again, proprietary data sets will be a huge winner. Anyone who's got a large corpus of content, proprietary content like theCUBE or SiliconANGLE as a publisher will benefit from this. Large FinTech companies, anyone with large proprietary data will probably be a big winner on this generative AI wave, because it just, it will eat that up, and turn that back into something better. So I think there's going to be a lot of interesting things to look at here. And certainly productivity's going to be off the charts for vanilla and the internet is going to get swarmed with vanilla content. So if you're in the content business, and you're an original content producer of any kind, you're going to be not vanilla, so you're going to be better. So I think there's so much at play Dave (indistinct). >> I think the playing field has been risen, so we- >> Risen and leveled? >> Yeah, and leveled to certain extent. So it's now like that few people as consumers, as consumers of AI, we will have a advantage and others cannot have that advantage. So it will be democratized. That's, I'm sure about that. But if you take the example of calculator, when the calculator came in, and a lot of people are, "Oh, people can't do math anymore because calculator is there." right? So it's a similar sort of moment, just like a calculator for the next level. But, again- >> I see it more like open source, Sarbjeet, because like if you think about what ChatGPT's doing, you do a query and it comes from somewhere the value of a post from ChatGPT is just a reuse of AI. The original content accent will be come from a human. So if I lay out a paragraph from ChatGPT, did some heavy lifting on some facts, I check the facts, save me about maybe- >> Yeah, it's productive. >> An hour writing, and then I write a killer two, three sentences of, like, sharp original thinking or critical analysis. I then took that body of work, open source content, and then laid something on top of it. >> And Sarbjeet's example is a good one, because like if the calculator kids don't do math as well anymore, the slide rule, remember we had slide rules as kids, remember we first started using Waze, you know, we were this minority and you had an advantage over other drivers. Now Waze is like, you know, social traffic, you know, navigation, everybody had, you know- >> All the back roads are crowded. >> They're car crowded. (group laughs) Exactly. All right, let's, let's move on. What about this notion that futurist Ray Amara put forth and really Amara's Law that we're showing here, it's, the law is we, you know, "We tend to overestimate the effect of technology in the short run and underestimate it in the long run." Is that the case, do you think, with ChatGPT? What do you think Sarbjeet? >> I think that's true actually. There's a lot of, >> We don't debate this. >> There's a lot of awe, like when people see the results from ChatGPT, they say what, what the heck? Like, it can do this? But then if you use it more and more and more, and I ask the set of similar question, not the same question, and it gives you like same answer. It's like reading from the same bucket of text in, the interior read (indistinct) where the ChatGPT, you will see that in some couple of segments. It's very, it sounds so boring that the ChatGPT is coming out the same two sentences every time. So it is kind of good, but it's not as good as people think it is right now. But we will have, go through this, you know, hype sort of cycle and get realistic with it. And then in the long term, I think it's a great thing in the short term, it's not something which will (indistinct) >> What's your counter point? You're saying it's not. >> I, no I think the question was, it's hyped up in the short term and not it's underestimated long term. That's what I think what he said, quote. >> Yes, yeah. That's what he said. >> Okay, I think that's wrong with this, because this is a unique, ChatGPT is a unique kind of impact and it's very generational. People have been comparing it, I have been comparing to the internet, like the web, web browser Mosaic and Netscape, right, Navigator. I mean, I clearly still remember the days seeing Navigator for the first time, wow. And there weren't not many sites you could go to, everyone typed in, you know, cars.com, you know. >> That (indistinct) wasn't that overestimated, the overhyped at the beginning and underestimated. >> No, it was, it was underestimated long run, people thought. >> But that Amara's law. >> That's what is. >> No, they said overestimated? >> Overestimated near term underestimated- overhyped near term, underestimated long term. I got, right I mean? >> Well, I, yeah okay, so I would then agree, okay then- >> We were off the charts about the internet in the early days, and it actually exceeded our expectations. >> Well there were people who were, like, poo-pooing it early on. So when the browser came out, people were like, "Oh, the web's a toy for kids." I mean, in 1995 the web was a joke, right? So '96, you had online populations growing, so you had structural changes going on around the browser, internet population. And then that replaced other things, direct mail, other business activities that were once analog then went to the web, kind of read only as you, as we always talk about. So I think that's a moment where the hype long term, the smart money, and the smart industry experts all get the long term. And in this case, there's more poo-pooing in the short term. "Ah, it's not a big deal, it's just AI." I've heard many people poo-pooing ChatGPT, and a lot of smart people saying, "No this is next gen, this is different and it's only going to get better." So I think people are estimating a big long game on this one. >> So you're saying it's bifurcated. There's those who say- >> Yes. >> Okay, all right, let's get to the heart of the premise, and possibly the debate for today's episode. Will OpenAI's early entry into the market confer sustainable competitive advantage for the company. And if you look at the history of tech, the technology industry, it's kind of littered with first mover failures. Altair, IBM, Tandy, Commodore, they and Apple even, they were really early in the PC game. They took a backseat to Dell who came in the scene years later with a better business model. Netscape, you were just talking about, was all the rage in Silicon Valley, with the first browser, drove up all the housing prices out here. AltaVista was the first search engine to really, you know, index full text. >> Owned by Dell, I mean DEC. >> Owned by Digital. >> Yeah, Digital Equipment >> Compaq bought it. And of course as an aside, Digital, they wanted to showcase their hardware, right? Their super computer stuff. And then so Friendster and MySpace, they came before Facebook. The iPhone certainly wasn't the first mobile device. So lots of failed examples, but there are some recent successes like AWS and cloud. >> You could say smartphone. So I mean. >> Well I know, and you can, we can parse this so we'll debate it. Now Twitter, you could argue, had first mover advantage. You kind of gave me that one John. Bitcoin and crypto clearly had first mover advantage, and sustaining that. Guys, will OpenAI make it to the list on the right with ChatGPT, what do you think? >> I think categorically as a company, it probably won't, but as a category, I think what they're doing will, so OpenAI as a company, they get funding, there's power dynamics involved. Microsoft put a billion dollars in early on, then they just pony it up. Now they're reporting 10 billion more. So, like, if the browsers, Microsoft had competitive advantage over Netscape, and used monopoly power, and convicted by the Department of Justice for killing Netscape with their monopoly, Netscape should have had won that battle, but Microsoft killed it. In this case, Microsoft's not killing it, they're buying into it. So I think the embrace extend Microsoft power here makes OpenAI vulnerable for that one vendor solution. So the AI as a company might not make the list, but the category of what this is, large language model AI, is probably will be on the right hand side. >> Okay, we're going to come back to the government intervention and maybe do some comparisons, but what are your thoughts on this premise here? That, it will basically set- put forth the premise that it, that ChatGPT, its early entry into the market will not confer competitive advantage to >> For OpenAI. >> To Open- Yeah, do you agree with that? >> I agree with that actually. It, because Google has been at it, and they have been holding back, as John said because of the scrutiny from the Fed, right, so- >> And privacy too. >> And the privacy and the accuracy as well. But I think Sam Altman and the company on those guys, right? They have put this in a hasty way out there, you know, because it makes mistakes, and there are a lot of questions around the, sort of, where the content is coming from. You saw that as your example, it just stole the content, and without your permission, you know? >> Yeah. So as quick this aside- >> And it codes on people's behalf and the, those codes are wrong. So there's a lot of, sort of, false information it's putting out there. So it's a very vulnerable thing to do what Sam Altman- >> So even though it'll get better, others will compete. >> So look, just side note, a term which Reid Hoffman used a little bit. Like he said, it's experimental launch, like, you know, it's- >> It's pretty damn good. >> It is clever because according to Sam- >> It's more than clever. It's good. >> It's awesome, if you haven't used it. I mean you write- you read what it writes and you go, "This thing writes so well, it writes so much better than you." >> The human emotion drives that too. I think that's a big thing. But- >> I Want to add one more- >> Make your last point. >> Last one. Okay. So, but he's still holding back. He's conducting quite a few interviews. If you want to get the gist of it, there's an interview with StrictlyVC interview from yesterday with Sam Altman. Listen to that one it's an eye opening what they want- where they want to take it. But my last one I want to make it on this point is that Satya Nadella yesterday did an interview with Wall Street Journal. I think he was doing- >> You were not impressed. >> I was not impressed because he was pushing it too much. So Sam Altman's holding back so there's less backlash. >> Got 10 billion reasons to push. >> I think he's almost- >> Microsoft just laid off 10000 people. Hey ChatGPT, find me a job. You know like. (group laughs) >> He's overselling it to an extent that I think it will backfire on Microsoft. And he's over promising a lot of stuff right now, I think. I don't know why he's very jittery about all these things. And he did the same thing during Ignite as well. So he said, "Oh, this AI will write code for you and this and that." Like you called him out- >> The hyperbole- >> During your- >> from Satya Nadella, he's got a lot of hyperbole. (group talks over each other) >> All right, Let's, go ahead. >> Well, can I weigh in on the whole- >> Yeah, sure. >> Microsoft thing on whether OpenAI, here's the take on this. I think it's more like the browser moment to me, because I could relate to that experience with ChatG, personally, emotionally, when I saw that, and I remember vividly- >> You mean that aha moment (indistinct). >> Like this is obviously the future. Anything else in the old world is dead, website's going to be everywhere. It was just instant dot connection for me. And a lot of other smart people who saw this. Lot of people by the way, didn't see it. Someone said the web's a toy. At the company I was worked for at the time, Hewlett Packard, they like, they could have been in, they had invented HTML, and so like all this stuff was, like, they just passed, the web was just being passed over. But at that time, the browser got better, more websites came on board. So the structural advantage there was online web usage was growing, online user population. So that was growing exponentially with the rise of the Netscape browser. So OpenAI could stay on the right side of your list as durable, if they leverage the category that they're creating, can get the scale. And if they can get the scale, just like Twitter, that failed so many times that they still hung around. So it was a product that was always successful, right? So I mean, it should have- >> You're right, it was terrible, we kept coming back. >> The fail whale, but it still grew. So OpenAI has that moment. They could do it if Microsoft doesn't meddle too much with too much power as a vendor. They could be the Netscape Navigator, without the anti-competitive behavior of somebody else. So to me, they have the pole position. So they have an opportunity. So if not, if they don't execute, then there's opportunity. There's not a lot of barriers to entry, vis-a-vis say the CapEx of say a cloud company like AWS. You can't replicate that, Many have tried, but I think you can replicate OpenAI. >> And we're going to talk about that. Okay, so real quick, I want to bring in some ETR data. This isn't an ETR heavy segment, only because this so new, you know, they haven't coverage yet, but they do cover AI. So basically what we're seeing here is a slide on the vertical axis's net score, which is a measure of spending momentum, and in the horizontal axis's is presence in the dataset. Think of it as, like, market presence. And in the insert right there, you can see how the dots are plotted, the two columns. And so, but the key point here that we want to make, there's a bunch of companies on the left, is he like, you know, DataRobot and C3 AI and some others, but the big whales, Google, AWS, Microsoft, are really dominant in this market. So that's really the key takeaway that, can we- >> I notice IBM is way low. >> Yeah, IBM's low, and actually bring that back up and you, but then you see Oracle who actually is injecting. So I guess that's the other point is, you're not necessarily going to go buy AI, and you know, build your own AI, you're going to, it's going to be there and, it, Salesforce is going to embed it into its platform, the SaaS companies, and you're going to purchase AI. You're not necessarily going to build it. But some companies obviously are. >> I mean to quote IBM's general manager Rob Thomas, "You can't have AI with IA." information architecture and David Flynn- >> You can't Have AI without IA >> without, you can't have AI without IA. You can't have, if you have an Information Architecture, you then can power AI. Yesterday David Flynn, with Hammersmith, was on our Supercloud. He was pointing out that the relationship of storage, where you store things, also impacts the data and stressablity, and Zhamak from Nextdata, she was pointing out that same thing. So the data problem factors into all this too, Dave. >> So you got the big cloud and internet giants, they're all poised to go after this opportunity. Microsoft is investing up to 10 billion. Google's code red, which was, you know, the headline in the New York Times. Of course Apple is there and several alternatives in the market today. Guys like Chinchilla, Bloom, and there's a company Jasper and several others, and then Lena Khan looms large and the government's around the world, EU, US, China, all taking notice before the market really is coalesced around a single player. You know, John, you mentioned Netscape, they kind of really, the US government was way late to that game. It was kind of game over. And Netscape, I remember Barksdale was like, "Eh, we're going to be selling software in the enterprise anyway." and then, pshew, the company just dissipated. So, but it looks like the US government, especially with Lena Khan, they're changing the definition of antitrust and what the cause is to go after people, and they're really much more aggressive. It's only what, two years ago that (indistinct). >> Yeah, the problem I have with the federal oversight is this, they're always like late to the game, and they're slow to catch up. So in other words, they're working on stuff that should have been solved a year and a half, two years ago around some of the social networks hiding behind some of the rules around open web back in the days, and I think- >> But they're like 15 years late to that. >> Yeah, and now they got this new thing on top of it. So like, I just worry about them getting their fingers. >> But there's only two years, you know, OpenAI. >> No, but the thing (indistinct). >> No, they're still fighting other battles. But the problem with government is that they're going to label Big Tech as like a evil thing like Pharma, it's like smoke- >> You know Lena Khan wants to kill Big Tech, there's no question. >> So I think Big Tech is getting a very seriously bad rap. And I think anything that the government does that shades darkness on tech, is politically motivated in most cases. You can almost look at everything, and my 80 20 rule is in play here. 80% of the government activity around tech is bullshit, it's politically motivated, and the 20% is probably relevant, but off the mark and not organized. >> Well market forces have always been the determining factor of success. The governments, you know, have been pretty much failed. I mean you look at IBM's antitrust, that, what did that do? The market ultimately beat them. You look at Microsoft back in the day, right? Windows 95 was peaking, the government came in. But you know, like you said, they missed the web, right, and >> so they were hanging on- >> There's nobody in government >> to Windows. >> that actually knows- >> And so, you, I think you're right. It's market forces that are going to determine this. But Sarbjeet, what do you make of Microsoft's big bet here, you weren't impressed with with Nadella. How do you think, where are they going to apply it? Is this going to be a Hail Mary for Bing, or is it going to be applied elsewhere? What do you think. >> They are saying that they will, sort of, weave this into their products, office products, productivity and also to write code as well, developer productivity as well. That's a big play for them. But coming back to your antitrust sort of comments, right? I believe the, your comment was like, oh, fed was late 10 years or 15 years earlier, but now they're two years. But things are moving very fast now as compared to they used to move. >> So two years is like 10 Years. >> Yeah, two years is like 10 years. Just want to make that point. (Dave laughs) This thing is going like wildfire. Any new tech which comes in that I think they're going against distribution channels. Lina Khan has commented time and again that the marketplace model is that she wants to have some grip on. Cloud marketplaces are a kind of monopolistic kind of way. >> I don't, I don't see this, I don't see a Chat AI. >> You told me it's not Bing, you had an interesting comment. >> No, no. First of all, this is great from Microsoft. If you're Microsoft- >> Why? >> Because Microsoft doesn't have the AI chops that Google has, right? Google is got so much core competency on how they run their search, how they run their backends, their cloud, even though they don't get a lot of cloud market share in the enterprise, they got a kick ass cloud cause they needed one. >> Totally. >> They've invented SRE. I mean Google's development and engineering chops are off the scales, right? Amazon's got some good chops, but Google's got like 10 times more chops than AWS in my opinion. Cloud's a whole different story. Microsoft gets AI, they get a playbook, they get a product they can render into, the not only Bing, productivity software, helping people write papers, PowerPoint, also don't forget the cloud AI can super help. We had this conversation on our Supercloud event, where AI's going to do a lot of the heavy lifting around understanding observability and managing service meshes, to managing microservices, to turning on and off applications, and or maybe writing code in real time. So there's a plethora of use cases for Microsoft to deploy this. combined with their R and D budgets, they can then turbocharge more research, build on it. So I think this gives them a car in the game, Google may have pole position with AI, but this puts Microsoft right in the game, and they already have a lot of stuff going on. But this just, I mean everything gets lifted up. Security, cloud, productivity suite, everything. >> What's under the hood at Google, and why aren't they talking about it? I mean they got to be freaked out about this. No? Or do they have kind of a magic bullet? >> I think they have the, they have the chops definitely. Magic bullet, I don't know where they are, as compared to the ChatGPT 3 or 4 models. Like they, but if you look at the online sort of activity and the videos put out there from Google folks, Google technology folks, that's account you should look at if you are looking there, they have put all these distinctions what ChatGPT 3 has used, they have been talking about for a while as well. So it's not like it's a secret thing that you cannot replicate. As you said earlier, like in the beginning of this segment, that anybody who has more data and the capacity to process that data, which Google has both, I think they will win this. >> Obviously living in Palo Alto where the Google founders are, and Google's headquarters next town over we have- >> We're so close to them. We have inside information on some of the thinking and that hasn't been reported by any outlet yet. And that is, is that, from what I'm hearing from my sources, is Google has it, they don't want to release it for many reasons. One is it might screw up their search monopoly, one, two, they're worried about the accuracy, 'cause Google will get sued. 'Cause a lot of people are jamming on this ChatGPT as, "Oh it does everything for me." when it's clearly not a hundred percent accurate all the time. >> So Lina Kahn is looming, and so Google's like be careful. >> Yeah so Google's just like, this is the third, could be a third rail. >> But the first thing you said is a concern. >> Well no. >> The disruptive (indistinct) >> What they will do is do a Waymo kind of thing, where they spin out a separate company. >> They're doing that. >> The discussions happening, they're going to spin out the separate company and put it over there, and saying, "This is AI, got search over there, don't touch that search, 'cause that's where all the revenue is." (chuckles) >> So, okay, so that's how they deal with the Clay Christensen dilemma. What's the business model here? I mean it's not advertising, right? Is it to charge you for a query? What, how do you make money at this? >> It's a good question, I mean my thinking is, first of all, it's cool to type stuff in and see a paper get written, or write a blog post, or gimme a marketing slogan for this or that or write some code. I think the API side of the business will be critical. And I think Howie Xu, I know you're going to reference some of his comments yesterday on Supercloud, I think this brings a whole 'nother user interface into technology consumption. I think the business model, not yet clear, but it will probably be some sort of either API and developer environment or just a straight up free consumer product, with some sort of freemium backend thing for business. >> And he was saying too, it's natural language is the way in which you're going to interact with these systems. >> I think it's APIs, it's APIs, APIs, APIs, because these people who are cooking up these models, and it takes a lot of compute power to train these and to, for inference as well. Somebody did the analysis on the how many cents a Google search costs to Google, and how many cents the ChatGPT query costs. It's, you know, 100x or something on that. You can take a look at that. >> A 100x on which side? >> You're saying two orders of magnitude more expensive for ChatGPT >> Much more, yeah. >> Than for Google. >> It's very expensive. >> So Google's got the data, they got the infrastructure and they got, you're saying they got the cost (indistinct) >> No actually it's a simple query as well, but they are trying to put together the answers, and they're going through a lot more data versus index data already, you know. >> Let me clarify, you're saying that Google's version of ChatGPT is more efficient? >> No, I'm, I'm saying Google search results. >> Ah, search results. >> What are used to today, but cheaper. >> But that, does that, is that going to confer advantage to Google's large language (indistinct)? >> It will, because there were deep science (indistinct). >> Google, I don't think Google search is doing a large language model on their search, it's keyword search. You know, what's the weather in Santa Cruz? Or how, what's the weather going to be? Or you know, how do I find this? Now they have done a smart job of doing some things with those queries, auto complete, re direct navigation. But it's, it's not entity. It's not like, "Hey, what's Dave Vellante thinking this week in Breaking Analysis?" ChatGPT might get that, because it'll get your Breaking Analysis, it'll synthesize it. There'll be some, maybe some clips. It'll be like, you know, I mean. >> Well I got to tell you, I asked ChatGPT to, like, I said, I'm going to enter a transcript of a discussion I had with Nir Zuk, the CTO of Palo Alto Networks, And I want you to write a 750 word blog. I never input the transcript. It wrote a 750 word blog. It attributed quotes to him, and it just pulled a bunch of stuff that, and said, okay, here it is. It talked about Supercloud, it defined Supercloud. >> It's made, it makes you- >> Wow, But it was a big lie. It was fraudulent, but still, blew me away. >> Again, vanilla content and non accurate content. So we are going to see a surge of misinformation on steroids, but I call it the vanilla content. Wow, that's just so boring, (indistinct). >> There's so many dangers. >> Make your point, cause we got to, almost out of time. >> Okay, so the consumption, like how do you consume this thing. As humans, we are consuming it and we are, like, getting a nicely, like, surprisingly shocked, you know, wow, that's cool. It's going to increase productivity and all that stuff, right? And on the danger side as well, the bad actors can take hold of it and create fake content and we have the fake sort of intelligence, if you go out there. So that's one thing. The second thing is, we are as humans are consuming this as language. Like we read that, we listen to it, whatever format we consume that is, but the ultimate usage of that will be when the machines can take that output from likes of ChatGPT, and do actions based on that. The robots can work, the robot can paint your house, we were talking about, right? Right now we can't do that. >> Data apps. >> So the data has to be ingested by the machines. It has to be digestible by the machines. And the machines cannot digest unorganized data right now, we will get better on the ingestion side as well. So we are getting better. >> Data, reasoning, insights, and action. >> I like that mall, paint my house. >> So, okay- >> By the way, that means drones that'll come in. Spray painting your house. >> Hey, it wasn't too long ago that robots couldn't climb stairs, as I like to point out. Okay, and of course it's no surprise the venture capitalists are lining up to eat at the trough, as I'd like to say. Let's hear, you'd referenced this earlier, John, let's hear what AI expert Howie Xu said at the Supercloud event, about what it takes to clone ChatGPT. Please, play the clip. >> So one of the VCs actually asked me the other day, right? "Hey, how much money do I need to spend, invest to get a, you know, another shot to the openAI sort of the level." You know, I did a (indistinct) >> Line up. >> A hundred million dollar is the order of magnitude that I came up with, right? You know, not a billion, not 10 million, right? So a hundred- >> Guys a hundred million dollars, that's an astoundingly low figure. What do you make of it? >> I was in an interview with, I was interviewing, I think he said hundred million or so, but in the hundreds of millions, not a billion right? >> You were trying to get him up, you were like "Hundreds of millions." >> Well I think, I- >> He's like, eh, not 10, not a billion. >> Well first of all, Howie Xu's an expert machine learning. He's at Zscaler, he's a machine learning AI guy. But he comes from VMware, he's got his technology pedigrees really off the chart. Great friend of theCUBE and kind of like a CUBE analyst for us. And he's smart. He's right. I think the barriers to entry from a dollar standpoint are lower than say the CapEx required to compete with AWS. Clearly, the CapEx spending to build all the tech for the run a cloud. >> And you don't need a huge sales force. >> And in some case apps too, it's the same thing. But I think it's not that hard. >> But am I right about that? You don't need a huge sales force either. It's, what, you know >> If the product's good, it will sell, this is a new era. The better mouse trap will win. This is the new economics in software, right? So- >> Because you look at the amount of money Lacework, and Snyk, Snowflake, Databrooks. Look at the amount of money they've raised. I mean it's like a billion dollars before they get to IPO or more. 'Cause they need promotion, they need go to market. You don't need (indistinct) >> OpenAI's been working on this for multiple five years plus it's, hasn't, wasn't born yesterday. Took a lot of years to get going. And Sam is depositioning all the success, because he's trying to manage expectations, To your point Sarbjeet, earlier. It's like, yeah, he's trying to "Whoa, whoa, settle down everybody, (Dave laughs) it's not that great." because he doesn't want to fall into that, you know, hero and then get taken down, so. >> It may take a 100 million or 150 or 200 million to train the model. But to, for the inference to, yeah to for the inference machine, It will take a lot more, I believe. >> Give it, so imagine, >> Because- >> Go ahead, sorry. >> Go ahead. But because it consumes a lot more compute cycles and it's certain level of storage and everything, right, which they already have. So I think to compute is different. To frame the model is a different cost. But to run the business is different, because I think 100 million can go into just fighting the Fed. >> Well there's a flywheel too. >> Oh that's (indistinct) >> (indistinct) >> We are running the business, right? >> It's an interesting number, but it's also kind of, like, context to it. So here, a hundred million spend it, you get there, but you got to factor in the fact that the ways companies win these days is critical mass scale, hitting a flywheel. If they can keep that flywheel of the value that they got going on and get better, you can almost imagine a marketplace where, hey, we have proprietary data, we're SiliconANGLE in theCUBE. We have proprietary content, CUBE videos, transcripts. Well wouldn't it be great if someone in a marketplace could sell a module for us, right? We buy that, Amazon's thing and things like that. So if they can get a marketplace going where you can apply to data sets that may be proprietary, you can start to see this become bigger. And so I think the key barriers to entry is going to be success. I'll give you an example, Reddit. Reddit is successful and it's hard to copy, not because of the software. >> They built the moat. >> Because you can, buy Reddit open source software and try To compete. >> They built the moat with their community. >> Their community, their scale, their user expectation. Twitter, we referenced earlier, that thing should have gone under the first two years, but there was such a great emotional product. People would tolerate the fail whale. And then, you know, well that was a whole 'nother thing. >> Then a plane landed in (John laughs) the Hudson and it was over. >> I think verticals, a lot of verticals will build applications using these models like for lawyers, for doctors, for scientists, for content creators, for- >> So you'll have many hundreds of millions of dollars investments that are going to be seeping out. If, all right, we got to wrap, if you had to put odds on it that that OpenAI is going to be the leader, maybe not a winner take all leader, but like you look at like Amazon and cloud, they're not winner take all, these aren't necessarily winner take all markets. It's not necessarily a zero sum game, but let's call it winner take most. What odds would you give that open AI 10 years from now will be in that position. >> If I'm 0 to 10 kind of thing? >> Yeah, it's like horse race, 3 to 1, 2 to 1, even money, 10 to 1, 50 to 1. >> Maybe 2 to 1, >> 2 to 1, that's pretty low odds. That's basically saying they're the favorite, they're the front runner. Would you agree with that? >> I'd say 4 to 1. >> Yeah, I was going to say I'm like a 5 to 1, 7 to 1 type of person, 'cause I'm a skeptic with, you know, there's so much competition, but- >> I think they're definitely the leader. I mean you got to say, I mean. >> Oh there's no question. There's no question about it. >> The question is can they execute? >> They're not Friendster, is what you're saying. >> They're not Friendster and they're more like Twitter and Reddit where they have momentum. If they can execute on the product side, and if they don't stumble on that, they will continue to have the lead. >> If they say stay neutral, as Sam is, has been saying, that, hey, Microsoft is one of our partners, if you look at their company model, how they have structured the company, then they're going to pay back to the investors, like Microsoft is the biggest one, up to certain, like by certain number of years, they're going to pay back from all the money they make, and after that, they're going to give the money back to the public, to the, I don't know who they give it to, like non-profit or something. (indistinct) >> Okay, the odds are dropping. (group talks over each other) That's a good point though >> Actually they might have done that to fend off the criticism of this. But it's really interesting to see the model they have adopted. >> The wildcard in all this, My last word on this is that, if there's a developer shift in how developers and data can come together again, we have conferences around the future of data, Supercloud and meshs versus, you know, how the data world, coding with data, how that evolves will also dictate, 'cause a wild card could be a shift in the landscape around how developers are using either machine learning or AI like techniques to code into their apps, so. >> That's fantastic insight. I can't thank you enough for your time, on the heels of Supercloud 2, really appreciate it. All right, thanks to John and Sarbjeet for the outstanding conversation today. Special thanks to the Palo Alto studio team. My goodness, Anderson, this great backdrop. You guys got it all out here, I'm jealous. And Noah, really appreciate it, Chuck, Andrew Frick and Cameron, Andrew Frick switching, Cameron on the video lake, great job. And Alex Myerson, he's on production, manages the podcast for us, Ken Schiffman as well. Kristen Martin and Cheryl Knight help get the word out on social media and our newsletters. Rob Hof is our editor-in-chief over at SiliconANGLE, does some great editing, thanks to all. Remember, all these episodes are available as podcasts. All you got to do is search Breaking Analysis podcast, wherever you listen. Publish each week on wikibon.com and siliconangle.com. Want to get in touch, email me directly, david.vellante@siliconangle.com or DM me at dvellante, or comment on our LinkedIn post. And by all means, check out etr.ai. They got really great survey data in the enterprise tech business. This is Dave Vellante for theCUBE Insights powered by ETR. Thanks for watching, We'll see you next time on Breaking Analysis. (electronic music)

Published Date : Jan 20 2023

SUMMARY :

bringing you data-driven and ChatGPT have taken the world by storm. So I asked it, give it to the large language models to do that. So to your point, it's So one of the problems with ChatGPT, and he simply gave the system the prompts, or the OS to help it do but it kind of levels the playing- and the answers were coming as the data you can get. Yeah, and leveled to certain extent. I check the facts, save me about maybe- and then I write a killer because like if the it's, the law is we, you know, I think that's true and I ask the set of similar question, What's your counter point? and not it's underestimated long term. That's what he said. for the first time, wow. the overhyped at the No, it was, it was I got, right I mean? the internet in the early days, and it's only going to get better." So you're saying it's bifurcated. and possibly the debate the first mobile device. So I mean. on the right with ChatGPT, and convicted by the Department of Justice the scrutiny from the Fed, right, so- And the privacy and thing to do what Sam Altman- So even though it'll get like, you know, it's- It's more than clever. I mean you write- I think that's a big thing. I think he was doing- I was not impressed because You know like. And he did the same thing he's got a lot of hyperbole. the browser moment to me, So OpenAI could stay on the right side You're right, it was terrible, They could be the Netscape Navigator, and in the horizontal axis's So I guess that's the other point is, I mean to quote IBM's So the data problem factors and the government's around the world, and they're slow to catch up. Yeah, and now they got years, you know, OpenAI. But the problem with government to kill Big Tech, and the 20% is probably relevant, back in the day, right? are they going to apply it? and also to write code as well, that the marketplace I don't, I don't see you had an interesting comment. No, no. First of all, the AI chops that Google has, right? are off the scales, right? I mean they got to be and the capacity to process that data, on some of the thinking So Lina Kahn is looming, and this is the third, could be a third rail. But the first thing What they will do out the separate company Is it to charge you for a query? it's cool to type stuff in natural language is the way and how many cents the and they're going through Google search results. It will, because there were It'll be like, you know, I mean. I never input the transcript. Wow, But it was a big lie. but I call it the vanilla content. Make your point, cause we And on the danger side as well, So the data By the way, that means at the Supercloud event, So one of the VCs actually What do you make of it? you were like "Hundreds of millions." not 10, not a billion. Clearly, the CapEx spending to build all But I think it's not that hard. It's, what, you know This is the new economics Look at the amount of And Sam is depositioning all the success, or 150 or 200 million to train the model. So I think to compute is different. not because of the software. Because you can, buy They built the moat And then, you know, well that the Hudson and it was over. that are going to be seeping out. Yeah, it's like horse race, 3 to 1, 2 to 1, that's pretty low odds. I mean you got to say, I mean. Oh there's no question. is what you're saying. and if they don't stumble on that, the money back to the public, to the, Okay, the odds are dropping. the model they have adopted. Supercloud and meshs versus, you know, on the heels of Supercloud

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

SarbjeetPERSON

0.99+

Brian GracelyPERSON

0.99+

Lina KhanPERSON

0.99+

Dave VellantePERSON

0.99+

IBMORGANIZATION

0.99+

Reid HoffmanPERSON

0.99+

Alex MyersonPERSON

0.99+

Lena KhanPERSON

0.99+

Sam AltmanPERSON

0.99+

AppleORGANIZATION

0.99+

AWSORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

Rob ThomasPERSON

0.99+

MicrosoftORGANIZATION

0.99+

Ken SchiffmanPERSON

0.99+

GoogleORGANIZATION

0.99+

David FlynnPERSON

0.99+

SamPERSON

0.99+

NoahPERSON

0.99+

Ray AmaraPERSON

0.99+

10 billionQUANTITY

0.99+

150QUANTITY

0.99+

Rob HofPERSON

0.99+

ChuckPERSON

0.99+

Palo AltoLOCATION

0.99+

Howie XuPERSON

0.99+

AndersonPERSON

0.99+

Cheryl KnightPERSON

0.99+

John FurrierPERSON

0.99+

Hewlett PackardORGANIZATION

0.99+

Santa CruzLOCATION

0.99+

1995DATE

0.99+

Lina KahnPERSON

0.99+

Zhamak DehghaniPERSON

0.99+

50 wordsQUANTITY

0.99+

Hundreds of millionsQUANTITY

0.99+

CompaqORGANIZATION

0.99+

10QUANTITY

0.99+

Kristen MartinPERSON

0.99+

two sentencesQUANTITY

0.99+

DavePERSON

0.99+

hundreds of millionsQUANTITY

0.99+

Satya NadellaPERSON

0.99+

CameronPERSON

0.99+

100 millionQUANTITY

0.99+

Silicon ValleyLOCATION

0.99+

one sentenceQUANTITY

0.99+

10 millionQUANTITY

0.99+

yesterdayDATE

0.99+

Clay ChristensenPERSON

0.99+

Sarbjeet JohalPERSON

0.99+

NetscapeORGANIZATION

0.99+

Breaking Analysis: Supercloud2 Explores Cloud Practitioner Realities & the Future of Data Apps


 

>> Narrator: From theCUBE Studios in Palo Alto and Boston bringing you data-driven insights from theCUBE and ETR. This is breaking analysis with Dave Vellante >> Enterprise tech practitioners, like most of us they want to make their lives easier so they can focus on delivering more value to their businesses. And to do so, they want to tap best of breed services in the public cloud, but at the same time connect their on-prem intellectual property to emerging applications which drive top line revenue and bottom line profits. But creating a consistent experience across clouds and on-prem estates has been an elusive capability for most organizations, forcing trade-offs and injecting friction into the system. The need to create seamless experiences is clear and the technology industry is starting to respond with platforms, architectures, and visions of what we've called the Supercloud. Hello and welcome to this week's Wikibon Cube Insights powered by ETR. In this breaking analysis we give you a preview of Supercloud 2, the second event of its kind that we've had on the topic. Yes, folks that's right Supercloud 2 is here. As of this recording, it's just about four days away 33 guests, 21 sessions, combining live discussions and fireside chats from theCUBE's Palo Alto Studio with prerecorded conversations on the future of cloud and data. You can register for free at supercloud.world. And we are super excited about the Supercloud 2 lineup of guests whereas Supercloud 22 in August, was all about refining the definition of Supercloud testing its technical feasibility and understanding various deployment models. Supercloud 2 features practitioners, technologists and analysts discussing what customers need with real-world examples of Supercloud and will expose thinking around a new breed of cross-cloud apps, data apps, if you will that change the way machines and humans interact with each other. Now the example we'd use if you think about applications today, say a CRM system, sales reps, what are they doing? They're entering data into opportunities they're choosing products they're importing contacts, et cetera. And sure the machine can then take all that data and spit out a forecast by rep, by region, by product, et cetera. But today's applications are largely about filling in forms and or codifying processes. In the future, the Supercloud community sees a new breed of applications emerging where data resides on different clouds, in different data storages, databases, Lakehouse, et cetera. And the machine uses AI to inspect the e-commerce system the inventory data, supply chain information and other systems, and puts together a plan without any human intervention whatsoever. Think about a system that orchestrates people, places and things like an Uber for business. So at Supercloud 2, you'll hear about this vision along with some of today's challenges facing practitioners. Zhamak Dehghani, the founder of Data Mesh is a headliner. Kit Colbert also is headlining. He laid out at the first Supercloud an initial architecture for what that's going to look like. That was last August. And he's going to present his most current thinking on the topic. Veronika Durgin of Sachs will be featured and talk about data sharing across clouds and you know what she needs in the future. One of the main highlights of Supercloud 2 is a dive into Walmart's Supercloud. Other featured practitioners include Western Union Ionis Pharmaceuticals, Warner Media. We've got deep, deep technology dives with folks like Bob Muglia, David Flynn Tristan Handy of DBT Labs, Nir Zuk, the founder of Palo Alto Networks focused on security. Thomas Hazel, who's going to talk about a new type of database for Supercloud. It's several analysts including Keith Townsend Maribel Lopez, George Gilbert, Sanjeev Mohan and so many more guests, we don't have time to list them all. They're all up on supercloud.world with a full agenda, so you can check that out. Now let's take a look at some of the things that we're exploring in more detail starting with the Walmart Cloud native platform, they call it WCNP. We definitely see this as a Supercloud and we dig into it with Jack Greenfield. He's the head of architecture at Walmart. Here's a quote from Jack. "WCNP is an implementation of Kubernetes for the Walmart ecosystem. We've taken Kubernetes off the shelf as open source." By the way, they do the same thing with OpenStack. "And we have integrated it with a number of foundational services that provide other aspects of our computational environment. Kubernetes off the shelf doesn't do everything." And so what Walmart chose to do, they took a do-it-yourself approach to build a Supercloud for a variety of reasons that Jack will explain, along with Walmart's so-called triplet architecture connecting on-prem, Azure and GCP. No surprise, there's no Amazon at Walmart for obvious reasons. And what they do is they create a common experience for devs across clouds. Jack is going to talk about how Walmart is evolving its Supercloud in the future. You don't want to miss that. Now, next, let's take a look at how Veronica Durgin of SAKS thinks about data sharing across clouds. Data sharing we think is a potential killer use case for Supercloud. In fact, let's hear it in Veronica's own words. Please play the clip. >> How do we talk to each other? And more importantly, how do we data share? You know, I work with data, you know this is what I do. So if you know I want to get data from a company that's using, say Google, how do we share it in a smooth way where it doesn't have to be this crazy I don't know, SFTP file moving? So that's where I think Supercloud comes to me in my mind, is like practical applications. How do we create that mesh, that network that we can easily share data with each other? >> Now data mesh is a possible architectural approach that will enable more facile data sharing and the monetization of data products. You'll hear Zhamak Dehghani live in studio talking about what standards are missing to make this vision a reality across the Supercloud. Now one of the other things that we're really excited about is digging deeper into the right approach for Supercloud adoption. And we're going to share a preview of a debate that's going on right now in the community. Bob Muglia, former CEO of Snowflake and Microsoft Exec was kind enough to spend some time looking at the community's supercloud definition and he felt that it needed to be simplified. So in near real time he came up with the following definition that we're showing here. I'll read it. "A Supercloud is a platform that provides programmatically consistent services hosted on heterogeneous cloud providers." So not only did Bob simplify the initial definition he's stressed that the Supercloud is a platform versus an architecture implying that the platform provider eg Snowflake, VMware, Databricks, Cohesity, et cetera is responsible for determining the architecture. Now interestingly in the shared Google doc that the working group uses to collaborate on the supercloud de definition, Dr. Nelu Mihai who is actually building a Supercloud responded as follows to Bob's assertion "We need to avoid creating many Supercloud platforms with their own architectures. If we do that, then we create other proprietary clouds on top of existing ones. We need to define an architecture of how Supercloud interfaces with all other clouds. What is the information model? What is the execution model and how users will interact with Supercloud?" What does this seemingly nuanced point tell us and why does it matter? Well, history suggests that de facto standards will emerge more quickly to resolve real world practitioner problems and catch on more quickly than consensus-based architectures and standards-based architectures. But in the long run, the ladder may serve customers better. So we'll be exploring this topic in more detail in Supercloud 2, and of course we'd love to hear what you think platform, architecture, both? Now one of the real technical gurus that we'll have in studio at Supercloud two is David Flynn. He's one of the people behind the the movement that enabled enterprise flash adoption, that craze. And he did that with Fusion IO and he is now working on a system to enable read write data access to any user in any application in any data center or on any cloud anywhere. So think of this company as a Supercloud enabler. Allow me to share an excerpt from a conversation David Flore and I had with David Flynn last year. He as well gave a lot of thought to the Supercloud definition and was really helpful with an opinionated point of view. He said something to us that was, we thought relevant. "What is the operating system for a decentralized cloud? The main two functions of an operating system or an operating environment are one the process scheduler and two, the file system. The strongest argument for supercloud is made when you go down to the platform layer and talk about it as an operating environment on which you can run all forms of applications." So a couple of implications here that will be exploring with David Flynn in studio. First we're inferring from his comment that he's in the platform camp where the platform owner is responsible for the architecture and there are obviously trade-offs there and benefits but we'll have to clarify that with him. And second, he's basically saying, you kill the concept the further you move up the stack. So the weak, the further you move the stack the weaker the supercloud argument becomes because it's just becoming SaaS. Now this is something we're going to explore to better understand is thinking on this, but also whether the existing notion of SaaS is changing and whether or not a new breed of Supercloud apps will emerge. Which brings us to this really interesting fellow that George Gilbert and I RIFed with ahead of Supercloud two. Tristan Handy, he's the founder and CEO of DBT Labs and he has a highly opinionated and technical mind. Here's what he said, "One of the things that we still don't know how to API-ify is concepts that live inside of your data warehouse inside of your data lake. These are core concepts that the business should be able to create applications around very easily. In fact, that's not the case because it involves a lot of data engineering pipeline and other work to make these available. So if you really want to make it easy to create these data experiences for users you need to have an ability to describe these metrics and then to turn them into APIs to make them accessible to application developers who have literally no idea how they're calculated behind the scenes and they don't need to." A lot of implications to this statement that will explore at Supercloud two versus Jamma Dani's data mesh comes into play here with her critique of hyper specialized data pipeline experts with little or no domain knowledge. Also the need for simplified self-service infrastructure which Kit Colbert is likely going to touch upon. Veronica Durgin of SAKS and her ideal state for data shearing along with Harveer Singh of Western Union. They got to deal with 200 locations around the world in data privacy issues, data sovereignty how do you share data safely? Same with Nick Taylor of Ionis Pharmaceutical. And not to blow your mind but Thomas Hazel and Bob Muglia deposit that to make data apps a reality across the Supercloud you have to rethink everything. You can't just let in memory databases and caching architectures take care of everything in a brute force manner. Rather you have to get down to really detailed levels even things like how data is laid out on disk, ie flash and think about rewriting applications for the Supercloud and the MLAI era. All of this and more at Supercloud two which wouldn't be complete without some data. So we pinged our friends from ETR Eric Bradley and Darren Bramberm to see if they had any data on Supercloud that we could tap. And so we're going to be analyzing a number of the players as well at Supercloud two. Now, many of you are familiar with this graphic here we show some of the players involved in delivering or enabling Supercloud-like capabilities. On the Y axis is spending momentum and on the horizontal accesses market presence or pervasiveness in the data. So netscore versus what they call overlap or end in the data. And the table insert shows how the dots are plotted now not to steal ETR's thunder but the first point is you really can't have supercloud without the hyperscale cloud platforms which is shown on this graphic. But the exciting aspect of Supercloud is the opportunity to build value on top of that hyperscale infrastructure. Snowflake here continues to show strong spending velocity as those Databricks, Hashi, Rubrik. VMware Tanzu, which we all put under the magnifying glass after the Broadcom announcements, is also showing momentum. Unfortunately due to a scheduling conflict we weren't able to get Red Hat on the program but they're clearly a player here. And we've put Cohesity and Veeam on the chart as well because backup is a likely use case across clouds and on-premises. And now one other call out that we drill down on at Supercloud two is CloudFlare, which actually uses the term supercloud maybe in a different way. They look at Supercloud really as you know, serverless on steroids. And so the data brains at ETR will have more to say on this topic at Supercloud two along with many others. Okay, so why should you attend Supercloud two? What's in it for me kind of thing? So first of all, if you're a practitioner and you want to understand what the possibilities are for doing cross-cloud services for monetizing data how your peers are doing data sharing, how some of your peers are actually building out a Supercloud you're going to get real world input from practitioners. If you're a technologist, you're trying to figure out various ways to solve problems around data, data sharing, cross-cloud service deployment there's going to be a number of deep technology experts that are going to share how they're doing it. We're also going to drill down with Walmart into a practical example of Supercloud with some other examples of how practitioners are dealing with cross-cloud complexity. Some of them, by the way, are kind of thrown up their hands and saying, Hey, we're going mono cloud. And we'll talk about the potential implications and dangers and risks of doing that. And also some of the benefits. You know, there's a question, right? Is Supercloud the same wine new bottle or is it truly something different that can drive substantive business value? So look, go to Supercloud.world it's January 17th at 9:00 AM Pacific. You can register for free and participate directly in the program. Okay, that's a wrap. I want to give a shout out to the Supercloud supporters. VMware has been a great partner as our anchor sponsor Chaos Search Proximo, and Alura as well. For contributing to the effort I want to thank Alex Myerson who's on production and manages the podcast. Ken Schiffman is his supporting cast as well. Kristen Martin and Cheryl Knight to help get the word out on social media and at our newsletters. And Rob Ho is our editor-in-chief over at Silicon Angle. Thank you all. Remember, these episodes are all available as podcast. Wherever you listen we really appreciate the support that you've given. We just saw some stats from from Buzz Sprout, we hit the top 25% we're almost at 400,000 downloads last year. So really appreciate your participation. All you got to do is search Breaking Analysis podcast and you'll find those I publish each week on wikibon.com and siliconangle.com. Or if you want to get ahold of me you can email me directly at David.Vellante@siliconangle.com or dm me DVellante or comment on our LinkedIn post. I want you to check out etr.ai. They've got the best survey data in the enterprise tech business. This is Dave Vellante for theCUBE Insights, powered by ETR. Thanks for watching. We'll see you next week at Supercloud two or next time on breaking analysis. (light music)

Published Date : Jan 14 2023

SUMMARY :

with Dave Vellante of the things that we're So if you know I want to get data and on the horizontal

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Bob MugliaPERSON

0.99+

Alex MyersonPERSON

0.99+

Cheryl KnightPERSON

0.99+

David FlynnPERSON

0.99+

VeronicaPERSON

0.99+

JackPERSON

0.99+

Nelu MihaiPERSON

0.99+

Zhamak DehghaniPERSON

0.99+

Thomas HazelPERSON

0.99+

Nick TaylorPERSON

0.99+

Dave VellantePERSON

0.99+

Jack GreenfieldPERSON

0.99+

Kristen MartinPERSON

0.99+

Ken SchiffmanPERSON

0.99+

Veronica DurginPERSON

0.99+

WalmartORGANIZATION

0.99+

Rob HoPERSON

0.99+

Warner MediaORGANIZATION

0.99+

Tristan HandyPERSON

0.99+

Veronika DurginPERSON

0.99+

George GilbertPERSON

0.99+

Ionis PharmaceuticalORGANIZATION

0.99+

George GilbertPERSON

0.99+

Bob MugliaPERSON

0.99+

David FlorePERSON

0.99+

DBT LabsORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

BobPERSON

0.99+

Palo AltoLOCATION

0.99+

21 sessionsQUANTITY

0.99+

Darren BrambermPERSON

0.99+

33 guestsQUANTITY

0.99+

Nir ZukPERSON

0.99+

BostonLOCATION

0.99+

AmazonORGANIZATION

0.99+

Harveer SinghPERSON

0.99+

Kit ColbertPERSON

0.99+

DatabricksORGANIZATION

0.99+

Sanjeev MohanPERSON

0.99+

Supercloud 2TITLE

0.99+

SnowflakeORGANIZATION

0.99+

last yearDATE

0.99+

Western UnionORGANIZATION

0.99+

CohesityORGANIZATION

0.99+

SupercloudORGANIZATION

0.99+

200 locationsQUANTITY

0.99+

AugustDATE

0.99+

Keith TownsendPERSON

0.99+

Data MeshORGANIZATION

0.99+

Palo Alto NetworksORGANIZATION

0.99+

David.Vellante@siliconangle.comOTHER

0.99+

next weekDATE

0.99+

bothQUANTITY

0.99+

oneQUANTITY

0.99+

secondQUANTITY

0.99+

first pointQUANTITY

0.99+

OneQUANTITY

0.99+

FirstQUANTITY

0.99+

VMwareORGANIZATION

0.98+

Silicon AngleORGANIZATION

0.98+

ETRORGANIZATION

0.98+

Eric BradleyPERSON

0.98+

twoQUANTITY

0.98+

todayDATE

0.98+

SachsORGANIZATION

0.98+

SAKSORGANIZATION

0.98+

SupercloudEVENT

0.98+

last AugustDATE

0.98+

each weekQUANTITY

0.98+

Breaking Analysis: AI Goes Mainstream But ROI Remains Elusive


 

>> From theCUBE Studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR, this is "Breaking Analysis" with Dave Vellante. >> A decade of big data investments combined with cloud scale, the rise of much more cost effective processing power. And the introduction of advanced tooling has catapulted machine intelligence to the forefront of technology investments. No matter what job you have, your operation will be AI powered within five years and machines may actually even be doing your job. Artificial intelligence is being infused into applications, infrastructure, equipment, and virtually every aspect of our lives. AI is proving to be extremely helpful at things like controlling vehicles, speeding up medical diagnoses, processing language, advancing science, and generally raising the stakes on what it means to apply technology for business advantage. But business value realization has been a challenge for most organizations due to lack of skills, complexity of programming models, immature technology integration, sizable upfront investments, ethical concerns, and lack of business alignment. Mastering AI technology will not be a requirement for success in our view. However, figuring out how and where to apply AI to your business will be crucial. That means understanding the business case, picking the right technology partner, experimenting in bite-sized chunks, and quickly identifying winners to double down on from an investment standpoint. Hello and welcome to this week's Wiki-bond CUBE Insights powered by ETR. In this breaking analysis, we update you on the state of AI and what it means for the competition. And to do so, we invite into our studios Andy Thurai of Constellation Research. Andy covers AI deeply. He knows the players, he knows the pitfalls of AI investment, and he's a collaborator. Andy, great to have you on the program. Thanks for coming into our CUBE studios. >> Thanks for having me on. >> You're very welcome. Okay, let's set the table with a premise and a series of assertions we want to test with Andy. I'm going to lay 'em out. And then Andy, I'd love for you to comment. So, first of all, according to McKinsey, AI adoption has more than doubled since 2017, but only 10% of organizations report seeing significant ROI. That's a BCG and MIT study. And part of that challenge of AI is it requires data, is requires good data, data proficiency, which is not trivial, as you know. Firms that can master both data and AI, we believe are going to have a competitive advantage this decade. Hyperscalers, as we show you dominate AI and ML. We'll show you some data on that. And having said that, there's plenty of room for specialists. They need to partner with the cloud vendors for go to market productivity. And finally, organizations increasingly have to put data and AI at the center of their enterprises. And to do that, most are going to rely on vendor R&D to leverage AI and ML. In other words, Andy, they're going to buy it and apply it as opposed to build it. What are your thoughts on that setup and that premise? >> Yeah, I see that a lot happening in the field, right? So first of all, the only 10% of realizing a return on investment. That's so true because we talked about this earlier, the most companies are still in the innovation cycle. So they're trying to innovate and see what they can do to apply. A lot of these times when you look at the solutions, what they come up with or the models they create, the experimentation they do, most times they don't even have a good business case to solve, right? So they just experiment and then they figure it out, "Oh my God, this model is working. Can we do something to solve it?" So it's like you found a hammer and then you're trying to find the needle kind of thing, right? That never works. >> 'Cause it's cool or whatever it is. >> It is, right? So that's why, I always advise, when they come to me and ask me things like, "Hey, what's the right way to do it? What is the secret sauce?" And, we talked about this. The first thing I tell them is, "Find out what is the business case that's having the most amount of problems, that that can be solved using some of the AI use cases," right? Not all of them can be solved. Even after you experiment, do the whole nine yards, spend millions of dollars on that, right? And later on you make it efficient only by saving maybe $50,000 for the company or a $100,000 for the company, is it really even worth the experiment, right? So you got to start with the saying that, you know, where's the base for this happening? Where's the need? What's a business use case? It doesn't have to be about cost efficient and saving money in the existing processes. It could be a new thing. You want to bring in a new revenue stream, but figure out what is a business use case, how much money potentially I can make off of that. The same way that start-ups go after. Right? >> Yeah. Pretty straightforward. All right, let's take a look at where ML and AI fit relative to the other hot sectors of the ETR dataset. This XY graph shows net score spending velocity in the vertical axis and presence in the survey, they call it sector perversion for the October survey, the January survey's in the field. Then that squiggly line on ML/AI represents the progression. Since the January 21 survey, you can see the downward trajectory. And we position ML and AI relative to the other big four hot sectors or big three, including, ML/AI is four. Containers, cloud and RPA. These have consistently performed above that magic 40% red dotted line for most of the past two years. Anything above 40%, we think is highly elevated. And we've just included analytics and big data for context and relevant adjacentness, if you will. Now note that green arrow moving toward, you know, the 40% mark on ML/AI. I got a glimpse of the January survey, which is in the field. It's got more than a thousand responses already, and it's trending up for the current survey. So Andy, what do you make of this downward trajectory over the past seven quarters and the presumed uptick in the coming months? >> So one of the things you have to keep in mind is when the pandemic happened, it's about survival mode, right? So when somebody's in a survival mode, what happens, the luxury and the innovations get cut. That's what happens. And this is exactly what happened in the situation. So as you can see in the last seven quarters, which is almost dating back close to pandemic, everybody was trying to keep their operations alive, especially digital operations. How do I keep the lights on? That's the most important thing for them. So while the numbers spent on AI, ML is less overall, I still think the AI ML to spend to sort of like a employee experience or the IT ops, AI ops, ML ops, as we talked about, some of those areas actually went up. There are companies, we talked about it, Atlassian had a lot of platform issues till the amount of money people are spending on that is exorbitant and simply because they are offering the solution that was not available other way. So there are companies out there, you can take AoPS or incident management for that matter, right? A lot of companies have a digital insurance, they don't know how to properly manage it. How do you find an intern solve it immediately? That's all using AI ML and some of those areas actually growing unbelievable, the companies in that area. >> So this is a really good point. If you can you bring up that chart again, what Andy's saying is a lot of the companies in the ETR taxonomy that are doing things with AI might not necessarily show up in a granular fashion. And I think the other point I would make is, these are still highly elevated numbers. If you put on like storage and servers, they would read way, way down the list. And, look in the pandemic, we had to deal with work from home, we had to re-architect the network, we had to worry about security. So those are really good points that you made there. Let's, unpack this a little bit and look at the ML AI sector and the ETR data and specifically at the players and get Andy to comment on this. This chart here shows the same x y dimensions, and it just notes some of the players that are specifically have services and products that people spend money on, that CIOs and IT buyers can comment on. So the table insert shows how the companies are plotted, it's net score, and then the ends in the survey. And Andy, the hyperscalers are dominant, as you can see. You see Databricks there showing strong as a specialist, and then you got to pack a six or seven in there. And then Oracle and IBM, kind of the big whales of yester year are in the mix. And to your point, companies like Salesforce that you mentioned to me offline aren't in that mix, but they do a lot in AI. But what are your takeaways from that data? >> If you could put the slide back on please. I want to make quick comments on a couple of those. So the first one is, it's surprising other hyperscalers, right? As you and I talked about this earlier, AWS is more about logo blocks. We discussed that, right? >> Like what? Like a SageMaker as an example. >> We'll give you all the components what do you need. Whether it's MLOps component or whether it's, CodeWhisperer that we talked about, or a oral platform or data or data, whatever you want. They'll give you the blocks and then you'll build things on top of it, right? But Google took a different way. Matter of fact, if we did those numbers a few years ago, Google would've been number one because they did a lot of work with their acquisition of DeepMind and other things. They're way ahead of the pack when it comes to AI for longest time. Now, I think Microsoft's move of partnering and taking a huge competitor out would open the eyes is unbelievable. You saw that everybody is talking about chat GPI, right? And the open AI tool and ChatGPT rather. Remember as Warren Buffet is saying that, when my laundry lady comes and talk to me about stock market, it's heated up. So that's how it's heated up. Everybody's using ChatGPT. What that means is at the end of the day is they're creating, it's still in beta, keep in mind. It's not fully... >> Can you play with it a little bit? >> I have a little bit. >> I have, but it's good and it's not good. You know what I mean? >> Look, so at the end of the day, you take the massive text of all the available text in the world today, mass them all together. And then you ask a question, it's going to basically search through that and figure it out and answer that back. Yes, it's good. But again, as we discussed, if there's no business use case of what problem you're going to solve. This is building hype. But then eventually they'll figure out, for example, all your chats, online chats, could be aided by your AI chat bots, which is already there, which is not there at that level. This could build help that, right? Or the other thing we talked about is one of the areas where I'm more concerned about is that it is able to produce equal enough original text at the level that humans can produce, for example, ChatGPT or the equal enough, the large language transformer can help you write stories as of Shakespeare wrote it. Pretty close to it. It'll learn from that. So when it comes down to it, talk about creating messages, articles, blogs, especially during political seasons, not necessarily just in US, but anywhere for that matter. If people are able to produce at the emission speed and throw it at the consumers and confuse them, the elections can be won, the governments can be toppled. >> Because to your point about chatbots is chatbots have obviously, reduced the number of bodies that you need to support chat. But they haven't solved the problem of serving consumers. Most of the chat bots are conditioned response, which of the following best describes your problem? >> The current chatbot. >> Yeah. Hey, did we solve your problem? No. Is the answer. So that has some real potential. But if you could bring up that slide again, Ken, I mean you've got the hyperscalers that are dominant. You talked about Google and Microsoft is ubiquitous, they seem to be dominant in every ETR category. But then you have these other specialists. How do those guys compete? And maybe you could even, cite some of the guys that you know, how do they compete with the hyperscalers? What's the key there for like a C3 ai or some of the others that are on there? >> So I've spoken with at least two of the CEOs of the smaller companies that you have on the list. One of the things they're worried about is that if they continue to operate independently without being part of hyperscaler, either the hyperscalers will develop something to compete against them full scale, or they'll become irrelevant. Because at the end of the day, look, cloud is dominant. Not many companies are going to do like AI modeling and training and deployment the whole nine yards by independent by themselves. They're going to depend on one of the clouds, right? So if they're already going to be in the cloud, by taking them out to come to you, it's going to be extremely difficult issue to solve. So all these companies are going and saying, "You know what? We need to be in hyperscalers." For example, you could have looked at DataRobot recently, they made announcements, Google and AWS, and they are all over the place. So you need to go where the customers are. Right? >> All right, before we go on, I want to share some other data from ETR and why people adopt AI and get your feedback. So the data historically shows that feature breadth and technical capabilities were the main decision points for AI adoption, historically. What says to me that it's too much focus on technology. In your view, is that changing? Does it have to change? Will it change? >> Yes. Simple answer is yes. So here's the thing. The data you're speaking from is from previous years. >> Yes >> I can guarantee you, if you look at the latest data that's coming in now, those two will be a secondary and tertiary points. The number one would be about ROI. And how do I achieve? I've spent ton of money on all of my experiments. This is the same thing theme I'm seeing across when talking to everybody who's spending money on AI. I've spent so much money on it. When can I get it live in production? How much, how can I quickly get it? Because you know, the board is breathing down their neck. You already spend this much money. Show me something that's valuable. So the ROI is going to become, take it from me, I'm predicting this for 2023, that's going to become number one. >> Yeah, and if people focus on it, they'll figure it out. Okay. Let's take a look at some of the top players that won, some of the names we just looked at and double click on that and break down their spending profile. So the chart here shows the net score, how net score is calculated. So pay attention to the second set of bars that Databricks, who was pretty prominent on the previous chart. And we've annotated the colors. The lime green is, we're bringing the platform in new. The forest green is, we're going to spend 6% or more relative to last year. And the gray is flat spending. The pinkish is our spending's going to be down on AI and ML, 6% or worse. And the red is churn. So you don't want big red. You subtract the reds from the greens and you get net score, which is shown by those blue dots that you see there. So AWS has the highest net score and very little churn. I mean, single low single digit churn. But notably, you see Databricks and DataRobot are next in line within Microsoft and Google also, they've got very low churn. Andy, what are your thoughts on this data? >> So a couple of things that stands out to me. Most of them are in line with my conversation with customers. Couple of them stood out to me on how bad IBM Watson is doing. >> Yeah, bring that back up if you would. Let's take a look at that. IBM Watson is the far right and the red, that bright red is churning and again, you want low red here. Why do you think that is? >> Well, so look, IBM has been in the forefront of innovating things for many, many years now, right? And over the course of years we talked about this, they moved from a product innovation centric company into more of a services company. And over the years they were making, as at one point, you know that they were making about majority of that money from services. Now things have changed Arvind has taken over, he came from research. So he's doing a great job of trying to reinvent themselves as a company. But it's going to have a long way to catch up. IBM Watson, if you think about it, that played what, jeopardy and chess years ago, like 15 years ago? >> It was jaw dropping when you first saw it. And then they weren't able to commercialize that. >> Yeah. >> And you're making a good point. When Gerstner took over IBM at the time, John Akers wanted to split the company up. He wanted to have a database company, he wanted to have a storage company. Because that's where the industry trend was, Gerstner said no, he came from AMEX, right? He came from American Express. He said, "No, we're going to have a single throat to choke for the customer." They bought PWC for relatively short money. I think it was $15 billion, completely transformed and I would argue saved IBM. But the trade off was, it sort of took them out of product leadership. And so from Gerstner to Palmisano to Remedi, it was really a services led company. And I think Arvind is really bringing it back to a product company with strong consulting. I mean, that's one of the pillars. And so I think that's, they've got a strong story in data and AI. They just got to sort of bring it together and better. Bring that chart up one more time. I want to, the other point is Oracle, Oracle sort of has the dominant lock-in for mission critical database and they're sort of applying AI there. But to your point, they're really not an AI company in the sense that they're taking unstructured data and doing sort of new things. It's really about how to make Oracle better, right? >> Well, you got to remember, Oracle is about database for the structure data. So in yesterday's world, they were dominant database. But you know, if you are to start storing like videos and texts and audio and other things, and then start doing search of vector search and all that, Oracle is not necessarily the database company of choice. And they're strongest thing being apps and building AI into the apps? They are kind of surviving in that area. But again, I wouldn't name them as an AI company, right? But the other thing that that surprised me in that list, what you showed me is yes, AWS is number one. >> Bring that back up if you would, Ken. >> AWS is number one as you, it should be. But what what actually caught me by surprise is how DataRobot is holding, you know? I mean, look at that. The either net new addition and or expansion, DataRobot seem to be doing equally well, even better than Microsoft and Google. That surprises me. >> DataRobot's, and again, this is a function of spending momentum. So remember from the previous chart that Microsoft and Google, much, much larger than DataRobot. DataRobot more niche. But with spending velocity and has always had strong spending velocity, despite some of the recent challenges, organizational challenges. And then you see these other specialists, H2O.ai, Anaconda, dataiku, little bit of red showing there C3.ai. But these again, to stress are the sort of specialists other than obviously the hyperscalers. These are the specialists in AI. All right, so we hit the bigger names in the sector. Now let's take a look at the emerging technology companies. And one of the gems of the ETR dataset is the emerging technology survey. It's called ETS. They used to just do it like twice a year. It's now run four times a year. I just discovered it kind of mid-2022. And it's exclusively focused on private companies that are potential disruptors, they might be M&A candidates and if they've raised enough money, they could be acquirers of companies as well. So Databricks would be an example. They've made a number of investments in companies. SNEAK would be another good example. Companies that are private, but they're buyers, they hope to go IPO at some point in time. So this chart here, shows the emerging companies in the ML AI sector of the ETR dataset. So the dimensions of this are similar, they're net sentiment on the Y axis and mind share on the X axis. Basically, the ETS study measures awareness on the x axis and intent to do something with, evaluate or implement or not, on that vertical axis. So it's like net score on the vertical where negatives are subtracted from the positives. And again, mind share is vendor awareness. That's the horizontal axis. Now that inserted table shows net sentiment and the ends in the survey, which informs the position of the dots. And you'll notice we're plotting TensorFlow as well. We know that's not a company, but it's there for reference as open source tooling is an option for customers. And ETR sometimes like to show that as a reference point. Now we've also drawn a line for Databricks to show how relatively dominant they've become in the past 10 ETS surveys and sort of mind share going back to late 2018. And you can see a dozen or so other emerging tech vendors. So Andy, I want you to share your thoughts on these players, who were the ones to watch, name some names. We'll bring that data back up as you as you comment. >> So Databricks, as you said, remember we talked about how Oracle is not necessarily the database of the choice, you know? So Databricks is kind of trying to solve some of the issue for AI/ML workloads, right? And the problem is also there is no one company that could solve all of the problems. For example, if you look at the names in here, some of them are database names, some of them are platform names, some of them are like MLOps companies like, DataRobot (indistinct) and others. And some of them are like future based companies like, you know, the Techton and stuff. >> So it's a mix of those sub sectors? >> It's a mix of those companies. >> We'll talk to ETR about that. They'd be interested in your input on how to make this more granular and these sub-sectors. You got Hugging Face in here, >> Which is NLP, yeah. >> Okay. So your take, are these companies going to get acquired? Are they going to go IPO? Are they going to merge? >> Well, most of them going to get acquired. My prediction would be most of them will get acquired because look, at the end of the day, hyperscalers need these capabilities, right? So they're going to either create their own, AWS is very good at doing that. They have done a lot of those things. But the other ones, like for particularly Azure, they're going to look at it and saying that, "You know what, it's going to take time for me to build this. Why don't I just go and buy you?" Right? Or or even the smaller players like Oracle or IBM Cloud, this will exist. They might even take a look at them, right? So at the end of the day, a lot of these companies are going to get acquired or merged with others. >> Yeah. All right, let's wrap with some final thoughts. I'm going to make some comments Andy, and then ask you to dig in here. Look, despite the challenge of leveraging AI, you know, Ken, if you could bring up the next chart. We're not repeating, we're not predicting the AI winter of the 1990s. Machine intelligence. It's a superpower that's going to permeate every aspect of the technology industry. AI and data strategies have to be connected. Leveraging first party data is going to increase AI competitiveness and shorten time to value. Andy, I'd love your thoughts on that. I know you've got some thoughts on governance and AI ethics. You know, we talked about ChatGBT, Deepfakes, help us unpack all these trends. >> So there's so much information packed up there, right? The AI and data strategy, that's very, very, very important. If you don't have a proper data, people don't realize that AI is, your AI is the morals that you built on, it's predominantly based on the data what you have. It's not, AI cannot predict something that's going to happen without knowing what it is. It need to be trained, it need to understand what is it you're talking about. So 99% of the time you got to have a good data for you to train. So this where I mentioned to you, the problem is a lot of these companies can't afford to collect the real world data because it takes too long, it's too expensive. So a lot of these companies are trying to do the synthetic data way. It has its own set of issues because you can't use all... >> What's that synthetic data? Explain that. >> Synthetic data is basically not a real world data, but it's a created or simulated data equal and based on real data. It looks, feels, smells, taste like a real data, but it's not exactly real data, right? This is particularly useful in the financial and healthcare industry for world. So you don't have to, at the end of the day, if you have real data about your and my medical history data, if you redact it, you can still reverse this. It's fairly easy, right? >> Yeah, yeah. >> So by creating a synthetic data, there is no correlation between the real data and the synthetic data. >> So that's part of AI ethics and privacy and, okay. >> So the synthetic data, the issue with that is that when you're trying to commingle that with that, you can't create models based on just on synthetic data because synthetic data, as I said is artificial data. So basically you're creating artificial models, so you got to blend in properly that that blend is the problem. And you know how much of real data, how much of synthetic data you could use. You got to use judgment between efficiency cost and the time duration stuff. So that's one-- >> And risk >> And the risk involved with that. And the secondary issues which we talked about is that when you're creating, okay, you take a business use case, okay, you think about investing things, you build the whole thing out and you're trying to put it out into the market. Most companies that I talk to don't have a proper governance in place. They don't have ethics standards in place. They don't worry about the biases in data, they just go on trying to solve a business case >> It's wild west. >> 'Cause that's what they start. It's a wild west! And then at the end of the day when they are close to some legal litigation action or something or something else happens and that's when the Oh Shit! moments happens, right? And then they come in and say, "You know what, how do I fix this?" The governance, security and all of those things, ethics bias, data bias, de-biasing, none of them can be an afterthought. It got to start with the, from the get-go. So you got to start at the beginning saying that, "You know what, I'm going to do all of those AI programs, but before we get into this, we got to set some framework for doing all these things properly." Right? And then the-- >> Yeah. So let's go back to the key points. I want to bring up the cloud again. Because you got to get cloud right. Getting that right matters in AI to the points that you were making earlier. You can't just be out on an island and hyperscalers, they're going to obviously continue to do well. They get more and more data's going into the cloud and they have the native tools. To your point, in the case of AWS, Microsoft's obviously ubiquitous. Google's got great capabilities here. They've got integrated ecosystems partners that are going to continue to strengthen through the decade. What are your thoughts here? >> So a couple of things. One is the last mile ML or last mile AI that nobody's talking about. So that need to be attended to. There are lot of players in the market that coming up, when I talk about last mile, I'm talking about after you're done with the experimentation of the model, how fast and quickly and efficiently can you get it to production? So that's production being-- >> Compressing that time is going to put dollars in your pocket. >> Exactly. Right. >> So once, >> If you got it right. >> If you get it right, of course. So there are, there are a couple of issues with that. Once you figure out that model is working, that's perfect. People don't realize, the moment you decide that moment when the decision is made, it's like a new car. After you purchase the value decreases on a minute basis. Same thing with the models. Once the model is created, you need to be in production right away because it starts losing it value on a seconds minute basis. So issue number one, how fast can I get it over there? So your deployment, you are inferencing efficiently at the edge locations, your optimization, your security, all of this is at issue. But you know what is more important than that in the last mile? You keep the model up, you continue to work on, again, going back to the car analogy, at one point you got to figure out your car is costing more than to operate. So you got to get a new car, right? And that's the same thing with the models as well. If your model has reached a stage, it is actually a potential risk for your operation. To give you an idea, if Uber has a model, the first time when you get a car from going from point A to B cost you $60. If the model decayed the next time I might give you a $40 rate, I would take it definitely. But it's lost for the company. The business risk associated with operating on a bad model, you should realize it immediately, pull the model out, retrain it, redeploy it. That's is key. >> And that's got to be huge in security model recency and security to the extent that you can get real time is big. I mean you, you see Palo Alto, CrowdStrike, a lot of other security companies are injecting AI. Again, they won't show up in the ETR ML/AI taxonomy per se as a pure play. But ServiceNow is another company that you have have mentioned to me, offline. AI is just getting embedded everywhere. >> Yep. >> And then I'm glad you brought up, kind of real-time inferencing 'cause a lot of the modeling, if we can go back to the last point that we're going to make, a lot of the AI today is modeling done in the cloud. The last point we wanted to make here, I'd love to get your thoughts on this, is real-time AI inferencing for instance at the edge is going to become increasingly important for us. It's going to usher in new economics, new types of silicon, particularly arm-based. We've covered that a lot on "Breaking Analysis", new tooling, new companies and that could disrupt the sort of cloud model if new economics emerge. 'Cause cloud obviously very centralized, they're trying to decentralize it. But over the course of this decade we could see some real disruption there. Andy, give us your final thoughts on that. >> Yes and no. I mean at the end of the day, cloud is kind of centralized now, but a lot of this companies including, AWS is kind of trying to decentralize that by putting their own sub-centers and edge locations. >> Local zones, outposts. >> Yeah, exactly. Particularly the outpost concept. And if it can even become like a micro center and stuff, it won't go to the localized level of, I go to a single IOT level. But again, the cloud extends itself to that level. So if there is an opportunity need for it, the hyperscalers will figure out a way to fit that model. So I wouldn't too much worry about that, about deployment and where to have it and what to do with that. But you know, figure out the right business use case, get the right data, get the ethics and governance place and make sure they get it to production and make sure you pull the model out when it's not operating well. >> Excellent advice. Andy, I got to thank you for coming into the studio today, helping us with this "Breaking Analysis" segment. Outstanding collaboration and insights and input in today's episode. Hope we can do more. >> Thank you. Thanks for having me. I appreciate it. >> You're very welcome. All right. I want to thank Alex Marson who's on production and manages the podcast. Ken Schiffman as well. Kristen Martin and Cheryl Knight helped get the word out on social media and our newsletters. And Rob Hoof is our editor-in-chief over at Silicon Angle. He does some great editing for us. Thank you all. Remember all these episodes are available as podcast. Wherever you listen, all you got to do is search "Breaking Analysis" podcast. I publish each week on wikibon.com and silicon angle.com or you can email me at david.vellante@siliconangle.com to get in touch, or DM me at dvellante or comment on our LinkedIn posts. Please check out ETR.AI for the best survey data and the enterprise tech business, Constellation Research. Andy publishes there some awesome information on AI and data. This is Dave Vellante for theCUBE Insights powered by ETR. Thanks for watching everybody and we'll see you next time on "Breaking Analysis". (gentle closing tune plays)

Published Date : Dec 29 2022

SUMMARY :

bringing you data-driven Andy, great to have you on the program. and AI at the center of their enterprises. So it's like you found a of the AI use cases," right? I got a glimpse of the January survey, So one of the things and it just notes some of the players So the first one is, Like a And the open AI tool and ChatGPT rather. I have, but it's of all the available text of bodies that you need or some of the others that are on there? One of the things they're So the data historically So here's the thing. So the ROI is going to So the chart here shows the net score, Couple of them stood out to me IBM Watson is the far right and the red, And over the course of when you first saw it. I mean, that's one of the pillars. Oracle is not necessarily the how DataRobot is holding, you know? So it's like net score on the vertical database of the choice, you know? on how to make this more Are they going to go IPO? So at the end of the day, of the technology industry. So 99% of the time you What's that synthetic at the end of the day, and the synthetic data. So that's part of AI that blend is the problem. And the risk involved with that. So you got to start at data's going into the cloud So that need to be attended to. is going to put dollars the first time when you that you can get real time is big. a lot of the AI today is I mean at the end of the day, and make sure they get it to production Andy, I got to thank you for Thanks for having me. and manages the podcast.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavePERSON

0.99+

Alex MarsonPERSON

0.99+

AndyPERSON

0.99+

Andy ThuraiPERSON

0.99+

Dave VellantePERSON

0.99+

AWSORGANIZATION

0.99+

IBMORGANIZATION

0.99+

Ken SchiffmanPERSON

0.99+

Tom DavenportPERSON

0.99+

AMEXORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

Cheryl KnightPERSON

0.99+

Rashmi KumarPERSON

0.99+

Rob HoofPERSON

0.99+

GoogleORGANIZATION

0.99+

UberORGANIZATION

0.99+

KenPERSON

0.99+

OracleORGANIZATION

0.99+

OctoberDATE

0.99+

6%QUANTITY

0.99+

$40QUANTITY

0.99+

January 21DATE

0.99+

ChipotleORGANIZATION

0.99+

$15 billionQUANTITY

0.99+

fiveQUANTITY

0.99+

RashmiPERSON

0.99+

$50,000QUANTITY

0.99+

$60QUANTITY

0.99+

USLOCATION

0.99+

JanuaryDATE

0.99+

AntonioPERSON

0.99+

John AkersPERSON

0.99+

Warren BuffetPERSON

0.99+

late 2018DATE

0.99+

IkeaORGANIZATION

0.99+

American ExpressORGANIZATION

0.99+

MITORGANIZATION

0.99+

PWCORGANIZATION

0.99+

99%QUANTITY

0.99+

HPEORGANIZATION

0.99+

DominoORGANIZATION

0.99+

ArvindPERSON

0.99+

Palo AltoLOCATION

0.99+

30 billionQUANTITY

0.99+

last yearDATE

0.99+

Constellation ResearchORGANIZATION

0.99+

GerstnerPERSON

0.99+

120 billionQUANTITY

0.99+

$100,000QUANTITY

0.99+

Breaking Analysis: Grading our 2022 Enterprise Technology Predictions


 

>>From the Cube Studios in Palo Alto in Boston, bringing you data-driven insights from the cube and E T R. This is breaking analysis with Dave Valante. >>Making technology predictions in 2022 was tricky business, especially if you were projecting the performance of markets or identifying I P O prospects and making binary forecast on data AI and the macro spending climate and other related topics in enterprise tech 2022, of course was characterized by a seesaw economy where central banks were restructuring their balance sheets. The war on Ukraine fueled inflation supply chains were a mess. And the unintended consequences of of forced march to digital and the acceleration still being sorted out. Hello and welcome to this week's weekly on Cube Insights powered by E T R. In this breaking analysis, we continue our annual tradition of transparently grading last year's enterprise tech predictions. And you may or may not agree with our self grading system, but look, we're gonna give you the data and you can draw your own conclusions and tell you what, tell us what you think. >>All right, let's get right to it. So our first prediction was tech spending increases by 8% in 2022. And as we exited 2021 CIOs, they were optimistic about their digital transformation plans. You know, they rushed to make changes to their business and were eager to sharpen their focus and continue to iterate on their digital business models and plug the holes that they, the, in the learnings that they had. And so we predicted that 8% rise in enterprise tech spending, which looked pretty good until Ukraine and the Fed decided that, you know, had to rush and make up for lost time. We kind of nailed the momentum in the energy sector, but we can't give ourselves too much credit for that layup. And as of October, Gartner had it spending growing at just over 5%. I think it was 5.1%. So we're gonna take a C plus on this one and, and move on. >>Our next prediction was basically kind of a slow ground ball. The second base, if I have to be honest, but we felt it was important to highlight that security would remain front and center as the number one priority for organizations in 2022. As is our tradition, you know, we try to up the degree of difficulty by specifically identifying companies that are gonna benefit from these trends. So we highlighted some possible I P O candidates, which of course didn't pan out. S NQ was on our radar. The company had just had to do another raise and they recently took a valuation hit and it was a down round. They raised 196 million. So good chunk of cash, but, but not the i p O that we had predicted Aqua Securities focus on containers and cloud native. That was a trendy call and we thought maybe an M SS P or multiple managed security service providers like Arctic Wolf would I p o, but no way that was happening in the crummy market. >>Nonetheless, we think these types of companies, they're still faring well as the talent shortage in security remains really acute, particularly in the sort of mid-size and small businesses that often don't have a sock Lacework laid off 20% of its workforce in 2022. And CO C e o Dave Hatfield left the company. So that I p o didn't, didn't happen. It was probably too early for Lacework. Anyway, meanwhile you got Netscope, which we've cited as strong in the E T R data as particularly in the emerging technology survey. And then, you know, I lumia holding its own, you know, we never liked that 7 billion price tag that Okta paid for auth zero, but we loved the TAM expansion strategy to target developers beyond sort of Okta's enterprise strength. But we gotta take some points off of the failure thus far of, of Okta to really nail the integration and the go to market model with azero and build, you know, bring that into the, the, the core Okta. >>So the focus on endpoint security that was a winner in 2022 is CrowdStrike led that charge with others holding their own, not the least of which was Palo Alto Networks as it continued to expand beyond its core network security and firewall business, you know, through acquisition. So overall we're gonna give ourselves an A minus for this relatively easy call, but again, we had some specifics associated with it to make it a little tougher. And of course we're watching ve very closely this this coming year in 2023. The vendor consolidation trend. You know, according to a recent Palo Alto network survey with 1300 SecOps pros on average organizations have more than 30 tools to manage security tools. So this is a logical way to optimize cost consolidating vendors and consolidating redundant vendors. The E T R data shows that's clearly a trend that's on the upswing. >>Now moving on, a big theme of 2020 and 2021 of course was remote work and hybrid work and new ways to work and return to work. So we predicted in 2022 that hybrid work models would become the dominant protocol, which clearly is the case. We predicted that about 33% of the workforce would come back to the office in 2022 in September. The E T R data showed that figure was at 29%, but organizations expected that 32% would be in the office, you know, pretty much full-time by year end. That hasn't quite happened, but we were pretty close with the projection, so we're gonna take an A minus on this one. Now, supply chain disruption was another big theme that we felt would carry through 2022. And sure that sounds like another easy one, but as is our tradition, again we try to put some binary metrics around our predictions to put some meat in the bone, so to speak, and and allow us than you to say, okay, did it come true or not? >>So we had some data that we presented last year and supply chain issues impacting hardware spend. We said at the time, you can see this on the left hand side of this chart, the PC laptop demand would remain above pre covid levels, which would reverse a decade of year on year declines, which I think started in around 2011, 2012. Now, while demand is down this year pretty substantially relative to 2021, I D C has worldwide unit shipments for PCs at just over 300 million for 22. If you go back to 2019 and you're looking at around let's say 260 million units shipped globally, you know, roughly, so, you know, pretty good call there. Definitely much higher than pre covid levels. But so what you might be asking why the B, well, we projected that 30% of customers would replace security appliances with cloud-based services and that more than a third would replace their internal data center server and storage hardware with cloud services like 30 and 40% respectively. >>And we don't have explicit survey data on exactly these metrics, but anecdotally we see this happening in earnest. And we do have some data that we're showing here on cloud adoption from ET R'S October survey where the midpoint of workloads running in the cloud is around 34% and forecast, as you can see, to grow steadily over the next three years. So this, well look, this is not, we understand it's not a one-to-one correlation with our prediction, but it's a pretty good bet that we were right, but we gotta take some points off, we think for the lack of unequivocal proof. Cause again, we always strive to make our predictions in ways that can be measured as accurate or not. Is it binary? Did it happen, did it not? Kind of like an O K R and you know, we strive to provide data as proof and in this case it's a bit fuzzy. >>We have to admit that although we're pretty comfortable that the prediction was accurate. And look, when you make an hard forecast, sometimes you gotta pay the price. All right, next, we said in 2022 that the big four cloud players would generate 167 billion in IS and PaaS revenue combining for 38% market growth. And our current forecasts are shown here with a comparison to our January, 2022 figures. So coming into this year now where we are today, so currently we expect 162 billion in total revenue and a 33% growth rate. Still very healthy, but not on our mark. So we think a w s is gonna miss our predictions by about a billion dollars, not, you know, not bad for an 80 billion company. So they're not gonna hit that expectation though of getting really close to a hundred billion run rate. We thought they'd exit the year, you know, closer to, you know, 25 billion a quarter and we don't think they're gonna get there. >>Look, we pretty much nailed Azure even though our prediction W was was correct about g Google Cloud platform surpassing Alibaba, Alibaba, we way overestimated the performance of both of those companies. So we're gonna give ourselves a C plus here and we think, yeah, you might think it's a little bit harsh, we could argue for a B minus to the professor, but the misses on GCP and Alibaba we think warrant a a self penalty on this one. All right, let's move on to our prediction about Supercloud. We said it becomes a thing in 2022 and we think by many accounts it has, despite the naysayers, we're seeing clear evidence that the concept of a layer of value add that sits above and across clouds is taking shape. And on this slide we showed just some of the pickup in the industry. I mean one of the most interesting is CloudFlare, the biggest supercloud antagonist. >>Charles Fitzgerald even predicted that no vendor would ever use the term in their marketing. And that would be proof if that happened that Supercloud was a thing and he said it would never happen. Well CloudFlare has, and they launched their version of Supercloud at their developer week. Chris Miller of the register put out a Supercloud block diagram, something else that Charles Fitzgerald was, it was was pushing us for, which is rightly so, it was a good call on his part. And Chris Miller actually came up with one that's pretty good at David Linthicum also has produced a a a A block diagram, kind of similar, David uses the term metacloud and he uses the term supercloud kind of interchangeably to describe that trend. And so we we're aligned on that front. Brian Gracely has covered the concept on the popular cloud podcast. Berkeley launched the Sky computing initiative. >>You read through that white paper and many of the concepts highlighted in the Supercloud 3.0 community developed definition align with that. Walmart launched a platform with many of the supercloud salient attributes. So did Goldman Sachs, so did Capital One, so did nasdaq. So you know, sorry you can hate the term, but very clearly the evidence is gathering for the super cloud storm. We're gonna take an a plus on this one. Sorry, haters. Alright, let's talk about data mesh in our 21 predictions posts. We said that in the 2020s, 75% of large organizations are gonna re-architect their big data platforms. So kind of a decade long prediction. We don't like to do that always, but sometimes it's warranted. And because it was a longer term prediction, we, at the time in, in coming into 22 when we were evaluating our 21 predictions, we took a grade of incomplete because the sort of decade long or majority of the decade better part of the decade prediction. >>So last year, earlier this year, we said our number seven prediction was data mesh gains momentum in 22. But it's largely confined and narrow data problems with limited scope as you can see here with some of the key bullets. So there's a lot of discussion in the data community about data mesh and while there are an increasing number of examples, JP Morgan Chase, Intuit, H S P C, HelloFresh, and others that are completely rearchitecting parts of their data platform completely rearchitecting entire data platforms is non-trivial. There are organizational challenges, there're data, data ownership, debates, technical considerations, and in particular two of the four fundamental data mesh principles that the, the need for a self-service infrastructure and federated computational governance are challenging. Look, democratizing data and facilitating data sharing creates conflicts with regulatory requirements around data privacy. As such many organizations are being really selective with their data mesh implementations and hence our prediction of narrowing the scope of data mesh initiatives. >>I think that was right on J P M C is a good example of this, where you got a single group within a, within a division narrowly implementing the data mesh architecture. They're using a w s, they're using data lakes, they're using Amazon Glue, creating a catalog and a variety of other techniques to meet their objectives. They kind of automating data quality and it was pretty well thought out and interesting approach and I think it's gonna be made easier by some of the announcements that Amazon made at the recent, you know, reinvent, particularly trying to eliminate ET t l, better connections between Aurora and Redshift and, and, and better data sharing the data clean room. So a lot of that is gonna help. Of course, snowflake has been on this for a while now. Many other companies are facing, you know, limitations as we said here and this slide with their Hadoop data platforms. They need to do new, some new thinking around that to scale. HelloFresh is a really good example of this. Look, the bottom line is that organizations want to get more value from data and having a centralized, highly specialized teams that own the data problem, it's been a barrier and a blocker to success. The data mesh starts with organizational considerations as described in great detail by Ash Nair of Warner Brothers. So take a listen to this clip. >>Yeah, so when people think of Warner Brothers, you always think of like the movie studio, but we're more than that, right? I mean, you think of H B O, you think of t n t, you think of C N N. We have 30 plus brands in our portfolio and each have their own needs. So the, the idea of a data mesh really helps us because what we can do is we can federate access across the company so that, you know, CNN can work at their own pace. You know, when there's election season, they can ingest their own data and they don't have to, you know, bump up against, as an example, HBO if Game of Thrones is going on. >>So it's often the case that data mesh is in the eyes of the implementer. And while a company's implementation may not strictly adhere to Jamma Dani's vision of data mesh, and that's okay, the goal is to use data more effectively. And despite Gartner's attempts to deposition data mesh in favor of the somewhat confusing or frankly far more confusing data fabric concept that they stole from NetApp data mesh is taking hold in organizations globally today. So we're gonna take a B on this one. The prediction is shaping up the way we envision, but as we previously reported, it's gonna take some time. The better part of a decade in our view, new standards have to emerge to make this vision become reality and they'll come in the form of both open and de facto approaches. Okay, our eighth prediction last year focused on the face off between Snowflake and Databricks. >>And we realized this popular topic, and maybe one that's getting a little overplayed, but these are two companies that initially, you know, looked like they were shaping up as partners and they, by the way, they are still partnering in the field. But you go back a couple years ago, the idea of using an AW w s infrastructure, Databricks machine intelligence and applying that on top of Snowflake as a facile data warehouse, still very viable. But both of these companies, they have much larger ambitions. They got big total available markets to chase and large valuations that they have to justify. So what's happening is, as we've previously reported, each of these companies is moving toward the other firm's core domain and they're building out an ecosystem that'll be critical for their future. So as part of that effort, we said each is gonna become aggressive investors and maybe start doing some m and a and they have in various companies. >>And on this chart that we produced last year, we studied some of the companies that were targets and we've added some recent investments of both Snowflake and Databricks. As you can see, they've both, for example, invested in elation snowflake's, put money into Lacework, the Secur security firm, ThoughtSpot, which is trying to democratize data with ai. Collibra is a governance platform and you can see Databricks investments in data transformation with D B T labs, Matillion doing simplified business intelligence hunters. So that's, you know, they're security investment and so forth. So other than our thought that we'd see Databricks I p o last year, this prediction been pretty spot on. So we'll give ourselves an A on that one. Now observability has been a hot topic and we've been covering it for a while with our friends at E T R, particularly Eric Bradley. Our number nine prediction last year was basically that if you're not cloud native and observability, you are gonna be in big trouble. >>So everything guys gotta go cloud native. And that's clearly been the case. Splunk, the big player in the space has been transitioning to the cloud, hasn't always been pretty, as we reported, Datadog real momentum, the elk stack, that's open source model. You got new entrants that we've cited before, like observe, honeycomb, chaos search and others that we've, we've reported on, they're all born in the cloud. So we're gonna take another a on this one, admittedly, yeah, it's a re reasonably easy call, but you gotta have a few of those in the mix. Okay, our last prediction, our number 10 was around events. Something the cube knows a little bit about. We said that a new category of events would emerge as hybrid and that for the most part is happened. So that's gonna be the mainstay is what we said. That pure play virtual events are gonna give way to hi hybrid. >>And the narrative is that virtual only events are, you know, they're good for quick hits, but lousy replacements for in-person events. And you know that said, organizations of all shapes and sizes, they learn how to create better virtual content and support remote audiences during the pandemic. So when we set at pure play is gonna give way to hybrid, we said we, we i we implied or specific or specified that the physical event that v i p experience is going defined. That overall experience and those v i p events would create a little fomo, fear of, of missing out in a virtual component would overlay that serves an audience 10 x the size of the physical. We saw that really two really good examples. Red Hat Summit in Boston, small event, couple thousand people served tens of thousands, you know, online. Second was Google Cloud next v i p event in, in New York City. >>Everything else was, was, was, was virtual. You know, even examples of our prediction of metaverse like immersion have popped up and, and and, and you know, other companies are doing roadshow as we predicted like a lot of companies are doing it. You're seeing that as a major trend where organizations are going with their sales teams out into the regions and doing a little belly to belly action as opposed to the big giant event. That's a definitely a, a trend that we're seeing. So in reviewing this prediction, the grade we gave ourselves is, you know, maybe a bit unfair, it should be, you could argue for a higher grade, but the, but the organization still haven't figured it out. They have hybrid experiences but they generally do a really poor job of leveraging the afterglow and of event of an event. It still tends to be one and done, let's move on to the next event or the next city. >>Let the sales team pick up the pieces if they were paying attention. So because of that, we're only taking a B plus on this one. Okay, so that's the review of last year's predictions. You know, overall if you average out our grade on the 10 predictions that come out to a b plus, I dunno why we can't seem to get that elusive a, but we're gonna keep trying our friends at E T R and we are starting to look at the data for 2023 from the surveys and all the work that we've done on the cube and our, our analysis and we're gonna put together our predictions. We've had literally hundreds of inbounds from PR pros pitching us. We've got this huge thick folder that we've started to review with our yellow highlighter. And our plan is to review it this month, take a look at all the data, get some ideas from the inbounds and then the e t R of January surveys in the field. >>It's probably got a little over a thousand responses right now. You know, they'll get up to, you know, 1400 or so. And once we've digested all that, we're gonna go back and publish our predictions for 2023 sometime in January. So stay tuned for that. All right, we're gonna leave it there for today. You wanna thank Alex Myerson who's on production and he manages the podcast, Ken Schiffman as well out of our, our Boston studio. I gotta really heartfelt thank you to Kristen Martin and Cheryl Knight and their team. They helped get the word out on social and in our newsletters. Rob Ho is our editor in chief over at Silicon Angle who does some great editing for us. Thank you all. Remember all these podcasts are available or all these episodes are available is podcasts. Wherever you listen, just all you do Search Breaking analysis podcast, really getting some great traction there. Appreciate you guys subscribing. I published each week on wikibon.com, silicon angle.com or you can email me directly at david dot valante silicon angle.com or dm me Dante, or you can comment on my LinkedIn post. And please check out ETR AI for the very best survey data in the enterprise tech business. Some awesome stuff in there. This is Dante for the Cube Insights powered by etr. Thanks for watching and we'll see you next time on breaking analysis.

Published Date : Dec 18 2022

SUMMARY :

From the Cube Studios in Palo Alto in Boston, bringing you data-driven insights from self grading system, but look, we're gonna give you the data and you can draw your own conclusions and tell you what, We kind of nailed the momentum in the energy but not the i p O that we had predicted Aqua Securities focus on And then, you know, I lumia holding its own, you So the focus on endpoint security that was a winner in 2022 is CrowdStrike led that charge put some meat in the bone, so to speak, and and allow us than you to say, okay, We said at the time, you can see this on the left hand side of this chart, the PC laptop demand would remain Kind of like an O K R and you know, we strive to provide data We thought they'd exit the year, you know, closer to, you know, 25 billion a quarter and we don't think they're we think, yeah, you might think it's a little bit harsh, we could argue for a B minus to the professor, Chris Miller of the register put out a Supercloud block diagram, something else that So you know, sorry you can hate the term, but very clearly the evidence is gathering for the super cloud But it's largely confined and narrow data problems with limited scope as you can see here with some of the announcements that Amazon made at the recent, you know, reinvent, particularly trying to the company so that, you know, CNN can work at their own pace. So it's often the case that data mesh is in the eyes of the implementer. but these are two companies that initially, you know, looked like they were shaping up as partners and they, So that's, you know, they're security investment and so forth. So that's gonna be the mainstay is what we And the narrative is that virtual only events are, you know, they're good for quick hits, the grade we gave ourselves is, you know, maybe a bit unfair, it should be, you could argue for a higher grade, You know, overall if you average out our grade on the 10 predictions that come out to a b plus, You know, they'll get up to, you know,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Alex MyersonPERSON

0.99+

Cheryl KnightPERSON

0.99+

Ken SchiffmanPERSON

0.99+

Chris MillerPERSON

0.99+

CNNORGANIZATION

0.99+

Rob HoPERSON

0.99+

AlibabaORGANIZATION

0.99+

Dave ValantePERSON

0.99+

AmazonORGANIZATION

0.99+

5.1%QUANTITY

0.99+

2022DATE

0.99+

Charles FitzgeraldPERSON

0.99+

Dave HatfieldPERSON

0.99+

Brian GracelyPERSON

0.99+

2019DATE

0.99+

LaceworkORGANIZATION

0.99+

twoQUANTITY

0.99+

GCPORGANIZATION

0.99+

33%QUANTITY

0.99+

WalmartORGANIZATION

0.99+

DavidPERSON

0.99+

2021DATE

0.99+

20%QUANTITY

0.99+

Kristen MartinPERSON

0.99+

Palo AltoLOCATION

0.99+

2020DATE

0.99+

Ash NairPERSON

0.99+

Goldman SachsORGANIZATION

0.99+

162 billionQUANTITY

0.99+

New York CityLOCATION

0.99+

DatabricksORGANIZATION

0.99+

OctoberDATE

0.99+

last yearDATE

0.99+

Arctic WolfORGANIZATION

0.99+

two companiesQUANTITY

0.99+

38%QUANTITY

0.99+

SeptemberDATE

0.99+

FedORGANIZATION

0.99+

JP Morgan ChaseORGANIZATION

0.99+

80 billionQUANTITY

0.99+

29%QUANTITY

0.99+

32%QUANTITY

0.99+

21 predictionsQUANTITY

0.99+

30%QUANTITY

0.99+

HBOORGANIZATION

0.99+

75%QUANTITY

0.99+

Game of ThronesTITLE

0.99+

JanuaryDATE

0.99+

2023DATE

0.99+

10 predictionsQUANTITY

0.99+

bothQUANTITY

0.99+

22QUANTITY

0.99+

ThoughtSpotORGANIZATION

0.99+

196 millionQUANTITY

0.99+

30QUANTITY

0.99+

eachQUANTITY

0.99+

last yearDATE

0.99+

Palo Alto NetworksORGANIZATION

0.99+

2020sDATE

0.99+

167 billionQUANTITY

0.99+

OktaORGANIZATION

0.99+

SecondQUANTITY

0.99+

GartnerORGANIZATION

0.99+

Eric BradleyPERSON

0.99+

Aqua SecuritiesORGANIZATION

0.99+

DantePERSON

0.99+

8%QUANTITY

0.99+

Warner BrothersORGANIZATION

0.99+

IntuitORGANIZATION

0.99+

Cube StudiosORGANIZATION

0.99+

each weekQUANTITY

0.99+

7 billionQUANTITY

0.99+

40%QUANTITY

0.99+

SnowflakeORGANIZATION

0.99+

Breaking Analysis: How Palo Alto Networks Became the Gold Standard of Cybersecurity


 

>> From "theCube" Studios in Palo Alto in Boston bringing you data-driven insights from "theCube" and ETR. This is "Breaking Analysis" with Dave Vellante. >> As an independent pure play company, Palo Alto Networks has earned its status as the leader in security. You can measure this in a variety of ways. Revenue, market cap, execution, ethos, and most importantly, conversations with customers generally. In CISO specifically, who consistently affirm this position. The company's on track to double its revenues in fiscal year 23 relative to fiscal year 2020. Despite macro headwinds, which are likely to carry through next year, Palo Alto owes its position to a clarity of vision and strong execution on a TAM expansion strategy through acquisitions and integration into its cloud and SaaS offerings. Hello and welcome to this week's "Wikibon Cube Insights" powered by ETR and this breaking analysis and ahead of Palo Alto Ignite the company's user conference, we bring you the next chapter on top of the last week's cybersecurity update. We're going to dig into the ETR data on Palo Alto Networks as we promised and provide a glimpse of what we're going to look for at "Ignite" and posit what Palo Alto needs to do to stay on top of the hill. Now, the challenges for cybersecurity professionals. Dead simple to understand. Solving it, not so much. This is a taxonomic eye test, if you will, from Optiv. It's one of our favorite artifacts to make the point the cybersecurity landscape is a mosaic of stovepipes. Security professionals have to work with dozens of tools many legacy combined with shiny new toys to try and keep up with the relentless pace of innovation catalyzed by the incredibly capable well-funded and motivated adversaries. Cybersecurity is an anomalous market in that the leaders have low single digit market shares. Think about that. Cisco at one point held 60% market share in the networking business and it's still deep into the 40s. Oracle captures around 30% of database market revenue. EMC and storage at its peak had more than 30% of that market. Even Dell's PC market shares, you know, in the mid 20s or even over that from a revenue standpoint. So cybersecurity from a market share standpoint is even more fragmented perhaps than the software industry. Okay, you get the point. So despite its position as the number one player Palo Alto might have maybe three maybe 4% of the total market, depending on what you use as your denominator, but just a tiny slice. So how is it that we can sit here and declare Palo Alto as the undisputed leader? Well, we probably wouldn't go that far. They probably have quite a bit of competition. But this CISO from a recent ETR round table discussion with our friend Eric Bradley, summed up Palo Alto's allure. We thought pretty well. The question was why Palo Alto Networks? Here's the answer. Because of its completeness as a platform, its ability to integrate with its own products or they acquire, integrate then rebrand them as their own. We've looked at other vendors we just didn't think they were as mature and we already had implemented some of the Palo Alto tools like the firewalls and stuff and we thought why not go holistically with the vendor a single throat to choke, if you will, if stuff goes wrong. And I think that was probably the primary driver and familiarity with the tools and the resources that they provided. Now here's another stat from ETR's Eric Bradley. He gave us a glimpse of the January survey that's in the field now. The percent of IT buyers stating that they plan to consolidate redundant vendors, it went from 34% in the October survey and now stands at 44%. So we fo we feel this bodes well for consolidators like Palo Alto networks. And the same is true from Microsoft's kind of good enough approach. It should also be true for CrowdStrike although last quarter we saw softness reported on in their SMB market, whereas interestingly MongoDB actually saw consistent strength from its SMB and its self-serve. So that's something that we're watching very closely. Now, Palo Alto Networks has held up better than most of its peers in the stock market. So let's take a look at that real quick. This chart gives you a sense of how well. It's a one year comparison of Palo Alto with the bug ETF. That's the cyber basket that we like to compare often CrowdStrike, Zscaler, and Okta. Now remember Palo Alto, they didn't run up as much as CrowdStrike, ZS and Okta during the pandemic but you can see it's now down unquote only 9% for the year. Whereas the cyber basket ETF is off 27% roughly in line with the NASDAQ. We're not showing that CrowdStrike down 44%, Zscaler down 61% and Okta off a whopping 72% in the past 12 months. Now as we've indicated, Palo Alto is making a strong case for consolidating point tools and we think it will have a much harder time getting customers to switch off of big platforms like Cisco who's another leader in network security. But based on the fragmentation in the market there's plenty of room to grow in our view. We asked breaking analysis contributor Chip Simington for his take on the technicals of the stock and he said that despite Palo Alto's leadership position it doesn't seem to make much difference these days. It's all about interest rates. And even though this name has performed better than its peers, it looks like the stock wants to keep testing its 52 week lows, but he thinks Palo Alto got oversold during the last big selloff. And the fact that the company's free cash flow is so strong probably keeps it at the one 50 level or above maybe bouncing around there for a while. If it breaks through that under to the downside it's ne next test is at that low of around one 40 level. So thanks for that, Chip. Now having get that out of the way as we said on the previous chart Palo Alto has strong opinions, it's founder and CTO, Nir Zuk, is extremely clear on that point of view. So let's take a look at how Palo Alto got to where it is today and how we think you should think about his future. The company was founded around 18 years ago as a network security company focused on what they called NextGen firewalls. Now, what Palo Alto did was different. They didn't try to stuff a bunch of functionality inside of a hardware box. Rather they layered network security functions on top of its firewalls and delivered value as a service through software running at the time in its own cloud. So pretty obvious today, but forward thinking for the time and now they've moved to a more true cloud native platform and much more activity in the public cloud. In February, 2020, right before the pandemic we reported on the divergence in market values between Palo Alto and Fort Net and we cited some challenges that Palo Alto was happening having transitioning to a cloud native model. And at the time we said we were confident that Palo Alto would make it through the knot hole. And you could see from the previous chart that it has. So the company's architectural approach was to do the heavy lifting in the cloud. And this eliminates the need for customers to deploy sensors on prem or proxies on prem or sandboxes on prem sandboxes, you know for instance are vulnerable to overwhelming attacks. Think about it, if you're a sandbox is on prem you're not going to be updating that every day. No way. You're probably not going to updated even every week or every month. And if the capacity of your sandbox is let's say 20,000 files an hour you know a hacker's just going to turn up the volume, it'll overwhelm you. They'll send a hundred thousand emails attachments into your sandbox and they'll choke you out and then they'll have the run of the house while you're trying to recover. Now the cloud doesn't completely prevent that but what it does, it definitely increases the hacker's cost. So they're going to probably hit some easier targets and that's kind of the objective of security firms. You know, increase the denominator on the ROI. All right, the next thing that Palo Alto did is start acquiring aggressively, I think we counted 17 or 18 acquisitions to expand the TAM beyond network security into endpoint CASB, PaaS security, IaaS security, container security, serverless security, incident response, SD WAN, CICD pipeline security, attack service management, supply chain security. Just recently with the acquisition of Cider Security and Palo Alto by all accounts takes the time to integrate into its cloud and SaaS platform called Prisma. Unlike many acquisitive companies in the past EMC was a really good example where you ended up with a kind of a Franken portfolio. Now all this leads us to believe that Palo Alto wants to be the consolidator and is in a good position to do so. But beyond that, as multi-cloud becomes more prevalent and more of a strategy customers tell us they want a consistent experience across clouds. And is going to be the same by the way with IoT. So of the next wave here. Customers don't want another stove pipe. So we think Palo Alto is in a good position to build what we call the security super cloud that layer above the clouds that brings a common experience for devs and operational teams. So of course the obvious question is this, can Palo Alto networks continue on this path of acquire and integrate and still maintain best of breed status? Can it? Will it? Does it even have to? As Holger Mueller of Constellation Research and I talk about all the time integrated suites seem to always beat best of breed in the long run. We'll come back to that. Now, this next graphic that we're going to show you underscores this question about portfolio. Here's a picture and I don't expect you to digest it all but it's a screen grab of Palo Alto's product and solutions portfolios, network cloud, network security rather, cloud security, Sassy, CNAP, endpoint unit 42 which is their threat intelligence platform and every imaginable security service and solution for customers. Well, maybe not every, I'm sure there's more to come like supply chain with the recent Cider acquisition and maybe more IoT beyond ZingBox and earlier acquisition but we're sure there will be more in the future both organic and inorganic. Okay, let's bring in more of the ETR survey data. For those of you who don't know ETR, they are the number one enterprise data platform surveying thousands of end customers every quarter with additional drill down surveys and customer round tables just an awesome SaaS enabled platform. And here's a view that shows net score or spending momentum on the vertical axis in provision or presence within the ETR data set on the horizontal axis. You see that red dotted line at 40%. Anything at or over that indicates a highly elevated net score. And as you can see Palo Alto is right on that line just under. And I'll give you another glimpse it looks like Palo Alto despite the macro may even just edge up a bit in the next survey based on the glimpse that Eric gave us. Now those colored bars in the bottom right corner they show the breakdown of Palo Alto's net score and underscore the methodology that ETR uses. The lime green is new customer adoptions, that's 7%. The forest green at 38% represents the percent of customers that are spending 6% or more on Palo Alto solutions. The gray is at that 40 or 8% that's flat spending plus or minus 5%. The pinkish at 5% is spending is down on Palo Alto network products by 6% or worse. And the bright red at only 2% is churn or defections. Very low single digit numbers for Palo Alto, that's a real positive. What you do is you subtract the red from the green and you get a net score of 38% which is very good for a company of Palo Alto size. And we'll note this is based on just under 400 responses in the ETR survey that are Palo Alto customers out of around 1300 in the total survey. It's a really good representation of Palo Alto. And you can see the other leading companies like CrowdStrike, Okta, Zscaler, Forte, Cisco they loom large with similar aspirations. Well maybe not so much Okta. They don't necessarily rule want to rule the world. They want to rule identity and of course the ever ubiquitous Microsoft in the upper right. Now drilling deeper into the ETR data, let's look at how Palo Alto has progressed over the last three surveys in terms of market presence in the survey. This view of the data shows provision in the data going back to October, 2021, that's the gray bars. The blue is July 22 and the yellow is the latest survey from October, 2022. Remember, the January survey is currently in the field. Now the leftmost set of data there show size a company. The middle set of data shows the industry for a select number of industries in the right most shows, geographic region. Notice anything, yes, Palo Alto up across the board relative to both this past summer and last fall. So that's pretty impressive. Palo Alto network CEO, Nikesh Aurora, stressed on the last earnings call that the company is seeing somewhat elongated deal approvals and sometimes splitting up size of deals. He's stressed that certain industries like energy, government and financial services continue to spend. But we would expect even a pullback there as companies get more conservative. But the point is that Nikesh talked about how they're hiring more sales pros to work the pipeline because they understand that they have to work harder to pull deals forward 'cause they got to get more approvals and they got to increase the volume that's coming through the pipeline to account for the possibility that certain companies are going to split up the deals, you know, large deals they want to split into to smaller bite size chunks. So they're really going hard after they go to market expansion to account for that. All right, so we're going to wrap by sharing what we expect and what we're going to probe for at Palo Alto Ignite next week, Lisa Martin and I will be hosting "theCube" and here's what we'll be looking for. First, it's a four day event at the MGM with the meat of the program on days two and three. That's day two was the big keynote. That's when we'll start our broadcasting, we're going for two days. Now our understanding is we've never done Palo Alto Ignite before but our understanding it's a pretty technically oriented crowd that's going to be eager to hear what CTO and founder Nir Zuk has to say. And as well CEO Nikesh Aurora and as in addition to longtime friend of "theCube" and current president, BJ Jenkins, he's going to be speaking. Wendy Whitmore runs Unit 42 and is going to be several other high profile Palo Alto execs, as well, Thomas Kurian from Google is a featured speaker. Lee Claridge, who is Palo Alto's, chief product officer we think is going to be giving the audience heavy doses of Prisma Cloud and Cortex enhancements. Now, Cortex, you might remember, came from an acquisition and does threat detection and attack surface management. And we're going to hear a lot about we think about security automation. So we'll be listening for how Cortex has been integrated and what kind of uptake that it's getting. We've done some, you know, modeling in from the ETR. Guys have done some modeling of cortex, you know looks like it's got a lot of upside and through the Palo Alto go to market machine, you know could really pick up momentum. That's something that we'll be probing for. Now, one of the other things that we'll be watching is pricing. We want to talk to customers about their spend optimization, their spending patterns, their vendor consolidation strategies. Look, Palo Alto is a premium offering. It charges for value. It's expensive. So we also want to understand what kind of switching costs are customers willing to absorb and how onerous they are and what's the business case look like? How are they thinking about that business case. We also want to understand and really probe on how will Palo Alto maintain best of breed as it continues to acquire and integrate to expand its TAM and appeal as that one-stop shop. You know, can it do that as we talked about before. And will it do that? There's also an interesting tension going on sort of changing subjects here in security. There's a guy named Edward Hellekey who's been in "theCube" before. He hasn't been in "theCube" in a while but he's a security pro who has educated us on the nuances of protecting data privacy, public policy, how it varies by region and how complicated it is relative to security. Because securities you technically you have to show a chain of custody that proves unequivocally, for example that data has been deleted or scrubbed or that metadata does. It doesn't include any residual private data that violates the laws, the local laws. And the tension is this, you need good data and lots of it to have good security, really the more the better. But government policy is often at odds in a major blocker to sharing data and it's getting more so. So we want to understand this tension and how companies like Palo Alto are dealing with it. Our customers testing public policy in courts we think not quite yet, our government's making exceptions and policies like GDPR that favor security over data privacy. What are the trade-offs there? And finally, one theme of this breaking analysis is what does Palo Alto have to do to stay on top? And we would sum it up with three words. Ecosystem, ecosystem, ecosystem. And we said this at CrowdStrike Falcon in September that the one concern we had was the pace of ecosystem development for CrowdStrike. Is collaboration possible with competitors? Is being adopted aggressively? Is Palo Alto being adopted aggressively by global system integrators? What's the uptake there? What about developers? Look, the hallmark of a cloud company which Palo Alto is a cloud security company is a thriving ecosystem that has entries into and exits from its platform. So we'll be looking at what that ecosystem looks like how vibrant and inclusive it is where the public clouds fit and whether Palo Alto Networks can really become the security super cloud. Okay, that's a wrap stop by next week. If you're in Vegas, say hello to "theCube" team. We have an unbelievable lineup on the program. Now if you're not there, check out our coverage on theCube.net. I want to thank Eric Bradley for sharing a glimpse on short notice of the upcoming survey from ETR and his thoughts. And as always, thanks to Chip Symington for his sharp comments. Want to thank Alex Morrison, who's on production and manages the podcast Ken Schiffman as well in our Boston studio, Kristen Martin and Cheryl Knight they help get the word out on social and of course in our newsletters, Rob Hoof, is our editor in chief over at Silicon Angle who does some awesome editing, thank you to all. Remember all these episodes they're available as podcasts. Wherever you listen, all you got to do is search "Breaking Analysis" podcasts. I publish each week on wikibon.com and silicon angle.com where you can email me at david.valante@siliconangle.com or dm me at D Valante or comment on our LinkedIn post. And please do check out etr.ai. They've got the best survey data in the enterprise tech business. This is Dave Valante for "theCube" Insights powered by ETR. Thanks for watching. We'll see you next week on "Ignite" or next time on "Breaking Analysis". (upbeat music)

Published Date : Dec 11 2022

SUMMARY :

bringing you data-driven and of course the ever

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Alex MorrisonPERSON

0.99+

Edward HellekeyPERSON

0.99+

Eric BradleyPERSON

0.99+

Lisa MartinPERSON

0.99+

CiscoORGANIZATION

0.99+

Thomas KurianPERSON

0.99+

Dave VellantePERSON

0.99+

Lee ClaridgePERSON

0.99+

Rob HoofPERSON

0.99+

17QUANTITY

0.99+

October, 2021DATE

0.99+

Palo AltoORGANIZATION

0.99+

February, 2020DATE

0.99+

October, 2022DATE

0.99+

40QUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

Dave ValantePERSON

0.99+

Wendy WhitmorePERSON

0.99+

SeptemberDATE

0.99+

OctoberDATE

0.99+

JanuaryDATE

0.99+

ZscalerORGANIZATION

0.99+

OktaORGANIZATION

0.99+

ForteORGANIZATION

0.99+

CrowdStrikeORGANIZATION

0.99+

Chip SimingtonPERSON

0.99+

52 weekQUANTITY

0.99+

Palo AltoORGANIZATION

0.99+

Cheryl KnightPERSON

0.99+

BJ JenkinsPERSON

0.99+

DellORGANIZATION

0.99+

July 22DATE

0.99+

6%QUANTITY

0.99+

EricPERSON

0.99+

VegasLOCATION

0.99+

Palo AltoLOCATION

0.99+

two daysQUANTITY

0.99+

one yearQUANTITY

0.99+

34%QUANTITY

0.99+

Chip SymingtonPERSON

0.99+

Kristen MartinPERSON

0.99+

7%QUANTITY

0.99+

40%QUANTITY

0.99+

27%QUANTITY

0.99+

44%QUANTITY

0.99+

61%QUANTITY

0.99+

38%QUANTITY

0.99+

Palo Alto NetworksORGANIZATION

0.99+

Nir ZukPERSON

0.99+

72%QUANTITY

0.99+

5%QUANTITY

0.99+

4%QUANTITY

0.99+

next weekDATE

0.99+

Constellation ResearchORGANIZATION

0.99+

Cider SecurityORGANIZATION

0.99+

four dayQUANTITY

0.99+

fiscal year 23DATE

0.99+

8%QUANTITY

0.99+

last quarterDATE

0.99+

david.valante@siliconangle.comOTHER

0.99+

Fort NetORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

FirstQUANTITY

0.99+

Ken SchiffmanPERSON

0.99+

GDPRTITLE

0.99+

last fallDATE

0.99+

NASDAQORGANIZATION

0.99+

fiscal year 2020DATE

0.99+

threeQUANTITY

0.99+

more than 30%QUANTITY

0.99+

three wordsQUANTITY

0.99+

todayDATE

0.99+

OracleORGANIZATION

0.99+

FrankenORGANIZATION

0.99+

Nicole Johnson, Head of Social Impact & Sustainability | The Path To Sustainable IT


 

>>Hi everyone. Welcome to this special event, pure Storage, the Path to Sustainable it. I'm your host, Lisa Martin. Very pleased to be joined by Nicole Johnson, the head of Social Impact and Sustainability at Pure Storage. Nicole, welcome to the >>Cube. Thanks for having me, Lisa. >>Sustainability is such an important topic to talk about, and I understand that Pure just announced a report today about sustainability. What can you tell me what nuggets are in this report? >>Well, actually quite a few really interesting nuggets, at least for us. And I, I think probably for you and your viewers as well. So we actually commissioned about a thousand sustainability leaders across the globe to understand, you know, what are their sustainability goals, what are they working on, and what are the impacts of buying decisions, particularly around infrastructure when it comes to sustainable goals. I think one of the things that was really interesting for us was the fact that around the world we did not see a significant variation in terms of sustainability being a top priority. You've, I'm sure you've heard about the energy crisis that's happening across Europe. And so, you know, there was some thought that perhaps that might play into AMEA being a larger, you know, having sustainability goals that were more significant. But we actually did not find that we found sustainability to be really important no matter where the respondents were located. >>So, very interesting at pure sustainability is really at the heart of what we do and has been since our founding. It's interesting because we set out to make storage really simple, but it turns out really simple, is also really sustainable and the products and services that we bring to our customers have really powerful outcomes when it comes to decreasing their, their own carbon footprints. And so, you know, we often hear from customers that we've actually really helped them to significantly improve their storage performance, but also allow them to save on space power and cooling costs and, and their footprint. So really significant findings. One example of that is a company called Cengage, which is a global education technology company. They recently shared with us that they have actually been able to reduce their overall storage footprint by 80% while doubling to tripling the performance of their storage systems. So it's really critical for, for companies who are thinking about their sustainability goals, to consider the dynamic between their sustainability program and their IT teams who are making these buying decisions. >>Right? Those two teams need to be really inextricably linked these days. You talked about the fact that there was really consistency across the regions in terms of sustainability being of high priority for organizations. You had a great customer story that you shared that showed significant impact can be made there by bringing the sustainability both together with it. But I'm wondering why are we seeing that so much of the vendor selection process still isn't revolving around sustainability or it's overlooked? What are some of the things that you see despite so many people saying sustainability huge priority? >>Well, in this survey, the most commonly cited challenge was really around the fact that there was a lack of management buy-in. 40% of respondents told us this was the top roadblock. So getting, I think getting that out of the way. And then we also just heard that sustainability teams were not brought into tech purchasing processes until after it's already rolling, right? So they're not even looped in. And that, that being said, you know, we know that it has been identified as one of the key departments to supporting a company's sustainability goals. So we, we really want to ensure that these two teams are talking more to each other. When we look even closer at the data from the respondents, we see some really positive correlations. We see that 65% of respondents reported that they're on track to meet their sustainability goals, and that it, of those 65%, it is significantly engaged with reporting data for those sustainability initiatives. We saw that, that for those who did report, the sustainability is a top priority for vendor selection. They were twice as likely to be on track with their goals and their sustainability directors said that they were getting involved at the beginning of the tech purchasing program. Our process, I'm sorry, rather than towards the end. And so, you know, we know that to curb the impact of climate crisis, we really need to embrace sustainability from a cross-functional viewpoint. >>Definitely has to be cross-functional. So, so strong correlations there in the report that organizations that had closer alignment between the sustainability folks and the IT folks were farther along in their sustainability program development, execution, et cetera, those CO was correlations, were they a surprise? >>Not entirely. You know, when we look at some of the statistics that come from the, you know, places like the World Economic Forum, they say that digitization generated 4% of greenhouse gas emissions in 2020. So, and that, you know, that's now almost three years ago, digital data only accelerates and by 2025, we expect that number could be almost double. And so we know that that communication and that correlation is gonna be really important because data centers are taking up such a huge footprint of when companies are looking at their emissions. And it's, I mean, quite frankly, a really interesting opportunity for it to be a trailblazer in the sustainability journey. And, you know, perhaps people that are in IT haven't thought about how they can make an impact in this area, but there really is some incredible ways to help us work on cutting carbon emissions, both from your company's perspective and from the world's perspective, right? >>Like we are, we're all doing this because it's something that we know we have to do to drive down climate change. So I think when you, when you think about how to be a trailblazer, how to do things differently, how to differentiate your own department, it's a really interesting connection that IT and sustainability work together. I would also say, you know, I'll just note that of the respondents to the survey we were discussing, we do over half of those respondents expect to see closer alignment between the organization's IT and sustainability teams as they move forward. >>And that's really a tip the, to those organizations embracing cultural change. That's always hard to do, but for those two, for sustainability and IT to come together as part of really the overall ethos of an organization, that's huge. And it's great to see the data demonstrating that, that those, that alignment, that close alignment is really on its way to helping organizations across industries make a big impact. And wanna dig in a little bit to peers, ESG goals. What can you share with us about >>That? Absolutely. So as I mentioned, peers kind of at the beginning of our formal ESG journey, but really has been working on the, on the sustainability front for a long time. I would, I, it's funny as we're, as we're doing a lot of this work and, and kind of building our own profile around this, we're coming back to some of the things that we have done in the past that consumers weren't necessarily interested in then, but are now because the world has changed, becoming more and more invested in. So that's exciting. So we did a baseline scope one, two, and three analysis and discovered, interestingly enough that 70% of our emissions comes from use of sold products. So our customers work running our products in their data centers. So we know that we, we've made some ambitious goals around our Scope one and two emissions, which is our own office, our utilities, you know, those, they only account for 6% of our emissions. So we know that to really address the issue of climate change, we need to work on the use of sold products. So we've also made a, a really ambitious commitment to decrease our carbon emissions by 66% per bed per petabyte by 2030 in our products. So decreasing our own carbon footprint, but also affecting our customers as well. And we've also committed to a science based target initiative and our road mapping how to achieve the ambitious goals set out in the Paris agreement. >>That's fantastic. It sounds like you really dialed in on where is the biggest opportunity for us as peer storage to make the biggest impact across our organization, across our customers' organizations. There lofty goals that pure set, but knowing what I know about Pure, you guys are probably well on track to, to accomplish those goals in record time. >>I hope So. >>Talk a little bit about advice that you would give to viewers who might be at the very beginning of their sustainability journey and really wondering what are the core elements besides it, sustainability, team alignment that I need to bring into this program to make it actually successful? >>Yeah, so I think, you know, understanding that you don't have to pick between really powerful technology and sustainable technology. There are opportunities to get both and not just in storage, right in, in your entire IT port portfolio. We know that, you know, we're in a place in the world where we have to look at things from the bigger picture. We have to solve new challenges and we have to approach business a little bit differently. So adopting solutions and services that are environmentally efficient can actually help to scale and deliver more effective and efficient IT solutions over time. So I think that that's something that we need to, to really remind ourselves, right? We have to go about business a little bit differently and that's okay. We also know that data centers utilize an incredible amount of, of energy and, and carbon. And so everything that we can do to drive that down is going to address the sustainability goals for us individually as well as, again, drive down that climate change. So we, we need to get out of the mindset that data centers are, are about reliability, your cost, et cetera. And really think about efficiency and carbon footprint when you're making those business decisions. I'll also say that, you know, the earlier that we can get sustainability teams into the conversation, the more impactful your business decisions are going to be and helping you to guide sustainable decision making. >>So shifting sustainability and it left almost together really shows that the correlation between those folks getting together in the beginning with intention, the report shows and the successes that peers had, demonstrate that that's very impactful for organizations to actually be able to implement even the cultural change that's needed for sustainability programs to be successful. My last question for you goes back to that report. You mentioned in there that the data show a lot of organizations are hampered by management buy-in, where sustainability is concerned. How can pure help its customers navigate around those barriers so that they get that management buy and they understand that the value in it for >>Them? Yeah, so I mean, I think that for me, my advice is always to speak to hearts and minds, right? And help the management to understand, first of all, the impact right on climate change. So I think that's the kind of hearts piece on the mind piece. I think it's addressing the sustainability goals that these companies have set for themselves and helping management understand how to, you know, how their IT buying decisions can actually really help them to reach these goals. We also, you know, we always run kind of TCOs for customers to understand what is the actual cost of, of the equipment. And so, you know, especially if you're in a, in a location in which energy costs are rising, I mean, I think we're seeing that around the world right now with inflation. Better understanding your energy costs can really help your management to understand the, again, the bigger picture and what that total cost is gonna be. Often we see, you know, that maybe the, the person who's buying the IT equipment isn't the same person who's purchasing, who's paying the, the electricity bills, right? And so sometimes even those two teams aren't talking. And there's a great opportunity there, I think, to just to just, you know, look at it from a more high level lens to better understand what total cost of ownership is. >>That's a great point. Great advice. Nicole. Thank you so much for joining me on the program today, talking about the new report that on sustainability that Pure put out some really compelling nuggets in there, but really also some great successes that you've already achieved internally on your own ESG goals and what you're helping customers to achieve in terms of driving down their carbon footprint and emissions. We so appreciate your insights and your thoughts. >>Thank you, Lisa. It's been great speaking with you. >>Pleasure speaking with you as well. We wanna thank you so much for watching. This is Pure Storage, the path to sustainable it. I'm Lisa Martin, we'll see you next time.

Published Date : Dec 7 2022

SUMMARY :

Very pleased to be joined by Nicole Johnson, the head of Social What can you tell me what nuggets are in this report? And so, you know, there was some thought that perhaps that might play into AMEA And so, you know, we often hear from customers that What are some of the things that you see despite so many people saying sustainability And so, you know, we know that to curb the that had closer alignment between the sustainability folks and the IT folks were farther along So, and that, you know, that's now almost three years ago, digital data only you know, I'll just note that of the respondents to the And it's great to see the data demonstrating that, our Scope one and two emissions, which is our own office, our utilities, you know, those, but knowing what I know about Pure, you guys are probably well on track to, to accomplish those goals And so everything that we can do to actually be able to implement even the cultural change that's needed for sustainability programs to I think, to just to just, you know, look at it from a more high level lens to Thank you so much for joining me on the program today, This is Pure Storage, the path to sustainable

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
NicolePERSON

0.99+

Nicole JohnsonPERSON

0.99+

Lisa MartinPERSON

0.99+

EuropeLOCATION

0.99+

6%QUANTITY

0.99+

4%QUANTITY

0.99+

2020DATE

0.99+

LisaPERSON

0.99+

two teamsQUANTITY

0.99+

2025DATE

0.99+

2030DATE

0.99+

70%QUANTITY

0.99+

65%QUANTITY

0.99+

twiceQUANTITY

0.99+

CengageORGANIZATION

0.99+

twoQUANTITY

0.99+

80%QUANTITY

0.99+

bothQUANTITY

0.99+

oneQUANTITY

0.99+

PureORGANIZATION

0.99+

Pure StorageORGANIZATION

0.98+

todayDATE

0.98+

66% per bed per petabyteQUANTITY

0.96+

AMEAORGANIZATION

0.95+

two emissionsQUANTITY

0.95+

World Economic ForumORGANIZATION

0.94+

three years agoDATE

0.93+

Paris agreementTITLE

0.93+

One exampleQUANTITY

0.93+

40%QUANTITY

0.9+

pure StorageORGANIZATION

0.79+

Social ImpactORGANIZATION

0.79+

three analysisQUANTITY

0.78+

gasQUANTITY

0.76+

doubleQUANTITY

0.74+

Scope oneOTHER

0.71+

about a thousand sustainabilityQUANTITY

0.65+

our emissionsQUANTITY

0.57+

climateEVENT

0.55+

ESGTITLE

0.42+

Pure Storage The Path to Sustainable IT


 

>>In the early part of this century, we're talking about the 2005 to 2007 timeframe. There was a lot of talk about so-called green it. And at that time there was some organizational friction. Like for example, the line was that the CIO never saw the power bill, so he or she didn't care, or that the facilities folks, they rarely talked to the IT department. So it was kind of that split brain. And, and then the oh 7 0 8 financial crisis really created an inflection point in a couple of ways. First, it caused organizations to kind of pump the brakes on it spending, and then they took their eye off the sustainability ball. And the second big trend, of course, was the cloud model, you know, kind of became a benchmark for it. Simplicity and automation and efficiency, the ability to dial down and dial up capacity as needed. >>And the third was by the end of the first decade of the, the two thousands, the technology of virtualization was really hitting its best stride. And then you had innovations like flash storage, which largely eliminated the need for these massive farms of spinning mechanical devices that sucked up a lot of power. And so really these technologies began their march to mainstream adoption. And as we progressed through the 2020s, the effect of climate change really come into focus as a critical component of esg. Environmental, social, and governance. Shareholders have come to demand metrics around sustainability. Employees are often choosing employers based on their ESG posture. And most importantly, companies are finding that savings on power cooling and footprint, it has a bottom line impact on the income statement. Now you add to that the energy challenges around the world, particularly facing Europe right now, the effects of global inflation and even more advanced technologies like machine intelligence. >>And you've got a perfect storm where technology can really provide some relief to organizations. Hello and welcome to the Path to Sustainable It Made Possible by Pure Storage and Collaboration with the Cube. My name is Dave Valante and I'm one of the host of the program, along with my colleague Lisa Martin. Now, today we're gonna hear from three leaders on the sustainability topic. First up, Lisa will talk to Nicole Johnson. She's the head of Social Impact and Sustainability at Pure Storage. Nicole will talk about the results from a study of around a thousand sustainability leaders worldwide, and she'll share some metrics from that study. And then next, Lisa will speak to AJ Singh. He's the Chief Product Officer at Pure Storage. We've had had him on the cube before, and not only will he share some useful stats in the market, I'll also talk about some of the technology innovations that customers can tap to address their energy consumption, not the least of which is ai, which is is entering every aspect of our lives, including how we deal with energy consumption. And then we'll bring it back to our Boston studio and go north of Italy with Mattia Ballero of Elec Informatica, a services provider with deep expertise on the topic of sustainability. We hope you enjoyed the program today. Thanks for watching. Let's get started >>At Pure Storage, the opportunity for change and our commitment to a sustainable future are a direct reflection of the way we've always operated and the values we live by every day. We are making significant and immediate impact worldwide through our environmental sustainability efforts. The milestones of change can be seen everywhere in everything we do. Pure's Evergreen Storage architecture delivers two key environmental benefits to customers, the reduction of wasted energy and the reduction of e-waste. Additionally, Pure's implemented a series of product packaging redesigns, promoting recycled and reuse in order to reduce waste that will not only benefit our customers, but also the environment. Pure is committed to doing what is right and leading the way with innovation. That has always been the pure difference, making a difference by enabling our customers to drive out energy usage and their data storage systems by up to 80%. Today, more than 97% of pure arrays purchased six years ago are still in service. And tomorrow our goal for the future is to reduce Scope three. Emissions Pure is committing to further reducing our sold products emissions by 66% per petabyte by 2030. All of this means what we said at the beginning, change that is simple and that is what it has always been about. Pure has a vision for the future today, tomorrow, forever. >>Hi everyone, welcome to this special event, pure Storage, the Path to Sustainable it. I'm your host, Lisa Martin. Very pleased to be joined by Nicole Johnson, the head of Social Impact and Sustainability at Pure Storage. Nicole, welcome to the Cube. Thanks >>For having me, Lisa. >>Sustainability is such an important topic to talk about and I understand that Pure just announced a report today about sustainability. What can you tell me what nuggets are in this report? >>Well, actually quite a few really interesting nuggets, at least for us. And I, I think probably for you and your viewers as well. So we actually commissioned about a thousand sustainability leaders across the globe to understand, you know, what are their sustainability goals, what are they working on, and what are the impacts of buying decisions, particularly around infrastructure when it comes to sustainable goals. I think one of the things that was really interesting for us was the fact that around the world we did not see a significant variation in terms of sustainability being a top priority. You've, I'm sure you've heard about the energy crisis that's happening across Europe. And so, you know, there was some thought that perhaps that might play into AMEA being a larger, you know, having sustainability goals that were more significant. But we actually did not find that we found sustainability to be really important no matter where the respondents were located. >>So very interesting at Pure sustainability is really at the heart of what we do and has been since our founding. It's interesting because we set out to make storage really simple, but it turns out really simple is also really sustainable. And the products and services that we bring to our customers have really powerful outcomes when it comes to decreasing their, their own carbon footprints. And so, you know, we often hear from customers that we've actually really helped them to significantly improve their storage performance, but also allow them to save on space power and cooling costs and, and their footprint. So really significant findings. One example of that is a company called Cengage, which is a global education technology company. They recently shared with us that they have actually been able to reduce their overall storage footprint by 80% while doubling to tripling the performance of their storage systems. So it's really critical for, for companies who are thinking about their sustainability goals, to consider the dynamic between their sustainability program and their IT teams who are making these buying decisions, >>Right? Those two teams need to be really inextricably linked these days. You talked about the fact that there was really consistency across the regions in terms of sustainability being of high priority for organizations. You had a great customer story that you shared that showed significant impact can be made there by bringing the sustainability both together with it. But I'm wondering why are we seeing that so much of the vendor selection process still isn't revolving around sustainability or it's overlooked? What are some of the things that you received despite so many people saying sustainability, huge priority? >>Well, in this survey, the most commonly cited challenge was really around the fact that there was a lack of management buy-in. 40% of respondents told us this was the top roadblock. So getting, I think getting that out of the way. And then we also just heard that sustainability teams were not brought into tech purchasing processes until after it's already rolling, right? So they're not even looped in. And that being said, you know, we know that it has been identified as one of the key departments to supporting a company sustainability goals. So we, we really want to ensure that these two teams are talking more to each other. When we look even closer at the data from the respondents, we see some really positive correlations. We see that 65% of respondents reported that they're on track to meet their sustainability goals. And the IT of those 65%, it is significantly engaged with reporting data for those sustainability initiatives. We saw that, that for those who did report, the sustainability is a top priority for vendor selection. They were twice as likely to be on track with their goals and their sustainability directors said that they were getting involved at the beginning of the tech purchasing program. Our process, I'm sorry, rather than towards the end. And so, you know, we know that to curb the impact of climate crisis, we really need to embrace sustainability from a cross-functional viewpoint. >>Definitely has to be cross-functional. So, so strong correlations there in the report that organizations that had closer alignment between the sustainability folks and the IT folks were farther along in their sustainability program development, execution, et cetera, those co was correlations, were they a surprise? >>Not entirely. You know, when we look at some of the statistics that come from the, you know, places like the World Economic Forum, they say that digitization generated 4% of greenhouse gas emissions in 2020. So, and that, you know, that's now almost three years ago, digital data only accelerates, and by 2025, we expect that number could be almost double. And so we know that that communication and that correlation is gonna be really important because data centers are taking up such a huge footprint of when companies are looking at their emissions. And it's, I mean, quite frankly, a really interesting opportunity for it to be a trailblazer in the sustainability journey. And, you know, perhaps people that are in IT haven't thought about how they can make an impact in this area, but there really is some incredible ways to help us work on cutting carbon emissions, both from your company's perspective and from the world's perspective, right? >>Like we are, we're all doing this because it's something that we know we have to do to drive down climate change. So I think when you, when you think about how to be a trailblazer, how to do things differently, how to differentiate your own department, it's a really interesting connection that IT and sustainability work together. I would also say, you know, I'll just note that of the respondents to the survey we were discussing, we do over half of those respondents expect to see closer alignment between the organization's IT and sustainability teams as they move forward. >>And that's really a, a tip a hat to those organizations embracing cultural change. That's always hard to do, but for those two, for sustainability in IT to come together as part of really the overall ethos of an organization, that's huge. And it's great to see the data demonstrating that, that those, that alignment, that close alignment is really on its way to helping organizations across industries make a big impact. I wanna dig in a little bit to here's ESG goals. What can you share with us about >>That? Absolutely. So as I mentioned peers kind of at the beginning of our formal ESG journey, but really has been working on the, on the sustainability front for a long time. I would, it's funny as we're, as we're doing a lot of this work and, and kind of building our own profile around this, we're coming back to some of the things that we have done in the past that consumers weren't necessarily interested in then but are now because the world has changed, becoming more and more invested in. So that's exciting. So we did a baseline scope one, two, and three analysis and discovered, interestingly enough that 70% of our emissions comes from use of sold products. So our customers work running our products in their data centers. So we know that we, we've made some ambitious goals around our Scope one and two emissions, which is our own office, our utilities, you know, those, they only account for 6% of our emissions. So we know that to really address the issue of climate change, we need to work on the use of sold products. So we've also made a, a really ambitious commitment to decrease our carbon emissions by 66% per bed per petabyte by 2030 in our product. So decreasing our own carbon footprint, but also affecting our customers as well. And we've also committed to a science-based target initiative and our road mapping how to achieve the ambitious goals set out in the Paris agreement. >>That's fantastic. It sounds like you really dialed in on where is the biggest opportunity for us as Pure Storage to make the biggest impact across our organization, across our customers organizations. There lofty goals that pure set, but knowing what I know about Pure, you guys are probably well on track to, to accomplish those goals in record time, >>I hope So. >>Talk a little bit about advice that you would give to viewers who might be at the very beginning of their sustainability journey and really wondering what are the core elements besides it, sustainability, team alignment that I need to bring into this program to make it actually successful? >>Yeah, so I think, you know, understanding that you don't have to pick between really powerful technology and sustainable technology. There are opportunities to get both and not just in storage right in, in your entire IT portfolio. We know that, you know, we're in a place in the world where we have to look at things from the bigger picture. We have to solve new challenges and we have to approach business a little bit differently. So adopting solutions and services that are environmentally efficient can actually help to scale and deliver more effective and efficient IT solutions over time. So I think that that's something that we need to, to really remind ourselves, right? We have to go about business a little bit differently and that's okay. We also know that data centers utilize an incredible amount of, of energy and, and carbon. And so everything that we can do to drive that down is going to address the sustainability goals for us individually as well as, again, drive down that climate change. So we, we need to get out of the mindset that data centers are, are about reliability or cost, et cetera, and really think about efficiency and carbon footprint when you're making those business decisions. I'll also say that, you know, the earlier that we can get sustainability teams into the conversation, the more impactful your business decisions are going to be and helping you to guide sustainable decision making. >>So shifting sustainability and IT left almost together really shows that the correlation between those folks getting together in the beginning with intention, the report shows and the successes that peers had demonstrate that that's very impactful for organizations to actually be able to implement even the cultural change that's needed for sustainability programs to be successful. My last question for you goes back to that report. You mentioned in there that the data show a lot of organizations are hampered by management buy-in, where sustainability is concerned. How can pure help its customers navigate around those barriers so that they get that management buy-in and they understand that the value in it for >>Them? Yeah, so I mean, I think that for me, my advice is always to speak to hearts and minds, right? And help the management to understand, first of all, the impact right on climate change. So I think that's the kind of hearts piece on the mind piece. I think it's addressing the sustainability goals that these companies have set for themselves and helping management understand how to, you know, how their IT buying decisions can actually really help them to reach these goals. We also, you know, we always run kind of TCOs for customers to understand what is the actual cost of, of the equipment. And so, you know, especially if you're in a, in a location in which energy costs are rising, I mean, I think we're seeing that around the world right now with inflation. Better understanding your energy costs can really help your management to understand the, again, the bigger picture and what that total cost is gonna be. Often we see, you know, that maybe the I the person who's buying the IT equipment isn't the same person who's purchasing, who's paying the, the electricity bills, right? And so sometimes even those two teams aren't talking. And there's a great opportunity there, I think, to just to just, you know, look at it from a more high level lens to better understand what total cost of ownership is. >>That's a great point. Great advice. Nicole, thank you so much for joining me on the program today, talking about the new report that on sustainability that Pure put out some really compelling nuggets in there, but really also some great successes that you've already achieved internally on your own ESG goals and what you're helping customers to achieve in terms of driving down their carbon footprint and emissions. We so appreciate your insights and your thoughts. >>Thank you, Lisa. It's been great speaking with you. >>AJ Singh joins me, the Chief Product Officer at Peer Storage. Aj, it's great to have you back on the program. >>Great to be back on, Lisa, good morning. >>Good morning. And sustainability is such an important topic to talk about. So we're gonna really unpack what PEER is doing, we're gonna get your viewpoints on what you're seeing and you're gonna leave the audience with some recommendations on how they can get started on their ESG journey. First question, we've been hearing a lot from pure AJ about the role that technology plays in organizations achieving sustainability goals. What's been the biggest environmental impact associated with, with customers achieving that given the massive volumes of data that keep being generated? >>Absolutely, Lisa, you can imagine that the data is only growing and exploding and, and, and, and there's a good reason for it. You know, data is the new currency. Some people call it the new oil. And the opportunity to go process this data gain insights is really helping customers drive an edge in the digital transformation. It's gonna make a difference between them being on the leaderboard a decade from now when the digital transformation kind of pans out versus, you know, being kind of somebody that, you know, quite missed the boat. So data is super critical and and obviously as part of that we see all these big benefits, but it has to be stored and, and, and that means it's gonna consume a lot of resources and, and the, and therefore data center usage has only accelerated, right? You can imagine the amount of data being generated, you know, recent study pointed to roughly by twenty twenty five, a hundred and seventy five zetabytes, which where each zettabyte is a billion terabytes. So just think of that size and scale of data. That's huge. And, and they also say that, you know, pretty soon, today, in fact in the developed world, every person is having an interaction with the data center literally every 18 seconds. So whether it's on Facebook or Twitter or you know, your email, people are constantly interacting with data. So you can imagine this data is only exploding. It has to be stored and it consumes a lot of energy. In fact, >>It, oh, go ahead. Sorry. >>No, I was saying in fact, you know, there's some studies have shown that data center usage literally consumes one to 2% of global energy consumption. So if there's one place we could really help climate change and, and all those aspects, if you can kind of really, you know, tamp down the data center, energy consumption, sorry, you were saying, >>I was just gonna say, it's, it's an incredibly important topic and the, the, the stats on data that you provided and also I, I like how you talked about, you know, every 18 seconds we're interacting with a data center, whether we know it or not, we think about the long term implications, the fact that data is growing massively. As you shared with the stats that you mentioned. If we think about though the responsibility that companies have, every company in today's world needs to be a data company, right? And we consumers expect it. We expect that you are gonna deliver these relevant, personalized experiences whether we're doing a transaction in our personal lives or in business. But what is the, what requirements do technology companies have to really start billing down their carbon footprints? >>No, absolutely. If you can think about it, just to kind of finish up the data story a little bit, the explosion is to the point where, in fact, if you just recently was in the news that Ireland went up and said, sorry, we can't have any more data centers here. We just don't have the power to supply them. That was big in the news and you know, all the hyperscale that was crashing the head. I know they've come around that and figured out a way around it, but it's getting there. Some, some organizations and and areas jurisdictions are saying pretty much no data center the law, you know, we're, we just can't do it. And so as you said, so companies like Pure, I mean, our view is that it has an opportunity here to really do our bit for climate change and be able to, you know, drive a sustainable environment. >>And, and at Pure we believe that, you know, today's data success really ultimately hinges on energy efficiency, you know, so to to really be energy efficient means you are gonna be successful long term with data. Because if you think of classic data infrastructures, the legacy infrastructures, you know, we've got disk infrastructures, hybrid infrastructures, flash infrastructures, low end systems, medium end systems, high end systems. So a lot of silos, you know, a lot of inefficiency across the silos. Cause the data doesn't get used across that. In fact, you know, today a lot of data centers are not really built with kind of the efficiency and environmental mindset. So there's a big opportunity there. >>So aj, talk to me about some of the steps that Pure is implementing as its chief product officer. Would love to get your your thoughts, what steps is it implementing to help Pures customers become more sustainable? >>No, absolutely. So essentially we are all inherently motivated, like pure and, and, and, and everybody else to solve problems for customers and really forward the status quo, right? You know, innovation, you know, that's what we are all about. And while we are doing that, the challenge is to how do you make technology and the data we feed into it faster, smarter, scalable obviously, but more importantly sustainable. And you can do all of that, but if you miss the sustainability bit, you're kind of missing the boat. And I also feel from an ethical perspective, that's really important for us. Not only you do all the other things, but also kind of make it sustainable. In fact, today 80% of the companies, the companies are realizing this, 80% today are in fact report out on sustainability, which is great. In fact, 80% of leadership at companies, you know, CEOs and senior executives say they've been impacted by some climate change event, you know, where it's a fire in the place they had to evacuate or floods or storms or hurricanes, you, you name it, right? >>So mitigating the carbon impact can in fact today be a competitive advantage for companies because that's where the puck is going and everybody's, you know, it's skating, wanting to skate towards the, and it's good, it's good business too to be sustainable and, and, and meet these, you know, customer requirements. In fact, the the recent survey that we released today is saying that more and more organizations are kickstarting, their sustainability initiatives and many take are aiming to make a significant progress against that over the next decade. So that's, that's really, you know, part of the big, the really, so our view is that that IT infrastructure, you know, can really make a big push towards greener it and not just kind of greenwash it, but actually, you know, you know, make things more greener and, and, and really take the, the lead in, in esg. And so it's important that organizations can reach alignment with their IT teams and challenge their IT teams to continue to lead, you know, for the organization, the sustainability aspects. >>I'm curious, aj, when you're in customer conversations, are you seeing that it's really the C-suite plus it coming together and, and how does peer help facilitate that? To your point, it needs to be able to deliver this, but it's, it's a board level objective these days. >>Absolutely. We're seeing increasingly, especially in Europe with the, you know, the war in Ukraine and the energy crisis that, you know, that's, that's, you know, unleashed. We definitely see it's becoming a bigger and bigger board level objective for, for a lot of companies. And we definitely see customers in starting to do that. So, so in particular, I do want to touch briefly on what steps we are taking as a company, you know, to to to make it sustainable. And obviously customers are doing all the things we talked about and, and we're also helping them become smarter with data. But the key difference is, you know, we have a big focus on efficiency, which is really optimizing performance per wat with unmatched storage density. So you can reduce the footprint and dramatically lower the power required. And and how efficient is that? You know, compared to other old flash systems, we tend to be one fifth, we tend to take one fifth the power compared to other flash systems and substantially lower compared to spinning this. >>So you can imagine, you know, cutting your, if data center consumption is a 2% of global consumption, roughly 40% of that tends to be storage cause of all the spinning disc. So you add about, you know, 0.8% to global consumption and if you can cut that by four fifths, you know, you can already start to make an impact. So, so we feel we can do that. And also we're quite a bit more denser, 10 times more denser. So imagine one fifth the power, one 10th the density, but then we take it a step further because okay, you've got the storage system in the data center, but what about the end of life aspect? What about the waste and reclamation? So we also have something called non-disruptive upgrades. We, using our AI technology in pure one, we can start to sense when a particular part is going to fail and just before it goes to failure, we actually replace it in a non-disruptive fashion. So customer's data is not impacted and then we recycle that so you get a full end to end life cycle, you know, from all the way from the time you deploy much lower power, much lower density, but then also at the back end, you know, reduction in e-waste and those kind of things. >>That's a great point you, that you bring up in terms of the reclamation process. It sounds like Pure does that on its own, the customer doesn't have to be involved in that. >>That's right. And we do that, it's a part of our evergreen, you know, service that we offer. A lot of customers sign up for that service and in fact they don't even, we tell them, Hey, you know, that part's about to go, we're gonna come in, we're gonna swap it out and, and then we actually recycle that part, >>The power of ai. Love that. What are some of the, the things that companies can do if they're, if they're early in this journey on sustainability, what are some of the specific steps companies can take to get started and maybe accelerate that journey as it's becoming climate change and things are becoming just more and more of a, of a daily topic on the news? >>No, absolutely. There's a lot of things companies can do. In fact, the four four item that we're gonna highlight, the first one is, you know, they can just start by doing a materiality assessment and a materiality assessment essentially engages all the stakeholders to find out which specific issues are important for the business, right? So you identify your key priorities that intersect with what the stakeholders want, you know, your different groups from sales, customers, partners, you know, different departments in the organization. And for example, for us, when we conducted our materiality assessment, for us, our product we felt was the biggest area of focus that could contribute a lot towards, you know, making an impact in, in, in from a sustainability standpoint. That's number one. I think number two companies can also think about taking an Azure service approach. The beauty of the Azure service approach is that you are buying a, your customer, they're buying outcomes with SLAs and, and when you are starting to buy outcomes with SLAs, you can start small and then grow as you consume more. >>So that way you don't have systems sitting idle waiting for you to consume more, right? And that's the beauty of the as service approach. And so for example, for us, you know, we have something called Evergreen one, which is our as service offer, where essentially customers are able to only use and have systems turned onto as much as they're consuming. So, so that reduces the waste associated with underutilized systems, right? That's number two. Number three is also you can optimize your supply chains end to end, right? Basically by making sure you're moving, recycling, packaging and eliminating waste in that thing so you can recycle it back to your suppliers. And you can also choose a sustainable supplier network that following sort of good practices, you know, you know, across the globe and such supply chains that are responsive and diverse can really help you. Also, the big business benefit benefited. >>You can also handle surges and demand, for example, for us during the pandemic with this global supply chain shortages, you know, whereas most of our competitors, you know, lead times went to 40, 50 weeks, our lead times went from three to six weeks cuz you know, we had this sustainable, you know, supply chain. And so all of these things, you know, the three things important, but the fourth thing I say more cultural and, and the cultural thing is how do you actually begin to have sustainability become a core part of your ethos at the company, you know, across all the departments, you know, and we've at Pure, definitely it's big for us, you know, you know, around sustainability starting with a product design, but all of the areas as well, if you follow those four items, they'll do the great place to start. >>That's great advice, great recommendations. You talk about the, the, the supply chain, sustainable supply chain optimization. We've been having a lot of conversations with businesses and vendors alike about that and how important it is. You bring up a great point too on supplier diversity, if we could have a whole conversation on that. Yes. But I'm also glad that you brought up culture that's huge to, for organizations to adopt an ESG strategy and really drive sustainability in their business. It has to become, to your point, part of their ethos. Yes. It's challenging. Cultural change management is challenging. Although I think with climate change and the things that are so public, it's, it's more on, on the top mindset folks. But it's a great point that the organization really as a whole needs to embrace the sustainability mindset so that it as a, as an organization lives and breathes that. Yes. And last question for you is advice. So you, you outlined the Four Steps organizations can take. I look how you made that quite simple. What advice would you give organizations who are on that journey to adopting those, those actions, as you said, as they look to really build and deploy and execute an ESG strategy? >>No, absolutely. And so obviously, you know, the advice is gonna come from, you know, a company like Pure, you know, our background kind of being a supplier of products. And so, you know, our advice is for companies that have products, usually they tend to be the biggest generator, the products that you sell to your, your customers, especially if they've got hardware components in it. But, you know, the biggest generator of e-waste and, and and, and, and, and kind of from a sustainability standpoint. So it's really important to have an intentional design approach towards your products with sustainability in mind. So it's not something that's, that you can handle at the very back end. You design it front in the product and so that sustainable design becomes very intentional. So for us, for example, doing these non-disruptive upgrades had to be designed up front so that, you know, a, you know, one of our repair person could go into a customer shop and be able to pull out a card and put in a new card without any change in the customer system. >>That non-receptive approach, it has to be designed into the hardware software systems to be able to pull that on. And that intentional design enables you to recover pieces just when they're about to fail and then putting them through a recovery, you know, waste recovery process. So that, that's kind of the one thing I would say that philosophy, again, it comes down to if that is, you know, seeping into the culture, into your core ethos, you will start to do, you know, you know, that type of work. So, so I mean it's important thing, you know, look, this year, you know, with the spike in energy prices, you know, you know, gas prices going up, it's super important that all of us, you know, do our bit in there and start to drive products that are fundamentally sustainable, not just at the initial, you know, install point, but from an end to end full life cycle standpoint. >>Absolutely. And I love that you brought up intention that is everything that peers doing is with, with such thought and intention and really for organizations and any industry to become more sustainable, to develop an ESG strategy. To your point, it all needs to start with intention. And of course that that cultural adoption, aj, it's been so great to have you on the program talking about what PEER is doing to help organizations really navigate that path to sustainable it. We appreciate your insights on your time. >>Thank you, Lisa. Pleasure being on board >>At Pure Storage. The opportunity for change and our commitment to a sustainable future are a direct reflection of the way we've always operated and the values we live by every day. We are making significant and immediate impact worldwide through our environmental sustainability efforts. The milestones of change can be seen everywhere in everything we do. Pures Evergreen storage architecture delivers two key environmental benefits to customers, the reduction of wasted energy and the reduction of e-waste. Additionally, pures implemented a series of product packaging redesigns, promoting recycle and reuse in order to reduce waste that will not only benefit our customers, but also the environment. Pure is committed to doing what is right and leading the way with innovation. That has always been the pure difference, making a difference by enabling our customers to drive out energy usage and their data storage systems by up to 80% today, more than 97% of Pure Array purchased six years ago are still in service. And tomorrow our goal for the future is to reduce Scope three emissions Pure is committing to further reducing our sold products emissions by 66% per petabyte by 2030. All of this means what we said at the beginning, change that is simple and that is what it has always been about. Pure has a vision for the future today, tomorrow, forever. >>We're back talking about the path to sustainable it and now we're gonna get the perspective from Mattia Valerio, who is with Elec Informatica and IT services firm and the beautiful Lombardi region of Italy north of Milano. Mattia, welcome to the Cube. Thanks so much for coming on. >>Thank you very much, Dave. Thank you. >>All right, before we jump in, tell us a little bit more about Elec Informatica. What's your focus, talk about your unique value add to customers. >>Yeah, so basically Alma Informatica is middle company from the north part of Italy and is managed service provider in the IT area. Okay. So the, the main focus area of Al Meca is reach digital transformation innovation to our clients with focus on infrastructure services, workplace services, and also cybersecurity services. Okay. And we try to follow the path of our clients to the digital transformation and the innovation through technology and sustainability. >>Yeah. Obviously very hot topics right now. Sustainability, environmental impact, they're growing areas of focus among leaders across all industries. A particularly acute right now in, in Europe with the, you know, the energy challenges you've talked about things like sustainable business. What does that mean? What does that term Yeah. You know, speak to and, and what can others learn from it? >>Yeah. At at, at our approach to sustainability is grounded in science and, and values and also in customer territory, but also employee centered. I mean, we conduct regular assessments to understand the most significant environment and social issues for our business with, with the goal of prioritizing what we do for a sustainability future. Our service delivery methodology, employee care relationship with the local supplier and local area and institution are a major factor for us to, to build a such a responsibility strategy. Specifically during the past year, we have been particularly focused on define sustainability governance in the company based on stakeholder engagement, defining material issues, establishing quantitative indicators to monitor and setting medium to long-term goals. >>Okay, so you have a lot of data. You can go into a customer, you can do an assessment, you can set a baseline, and then you have other data by which you can compare that and, and understand what's achievable. So what's your vision for sustainable business? You know, that strategy, you know, how has it affected your business in terms of the evolution? Cuz this wasn't, hasn't always been as hot a topic as it is today. And and is it a competitive advantage for you? >>Yeah, yeah. For, for, for all intense and proposed sustainability is a competitive advantage for elec. I mean, it's so, because at the time of profound transformation in the work, in the world of work, CSR issues make a company more attractive when searching for new talent to enter in the workforce of our company. In addition, efforts to ensure people's proper work life balance are a strong retention factor. And regarding our business proposition, ELEX attempts is to meet high standard of sustainability and reliability. Our green data center, you said is a prime example of this approach as at the same time, is there a conditioning activity that is done to give a second life to technology devices that come from back from rental? I mean, our customer inquiries with respect to sustainability are increasingly frequent and in depth and which is why we monitor our performance and invest in certification such as EcoVadis or ISO 14,001. Okay, >>Got it. So in a previous life I actually did some work with, with, with power companies and there were two big factors in it that affected the power consumption. Obviously virtualization was a big one, if you could consolidate servers, you know, that was huge. But the other was the advent of flash storage and that was, we used to actually go in with the, the engineers and the power company put in alligator clips to measure of, of, of an all flash array versus, you know, the spinning disc and it was a big impact. So you, I wanna talk about your, your experience with Pure Storage. You use Flash Array and the Evergreen architecture. Can you talk about what your experience there, why did you make that decision to select Pure Storage? How does that help you meet sustainability and operational requirements? Do those benefits scale as your customers grow? What's your experience been? >>Yeah, it was basically an easy and easy answer to our, to our business needs. Okay. Because you said before that in Elec we, we manage a lot of data, okay? And in the past we, we, we see it, we see that the constraints of managing so many, many data was very, very difficult to manage in terms of power consumption or simply for the, the space of storing the data. And when, when Pure came to us and share our products, their vision to the data management journey for Element Informatica, it was very easy to choose pure why with values and numbers. We, we create a business case and we said that we, we see that our power consumption usage was much less, more than 90% of previous technology that we used in the past. Okay. And so of course you have to manage a grade oil deploy of flash technology storage, but it was a good target. >>So we have tried to monitoring the adoption of flash technology and monitor monitoring also the power consumption and the efficiency that the pure technology bring to our, to our IT systems and of course the IT systems of our clients. And so this is one, the first part, the first good part of our trip with, with Pure. And after that we approach also the sustainability in long term of choosing pure technology storage. You mentioned the Evergreen models of Pure, and of course this was, again, challenge for us because it allows, it allow us to extend the life cycle management of our data centers, but also the, IT allows us to improve the facility of the facilities of using technology from our technical side. Okay. So we are much more efficient than in the past with the choose of Pure storage technologies. Okay. Of course, this easy users, easy usage mode, let me say it, allow us to bring this value to our, to all our clients that put their data in our data centers. >>So you talked about how you've seen a 90% improvement relative to previous technologies. I always, I haven't put you in the spot. Yeah, because I, I, I was on Pure's website and I saw in their ESG report some com, you know, it was a comparison with a generic competitor presuming that competitor was not, you know, a 2010 spinning disc system. But, but, so I'm curious as to the results that you're seeing with Pure in terms of footprint and power usage. You, you're referencing some of that. We heard some metrics from Nicole and AJ earlier in the program. Do you think, again, I'm gonna put you in the spot, do you think that Pure's architecture and the way they've applied, whether it's machine intelligence or the Evergreen model, et cetera, is more competitive than other platforms that you've seen? >>Yeah, of course. Is more competitor improve competitive because basically it allows to service provider to do much more efficient value proposition and offer services that are more, that brings more values to, to the customers. Okay. So the customer is always at the center of a proposition of a service provider and trying to adopt the methodology and also the, the value that pure as inside by design in the technology is, is for us very, very, very important and very, very strategic because, because with like a glass, we can, our self transfer try to transfer the values of pure, pure technologies to our service provider client. >>Okay. Matta, let's wrap and talk about sort of near term 2023 and then longer term it looks like sustainability is a topic that's here to stay. Unlike when we were putting alligator clips on storage arrays, trying to help customers get rebates that just didn't have legs. It was too complicated. Now it's a, a topic that everybody's measuring. What's next for elec in its sustainability journey? What advice would you might have? Sustainability leaders that wanna make a meaningful impact on the environment, but also on the bottom line. >>Okay, so sustainability is fortunately a widely spread concept. And our role in, in this great game is to define a strategy, align with the common and fundamentals goals for the future of planet and capable of expressing our inclination and the, and the particularities and accessibility goals in the near future. I, I say, I can say that are will be basically free one define sustainability plan. Okay? It's fundamentals to define a sustainability plan. Then it's very important to monitor the its emissions and we will calculate our carbon footprint. Okay? And least button list produces certifiable and comprehensive sustainability report with respect to the demands of customers, suppliers, and also partners. Okay. So I can say that this three target will be our direction in the, in the future. Okay. >>Yeah. So I mean, pretty straightforward. Make a plan. You gotta monitor and measure, you can't improve what you can't measure. So you gonna set a baseline, you're gonna report on that. Yep. You're gonna analyze the data and you're gonna make continuous improvement. >>Yep. >>Matea, thanks so much for joining us today in sharing your perspectives from the, the northern part of Italy. Really appreciate it. >>Yeah, thank you for having aboard. Thank you very >>Much. It was really our pleasure. Okay, in a moment, I'm gonna be back to wrap up the program and share some resources that could be valuable in your sustainability journey. Keep it right there. >>Sustainability is becoming increasingly important and is hitting more RFPs than ever before as a critical decision point for customers. Environmental benefits are not the only impetus. Rather bottom line cost savings are proving that sustainability actually means better business. You can make a strong business case around sustainability and you should, many more organizations are setting mid and long-term goals for sustainability and putting forth published metrics for shareholders and customers. Whereas early green IT initiatives at the beginning of this century, were met with skepticism and somewhat disappointing results. Today, vendor r and d is driving innovation in system design, semiconductor advancements, automation in machine intelligence that's really beginning to show tangible results. Thankfully. Now remember, all these videos are available on demand@thecube.net. So check them out at your convenience and don't forget to go to silicon angle.com for all the enterprise tech news of the day. You also want to check out pure storage.com. >>There are a ton of resources there. As an aside, pure is the only company I can recall to allow you to access resources like a Gartner Magic Quadrant without forcing you to fill out a lead gen form. So thank you for that. Pure storage, I love that. There's no squeeze page on that. No friction. It's kind of on brand there for pure well done. But to the topic today, sustainability, there's some really good information on the site around esg, Pure's Environmental, social and Governance mission. So there's more in there than just sustainability. You'll see some transparent statistics on things like gender and ethnic diversity, and of course you'll see that Pure has some work to do there. But kudos for publishing those stats transparently and setting goals so we can track your progress. And there's plenty on the sustainability topic as well, including some competitive benchmarks, which are interesting to look at and may give you some other things to think about. We hope you've enjoyed the path to Sustainable it made possible by Pure Storage produced with the Cube, your leader in enterprise and emerging tech, tech coverage.

Published Date : Dec 5 2022

SUMMARY :

trend, of course, was the cloud model, you know, kind of became a benchmark for it. And then you had innovations like flash storage, which largely eliminated the We hope you enjoyed the program today. At Pure Storage, the opportunity for change and our commitment to a sustainable future Very pleased to be joined by Nicole Johnson, the head of Social What can you tell me what nuggets are in this report? And so, you know, there was some thought that perhaps that might play into AMEA And so, you know, we often hear from customers that What are some of the things that you received despite so many people saying sustainability, And so, you know, we know that to curb the that had closer alignment between the sustainability folks and the IT folks were farther along So, and that, you know, that's now almost three years ago, digital data the respondents to the survey we were discussing, we do And it's great to see the data demonstrating our Scope one and two emissions, which is our own office, our utilities, you know, those, It sounds like you really dialed in on where is the biggest decisions are going to be and helping you to guide sustainable decision My last question for you goes back to that report. And so, you know, especially if you're in a, in a location Nicole, thank you so much for joining me on the program today, it's great to have you back on the program. pure AJ about the role that technology plays in organizations achieving sustainability it's on Facebook or Twitter or you know, your email, people are constantly interacting with you know, tamp down the data center, energy consumption, sorry, you were saying, We expect that you are gonna deliver these relevant, the explosion is to the point where, in fact, if you just recently was in the news that Ireland went So a lot of silos, you know, a lot of inefficiency across the silos. So aj, talk to me about some of the steps that Pure is implementing as its chief product officer. In fact, 80% of leadership at companies, you know, CEOs and senior executives say they've teams and challenge their IT teams to continue to lead, you know, To your point, it needs to be able to deliver this, but it's, it's a board level objective We're seeing increasingly, especially in Europe with the, you know, the war in Ukraine and the the back end, you know, reduction in e-waste and those kind of things. that on its own, the customer doesn't have to be involved in that. they don't even, we tell them, Hey, you know, that part's about to go, we're gonna come in, we're gonna swap it out and, companies can take to get started and maybe accelerate that journey as it's becoming climate the biggest area of focus that could contribute a lot towards, you know, making an impact in, So that way you don't have systems sitting idle waiting for you to consume more, and the cultural thing is how do you actually begin to have sustainability become But I'm also glad that you brought up culture that's And so obviously, you know, the advice is gonna come from, you know, it comes down to if that is, you know, seeping into the culture, into your core ethos, it's been so great to have you on the program talking about what PEER is doing to help organizations really are a direct reflection of the way we've always operated and the values we live by every We're back talking about the path to sustainable it and now we're gonna get the perspective from All right, before we jump in, tell us a little bit more about Elec Informatica. in the IT area. right now in, in Europe with the, you know, the energy challenges you've talked about things sustainability governance in the company based on stakeholder engagement, You know, that strategy, you know, how has it affected your business in terms of the evolution? Our green data center, you of, of, of an all flash array versus, you know, the spinning disc and it was a big impact. And so of course you have to manage a grade oil deploy of the facilities of using technology from our that competitor was not, you know, a 2010 spinning disc system. So the customer is always at the center of a proposition What advice would you might have? monitor the its emissions and we will calculate our So you gonna set a baseline, you're gonna report on that. the northern part of Italy. Yeah, thank you for having aboard. Okay, in a moment, I'm gonna be back to wrap up the program and share some resources case around sustainability and you should, many more organizations are setting mid can recall to allow you to access resources like a Gartner Magic Quadrant without forcing

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
NicolePERSON

0.99+

Lisa MartinPERSON

0.99+

Nicole JohnsonPERSON

0.99+

Dave ValantePERSON

0.99+

Mattia BalleroPERSON

0.99+

Elec InformaticaORGANIZATION

0.99+

MattiaPERSON

0.99+

AJ SinghPERSON

0.99+

AJ SinghPERSON

0.99+

40QUANTITY

0.99+

Mattia ValerioPERSON

0.99+

EuropeLOCATION

0.99+

DavePERSON

0.99+

LisaPERSON

0.99+

0.8%QUANTITY

0.99+

Al MecaORGANIZATION

0.99+

2020DATE

0.99+

threeQUANTITY

0.99+

90%QUANTITY

0.99+

Alma InformaticaORGANIZATION

0.99+

10 timesQUANTITY

0.99+

2005DATE

0.99+

6%QUANTITY

0.99+

2010DATE

0.99+

4%QUANTITY

0.99+

firstQUANTITY

0.99+

2030DATE

0.99+

2%QUANTITY

0.99+

70%QUANTITY

0.99+

ELEXORGANIZATION

0.99+

2025DATE

0.99+

80%QUANTITY

0.99+

Pure StorageORGANIZATION

0.99+

BostonLOCATION

0.99+

twiceQUANTITY

0.99+

two teamsQUANTITY

0.99+

65%QUANTITY

0.99+

LombardiLOCATION

0.99+

tomorrowDATE

0.99+

secondQUANTITY

0.99+

MateaPERSON

0.99+

PureORGANIZATION

0.99+

2007DATE

0.99+

demand@thecube.netOTHER

0.99+

CengageORGANIZATION

0.99+

First questionQUANTITY

0.99+

AJPERSON

0.99+

Element InformaticaORGANIZATION

0.99+

todayDATE

0.99+

first partQUANTITY

0.99+

six weeksQUANTITY

0.99+

more than 97%QUANTITY

0.99+

oneQUANTITY

0.99+

FirstQUANTITY

0.99+

thirdQUANTITY

0.99+

TodayDATE

0.99+

twenty twenty fiveQUANTITY

0.99+

2020sDATE

0.99+

twoQUANTITY

0.99+

two thousandsQUANTITY

0.99+

six years agoDATE

0.99+

bothQUANTITY

0.99+

Breaking Analysis: Cyber Firms Revert to the Mean


 

(upbeat music) >> From theCube Studios in Palo Alto in Boston, bringing you data driven insights from theCube and ETR. This is Breaking Analysis with Dave Vellante. >> While by no means a safe haven, the cybersecurity sector has outpaced the broader tech market by a meaningful margin, that is up until very recently. Cybersecurity remains the number one technology priority for the C-suite, but as we've previously reported the CISO's budget has constraints just like other technology investments. Recent trends show that economic headwinds have elongated sales cycles, pushed deals into future quarters, and just like other tech initiatives, are pacing cybersecurity investments and breaking them into smaller chunks. Hello and welcome to this week's Wikibon Cube Insights powered by ETR. In this Breaking Analysis we explain how cybersecurity trends are reverting to the mean and tracking more closely with other technology investments. We'll make a couple of valuation comparisons to show the magnitude of the challenge and which cyber firms are feeling the heat, which aren't. There are some exceptions. We'll then show the latest survey data from ETR to quantify the contraction in spending momentum and close with a glimpse of the landscape of emerging cybersecurity companies, the private companies that could be ripe for acquisition, consolidation, or disruptive to the broader market. First, let's take a look at the recent patterns for cyber stocks relative to the broader tech market as a benchmark, as an indicator. Here's a year to date comparison of the bug ETF, which comprises a basket of cyber security names, and we compare that with the tech heavy NASDAQ composite. Notice that on April 13th of this year the cyber ETF was actually in positive territory while the NAS was down nearly 14%. Now by August 16th, the green turned red for cyber stocks but they still meaningfully outpaced the broader tech market by more than 950 basis points as of December 2nd that Delta had contracted. As you can see, the cyber ETF is now down nearly 25%, year to date, while the NASDAQ is down 27% and change. Now take a look at just how far a few of the high profile cybersecurity names have fallen. Here are six security firms that we've been tracking closely since before the pandemic. We've been, you know, tracking dozens but let's just take a look at this data and the subset. We show for comparison the S&P 500 and the NASDAQ, again, just for reference, they're both up since right before the pandemic. They're up relative to right before the pandemic, and then during the pandemic the S&P shot up more than 40%, relative to its pre pandemic level, around February is what we're using for the pre pandemic level, and the NASDAQ peaked at around 65% higher than that February level. They're now down 85% and 71% of their previous. So they're at 85% and 71% respectively from their pandemic highs. You compare that to these six companies, Splunk, which was and still is working through a transition is well below its pre pandemic market value and 44, it's 44% of its pre pandemic high as of last Friday. Palo Alto Networks is the most interesting here, in that it had been facing challenges prior to the pandemic related to a pivot to the Cloud which we reported on at the time. But as we said at that time we believe the company would sort out its Cloud transition, and its go to market challenges, and sales compensation issues, which it did as you can see. And its valuation jumped from 24 billion prior to Covid to 56 billion, and it's holding 93% of its peak value. Its revenue run rate is now over 6 billion with a healthy growth rate of 24% expected for the next quarter. Similarly, Fortinet has done relatively well holding 71% of its peak Covid value, with a healthy 34% revenue guide for the coming quarter. Now, Okta has been the biggest disappointment, a darling of the pandemic Okta's communication snafu, with what was actually a pretty benign hack combined with difficulty absorbing its 7 billion off zero acquisition, knocked the company off track. Its valuation has dropped by 35 billion since its peak during the pandemic, and that's after a nice beat and bounce back quarter just announced by Okta. Now, in our view Okta remains a viable long-term leader in identity. However, its recent fiscal 24 revenue guide was exceedingly conservative at around 16% growth. So either the company is sandbagging, or has such poor visibility that it wants to be like super cautious or maybe it's actually seeing a dramatic slowdown in its business momentum. After all, this is a company that not long ago was putting up 50% plus revenue growth rates. So it's one that bears close watching. CrowdStrike is another big name that we've been talking about on Breaking Analysis for quite some time. It like Okta has led the industry in a key ETR performance indicator that measures customer spending momentum. Just last week, CrowdStrike announced revenue increased more than 50% but new ARR was soft and the company guided conservatively. Not surprisingly, the stock got absolutely crushed as CrowdStrike blamed tepid demand from smaller and midsize firms. Many analysts believe that competition from Microsoft was one factor along with cautious spending amongst those midsize and smaller customers. Notably, large customers remain active. So we'll see if this is a longer term trend or an anomaly. Zscaler is another company in the space that we've reported having great customer spending momentum from the ETR data. But even though the company beat expectations for its recent quarter, like other companies its Outlook was conservative. So other than Palo Alto, and to a lesser extent Fortinet, these companies and others that we're not showing here are feeling the economic pinch and it shows in the compression of value. CrowdStrike, for example, had a 70 billion valuation at one point during the pandemic Zscaler top 50 billion, Okta 45 billion. Now, having said that Palo Alto Networks, Fortinet, CrowdStrike, and Zscaler are all still trading well above their pre pandemic levels that we tracked back in February of 2020. All right, let's go now back to ETR'S January survey and take a look at how much things have changed since the beginning of the year. Remember, this is obviously pre Ukraine, and pre all the concerns about the economic headwinds but here's an X Y graph that shows a net score, or spending momentum on the y-axis, and market presence on the x-axis. The red dotted line at 40% on the vertical indicates a highly elevated net score. Anything above that we think is, you know, super elevated. Now, we filtered the data here to show only those companies with more than 50 responses in the ETR survey. Still really crowded. Note that there were around 20 companies above that red 40% mark, which is a very, you know, high number. It's a, it's a crowded market, but lots of companies with, you know, positive momentum. Now let's jump ahead to the most recent October survey and take a look at what, what's happening. Same graphic plotting, spending momentum, and market presence, and look at the number of companies above that red line and how it's been squashed. It's really compressing, it's still a crowded market, it's still, you know, plenty of green, but the number of companies above 40% that, that key mark has gone from around 20 firms down to about five or six. And it speaks to that compression and IT spending, and of course the elongated sales cycles pushing deals out, taking them in smaller chunks. I can't tell you how many conversations with customers I had, at last week at Reinvent underscoring this exact same trend. The buyers are getting pressure from their CFOs to slow things down, do more with less and, and, and prioritize projects to those that absolutely are critical to driving revenue or cutting costs. And that's rippling through all sectors, including cyber. Now, let's do a bit more playing around with the ETR data and take a look at those companies with more than a hundred citations in the survey this quarter. So N, greater than or equal to a hundred. Now remember the followers of Breaking Analysis know that each quarter we take a look at those, what we call four star security firms. That is, those are the, that are in, that hit the top 10 for both spending momentum, net score, and the N, the mentions in the survey, the presence, the pervasiveness in the survey, and that's what we show here. The left most chart is sorted by spending momentum or net score, and the right hand chart by shared N, or the number of mentions in the survey, that pervasiveness metric. that solid red line denotes the cutoff point at the top 10. And you'll note we've actually cut it off at 11 to account for Auth 0, which is now part of Okta, and is going through a go to market transition, you know, with the company, they're kind of restructuring sales so they can take advantage of that. So starting on the left with spending momentum, again, net score, Microsoft leads all vendors, typical Microsoft, very prominent, although it hadn't always done so, it, for a while, CrowdStrike and Okta were, were taking the top spot, now it's Microsoft. CrowdStrike, still always near the top, but note that CyberArk and Cloudflare have cracked the top five in Okta, which as I just said was consistently at the top, has dropped well off its previous highs. You'll notice that Palo Alto Network Palo Alto Networks with a 38% net score, just below that magic 40% number, is healthy, especially as you look over to the right hand chart. Take a look at Palo Alto with an N of 395. It is the largest of the independent pure play security firms, and has a very healthy net score, although one caution is that net score has dropped considerably since the beginning of the year, which is the case for most of the top 10 names. The only exception is Fortinet, they're the only ones that saw an increase since January in spending momentum as ETR measures it. Now this brings us to the four star security firms, that is those that hit the top 10 in both net score on the left hand side and market presence on the right hand side. So it's Microsoft, Palo Alto, CrowdStrike, Okta, still there even not accounting for a Auth 0, just Okta on its own. If you put in Auth 0, it's, it's even stronger. Adding then in Fortinet and Zscaler. So Microsoft, Palo Alto, CrowdStrike, Okta, Fortinet, and Zscaler. And as we've mentioned since January, only Fortinet has shown an increase in net score since, since that time, again, since the January survey. Now again, this talks to the compression in spending. Now one of the big themes we hear constantly in cybersecurity is the market is overcrowded. Everybody talks about that, me included. The implication there, is there's a lot of room for consolidation and that consolidation can come in the form of M&A, or it can come in the form of people consolidating onto a single platform, and retiring some other vendors, and getting rid of duplicate vendors. We're hearing that as a big theme as well. Now, as we saw in the previous, previous chart, this is a very crowded market and we've seen lots of consolidation in 2022, in the form of M&A. Literally hundreds of M&A deals, with some of the largest companies going private. SailPoint, KnowBe4, Barracuda, Mandiant, Fedora, these are multi billion dollar acquisitions, or at least billion dollars and up, and many of them multi-billion, for these companies, and hundreds more acquisitions in the cyberspace, now less you think the pond is overfished, here's a chart from ETR of emerging tech companies in the cyber security industry. This data comes from ETR's Emerging Technologies Survey, ETS, which is this diamond in a rough that I found a couple quarters ago, and it's ripe with companies that are candidates for M&A. Many would've liked, many of these companies would've liked to, gotten to the public markets during the pandemic, but they, you know, couldn't get there. They weren't ready. So the graph, you know, similar to the previous one, but different, it shows net sentiment on the vertical axis and that's a measurement of, of, of intent to adopt against a mind share on the X axis, which measures, measures the awareness of the vendor in the community. So this is specifically a survey that ETR goes out and, and, and fields only to track those emerging tech companies that are private companies. Now, some of the standouts in Mindshare, are OneTrust, BeyondTrust, Tanium and Endpoint, Net Scope, which we've talked about in previous Breaking Analysis. 1Password, which has been acquisitive on its own. In identity, the managed security service provider, Arctic Wolf Network, a company we've also covered, we've had their CEO on. We've talked about MSSPs as a real trend, particularly in small and medium sized business, we'll come back to that, Sneek, you know, kind of high flyer in both app security and containers, and you can just see the number of companies in the space this huge and it just keeps growing. Now, just to make it a bit easier on the eyes we filtered the data on these companies with with those, and isolated on those with more than a hundred responses only within the survey. And that's what we show here. Some of the names that we just mentioned are a bit easier to see, but these are the ones that really stand out in ERT, ETS, survey of private companies, OneTrust, BeyondTrust, Taniam, Netscope, which is in Cloud, 1Password, Arctic Wolf, Sneek, BitSight, SecurityScorecard, HackerOne, Code42, and Exabeam, and Sim. All of these hit the ETS survey with more than a hundred responses by, by the IT practitioners. Okay, so these firms, you know, maybe they do some M&A on their own. We've seen that with Sneek, as I said, with 1Password has been inquisitive, as have others. Now these companies with the larger footprint, these private companies, will likely be candidate for both buying companies and eventually going public when the markets settle down a bit. So again, no shortage of players to affect consolidation, both buyers and sellers. Okay, so let's finish with some key questions that we're watching. CrowdStrike in particular on its earnings calls cited softness from smaller buyers. Is that because these smaller buyers have stopped adopting? If so, are they more at risk, or are they tactically moving toward the easy button, aka, Microsoft's good enough approach. What does that mean for the market if smaller company cohorts continue to soften? How about MSSPs? Will companies continue to outsource, or pause on on that, as well as try to free up, to try to free up some budget? Adam Celiski at Reinvent last week said, "If you want to save money the Cloud's the best place to do it." Is the cloud the best place to save money in cyber? Well, it would seem that way from the standpoint of controlling budgets with lots of, lots of optionality. You could dial up and dial down services, you know, or does the Cloud add another layer of complexity that has to be understood and managed by Devs, for example? Now, consolidation should favor the likes of Palo Alto and CrowdStrike, cause they're platform players, and some of the larger players as well, like Cisco, how about IBM and of course Microsoft. Will that happen? And how will economic uncertainty impact the risk equation, a particular concern is increase of tax on vulnerable sectors of the population, like the elderly. How will companies and governments protect them from scams? And finally, how many cybersecurity companies can actually remain independent in the slingshot economy? In so many ways the market is still strong, it's just that expectations got ahead of themselves, and now as earnings forecast come, come, come down and come down to earth, it's going to basically come down to who can execute, generate cash, and keep enough runway to get through the knothole. And the one certainty is nobody really knows how tight that knothole really is. All right, let's call it a wrap. Next week we dive deeper into Palo Alto Networks, and take a look at how and why that company has held up so well and what to expect at Ignite, Palo Alto's big user conference coming up later this month in Las Vegas. We'll be there with theCube. Okay, many thanks to Alex Myerson on production and manages the podcast, Ken Schiffman as well, as our newest edition to our Boston studio. Great to have you Ken. Kristin Martin and Cheryl Knight help get the word out on social media and in our newsletters. And Rob Hof is our EIC over at Silicon Angle. He does some great editing for us. Thank you to all. Remember these episodes are all available as podcasts. Wherever you listen, just search Breaking Analysis podcast. I publish each week on wikibond.com and siliconangle.com, or you can email me directly David.vellante@siliconangle.com or DM me @DVellante, or comment on our LinkedIn posts. Please do checkout etr.ai, they got the best survey data in the enterprise tech business. This is Dave Vellante for theCube Insights powered by ETR. Thanks for watching, and we'll see you next time on Breaking Analysis. (upbeat music)

Published Date : Dec 5 2022

SUMMARY :

with Dave Vellante. and of course the elongated

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Alex MyersonPERSON

0.99+

MicrosoftORGANIZATION

0.99+

Dave VellantePERSON

0.99+

December 2ndDATE

0.99+

OktaORGANIZATION

0.99+

DeltaORGANIZATION

0.99+

Ken SchiffmanPERSON

0.99+

ZscalerORGANIZATION

0.99+

FortinetORGANIZATION

0.99+

Cheryl KnightPERSON

0.99+

Adam CeliskiPERSON

0.99+

CrowdStrikeORGANIZATION

0.99+

CiscoORGANIZATION

0.99+

August 16thDATE

0.99+

April 13thDATE

0.99+

Rob HofPERSON

0.99+

NASDAQORGANIZATION

0.99+

IBMORGANIZATION

0.99+

93%QUANTITY

0.99+

Kristin MartinPERSON

0.99+

Palo AltoLOCATION

0.99+

Arctic Wolf NetworkORGANIZATION

0.99+

38%QUANTITY

0.99+

40%QUANTITY

0.99+

71%QUANTITY

0.99+

JanuaryDATE

0.99+

Palo AltoORGANIZATION

0.99+

Palo Alto NetworksORGANIZATION

0.99+

50%QUANTITY

0.99+

February of 2020DATE

0.99+

Las VegasLOCATION

0.99+

7 billionQUANTITY

0.99+

six companiesQUANTITY

0.99+

SplunkORGANIZATION

0.99+

2022DATE

0.99+

BarracudaORGANIZATION

0.99+

34%QUANTITY

0.99+

24%QUANTITY

0.99+

FebruaryDATE

0.99+

last weekDATE

0.99+

last FridayDATE

0.99+

SailPointORGANIZATION

0.99+

FirstQUANTITY

0.99+

more than 50%QUANTITY

0.99+

85%QUANTITY

0.99+

each weekQUANTITY

0.99+

44%QUANTITY

0.99+

35 billionQUANTITY

0.99+

70 billionQUANTITY

0.99+

KenPERSON

0.99+

KnowBe4ORGANIZATION

0.99+

27%QUANTITY

0.99+

56 billionQUANTITY

0.99+

NetscopeORGANIZATION

0.99+

OctoberDATE

0.99+

Next weekDATE

0.99+

one factorQUANTITY

0.99+

bothQUANTITY

0.99+

hundredsQUANTITY

0.99+

44QUANTITY

0.99+

dozensQUANTITY

0.99+

BeyondTrustORGANIZATION

0.99+

David.vellante@siliconangle.comOTHER

0.99+

24 billionQUANTITY

0.99+

Keynote Analysis with theCUBE | AWS re:Invent 2022


 

(bright music) >> Hello, everyone. Welcome back to live coverage day two or day one, day two for theCUBE, day one for the event. I'm John Furrier, host of theCUBE. It's the keynote analysis segment. Adam just finished coming off stage. I'm here with Dave Vellante and Zeus Kerravala, with principal analyst at ZK Research, Zeus, it's great to see you. Dave. Guys, the analysis is clear. AWS is going NextGen. You guys had a multi-day analyst sessions in on the pre-briefs. We heard the keynote, it's out there. Adam's getting his sea legs, so to speak, a lot of metaphors around ocean. >> Yeah. >> Space. He's got these thematic exploration as he chunked his keynote out into sections. Zeus, a lot of networking in there in terms of some of the price performance, specialized instances around compute, this end-to-end data services. Dave, you were all over this data aspect going into the keynote and obviously, we had visibility into this business transformation theme. What's your analysis? Zeus, we'll start with you. What's your take on what Amazon web service is doing this year and the keynote? What's your analysis? >> Well, I think, there was a few key themes here. The first one is I do think we're seeing better integration across the AWS portfolio. Historically, AWS makes a lot of stuff and it's not always been easy to use say, Aurora and Redshift together, although most customers buy them together. So, they announce the integration of that. It's a lot tighter now. It's almost like it could be one product, but I know they like to keep the product development separately. Also, I think, we're seeing a real legitimization of AWS in a bunch of areas where people said it wasn't possible before. Last year, Nasdaq said they're running in the cloud. The Options Exchange today announced that they're going to be moving to the cloud. Contact centers running the cloud for a lot of real time voice. And so, things that we looked at before and said those will never move to the cloud have now moved to the cloud. And I think, my third takeaway is just AWS is changing and they're now getting into areas to allow customers to do things they couldn't do before. So, if you look at what they're doing in the area of AI, a lot of their AI and ML services before were prediction. And I'm not saying you need an AI, ML to do prediction, was certainly a lot more accurate, but now they're getting into generative data. So, being able to create data where data didn't exist before and that's a whole new use case for 'em. So, AWS, I think, is actually for all the might and power they've had, it's actually stepping up and becoming a much different company now. >> Yeah, I had wrote that post. I had a one-on-one day, got used of the transcript with Adam Selipsky. He went down that route of hey, we going to change NextGen. Oh, that's my word. AWS Classic my word. The AWS Classic, the old school cloud, which a bunch of Lego blocks, and you got this new NextGen cloud with the ecosystems emerging. So, clearly, it's Amazon shifting. >> Yeah. >> But Dave, your breaking analysis teed out the keynote. You went into the whole cost recovery. We heard Adam talk about macro at the beginning of his keynote. He talked about economic impact, sustainability, big macro issues. >> Yeah. >> And then, he went into data and spent most of the time on the keynote on data. Tools, integration, governance, insights. You're all over that. You had that, almost your breaking analysis almost matched the keynote, >> Yeah. >> thematically, macro, cost savings right-sizing with the cloud. And last night, I was talking to some of the marketplace people, we think that the marketplace might be the center where people start managing their cost better. This could have an impact on the ecosystem if they're not in in the marketplace. So, again, so much is going on. >> What's your analogy? >> Yeah, there's so much to unpack, a couple things. One is we get so much insight from theCUBE community plus your sit down 101 with Adam Selipsky allowed us to gather some nuggets, and really, I think, predict pretty accurately. But the number one question I get, if I could hit the escape key a bit, is what's going to be different in the Adam Selipsky era that was different from the Jassy era. Jassy was all about the primitives. The best cloud. And Selipsky's got to double down on that. So, he's got to keep that going. Plus, he's got to do that end-to-end integration and he's got to do the deeper business integration, up the stack, if you will. And so, when you're thinking about the keynote and the spirit of keynote analysis, we definitely heard, hey, more primitives, more database features, more Graviton, the network stuff, the HPC, Graviton for HPC. So, okay, check on that. We heard some better end-to-end integration between the elimination of ETL between Aurora and Redshift. Zeus and I were sitting next to each other. Okay, it's about time. >> Yeah. >> Okay, finally we got that. So, that's good. Check. And then, they called it this thing, the Amazon data zones, which was basically extending Redshift data sharing within your organization. So, you can now do that. Now, I don't know if it works across regions. >> Well, they mentioned APIs and they have the data zone. >> Yep. And so, I don't know if it works across regions, but the interesting thing there is he specifically mentioned integration with Snowflake and Tableau. And so, that gets me to your point, at the end of the day, in order for Amazon, and this is why they win, to succeed, they've got to have this ecosystem really cranking. And that's something that is just the secret sauce of the business model. >> Yeah. And it's their integration into that ecosystem. I think, it's an interesting trend that I've seen for customers where everybody wanted best of breed, everybody wanted disaggregated, and their customers are having trouble now putting those building blocks together. And then, nobody created more building blocks than AWS. And so, I think, under Adam, what we're seeing is much more concerted effort to make it easier for customers to consume those building blocks in an easy way. And the AWS execs >> Yeah. >> I talked to yesterday all committed to that. It's easy, easy, easy. And I think that's why. (Dave laughing) Yeah, there's no question they've had a lead in cloud for a long time. But if they're going to keep that, that needs to be upfront. >> Well, you're close to this, how easy is it? >> Yeah. >> But we're going to have Adrian Cockcroft (Dave laughing) on at the end of the day today, go into one analysis. Now, that- >> Well, less difficult. >> How's that? (indistinct) (group laughing) >> There you go. >> Adrian retired from Amazon. He's a CUBE analyst retiree, but he had a good point. You can buy the bag of Lego blocks if you want primitives >> Yeah. >> or you can buy the toy that's glued together. And it works, but it breaks. And you can't really manage it, and you buy a new one. So, his metaphor was, okay, if the primitives allow you to construct a durable solutions, a lot harder relative to rolling your own, not like that, but also the simplest out-of-the box capability is what people want. They want solutions. We call Adam the solutions CEO. So, I think, you're going to start to see this purpose built specialized services allow the ecosystem to build those toys, so that the customers can have an out-of-the box experience while having the option for the AWS Classic, which is if you want durability, you want to tune it, you want to manage it, that's the way to go for the hardcore. Now, can be foundational, but I just see the solutions things being very much like an out-of-the-box. Okay, throw away, >> Yeah. >> buy a new toy. >> More and more, I'm saying less customers want to be that hardcore assembler of building blocks. And obviously, the really big companies do, but that line is moving >> Yeah. >> and more companies, I think, just want to run their business and they want those prebuilt solutions. >> We had to cut out of the keynote early. But I didn't hear a lot about... The example that they often use is Amazon Connect, the call center solution. >> Yeah. >> I didn't hear a lot to that in the keynote. Maybe it's happening right now, but look, at the end of the day, suites always win. The best of breed does well, (John laughing) takes off, generate a couple billion, Snowflake will grow, they'll get to 10 billion. But you look at Oracle, suites work. (laughs) >> Yeah. >> What I found interesting about the keynote is that he had this thematic exploration themes. First one was space that was like connect the dot, the nebula, different (mumbles) lens, >> Ocean. >> ask the right questions. (Dave laughing) >> Ocean was security which bears more, >> Yeah. >> a lot more needed to manage that oxygen going deep. Are you snorkeling? Are you scuba diving? Barely interesting amount of work. >> In Antarctica. >> Antarctica was the performance around how you handle tough conditions and you've got to get that performance. >> Dave: We're laughing, but it was good. >> But the day, the Ocean Day- >> Those are very poetic. >> I tweeted you, Dave, (Dave laughing) because I sit on theCUBE in 2011. I hate hail. (Dave laughing) It's the worst term ever. It's the day the ocean's more dynamic. It's a lot more flowing. Maybe 10 years too soon, Dave. But he announces the ocean theme and then says we have a Security Lake. So, like lake, ocean, little fun on words- >> I actually think the Security Lake is pretty meaningful, because we were listening to talk, coming over here talking about it, where I think, if you look at a lot of the existing solutions, security solutions there, I describe 'em as a collection of data ponds that you can view through one map, but they're not really connected. And the amount of data that AWS holds now, arguably more than any other company, if they're not going to provide the Security Lake, who is? >> Well, but staying >> Yeah. >> on security for a second. To me, the big difference between Azure and Amazon is the ecosystem. So, CrowdStrike, Okta, Zscaler, name it, CyberArk, Rapid7, they're all part of this ecosystem. Whereas Microsoft competes with all of those guys. >> Yes. Yeah. >> So it's a lot more white space than the Amazon ecosystem. >> Well, I want to get you guys to take on, so in your reaction, because I think, my vision of what what's happening here is that I think that whole data portion's going to be data as code. And I think, the ecosystem harvests the data play. If you look at AWS' key announcements here, Security Lake, price performance, they're going to optimize for those kinds of services. Look at security, okay, Security Lake, GuardDuty, EKS, that's a Docker. Docker has security problems. They're going inside the container and looking at threat detection inside containers with Kubernetes as the runtime. That's a little nuance point, but that's pretty significant, Dave. And they're now getting into, we're talking in the weeds on the security piece, adding that to their large scale security footprint. Security is going to be one of those things where if you're not on the inside of their security play, you're probably going to be on the outside. And of course, the price performance is going to be the killer. The networking piece surprise me. Their continuing to innovate on the network. What does that mean for Cisco? So many questions. >> We had Ajay Patel on yesterday for VMware. He's an awesome middleware guy. And I was asking about serverless and architectures. And he said, "Look, basically, serverless' great for stateless, but if you want to run state, you got to have control over the run time." But the point he made was that people used to think of running containers with straight VMs versus Fargate or Knative, if you choose, or serverless. They used to think of those as different architectures. And his point was they're all coming together. And it's now you're architecting and calling, which service you need. And that's how people are thinking about future architectures, which I think, makes a lot of sense. >> If you are running managed Kubernetes, which everyone's doing, 'cause no one's really building it in-house themselves. >> No. >> They're running it as managed service, skills gaps and a variety of other reasons. This EKS protection is very interesting. They're managing inside and outside the container, which means that gives 'em visibility on both sides, under the hood and inside the application layer. So, very nuanced point, Zeus. What's your reaction to this? And obviously, the networking piece, I'd love to get your thought. >> Well, security, obviously, it's becoming a... It's less about signatures and more of an analytics. And so, things happen inside the container and outside the container. And so, their ability to look on both sides of that allows you to happen threats in time, but then also predict threats that could happen when you spin the container up. And the difficulty with the containers is they are ephemeral. It's not like a VM where it's a persistent workload that you can do analysis on. You need to know what's going on with the container almost before it spins up. >> Yeah. >> And that's a much different task. So, I do think the amount of work they're doing with the containers gives them that entry into that and I think, it's a good offering for them. On the network side, they provide a lot of basic connectivity. I do think there's a role still for the Ciscos and the Aristas and companies like that to provide a layer of enhanced network services that connects multicloud. 'Cause AWS is never going to do that. But they've certainly, they're as legitimate network vendor as there is today. >> We had NetApp on yesterday. They were talking about latency in their- >> I'll tell you this, the analyst session, Steven Armstrong said, "You are going to hear us talk about multicloud." Yes. We're not going to necessarily lead with it. >> Without a mention. >> Yeah. >> But you said it before, never say never with Amazon. >> Yeah. >> We talk about supercloud and you're like, Dave, ultimately, the cloud guys are going to get into supercloud. They have to. >> Look, they will do multicloud. I predict that they will do multicloud. I'll tell you why. Just like in networking- >> Well, customers are asking for it. >> Well, one, they have the, not by design, but by defaulter and multiple clouds are in their environment. They got to deal with that. I think, the supercloud and sky cloud visions, there will be common services. Remember networking back in the old days when Cisco broke in as a startup. There was no real shortest path, first thinking. Policy came in after you connected all the routers together. So, right now, it's going to be best of breed, low latency, high performance. But I think, there's going to be a need in the future saying, hey, I want to run my compute on the slower lower cost compute. They already got segmentation by their announcements today. So, I think, you're going to see policy-based AI coming in where developers can look at common services across clouds and saying, I want to lock in an SLA on latency and compute services. It won't be super fast compared to say, on AWS, with the next Graviton 10 or whatever comes out. >> Yeah. >> So, I think, you're going to start to see that come in. >> Actually, I'm glad you brought Graviton up too, because the work they're doing in Silicon, actually I think, is... 'Cause I think, the one thing AWS now understands is some things are best optimized in Silicon, some at software layers, some in cloud. And they're doing work on all those layers. And Graviton to me is- >> John: Is a home run. >> Yeah. >> Well- >> Dave, they've got more instances, it's going to be... They already have Gravitons that's slower than the other versions. So, what they going to do, sunset them? >> They don't deprecate anything ever. So, (John laughing) Amazon paid $350 million. People believe that it's a number for Annapurna, which is like one of the best acquisitions in history. (group laughing) And it's given them, it's put them on an arm curve for Silicon that is blowing away Intel. Intel's finally going to get Sapphire Rapids out in January. Meanwhile, Amazon just keeps spinning out new Gravitons and Trainiums. >> Yeah. >> And so, they are on a price performance curve. And like you say, no developer ever wants to run on slower hardware, ever. >> Today, if there's a common need for multicloud, they might say, hey, I got the trade off latency and performance on common services if that's what gets me there. >> Sure. >> If there's maybe a business case to do that. >> Well, that's what they're- >> Which by the way, I want to.... Selipsky had strong quote I thought was, "If you're looking to tighten your belt, the cloud is the place >> Yeah. >> to do it." I thought >> I tweeted that. >> that was very strong. >> Yeah. >> Yeah. >> And I think, he's right. And then, the other point I want to make on that is, I think, I don't have any data on this, but I believe believe just based on some of the discussions I've had that most of Amazon's revenue is on demand. Paid by the drink. Those on demand customers are at risk, 'cause they can go somewhere else. So, they're trying to get you into optimized pricing, whether it's reserved instances or one year or three-year subscriptions. And so, they're working really hard at doing that. >> My prediction on that is that's a great point you brought up. My prediction is that the cost belt tightening is going to come in the marketplace, is going to be a major factor as companies want to get their belts tighten. How they going to do that, Dave? They're going to go in the marketplace saying, hey, I already overpaid a three-year commitment. Can I get some cohesively in there? Can I get some of this or that and the other thing? >> Yep. >> You're going to start to see the vendors and the ecosystem. If they're not in the marketplace, that's where I think, the customers will go. There are other choices to either cut their supplier base or renegotiate. I think, it's going to happen in the marketplace. Let's watch. I think, we're going to watch that grow. >> I actually think the optimization services that AWS has to help customers lower spend is a secret sauce for them that they... Customers tell me all the time, AWS comes in, they'll bring their costs down and they wind up spending more with them. >> Dave: Yeah. >> And the other cloud providers don't do that. And that has been almost a silver bullet for them to get customers to stay with them. >> Okay. And this is always the way. You drop the price of storage, you drop the price of memory, you drop the price of compute, people buy more. And in the question, long term is okay. And does AWS get commoditized? Is that where they're going? Or do they continue to thrive up the stack? John, you're always asking people about the bumper sticker. >> Hold on. (John drowns out Dave) Before we get the bumper sticker, I want to get into what we missed, what they missed on the keynote. >> Yeah, there are some blind spots. >> I think- >> That's good call. >> Let's go around the horn and think what did they miss? I'll start, I think, they missed the developer productivity angle. Supply chain software was not talked about at all. We see that at all the other conferences. I thought that could have been weaved in. >> Dave: You mean security in the supply chain? >> Just overall developer productivity has been one of the most constant themes I've seen at events. Who are building the apps? Who are the builders? What are they actually doing? Maybe Werner will bring that up on his last day, but I didn't hear Adam talk about it all, developer productivity. What's your take in this? >> Yeah, I think, on the security side, they announced security data lake. I think, the other cloud providers do a better job of providing insights on how they do security. With AWS, it's almost a black hole. And I know there's a careful line they walk between what they do, what their partners do. But I do think they could be a little clearer on how they operate, much like Azure and GCP. They announce a lot of stuff on how their operations works and things like that. >> I think, platform across cloud is definitely a blind spot for these guys. >> Yeah. >> I think, look at- >> But none of the cloud providers have embraced that, right? >> It's true. >> Yeah. >> Maybe Google a little bit >> Yeah. >> and Microsoft a little bit. Certainly, AWS hasn't at this point in time, but I think, they perceive the likes of Mongo and Snowflake and Databricks, and others as ISVs and they're not. They're platform players that are building across clouds. They're leveraging, they're building superclouds. So, I think that's an opportunity for the ecosystem. And very curious to see how Amazon plays there down the stream. So, John, what do you think is the bumper sticker? We're only in day one and a half here. What do you think so far the bumper sticker is for re:Invent 2022? >> Well, to me, the day one is about infrastructure performance with the whole what's in the data center? What's at the chip level? Today was about data, specialized services, and security. I think that was the key theme here. And then, that's going to sequence into how they're going to reorganize their ecosystem. They have a new leader, Ruba Borno, who's going to be leading the charge. They've integrated all their bespoke fragmented partner network pieces into one leadership. That's going to be really important to hear that. And then, finally, Werner for developers and event-based services, micro services. What that world's going on, because that's where the developers are. And ultimately, they build the app. So, you got infrastructure, data, specialized services, and security. Machine learning with Swami is going to be huge. And again, how do developers code it all up is going to be key. And is it the bag of Legos or the glued toy? (Dave chuckles) So, what do you want? Out-of-the-box or you want to build your own? >> And that's the bottom line is connecting those dots. All they got to be is good enough. I think, Zeus, to your point, >> Yep. >> if they're just good enough, less complicated, the will keep people on the base. >> Yeah. I think, the bumper stickers, the more you buy, the more you're saving. (John laughing) Because from an operational perspective, they are trying to bring down the complexity level. And with their optimization services and the way their credit model works, I do think they're trending down that path. >> And my bumper sticker's ecosystem, ecosystem, ecosystem. This company has 100,000 partners and that is a business model secret weapon. >> All right, there it is. The keynote announced. More analysis coming up. We're going to have the leader of (indistinct) coming up next, here on to break down their perspective, you got theCUBE's analyst perspective here. Thanks for watching. Day two, more live coverage for the next two more days, so stay with us. I'm John Furrier with Dave Vellante and Zeus Kerravala here on theCUBE. Be right back. (bright music)

Published Date : Nov 29 2022

SUMMARY :

in on the pre-briefs. going into the keynote is actually for all the The AWS Classic, the old school cloud, at the beginning of his keynote. and spent most of the time This could have an impact on the ecosystem and the spirit of keynote analysis, And then, they called it this and they have the data zone. And so, that gets me to your And the AWS execs But if they're going to keep on at the end of the day You can buy the bag of Lego blocks allow the ecosystem to build those toys, And obviously, the and more companies, I think, the call center solution. but look, at the end of about the keynote ask the right questions. a lot more needed to around how you handle tough conditions But he announces the ocean theme And the amount of data that AWS holds now, and Amazon is the ecosystem. space than the Amazon ecosystem. And of course, the price performance But the point he made If you are running managed Kubernetes, And obviously, the networking piece, And the difficulty and the Aristas and companies like that We had NetApp on yesterday. the analyst session, But you said it before, the cloud guys are going I predict that they will do on the slower lower cost compute. to start to see that come in. And Graviton to me is- that's slower than the other versions. Intel's finally going to get And like you say, got the trade off latency business case to do that. the cloud is the place to do it." on some of the discussions I've had and the other thing? I think, it's going to happen Customers tell me all the time, And the other cloud And in the question, long term is okay. I want to get into what we missed, We see that at all the other conferences. Who are building the apps? on the security side, I think, platform across is the bumper sticker? And is it the bag of Legos And that's the bottom line on the base. stickers, the more you buy, and that is a business for the next two more

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Adrian CockcroftPERSON

0.99+

Steven ArmstrongPERSON

0.99+

AdamPERSON

0.99+

AWSORGANIZATION

0.99+

Dave VellantePERSON

0.99+

DavePERSON

0.99+

AdrianPERSON

0.99+

AmazonORGANIZATION

0.99+

Adam SelipskyPERSON

0.99+

JohnPERSON

0.99+

CiscoORGANIZATION

0.99+

Ruba BornoPERSON

0.99+

2011DATE

0.99+

John FurrierPERSON

0.99+

one yearQUANTITY

0.99+

AWS'ORGANIZATION

0.99+

ZK ResearchORGANIZATION

0.99+

three-yearQUANTITY

0.99+

AntarcticaLOCATION

0.99+

MicrosoftORGANIZATION

0.99+

Last yearDATE

0.99+

10 billionQUANTITY

0.99+

Zeus KerravalaPERSON

0.99+

JanuaryDATE

0.99+

Ajay PatelPERSON

0.99+

NasdaqORGANIZATION

0.99+

$350 millionQUANTITY

0.99+

CiscosORGANIZATION

0.99+

100,000 partnersQUANTITY

0.99+

yesterdayDATE

0.99+

GoogleORGANIZATION

0.99+

SelipskyPERSON

0.99+

Zeus KerravalaPERSON

0.99+

Options ExchangeORGANIZATION

0.99+

AristasORGANIZATION

0.99+

DatabricksORGANIZATION

0.99+

MongoORGANIZATION

0.99+

TodayDATE

0.99+

todayDATE

0.99+

Breaking Analysis: re:Invent 2022 marks the next chapter in data & cloud


 

from the cube studios in Palo Alto in Boston bringing you data-driven insights from the cube and ETR this is breaking analysis with Dave vellante the ascendancy of AWS under the leadership of Andy jassy was marked by a tsunami of data and corresponding cloud services to leverage that data now those Services they mainly came in the form of Primitives I.E basic building blocks that were used by developers to create more sophisticated capabilities AWS in the 2020s being led by CEO Adam solipski will be marked by four high-level Trends in our opinion one A Rush of data that will dwarf anything we've previously seen two a doubling or even tripling down on the basic elements of cloud compute storage database security Etc three a greater emphasis on end-to-end integration of AWS services to simplify and accelerate customer adoption of cloud and four significantly deeper business integration of cloud Beyond it as an underlying element of organizational operations hello and welcome to this week's wikibon Cube insights powered by ETR in this breaking analysis we extract and analyze nuggets from John furrier's annual sit-down with the CEO of AWS we'll share data from ETR and other sources to set the context for the market and competition in cloud and we'll give you our glimpse of what to expect at re invent in 2022. now before we get into the core of our analysis Alibaba has announced earnings they always announced after the big three you know a month later and we've updated our Q3 slash November hyperscale Computing forecast for the year as seen here and we're going to spend a lot of time on this as most of you have seen the bulk of it already but suffice to say alibaba's cloud business is hitting that same macro Trend that we're seeing across the board but a more substantial slowdown than we expected and more substantial than its peers they're facing China headwinds they've been restructuring its Cloud business and it's led to significantly slower growth uh in in the you know low double digits as opposed to where we had it at 15 this puts our year-end estimates for 2022 Revenue at 161 billion still a healthy 34 growth with AWS surpassing 80 billion in 2022 Revenue now on a related note one of the big themes in Cloud that we've been reporting on is how customers are optimizing their Cloud spend it's a technique that they use and when the economy looks a little shaky and here's a graphic that we pulled from aws's website which shows the various pricing plans at a high level as you know they're much more granular than that and more sophisticated but Simplicity we'll just keep it here basically there are four levels first one here is on demand I.E pay by the drink now we're going to jump down to what we've labeled as number two spot instances that's like the right place at the right time I can use that extra capacity in the moment the third is reserved instances or RIS where I pay up front to get a discount and the fourth is sort of optimized savings plans where customers commit to a one or three year term and for a better price now you'll notice we labeled the choices in a different order than AWS presented them on its website and that's because we believe that the order that we chose is the natural progression for customers this started on demand they maybe experiment with spot instances they move to reserve instances when the cloud bill becomes too onerous and if you're large enough you lock in for one or three years okay the interesting thing is the order in which AWS presents them we believe that on-demand accounts for the majority of AWS customer spending now if you think about it those on-demand customers they're also at risk customers yeah sure there's some switching costs like egress and learning curve but many customers they have multiple clouds and they've got experience and so they're kind of already up to a learning curve and if you're not married to AWS with a longer term commitment there's less friction to switch now AWS here presents the most attractive plan from a financial perspective second after on demand and it's also the plan that makes the greatest commitment from a lock-in standpoint now In fairness to AWS it's also true that there is a trend towards subscription-based pricing and we have some data on that this chart is from an ETR drill down survey the end is 300. pay attention to the bars on the right the left side is sort of busy but the pink is subscription and you can see the trend upward the light blue is consumption based or on demand based pricing and you can see there's a steady Trend toward subscription now we'll dig into this in a later episode of Breaking analysis but we'll share with you a little some tidbits with the data that ETR provides you can select which segment is and pass or you can go up the stack Etc but so when you choose is and paths 44 of customers either prefer or are required to use on-demand pricing whereas around 40 percent of customers say they either prefer or are required to use subscription pricing again that's for is so now the further mu you move up the stack the more prominent subscription pricing becomes often with sixty percent or more for the software-based offerings that require or prefer subscription and interestingly cyber security tracks along with software at around 60 percent that that prefer subscription it's likely because as with software you're not shutting down your cyber protection on demand all right let's get into the expectations for reinvent and we're going to start with an observation in data in this 2018 book seeing digital author David michella made the point that whereas most companies apply data on the periphery of their business kind of as an add-on function successful data companies like Google and Amazon and Facebook have placed data at the core of their operations they've operationalized data and they apply machine intelligence to that foundational element why is this the fact is it's not easy to do what the internet Giants have done very very sophisticated engineering and and and cultural discipline and this brings us to reinvent 2022 in the future of cloud machine learning and AI will increasingly be infused into applications we believe the data stack and the application stack are coming together as organizations build data apps and data products data expertise is moving from the domain of Highly specialized individuals to Everyday business people and we are just at the cusp of this trend this will in our view be a massive theme of not only re invent 22 but of cloud in the 2020s the vision of data mesh We Believe jamachtagani's principles will be realized in this decade now what we'd like to do now is share with you a glimpse of the thinking of Adam solipsky from his sit down with John Furrier each year John has a one-on-one conversation with the CEO of AWS AWS he's been doing this for years and the outcome is a better understanding of the directional thinking of the leader of the number one Cloud platform so we're now going to share some direct quotes I'm going to run through them with some commentary and then bring in some ETR data to analyze the market implications here we go this is from solipsky quote I.T in general and data are moving from departments into becoming intrinsic parts of how businesses function okay we're talking here about deeper business integration let's go on to the next one quote in time we'll stop talking about people who have the word analyst we inserted data he meant data data analyst in their title rather will have hundreds of millions of people who analyze data as part of their day-to-day job most of whom will not have the word analyst anywhere in their title we're talking about graphic designers and pizza shop owners and product managers and data scientists as well he threw that in I'm going to come back to that very interesting so he's talking about here about democratizing data operationalizing data next quote customers need to be able to take an end-to-end integrated view of their entire data Journey from ingestion to storage to harmonizing the data to being able to query it doing business Intelligence and human-based Analysis and being able to collaborate and share data and we've been putting together we being Amazon together a broad Suite of tools from database to analytics to business intelligence to help customers with that and this last statement it's true Amazon has a lot of tools and you know they're beginning to become more and more integrated but again under jassy there was not a lot of emphasis on that end-to-end integrated view we believe it's clear from these statements that solipsky's customer interactions are leading him to underscore that the time has come for this capability okay continuing quote if you have data in one place you shouldn't have to move it every time you want to analyze that data couldn't agree more it would be much better if you could leave that data in place avoid all the ETL which has become a nasty three-letter word more and more we're building capabilities where you can query that data in place end quote okay this we see a lot in the marketplace Oracle with mySQL Heatwave the entire Trend toward converge database snowflake [ __ ] extending their platforms into transaction and analytics respectively and so forth a lot of the partners are are doing things as well in that vein let's go into the next quote the other phenomenon is infusing machine learning into all those capabilities yes the comments from the michelleographic come into play here infusing Ai and machine intelligence everywhere next one quote it's not a data Cloud it's not a separate Cloud it's a series of broad but integrated capabilities to help you manage the end-to-end life cycle of your data there you go we AWS are the cloud we're going to come back to that in a moment as well next set of comments around data very interesting here quote data governance is a huge issue really what customers need is to find the right balance of their organization between access to data and control and if you provide too much access then you're nervous that your data is going to end up in places that it shouldn't shouldn't be viewed by people who shouldn't be viewing it and you feel like you lack security around that data and by the way what happens then is people overreact and they lock it down so that almost nobody can see it it's those handcuffs there's data and asset are reliability we've talked about that for years okay very well put by solipsky but this is a gap in our in our view within AWS today and we're we're hoping that they close it at reinvent it's not easy to share data in a safe way within AWS today outside of your organization so we're going to look for that at re invent 2022. now all this leads to the following statement by solipsky quote data clean room is a really interesting area and I think there's a lot of different Industries in which clean rooms are applicable I think that clean rooms are an interesting way of enabling multiple parties to share and collaborate on the data while completely respecting each party's rights and their privacy mandate okay again this is a gap currently within AWS today in our view and we know snowflake is well down this path and databricks with Delta sharing is also on this curve so AWS has to address this and demonstrate this end-to-end data integration and the ability to safely share data in our view now let's bring in some ETR spending data to put some context around these comments with reference points in the form of AWS itself and its competitors and partners here's a chart from ETR that shows Net score or spending momentum on the x-axis an overlap or pervasiveness in the survey um sorry let me go back up the net scores on the y-axis and overlap or pervasiveness in the survey is on the x-axis so spending momentum by pervasiveness okay or should have share within the data set the table that's inserted there with the Reds and the greens that informs us to how the dots are positioned so it's Net score and then the shared ends are how the plots are determined now we've filtered the data on the three big data segments analytics database and machine learning slash Ai and we've only selected one company with fewer than 100 ends in the survey and that's databricks you'll see why in a moment the red dotted line indicates highly elevated customer spend at 40 percent now as usual snowflake outperforms all players on the y-axis with a Net score of 63 percent off the charts all three big U.S cloud players are above that line with Microsoft and AWS dominating the x-axis so very impressive that they have such spending momentum and they're so large and you see a number of other emerging data players like rafana and datadog mongodbs there in the mix and then more established players data players like Splunk and Tableau now you got Cisco who's gonna you know it's a it's a it's a adjacent to their core networking business but they're definitely into you know the analytics business then the really established players in data like Informatica IBM and Oracle all with strong presence but you'll notice in the red from the momentum standpoint now what you're going to see in a moment is we put red highlights around databricks Snowflake and AWS why let's bring that back up and we'll explain so there's no way let's bring that back up Alex if you would there's no way AWS is going to hit the brakes on innovating at the base service level what we call Primitives earlier solipsky told Furrier as much in their sit down that AWS will serve the technical user and data science Community the traditional domain of data bricks and at the same time address the end-to-end integration data sharing and business line requirements that snowflake is positioned to serve now people often ask Snowflake and databricks how will you compete with the likes of AWS and we know the answer focus on data exclusively they have their multi-cloud plays perhaps the more interesting question is how will AWS compete with the likes of Specialists like Snowflake and data bricks and the answer is depicted here in this chart AWS is going to serve both the technical and developer communities and the data science audience and through end-to-end Integrations and future services that simplify the data Journey they're going to serve the business lines as well but the Nuance is in all the other dots in the hundreds or hundreds of thousands that are not shown here and that's the AWS ecosystem you can see AWS has earned the status of the number one Cloud platform that everyone wants to partner with as they say it has over a hundred thousand partners and that ecosystem combined with these capabilities that we're discussing well perhaps behind in areas like data sharing and integrated governance can wildly succeed by offering the capabilities and leveraging its ecosystem now for their part the snowflakes of the world have to stay focused on the mission build the best products possible and develop their own ecosystems to compete and attract the Mind share of both developers and business users and that's why it's so interesting to hear solipski basically say it's not a separate Cloud it's a set of integrated Services well snowflake is in our view building a super cloud on top of AWS Azure and Google when great products meet great sales and marketing good things can happen so this will be really fun to watch what AWS announces in this area at re invent all right one other topic that solipsky talked about was the correlation between serverless and container adoption and you know I don't know if this gets into there certainly their hybrid place maybe it starts to get into their multi-cloud we'll see but we have some data on this so again we're talking about the correlation between serverless and container adoption but before we get into that let's go back to 2017 and listen to what Andy jassy said on the cube about serverless play the clip very very earliest days of AWS Jeff used to say a lot if I were starting Amazon today I'd have built it on top of AWS we didn't have all the capability and all the functionality at that very moment but he knew what was coming and he saw what people were still able to accomplish even with where the services were at that point I think the same thing is true here with Lambda which is I think if Amazon were starting today it's a given they would build it on the cloud and I think we with a lot of the applications that comprise Amazon's consumer business we would build those on on our serverless capabilities now we still have plenty of capabilities and features and functionality we need to add to to Lambda and our various serverless services so that may not be true from the get-go right now but I think if you look at the hundreds of thousands of customers who are building on top of Lambda and lots of real applications you know finra has built a good chunk of their market watch application on top of Lambda and Thompson Reuters has built you know one of their key analytics apps like people are building real serious things on top of Lambda and the pace of iteration you'll see there will increase as well and I really believe that to be true over the next year or two so years ago when Jesse gave a road map that serverless was going to be a key developer platform going forward and so lipsky referenced the correlation between serverless and containers in the Furrier sit down so we wanted to test that within the ETR data set now here's a screen grab of The View across 1300 respondents from the October ETR survey and what we've done here is we've isolated on the cloud computing segment okay so you can see right there cloud computing segment now we've taken the functions from Google AWS Lambda and Microsoft Azure functions all the serverless offerings and we've got Net score on the vertical axis we've got presence in the data set oh by the way 440 by the way is highly elevated remember that and then we've got on the horizontal axis we have the presence in the data center overlap okay that's relative to each other so remember 40 all these guys are above that 40 mark okay so you see that now what we're going to do this is just for serverless and what we're going to do is we're going to turn on containers to see the correlation and see what happens so watch what happens when we click on container boom everything moves to the right you can see all three move to the right Google drops a little bit but all the others now the the filtered end drops as well so you don't have as many people that are aggressively leaning into both but all three move to the right so watch again containers off and then containers on containers off containers on so you can see a really major correlation between containers and serverless okay so to get a better understanding of what that means I call my friend and former Cube co-host Stu miniman what he said was people generally used to think of VMS containers and serverless as distinctly different architectures but the lines are beginning to blur serverless makes things simpler for developers who don't want to worry about underlying infrastructure as solipsky and the data from ETR indicate serverless and containers are coming together but as Stu and I discussed there's a spectrum where on the left you have kind of native Cloud VMS in the middle you got AWS fargate and in the rightmost anchor is Lambda AWS Lambda now traditionally in the cloud if you wanted to use containers developers would have to build a container image they have to select and deploy the ec2 images that they or instances that they wanted to use they have to allocate a certain amount of memory and then fence off the apps in a virtual machine and then run the ec2 instances against the apps and then pay for all those ec2 resources now with AWS fargate you can run containerized apps with less infrastructure management but you still have some you know things that you can you can you can do with the with the infrastructure so with fargate what you do is you'd build the container images then you'd allocate your memory and compute resources then run the app and pay for the resources only when they're used so fargate lets you control the runtime environment while at the same time simplifying the infrastructure management you gotta you don't have to worry about isolating the app and other stuff like choosing server types and patching AWS does all that for you then there's Lambda with Lambda you don't have to worry about any of the underlying server infrastructure you're just running code AS functions so the developer spends their time worrying about the applications and the functions that you're calling the point is there's a movement and we saw in the data towards simplifying the development environment and allowing the cloud vendor AWS in this case to do more of the underlying management now some folks will still want to turn knobs and dials but increasingly we're going to see more higher level service adoption now re invent is always a fire hose of content so let's do a rapid rundown of what to expect we talked about operate optimizing data and the organization we talked about Cloud optimization there'll be a lot of talk on the show floor about best practices and customer sharing data solipsky is leading AWS into the next phase of growth and that means moving beyond I.T transformation into deeper business integration and organizational transformation not just digital transformation organizational transformation so he's leading a multi-vector strategy serving the traditional peeps who want fine-grained access to core services so we'll see continued Innovation compute storage AI Etc and simplification through integration and horizontal apps further up to stack Amazon connect is an example that's often cited now as we've reported many times databricks is moving from its stronghold realm of data science into business intelligence and analytics where snowflake is coming from its data analytics stronghold and moving into the world of data science AWS is going down a path of snowflake meet data bricks with an underlying cloud is and pass layer that puts these three companies on a very interesting trajectory and you can expect AWS to go right after the data sharing opportunity and in doing so it will have to address data governance they go hand in hand okay price performance that is a topic that will never go away and it's something that we haven't mentioned today silicon it's a it's an area we've covered extensively on breaking analysis from Nitro to graviton to the AWS acquisition of Annapurna its secret weapon new special specialized capabilities like inferential and trainium we'd expect something more at re invent maybe new graviton instances David floyer our colleague said he's expecting at some point a complete system on a chip SOC from AWS and maybe an arm-based server to eventually include high-speed cxl connections to devices and memories all to address next-gen applications data intensive applications with low power requirements and lower cost overall now of course every year Swami gives his usual update on machine learning and AI building on Amazon's years of sagemaker innovation perhaps a focus on conversational AI or a better support for vision and maybe better integration across Amazon's portfolio of you know large language models uh neural networks generative AI really infusing AI everywhere of course security always high on the list that reinvent and and Amazon even has reinforce a conference dedicated to it uh to security now here we'd like to see more on supply chain security and perhaps how AWS can help there as well as tooling to make the cio's life easier but the key so far is AWS is much more partner friendly in the security space than say for instance Microsoft traditionally so firms like OCTA and crowdstrike in Palo Alto have plenty of room to play in the AWS ecosystem we'd expect of course to hear something about ESG it's an important topic and hopefully how not only AWS is helping the environment that's important but also how they help customers save money and drive inclusion and diversity again very important topics and finally come back to it reinvent is an ecosystem event it's the Super Bowl of tech events and the ecosystem will be out in full force every tech company on the planet will have a presence and the cube will be featuring many of the partners from the serial floor as well as AWS execs and of course our own independent analysis so you'll definitely want to tune into thecube.net and check out our re invent coverage we start Monday evening and then we go wall to wall through Thursday hopefully my voice will come back we have three sets at the show and our entire team will be there so please reach out or stop by and say hello all right we're going to leave it there for today many thanks to Stu miniman and David floyer for the input to today's episode of course John Furrier for extracting the signal from the noise and a sit down with Adam solipski thanks to Alex Meyerson who was on production and manages the podcast Ken schiffman as well Kristen Martin and Cheryl Knight helped get the word out on social and of course in our newsletters Rob hoef is our editor-in-chief over at siliconangle does some great editing thank thanks to all of you remember all these episodes are available as podcasts wherever you listen you can pop in the headphones go for a walk just search breaking analysis podcast I published each week on wikibon.com at siliconangle.com or you can email me at david.valante at siliconangle.com or DM me at di vallante or please comment on our LinkedIn posts and do check out etr.ai for the best survey data in the Enterprise Tech business this is Dave vellante for the cube insights powered by ETR thanks for watching we'll see it reinvent or we'll see you next time on breaking analysis [Music]

Published Date : Nov 26 2022

SUMMARY :

so now the further mu you move up the

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
David michellaPERSON

0.99+

Alex MeyersonPERSON

0.99+

Cheryl KnightPERSON

0.99+

AWSORGANIZATION

0.99+

AlibabaORGANIZATION

0.99+

oneQUANTITY

0.99+

Dave vellantePERSON

0.99+

David floyerPERSON

0.99+

Kristen MartinPERSON

0.99+

JohnPERSON

0.99+

sixty percentQUANTITY

0.99+

AmazonORGANIZATION

0.99+

Adam solipskiPERSON

0.99+

John FurrierPERSON

0.99+

MicrosoftORGANIZATION

0.99+

2022DATE

0.99+

Andy jassyPERSON

0.99+

GoogleORGANIZATION

0.99+

OracleORGANIZATION

0.99+

FacebookORGANIZATION

0.99+

hundredsQUANTITY

0.99+

2017DATE

0.99+

Palo AltoLOCATION

0.99+

40 percentQUANTITY

0.99+

alibabaORGANIZATION

0.99+

LambdaTITLE

0.99+

63 percentQUANTITY

0.99+

1300 respondentsQUANTITY

0.99+

Super BowlEVENT

0.99+

80 billionQUANTITY

0.99+

John furrierPERSON

0.99+

ThursdayDATE

0.99+

CiscoORGANIZATION

0.99+

three yearsQUANTITY

0.99+

Monday eveningDATE

0.99+

JessePERSON

0.99+

Stu minimanPERSON

0.99+

siliconangle.comOTHER

0.99+

OctoberDATE

0.99+

thecube.netOTHER

0.99+

fourthQUANTITY

0.99+

a month laterDATE

0.99+

thirdQUANTITY

0.99+

hundreds of thousandsQUANTITY

0.99+

fargateORGANIZATION

0.99+

Breaking Analysis: Snowflake caught in the storm clouds


 

>> From the CUBE Studios in Palo Alto in Boston, bringing you data driven insights from the Cube and ETR. This is Breaking Analysis with Dave Vellante. >> A better than expected earnings report in late August got people excited about Snowflake again, but the negative sentiment in the market is weighed heavily on virtually all growth tech stocks and Snowflake is no exception. As we've stressed many times the company's management is on a long term mission to dramatically simplify the way organizations use data. Snowflake is tapping into a multi hundred billion dollar total available market and continues to grow at a rapid pace. In our view, Snowflake is embarking on its third major wave of innovation data apps, while its first and second waves are still bearing significant fruit. Now for short term traders focused on the next 90 or 180 days, that probably doesn't matter. But those taking a longer view are asking, "Should we still be optimistic about the future of this high flyer or is it just another over hyped tech play?" Hello and welcome to this week's Wiki Bond Cube Insights powered by ETR. Snowflake's Quarter just ended. And in this breaking analysis we take a look at the most recent survey data from ETR to see what clues and nuggets we can extract to predict the near term future in the long term outlook for Snowflake which is going to announce its earnings at the end of this month. Okay, so you know the story. If you've been investor in Snowflake this year, it's been painful. We said at IPO, "If you really want to own this stock on day one, just hold your nose and buy it." But like most IPOs we said there will be likely a better entry point in the future, and not surprisingly that's been the case. Snowflake IPOed a price of 120, which you couldn't touch on day one unless you got into a friends and family Delio. And if you did, you're still up 5% or so. So congratulations. But at one point last year you were up well over 200%. That's been the nature of this volatile stock, and I certainly can't help you with the timing of the market. But longer term Snowflake is targeting 10 billion in revenue for fiscal year 2028. A big number. Is it achievable? Is it big enough? Tell you what, let's come back to that. Now shorter term, our expert trader and breaking analysis contributor Chip Simonton said he got out of the stock a while ago after having taken a shot at what turned out to be a bear market rally. He pointed out that the stock had been bouncing around the 150 level for the last few months and broke that to the downside last Friday. So he'd expect 150 is where the stock is going to find resistance on the way back up, but there's no sign of support right now. He said maybe at 120, which was the July low and of course the IPO price that we just talked about. Now, perhaps earnings will be a catalyst, when Snowflake announces on November 30th, but until the mentality toward growth tech changes, nothing's likely to change dramatically according to Simonton. So now that we have that out of the way, let's take a look at the spending data for Snowflake in the ETR survey. Here's a chart that shows the time series breakdown of snowflake's net score going back to the October, 2021 survey. Now at that time, Snowflake's net score stood at a robust 77%. And remember, net score is a measure of spending velocity. It's a proprietary network, and ETR derives it from a quarterly survey of IT buyers and asks the respondents, "Are you adopting the platform new? Are you spending 6% or more? Is you're spending flat? Is you're spending down 6% or worse? Or are you leaving the platform decommissioning?" You subtract the percent of customers that are spending less or churning from those that are spending more and adopting or adopting and you get a net score. And that's expressed as a percentage of customers responding. In this chart we show Snowflake's in out of the total survey which ranges... The total survey ranges between 1,200 and 1,400 each quarter. And the very last column... Oh sorry, very last row, we show the number of Snowflake respondents that are coming in the survey from the Fortune 500 and the Global 2000. Those are two very important Snowflake constituencies. Now what this data tells us is that Snowflake exited 2021 with very strong momentum in a net score of 82%, which is off the charts and it was actually accelerating from the previous survey. Now by April that sentiment had flipped and Snowflake came down to earth with a 68% net score. Still highly elevated relative to its peers, but meaningfully down. Why was that? Because we saw a drop in new ads and an increase in flat spend. Then into the July and most recent October surveys, you saw a significant drop in the percentage of customers that were spending more. Now, notably, the percentage of customers who are contemplating adding the platform is actually staying pretty strong, but it is off a bit this past survey. And combined with a slight uptick in planned churn, net score is now down to 60%. That uptick from 0% and 1% and then 3%, it's still small, but that net score at 60% is still 20 percentage points higher than our highly elevated benchmark of 40% as you recall from listening to earlier breaking analysis. That 40% range is we consider a milestone. Anything above that is actually quite strong. But again, Snowflake is down and coming back to churn, while 3% churn is very low, in previous quarters we've seen Snowflake 0% or 1% decommissions. Now the last thing to note in this chart is the meaningful uptick in survey respondents that are citing, they're using the Snowflake platform. That's up to 212 in the survey. So look, it's hard to imagine that Snowflake doesn't feel the softening in the market like everyone else. Snowflake is guiding for around 60% growth in product revenue against the tough compare from a year ago with a 2% operating margin. So like every company, the reaction of the street is going to come down to how accurate or conservative the guide is from their CFO. Now, earlier this year, Snowflake acquired a company called Streamlit for around $800 million. Streamlit is an open source Python library and it makes it easier to build data apps with machine learning, obviously a huge trend. And like Snowflake, generally its focus is on simplifying the complex, in this case making data science easier to integrate into data apps that business people can use. So we were excited this summer in the July ETR survey to see that they added some nice data and pick on Streamlit, which we're showing here in comparison to Snowflake's core business on the left hand side. That's the data warehousing, the Streamlit pieces on the right hand side. And we show again net score over time from the previous survey for Snowflake's core database and data warehouse offering again on the left as compared to a Streamlit on the right. Snowflake's core product had 194 responses in the October, 22 survey, Streamlit had an end of 73, which is up from 52 in the July survey. So significant uptick of people responding that they're doing business in adopting Streamlit. That was pretty impressive to us. And it's hard to see, but the net scores stayed pretty constant for Streamlit at 51%. It was 52% I think in the previous quarter, well over that magic 40% mark. But when you blend it with Snowflake, it does sort of bring things down a little bit. Now there are two key points here. One is that the acquisition seems to have gained exposure right out of the gate as evidenced by the large number of responses. And two, the spending momentum. Again while it's lower than Snowflake overall, and when you blend it with Snowflake it does pull it down, it's very healthy and steady. Now let's do a little pure comparison with some of our favorite names in this space. This chart shows net score or spending velocity in the Y-axis, an overlap or presence, pervasiveness if you will, in the data set on the X-axis. That red dotted line again is that 40% highly elevated net score that we like to talk about. And that table inserted informs us as to how the companies are plotted, where the dots set up, the net score, the ins. And we're comparing a number of database players, although just a caution, Oracle includes all of Oracle including its apps. But we just put it in there for reference because it is the leader in database. Right off the bat, Snowflake jumps out with a net score of 64%. The 60% from the earlier chart, again included Streamlit. So you can see its core database, data warehouse business actually is higher than the total company average that we showed you before 'cause the Streamlit is blended in. So when you separate it out, Streamlit is right on top of data bricks. Isn't that ironic? Only Snowflake and Databricks in this selection of names are above the 40% level. You see Mongo and Couchbase, they know they're solid and Teradata cloud actually showing pretty well compared to some of the earlier survey results. Now let's isolate on the database data platform sector and see how that shapes up. And for this analysis, same XY dimensions, we've added the big giants, AWS and Microsoft and Google. And notice that those three plus Snowflake are just at or above the 40% line. Snowflake continues to lead by a significant margin in spending momentum and it keeps creeping to the right. That's that end that we talked about earlier. Now here's an interesting tidbit. Snowflake is often asked, and I've asked them myself many times, "How are you faring relative to AWS, Microsoft and Google, these big whales with Redshift and Synapse and Big Query?" And Snowflake has been telling folks that 80% of its business comes from AWS. And when Microsoft heard that, they said, "Whoa, wait a minute, Snowflake, let's partner up." 'Cause Microsoft is smart, and they understand that the market is enormous. And if they could do better with Snowflake, one, they may steal some business from AWS. And two, even if Snowflake is winning against some of the Microsoft database products, if it wins on Azure, Microsoft is going to sell more compute and more storage, more AI tools, more other stuff to these customers. Now AWS is really aggressive from a partnering standpoint with Snowflake. They're openly negotiating, not openly, but they're negotiating better prices. They're realizing that when it comes to data, the cheaper that you make the offering, the more people are going to consume. At scale economies and operating leverage are really powerful things at volume that kick in. Now Microsoft, they're coming along, they obviously get it, but Google is seemingly resistant to that type of go to market partnership. Rather than lean into Snowflake as a great partner Google's field force is kind of fighting fashion. Google itself at Cloud next heavily messaged what they call the open data cloud, which is a direct rip off of Snowflake. So what can we say about Google? They continue to be kind of behind the curve when it comes to go to market. Now just a brief aside on the competitive posture. I've seen Slootman, Frank Slootman, CEO of Snowflake in action with his prior companies and how he depositioned the competition. At Data Domain, he eviscerated a company called Avamar with their, what he called their expensive and slow post process architecture. I think he actually called it garbage, if I recall at one conference I heard him speak at. And that sort of destroyed BMC when he was at ServiceNow, kind of positioning them as the equivalent of the department of motor vehicles. And so it's interesting to hear how Snowflake openly talks about the data platforms of AWS, Microsoft, Google, and data bricks. I'll give you this sort of short bumper sticker. Redshift is just an on-prem database that AWS morphed to the cloud, which by the way is kind of true. They actually did a brilliant job of it, but it's basically a fact. Microsoft Excel, a collection of legacy databases, which also kind of morphed to run in the cloud. And even Big Query, which is considered cloud native by many if not most, is being positioned by Snowflake as originally an on-prem database to support Google's ad business, maybe. And data bricks is for those people smart enough to get it to Berkeley that love complexity. And now Snowflake doesn't, they don't mention Berkeley as far as I know. That's my addition. But you get the point. And the interesting thing about Databricks and Snowflake is a while ago in the cube I said that there was a new workload type emerging around data where you have AWS cloud, Snowflake obviously for the cloud database and Databricks data for the data science and EML, you bring those things together and there's this new workload emerging that's going to be very powerful in the future. And it's interesting to see now the aspirations of all three of these platforms are colliding. That's quite a dynamic, especially when you see both Snowflake and Databricks putting venture money and getting their hooks into the loyalties of the same companies like DBT labs and Calibra. Anyway, Snowflake's posture is that we are the pioneer in cloud native data warehouse, data sharing and now data apps. And our platform is designed for business people that want simplicity. The other guys, yes, they're formidable, but we Snowflake have an architectural lead and of course we run in multiple clouds. So it's pretty strong positioning or depositioning, you have to admit. Now I'm not sure I agree with the big query knockoffs completely. I think that's a bit of a stretch, but snowflake, as we see in the ETR survey data is winning. So in thinking about the longer term future, let's talk about what's different with Snowflake, where it's headed and what the opportunities are for the company. Snowflake put itself on the map by focusing on simplifying data analytics. What's interesting about that is the company's founders are as you probably know from Oracle. And rather than focusing on transactional data, which is Oracle's sweet spot, the stuff they worked on when they were at Oracle, the founder said, "We're going to go somewhere else. We're going to attack the data warehousing problem and the data analytics problem." And they completely re-imagined the database and how it could be applied to solve those challenges and reimagine what was possible if you had virtually unlimited compute and storage capacity. And of course Snowflake became famous for separating the compute from storage and being able to completely shut down compute so you didn't have to pay for it when you're not using it. And the ability to have multiple clusters hit the same data without making endless copies and a consumption/cloud pricing model. And then of course everyone on the planet realized, "Wow, that's a pretty good idea." Every venture capitalist in Silicon Valley has been funding companies to copy that move. And that today has pretty much become mainstream in table stakes. But I would argue that Snowflake not only had the lead, but when you look at how others are approaching this problem, it's not necessarily as clean and as elegant. Some of the startups, the early startups I think get it and maybe had an advantage of starting later, which can be a disadvantage too. But AWS is a good example of what I'm saying here. Is its version of separating compute from storage was an afterthought and it's good, it's... Given what they had it was actually quite clever and customers like it, but it's more of a, "Okay, we're going to tier to storage to lower cost, we're going to sort of dial down the compute not completely, we're not going to shut it off, we're going to minimize the compute required." It's really not true as separation is like for instance Snowflake has. But having said that, we're talking about competitors with lots of resources and cohort offerings. And so I don't want to make this necessarily all about the product, but all things being equal architecture matters, okay? So that's the cloud S-curve, the first one we're showing. Snowflake's still on that S-curve, and in and of itself it's got legs, but it's not what's going to power the company to 10 billion. The next S-curve we denote is the multi-cloud in the middle. And now while 80% of Snowflake's revenue is AWS, Microsoft is ramping up and Google, well, we'll see. But the interesting part of that curve is data sharing, and this idea of data clean rooms. I mean it really should be called the data sharing curve, but I have my reasons for calling it multi-cloud. And this is all about network effects and data gravity, and you're seeing this play out today, especially in industries like financial services and healthcare and government that are highly regulated verticals where folks are super paranoid about compliance. There not going to share data if they're going to get sued for it, if they're going to be in the front page of the Wall Street Journal for some kind of privacy breach. And what Snowflake has done is said, "Put all the data in our cloud." Now, of course now that triggers a lot of people because it's a walled garden, okay? It is. That's the trade off. It's not the Wild West, it's not Windows, it's Mac, it's more controlled. But the idea is that as different parts of the organization or even partners begin to share data that they need, it's got to be governed, it's got to be secure, it's got to be compliant, it's got to be trusted. So Snowflake introduced the idea of, they call these things stable edges. I think that's the term that they use. And they track a metric around stable edges. And so a stable edge, or think of it as a persistent edge is an ongoing relationship between two parties that last for some period of time, more than a month. It's not just a one shot deal, one a done type of, "Oh guys shared it for a day, done." It sent you an FTP, it's done. No, it's got to have trajectory over time. Four weeks or six weeks or some period of time that's meaningful. And that metric is growing. Now I think sort of a different metric that they track. I think around 20% of Snowflake customers are actively sharing data today and then they track the number of those edge relationships that exist. So that's something that's unique. Because again, most data sharing is all about making copies of data. That's great for storage companies, it's bad for auditors, and it's bad for compliance officers. And that trend is just starting out, that middle S-curve, it's going to kind of hit the base of that steep part of the S-curve and it's going to have legs through this decade we think. And then finally the third wave that we show here is what we call super cloud. That's why I called it multi-cloud before, so it could invoke super cloud. The idea that you've built a PAS layer that is purpose built for a specific objective, and in this case it's building data apps that are cloud native, shareable and governed. And is a long-term trend that's going to take some time to develop. I mean, application development platforms can take five to 10 years to mature and gain significant adoption, but this one's unique. This is a critical play for Snowflake. If it's going to compete with the big cloud players, it has to have an app development framework like Snowpark. It has to accommodate new data types like transactional data. That's why it announced this thing called UniStore last June, Snowflake a summit. And the pattern that's forming here is Snowflake is building layer upon layer with its architecture at the core. It's not currently anyway, it's not going out and saying, "All right, we're going to buy a company that's got to another billion dollars in revenue and that's how we're going to get to 10 billion." So it's not buying its way into new markets through revenue. It's actually buying smaller companies that can complement Snowflake and that it can turn into revenue for growth that fit in to the data cloud. Now as to the 10 billion by fiscal year 28, is that achievable? That's the question. Yeah, I think so. Would the momentum resources go to market product and management prowess that Snowflake has? Yes, it's definitely achievable. And one could argue to $10 billion is too conservative. Indeed, Snowflake CFO, Mike Scarpelli will fully admit his forecaster built on existing offerings. He's not including revenue as I understand it from all the new stuff that's in the pipeline because he doesn't know what it's going to look like. He doesn't know what the adoption is going to look like. He doesn't have data on that adoption, not just yet anyway. And now of course things can change quite dramatically. It's possible that is forecast for existing businesses don't materialize or competition picks them off or a company like Databricks actually is able in the longer term replicate the functionality of Snowflake with open source technologies, which would be a very competitive source of innovation. But in our view, there's plenty of room for growth, the market is enormous and the real key is, can and will Snowflake deliver on the promises of simplifying data? Of course we've heard this before from data warehouse, the data mars and data legs and master data management and ETLs and data movers and data copiers and Hadoop and a raft of technologies that have not lived up to expectations. And we've also, by the way, seen some tremendous successes in the software business with the likes of ServiceNow and Salesforce. So will Snowflake be the next great software name and hit that 10 billion magic mark? I think so. Let's reconnect in 2028 and see. Okay, we'll leave it there today. I want to thank Chip Simonton for his input to today's episode. Thanks to Alex Myerson who's on production and manages the podcast. Ken Schiffman as well. Kristin Martin and Cheryl Knight help get the word out on social media and in our newsletters. And Rob Hove is our Editor in Chief over at Silicon Angle. He does some great editing for us. Check it out for all the news. Remember all these episodes are available as podcasts. Wherever you listen, just search Breaking Analysis podcast. I publish each week on wikibon.com and siliconangle.com. Or you can email me to get in touch David.vallante@siliconangle.com. DM me @dvellante or comment on our LinkedIn post. And please do check out etr.ai, they've got the best survey data in the enterprise tech business. This is Dave Vellante for the CUBE Insights, powered by ETR. Thanks for watching, thanks for listening and we'll see you next time on breaking analysis. (upbeat music)

Published Date : Nov 10 2022

SUMMARY :

insights from the Cube and ETR. And the ability to have multiple

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Alex MyersonPERSON

0.99+

Mike ScarpelliPERSON

0.99+

Dave VellantePERSON

0.99+

OracleORGANIZATION

0.99+

AWSORGANIZATION

0.99+

November 30thDATE

0.99+

Ken SchiffmanPERSON

0.99+

MicrosoftORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

Chip SimontonPERSON

0.99+

October, 2021DATE

0.99+

Rob HovePERSON

0.99+

Cheryl KnightPERSON

0.99+

Frank SlootmanPERSON

0.99+

Four weeksQUANTITY

0.99+

JulyDATE

0.99+

six weeksQUANTITY

0.99+

10 billionQUANTITY

0.99+

fiveQUANTITY

0.99+

Palo AltoLOCATION

0.99+

SlootmanPERSON

0.99+

BMCORGANIZATION

0.99+

DatabricksORGANIZATION

0.99+

6%QUANTITY

0.99+

80%QUANTITY

0.99+

last yearDATE

0.99+

OctoberDATE

0.99+

Silicon ValleyLOCATION

0.99+

40%QUANTITY

0.99+

1,400QUANTITY

0.99+

$10 billionQUANTITY

0.99+

SnowflakeORGANIZATION

0.99+

AprilDATE

0.99+

3%QUANTITY

0.99+

77%QUANTITY

0.99+

64%QUANTITY

0.99+

60%QUANTITY

0.99+

194 responsesQUANTITY

0.99+

Kristin MartinPERSON

0.99+

two partiesQUANTITY

0.99+

51%QUANTITY

0.99+

2%QUANTITY

0.99+

Silicon AngleORGANIZATION

0.99+

fiscal year 28DATE

0.99+

billion dollarsQUANTITY

0.99+

0%QUANTITY

0.99+

AvamarORGANIZATION

0.99+

52%QUANTITY

0.99+

BerkeleyLOCATION

0.99+

2028DATE

0.99+

MongoORGANIZATION

0.99+

Data DomainORGANIZATION

0.99+

1%QUANTITY

0.99+

late AugustDATE

0.99+

twoQUANTITY

0.99+

threeQUANTITY

0.99+

fiscal year 2028DATE

0.99+

Breaking Analysis: Latest CIO Survey Shows Steady Deceleration in IT Spend


 

>> From the Cube Studios in Palo Alto in Boston bringing you data driven insights from theCUBE and ETR, this is Breaking Analysis with Dave Vellante. >> Is the glass half full or half empty? Well, it depends on how you want to look at it. CIOs are tapping the breaks on spending, that's clear. The latest macro survey data from ETR quantifies what we already know to be true, that IT spend is decelerating. CIOs and IT buyers forecast that their tech spend will grow by 5.5% this year. That's a meaningful deceleration from near year end 2021 expectations. But these levels are still well above historical norms. So while the feel good factor may be in some jeopardy, overall things are pretty good, at least for now. Hello and welcome to this week's Wikibon Cube Insights powered by ETR. In this Breaking Analysis, we update you in the latest macro tech spending data from Enterprise Technology Research, including strategies that organizations are employing to cut costs, and which project categories continue to see the most traction. Now, CIOs were much more optimistic at the end of last year than they are today. Back then they thought their aggregates spend would increase by more than 8%. Of course, at that time the expectation was that the economy was ready to make a semi ordered return to normal, and that didn't happen as you well know. And you can see here the expectation for spending this year is down to 5.5% growth, as we said, and this is based on the most recent ETR CIO and IT buyer survey, which includes more than 1100 responses. So we started the year above 8% then made a meaningful decline into the mid sixes and nine months into the year, we're now in the mid fives, but this is still two to 300 basis points above historical norms for IT spending. And looking ahead to next year, CIOs are expecting accelerated growth edging back up toward that 6% level. Now as noted here, the visibility on this is probably less clear than pre COVID years of course, but the bottom line is digital transformations are continuing to push it spending above historical levels. Now the problem as we know, is earning estimates are coming down and forecasts are being lowered every day. I mean, as the saying goes the first disappointment is rarely the last. Even the semiconductor industry is seeing softness. Just this past week we saw AMD lower its quarterly revenue forecast by more than a billion dollars, as PC demand in the second half has significantly softened. But again, that's relative to some pretty amazing PC growth in the past couple of years thanks to the isolation economy. So we do see CIOs tapping the brakes, and these data points here tell an interesting story. ETR asked respondents about various actions that they're taking and these two stood out. The top line is, "We're accelerating new IT projects," and the bottom line is, "We're freezing IT projects," and you can see the convergence of those two lines, which of course signals the down. But again, these are not alarming data points. If you think about history. If you go back to Q1 2020, for example, just before the pandemic, that top line that was at 12% versus where it is today at 25%. And if you look at project freezes, they were at 22% in Q1 of 2020, which is significantly higher than today. So relatively speaking the spending dynamic is still strong. It just doesn't feel that way because we're coming out of an historic anomaly. Now, ETR asked a follow up question to respondents that indicated that spending would be down this quarter relative to the same quarter last year. So they wanted to better understand the most common actions that organizations would take to save money, and that's what this chart shows. The most common approach is still to consolidate redundant vendors across the lines of business. That was over 30%, as you can see here in the first set of bars. So presumably CIOs now have the latitude to go after so-called shadow projects, shadow IT, and implement standards across the organization via vendor consolidation. As well, there's a big jump in the survey from 14% to 20% of respondents saying that they were going after the Cloud bill, and that relates to the fourth set of bars which is scrutinizing consumption based services. So combined, 45% of respondents are looking at reducing their on demand spend. Now, some of that may be SaaS related, but most of the SaaS spend is committed, so pre-committed, but we do see organizations doing more audits and trying to eliminate or reduce orphaned licenses. Now the last data point that we want to focus on is the technology sectors that are of the highest priority. You can see here on the set of bars on the left while cybersecurity remains the top technology area, even this sector is showing a little bit of softness. What's really notable is the uptick in data related areas, that second set of bars, this category is now the second most cited, taking over from Cloud, which as you can see, remain strong, and of course Cloud continues to be a key component of digital transformations. As we've previously reported, machine learning, AI, and RPA are somewhat more strategic and more discretionary, and they've dropped below the 40% mark in terms of net score in the overall survey. We're not showing that data here, but we covered this in our last Breaking Analysis ahead of our UI path event. Now you have to remember these are the top seven sectors, and there are dozens in the ETR taxonomy, so making this list is goodness from a spending perspective. So even though there's some softness in most of these categories, these are the ones CIOs are most focused on addressing. So the big takeaways of this data are spending targets are coming down to the mid 5% range, but this is meaningfully higher than historical norms. And while CIOs, they are pumping the brakes on projects, they're still moving forward at rates faster than pre COVID levels and they're freezing fewer projects. Remember, this as well, this could be a skill shortage in play, but the slowdown is more likely related to the economic uncertainty. You know, we're seeing the two-sided coin of pay by the drink consumption models, right? You can dial it up as as you need to but you can also dial it down, and that's one of the alluring features of on demand. And we're seeing firms give more scrutiny to the Cloud bill, why wouldn't they? And there's a bit of unsurprising backlash to the flaws in today's SaaS pricing model that locks you in for specified terms. So people, when their term comes up are really going to scrutinize whether or not they have orphan licenses and try to reduce those. And it appears that the real savings can come from eliminating redundant vendors. That seems to be the biggest, you know, number one strategy, and that could favor some of the larger firms, think Oracle, Dell, Salesforce ServiceNow, IBM, HPE, Cisco, and others, you know, they may benefit from having more of larger footprint across the organization. You know, having that one throat to choke, you know one back to pat, as some like to say, but they could benefit those larger companies in least in the near term. Now having said that, we do see an uptick in data related areas as a priority for CIOs, and that could mean companies like Snowflake are in a strong position and can continue to thrive. You know, even though as we reported a couple of weeks ago, virtually all companies and sectors in the ETR data set are showing some softness related to spending a momentum from previous quarters. ETR will have its... will release its results next week and then we'll dig into the specific vendor action relative to previous quarters. So look, it feels like a meaningful slowdown but the sky is by no means falling. There are these kind of out of our control factors like interest rates, and Ukraine, and oil supply, and wages, et cetera, that are creating this uncertainty and causing firms to be more cautious. But generally we remain optimistic as leading tech companies are pretty well managed and have a lot of runway on the balance sheets, and can adjust costs to reflect the uncertain environment and remain flexible in their business models in doing so. Okay, that's it for today. Thanks to Alex Myerson who's on production and he also manages the podcast for Breaking Analysis. Ken Schiffman is also out of our Boston studio as well. Kristin Martin and Cheryl Knight, they help get the word out on social media and in our newsletters, and Rob Hof is our editor in chief over at Silicon Angle who posts our Breaking Analysis and does some great editing. So thank you to all. Remember all these episodes are available as podcasts. Wherever you listen all you got to do is search Breaking Analysis podcast. I publish each week on wikibon.com and siliconangle.com, and you can email me at david.vellante@siliconangle.com or DM me @dvellante, or feel free to comment on our LinkedIn posts. And please do check out etr.ai for the best survey data in the enterprise tech business. This is Dave for the theCUBE Insights powered by ETR. Thanks for watching and we'll see you next time on Breaking Analysis. (relaxing music)

Published Date : Oct 7 2022

SUMMARY :

From the Cube Studios in Palo Alto and that relates to the fourth set of bars

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Alex MyersonPERSON

0.99+

Rob HofPERSON

0.99+

IBMORGANIZATION

0.99+

DellORGANIZATION

0.99+

CiscoORGANIZATION

0.99+

Cheryl KnightPERSON

0.99+

OracleORGANIZATION

0.99+

twoQUANTITY

0.99+

Dave VellantePERSON

0.99+

Ken SchiffmanPERSON

0.99+

HPEORGANIZATION

0.99+

40%QUANTITY

0.99+

Palo AltoLOCATION

0.99+

14%QUANTITY

0.99+

Kristin MartinPERSON

0.99+

45%QUANTITY

0.99+

two linesQUANTITY

0.99+

5.5%QUANTITY

0.99+

6%QUANTITY

0.99+

ETRORGANIZATION

0.99+

second halfQUANTITY

0.99+

next weekDATE

0.99+

25%QUANTITY

0.99+

more than 1100 responsesQUANTITY

0.99+

david.vellante@siliconangle.comOTHER

0.99+

22%QUANTITY

0.99+

BostonLOCATION

0.99+

todayDATE

0.99+

Silicon AngleORGANIZATION

0.99+

more than a billion dollarsQUANTITY

0.99+

fourth setQUANTITY

0.99+

DavePERSON

0.99+

Cube StudiosORGANIZATION

0.99+

more than 8%QUANTITY

0.99+

next yearDATE

0.99+

12%QUANTITY

0.99+

first setQUANTITY

0.99+

nine monthsQUANTITY

0.99+

each weekQUANTITY

0.99+

this yearDATE

0.99+

AMDORGANIZATION

0.99+

20%QUANTITY

0.99+

Q1DATE

0.99+

Salesforce ServiceNowORGANIZATION

0.98+

two-sidedQUANTITY

0.98+

dozensQUANTITY

0.98+

secondQUANTITY

0.98+

pandemicEVENT

0.98+

first disappointmentQUANTITY

0.97+

Q1 2020DATE

0.97+

over 30%QUANTITY

0.96+

Breaking AnalysisTITLE

0.96+

last yearDATE

0.96+

this weekDATE

0.95+

Enterprise Technology ResearchORGANIZATION

0.94+

LinkedInORGANIZATION

0.92+

second setQUANTITY

0.9+

UkraineLOCATION

0.9+

past couple of yearsDATE

0.88+

mid fivesQUANTITY

0.88+

sevenQUANTITY

0.88+

couple of weeks agoDATE

0.85+

above 8%QUANTITY

0.85+

quarterDATE

0.85+

this quarterDATE

0.82+

end of last yearDATE

0.82+

mid 5%QUANTITY

0.81+

300 basis pointsQUANTITY

0.8+

theCUBEORGANIZATION

0.79+

@dvellantePERSON

0.75+

SnowflakeORGANIZATION

0.72+

past weekDATE

0.71+

COVIDOTHER

0.7+

wikibon.comORGANIZATION

0.69+

year end 2021DATE

0.67+

Wikibon CubeORGANIZATION

0.63+

oneQUANTITY

0.58+

siliconangle.comORGANIZATION

0.57+

BreakingTITLE

0.57+

2020DATE

0.54+

halfQUANTITY

0.52+

Dell Tech Summit Jen Saavedra


 

(bright upbeat music) >> Okay, we're back with Jenn Saavedra, who's the Chief Human Resource Officer of Dell and we're going to discuss people culture and hybrid work and leadership in the post isolation economy. Jenn, the conversations that we had at Dell Tech World this past May around the new work environment were some of the most interesting and engaging that I had personally. So I'm really eager to get the update. It's great to see you again. Thanks for coming on theCUBE. >> Thanks for having me, Dave. There's been a lot of change in just a short amount of time. So I'm excited to share some of our learnings with you. >> I mean, I bet there has, I mean post pandemic companies, they're trying everybody's trying to figure out the return to work and what it looks like. Last May there was really a theme of flexibility but depending, and we talked about, well, millennial or not, young, old, and it's just really was mixed. So how have you approached the topic? What are your policies? What's changed since we last talked? What's working, what's still being worked? What would you recommend to other companies to... Over to you. >> Yeah, well, this isn't a topic that's necessarily new to Dell technology. So we've been doing hybrid before hybrid was a thing. So for over a decade we've been doing what we called connected workplace. So we have kind of a history and we have some great learnings from that. Although things did change for the entire world. In March of 2020, we went from kind of this hybrid to everybody being remote for a while. But what we wanted to do is we're such a data-driven company. There's so many headlines out there, about all these things that people think could happen will happen but there wasn't a lot of data behind it. So we took a step back and we asked our team members, how do you think we're doing? And we asked very kind of strong language because we've been doing this for a while, we asked them, do you think we're leading in the world of hybrid? And 86% of our team members said that we were which is great, but we always know there's nuance behind that macro level. So we asked 'em a lot of different questions and we just went on this kind of myth busting journey and we decided to test some of those things we're hearing about Culture Willow Road or new team members will have trouble being connected or millennials will be different. And we really just collected a lot of data asked our team members what their experiences. And what we have found is really you don't have to be together in the office all the time to have a strong culture, a sense of connection, to be productive, and to have a really healthy business. >> Well, I like that you were data driven around it with the data business here. But there is a lot of debate around your culture and how it suffers in a hybrid environment, how remote workers won't get promoted. And so I'm curious, and I've seen some like-minded companies like Dell say, Hey, we want you guys to work the way you want to work. But then I've seen them adjust and say, Well, yeah, but we also want you to know in the office week so we can collaborate a little bit more. So what are you seeing at Dell and do you maintain that cultural advantage that you're alluding to in this kind of strange new ever changing world? >> Yeah, well, I think, look, one approach doesn't fiddle. So I don't think that the approach that works for Dell Technologies is necessarily the approach that works for every company. It works with our strategy and culture. It is really important that we listen to our team members and that we support them through this journey. They tell us time and time again one of the most special things about our culture is that we provide flexibility and choice. So we're not a mandate culture. We really want to make sure that our team members know that we want them to be their best and do their best. And not every individual role has the same requirements. Not every individual person has the same needs. And so we really want to meet them where they are so that they can be productive. They feel connected to the team and to the company and engaged and inspired. So, for us it really does make sense to go forward with this. And so we haven't taken a step back. We've been doing hybrid, we'll continue to do hybrid. But just like if you, we talk about not being a mandate. I think the companies that say nobody will come in or you have to come in three days a week, all of that feels more limiting. And so what we really say is, work out with your team, work out with your role, work out with your leader what really makes the most sense to drive things forward. >> I mean, you talk- >> So that's what we do. >> You were talking before about myths and I want talk about team member performance 'cause there's, a lot of people believe that if you're not in the office, you have disadvantages, people in the office have the advantage 'cause they get FaceTime. Is is that a myth? Is there some truth to that? What do you think about that? >> Well, for us, we look, again we just looked at the data. So we said we don't want to create a have and have not culture that you're talking about. We really want to have an inclusive culture, we want to be outcome-driven. We're a meritocracy. But we went and we looked at the data. So pre pandemic, we looked at things like performance, we looked at rewards and recognition, we looked at attrition rates, we looked at sentiment. Do you feel like your leader is inspiring? And we found no meaningful differences in any of that or in engagement between those who worked fully remote, fully in the office or some combination between. So our data would bust that myth and say, you don't have to be in an office and be seen to get ahead. We have equitable opportunity. Now, having said that, you always have to be watching that data and that's something that we'll continue to do and make sure that we are creating equal opportunity regardless of where you work. >> And it's personal too, I think I think some people can be really productive at home. I happen to be one that I'm way more productive in the office 'cause the dogs aren't barking. I have less distractions. And so, yeah, and I think the takeaway that in just in talking to you Jenn and folks at Dell is, whatever works for you we're going to support. So I wanted to switch gears a little bit and talk about leadership and very specifically, empathic leadership has been said to have a big impact on attracting talent, retaining talent, but it's hard to have empathy sometimes. And I know I saw some stats in a recent Dell study, it was like two thirds of the people felt like their organization underestimates the people requirements. And I asked myself, I'm like, Hmm, what am I missing with our folks? So especially as it relates to transformation programs. So how can human resource practitioners support business leaders generally, specifically as it relates to leading with empathy? >> I think empathy's always been important. You have to develop trust. You can have the best strategy in the world, right? But if you don't feel like your leader understands who you are, appreciates the value that you bring to the company then you're not going to get very far. So I think empathetic leadership has always been part of the foundation of a trusting strong relationship between a leader and a team member. But if I think we look back on the last two years and I imagine it'll be even more so as we go forward. Empathetic leadership will be even more important. There's so much going on in the world, politically, socially, economically, that taking that time to say you want your team members to see you as credible and confident that you can take us forward, but also that you know and understand me as a human being. And that to me is really what it's about. And I think with regard to transformation that you brought up, I think one of the things we forget about as leaders we've probably been thinking about a decision or transformation for months or weeks and we're ready to go execute, we're ready to go operationalize that thing. And so sometimes when we get to that point because we've been talking about it for so long we send out the email, we have the all hands, and we just say we're ready to go. But our team members haven't always been on that journey for those months that we have. And so I think that empathetic moment to say, Okay, not everybody is on this change curve where I am, let's take a pause, let me put myself in their shoes and really think about how we bring everybody along the journey. >> Jenn, in the spirit of myth busting I mean, I'm one of those people who felt like that a business is going to have a harder time fostering this culture of collaboration and innovation in post isolation economy as they could pre-COVID. But I notice there's an announcement today that came across my desk, I think it's from Newsweek. Yes, and it's the list of top hundred companies recognized for employee motivation, satisfaction. And it was really interesting because you always see, oh, we're the top 10 or the top 100. But this says as a survey of 1.4 million employees from companies ranging from 50 to 10,000 employees. And it recognizes the companies that put respect, caring, and appreciation for their employees at the center of their business model, and in doing so, have earned the loyalty and respect of the people who work for them. Number one on the list is Dell, SAP. So congratulations. SAP was number two. I mean, there really isn't any other tech company on there certainly no large tech companies on there. So I always see these lists, I go, Yeah, okay that's cool, top a hundred, whatever. But top one in an industry where there's only two in the top is pretty impressive. And how does that relate to fostering my earlier skepticism of a culture of collaboration? So first of all, congratulations. How'd you do it? And how are you succeeding in this new world? >> Well, thanks. It does feel great to be number one, but it doesn't happen by accident. And I think while most companies have a culture, and a spouse values, we have ours called the culture code. But it's really been very important to us that it's not just a poster on the wall or words on paper. And so we embed our culture code into all of our HR practices that whole ecosystem, from recognition rewards, to performance evaluation, to interviewee, to development. We build it into everything so it really reflects who we are and you experience it every day. And then to make sure that we're not fooling ourselves, we ask all of our employees, do you feel like the behaviors you see and the experience you have every day reflects the culture code? And 94% of our team members say that in fact it does. So I think that that's really been kind of the secret to our success. If you listen to Michael Dell, he'll always say, "The most special thing about Dell "is our culture and our people." And that comes through being very thoughtful and deliberate to preserve and protect and continue to focus on our culture. >> I don't you think too that repetition and, well, first of all, belief in that cultural philosophy is important. And then kind of repeating, like you said, Yeah it's not just a poster on the wall. But I remember like, when we're kids your parents tell you, okay, power of positive thinking, do unto others as you have others do it to you. You're going to do some dumb things but don't do the same dumb things twice and you sort of fluff it up. But then as you mature you say, Wow, actually those were- >> They might have had a point, right? >> Values were instilled in me and now I'm bringing them forward and paying it forward. But I guess my point is, and it's kind of a point observation but I'll turn it into a question. Isn't consistency and belief in your values really, really important? >> I couldn't agree with you more, right? I think that's one of those things that we talk about it all the time. And as an HR professional, it's not the HR people just talking about our culture. It's our business leaders, it's our CEO, it's our COOs, it's our partners. We share our culture code with our partners and our vendors and our suppliers and everybody, this is important. We say when you interact with anybody at Dell Technologies, you should expect that this is the experience that you're going to get. And so it is something that we talk about that we embed into everything that we do. And I think it's really important that you don't just think it's a one and done 'cause that's not how things really work. >> Well, and it's a culture of respect, high performance, high expectations, accountability, having followed the company and worked with the company for many, many years, you always respect the dignity of your partners and your people. So really appreciate your time, Jenn. Again, congratulations on being number one. >> Thank you so much. >> You're very welcome. Okay, you've been watching a special presentation of theCUBE inside Dell Technology Summit 2022. Remember, these episodes are all available on demand at thecube.net and you can check out siliconangle.com for all the news and analysis. And don't forget to check out wikibon.com each week for a new episode of Breaking Analysis. This is Dave Vellante, thanks for watching and we'll see you next time. (bright upbeat music)

Published Date : Oct 6 2022

SUMMARY :

Jenn, the conversations that we had So I'm excited to share out the return to work we asked them, do you think we're leading say, Hey, we want you guys to and that we support them What do you think about that? and make sure that we are that in just in talking to And that to me is really what it's about. And how does that relate to and the experience you have every day and you sort of fluff it up. and it's kind of a point observation And so it is something that we talk about Well, and it's a culture and you can check out siliconangle.com

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

Michael DellPERSON

0.99+

Jenn SaavedraPERSON

0.99+

JennPERSON

0.99+

DellORGANIZATION

0.99+

DavePERSON

0.99+

March of 2020DATE

0.99+

94%QUANTITY

0.99+

todayDATE

0.99+

86%QUANTITY

0.99+

Last MayDATE

0.99+

Jen SaavedraPERSON

0.99+

50QUANTITY

0.99+

DellPERSON

0.99+

SAPORGANIZATION

0.99+

Dell TechnologiesORGANIZATION

0.99+

twiceQUANTITY

0.99+

oneQUANTITY

0.99+

siliconangle.comOTHER

0.98+

Dell Tech SummitEVENT

0.98+

10,000 employeesQUANTITY

0.98+

1.4 million employeesQUANTITY

0.98+

thecube.netOTHER

0.98+

three days a weekQUANTITY

0.98+

each weekQUANTITY

0.98+

FaceTimeTITLE

0.98+

two thirdsQUANTITY

0.95+

one approachQUANTITY

0.94+

twoQUANTITY

0.93+

Dell Technology Summit 2022EVENT

0.92+

hundredQUANTITY

0.89+

over a decadeQUANTITY

0.87+

last two yearsDATE

0.85+

number twoQUANTITY

0.82+

pandemicEVENT

0.81+

Breaking AnalysisTITLE

0.79+

top oneQUANTITY

0.79+

top a hundredQUANTITY

0.76+

Culture Willow RoadORGANIZATION

0.75+

past MayDATE

0.72+

top 10QUANTITY

0.72+

100QUANTITY

0.7+

Number oneQUANTITY

0.68+

Dell TechORGANIZATION

0.63+

NewsweekORGANIZATION

0.58+

firstQUANTITY

0.58+

wikibon.comORGANIZATION

0.57+

individualQUANTITY

0.53+

WorldEVENT

0.49+

Breaking Analysis: As the tech tide recedes, all sectors feel the pinch


 

>> From theCUBE Studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR. This is "Breaking Analysis" with Dave Vellante. >> Virtually all tech companies have expressed caution in their respective earnings calls, and why not? I know you're sick in talking about the macroeconomic environment, but it's full of uncertainties and there's no upside to providing aggressive guidance when sellers are in control. They punish even the slightest miss. Moreover, the spending data confirms the softening market across the board, so it's becoming expected that CFOs will guide cautiously. But companies facing execution challenges, they can't hide behind the macro, which is why it's important to understand which firms are best positioned to maintain momentum through the headwinds and come out the other side stronger. Hello, and welcome to this week's Wikibon Cube Insights powered by ETR. In this "Breaking Analysis," we'll do three things. First, we're going to share a high-level view of the spending pinch that almost all sectors are experiencing. Second, we're going to highlight some of those companies that continue to show notably strong momentum and relatively high spending velocity on their platforms, albeit less robust than last year. And third, we're going to give you a peak at how one senior technology leader in the financial sector sees the competitive dynamic between AWS, Snowflake, and Databricks. So I landed on the red eye this morning and opened my eyes, and then opened my email to see this. My Barron's Daily had a headline telling me how bad things are and why they could get worse. The S&P Thursday hit a new closing low for the year. The safe haven of bonds are sucking wind. The market hasn't seemed to find a floor. Central banks are raising rates. Inflation is still high, but the job market remains strong. Oh, not to mention that the US debt service is headed toward a trillion dollars per year, and the geopolitical situation is pretty tense, and Europe seems to be really struggling. Yeah, so the Santa Claus rally is really looking pretty precarious, especially if there's a liquidity crunch coming, like guess why they call Barron's Barron's. Last week, we showed you this graphic ahead of the UiPath event. For months, the big four sectors, cloud, containers, AI, and RPA, have shown spending momentum above the rest. Now, this chart shows net score or spending velocity on specific sectors, and these four have consistently trended above the 40% red line for two years now, until this past ETR survey. ML/AI and RPA have decelerated as shown by the squiggly lines, and our premise was that they are more discretionary than the other sectors. The big four is now the big two: cloud and containers. But the reality is almost every sector in the ETR taxonomy is down as shown here. This chart shows the sectors that have decreased in a meaningful way. Almost all sectors are now below the trend line and only cloud and containers, as we showed earlier, are above the magic 40% mark. Container platforms and container orchestration are those gray dots. And no sector has shown a significant increase in spending velocity relative to October 2021 survey. In addition to ML/AI and RPA, information security, yes, security, virtualizations, video conferencing, outsourced IT, syndicated research. Syndicated research, yeah, those Gartner, IDC, Forrester, they stand out as seemingly the most discretionary, although we would argue that security is less discretionary. But what you're seeing is a share shift as we've previously reported toward modern platforms and away from point tools. But the point is there is no sector that is immune from the macroeconomic environment. Although remember, as we reported last week, we're still expecting five to 6% IT spending growth this year relative to 2021, but it's a dynamic environment. So let's now take a look at some of the key players and see how they're performing on a relative basis. This chart shows the net score or spending momentum on the y-axis and the pervasiveness of the vendor within the ETR survey measured as the percentage of respondents citing the vendor in use. As usual, Microsoft and AWS stand out because they are both pervasive on the x-axis and they're highly elevated on the vertical axis. For two companies of this size that demonstrate and maintain net scores above the 40% mark is extremely impressive. Although AWS is now showing much higher on the vertical scale relative to Microsoft, which is a new trend. Normally, we see Microsoft dominating on both dimensions. Salesforce is impressive as well because it's so large, but it's below those two on the vertical axis. Now, Google is meaningfully large, but relative to the other big public clouds, AWS and Azure, we see this as disappointing. John Blackledge of Cowen went on CNBC this past week and said that GCP, by his estimates, are 75% of Google Cloud's reported revenue and is now only five years behind AWS in Azure. Now, our models say, "No way." Google Cloud Platform, by our estimate, is running at about $3 billion per quarter or more like 60% of Google's reported overall cloud revenue. You have to go back to 2016 to find AWS running at that level and 2018 for Azure. So we would estimate that GCP is six years behind AWS and four years behind Azure from a revenue performance standpoint. Now, tech-wise, you can make a stronger case for Google. They have really strong tech. But revenue is, in our view, a really good indicator. Now, we circle here ServiceNow because they have become a generational company and impressively remain above the 40% line. We were at CrowdStrike with theCUBE two weeks ago, and we saw firsthand what we see as another generational company in the making. And you can see the company spending momentum is quite impressive. Now, HashiCorp and Snowflake have now surpassed Kubernetes to claim the top net score spots. Now, we know Kubernetes isn't a company, but ETR tracks it as though it were just for context. And we've highlighted Databricks as well, showing momentum, but it doesn't have the market presence of Snowflake. And there are a number of other players in the green: Pure Storage, Workday, Elastic, JFrog, Datadog, Palo Alto, Zscaler, CyberArk, Fortinet. Those last ones are in security, but again, they're all off their recent highs of 2021 and early 2022. Now, speaking of AWS, Snowflake, and Databricks, our colleague Eric Bradley of ETR recently held an in-depth interview with a senior executive at a large financial institution to dig into the analytics space. And there were some interesting takeaways that we'd like to share. The first is a discussion about whether or not AWS can usurp Snowflake as the top dog in analytics. I'll let you read this at your at your leisure, but I'll pull out some call-outs as indicated by the red lines. This individual's take was quite interesting. Note the comment that quote, this is my area of expertise. This person cited AWS's numerous databases as problematic, but Redshift was cited as the closest competitors to Snowflake. This individual also called out Snowflake's current cross-cloud Advantage, what we sometimes call supercloud, as well as the value add in their marketplace as a differentiator. But the point is this person was actually making, the point that this person was actually making is that cloud vendors make a lot of money from Snowflake. AWS, for example, see Snowflake as much more of a partner than a competitor. And as we've reported, Snowflake drives a lot of EC2 and storage revenue for AWS. Now, as well, this doesn't mean AWS does not have a strong marketplace. It does. Probably the best in the business, but the point is Snowflake's marketplace is exclusively focused on a data marketplace and the company's challenge or opportunity is to build up that ecosystem and to continue to add partners and create network effects that allow them to create long-term sustainable moat for the company, while at the same time, staying ahead of the competition with innovation. Now, the other comment that caught our attention was Snowflake's differentiators. This individual cited three areas. One, the well-known separation of compute and storage, which, of course, AWS has replicated sort of, maybe not as elegant in the sense that you can reduce the compute load with Redshift, but unlike Snowflake, you can't shut it down. Two, with Snowflake's data sharing capability, which is becoming quite well-known and a key part of its value proposition. And three, its marketplace. And again, key opportunity for Snowflake to build out its ecosystem. Close feature gaps that it's not necessarily going to deliver on its own. And really importantly, create governed and secure data sharing experiences for anyone on the data cloud or across clouds. Now, the last thing this individual addressed in the ETR interview that we'll share is how Databricks and Snowflake are attacking a similar problem, i.e. simplifying data, data sharing, and getting more value from data. The key messages here are there's overlap with these two platforms, but Databricks appeals to a more techy crowd. You open a notebook, when you're working with Databricks, you're more likely to be a data scientist, whereas with Snowflake, you're more likely to be aligned with the lines of business within sometimes an industry emphasis. We've talked about this quite often on "Breaking Analysis." Snowflake is moving into the data science arena from its data warehouse strength, and Databricks is moving into analytics and the world of SQL from its AI/ML position of strength, and both companies are doing well, although Snowflake was able to get to the public markets at IPO, Databricks has not. Now, even though Snowflake is on the quarterly shock clock as we saw earlier, it has a larger presence in the market. That's at least partly due to the tailwind of an IPO, and, of course, a stronger go-to market posture. Okay, so we wanted to share some of that with you, and I realize it's a bit of a tangent, but it's good stuff from a qualitative practitioner perspective. All right, let's close with some final thoughts. Look forward a little bit. Things in the short-term are really hard to predict. We've seen these oversold rallies peter out for the last couple of months because the world is such a mess right now, and it's really difficult to reconcile these counterveiling trends. Nothing seems to be working from a public policy perspective. Now, we know tech spending is softening, but let's not forget it, five to 6% growth. It's at or above historical norms, but there's no question the trend line is down. That said, there are certain growth companies, several mentioned in this episode, that are modern and vying to be generational platforms. They're well-positioned, financially sound, disciplined, with strong cash positions, with inherent profitability. What I mean by that is they can dial down growth if they wanted to, dial up EBIT, but being a growth company today is not what it was a year ago. Because of rising rates, the discounted cash flows are just less attractive. So earnings estimates, along with revenue multiples on these growth companies, are reverting toward the mean. However, companies like Snowflake, and CrowdStrike, and some others are able to still command a relative premium because of their execution and continued momentum. Others, as we reported last week, like UiPath for example, despite really strong momentum and customer spending, have had execution challenges. Okta is another example of a company with strong spending momentum, but is absorbing off zero for example. And as a result, they're getting hit harder from evaluation standpoint. The bottom line is sellers are still firmly in control, the bulls have been humbled, and the traders aren't buying growth tech or much tech at all right now. But long-term investors are looking for entry points because these generational companies are going to be worth significantly more five to 10 years down the line. Okay, that's it for today. Thanks for watching this "Breaking Analysis" episode. Thanks to Alex Myerson and Ken Schiffman on production. And Alex manages our podcast as well. Kristen Martin and Cheryl Knight. They help get the word out on social media and in our newsletters. And Rob Hof is our editor-in-chief over at SiliconANGLE do some wonderful editing for us, so thank you. Thank you all. Remember that all these episodes are available as podcast wherever you listen. All you do is search "Breaking Analysis" podcast. I publish each week on wikibon.com and siliconangle.com and you can email me at david.vellante@siliconangle.com, or DM me @dvellante, or comment on my LinkedIn post. And please check out etr.ai for the very best survey data in the enterprise tech business. This is Dave Vellante for theCUBE Insights, powered by ETR. Thanks for watching, and we'll see you next time on "Breaking Analysis." (gentle music)

Published Date : Oct 2 2022

SUMMARY :

This is "Breaking Analysis" and come out the other side stronger.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AWSORGANIZATION

0.99+

Eric BradleyPERSON

0.99+

Cheryl KnightPERSON

0.99+

Dave VellantePERSON

0.99+

Alex MyersonPERSON

0.99+

Kristen MartinPERSON

0.99+

Ken SchiffmanPERSON

0.99+

October 2021DATE

0.99+

John BlackledgePERSON

0.99+

fiveQUANTITY

0.99+

Rob HofPERSON

0.99+

two companiesQUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

Last weekDATE

0.99+

GartnerORGANIZATION

0.99+

DatabricksORGANIZATION

0.99+

SnowflakeORGANIZATION

0.99+

ForresterORGANIZATION

0.99+

Palo AltoLOCATION

0.99+

2021DATE

0.99+

IDCORGANIZATION

0.99+

75%QUANTITY

0.99+

last weekDATE

0.99+

GoogleORGANIZATION

0.99+

FortinetORGANIZATION

0.99+

2018DATE

0.99+

2016DATE

0.99+

DatadogORGANIZATION

0.99+

AlexPERSON

0.99+

two yearsQUANTITY

0.99+

Palo AltoORGANIZATION

0.99+

OktaORGANIZATION

0.99+

four yearsQUANTITY

0.99+

last weekDATE

0.99+

UiPathORGANIZATION

0.99+

david.vellante@siliconangle.comOTHER

0.99+

40%QUANTITY

0.99+

last yearDATE

0.99+

CyberArkORGANIZATION

0.99+

60%QUANTITY

0.99+

six yearsQUANTITY

0.99+

both companiesQUANTITY

0.99+

FirstQUANTITY

0.99+

ZscalerORGANIZATION

0.99+

threeQUANTITY

0.99+

SecondQUANTITY

0.99+

ETRORGANIZATION

0.99+

CrowdStrikeORGANIZATION

0.99+

firstQUANTITY

0.99+

thirdQUANTITY

0.99+

JFrogORGANIZATION

0.99+

SiliconANGLEORGANIZATION

0.99+

three areasQUANTITY

0.99+

a year agoDATE

0.99+

SnowflakeTITLE

0.99+

each weekQUANTITY

0.99+

S&PORGANIZATION

0.99+

five yearsQUANTITY

0.99+

Pure StorageORGANIZATION

0.99+

twoQUANTITY

0.98+

ElasticORGANIZATION

0.98+

WorkdayORGANIZATION

0.98+

two weeks agoDATE

0.98+

Day 1 Keynote Analysis | CrowdStrike Fal.Con 2022


 

(upbeat music) >> Hello everyone, and welcome to Fal.Con 2022, CrowdStrike's big user conference. You're watching the Cube. My name is Dave Vallante. I'm here with my co-host David Nicholson. CrowdStrike is a company that was founded over 10 years ago. This is about 11 years, almost to the day. They're 2 billion company in revenue terms. They're growing at about 60% a year. They've got a path they've committed to wall street. They've got a path to $5 billion by mid decade. They got a $40 billion market cap. They're free, free cash flow positive and trying to build essentially a generational company with a very growing Tam and a modern platform. CrowdStrike has the fundamental belief that the unstoppable breach is a myth. David Nicholson, even though CSOs don't believe that, CrowdStrike is on a mission. Right? >> I didn't hear the phrase. Zero trust mentioned in the keynote >> Right. >> What was mentioned was this idea that CrowdStrike isn't simply a tool, it's a platform. And obviously it takes a platform to get to 5 billion. >> Yeah. So let's talk about the keynote. George Kurtz, the CEO came on. I thought the keynote was, was measured, but very substantive. It was not a lot of hype in there. Most security conferences, the two exceptions are this one and Reinforce, Amazon's big security conference. Steven Schmidt. The first time I was at a Reinforce said "All this narrative about security is such a bad industry" and "We're not doing a great job." And "It's so scary." That doesn't help the industry. George Kurtz sort of took a similar message. And you know what, Dave? When I think of security outside the context of IT I think of like security guards >> Right. >> Like protecting the billionaires. Right? That's a powerful, you know, positive thing. It's not really a defensive movement even though it is defensive but so that was kind of his posture there. But he talked about essentially what I call, not his words permanent changes in the, in the in the cyber defense industry, subsequent to the pandemic. Again, he didn't specifically mention the pandemic but he alluded to, you know, this new world that we live in. Fal.Con is a hundred sessions, eight tracks. And really his contention is we're in the early innings. These guys got 20,000 customers. And I think they got the potential to have hundreds of thousands. >> Yeah. Yeah. So, if I'm working with a security company I want them to be measured. I'm not looking for hype. I don't want those. I don't want those guards to be in disco shirts. I want them in black suits. So, you know, so the, the, the point about measured is is I think a positive one. I was struck by the competence of the people who were on stage today. I have seen very very large companies become kind of bureaucratic. And sometimes you don't get the best of the best up on stage. And we saw a lot of impressive folks. >> Yeah. Michael Santonis get up, but before we get to him. So, a couple points that Kurtz made he said, "digital transformation is needed to bring modern architectures to IT. And that brings modern security." And he laid out that whole sort of old way, new way very Andy Jassy-like old guard, new guard. He didn't hit on it that hard but he basically said "security is all about mitigating risk." And he mentioned that the the CSO I say CSO, he says CSO or CSO has a seat at the board. Now, many CSOs are board level participants. And then he went into the sort of four pillars of, of workload, and the areas that they focus on. So workload to them is end point, identity, and then data. They don't touch network security. That's where they partner with the likes of Cisco, >> Right. >> And Palo Alto networks. But then they went deep into identity threat protection, data, which is their observability platform from an acquisition called Humio. And then they went big time into XDR. We're going to talk about all this stuff. He said, "data is the new digital currency." Talked a lot about how they're now renaming, Humio, Log Scale. That's their Splunk killer. We're going to talk about that all week. And he talked a little bit about the single agent architecture. That is kind of the linchpin of CrowdStrike's architecture. And then Michael Santonis, the CTO came on and did a deep dive into each of those, and really went deep into XDR extended, right? Detection and response. XDR building on EDR. >> Yeah. I think the subject of XDR is something we'll be, we'll be touching on a lot. I think in the next two days. I thought the extension into observability was very, very interesting. When you look at performance metrics, where things are gathering those things in and being able to use a single agent to do so. That speaks to this idea that they are a platform and not just a tool. It's easy to say that you aspire to be a platform. I think that's a proof point. On the subject, by the way of their fundamental architecture. Over the years, there have been times when saying that your infrastructure requires an agent that would've been a deal killer. People say "No agents!" They've stuck to their guns because they know that the best way to deliver what they deliver is to have an agent in the environment. And it has proven to be the right strategy. >> Well, this is one of the things I want to explore with the technical architects that come on here today is, how do you build a lightweight agent that can do everything that you say it's going to do? Because they started out at endpoint, and then they've extended it to all these other modules, you know, identity. They're now into observability. They've got this data platform. They just announced that acquisition of another company they bought Preempt, which is their identity. They announced Responsify, responsify? Reposify, which is sort of extends the observability and gives them visualization or visibility. And I'm like, how do you take? How do you keep an agent lightweight? That's one of the things I want to better understand. And then the other is, as you get into XDR I thought Michael Santonis was pretty interesting. He had black hat last month. He did a little video, you know. >> That was great >> Man in the street, what's XDR what's XDR what's XDR. I thought the best response was, somebody said "a holistic approach to end point security." And so it's really an evolution of, of EDR. So we're going to talk about that. But, how do you keep an agent lightweight and still support all these other capabilities? That's something I really want to dig into, you know, without getting bloated. >> Yeah, Yeah. I think it's all about the TLAs, Dave. It's about the S, it's about SDKs and APIs and having an ecosystem of partners that will look at the lightweight agent and then develop around it. Again, going back to the idea of platform, it's critical. If you're trying to do it all on your own, you get bloat. If you try to be all things to all people with your agent, if you try to reverse engineer every capability that's out there, it doesn't work. >> Well that's one of the things that, again I want to explore because CrowdStrike is trying to be a generational company. In the Breaking Analysis that we published this week. One of the things I said, "In order to be a generational company you have to have a strong ecosystem." Now the ecosystem here is respectable, you know, but it's obviously not AWS class. You know, I think Snowflake is a really good example, ServiceNow. This feels to me like ServiceNow circa 2013. >> Yeah. >> And we've seen how ServiceNow has evolved. You know, Okta, bought Off Zero to give them the developer angle. We heard a little bit about a developer platform today. I want to dig into that some more. And we heard a lot about everybody hates their DLP. I want to get rid of my DLP, data loss prevention. And so, and the same thing with the SIM. One of the ETR round table, Eric Bradley, our colleague at a round table said "If it weren't for the compliance requirements, I would replace my SIM with XDR." And so that's again, another interesting topic. CrowdStrike, cloud native, lightweight agent, you know, some really interesting tuck in acquisitions. Great go-to-market, you know, not super hype just product that works and gets stuff done, you know, seems to have a really good, bright future. >> Yeah, no, I would agree. Definitely. No hype necessary. Just constant execution moving forward. It's clearly something that will be increasingly in demand. Another subject that came up that I thought was interesting, in the keynote, was this idea of security for elections, extending into the realm of misinformation and disinformation which are both very very loaded terms. It'll be very interesting to see how security works its way into that realm in the future. >> Yeah, yeah, >> Yeah. >> Yeah, his guy, Kevin Mandia, who is the CEO of Mandiant, which just got acquired. Google just closed the deal for $5.4 billion. I thought that was kind of light, by the way, I thought Mandiant was worth more than that. Still a good number, but, and Kevin, you know was the founder and, >> Great guy. >> they were self-funded. >> Yeah, yeah impressive. >> So. But I thought he was really impressive. He talked about election security in terms of hardening you know, the election infrastructure, but then, boom he went right to what I see as the biggest issue, disinformation. And so I'm sitting there asking myself, okay how do you deal with that? And what he talked about was mapping network effects and monitoring network effects, >> Right. >> to see who's pumping the disinformation and building career streams to really monitor those network effects, positive, you know, factual or non-factual network or information. Because a lot of times, you know, networks will pump factual information to build credibility. Right? >> Right. >> And get street cred, earn that trust. You know, you talk about zero trust. And then pump disinformation into the network. So they've now got a track. We'll get, we have Kevin Mandia on later with Sean Henry who's the CSO yeah, the the CSO or C S O, chief security officer of CrowdStrike >> more TLA. Well, so, you can think of it as almost the modern equivalent of the political ad where the candidate at the end says I support this ad or I stand behind whatever's in this ad. Forget about trying to define what is dis or misinformation. What is opinion versus fact. Let's have a standard for finding, for exposing where the information is coming from. So if you could see, if you're reading something and there is something that is easily de-code able that says this information is coming from a troll farm of a thousand bots and you can sort of examine the underlying ethos behind where this information is coming from. And you can take that into consideration. Personally, I'm not a believer in trying to filter stuff out. Put the garbage out there, just make sure people know where the garbage is coming from so they can make decisions about it. >> So I got a thought on that because, Kevin Mandia touched on it. Again, I want to ask about this. He said, so this whole idea of these, you know detecting the bots and monitoring the networks. Then he said, you can I think he said something that's to the effect of. "You can go on the offensive." And I'm thinking, okay, what does that mean? So for instance, you see it all the time. Anytime I see some kind of fact put out there, I got to start reading the comments and like cause I like to see both sides, you know. I'm right down the middle. And you'll go down and like 40 comments down, you're like, oh this is, this is fake. This video was edited, >> Right. >> Da, da, da, da, and then a bunch of other people. But then the bots take over and that gets buried. So, maybe going on the offensive is to your point. Go ahead and put it out there. But then the bots, the positive bots say, okay, by the way, this is fake news. This is an edited video FYI. And this is who put it out and here's the bot graph or something like that. And then you attack the bots with more bots and then now everybody can sort of of see it, you know? And it's not like you don't have to, you know email your friend and saying, "Hey dude, this is fake news." >> Right, right. >> You know, Do some research. >> Yeah. >> Put the research out there in volume is what you're saying. >> Yeah. So, it's an, it's just I thought it was an interesting segue into another area of security under the heading of election security. That is fraught with a lot of danger if done wrong, if done incorrectly, you know, you you get into the realm of opinion making. And we should be free to see information, but we also should have access to information about where the information is coming from. >> The other narrative that you hear. So, everything's down today again and I haven't checked lately, but security generally, we wrote about this in our Breaking Analysis. Security, somewhat, has held up in the stock market better than the broad tech market. Why? And the premise is, George Kurt said this on the last conference call, earnings call, that "security is non-discretionary." At the same time he did say that sales cycles are getting a little longer, but we see this as a positive for CrowdStrike. Because CrowdStrike, their mission, or one of their missions is to consolidate all these point tools. We've talked many, many times in the Cube, and in Breaking Analysis and on Silicon Angle, and on Wikibon, how the the security business use too many point tools. You know this as a former CTO. And, now you've got all these stove pipes, the number one challenge the CSOs face is lack of talent. CrowdStrike's premise is they can consolidate that with the Fal.Con platform, and have a single point of control. "Single pane of glass" to use that bromide. So, the question is, is security really non-discretionary? My answer to that is yes and no. It is to a sense, because security is the number one priority. You can't be lax on security. But at the same time the CSO doesn't have an open checkbook, >> Right. >> He or she can't just say, okay, I need this. I need that. I need this. There's other competing initiatives that have to be taken in balance. And so, we've seen in the ETR spending data, you know. By the way, everything's up relative to where it was, pre you know, right at the pandemic, right when, pandemic year everything was flat to down. Everything's up, really up last year, I don't know 8 to 10%. It was expected to be up 8% this year, let's call it 6 to 7% in 21. We were calling for 7 to 8% this year. It's back down to like, you know, 4 or 5% now. It's still healthy, but it's softer. People are being more circumspect. People aren't sure about what the fed's going to do next. Interest rates, you know, loom large. A lot of uncertainty out here. So, in that sense, I would say security is not non-discretionary. Sorry for the double negative. What's your take? >> I think it's less discretionary. >> Okay. >> Food, water, air. Non-discretionary. (David laughing) And then you move away in sort of gradations from that point. I would say that yeah, it is, it falls into the category of less-discretionary. >> Alright. >> Which is a good place to be. >> Dave Nicholson and David Vallante here. Two days of wall to wall coverage of Fal.Con 2022, CrowdStrike's big user conference. We got some great guests. Keep it right there, we'll be right back, right after this short break. (upbeat music)

Published Date : Sep 20 2022

SUMMARY :

that the unstoppable breach is a myth. I didn't hear the phrase. platform to get to 5 billion. And you know what, Dave? in the cyber defense industry, of the people who were on stage today. And he mentioned that the That is kind of the linchpin that the best way to deliver And then the other is, as you get into XDR Man in the street, It's about the S, it's about SDKs and APIs One of the things I said, And so, and the same thing with the SIM. into that realm in the future. of light, by the way, Yeah, as the biggest issue, disinformation. Because a lot of times, you know, into the network. And you can take that into consideration. cause I like to see both sides, you know. And then you attack the You know, Put the research out there in volume I thought it was an interesting And the premise is, George Kurt said this the fed's going to do next. And then you move away Two days of wall to wall coverage

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Eric BradleyPERSON

0.99+

Dave VallantePERSON

0.99+

Sean HenryPERSON

0.99+

8QUANTITY

0.99+

David NicholsonPERSON

0.99+

Kevin MandiaPERSON

0.99+

David VallantePERSON

0.99+

Michael SantonisPERSON

0.99+

CiscoORGANIZATION

0.99+

George KurtzPERSON

0.99+

KurtzPERSON

0.99+

Steven SchmidtPERSON

0.99+

George KurtPERSON

0.99+

KevinPERSON

0.99+

Dave NicholsonPERSON

0.99+

GoogleORGANIZATION

0.99+

DavePERSON

0.99+

AmazonORGANIZATION

0.99+

MandiantORGANIZATION

0.99+

7QUANTITY

0.99+

5 billionQUANTITY

0.99+

$5 billionQUANTITY

0.99+

40 commentsQUANTITY

0.99+

Andy JassyPERSON

0.99+

$40 billionQUANTITY

0.99+

$5.4 billionQUANTITY

0.99+

2 billionQUANTITY

0.99+

6QUANTITY

0.99+

20,000 customersQUANTITY

0.99+

4QUANTITY

0.99+

last yearDATE

0.99+

5%QUANTITY

0.99+

CrowdStrikeORGANIZATION

0.99+

last monthDATE

0.99+

ReinforceORGANIZATION

0.99+

two exceptionsQUANTITY

0.99+

AWSORGANIZATION

0.99+

oneQUANTITY

0.99+

both sidesQUANTITY

0.99+

todayDATE

0.99+

DavidPERSON

0.98+

this weekDATE

0.98+

eight tracksQUANTITY

0.98+

bothQUANTITY

0.98+

10%QUANTITY

0.98+

hundreds of thousandsQUANTITY

0.98+

7%QUANTITY

0.98+

this yearDATE

0.97+

OktaORGANIZATION

0.97+

OneQUANTITY

0.97+

Fal.Con 2022EVENT

0.97+

Day 1QUANTITY

0.97+

about 60% a yearQUANTITY

0.97+

Two daysQUANTITY

0.97+

zero trustQUANTITY

0.97+

8%QUANTITY

0.96+

21QUANTITY

0.96+

Fal.ConEVENT

0.96+

hundred sessionsQUANTITY

0.96+

eachQUANTITY

0.95+

over 10 years agoDATE

0.95+

single agentQUANTITY

0.95+

single pointQUANTITY

0.95+

CrowdStrikeTITLE

0.95+

pandemicEVENT

0.95+

first timeQUANTITY

0.95+

Off ZeroORGANIZATION

0.94+

CrowdStrikeEVENT

0.94+

2013DATE

0.92+

PreemptORGANIZATION

0.92+

HumioORGANIZATION

0.92+

Zero trustQUANTITY

0.9+

Breaking Analysis: We Have the Data…What Private Tech Companies Don’t Tell you About Their Business


 

>> From The Cube Studios in Palo Alto and Boston, bringing you data driven insights from The Cube at ETR. This is "Breaking Analysis" with Dave Vellante. >> The reverse momentum in tech stocks caused by rising interest rates, less attractive discounted cash flow models, and more tepid forward guidance, can be easily measured by public market valuations. And while there's lots of discussion about the impact on private companies and cash runway and 409A valuations, measuring the performance of non-public companies isn't as easy. IPOs have dried up and public statements by private companies, of course, they accentuate the good and they kind of hide the bad. Real data, unless you're an insider, is hard to find. Hello and welcome to this week's "Wikibon Cube Insights" powered by ETR. In this "Breaking Analysis", we unlock some of the secrets that non-public, emerging tech companies may or may not be sharing. And we do this by introducing you to a capability from ETR that we've not exposed you to over the past couple of years, it's called the Emerging Technologies Survey, and it is packed with sentiment data and performance data based on surveys of more than a thousand CIOs and IT buyers covering more than 400 companies. And we've invited back our colleague, Erik Bradley of ETR to help explain the survey and the data that we're going to cover today. Erik, this survey is something that I've not personally spent much time on, but I'm blown away at the data. It's really unique and detailed. First of all, welcome. Good to see you again. >> Great to see you too, Dave, and I'm really happy to be talking about the ETS or the Emerging Technology Survey. Even our own clients of constituents probably don't spend as much time in here as they should. >> Yeah, because there's so much in the mainstream, but let's pull up a slide to bring out the survey composition. Tell us about the study. How often do you run it? What's the background and the methodology? >> Yeah, you were just spot on the way you were talking about the private tech companies out there. So what we did is we decided to take all the vendors that we track that are not yet public and move 'em over to the ETS. And there isn't a lot of information out there. If you're not in Silicon (indistinct), you're not going to get this stuff. So PitchBook and Tech Crunch are two out there that gives some data on these guys. But what we really wanted to do was go out to our community. We have 6,000, ITDMs in our community. We wanted to ask them, "Are you aware of these companies? And if so, are you allocating any resources to them? Are you planning to evaluate them," and really just kind of figure out what we can do. So this particular survey, as you can see, 1000 plus responses, over 450 vendors that we track. And essentially what we're trying to do here is talk about your evaluation and awareness of these companies and also your utilization. And also if you're not utilizing 'em, then we can also figure out your sales conversion or churn. So this is interesting, not only for the ITDMs themselves to figure out what their peers are evaluating and what they should put in POCs against the big guys when contracts come up. But it's also really interesting for the tech vendors themselves to see how they're performing. >> And you can see 2/3 of the respondents are director level of above. You got 28% is C-suite. There is of course a North America bias, 70, 75% is North America. But these smaller companies, you know, that's when they start doing business. So, okay. We're going to do a couple of things here today. First, we're going to give you the big picture across the sectors that ETR covers within the ETS survey. And then we're going to look at the high and low sentiment for the larger private companies. And then we're going to do the same for the smaller private companies, the ones that don't have as much mindshare. And then I'm going to put those two groups together and we're going to look at two dimensions, actually three dimensions, which companies are being evaluated the most. Second, companies are getting the most usage and adoption of their offerings. And then third, which companies are seeing the highest churn rates, which of course is a silent killer of companies. And then finally, we're going to look at the sentiment and mindshare for two key areas that we like to cover often here on "Breaking Analysis", security and data. And data comprises database, including data warehousing, and then big data analytics is the second part of data. And then machine learning and AI is the third section within data that we're going to look at. Now, one other thing before we get into it, ETR very often will include open source offerings in the mix, even though they're not companies like TensorFlow or Kubernetes, for example. And we'll call that out during this discussion. The reason this is done is for context, because everyone is using open source. It is the heart of innovation and many business models are super glued to an open source offering, like take MariaDB, for example. There's the foundation and then there's with the open source code and then there, of course, the company that sells services around the offering. Okay, so let's first look at the highest and lowest sentiment among these private firms, the ones that have the highest mindshare. So they're naturally going to be somewhat larger. And we do this on two dimensions, sentiment on the vertical axis and mindshare on the horizontal axis and note the open source tool, see Kubernetes, Postgres, Kafka, TensorFlow, Jenkins, Grafana, et cetera. So Erik, please explain what we're looking at here, how it's derived and what the data tells us. >> Certainly, so there is a lot here, so we're going to break it down first of all by explaining just what mindshare and net sentiment is. You explain the axis. We have so many evaluation metrics, but we need to aggregate them into one so that way we can rank against each other. Net sentiment is really the aggregation of all the positive and subtracting out the negative. So the net sentiment is a very quick way of looking at where these companies stand versus their peers in their sectors and sub sectors. Mindshare is basically the awareness of them, which is good for very early stage companies. And you'll see some names on here that are obviously been around for a very long time. And they're clearly be the bigger on the axis on the outside. Kubernetes, for instance, as you mentioned, is open source. This de facto standard for all container orchestration, and it should be that far up into the right, because that's what everyone's using. In fact, the open source leaders are so prevalent in the emerging technology survey that we break them out later in our analysis, 'cause it's really not fair to include them and compare them to the actual companies that are providing the support and the security around that open source technology. But no survey, no analysis, no research would be complete without including these open source tech. So what we're looking at here, if I can just get away from the open source names, we see other things like Databricks and OneTrust . They're repeating as top net sentiment performers here. And then also the design vendors. People don't spend a lot of time on 'em, but Miro and Figma. This is their third survey in a row where they're just dominating that sentiment overall. And Adobe should probably take note of that because they're really coming after them. But Databricks, we all know probably would've been a public company by now if the market hadn't turned, but you can see just how dominant they are in a survey of nothing but private companies. And we'll see that again when we talk about the database later. >> And I'll just add, so you see automation anywhere on there, the big UiPath competitor company that was not able to get to the public markets. They've been trying. Snyk, Peter McKay's company, they've raised a bunch of money, big security player. They're doing some really interesting things in developer security, helping developers secure the data flow, H2O.ai, Dataiku AI company. We saw them at the Snowflake Summit. Redis Labs, Netskope and security. So a lot of names that we know that ultimately we think are probably going to be hitting the public market. Okay, here's the same view for private companies with less mindshare, Erik. Take us through this one. >> On the previous slide too real quickly, I wanted to pull that security scorecard and we'll get back into it. But this is a newcomer, that I couldn't believe how strong their data was, but we'll bring that up in a second. Now, when we go to the ones of lower mindshare, it's interesting to talk about open source, right? Kubernetes was all the way on the top right. Everyone uses containers. Here we see Istio up there. Not everyone is using service mesh as much. And that's why Istio is in the smaller breakout. But still when you talk about net sentiment, it's about the leader, it's the highest one there is. So really interesting to point out. Then we see other names like Collibra in the data side really performing well. And again, as always security, very well represented here. We have Aqua, Wiz, Armis, which is a standout in this survey this time around. They do IoT security. I hadn't even heard of them until I started digging into the data here. And I couldn't believe how well they were doing. And then of course you have AnyScale, which is doing a second best in this and the best name in the survey Hugging Face, which is a machine learning AI tool. Also doing really well on a net sentiment, but they're not as far along on that access of mindshare just yet. So these are again, emerging companies that might not be as well represented in the enterprise as they will be in a couple of years. >> Hugging Face sounds like something you do with your two year old. Like you said, you see high performers, AnyScale do machine learning and you mentioned them. They came out of Berkeley. Collibra Governance, InfluxData is on there. InfluxDB's a time series database. And yeah, of course, Alex, if you bring that back up, you get a big group of red dots, right? That's the bad zone, I guess, which Sisense does vis, Yellowbrick Data is a NPP database. How should we interpret the red dots, Erik? I mean, is it necessarily a bad thing? Could it be misinterpreted? What's your take on that? >> Sure, well, let me just explain the definition of it first from a data science perspective, right? We're a data company first. So the gray dots that you're seeing that aren't named, that's the mean that's the average. So in order for you to be on this chart, you have to be at least one standard deviation above or below that average. So that gray is where we're saying, "Hey, this is where the lump of average comes in. This is where everyone normally stands." So you either have to be an outperformer or an underperformer to even show up in this analysis. So by definition, yes, the red dots are bad. You're at least one standard deviation below the average of your peers. It's not where you want to be. And if you're on the lower left, not only are you not performing well from a utilization or an actual usage rate, but people don't even know who you are. So that's a problem, obviously. And the VCs and the PEs out there that are backing these companies, they're the ones who mostly are interested in this data. >> Yeah. Oh, that's great explanation. Thank you for that. No, nice benchmarking there and yeah, you don't want to be in the red. All right, let's get into the next segment here. Here going to look at evaluation rates, adoption and the all important churn. First new evaluations. Let's bring up that slide. And Erik, take us through this. >> So essentially I just want to explain what evaluation means is that people will cite that they either plan to evaluate the company or they're currently evaluating. So that means we're aware of 'em and we are choosing to do a POC of them. And then we'll see later how that turns into utilization, which is what a company wants to see, awareness, evaluation, and then actually utilizing them. That's sort of the life cycle for these emerging companies. So what we're seeing here, again, with very high evaluation rates. H2O, we mentioned. SecurityScorecard jumped up again. Chargebee, Snyk, Salt Security, Armis. A lot of security names are up here, Aqua, Netskope, which God has been around forever. I still can't believe it's in an Emerging Technology Survey But so many of these names fall in data and security again, which is why we decided to pick those out Dave. And on the lower side, Vena, Acton, those unfortunately took the dubious award of the lowest evaluations in our survey, but I prefer to focus on the positive. So SecurityScorecard, again, real standout in this one, they're in a security assessment space, basically. They'll come in and assess for you how your security hygiene is. And it's an area of a real interest right now amongst our ITDM community. >> Yeah, I mean, I think those, and then Arctic Wolf is up there too. They're doing managed services. You had mentioned Netskope. Yeah, okay. All right, let's look at now adoption. These are the companies whose offerings are being used the most and are above that standard deviation in the green. Take us through this, Erik. >> Sure, yet again, what we're looking at is, okay, we went from awareness, we went to evaluation. Now it's about utilization, which means a survey respondent's going to state "Yes, we evaluated and we plan to utilize it" or "It's already in our enterprise and we're actually allocating further resources to it." Not surprising, again, a lot of open source, the reason why, it's free. So it's really easy to grow your utilization on something that's free. But as you and I both know, as Red Hat proved, there's a lot of money to be made once the open source is adopted, right? You need the governance, you need the security, you need the support wrapped around it. So here we're seeing Kubernetes, Postgres, Apache Kafka, Jenkins, Grafana. These are all open source based names. But if we're looking at names that are non open source, we're going to see Databricks, Automation Anywhere, Rubrik all have the highest mindshare. So these are the names, not surprisingly, all names that probably should have been public by now. Everyone's expecting an IPO imminently. These are the names that have the highest mindshare. If we talk about the highest utilization rates, again, Miro and Figma pop up, and I know they're not household names, but they are just dominant in this survey. These are applications that are meant for design software and, again, they're going after an Autodesk or a CAD or Adobe type of thing. It is just dominant how high the utilization rates are here, which again is something Adobe should be paying attention to. And then you'll see a little bit lower, but also interesting, we see Collibra again, we see Hugging Face again. And these are names that are obviously in the data governance, ML, AI side. So we're seeing a ton of data, a ton of security and Rubrik was interesting in this one, too, high utilization and high mindshare. We know how pervasive they are in the enterprise already. >> Erik, Alex, keep that up for a second, if you would. So yeah, you mentioned Rubrik. Cohesity's not on there. They're sort of the big one. We're going to talk about them in a moment. Puppet is interesting to me because you remember the early days of that sort of space, you had Puppet and Chef and then you had Ansible. Red Hat bought Ansible and then Ansible really took off. So it's interesting to see Puppet on there as well. Okay. So now let's look at the churn because this one is where you don't want to be. It's, of course, all red 'cause churn is bad. Take us through this, Erik. >> Yeah, definitely don't want to be here and I don't love to dwell on the negative. So we won't spend as much time. But to your point, there's one thing I want to point out that think it's important. So you see Rubrik in the same spot, but Rubrik has so many citations in our survey that it actually would make sense that they're both being high utilization and churn just because they're so well represented. They have such a high overall representation in our survey. And the reason I call that out is Cohesity. Cohesity has an extremely high churn rate here about 17% and unlike Rubrik, they were not on the utilization side. So Rubrik is seeing both, Cohesity is not. It's not being utilized, but it's seeing a high churn. So that's the way you can look at this data and say, "Hm." Same thing with Puppet. You noticed that it was on the other slide. It's also on this one. So basically what it means is a lot of people are giving Puppet a shot, but it's starting to churn, which means it's not as sticky as we would like. One that was surprising on here for me was Tanium. It's kind of jumbled in there. It's hard to see in the middle, but Tanium, I was very surprised to see as high of a churn because what I do hear from our end user community is that people that use it, like it. It really kind of spreads into not only vulnerability management, but also that endpoint detection and response side. So I was surprised by that one, mostly to see Tanium in here. Mural, again, was another one of those application design softwares that's seeing a very high churn as well. >> So you're saying if you're in both... Alex, bring that back up if you would. So if you're in both like MariaDB is for example, I think, yeah, they're in both. They're both green in the previous one and red here, that's not as bad. You mentioned Rubrik is going to be in both. Cohesity is a bit of a concern. Cohesity just brought on Sanjay Poonen. So this could be a go to market issue, right? I mean, 'cause Cohesity has got a great product and they got really happy customers. So they're just maybe having to figure out, okay, what's the right ideal customer profile and Sanjay Poonen, I guarantee, is going to have that company cranking. I mean they had been doing very well on the surveys and had fallen off of a bit. The other interesting things wondering the previous survey I saw Cvent, which is an event platform. My only reason I pay attention to that is 'cause we actually have an event platform. We don't sell it separately. We bundle it as part of our offerings. And you see Hopin on here. Hopin raised a billion dollars during the pandemic. And we were like, "Wow, that's going to blow up." And so you see Hopin on the churn and you didn't see 'em in the previous chart, but that's sort of interesting. Like you said, let's not kind of dwell on the negative, but you really don't. You know, churn is a real big concern. Okay, now we're going to drill down into two sectors, security and data. Where data comprises three areas, database and data warehousing, machine learning and AI and big data analytics. So first let's take a look at the security sector. Now this is interesting because not only is it a sector drill down, but also gives an indicator of how much money the firm has raised, which is the size of that bubble. And to tell us if a company is punching above its weight and efficiently using its venture capital. Erik, take us through this slide. Explain the dots, the size of the dots. Set this up please. >> Yeah. So again, the axis is still the same, net sentiment and mindshare, but what we've done this time is we've taken publicly available information on how much capital company is raised and that'll be the size of the circle you see around the name. And then whether it's green or red is basically saying relative to the amount of money they've raised, how are they doing in our data? So when you see a Netskope, which has been around forever, raised a lot of money, that's why you're going to see them more leading towards red, 'cause it's just been around forever and kind of would expect it. Versus a name like SecurityScorecard, which is only raised a little bit of money and it's actually performing just as well, if not better than a name, like a Netskope. OneTrust doing absolutely incredible right now. BeyondTrust. We've seen the issues with Okta, right. So those are two names that play in that space that obviously are probably getting some looks about what's going on right now. Wiz, we've all heard about right? So raised a ton of money. It's doing well on net sentiment, but the mindshare isn't as well as you'd want, which is why you're going to see a little bit of that red versus a name like Aqua, which is doing container and application security. And hasn't raised as much money, but is really neck and neck with a name like Wiz. So that is why on a relative basis, you'll see that more green. As we all know, information security is never going away. But as we'll get to later in the program, Dave, I'm not sure in this current market environment, if people are as willing to do POCs and switch away from their security provider, right. There's a little bit of tepidness out there, a little trepidation. So right now we're seeing overall a slight pause, a slight cooling in overall evaluations on the security side versus historical levels a year ago. >> Now let's stay on here for a second. So a couple things I want to point out. So it's interesting. Now Snyk has raised over, I think $800 million but you can see them, they're high on the vertical and the horizontal, but now compare that to Lacework. It's hard to see, but they're kind of buried in the middle there. That's the biggest dot in this whole thing. I think I'm interpreting this correctly. They've raised over a billion dollars. It's a Mike Speiser company. He was the founding investor in Snowflake. So people watch that very closely, but that's an example of where they're not punching above their weight. They recently had a layoff and they got to fine tune things, but I'm still confident they they're going to do well. 'Cause they're approaching security as a data problem, which is probably people having trouble getting their arms around that. And then again, I see Arctic Wolf. They're not red, they're not green, but they've raised fair amount of money, but it's showing up to the right and decent level there. And a couple of the other ones that you mentioned, Netskope. Yeah, they've raised a lot of money, but they're actually performing where you want. What you don't want is where Lacework is, right. They've got some work to do to really take advantage of the money that they raised last November and prior to that. >> Yeah, if you're seeing that more neutral color, like you're calling out with an Arctic Wolf, like that means relative to their peers, this is where they should be. It's when you're seeing that red on a Lacework where we all know, wow, you raised a ton of money and your mindshare isn't where it should be. Your net sentiment is not where it should be comparatively. And then you see these great standouts, like Salt Security and SecurityScorecard and Abnormal. You know they haven't raised that much money yet, but their net sentiment's higher and their mindshare's doing well. So those basically in a nutshell, if you're a PE or a VC and you see a small green circle, then you're doing well, then it means you made a good investment. >> Some of these guys, I don't know, but you see these small green circles. Those are the ones you want to start digging into and maybe help them catch a wave. Okay, let's get into the data discussion. And again, three areas, database slash data warehousing, big data analytics and ML AI. First, we're going to look at the database sector. So Alex, thank you for bringing that up. Alright, take us through this, Erik. Actually, let me just say Postgres SQL. I got to ask you about this. It shows some funding, but that actually could be a mix of EDB, the company that commercializes Postgres and Postgres the open source database, which is a transaction system and kind of an open source Oracle. You see MariaDB is a database, but open source database. But the companies they've raised over $200 million and they filed an S-4. So Erik looks like this might be a little bit of mashup of companies and open source products. Help us understand this. >> Yeah, it's tough when you start dealing with the open source side and I'll be honest with you, there is a little bit of a mashup here. There are certain names here that are a hundred percent for profit companies. And then there are others that are obviously open source based like Redis is open source, but Redis Labs is the one trying to monetize the support around it. So you're a hundred percent accurate on this slide. I think one of the things here that's important to note though, is just how important open source is to data. If you're going to be going to any of these areas, it's going to be open source based to begin with. And Neo4j is one I want to call out here. It's not one everyone's familiar with, but it's basically geographical charting database, which is a name that we're seeing on a net sentiment side actually really, really high. When you think about it's the third overall net sentiment for a niche database play. It's not as big on the mindshare 'cause it's use cases aren't as often, but third biggest play on net sentiment. I found really interesting on this slide. >> And again, so MariaDB, as I said, they filed an S-4 I think $50 million in revenue, that might even be ARR. So they're not huge, but they're getting there. And by the way, MariaDB, if you don't know, was the company that was formed the day that Oracle bought Sun in which they got MySQL and MariaDB has done a really good job of replacing a lot of MySQL instances. Oracle has responded with MySQL HeatWave, which was kind of the Oracle version of MySQL. So there's some interesting battles going on there. If you think about the LAMP stack, the M in the LAMP stack was MySQL. And so now it's all MariaDB replacing that MySQL for a large part. And then you see again, the red, you know, you got to have some concerns about there. Aerospike's been around for a long time. SingleStore changed their name a couple years ago, last year. Yellowbrick Data, Fire Bolt was kind of going after Snowflake for a while, but yeah, you want to get out of that red zone. So they got some work to do. >> And Dave, real quick for the people that aren't aware, I just want to let them know that we can cut this data with the public company data as well. So we can cross over this with that because some of these names are competing with the larger public company names as well. So we can go ahead and cross reference like a MariaDB with a Mongo, for instance, or of something of that nature. So it's not in this slide, but at another point we can certainly explain on a relative basis how these private names are doing compared to the other ones as well. >> All right, let's take a quick look at analytics. Alex, bring that up if you would. Go ahead, Erik. >> Yeah, I mean, essentially here, I can't see it on my screen, my apologies. I just kind of went to blank on that. So gimme one second to catch up. >> So I could set it up while you're doing that. You got Grafana up and to the right. I mean, this is huge right. >> Got it thank you. I lost my screen there for a second. Yep. Again, open source name Grafana, absolutely up and to the right. But as we know, Grafana Labs is actually picking up a lot of speed based on Grafana, of course. And I think we might actually hear some noise from them coming this year. The names that are actually a little bit more disappointing than I want to call out are names like ThoughtSpot. It's been around forever. Their mindshare of course is second best here but based on the amount of time they've been around and the amount of money they've raised, it's not actually outperforming the way it should be. We're seeing Moogsoft obviously make some waves. That's very high net sentiment for that company. It's, you know, what, third, fourth position overall in this entire area, Another name like Fivetran, Matillion is doing well. Fivetran, even though it's got a high net sentiment, again, it's raised so much money that we would've expected a little bit more at this point. I know you know this space extremely well, but basically what we're looking at here and to the bottom left, you're going to see some names with a lot of red, large circles that really just aren't performing that well. InfluxData, however, second highest net sentiment. And it's really pretty early on in this stage and the feedback we're getting on this name is the use cases are great, the efficacy's great. And I think it's one to watch out for. >> InfluxData, time series database. The other interesting things I just noticed here, you got Tamer on here, which is that little small green. Those are the ones we were saying before, look for those guys. They might be some of the interesting companies out there and then observe Jeremy Burton's company. They do observability on top of Snowflake, not green, but kind of in that gray. So that's kind of cool. Monte Carlo is another one, they're sort of slightly green. They are doing some really interesting things in data and data mesh. So yeah, okay. So I can spend all day on this stuff, Erik, phenomenal data. I got to get back and really dig in. Let's end with machine learning and AI. Now this chart it's similar in its dimensions, of course, except for the money raised. We're not showing that size of the bubble, but AI is so hot. We wanted to cover that here, Erik, explain this please. Why TensorFlow is highlighted and walk us through this chart. >> Yeah, it's funny yet again, right? Another open source name, TensorFlow being up there. And I just want to explain, we do break out machine learning, AI is its own sector. A lot of this of course really is intertwined with the data side, but it is on its own area. And one of the things I think that's most important here to break out is Databricks. We started to cover Databricks in machine learning, AI. That company has grown into much, much more than that. So I do want to state to you Dave, and also the audience out there that moving forward, we're going to be moving Databricks out of only the MA/AI into other sectors. So we can kind of value them against their peers a little bit better. But in this instance, you could just see how dominant they are in this area. And one thing that's not here, but I do want to point out is that we have the ability to break this down by industry vertical, organization size. And when I break this down into Fortune 500 and Fortune 1000, both Databricks and Tensorflow are even better than you see here. So it's quite interesting to see that the names that are succeeding are also succeeding with the largest organizations in the world. And as we know, large organizations means large budgets. So this is one area that I just thought was really interesting to point out that as we break it down, the data by vertical, these two names still are the outstanding players. >> I just also want to call it H2O.ai. They're getting a lot of buzz in the marketplace and I'm seeing them a lot more. Anaconda, another one. Dataiku consistently popping up. DataRobot is also interesting because all the kerfuffle that's going on there. The Cube guy, Cube alum, Chris Lynch stepped down as executive chairman. All this stuff came out about how the executives were taking money off the table and didn't allow the employees to participate in that money raising deal. So that's pissed a lot of people off. And so they're now going through some kind of uncomfortable things, which is unfortunate because DataRobot, I noticed, we haven't covered them that much in "Breaking Analysis", but I've noticed them oftentimes, Erik, in the surveys doing really well. So you would think that company has a lot of potential. But yeah, it's an important space that we're going to continue to watch. Let me ask you Erik, can you contextualize this from a time series standpoint? I mean, how is this changed over time? >> Yeah, again, not show here, but in the data. I'm sorry, go ahead. >> No, I'm sorry. What I meant, I should have interjected. In other words, you would think in a downturn that these emerging companies would be less interesting to buyers 'cause they're more risky. What have you seen? >> Yeah, and it was interesting before we went live, you and I were having this conversation about "Is the downturn stopping people from evaluating these private companies or not," right. In a larger sense, that's really what we're doing here. How are these private companies doing when it comes down to the actual practitioners? The people with the budget, the people with the decision making. And so what I did is, we have historical data as you know, I went back to the Emerging Technology Survey we did in November of 21, right at the crest right before the market started to really fall and everything kind of started to fall apart there. And what I noticed is on the security side, very much so, we're seeing less evaluations than we were in November 21. So I broke it down. On cloud security, net sentiment went from 21% to 16% from November '21. That's a pretty big drop. And again, that sentiment is our one aggregate metric for overall positivity, meaning utilization and actual evaluation of the name. Again in database, we saw it drop a little bit from 19% to 13%. However, in analytics we actually saw it stay steady. So it's pretty interesting that yes, cloud security and security in general is always going to be important. But right now we're seeing less overall net sentiment in that space. But within analytics, we're seeing steady with growing mindshare. And also to your point earlier in machine learning, AI, we're seeing steady net sentiment and mindshare has grown a whopping 25% to 30%. So despite the downturn, we're seeing more awareness of these companies in analytics and machine learning and a steady, actual utilization of them. I can't say the same in security and database. They're actually shrinking a little bit since the end of last year. >> You know it's interesting, we were on a round table, Erik does these round tables with CISOs and CIOs, and I remember one time you had asked the question, "How do you think about some of these emerging tech companies?" And one of the executives said, "I always include somebody in the bottom left of the Gartner Magic Quadrant in my RFPs. I think he said, "That's how I found," I don't know, it was Zscaler or something like that years before anybody ever knew of them "Because they're going to help me get to the next level." So it's interesting to see Erik in these sectors, how they're holding up in many cases. >> Yeah. It's a very important part for the actual IT practitioners themselves. There's always contracts coming up and you always have to worry about your next round of negotiations. And that's one of the roles these guys play. You have to do a POC when contracts come up, but it's also their job to stay on top of the new technology. You can't fall behind. Like everyone's a software company. Now everyone's a tech company, no matter what you're doing. So these guys have to stay in on top of it. And that's what this ETS can do. You can go in here and look and say, "All right, I'm going to evaluate their technology," and it could be twofold. It might be that you're ready to upgrade your technology and they're actually pushing the envelope or it simply might be I'm using them as a negotiation ploy. So when I go back to the big guy who I have full intentions of writing that contract to, at least I have some negotiation leverage. >> Erik, we got to leave it there. I could spend all day. I'm going to definitely dig into this on my own time. Thank you for introducing this, really appreciate your time today. >> I always enjoy it, Dave and I hope everyone out there has a great holiday weekend. Enjoy the rest of the summer. And, you know, I love to talk data. So anytime you want, just point the camera on me and I'll start talking data. >> You got it. I also want to thank the team at ETR, not only Erik, but Darren Bramen who's a data scientist, really helped prepare this data, the entire team over at ETR. I cannot tell you how much additional data there is. We are just scratching the surface in this "Breaking Analysis". So great job guys. I want to thank Alex Myerson. Who's on production and he manages the podcast. Ken Shifman as well, who's just coming back from VMware Explore. Kristen Martin and Cheryl Knight help get the word out on social media and in our newsletters. And Rob Hof is our editor in chief over at SiliconANGLE. Does some great editing for us. Thank you. All of you guys. Remember these episodes, they're all available as podcast, wherever you listen. All you got to do is just search "Breaking Analysis" podcast. I publish each week on wikibon.com and siliconangle.com. Or you can email me to get in touch david.vellante@siliconangle.com. You can DM me at dvellante or comment on my LinkedIn posts and please do check out etr.ai for the best survey data in the enterprise tech business. This is Dave Vellante for Erik Bradley and The Cube Insights powered by ETR. Thanks for watching. Be well. And we'll see you next time on "Breaking Analysis". (upbeat music)

Published Date : Sep 7 2022

SUMMARY :

bringing you data driven it's called the Emerging Great to see you too, Dave, so much in the mainstream, not only for the ITDMs themselves It is the heart of innovation So the net sentiment is a very So a lot of names that we And then of course you have AnyScale, That's the bad zone, I guess, So the gray dots that you're rates, adoption and the all And on the lower side, Vena, Acton, in the green. are in the enterprise already. So now let's look at the churn So that's the way you can look of dwell on the negative, So again, the axis is still the same, And a couple of the other And then you see these great standouts, Those are the ones you want to but Redis Labs is the one And by the way, MariaDB, So it's not in this slide, Alex, bring that up if you would. So gimme one second to catch up. So I could set it up but based on the amount of time Those are the ones we were saying before, And one of the things I think didn't allow the employees to here, but in the data. What have you seen? the market started to really And one of the executives said, And that's one of the Thank you for introducing this, just point the camera on me We are just scratching the surface

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
ErikPERSON

0.99+

Alex MyersonPERSON

0.99+

Ken ShifmanPERSON

0.99+

Sanjay PoonenPERSON

0.99+

Dave VellantePERSON

0.99+

DavePERSON

0.99+

Erik BradleyPERSON

0.99+

November 21DATE

0.99+

Darren BramenPERSON

0.99+

AlexPERSON

0.99+

Cheryl KnightPERSON

0.99+

PostgresORGANIZATION

0.99+

DatabricksORGANIZATION

0.99+

NetskopeORGANIZATION

0.99+

AdobeORGANIZATION

0.99+

Rob HofPERSON

0.99+

FivetranORGANIZATION

0.99+

$50 millionQUANTITY

0.99+

21%QUANTITY

0.99+

Chris LynchPERSON

0.99+

19%QUANTITY

0.99+

Jeremy BurtonPERSON

0.99+

$800 millionQUANTITY

0.99+

6,000QUANTITY

0.99+

OracleORGANIZATION

0.99+

Redis LabsORGANIZATION

0.99+

November '21DATE

0.99+

ETRORGANIZATION

0.99+

FirstQUANTITY

0.99+

25%QUANTITY

0.99+

last yearDATE

0.99+

OneTrustORGANIZATION

0.99+

two dimensionsQUANTITY

0.99+

two groupsQUANTITY

0.99+

November of 21DATE

0.99+

bothQUANTITY

0.99+

BostonLOCATION

0.99+

more than 400 companiesQUANTITY

0.99+

Kristen MartinPERSON

0.99+

MySQLTITLE

0.99+

MoogsoftORGANIZATION

0.99+

The CubeORGANIZATION

0.99+

thirdQUANTITY

0.99+

GrafanaORGANIZATION

0.99+

H2OORGANIZATION

0.99+

Mike SpeiserPERSON

0.99+

david.vellante@siliconangle.comOTHER

0.99+

secondQUANTITY

0.99+

twoQUANTITY

0.99+

firstQUANTITY

0.99+

28%QUANTITY

0.99+

16%QUANTITY

0.99+

SecondQUANTITY

0.99+

Breaking Analysis: VMware Explore 2022 will mark the start of a Supercloud journey


 

>> From the Cube studios in Palo Alto and Boston, bringing you data driven insights from theCUBE and ETR, this is Breaking Analysis with Dave Vellante. >> While the precise direction of VMware's future is unknown, given the plan Broadcom acquisition, one thing is clear. The topic of what Broadcom plans will not be the main focus of the agenda at the upcoming VMware Explore event next week in San Francisco. We believe that despite any uncertainty, VMware will lay out for its customers what it sees as its future. And that future is multi-cloud or cross-cloud services, what we call Supercloud. Hello, and welcome to this week's Wikibon Cube Insights powered by ETR. In this breaking analysis, we drill into the latest survey data on VMware from ETR. And we'll share with you the next iteration of the Supercloud definition based on feedback from dozens of contributors. And we'll give you our take on what to expect next week at VMware Explorer 2022. Well, VMware is maturing. You can see it in the numbers. VMware had a solid quarter just this week, which was announced beating earnings and growing the top line by 6%. But it's clear from its financials and the ETR data that we're showing here that VMware's Halcion glory days are behind it. This chart shows the spending profile from ETR's July survey of nearly 1500 IT buyers and CIOs. The survey included 722 VMware customers with the green bars showing elevated spending momentum, ie: growth, either new or growing at more than 6%. And the red bars show lower spending, either down 6% or worse or defections. The gray bars, that's the flat spending crowd, and it really tells a story. Look, nobody's throwing away their VMware platforms. They're just not investing as rapidly as in previous years. The blue line shows net score or spending momentum and subtracts the reds from the greens. The yellow line shows market penetration or pervasiveness in the survey. So the data is pretty clear. It's steady, but it's not remarkable. Now, the timing of the acquisition, quite rightly, is quite good, I would say. Now, this next chart shows the net score and pervasiveness juxtaposed on an XY graph and breaks down the VMware portfolio in those dimensions, the product portfolio. And you can see the dominance of respondents citing VMware as the platform. They might not know exactly which services they use, but they just respond VMware. That's on the X axis. You can see it way to the right. And the spending momentum or the net score is on the Y axis. That red dotted line at 4%, that indicates elevated levels and only VMware cloud on AWS is above that line. Notably, Tanzu has jumped up significantly from previous quarters, with the rest of the portfolio showing steady, as you would expect from a maturing platform. Only carbon black is hovering in the red zone, kind of ironic given the name. We believe that VMware is going to be a major player in cross cloud services, what we refer to as Supercloud. For months, we've been refining the concept and the definition. At Supercloud '22, we had discussions with more than 30 technology and business experts, and we've gathered input from many more. Based on that feedback, here's the definition we've landed on. It's somewhat refined from our earlier definition that we published a couple weeks ago. Supercloud is an emerging computing architecture that comprises a set of services abstracted from the underlying primitives of hyperscale clouds, e.g. compute, storage, networking, security, and other native resources, to create a global system spanning more than one cloud. Supercloud is three essential properties, three deployment models, and three service models. So what are those essential elements, those properties? We've simplified the picture from our last report. We show them here. I'll review them briefly. We're not going to go super in depth here because we've covered this topic a lot. But supercloud, it runs on more than one cloud. It creates that common or identical experience across clouds. It contains a necessary capability that we call a superPaaS that acts as a cloud interpreter, and it has metadata intelligence to optimize for a specific purpose. We'll publish this definition in detail. So again, we're not going to spend a ton of time here today. Now, we've identified three deployment models for Supercloud. The first is a single instantiation, where a control plane runs on one cloud but supports interactions with multiple other clouds. An example we use is Kubernetes cluster management service that runs on one cloud but can deploy and manage clusters on other clouds. The second model is a multi-cloud, multi-region instantiation where a full stack of services is instantiated on multiple clouds and multiple cloud regions with a common interface across them. We've used cohesity as one example of this. And then a single global instance that spans multiple cloud providers. That's our snowflake example. Again, we'll publish this in detail. So we're not going to spend a ton of time here today. Finally, the service models. The feedback we've had is IaaS, PaaS, and SaaS work fine to describe the service models for Supercloud. NetApp's Cloud Volume is a good example in IaaS. VMware cloud foundation and what we expect at VMware Explore is a good PaaS example. And SAP HANA Cloud is a good example of SaaS running as a Supercloud service. That's the SAP HANA multi-cloud. So what is it that we expect from VMware Explore 2022? Well, along with what will be an exciting and speculation filled gathering of the VMware community at the Moscone Center, we believe VMware will lay out its future architectural direction. And we expect it will fit the Supercloud definition that we just described. We think VMware will show its hand on a set of cross-cloud services and will promise a common experience for users and developers alike. As we talked about at Supercloud '22, VMware kind of wants to have its cake, eat it too, and lose weight. And by that, we mean that it will not only abstract the underlying primitives of each of the individual clouds, but if developers want access to them, they will allow that and actually facilitate that. Now, we don't expect VMware to use the term Supercloud, but it will be a cross-cloud multi-cloud services model that they put forth, we think, at VMworld Explore. With IaaS comprising compute, storage, and networking, a very strong emphasis, we believe, on security, of course, a governance and a comprehensive set of data protection services. Now, very importantly, we believe Tanzu will play a leading role in any announcements this coming week, as a purpose-built PaaS layer, specifically designed to create a common experience for cross clouds for data and application services. This, we believe, will be VMware's most significant offering to date in cross-cloud services. And it will position VMware to be a leader in what we call Supercloud. Now, while it remains to be seen what Broadcom exactly intends to do with VMware, we've speculated, others have speculated. We think this Supercloud is a substantial market opportunity generally and for VMware specifically. Look, if you don't own a public cloud, and very few companies do, in the tech business, we believe you better be supporting the build out of superclouds or building a supercloud yourself on top of hyperscale infrastructure. And we believe that as cloud matures, hyperscalers will increasingly I cross cloud services as an opportunity. We asked David Floyer to take a stab at a market model for super cloud. He's really good at these types of things. What he did is he took the known players in cloud and estimated their IaaS and PaaS cloud services, their total revenue, and then took a percentage. So this is super set of just the public cloud and the hyperscalers. And then what he did is he took a percentage to fit the Supercloud definition, as we just shared above. He then added another 20% on top to cover the long tail of Other. Other over time is most likely going to grow to let's say 30%. That's kind of how these markets work. Okay, so this is obviously an estimate, but it's an informed estimate by an individual who has done this many, many times and is pretty well respected in these types of forecasts, these long term forecasts. Now, by the definition we just shared, Supercloud revenue was estimated at about $3 billion in 2022 worldwide, growing to nearly $80 billion by 2030. Now remember, there's not one Supercloud market. It comprises a bunch of purpose-built superclouds that solve a specific problem. But the common attribute is it's built on top of hyperscale infrastructure. So overall, cloud services, including Supercloud, peak by the end of the decade. But Supercloud continues to grow and will take a higher percentage of the cloud market. The reasoning here is that the market will change and compute, will increasingly become distributed and embedded into edge devices, such as automobiles and robots and factory equipment, et cetera, and not necessarily be a discreet... I mean, it still will be, of course, but it's not going to be as much of a discrete component that is consumed via services like EZ2, that will mature. And this will be a key shift to watch in spending dynamics and really importantly, computing economics, the things we've talked about around arm and edge and AI inferencing and new low cost computing architectures at the edge. We're talking not the near edge, like, Lowes and Home Depot, we're talking far edge and embedded devices. Now, whether this becomes a seamless part of Supercloud remains to be seen. Look, if that's how we see it, the current and the future state of Supercloud, and we're committed to keeping the discussion going with an inclusive model that gathers input from all parts of the industry. Okay, that's it for today. Thanks to Alex Morrison, who's on production, and he also manages the podcast. Ken Schiffman, as well, is on production in our Boston office. Kristin Martin and Cheryl Knight, they help us get the word out on social media and in our newsletters. And Rob Hoffe is our editor in chief over at Silicon Angle and does some helpful editing. Thank you, all. Remember these episodes, they're all available as podcasts, wherever you listen. All you got to do is search Breaking Analysis Podcast. I publish each week on wikibon.com and siliconangle.com. You can email me directly at david.vellante@siliconangle.com or DM me @Dvellante or comment on our LinkedIn posts. Please do check out etr.ai. They've got some great enterprise survey research. So please go there and poke around, And if you need any assistance, let them know. This is Dave Vellante for the Cube Insights powered by ETR. Thanks for watching, and we'll see you next time on Breaking Analysis. (lively music)

Published Date : Aug 27 2022

SUMMARY :

From the Cube studios and subtracts the reds from the greens.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Alex MorrisonPERSON

0.99+

Cheryl KnightPERSON

0.99+

Dave VellantePERSON

0.99+

Rob HoffePERSON

0.99+

VMwareORGANIZATION

0.99+

Ken SchiffmanPERSON

0.99+

David FloyerPERSON

0.99+

Kristin MartinPERSON

0.99+

30%QUANTITY

0.99+

BostonLOCATION

0.99+

2022DATE

0.99+

LowesORGANIZATION

0.99+

20%QUANTITY

0.99+

Palo AltoLOCATION

0.99+

722QUANTITY

0.99+

4%QUANTITY

0.99+

San FranciscoLOCATION

0.99+

david.vellante@siliconangle.comOTHER

0.99+

2030DATE

0.99+

Silicon AngleORGANIZATION

0.99+

JulyDATE

0.99+

BroadcomORGANIZATION

0.99+

Home DepotORGANIZATION

0.99+

6%QUANTITY

0.99+

next weekDATE

0.99+

AWSORGANIZATION

0.99+

second modelQUANTITY

0.99+

more than 6%QUANTITY

0.99+

ETRORGANIZATION

0.99+

more than one cloudQUANTITY

0.99+

siliconangle.comOTHER

0.99+

nearly $80 billionQUANTITY

0.99+

about $3 billionQUANTITY

0.99+

more than 30 technologyQUANTITY

0.99+

firstQUANTITY

0.99+

this weekDATE

0.98+

SupercloudORGANIZATION

0.98+

each weekQUANTITY

0.98+

one exampleQUANTITY

0.98+

three service modelsQUANTITY

0.98+

VMware ExploreEVENT

0.98+

dozens of contributorsQUANTITY

0.97+

todayDATE

0.97+

NetAppTITLE

0.97+

this weekDATE

0.97+

SupercloudTITLE

0.97+

SAP HANATITLE

0.97+

VMworld ExploreORGANIZATION

0.97+

three essential propertiesQUANTITY

0.97+

three deployment modelsQUANTITY

0.97+

one cloudQUANTITY

0.96+

TanzuORGANIZATION

0.96+

eachQUANTITY

0.96+

Moscone CenterLOCATION

0.96+

wikibon.comOTHER

0.95+

SAP HANA CloudTITLE

0.95+

Cube InsightsORGANIZATION

0.92+

single instantiationQUANTITY

0.9+

Breaking Analysis: What Black Hat '22 tells us about securing the Supercloud


 

>> From theCUBE Studios in Palo Alto in Boston, bringing you data driven insights from theCUBE and ETR, This is "Breaking Analysis with Dave Vellante". >> Black Hat 22 was held in Las Vegas last week, the same time as theCUBE Supercloud event. Unlike AWS re:Inforce where words are carefully chosen to put a positive spin on security, Black Hat exposes all the warts of cyber and openly discusses its hard truths. It's a conference that's attended by technical experts who proudly share some of the vulnerabilities they've discovered, and, of course, by numerous vendors marketing their products and services. Hello, and welcome to this week's Wikibon CUBE Insights powered by ETR. In this "Breaking Analysis", we summarize what we learned from discussions with several people who attended Black Hat and our analysis from reviewing dozens of keynotes, articles, sessions, and data from a recent Black Hat Attendees Survey conducted by Black Hat and Informa, and we'll end with the discussion of what it all means for the challenges around securing the supercloud. Now, I personally did not attend, but as I said at the top, we reviewed a lot of content from the event which is renowned for its hundreds of sessions, breakouts, and strong technical content that is, as they say, unvarnished. Chris Krebs, the former director of Us cybersecurity and infrastructure security agency, CISA, he gave the keynote, and he spoke about the increasing complexity of tech stacks and the ripple effects that that has on organizational risk. Risk was a big theme at the event. Where re:Inforce tends to emphasize, again, the positive state of cybersecurity, it could be said that Black Hat, as the name implies, focuses on the other end of the spectrum. Risk, as a major theme of the event at the show, got a lot of attention. Now, there was a lot of talk, as always, about the expanded threat service, you hear that at any event that's focused on cybersecurity, and tons of emphasis on supply chain risk as a relatively new threat that's come to the CISO's minds. Now, there was also plenty of discussion about hybrid work and how remote work has dramatically increased business risk. According to data from in Intel 471's Mark Arena, the previously mentioned Black Hat Attendee Survey showed that compromise credentials posed the number one source of risk followed by infrastructure vulnerabilities and supply chain risks, so a couple of surveys here that we're citing, and we'll come back to that in a moment. At an MIT cybersecurity conference earlier last decade, theCUBE had a hypothetical conversation with former Boston Globe war correspondent, Charles Sennott, about the future of war and the role of cyber. We had similar discussions with Dr. Robert Gates on theCUBE at a ServiceNow event in 2016. At Black Hat, these discussions went well beyond the theoretical with actual data from the war in Ukraine. It's clear that modern wars are and will be supported by cyber, but the takeaways are that they will be highly situational, targeted, and unpredictable because in combat scenarios, anything can happen. People aren't necessarily at their keyboards. Now, the role of AI was certainly discussed as it is at every conference, and particularly cyber conferences. You know, it was somewhat dissed as over hyped, not surprisingly, but while AI is not a panacea to cyber exposure, automation and machine intelligence can definitely augment, what appear to be and have been stressed out, security teams can do this by recommending actions and taking other helpful types of data and presenting it in a curated form that can streamline the job of the SecOps team. Now, most cyber defenses are still going to be based on tried and true monitoring and telemetry data and log analysis and curating known signatures and analyzing consolidated data, but increasingly, AI will help with the unknowns, i.e. zero-day threats and threat actor behaviors after infiltration. Now, finally, while much lip service was given to collaboration and public-private partnerships, especially after Stuxsnet was revealed early last decade, the real truth is that threat intelligence in the private sector is still evolving. In particular, the industry, mid decade, really tried to commercially exploit proprietary intelligence and, you know, do private things like private reporting and monetize that, but attitudes toward collaboration are trending in a positive direction was one of the sort of outcomes that we heard at Black Hat. Public-private partnerships are being both mandated by government, and there seems to be a willingness to work together to fight an increasingly capable adversary. These things are definitely on the rise. Now, without this type of collaboration, securing the supercloud is going to become much more challenging and confined to narrow solutions. and we're going to talk about that little later in the segment. Okay, let's look at some of the attendees survey data from Black Hat. Just under 200 really serious security pros took the survey, so not enough to slice and dice by hair color, eye color, height, weight, and favorite movie genre, but enough to extract high level takeaways. You know, these strongly agree or disagree survey responses can sometimes give vanilla outputs, but let's look for the ones where very few respondents strongly agree or disagree with a statement or those that overwhelmingly strongly agree or somewhat agree. So it's clear from this that the respondents believe the following, one, your credentials are out there and available to criminals. Very few people thought that that was, you know, unavoidable. Second, remote work is here to stay, and third, nobody was willing to really jinx their firms and say that they strongly disagree that they'll have to respond to a major cybersecurity incident within the next 12 months. Now, as we've reported extensively, COVID has permanently changed the cybersecurity landscape and the CISO's priorities and playbook. Check out this data that queries respondents on the pandemic's impact on cybersecurity, new requirements to secure remote workers, more cloud, more threats from remote systems and remote users, and a shift away from perimeter defenses that are no longer as effective, e.g. firewall appliances. Note, however, the fifth response that's down there highlighted in green. It shows a meaningful drop in the percentage of remote workers that are disregarding corporate security policy, still too many, but 10 percentage points down from 2021 survey. Now, as we've said many times, bad user behavior will trump good security technology virtually every time. Consistent with the commentary from Mark Arena's Intel 471 threat report, fishing for credentials is the number one concern cited in the Black Hat Attendees Survey. This is a people and process problem more than a technology issue. Yes, using multifactor authentication, changing passwords, you know, using unique passwords, using password managers, et cetera, they're all great things, but if it's too hard for users to implement these things, they won't do it, they'll remain exposed, and their organizations will remain exposed. Number two in the graphic, sophisticated attacks that could expose vulnerabilities in the security infrastructure, again, consistent with the Intel 471 data, and three, supply chain risks, again, consistent with Mark Arena's commentary. Ask most CISOs their number one problem, and they'll tell you, "It's a lack of talent." That'll be on the top of their list. So it's no surprise that 63% of survey respondents believe they don't have the security staff necessary to defend against cyber threats. This speaks to the rise of managed security service providers that we've talked about previously on "Breaking Analysis". We've seen estimates that less than 50% of organizations in the US have a SOC, and we see those firms as ripe for MSSP support as well as larger firms augmenting staff with managed service providers. Now, after re:Invent, we put forth this conceptual model that discussed how the cloud was becoming the first line of defense for CISOs, and DevOps was being asked to do more, things like securing the runtime, the containers, the platform, et cetera, and audit was kind of that last line of defense. So a couple things we picked up from Black Hat which are consistent with this shift and some that are somewhat new, first, is getting visibility across the expanded threat surface was a big theme at Black Hat. This makes it even harder to identify risk, of course, this being the expanded threat surface. It's one thing to know that there's a vulnerability somewhere. It's another thing to determine the severity of the risk, but understanding how easy or difficult it is to exploit that vulnerability and how to prioritize action around that. Vulnerability is increasingly complex for CISOs as the security landscape gets complexified. So what's happening is the SOC, if there even is one at the organization, is becoming federated. No longer can there be one ivory tower that's the magic god room of data and threat detection and analysis. Rather, the SOC is becoming distributed following the data, and as we just mentioned, the SOC is being augmented by the cloud provider and the managed service providers, the MSSPs. So there's a lot of critical security data that is decentralized and this will necessitate a new cyber data model where data can be synchronized and shared across a federation of SOCs, if you will, or mini SOCs or SOC capabilities that live in and/or embedded in an organization's ecosystem. Now, to this point about cloud being the first line of defense, let's turn to a story from ETR that came out of our colleague Eric Bradley's insight in a one-on-one he did with a senior IR person at a manufacturing firm. In a piece that ETR published called "Saved by Zscaler", check out this comment. Quote, "As the last layer, we are filtering all the outgoing internet traffic through Zscaler. And when an attacker is already on your network, and they're trying to communicate with the outside to exchange encryption keys, Zscaler is already blocking the traffic. It happened to us. It happened and we were saved by Zscaler." So that's pretty cool. So not only is the cloud the first line of defense, as we sort of depicted in that previous graphic, here's an example where it's also the last line of defense. Now, let's end on what this all means to securing the supercloud. At our Supercloud 22 event last week in our Palo Alto CUBE Studios, we had a session on this topic on supercloud, securing the supercloud. Security, in our view, is going to be one of the most important and difficult challenges for the idea of supercloud to become real. We reviewed in last week's "Breaking Analysis" a detailed discussion with Snowflake co-founder and president of products, Benoit Dageville, how his company approaches security in their data cloud, what we call a superdata cloud. Snowflake doesn't use the term supercloud. They use the term datacloud, but what if you don't have the focus, the engineering depth, and the bank roll that Snowflake has? Does that mean superclouds will only be developed by those companies with deep pockets and enormous resources? Well, that's certainly possible, but on the securing the supercloud panel, we had three technical experts, Gee Rittenhouse of Skyhigh Security, Piyush Sharrma who's the founder of Accurics who sold to Tenable, and Tony Kueh, who's the former Head of Product at VMware. Now, John Furrier asked each of them, "What is missing? What's it going to take to secure the supercloud? What has to happen?" Here's what they said. Play the clip. >> This is the final question. We have one minute left. I wish we had more time. This is a great panel. We'll bring you guys back for sure after the event. What one thing needs to happen to unify or get through the other side of this fragmentation and then the challenges for supercloud? Because remember, the enterprise equation is solve complexity with more complexity. Well, that's not what the market wants. They want simplicity. They want SaaS. They want ease of use. They want infrastructure risk code. What has to happen? What do you think, each of you? >> So I can start, and extending to the previous conversation, I think we need a consortium. We need a framework that defines that if you really want to operate on supercloud, these are the 10 things that you must follow. It doesn't matter whether you take AWS, Slash, or TCP or you have all, and you will have the on-prem also, which means that it has to follow a pattern, and that pattern is what is required for supercloud, in my opinion. Otherwise, security is going everywhere. They're like they have to fix everything, find everything, and so on and so forth. It's not going to be possible. So they need a framework. They need a consortium, and this consortium needs to be, I think, needs to led by the cloud providers because they're the ones who have these foundational infrastructure elements, and the security vendor should contribute on providing more severe detections or severe findings. So that's, in my opinion, should be the model. >> Great, well, thank you, Gee. >> Yeah, I would think it's more along the lines of a business model. We've seen in cloud that the scale matters, and once you're big, you get bigger. We haven't seen that coalesce around either a vendor, a business model, or whatnot to bring all of this and connect it all together yet. So that value proposition in the industry, I think, is missing, but there's elements of it already available. >> I think there needs to be a mindset. If you look, again, history repeating itself. The internet sort of came together around set of IETF, RSC standards. Everybody embraced and extended it, right? But still, there was, at least, a baseline, and I think at that time, the largest and most innovative vendors understood that they couldn't do it by themselves, right? And so I think what we need is a mindset where these big guys, like Google, let's take an example. They're not going to win at all, but they can have a substantial share. So how do they collaborate with the ecosystem around a set of standards so that they can bring their differentiation and then embrace everybody together. >> Okay, so Gee's point about a business model is, you know, business model being missing, it's broadly true, but perhaps Snowflake serves as a business model where they've just gone out and and done it, setting or trying to set a de facto standard by which data can be shared and monetized. They're certainly setting that standard and mandating that standard within the Snowflake ecosystem with its proprietary framework. You know, perhaps that is one answer, but Tony lays out a scenario where there's a collaboration mindset around a set of standards with an ecosystem. You know, intriguing is this idea of a consortium or a framework that Piyush was talking about, and that speaks to the collaboration or lack thereof that we spoke of earlier, and his and Tony's proposal that the cloud providers should lead with the security vendor ecosystem playing a supporting role is pretty compelling, but can you see AWS and Azure and Google in a kumbaya moment getting together to make that happen? It seems unlikely, but maybe a better partnership between the US government and big tech could be a starting point. Okay, that's it for today. I want to thank the many people who attended Black Hat, reported on it, wrote about it, gave talks, did videos, and some that spoke to me that had attended the event, Becky Bracken, who is the EIC at Dark Reading. They do a phenomenal job and the entire team at Dark Reading, the news desk there, Mark Arena, whom I mentioned, Garrett O'Hara, Nash Borges, Kelly Jackson, sorry, Kelly Jackson Higgins, Roya Gordon, Robert Lipovsky, Chris Krebs, and many others, thanks for the great, great commentary and the content that you put out there, and thanks to Alex Myerson, who's on production, and Alex manages the podcasts for us. Ken Schiffman is also in our Marlborough studio as well, outside of Boston. Kristen Martin and Cheryl Knight, they help get the word out on social media and in our newsletters, and Rob Hoff is our Editor-in-Chief at SiliconANGLE and does some great editing and helps with the titles of "Breaking Analysis" quite often. Remember these episodes, they're all available as podcasts, wherever you listen, just search for "Breaking Analysis Podcasts". I publish each on wikibon.com and siliconangle.com, and you could email me, get in touch with me at david.vellante@siliconangle.com or you can DM me @dvellante or comment on my LinkedIn posts, and please do check out etr.ai for the best survey data in the enterprise tech business. This is Dave Vellante for theCUBE Insights powered by ETR. Thanks for watching, and we'll see you next time on "Breaking Analysis". (upbeat music)

Published Date : Aug 21 2022

SUMMARY :

with Dave Vellante". and the ripple effects that This is the final question. and the security vendor should contribute that the scale matters, the largest and most innovative and the content that you put out there,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Cheryl KnightPERSON

0.99+

Alex MyersonPERSON

0.99+

Robert LipovskyPERSON

0.99+

Eric BradleyPERSON

0.99+

Chris KrebsPERSON

0.99+

Charles SennottPERSON

0.99+

Becky BrackenPERSON

0.99+

Rob HoffPERSON

0.99+

Dave VellantePERSON

0.99+

TonyPERSON

0.99+

Ken SchiffmanPERSON

0.99+

John FurrierPERSON

0.99+

Kelly JacksonPERSON

0.99+

Gee RittenhousePERSON

0.99+

Benoit DagevillePERSON

0.99+

Tony KuehPERSON

0.99+

Mark ArenaPERSON

0.99+

Piyush SharrmaPERSON

0.99+

Kristen MartinPERSON

0.99+

Roya GordonPERSON

0.99+

CISAORGANIZATION

0.99+

SnowflakeORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

Palo AltoLOCATION

0.99+

Garrett O'HaraPERSON

0.99+

AccuricsORGANIZATION

0.99+

BostonLOCATION

0.99+

USLOCATION

0.99+

2021DATE

0.99+

Skyhigh SecurityORGANIZATION

0.99+

Black HatORGANIZATION

0.99+

10 thingsQUANTITY

0.99+

TenableORGANIZATION

0.99+

AWSORGANIZATION

0.99+

david.vellante@siliconangle.comOTHER

0.99+

Nash BorgesPERSON

0.99+

last weekDATE

0.99+

IntelORGANIZATION

0.99+

Las VegasLOCATION

0.99+

Robert GatesPERSON

0.99+

one minuteQUANTITY

0.99+

63%QUANTITY

0.99+

less than 50%QUANTITY

0.99+

SecondQUANTITY

0.99+

SiliconANGLEORGANIZATION

0.99+

last weekDATE

0.99+

eachQUANTITY

0.99+

Kelly Jackson HigginsPERSON

0.99+

AlexPERSON

0.99+

2016DATE

0.99+

Black Hat 22EVENT

0.99+

VMwareORGANIZATION

0.99+

thirdQUANTITY

0.99+

threeQUANTITY

0.99+

Black HatEVENT

0.98+

three technical expertsQUANTITY

0.98+

first lineQUANTITY

0.98+

fifth responseQUANTITY

0.98+

supercloudORGANIZATION

0.98+

ETRORGANIZATION

0.98+

UkraineLOCATION

0.98+

Boston GlobeORGANIZATION

0.98+

Dr.PERSON

0.98+

one answerQUANTITY

0.97+

wikibon.comOTHER

0.97+

first lineQUANTITY

0.97+

this weekDATE

0.96+

firstQUANTITY

0.96+

MarlboroughLOCATION

0.96+

siliconangle.comOTHER

0.95+

Saved by ZscalerTITLE

0.95+

Palo Alto CUBE StudiosLOCATION

0.95+

hundreds of sessionsQUANTITY

0.95+

LinkedInORGANIZATION

0.94+

bothQUANTITY

0.94+

oneQUANTITY

0.94+

dozens of keynotesQUANTITY

0.93+

todayDATE

0.93+

Breaking Analysis: How the cloud is changing security defenses in the 2020s


 

>> Announcer: From theCUBE studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR. This is "Breaking Analysis" with Dave Vellante. >> The rapid pace of cloud adoption has changed the way organizations approach cybersecurity. Specifically, the cloud is increasingly becoming the first line of cyber defense. As such, along with communicating to the board and creating a security aware culture, the chief information security officer must ensure that the shared responsibility model is being applied properly. Meanwhile, the DevSecOps team has emerged as the critical link between strategy and execution, while audit becomes the free safety, if you will, in the equation, i.e., the last line of defense. Hello, and welcome to this week's, we keep on CUBE Insights, powered by ETR. In this "Breaking Analysis", we'll share the latest data on hyperscale, IaaS, and PaaS market performance, along with some fresh ETR survey data. And we'll share some highlights and the puts and takes from the recent AWS re:Inforce event in Boston. But first, the macro. It's earning season, and that's what many people want to talk about, including us. As we reported last week, the macro spending picture is very mixed and weird. Think back to a week ago when SNAP reported. A player like SNAP misses and the Nasdaq drops 300 points. Meanwhile, Intel, the great semiconductor hope for America misses by a mile, cuts its revenue outlook by 15% for the year, and the Nasdaq was up nearly 250 points just ahead of the close, go figure. Earnings reports from Meta, Google, Microsoft, ServiceNow, and some others underscored cautious outlooks, especially those exposed to the advertising revenue sector. But at the same time, Apple, Microsoft, and Google, were, let's say less bad than expected. And that brought a sigh of relief. And then there's Amazon, which beat on revenue, it beat on cloud revenue, and it gave positive guidance. The Nasdaq has seen this month best month since the isolation economy, which "Breaking Analysis" contributor, Chip Symington, attributes to what he calls an oversold rally. But there are many unknowns that remain. How bad will inflation be? Will the fed really stop tightening after September? The Senate just approved a big spending bill along with corporate tax hikes, which generally don't favor the economy. And on Monday, August 1st, the market will likely realize that we are in the summer quarter, and there's some work to be done. Which is why it's not surprising that investors sold the Nasdaq at the close today on Friday. Are people ready to call the bottom? Hmm, some maybe, but there's still lots of uncertainty. However, the cloud continues its march, despite some very slight deceleration in growth rates from the two leaders. Here's an update of our big four IaaS quarterly revenue data. The big four hyperscalers will account for $165 billion in revenue this year, slightly lower than what we had last quarter. We expect AWS to surpass 83 billion this year in revenue. Azure will be more than 2/3rds the size of AWS, a milestone from Microsoft. Both AWS and Azure came in slightly below our expectations, but still very solid growth at 33% and 46% respectively. GCP, Google Cloud Platform is the big concern. By our estimates GCP's growth rate decelerated from 47% in Q1, and was 38% this past quarter. The company is struggling to keep up with the two giants. Remember, both GCP and Azure, they play a shell game and hide the ball on their IaaS numbers, so we have to use a survey data and other means of estimating. But this is how we see the market shaping up in 2022. Now, before we leave the overall cloud discussion, here's some ETR data that shows the net score or spending momentum granularity for each of the hyperscalers. These bars show the breakdown for each company, with net score on the right and in parenthesis, net score from last quarter. lime green is new adoptions, forest green is spending up 6% or more, the gray is flat, pink is spending at 6% down or worse, and the bright red is replacement or churn. Subtract the reds from the greens and you get net score. One note is this is for each company's overall portfolio. So it's not just cloud. So it's a bit of a mixed bag, but there are a couple points worth noting. First, anything above 40% or 40, here as shown in the chart, is considered elevated. AWS, as you can see, is well above that 40% mark, as is Microsoft. And if you isolate Microsoft's Azure, only Azure, it jumps above AWS's momentum. Google is just barely hanging on to that 40 line, and Alibaba is well below, with both Google and Alibaba showing much higher replacements, that bright red. But here's the key point. AWS and Azure have virtually no churn, no replacements in that bright red. And all four companies are experiencing single-digit numbers in terms of decreased spending within customer accounts. People may be moving some workloads back on-prem selectively, but repatriation is definitely not a trend to bet the house on, in our view. Okay, let's get to the main subject of this "Breaking Analysis". TheCube was at AWS re:Inforce in Boston this week, and we have some observations to share. First, we had keynotes from Steven Schmidt who used to be the chief information security officer at Amazon on Web Services, now he's the CSO, the chief security officer of Amazon. Overall, he dropped the I in his title. CJ Moses is the CISO for AWS. Kurt Kufeld of AWS also spoke, as did Lena Smart, who's the MongoDB CISO, and she keynoted and also came on theCUBE. We'll go back to her in a moment. The key point Schmidt made, one of them anyway, was that Amazon sees more data points in a day than most organizations see in a lifetime. Actually, it adds up to quadrillions over a fairly short period of time, I think, it was within a month. That's quadrillion, it's 15 zeros, by the way. Now, there was drill down focus on data protection and privacy, governance, risk, and compliance, GRC, identity, big, big topic, both within AWS and the ecosystem, network security, and threat detection. Those are the five really highlighted areas. Re:Inforce is really about bringing a lot of best practice guidance to security practitioners, like how to get the most out of AWS tooling. Schmidt had a very strong statement saying, he said, "I can assure you with a 100% certainty that single controls and binary states will absolutely positively fail." Hence, the importance of course, of layered security. We heard a little bit of chat about getting ready for the future and skating to the security puck where quantum computing threatens to hack all of the existing cryptographic algorithms, and how AWS is trying to get in front of all that, and a new set of algorithms came out, AWS is testing. And, you know, we'll talk about that maybe in the future, but that's a ways off. And by its prominent presence, the ecosystem was there enforced, to talk about their role and filling the gaps and picking up where AWS leaves off. We heard a little bit about ransomware defense, but surprisingly, at least in the keynotes, no discussion about air gaps, which we've talked about in previous "Breaking Analysis", is a key factor. We heard a lot about services to help with threat detection and container security and DevOps, et cetera, but there really wasn't a lot of specific talk about how AWS is simplifying the life of the CISO. Now, maybe it's inherently assumed as AWS did a good job stressing that security is job number one, very credible and believable in that front. But you have to wonder if the world is getting simpler or more complex with cloud. And, you know, you might say, "Well, Dave, come on, of course it's better with cloud." But look, attacks are up, the threat surface is expanding, and new exfiltration records are being set every day. I think the hard truth is, the cloud is driving businesses forward and accelerating digital, and those businesses are now exposed more than ever. And that's why security has become such an important topic to boards and throughout the entire organization. Now, the other epiphany that we had at re:Inforce is that there are new layers and a new trust framework emerging in cyber. Roles are shifting, and as a direct result of the cloud, things are changing within organizations. And this first hit me in a conversation with long-time cyber practitioner and Wikibon colleague from our early Wikibon days, and friend, Mike Versace. And I spent two days testing the premise that Michael and I talked about. And here's an attempt to put that conversation into a graphic. The cloud is now the first line of defense. AWS specifically, but hyperscalers generally provide the services, the talent, the best practices, and automation tools to secure infrastructure and their physical data centers. And they're really good at it. The security inside of hyperscaler clouds is best of breed, it's world class. And that first line of defense does take some of the responsibility off of CISOs, but they have to understand and apply the shared responsibility model, where the cloud provider leaves it to the customer, of course, to make sure that the infrastructure they're deploying is properly configured. So in addition to creating a cyber aware culture and communicating up to the board, the CISO has to ensure compliance with and adherence to the model. That includes attracting and retaining the talent necessary to succeed. Now, on the subject of building a security culture, listen to this clip on one of the techniques that Lena Smart, remember, she's the CISO of MongoDB, one of the techniques she uses to foster awareness and build security cultures in her organization. Play the clip >> Having the Security Champion program, so that's just, it's like one of my babies. That and helping underrepresented groups in MongoDB kind of get on in the tech world are both really important to me. And so the Security Champion program is purely purely voluntary. We have over 100 members. And these are people, there's no bar to join, you don't have to be technical. If you're an executive assistant who wants to learn more about security, like my assistant does, you're more than welcome. Up to, we actually, people grade themselves when they join us. We give them a little tick box, like five is, I walk on security water, one is I can spell security, but I'd like to learn more. Mixing those groups together has been game-changing for us. >> Now, the next layer is really where it gets interesting. DevSecOps, you know, we hear about it all the time, shifting left. It implies designing security into the code at the dev level. Shift left and shield right is the kind of buzz phrase. But it's getting more and more complicated. So there are layers within the development cycle, i.e., securing the container. So the app code can't be threatened by backdoors or weaknesses in the containers. Then, securing the runtime to make sure the code is maintained and compliant. Then, the DevOps platform so that change management doesn't create gaps and exposures, and screw things up. And this is just for the application security side of the equation. What about the network and implementing zero trust principles, and securing endpoints, and machine to machine, and human to app communication? So there's a lot of burden being placed on the DevOps team, and they have to partner with the SecOps team to succeed. Those guys are not security experts. And finally, there's audit, which is the last line of defense or what I called at the open, the free safety, for you football fans. They have to do more than just tick the box for the board. That doesn't cut it anymore. They really have to know their stuff and make sure that what they sign off on is real. And then you throw ESG into the mix is becoming more important, making sure the supply chain is green and also secure. So you can see, while much of this stuff has been around for a long, long time, the cloud is accelerating innovation in the pace of delivery. And so much is changing as a result. Now, next, I want to share a graphic that we shared last week, but a little different twist. It's an XY graphic with net score or spending velocity in the vertical axis and overlap or presence in the dataset on the horizontal. With that magic 40% red line as shown. Okay, I won't dig into the data and draw conclusions 'cause we did that last week, but two points I want to make. First, look at Microsoft in the upper-right hand corner. They are big in security and they're attracting a lot of dollars in the space. We've reported on this for a while. They're a five-star security company. And every time, from a spending standpoint in ETR data, that little methodology we use, every time I've run this chart, I've wondered, where the heck is AWS? Why aren't they showing up there? If security is so important to AWS, which it is, and its customers, why aren't they spending money with Amazon on security? And I asked this very question to Merrit Baer, who resides in the office of the CISO at AWS. Listen to her answer. >> It doesn't mean don't spend on security. There is a lot of goodness that we have to offer in ESS, external security services. But I think one of the unique parts of AWS is that we don't believe that security is something you should buy, it's something that you get from us. It's something that we do for you a lot of the time. I mean, this is the definition of the shared responsibility model, right? >> Now, maybe that's good messaging to the market. Merritt, you know, didn't say it outright, but essentially, Microsoft they charge for security. At AWS, it comes with the package. But it does answer my question. And, of course, the fact is that AWS can subsidize all this with egress charges. Now, on the flip side of that, (chuckles) you got Microsoft, you know, they're both, they're competing now. We can take CrowdStrike for instance. Microsoft and CrowdStrike, they compete with each other head to head. So it's an interesting dynamic within the ecosystem. Okay, but I want to turn to a powerful example of how AWS designs in security. And that is the idea of confidential computing. Of course, AWS is not the only one, but we're coming off of re:Inforce, and I really want to dig into something that David Floyer and I have talked about in previous episodes. And we had an opportunity to sit down with Arvind Raghu and J.D. Bean, two security experts from AWS, to talk about this subject. And let's share what we learned and why we think it matters. First, what is confidential computing? That's what this slide is designed to convey. To AWS, they would describe it this way. It's the use of special hardware and the associated firmware that protects customer code and data from any unauthorized access while the data is in use, i.e., while it's being processed. That's oftentimes a security gap. And there are two dimensions here. One is protecting the data and the code from operators on the cloud provider, i.e, in this case, AWS, and protecting the data and code from the customers themselves. In other words, from admin level users are possible malicious actors on the customer side where the code and data is being processed. And there are three capabilities that enable this. First, the AWS Nitro System, which is the foundation for virtualization. The second is Nitro Enclaves, which isolate environments, and then third, the Nitro Trusted Platform Module, TPM, which enables cryptographic assurances of the integrity of the Nitro instances. Now, we've talked about Nitro in the past, and we think it's a revolutionary innovation, so let's dig into that a bit. This is an AWS slide that was shared about how they protect and isolate data and code. On the left-hand side is a classical view of a virtualized architecture. You have a single host or a single server, and those white boxes represent processes on the main board, X86, or could be Intel, or AMD, or alternative architectures. And you have the hypervisor at the bottom which translates instructions to the CPU, allowing direct execution from a virtual machine into the CPU. But notice, you also have blocks for networking, and storage, and security. And the hypervisor emulates or translates IOS between the physical resources and the virtual machines. And it creates some overhead. Now, companies like VMware have done a great job, and others, of stripping out some of that overhead, but there's still an overhead there. That's why people still like to run on bare metal. Now, and while it's not shown in the graphic, there's an operating system in there somewhere, which is privileged, so it's got access to these resources, and it provides the services to the VMs. Now, on the right-hand side, you have the Nitro system. And you can see immediately the differences between the left and right, because the networking, the storage, and the security, the management, et cetera, they've been separated from the hypervisor and that main board, which has the Intel, AMD, throw in Graviton and Trainium, you know, whatever XPUs are in use in the cloud. And you can see that orange Nitro hypervisor. That is a purpose-built lightweight component for this system. And all the other functions are separated in isolated domains. So very strong isolation between the cloud software and the physical hardware running workloads, i.e., those white boxes on the main board. Now, this will run at practically bare metal speeds, and there are other benefits as well. One of the biggest is security. As we've previously reported, this came out of AWS's acquisition of Annapurna Labs, which we've estimated was picked up for a measly $350 million, which is a drop in the bucket for AWS to get such a strategic asset. And there are three enablers on this side. One is the Nitro cards, which are accelerators to offload that wasted work that's done in traditional architectures by typically the X86. We've estimated 25% to 30% of core capacity and cycles is wasted on those offloads. The second is the Nitro security chip, which is embedded and extends the root of trust to the main board hardware. And finally, the Nitro hypervisor, which allocates memory and CPU resources. So the Nitro cards communicate directly with the VMs without the hypervisors getting in the way, and they're not in the path. And all that data is encrypted while it's in motion, and of course, encryption at rest has been around for a while. We asked AWS, is this an, we presumed it was an Arm-based architecture. We wanted to confirm that. Or is it some other type of maybe hybrid using X86 and Arm? They told us the following, and quote, "The SoC, system on chips, for these hardware components are purpose-built and custom designed in-house by Amazon and Annapurna Labs. The same group responsible for other silicon innovations such as Graviton, Inferentia, Trainium, and AQUA. Now, the Nitro cards are Arm-based and do not use any X86 or X86/64 bit CPUs. Okay, so it confirms what we thought. So you may say, "Why should we even care about all this technical mumbo jumbo, Dave?" Well, a year ago, David Floyer and I published this piece explaining why Nitro and Graviton are secret weapons of Amazon that have been a decade in the making, and why everybody needs some type of Nitro to compete in the future. This is enabled, this Nitro innovations and the custom silicon enabled by the Annapurna acquisition. And AWS has the volume economics to make custom silicon. Not everybody can do it. And it's leveraging the Arm ecosystem, the standard software, and the fabrication volume, the manufacturing volume to revolutionize enterprise computing. Nitro, with the alternative processor, architectures like Graviton and others, enables AWS to be on a performance, cost, and power consumption curve that blows away anything we've ever seen from Intel. And Intel's disastrous earnings results that we saw this past week are a symptom of this mega trend that we've been talking about for years. In the same way that Intel and X86 destroyed the market for RISC chips, thanks to PC volumes, Arm is blowing away X86 with volume economics that cannot be matched by Intel. Thanks to, of course, to mobile and edge. Our prediction is that these innovations and the Arm ecosystem are migrating and will migrate further into enterprise computing, which is Intel's stronghold. Now, that stronghold is getting eaten away by the likes of AMD, Nvidia, and of course, Arm in the form of Graviton and other Arm-based alternatives. Apple, Tesla, Amazon, Google, Microsoft, Alibaba, and others are all designing custom silicon, and doing so much faster than Intel can go from design to tape out, roughly cutting that time in half. And the premise of this piece is that every company needs a Nitro to enable alternatives to the X86 in order to support emergent workloads that are data rich and AI-based, and to compete from an economic standpoint. So while at re:Inforce, we heard that the impetus for Nitro was security. Of course, the Arm ecosystem, and its ascendancy has enabled, in our view, AWS to create a platform that will set the enterprise computing market this decade and beyond. Okay, that's it for today. Thanks to Alex Morrison, who is on production. And he does the podcast. And Ken Schiffman, our newest member of our Boston Studio team is also on production. Kristen Martin and Cheryl Knight help spread the word on social media and in the community. And Rob Hof is our editor in chief over at SiliconANGLE. He does some great, great work for us. Remember, all these episodes are available as podcast. Wherever you listen, just search "Breaking Analysis" podcast. I publish each week on wikibon.com and siliconangle.com. Or you can email me directly at David.Vellante@siliconangle.com or DM me @dvellante, comment on my LinkedIn post. And please do check out etr.ai for the best survey data in the enterprise tech business. This is Dave Vellante for theCUBE Insights, powered by ETR. Thanks for watching. Be well, and we'll see you next time on "Breaking Analysis." (upbeat theme music)

Published Date : Jul 30 2022

SUMMARY :

This is "Breaking Analysis" and the Nasdaq was up nearly 250 points And so the Security Champion program the SecOps team to succeed. of the shared responsibility model, right? and it provides the services to the VMs.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Alex MorrisonPERSON

0.99+

David FloyerPERSON

0.99+

Mike VersacePERSON

0.99+

MichaelPERSON

0.99+

AWSORGANIZATION

0.99+

Steven SchmidtPERSON

0.99+

AmazonORGANIZATION

0.99+

Kurt KufeldPERSON

0.99+

AppleORGANIZATION

0.99+

Dave VellantePERSON

0.99+

TeslaORGANIZATION

0.99+

AlibabaORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

J.D. BeanPERSON

0.99+

Ken SchiffmanPERSON

0.99+

Arvind RaghuPERSON

0.99+

Lena SmartPERSON

0.99+

Kristen MartinPERSON

0.99+

Cheryl KnightPERSON

0.99+

40%QUANTITY

0.99+

Rob HofPERSON

0.99+

DavePERSON

0.99+

SchmidtPERSON

0.99+

Palo AltoLOCATION

0.99+

2022DATE

0.99+

fiveQUANTITY

0.99+

NvidiaORGANIZATION

0.99+

two daysQUANTITY

0.99+

Annapurna LabsORGANIZATION

0.99+

6%QUANTITY

0.99+

SNAPORGANIZATION

0.99+

five-starQUANTITY

0.99+

Chip SymingtonPERSON

0.99+

47%QUANTITY

0.99+

AnnapurnaORGANIZATION

0.99+

$350 millionQUANTITY

0.99+

BostonLOCATION

0.99+

Merrit BaerPERSON

0.99+

CJ MosesPERSON

0.99+

40QUANTITY

0.99+

MerrittPERSON

0.99+

15%QUANTITY

0.99+

25%QUANTITY

0.99+

AMDORGANIZATION

0.99+