Image Title

Search Results for each domain:

Breaking Analysis: Snowflake caught in the storm clouds


 

>> From the CUBE Studios in Palo Alto in Boston, bringing you data driven insights from the Cube and ETR. This is Breaking Analysis with Dave Vellante. >> A better than expected earnings report in late August got people excited about Snowflake again, but the negative sentiment in the market is weighed heavily on virtually all growth tech stocks and Snowflake is no exception. As we've stressed many times the company's management is on a long term mission to dramatically simplify the way organizations use data. Snowflake is tapping into a multi hundred billion dollar total available market and continues to grow at a rapid pace. In our view, Snowflake is embarking on its third major wave of innovation data apps, while its first and second waves are still bearing significant fruit. Now for short term traders focused on the next 90 or 180 days, that probably doesn't matter. But those taking a longer view are asking, "Should we still be optimistic about the future of this high flyer or is it just another over hyped tech play?" Hello and welcome to this week's Wiki Bond Cube Insights powered by ETR. Snowflake's Quarter just ended. And in this breaking analysis we take a look at the most recent survey data from ETR to see what clues and nuggets we can extract to predict the near term future in the long term outlook for Snowflake which is going to announce its earnings at the end of this month. Okay, so you know the story. If you've been investor in Snowflake this year, it's been painful. We said at IPO, "If you really want to own this stock on day one, just hold your nose and buy it." But like most IPOs we said there will be likely a better entry point in the future, and not surprisingly that's been the case. Snowflake IPOed a price of 120, which you couldn't touch on day one unless you got into a friends and family Delio. And if you did, you're still up 5% or so. So congratulations. But at one point last year you were up well over 200%. That's been the nature of this volatile stock, and I certainly can't help you with the timing of the market. But longer term Snowflake is targeting 10 billion in revenue for fiscal year 2028. A big number. Is it achievable? Is it big enough? Tell you what, let's come back to that. Now shorter term, our expert trader and breaking analysis contributor Chip Simonton said he got out of the stock a while ago after having taken a shot at what turned out to be a bear market rally. He pointed out that the stock had been bouncing around the 150 level for the last few months and broke that to the downside last Friday. So he'd expect 150 is where the stock is going to find resistance on the way back up, but there's no sign of support right now. He said maybe at 120, which was the July low and of course the IPO price that we just talked about. Now, perhaps earnings will be a catalyst, when Snowflake announces on November 30th, but until the mentality toward growth tech changes, nothing's likely to change dramatically according to Simonton. So now that we have that out of the way, let's take a look at the spending data for Snowflake in the ETR survey. Here's a chart that shows the time series breakdown of snowflake's net score going back to the October, 2021 survey. Now at that time, Snowflake's net score stood at a robust 77%. And remember, net score is a measure of spending velocity. It's a proprietary network, and ETR derives it from a quarterly survey of IT buyers and asks the respondents, "Are you adopting the platform new? Are you spending 6% or more? Is you're spending flat? Is you're spending down 6% or worse? Or are you leaving the platform decommissioning?" You subtract the percent of customers that are spending less or churning from those that are spending more and adopting or adopting and you get a net score. And that's expressed as a percentage of customers responding. In this chart we show Snowflake's in out of the total survey which ranges... The total survey ranges between 1,200 and 1,400 each quarter. And the very last column... Oh sorry, very last row, we show the number of Snowflake respondents that are coming in the survey from the Fortune 500 and the Global 2000. Those are two very important Snowflake constituencies. Now what this data tells us is that Snowflake exited 2021 with very strong momentum in a net score of 82%, which is off the charts and it was actually accelerating from the previous survey. Now by April that sentiment had flipped and Snowflake came down to earth with a 68% net score. Still highly elevated relative to its peers, but meaningfully down. Why was that? Because we saw a drop in new ads and an increase in flat spend. Then into the July and most recent October surveys, you saw a significant drop in the percentage of customers that were spending more. Now, notably, the percentage of customers who are contemplating adding the platform is actually staying pretty strong, but it is off a bit this past survey. And combined with a slight uptick in planned churn, net score is now down to 60%. That uptick from 0% and 1% and then 3%, it's still small, but that net score at 60% is still 20 percentage points higher than our highly elevated benchmark of 40% as you recall from listening to earlier breaking analysis. That 40% range is we consider a milestone. Anything above that is actually quite strong. But again, Snowflake is down and coming back to churn, while 3% churn is very low, in previous quarters we've seen Snowflake 0% or 1% decommissions. Now the last thing to note in this chart is the meaningful uptick in survey respondents that are citing, they're using the Snowflake platform. That's up to 212 in the survey. So look, it's hard to imagine that Snowflake doesn't feel the softening in the market like everyone else. Snowflake is guiding for around 60% growth in product revenue against the tough compare from a year ago with a 2% operating margin. So like every company, the reaction of the street is going to come down to how accurate or conservative the guide is from their CFO. Now, earlier this year, Snowflake acquired a company called Streamlit for around $800 million. Streamlit is an open source Python library and it makes it easier to build data apps with machine learning, obviously a huge trend. And like Snowflake, generally its focus is on simplifying the complex, in this case making data science easier to integrate into data apps that business people can use. So we were excited this summer in the July ETR survey to see that they added some nice data and pick on Streamlit, which we're showing here in comparison to Snowflake's core business on the left hand side. That's the data warehousing, the Streamlit pieces on the right hand side. And we show again net score over time from the previous survey for Snowflake's core database and data warehouse offering again on the left as compared to a Streamlit on the right. Snowflake's core product had 194 responses in the October, 22 survey, Streamlit had an end of 73, which is up from 52 in the July survey. So significant uptick of people responding that they're doing business in adopting Streamlit. That was pretty impressive to us. And it's hard to see, but the net scores stayed pretty constant for Streamlit at 51%. It was 52% I think in the previous quarter, well over that magic 40% mark. But when you blend it with Snowflake, it does sort of bring things down a little bit. Now there are two key points here. One is that the acquisition seems to have gained exposure right out of the gate as evidenced by the large number of responses. And two, the spending momentum. Again while it's lower than Snowflake overall, and when you blend it with Snowflake it does pull it down, it's very healthy and steady. Now let's do a little pure comparison with some of our favorite names in this space. This chart shows net score or spending velocity in the Y-axis, an overlap or presence, pervasiveness if you will, in the data set on the X-axis. That red dotted line again is that 40% highly elevated net score that we like to talk about. And that table inserted informs us as to how the companies are plotted, where the dots set up, the net score, the ins. And we're comparing a number of database players, although just a caution, Oracle includes all of Oracle including its apps. But we just put it in there for reference because it is the leader in database. Right off the bat, Snowflake jumps out with a net score of 64%. The 60% from the earlier chart, again included Streamlit. So you can see its core database, data warehouse business actually is higher than the total company average that we showed you before 'cause the Streamlit is blended in. So when you separate it out, Streamlit is right on top of data bricks. Isn't that ironic? Only Snowflake and Databricks in this selection of names are above the 40% level. You see Mongo and Couchbase, they know they're solid and Teradata cloud actually showing pretty well compared to some of the earlier survey results. Now let's isolate on the database data platform sector and see how that shapes up. And for this analysis, same XY dimensions, we've added the big giants, AWS and Microsoft and Google. And notice that those three plus Snowflake are just at or above the 40% line. Snowflake continues to lead by a significant margin in spending momentum and it keeps creeping to the right. That's that end that we talked about earlier. Now here's an interesting tidbit. Snowflake is often asked, and I've asked them myself many times, "How are you faring relative to AWS, Microsoft and Google, these big whales with Redshift and Synapse and Big Query?" And Snowflake has been telling folks that 80% of its business comes from AWS. And when Microsoft heard that, they said, "Whoa, wait a minute, Snowflake, let's partner up." 'Cause Microsoft is smart, and they understand that the market is enormous. And if they could do better with Snowflake, one, they may steal some business from AWS. And two, even if Snowflake is winning against some of the Microsoft database products, if it wins on Azure, Microsoft is going to sell more compute and more storage, more AI tools, more other stuff to these customers. Now AWS is really aggressive from a partnering standpoint with Snowflake. They're openly negotiating, not openly, but they're negotiating better prices. They're realizing that when it comes to data, the cheaper that you make the offering, the more people are going to consume. At scale economies and operating leverage are really powerful things at volume that kick in. Now Microsoft, they're coming along, they obviously get it, but Google is seemingly resistant to that type of go to market partnership. Rather than lean into Snowflake as a great partner Google's field force is kind of fighting fashion. Google itself at Cloud next heavily messaged what they call the open data cloud, which is a direct rip off of Snowflake. So what can we say about Google? They continue to be kind of behind the curve when it comes to go to market. Now just a brief aside on the competitive posture. I've seen Slootman, Frank Slootman, CEO of Snowflake in action with his prior companies and how he depositioned the competition. At Data Domain, he eviscerated a company called Avamar with their, what he called their expensive and slow post process architecture. I think he actually called it garbage, if I recall at one conference I heard him speak at. And that sort of destroyed BMC when he was at ServiceNow, kind of positioning them as the equivalent of the department of motor vehicles. And so it's interesting to hear how Snowflake openly talks about the data platforms of AWS, Microsoft, Google, and data bricks. I'll give you this sort of short bumper sticker. Redshift is just an on-prem database that AWS morphed to the cloud, which by the way is kind of true. They actually did a brilliant job of it, but it's basically a fact. Microsoft Excel, a collection of legacy databases, which also kind of morphed to run in the cloud. And even Big Query, which is considered cloud native by many if not most, is being positioned by Snowflake as originally an on-prem database to support Google's ad business, maybe. And data bricks is for those people smart enough to get it to Berkeley that love complexity. And now Snowflake doesn't, they don't mention Berkeley as far as I know. That's my addition. But you get the point. And the interesting thing about Databricks and Snowflake is a while ago in the cube I said that there was a new workload type emerging around data where you have AWS cloud, Snowflake obviously for the cloud database and Databricks data for the data science and EML, you bring those things together and there's this new workload emerging that's going to be very powerful in the future. And it's interesting to see now the aspirations of all three of these platforms are colliding. That's quite a dynamic, especially when you see both Snowflake and Databricks putting venture money and getting their hooks into the loyalties of the same companies like DBT labs and Calibra. Anyway, Snowflake's posture is that we are the pioneer in cloud native data warehouse, data sharing and now data apps. And our platform is designed for business people that want simplicity. The other guys, yes, they're formidable, but we Snowflake have an architectural lead and of course we run in multiple clouds. So it's pretty strong positioning or depositioning, you have to admit. Now I'm not sure I agree with the big query knockoffs completely. I think that's a bit of a stretch, but snowflake, as we see in the ETR survey data is winning. So in thinking about the longer term future, let's talk about what's different with Snowflake, where it's headed and what the opportunities are for the company. Snowflake put itself on the map by focusing on simplifying data analytics. What's interesting about that is the company's founders are as you probably know from Oracle. And rather than focusing on transactional data, which is Oracle's sweet spot, the stuff they worked on when they were at Oracle, the founder said, "We're going to go somewhere else. We're going to attack the data warehousing problem and the data analytics problem." And they completely re-imagined the database and how it could be applied to solve those challenges and reimagine what was possible if you had virtually unlimited compute and storage capacity. And of course Snowflake became famous for separating the compute from storage and being able to completely shut down compute so you didn't have to pay for it when you're not using it. And the ability to have multiple clusters hit the same data without making endless copies and a consumption/cloud pricing model. And then of course everyone on the planet realized, "Wow, that's a pretty good idea." Every venture capitalist in Silicon Valley has been funding companies to copy that move. And that today has pretty much become mainstream in table stakes. But I would argue that Snowflake not only had the lead, but when you look at how others are approaching this problem, it's not necessarily as clean and as elegant. Some of the startups, the early startups I think get it and maybe had an advantage of starting later, which can be a disadvantage too. But AWS is a good example of what I'm saying here. Is its version of separating compute from storage was an afterthought and it's good, it's... Given what they had it was actually quite clever and customers like it, but it's more of a, "Okay, we're going to tier to storage to lower cost, we're going to sort of dial down the compute not completely, we're not going to shut it off, we're going to minimize the compute required." It's really not true as separation is like for instance Snowflake has. But having said that, we're talking about competitors with lots of resources and cohort offerings. And so I don't want to make this necessarily all about the product, but all things being equal architecture matters, okay? So that's the cloud S-curve, the first one we're showing. Snowflake's still on that S-curve, and in and of itself it's got legs, but it's not what's going to power the company to 10 billion. The next S-curve we denote is the multi-cloud in the middle. And now while 80% of Snowflake's revenue is AWS, Microsoft is ramping up and Google, well, we'll see. But the interesting part of that curve is data sharing, and this idea of data clean rooms. I mean it really should be called the data sharing curve, but I have my reasons for calling it multi-cloud. And this is all about network effects and data gravity, and you're seeing this play out today, especially in industries like financial services and healthcare and government that are highly regulated verticals where folks are super paranoid about compliance. There not going to share data if they're going to get sued for it, if they're going to be in the front page of the Wall Street Journal for some kind of privacy breach. And what Snowflake has done is said, "Put all the data in our cloud." Now, of course now that triggers a lot of people because it's a walled garden, okay? It is. That's the trade off. It's not the Wild West, it's not Windows, it's Mac, it's more controlled. But the idea is that as different parts of the organization or even partners begin to share data that they need, it's got to be governed, it's got to be secure, it's got to be compliant, it's got to be trusted. So Snowflake introduced the idea of, they call these things stable edges. I think that's the term that they use. And they track a metric around stable edges. And so a stable edge, or think of it as a persistent edge is an ongoing relationship between two parties that last for some period of time, more than a month. It's not just a one shot deal, one a done type of, "Oh guys shared it for a day, done." It sent you an FTP, it's done. No, it's got to have trajectory over time. Four weeks or six weeks or some period of time that's meaningful. And that metric is growing. Now I think sort of a different metric that they track. I think around 20% of Snowflake customers are actively sharing data today and then they track the number of those edge relationships that exist. So that's something that's unique. Because again, most data sharing is all about making copies of data. That's great for storage companies, it's bad for auditors, and it's bad for compliance officers. And that trend is just starting out, that middle S-curve, it's going to kind of hit the base of that steep part of the S-curve and it's going to have legs through this decade we think. And then finally the third wave that we show here is what we call super cloud. That's why I called it multi-cloud before, so it could invoke super cloud. The idea that you've built a PAS layer that is purpose built for a specific objective, and in this case it's building data apps that are cloud native, shareable and governed. And is a long-term trend that's going to take some time to develop. I mean, application development platforms can take five to 10 years to mature and gain significant adoption, but this one's unique. This is a critical play for Snowflake. If it's going to compete with the big cloud players, it has to have an app development framework like Snowpark. It has to accommodate new data types like transactional data. That's why it announced this thing called UniStore last June, Snowflake a summit. And the pattern that's forming here is Snowflake is building layer upon layer with its architecture at the core. It's not currently anyway, it's not going out and saying, "All right, we're going to buy a company that's got to another billion dollars in revenue and that's how we're going to get to 10 billion." So it's not buying its way into new markets through revenue. It's actually buying smaller companies that can complement Snowflake and that it can turn into revenue for growth that fit in to the data cloud. Now as to the 10 billion by fiscal year 28, is that achievable? That's the question. Yeah, I think so. Would the momentum resources go to market product and management prowess that Snowflake has? Yes, it's definitely achievable. And one could argue to $10 billion is too conservative. Indeed, Snowflake CFO, Mike Scarpelli will fully admit his forecaster built on existing offerings. He's not including revenue as I understand it from all the new stuff that's in the pipeline because he doesn't know what it's going to look like. He doesn't know what the adoption is going to look like. He doesn't have data on that adoption, not just yet anyway. And now of course things can change quite dramatically. It's possible that is forecast for existing businesses don't materialize or competition picks them off or a company like Databricks actually is able in the longer term replicate the functionality of Snowflake with open source technologies, which would be a very competitive source of innovation. But in our view, there's plenty of room for growth, the market is enormous and the real key is, can and will Snowflake deliver on the promises of simplifying data? Of course we've heard this before from data warehouse, the data mars and data legs and master data management and ETLs and data movers and data copiers and Hadoop and a raft of technologies that have not lived up to expectations. And we've also, by the way, seen some tremendous successes in the software business with the likes of ServiceNow and Salesforce. So will Snowflake be the next great software name and hit that 10 billion magic mark? I think so. Let's reconnect in 2028 and see. Okay, we'll leave it there today. I want to thank Chip Simonton for his input to today's episode. Thanks to Alex Myerson who's on production and manages the podcast. Ken Schiffman as well. Kristin Martin and Cheryl Knight help get the word out on social media and in our newsletters. And Rob Hove is our Editor in Chief over at Silicon Angle. He does some great editing for us. Check it out for all the news. Remember all these episodes are available as podcasts. Wherever you listen, just search Breaking Analysis podcast. I publish each week on wikibon.com and siliconangle.com. Or you can email me to get in touch David.vallante@siliconangle.com. DM me @dvellante or comment on our LinkedIn post. And please do check out etr.ai, they've got the best survey data in the enterprise tech business. This is Dave Vellante for the CUBE Insights, powered by ETR. Thanks for watching, thanks for listening and we'll see you next time on breaking analysis. (upbeat music)

Published Date : Nov 10 2022

SUMMARY :

insights from the Cube and ETR. And the ability to have multiple

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Alex MyersonPERSON

0.99+

Mike ScarpelliPERSON

0.99+

Dave VellantePERSON

0.99+

OracleORGANIZATION

0.99+

AWSORGANIZATION

0.99+

November 30thDATE

0.99+

Ken SchiffmanPERSON

0.99+

MicrosoftORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

Chip SimontonPERSON

0.99+

October, 2021DATE

0.99+

Rob HovePERSON

0.99+

Cheryl KnightPERSON

0.99+

Frank SlootmanPERSON

0.99+

Four weeksQUANTITY

0.99+

JulyDATE

0.99+

six weeksQUANTITY

0.99+

10 billionQUANTITY

0.99+

fiveQUANTITY

0.99+

Palo AltoLOCATION

0.99+

SlootmanPERSON

0.99+

BMCORGANIZATION

0.99+

DatabricksORGANIZATION

0.99+

6%QUANTITY

0.99+

80%QUANTITY

0.99+

last yearDATE

0.99+

OctoberDATE

0.99+

Silicon ValleyLOCATION

0.99+

40%QUANTITY

0.99+

1,400QUANTITY

0.99+

$10 billionQUANTITY

0.99+

SnowflakeORGANIZATION

0.99+

AprilDATE

0.99+

3%QUANTITY

0.99+

77%QUANTITY

0.99+

64%QUANTITY

0.99+

60%QUANTITY

0.99+

194 responsesQUANTITY

0.99+

Kristin MartinPERSON

0.99+

two partiesQUANTITY

0.99+

51%QUANTITY

0.99+

2%QUANTITY

0.99+

Silicon AngleORGANIZATION

0.99+

fiscal year 28DATE

0.99+

billion dollarsQUANTITY

0.99+

0%QUANTITY

0.99+

AvamarORGANIZATION

0.99+

52%QUANTITY

0.99+

BerkeleyLOCATION

0.99+

2028DATE

0.99+

MongoORGANIZATION

0.99+

Data DomainORGANIZATION

0.99+

1%QUANTITY

0.99+

late AugustDATE

0.99+

twoQUANTITY

0.99+

threeQUANTITY

0.99+

fiscal year 2028DATE

0.99+

Kirk Haslbeck, Collibra, Data Citizens 22


 

(atmospheric music) >> Welcome to theCUBE Coverage of Data Citizens 2022 Collibra's Customer event. My name is Dave Vellante. With us is Kirk Haslbeck, who's the Vice President of Data Quality of Collibra. Kirk, good to see you, welcome. >> Thanks for having me, Dave. Excited to be here. >> You bet. Okay, we're going to discuss data quality, observability. It's a hot trend right now. You founded a data quality company, OwlDQ, and it was acquired by Collibra last year. Congratulations. And now you lead data quality at Collibra. So we're hearing a lot about data quality right now. Why is it such a priority? Take us through your thoughts on that. >> Yeah, absolutely. It's definitely exciting times for data quality which you're right, has been around for a long time. So why now? And why is it so much more exciting than it used to be? I think it's a bit stale, but we all know that companies use more data than ever before, and the variety has changed and the volume has grown. And while I think that remains true there are a couple other hidden factors at play that everyone's so interested in as to why this is becoming so important now. And I guess you could kind of break this down simply and think about if Dave you and I were going to build a new healthcare application and monitor the heartbeat of individuals, imagine if we get that wrong, what the ramifications could be, what those incidents would look like. Or maybe better yet, we try to build a new trading algorithm with a crossover strategy where the 50 day crosses the 10 day average. And imagine if the data underlying the inputs to that is incorrect. We will probably have major financial ramifications in that sense. So, kind of starts there, where everybody's realizing that we're all data companies, and if we are using bad data we're likely making incorrect business decisions. But I think there's kind of two other things at play. I bought a car not too long ago and my dad called and said, "How many cylinders does it have?" And I realized in that moment, I might have failed him cause I didn't know. And I used to ask those types of questions about any lock breaks and cylinders, and if it's manual or automatic. And I realized, I now just buy a car that I hope works. And it's so complicated with all the computer chips. I really don't know that much about it. And that's what's happening with data. We're just loading so much of it. And it's so complex that the way companies consume them in the IT function is that they bring in a lot of data and then they syndicate it out to the business. And it turns out that the individuals loading and consuming all of this data for the company actually may not know that much about the data itself and that's not even their job anymore. So, we'll talk more about that in a minute, but that's really what's setting the foreground for this observability play and why everybody's so interested. It's because we're becoming less close to the intricacies of the data and we just expect it to always be there and be correct. >> You know, the other thing too about data quality, and for years we did the MIT, CDO, IQ event. We didn't do it last year at COVID, messed everything up. But the observation I would make there, your thoughts is, data quality used to be information quality, used to be this back office function, and then it became sort of front office with financial services, and government and healthcare, these highly regulated industries. And then the whole chief data officer thing happened and people were realizing, well they sort of flipped the bit from sort of a data as a risk to data as an asset. And now as we say, we're going to talk about observability. And so it's really become front and center, just the whole quality issue because data's so fundamental, hasn't it? >> Yeah, absolutely. I mean, let's imagine we pull up our phones right now and I go to my favorite stock ticker app, and I check out the Nasdaq market cap. I really have no idea if that's the correct number. I know it's a number, it looks large, it's in a numeric field. And that's kind of what's going on. There's so many numbers and they're coming from all of these different sources, and data providers, and they're getting consumed and passed along. But there isn't really a way to tactically put controls on every number and metric across every field we plan to monitor, but with the scale that we've achieved in early days, even before Collibra. And what's been so exciting is, we have these types of observation techniques, these data monitors that can actually track past performance of every field at scale. And why that's so interesting, and why I think the CDO is listening right intently nowadays to this topic is, so maybe we could surface all of these problems with the right solution of data observability and with the right scale, and then just be alerted on breaking trends. So we're sort of shifting away from this world of must write a condition and then when that condition breaks that was always known as a break record. But what about breaking trends and root cause analysis? And is it possible to do that with less human intervention? And so I think most people are seeing now that it's going to have to be a software tool and a computer system. It's not ever going to be based on one or two domain experts anymore. >> So how does data observability relate to data quality? Are they sort of two sides of the same coin? Are they cousins? What's your perspective on that? >> Yeah, it's super interesting. It's an emerging market. So the language is changing, a lot of the topic and areas changing. The way that I like to say it or break it down because the lingo is constantly moving, as a target on the space is really breaking records versus breaking trends. And I could write a condition when this thing happens it's wrong, and when it doesn't it's correct. Or I could look for a trend and I'll give you a good example. Everybody's talking about fresh data and stale data, and why would that matter? Well, if your data never arrived, or only part of it arrived, or didn't arrive on time, it's likely stale, and there will not be a condition that you could write that would show you all the good and the bads. That was kind of your traditional approach of data quality break records. But your modern day approach is you lost a significant portion of your data, or it did not arrive on time to make that decision accurately on time. And that's a hidden concern. Some people call this freshness, we call it stale data. But it all points to the same idea of the thing that you're observing may not be a data quality condition anymore. It may be a breakdown in the data pipeline. And with thousands of data pipelines in play for every company out there, there's more than a couple of these happening every day. >> So what's the Collibra angle on all this stuff? Made the acquisition, you got data quality, observability coming together. You guys have a lot of expertise in this area, but you hear providence of data. You just talked about stale data, the whole trend toward realtime. How is Collibra approaching the problem and what's unique about your approach? >> Well I think where we're fortunate is with our background. Myself and team, we sort of lived this problem for a long time in the Wall Street days about a decade ago. And we saw it from many different angles. And what we came up with, before it was called data observability or reliability, was basically the underpinnings of that. So we're a little bit ahead of the curve there when most people evaluate our solution. It's more advanced than some of the observation techniques that currently exist. But we've also always covered data quality and we believe that people want to know more, they need more insights. And they want to see break records and breaking trends together, so they can correlate the root cause. And we hear that all the time. "I have so many things going wrong just show me the big picture. Help me find the thing that if I were to fix it today would make the most impact." So we're really focused on root cause analysis, business impact, connecting it with lineage and catalog metadata. And as that grows you can actually achieve total data governance. At this point with the acquisition of what was a Lineage company years ago, and then my company OwlDQ, now Collibra Data Quality. Collibra may be the best positioned for total data governance and intelligence in the space. >> Well, you mentioned financial services a couple of times and some examples, remember the flash crash in 2010. Nobody had any idea what that was. They would just say, "Oh, it's a glitch." So they didn't understand the root cause of it. So this is a really interesting topic to me. So we know at Data Citizens 22 that you're announcing, you got to announce new products, right? It is your yearly event. What's new? Give us a sense as to what products are coming out but specifically around data quality and observability. >> Absolutely. There's this, there's always a next thing on the forefront. And the one right now is these hyperscalers in the cloud. So you have databases like Snowflake and BigQuery, and Databricks, Delta Lake and SQL Pushdown. And ultimately what that means is a lot of people are storing in loading data even faster in a SaaS like model. And we've started to hook into these databases, and while we've always worked with the same databases in the past they're supported today. We're doing something called Native Database pushdown, where the entire compute and data activity happens in the database. And why that is so interesting and powerful now? Is everyone's concerned with something called Egress. Did my data that I've spent all this time and money with my security team securing ever leave my hands, did it ever leave my secure VPC as they call it? And with these native integrations that we're building and about to unveil here as kind of a sneak peak for next week at Data Citizens, we're now doing all compute and data operations in databases like Snowflake. And what that means is with no install and no configuration you could log into the Collibra data quality app and have all of your data quality running inside the database that you've probably already picked as your go forward team selection secured database of choice. So we're really excited about that. And I think if you look at the whole landscape of network cost, egress cost, data storage and compute, what people are realizing is it's extremely efficient to do it in the way that we're about to release here next week. >> So this is interesting because what you just described, you mentioned Snowflake, you mentioned Google, oh actually you mentioned yeah, Databricks. You know, Snowflake has the data cloud. If you put everything in the data cloud, okay, you're cool. But then Google's got the open data cloud. If you heard, Google next. And now Databricks doesn't call it the data cloud, but they have like the open source data cloud. So you have all these different approaches and there's really no way, up until now I'm hearing, to really understand the relationships between all those and have confidence across, it's like yamarket AMI, you should just be a note on the mesh. I don't care if it's a data warehouse or a data lake, or where it comes from, but it's a point on that mesh and I need tooling to be able to have confidence that my data is governed and has the proper lineage, providence. And that's what you're bringing to the table. Is that right? Did I get that right? >> Yeah, that's right. And it's, for us, it's not that we haven't been working with those great cloud databases, but it's the fact that we can send them the instructions now we can send them the operating ability to crunch all of the calculations, the governance, the quality, and get the answers. And what that's doing, it's basically zero network cost, zero egress cost, zero latency of time. And so when you were to log into BigQuery tomorrow using our tool, or say Snowflake for example, you have instant data quality metrics, instant profiling, instant lineage in access, privacy controls, things of that nature that just become less onerous. What we're seeing is there's so much technology out there just like all of the major brands that you mentioned but how do we make it easier? The future is about less clicks, faster time to value, faster scale, and eventually lower cost. And we think that this positions us to be the leader there. >> I love this example because, we've got talks about well the cloud guys you're going to own the world. And of course now we're seeing that the ecosystem is finding so much white space to add value connect across cloud. Sometimes we call it super cloud and so, or inter clouding. Alright, Kirk, give us your final thoughts on the trends that we've talked about and data Citizens 22. >> Absolutely. Well I think, one big trend is discovery and classification. Seeing that across the board, people used to know it was a zip code and nowadays with the amount of data that's out there they want to know where everything is, where their sensitive data is, if it's redundant, tell me everything inside of three to five seconds. And with that comes, they want to know in all of these hyperscale databases how fast they can get controls and insights out of their tools. So I think we're going to see more one click solutions, more SaaS based solutions, and solutions that hopefully prove faster time to value on all of these modern cloud platforms. >> Excellent. All right, Kirk Haslbeck, thanks so much for coming on theCUBE and previewing Data Citizens 22. Appreciate it. >> Thanks for having me, Dave. >> You're welcome. All right. And thank you for watching. Keep it right there for more coverage from theCUBE. (atmospheric music)

Published Date : Nov 2 2022

SUMMARY :

Kirk, good to see you, welcome. Excited to be here. And now you lead data quality at Collibra. And it's so complex that the And now as we say, we're going and I check out the Nasdaq market cap. of the thing that you're observing and what's unique about your approach? ahead of the curve there and some examples, And the one right now is these and has the proper lineage, providence. and get the answers. And of course now we're and solutions that hopefully and previewing Data Citizens 22. And thank you for watching.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

DavePERSON

0.99+

CollibraORGANIZATION

0.99+

2010DATE

0.99+

Kirk HaslbeckPERSON

0.99+

oneQUANTITY

0.99+

OwlDQORGANIZATION

0.99+

KirkPERSON

0.99+

50 dayQUANTITY

0.99+

GoogleORGANIZATION

0.99+

10 dayQUANTITY

0.99+

DatabricksORGANIZATION

0.99+

two sidesQUANTITY

0.99+

last yearDATE

0.99+

Collibra Data QualityORGANIZATION

0.99+

next weekDATE

0.99+

Data CitizensORGANIZATION

0.99+

tomorrowDATE

0.98+

two other thingsQUANTITY

0.98+

BigQueryTITLE

0.98+

five secondsQUANTITY

0.98+

one clickQUANTITY

0.97+

todayDATE

0.97+

CollibraTITLE

0.96+

Wall StreetLOCATION

0.96+

SQL PushdownTITLE

0.94+

Data Citizens 22ORGANIZATION

0.93+

COVIDORGANIZATION

0.93+

SnowflakeTITLE

0.91+

NasdaqORGANIZATION

0.9+

Data Citizens 22ORGANIZATION

0.89+

Delta LakeTITLE

0.89+

EgressORGANIZATION

0.89+

MITEVENT

0.89+

more than a coupleQUANTITY

0.87+

a decade agoDATE

0.85+

zeroQUANTITY

0.84+

CitizensORGANIZATION

0.83+

Data Citizens 2022 CollibraEVENT

0.83+

yearsDATE

0.81+

thousands of dataQUANTITY

0.8+

Data Citizens 22TITLE

0.78+

two domain expertsQUANTITY

0.77+

SnowflakeORGANIZATION

0.76+

IQEVENT

0.76+

coupleQUANTITY

0.75+

CollibraPERSON

0.75+

theCUBEORGANIZATION

0.71+

many numbersQUANTITY

0.7+

Vice PresidentPERSON

0.68+

LineageORGANIZATION

0.66+

DatabricksTITLE

0.64+

too long agoDATE

0.62+

threeQUANTITY

0.6+

DataORGANIZATION

0.57+

CDOEVENT

0.53+

minuteQUANTITY

0.53+

CDOTITLE

0.53+

numberQUANTITY

0.51+

AMIORGANIZATION

0.44+

QualityPERSON

0.43+

Collibra Data Citizens 22


 

>>Collibra is a company that was founded in 2008 right before the so-called modern big data era kicked into high gear. The company was one of the first to focus its business on data governance. Now, historically, data governance and data quality initiatives, they were back office functions and they were largely confined to regulatory regulated industries that had to comply with public policy mandates. But as the cloud went mainstream, the tech giants showed us how valuable data could become and the value proposition for data quality and trust. It evolved from primarily a compliance driven issue to becoming a lynchpin of competitive advantage. But data in the decade of the 2010s was largely about getting the technology to work. You had these highly centralized technical teams that were formed and they had hyper specialized skills to develop data architectures and processes to serve the myriad data needs of organizations. >>And it resulted in a lot of frustration with data initiatives for most organizations that didn't have the resources of the cloud guys and the social media giants to really attack their data problems and turn data into gold. This is why today for example, this quite a bit of momentum to rethinking monolithic data architectures. You see, you hear about initiatives like data mesh and the idea of data as a product. They're gaining traction as a way to better serve the the data needs of decentralized business Uni users, you hear a lot about data democratization. So these decentralization efforts around data, they're great, but they create a new set of problems. Specifically, how do you deliver like a self-service infrastructure to business users and domain experts? Now the cloud is definitely helping with that, but also how do you automate governance? This becomes especially tricky as protecting data privacy has become more and more important. >>In other words, while it's enticing to experiment and run fast and loose with data initiatives kinda like the Wild West, to find new veins of gold, it has to be done responsibly. As such, the idea of data governance has had to evolve to become more automated. And intelligence governance and data lineage is still fundamental to ensuring trust as data. It moves like water through an organization. No one is gonna use data that isn't trusted. Metadata has become increasingly important for data discovery and data classification. As data flows through an organization, the continuously ability to check for data flaws and automating that data quality, they become a functional requirement of any modern data management platform. And finally, data privacy has become a critical adjacency to cyber security. So you can see how data governance has evolved into a much richer set of capabilities than it was 10 or 15 years ago. >>Hello and welcome to the Cube's coverage of Data Citizens made possible by Calibra, a leader in so-called Data intelligence and the host of Data Citizens 2022, which is taking place in San Diego. My name is Dave Ante and I'm one of the hosts of our program, which is running in parallel to data citizens. Now at the Cube we like to say we extract the signal from the noise, and over the, the next couple of days, we're gonna feature some of the themes from the keynote speakers at Data Citizens and we'll hear from several of the executives. Felix Von Dala, who is the co-founder and CEO of Collibra, will join us along with one of the other founders of Collibra, Stan Christians, who's gonna join my colleague Lisa Martin. I'm gonna also sit down with Laura Sellers, she's the Chief Product Officer at Collibra. We'll talk about some of the, the announcements and innovations they're making at the event, and then we'll dig in further to data quality with Kirk Hasselbeck. >>He's the vice president of Data quality at Collibra. He's an amazingly smart dude who founded Owl dq, a company that he sold to Col to Collibra last year. Now many companies, they didn't make it through the Hado era, you know, they missed the industry waves and they became Driftwood. Collibra, on the other hand, has evolved its business. They've leveraged the cloud, expanded its product portfolio, and leaned in heavily to some major partnerships with cloud providers, as well as receiving a strategic investment from Snowflake earlier this year. So it's a really interesting story that we're thrilled to be sharing with you. Thanks for watching and I hope you enjoy the program. >>Last year, the Cube Covered Data Citizens Collibra's customer event. And the premise that we put forth prior to that event was that despite all the innovation that's gone on over the last decade or more with data, you know, starting with the Hado movement, we had data lakes, we'd spark the ascendancy of programming languages like Python, the introduction of frameworks like TensorFlow, the rise of ai, low code, no code, et cetera. Businesses still find it's too difficult to get more value from their data initiatives. And we said at the time, you know, maybe it's time to rethink data innovation. While a lot of the effort has been focused on, you know, more efficiently storing and processing data, perhaps more energy needs to go into thinking about the people and the process side of the equation, meaning making it easier for domain experts to both gain insights for data, trust the data, and begin to use that data in new ways, fueling data, products, monetization and insights data citizens 2022 is back and we're pleased to have Felix Van Dema, who is the founder and CEO of Collibra. He's on the cube or excited to have you, Felix. Good to see you again. >>Likewise Dave. Thanks for having me again. >>You bet. All right, we're gonna get the update from Felix on the current data landscape, how he sees it, why data intelligence is more important now than ever and get current on what Collibra has been up to over the past year and what's changed since Data Citizens 2021. And we may even touch on some of the product news. So Felix, we're living in a very different world today with businesses and consumers. They're struggling with things like supply chains, uncertain economic trends, and we're not just snapping back to the 2010s. That's clear, and that's really true as well in the world of data. So what's different in your mind, in the data landscape of the 2020s from the previous decade, and what challenges does that bring for your customers? >>Yeah, absolutely. And, and I think you said it well, Dave, and and the intro that that rising complexity and fragmentation in the broader data landscape, that hasn't gotten any better over the last couple of years. When when we talk to our customers, that level of fragmentation, the complexity, how do we find data that we can trust, that we know we can use has only gotten kinda more, more difficult. So that trend that's continuing, I think what is changing is that trend has become much more acute. Well, the other thing we've seen over the last couple of years is that the level of scrutiny that organizations are under respect to data, as data becomes more mission critical, as data becomes more impactful than important, the level of scrutiny with respect to privacy, security, regulatory compliance, as only increasing as well, which again, is really difficult in this environment of continuous innovation, continuous change, continuous growing complexity and fragmentation. >>So it's become much more acute. And, and to your earlier point, we do live in a different world and and the the past couple of years we could probably just kind of brute for it, right? We could focus on, on the top line. There was enough kind of investments to be, to be had. I think nowadays organizations are focused or are, are, are, are, are, are in a very different environment where there's much more focus on cost control, productivity, efficiency, How do we truly get value from that data? So again, I think it just another incentive for organization to now truly look at data and to scale it data, not just from a a technology and infrastructure perspective, but how do you actually scale data from an organizational perspective, right? You said at the the people and process, how do we do that at scale? And that's only, only only becoming much more important. And we do believe that the, the economic environment that we find ourselves in today is gonna be catalyst for organizations to really dig out more seriously if, if, if, if you will, than they maybe have in the have in the best. >>You know, I don't know when you guys founded Collibra, if, if you had a sense as to how complicated it was gonna get, but you've been on a mission to really address these problems from the beginning. How would you describe your, your, your mission and what are you doing to address these challenges? >>Yeah, absolutely. We, we started Colli in 2008. So in some sense and the, the last kind of financial crisis, and that was really the, the start of Colli where we found product market fit, working with large finance institutions to help them cope with the increasing compliance requirements that they were faced with because of the, of the financial crisis and kind of here we are again in a very different environment, of course 15 years, almost 15 years later. But data only becoming more important. But our mission to deliver trusted data for every user, every use case and across every source, frankly, has only become more important. So what has been an incredible journey over the last 14, 15 years, I think we're still relatively early in our mission to again, be able to provide everyone, and that's why we call it data citizens. We truly believe that everyone in the organization should be able to use trusted data in an easy, easy matter. That mission is is only becoming more important, more relevant. We definitely have a lot more work ahead of us because we are still relatively early in that, in that journey. >>Well, that's interesting because, you know, in my observation it takes seven to 10 years to actually build a company and then the fact that you're still in the early days is kind of interesting. I mean, you, Collibra's had a good 12 months or so since we last spoke at Data Citizens. Give us the latest update on your business. What do people need to know about your, your current momentum? >>Yeah, absolutely. Again, there's, there's a lot of tail organizations that are only maturing the data practices and we've seen it kind of transform or, or, or influence a lot of our business growth that we've seen, broader adoption of the platform. We work at some of the largest organizations in the world where it's Adobe, Heineken, Bank of America, and many more. We have now over 600 enterprise customers, all industry leaders and every single vertical. So it's, it's really exciting to see that and continue to partner with those organizations. On the partnership side, again, a lot of momentum in the org in, in the, in the markets with some of the cloud partners like Google, Amazon, Snowflake, data bricks and, and others, right? As those kind of new modern data infrastructures, modern data architectures that are definitely all moving to the cloud, a great opportunity for us, our partners and of course our customers to help them kind of transition to the cloud even faster. >>And so we see a lot of excitement and momentum there within an acquisition about 18 months ago around data quality, data observability, which we believe is an enormous opportunity. Of course, data quality isn't new, but I think there's a lot of reasons why we're so excited about quality and observability now. One is around leveraging ai, machine learning, again to drive more automation. And the second is that those data pipelines that are now being created in the cloud, in these modern data architecture arch architectures, they've become mission critical. They've become real time. And so monitoring, observing those data pipelines continuously has become absolutely critical so that they're really excited about about that as well. And on the organizational side, I'm sure you've heard a term around kind of data mesh, something that's gaining a lot of momentum, rightfully so. It's really the type of governance that we always believe. Then federated focused on domains, giving a lot of ownership to different teams. I think that's the way to scale data organizations. And so that aligns really well with our vision and, and from a product perspective, we've seen a lot of momentum with our customers there as well. >>Yeah, you know, a couple things there. I mean, the acquisition of i l dq, you know, Kirk Hasselbeck and, and their team, it's interesting, you know, the whole data quality used to be this back office function and, and really confined to highly regulated industries. It's come to the front office, it's top of mind for chief data officers, data mesh. You mentioned you guys are a connective tissue for all these different nodes on the data mesh. That's key. And of course we see you at all the shows. You're, you're a critical part of many ecosystems and you're developing your own ecosystem. So let's chat a little bit about the, the products. We're gonna go deeper in into products later on at, at Data Citizens 22, but we know you're debuting some, some new innovations, you know, whether it's, you know, the, the the under the covers in security, sort of making data more accessible for people just dealing with workflows and processes as you talked about earlier. Tell us a little bit about what you're introducing. >>Yeah, absolutely. We're super excited, a ton of innovation. And if we think about the big theme and like, like I said, we're still relatively early in this, in this journey towards kind of that mission of data intelligence that really bolts and compelling mission, either customers are still start, are just starting on that, on that journey. We wanna make it as easy as possible for the, for our organization to actually get started because we know that's important that they do. And for our organization and customers that have been with us for some time, there's still a tremendous amount of opportunity to kind of expand the platform further. And again, to make it easier for really to, to accomplish that mission and vision around that data citizen that everyone has access to trustworthy data in a very easy, easy way. So that's really the theme of a lot of the innovation that we're driving. >>A lot of kind of ease of adoption, ease of use, but also then how do we make sure that lio becomes this kind of mission critical enterprise platform from a security performance architecture scale supportability that we're truly able to deliver that kind of an enterprise mission critical platform. And so that's the big theme from an innovation perspective, From a product perspective, a lot of new innovation that we're really excited about. A couple of highlights. One is around data marketplace. Again, a lot of our customers have plans in that direction, how to make it easy. How do we make, how do we make available to true kind of shopping experience that anybody in your organization can, in a very easy search first way, find the right data product, find the right dataset, that data can then consume usage analytics. How do you, how do we help organizations drive adoption, tell them where they're working really well and where they have opportunities homepages again to, to make things easy for, for people, for anyone in your organization to kind of get started with ppia, you mentioned workflow designer, again, we have a very powerful enterprise platform. >>One of our key differentiators is the ability to really drive a lot of automation through workflows. And now we provided a new low code, no code kind of workflow designer experience. So, so really customers can take it to the next level. There's a lot more new product around K Bear Protect, which in partnership with Snowflake, which has been a strategic investor in kib, focused on how do we make access governance easier? How do we, how do we, how are we able to make sure that as you move to the cloud, things like access management, masking around sensitive data, PII data is managed as much more effective, effective rate, really excited about that product. There's more around data quality. Again, how do we, how do we get that deployed as easily and quickly and widely as we can? Moving that to the cloud has been a big part of our strategy. >>So we launch more data quality cloud product as well as making use of those, those native compute capabilities in platforms like Snowflake, Data, Bricks, Google, Amazon, and others. And so we are bettering a capability, a capability that we call push down. So actually pushing down the computer and data quality, the monitoring into the underlying platform, which again, from a scale performance and ease of use perspective is gonna make a massive difference. And then more broadly, we, we talked a little bit about the ecosystem. Again, integrations, we talk about being able to connect to every source. Integrations are absolutely critical and we're really excited to deliver new integrations with Snowflake, Azure and Google Cloud storage as well. So there's a lot coming out. The, the team has been work at work really hard and we are really, really excited about what we are coming, what we're bringing to markets. >>Yeah, a lot going on there. I wonder if you could give us your, your closing thoughts. I mean, you, you talked about, you know, the marketplace, you know, you think about data mesh, you think of data as product, one of the key principles you think about monetization. This is really different than what we've been used to in data, which is just getting the technology to work has been been so hard. So how do you see sort of the future and, you know, give us the, your closing thoughts please? >>Yeah, absolutely. And I, and I think we we're really at this pivotal moment, and I think you said it well. We, we all know the constraint and the challenges with data, how to actually do data at scale. And while we've seen a ton of innovation on the infrastructure side, we fundamentally believe that just getting a faster database is important, but it's not gonna fully solve the challenges and truly kind of deliver on the opportunity. And that's why now is really the time to deliver this data intelligence vision, this data intelligence platform. We are still early, making it as easy as we can. It's kind of, of our, it's our mission. And so I'm really, really excited to see what we, what we are gonna, how the marks gonna evolve over the next, next few quarters and years. I think the trend is clearly there when we talk about data mesh, this kind of federated approach folks on data products is just another signal that we believe that a lot of our organization are now at the time. >>The understanding need to go beyond just the technology. I really, really think about how do we actually scale data as a business function, just like we've done with it, with, with hr, with, with sales and marketing, with finance. That's how we need to think about data. I think now is the time given the economic environment that we are in much more focus on control, much more focused on productivity efficiency and now's the time. We need to look beyond just the technology and infrastructure to think of how to scale data, how to manage data at scale. >>Yeah, it's a new era. The next 10 years of data won't be like the last, as I always say. Felix, thanks so much and good luck in, in San Diego. I know you're gonna crush it out there. >>Thank you Dave. >>Yeah, it's a great spot for an in-person event and, and of course the content post event is gonna be available@collibra.com and you can of course catch the cube coverage@thecube.net and all the news@siliconangle.com. This is Dave Valante for the cube, your leader in enterprise and emerging tech coverage. >>Hi, I'm Jay from Collibra's Data Office. Today I want to talk to you about Collibra's data intelligence cloud. We often say Collibra is a single system of engagement for all of your data. Now, when I say data, I mean data in the broadest sense of the word, including reference and metadata. Think of metrics, reports, APIs, systems, policies, and even business processes that produce or consume data. Now, the beauty of this platform is that it ensures all of your users have an easy way to find, understand, trust, and access data. But how do you get started? Well, here are seven steps to help you get going. One, start with the data. What's data intelligence? Without data leverage the Collibra data catalog to automatically profile and classify your enterprise data wherever that data lives, databases, data lakes or data warehouses, whether on the cloud or on premise. >>Two, you'll then wanna organize the data and you'll do that with data communities. This can be by department, find a business or functional team, however your organization organizes work and accountability. And for that you'll establish community owners, communities, make it easy for people to navigate through the platform, find the data and will help create a sense of belonging for users. An important and related side note here, we find it's typical in many organizations that data is thought of is just an asset and IT and data offices are viewed as the owners of it and who are really the central teams performing analytics as a service provider to the enterprise. We believe data is more than an asset, it's a true product that can be converted to value. And that also means establishing business ownership of data where that strategy and ROI come together with subject matter expertise. >>Okay, three. Next, back to those communities there, the data owners should explain and define their data, not just the tables and columns, but also the related business terms, metrics and KPIs. These objects we call these assets are typically organized into business glossaries and data dictionaries. I definitely recommend starting with the topics that are most important to the business. Four, those steps that enable you and your users to have some fun with it. Linking everything together builds your knowledge graph and also known as a metadata graph by linking or relating these assets together. For example, a data set to a KPI to a report now enables your users to see what we call the lineage diagram that visualizes where the data in your dashboards actually came from and what the data means and who's responsible for it. Speaking of which, here's five. Leverage the calibra trusted business reporting solution on the marketplace, which comes with workflows for those owners to certify their reports, KPIs, and data sets. >>This helps them force their trust in their data. Six, easy to navigate dashboards or landing pages right in your platform for your company's business processes are the most effective way for everyone to better understand and take action on data. Here's a pro tip, use the dashboard design kit on the marketplace to help you build compelling dashboards. Finally, seven, promote the value of this to your users and be sure to schedule enablement office hours and new employee onboarding sessions to get folks excited about what you've built and implemented. Better yet, invite all of those community and data owners to these sessions so that they can show off the value that they've created. Those are my seven tips to get going with Collibra. I hope these have been useful. For more information, be sure to visit collibra.com. >>Welcome to the Cube's coverage of Data Citizens 2022 Collibra's customer event. My name is Dave Valante. With us is Kirk Hasselbeck, who's the vice president of Data Quality of Collibra Kirk, good to see you. Welcome. >>Thanks for having me, Dave. Excited to be here. >>You bet. Okay, we're gonna discuss data quality observability. It's a hot trend right now. You founded a data quality company, OWL dq, and it was acquired by Collibra last year. Congratulations. And now you lead data quality at Collibra. So we're hearing a lot about data quality right now. Why is it such a priority? Take us through your thoughts on that. >>Yeah, absolutely. It's, it's definitely exciting times for data quality, which you're right, has been around for a long time. So why now and why is it so much more exciting than it used to be? I think it's a bit stale, but we all know that companies use more data than ever before and the variety has changed and the volume has grown. And, and while I think that remains true, there are a couple other hidden factors at play that everyone's so interested in as, as to why this is becoming so important now. And, and I guess you could kind of break this down simply and think about if Dave, you and I were gonna build, you know, a new healthcare application and monitor the heartbeat of individuals, imagine if we get that wrong, you know, what the ramifications could be, what, what those incidents would look like, or maybe better yet, we try to build a, a new trading algorithm with a crossover strategy where the 50 day crosses the, the 10 day average. >>And imagine if the data underlying the inputs to that is incorrect. We will probably have major financial ramifications in that sense. So, you know, it kind of starts there where everybody's realizing that we're all data companies and if we are using bad data, we're likely making incorrect business decisions. But I think there's kind of two other things at play. You know, I, I bought a car not too long ago and my dad called and said, How many cylinders does it have? And I realized in that moment, you know, I might have failed him because, cause I didn't know. And, and I used to ask those types of questions about any lock brakes and cylinders and, and you know, if it's manual or, or automatic and, and I realized I now just buy a car that I hope works. And it's so complicated with all the computer chips, I, I really don't know that much about it. >>And, and that's what's happening with data. We're just loading so much of it. And it's so complex that the way companies consume them in the IT function is that they bring in a lot of data and then they syndicate it out to the business. And it turns out that the, the individuals loading and consuming all of this data for the company actually may not know that much about the data itself, and that's not even their job anymore. So we'll talk more about that in a minute, but that's really what's setting the foreground for this observability play and why everybody's so interested. It, it's because we're becoming less close to the intricacies of the data and we just expect it to always be there and be correct. >>You know, the other thing too about data quality, and for years we did the MIT CDO IQ event, we didn't do it last year, Covid messed everything up. But the observation I would make there thoughts is, is it data quality? Used to be information quality used to be this back office function, and then it became sort of front office with financial services and government and healthcare, these highly regulated industries. And then the whole chief data officer thing happened and people were realizing, well, they sort of flipped the bit from sort of a data as a, a risk to data as a, as an asset. And now as we say, we're gonna talk about observability. And so it's really become front and center just the whole quality issue because data's so fundamental, hasn't it? >>Yeah, absolutely. I mean, let's imagine we pull up our phones right now and I go to my, my favorite stock ticker app and I check out the NASDAQ market cap. I really have no idea if that's the correct number. I know it's a number, it looks large, it's in a numeric field. And, and that's kind of what's going on. There's, there's so many numbers and they're coming from all of these different sources and data providers and they're getting consumed and passed along. But there isn't really a way to tactically put controls on every number and metric across every field we plan to monitor, but with the scale that we've achieved in early days, even before calibra. And what's been so exciting is we have these types of observation techniques, these data monitors that can actually track past performance of every field at scale. And why that's so interesting and why I think the CDO is, is listening right intently nowadays to this topic is, so maybe we could surface all of these problems with the right solution of data observability and with the right scale and then just be alerted on breaking trends. So we're sort of shifting away from this world of must write a condition and then when that condition breaks, that was always known as a break record. But what about breaking trends and root cause analysis? And is it possible to do that, you know, with less human intervention? And so I think most people are seeing now that it's going to have to be a software tool and a computer system. It's, it's not ever going to be based on one or two domain experts anymore. >>So, So how does data observability relate to data quality? Are they sort of two sides of the same coin? Are they, are they cousins? What's your perspective on that? >>Yeah, it's, it's super interesting. It's an emerging market. So the language is changing a lot of the topic and areas changing the way that I like to say it or break it down because the, the lingo is constantly moving is, you know, as a target on this space is really breaking records versus breaking trends. And I could write a condition when this thing happens, it's wrong and when it doesn't it's correct. Or I could look for a trend and I'll give you a good example. You know, everybody's talking about fresh data and stale data and, and why would that matter? Well, if your data never arrived or only part of it arrived or didn't arrive on time, it's likely stale and there will not be a condition that you could write that would show you all the good in the bads. That was kind of your, your traditional approach of data quality break records. But your modern day approach is you lost a significant portion of your data, or it did not arrive on time to make that decision accurately on time. And that's a hidden concern. Some people call this freshness, we call it stale data, but it all points to the same idea of the thing that you're observing may not be a data quality condition anymore. It may be a breakdown in the data pipeline. And with thousands of data pipelines in play for every company out there there, there's more than a couple of these happening every day. >>So what's the Collibra angle on all this stuff made the acquisition, you got data quality observability coming together, you guys have a lot of expertise in, in this area, but you hear providence of data, you just talked about, you know, stale data, you know, the, the whole trend toward real time. How is Calibra approaching the problem and what's unique about your approach? >>Well, I think where we're fortunate is with our background, myself and team, we sort of lived this problem for a long time, you know, in, in the Wall Street days about a decade ago. And we saw it from many different angles. And what we came up with before it was called data observability or reliability was basically the, the underpinnings of that. So we're a little bit ahead of the curve there when most people evaluate our solution, it's more advanced than some of the observation techniques that that currently exist. But we've also always covered data quality and we believe that people want to know more, they need more insights, and they want to see break records and breaking trends together so they can correlate the root cause. And we hear that all the time. I have so many things going wrong, just show me the big picture, help me find the thing that if I were to fix it today would make the most impact. So we're really focused on root cause analysis, business impact, connecting it with lineage and catalog metadata. And as that grows, you can actually achieve total data governance at this point with the acquisition of what was a Lineage company years ago, and then my company Ldq now Collibra, Data quality Collibra may be the best positioned for total data governance and intelligence in the space. >>Well, you mentioned financial services a couple of times and some examples, remember the flash crash in 2010. Nobody had any idea what that was, you know, they just said, Oh, it's a glitch, you know, so they didn't understand the root cause of it. So this is a really interesting topic to me. So we know at Data Citizens 22 that you're announcing, you gotta announce new products, right? You're yearly event what's, what's new. Give us a sense as to what products are coming out, but specifically around data quality and observability. >>Absolutely. There's this, you know, there's always a next thing on the forefront. And the one right now is these hyperscalers in the cloud. So you have databases like Snowflake and Big Query and Data Bricks is Delta Lake and SQL Pushdown. And ultimately what that means is a lot of people are storing in loading data even faster in a SaaS like model. And we've started to hook in to these databases. And while we've always worked with the the same databases in the past, they're supported today we're doing something called Native Database pushdown, where the entire compute and data activity happens in the database. And why that is so interesting and powerful now is everyone's concerned with something called Egress. Did your, my data that I've spent all this time and money with my security team securing ever leave my hands, did it ever leave my secure VPC as they call it? >>And with these native integrations that we're building and about to unveil, here's kind of a sneak peek for, for next week at Data Citizens. We're now doing all compute and data operations in databases like Snowflake. And what that means is with no install and no configuration, you could log into the Collibra data quality app and have all of your data quality running inside the database that you've probably already picked as your your go forward team selection secured database of choice. So we're really excited about that. And I think if you look at the whole landscape of network cost, egress, cost, data storage and compute, what people are realizing is it's extremely efficient to do it in the way that we're about to release here next week. >>So this is interesting because what you just described, you know, you mentioned Snowflake, you mentioned Google, Oh actually you mentioned yeah, data bricks. You know, Snowflake has the data cloud. If you put everything in the data cloud, okay, you're cool, but then Google's got the open data cloud. If you heard, you know, Google next and now data bricks doesn't call it the data cloud, but they have like the open source data cloud. So you have all these different approaches and there's really no way up until now I'm, I'm hearing to, to really understand the relationships between all those and have confidence across, you know, it's like Jak Dani, you should just be a note on the mesh. And I don't care if it's a data warehouse or a data lake or where it comes from, but it's a point on that mesh and I need tooling to be able to have confidence that my data is governed and has the proper lineage, providence. And, and, and that's what you're bringing to the table, Is that right? Did I get that right? >>Yeah, that's right. And it's, for us, it's, it's not that we haven't been working with those great cloud databases, but it's the fact that we can send them the instructions now, we can send them the, the operating ability to crunch all of the calculations, the governance, the quality, and get the answers. And what that's doing, it's basically zero network costs, zero egress cost, zero latency of time. And so when you were to log into Big Query tomorrow using our tool or like, or say Snowflake for example, you have instant data quality metrics, instant profiling, instant lineage and access privacy controls, things of that nature that just become less onerous. What we're seeing is there's so much technology out there, just like all of the major brands that you mentioned, but how do we make it easier? The future is about less clicks, faster time to value, faster scale, and eventually lower cost. And, and we think that this positions us to be the leader there. >>I love this example because, you know, Barry talks about, wow, the cloud guys are gonna own the world and, and of course now we're seeing that the ecosystem is finding so much white space to add value, connect across cloud. Sometimes we call it super cloud and so, or inter clouding. All right, Kirk, give us your, your final thoughts and on on the trends that we've talked about and Data Citizens 22. >>Absolutely. Well, I think, you know, one big trend is discovery and classification. Seeing that across the board, people used to know it was a zip code and nowadays with the amount of data that's out there, they wanna know where everything is, where their sensitive data is. If it's redundant, tell me everything inside of three to five seconds. And with that comes, they want to know in all of these hyperscale databases how fast they can get controls and insights out of their tools. So I think we're gonna see more one click solutions, more SAS based solutions and solutions that hopefully prove faster time to value on, on all of these modern cloud platforms. >>Excellent. All right, Kurt Hasselbeck, thanks so much for coming on the Cube and previewing Data Citizens 22. Appreciate it. >>Thanks for having me, Dave. >>You're welcome. Right, and thank you for watching. Keep it right there for more coverage from the Cube. Welcome to the Cube's virtual Coverage of Data Citizens 2022. My name is Dave Valante and I'm here with Laura Sellers, who's the Chief Product Officer at Collibra, the host of Data Citizens. Laura, welcome. Good to see you. >>Thank you. Nice to be here. >>Yeah, your keynote at Data Citizens this year focused on, you know, your mission to drive ease of use and scale. Now when I think about historically fast access to the right data at the right time in a form that's really easily consumable, it's been kind of challenging, especially for business users. Can can you explain to our audience why this matters so much and what's actually different today in the data ecosystem to make this a reality? >>Yeah, definitely. So I think what we really need and what I hear from customers every single day is that we need a new approach to data management and our product teams. What inspired me to come to Calibra a little bit a over a year ago was really the fact that they're very focused on bringing trusted data to more users across more sources for more use cases. And so as we look at what we're announcing with these innovations of ease of use and scale, it's really about making teams more productive in getting started with and the ability to manage data across the entire organization. So we've been very focused on richer experiences, a broader ecosystem of partners, as well as a platform that delivers performance, scale and security that our users and teams need and demand. So as we look at, Oh, go ahead. >>I was gonna say, you know, when I look back at like the last 10 years, it was all about getting the technology to work and it was just so complicated. But, but please carry on. I'd love to hear more about this. >>Yeah, I, I really, you know, Collibra is a system of engagement for data and we really are working on bringing that entire system of engagement to life for everyone to leverage here and now. So what we're announcing from our ease of use side of the world is first our data marketplace. This is the ability for all users to discover and access data quickly and easily shop for it, if you will. The next thing that we're also introducing is the new homepage. It's really about the ability to drive adoption and have users find data more quickly. And then the two more areas of the ease of use side of the world is our world of usage analytics. And one of the big pushes and passions we have at Collibra is to help with this data driven culture that all companies are trying to create. And also helping with data literacy, with something like usage analytics, it's really about driving adoption of the CLE platform, understanding what's working, who's accessing it, what's not. And then finally we're also introducing what's called workflow designer. And we love our workflows at Libra, it's a big differentiator to be able to automate business processes. The designer is really about a way for more people to be able to create those workflows, collaborate on those workflow flows, as well as people to be able to easily interact with them. So a lot of exciting things when it comes to ease of use to make it easier for all users to find data. >>Y yes, there's definitely a lot to unpack there. I I, you know, you mentioned this idea of, of of, of shopping for the data. That's interesting to me. Why this analogy, metaphor or analogy, I always get those confused. I let's go with analogy. Why is it so important to data consumers? >>I think when you look at the world of data, and I talked about this system of engagement, it's really about making it more accessible to the masses. And what users are used to is a shopping experience like your Amazon, if you will. And so having a consumer grade experience where users can quickly go in and find the data, trust that data, understand where the data's coming from, and then be able to quickly access it, is the idea of being able to shop for it, just making it as simple as possible and really speeding the time to value for any of the business analysts, data analysts out there. >>Yeah, I think when you, you, you see a lot of discussion about rethinking data architectures, putting data in the hands of the users and business people, decentralized data and of course that's awesome. I love that. But of course then you have to have self-service infrastructure and you have to have governance. And those are really challenging. And I think so many organizations, they're facing adoption challenges, you know, when it comes to enabling teams generally, especially domain experts to adopt new data technologies, you know, like the, the tech comes fast and furious. You got all these open source projects and get really confusing. Of course it risks security, governance and all that good stuff. You got all this jargon. So where do you see, you know, the friction in adopting new data technologies? What's your point of view and how can organizations overcome these challenges? >>You're, you're dead on. There's so much technology and there's so much to stay on top of, which is part of the friction, right? It's just being able to stay ahead of, of and understand all the technologies that are coming. You also look at as there's so many more sources of data and people are migrating data to the cloud and they're migrating to new sources. Where the friction comes is really that ability to understand where the data came from, where it's moving to, and then also to be able to put the access controls on top of it. So people are only getting access to the data that they should be getting access to. So one of the other things we're announcing with, with all of the innovations that are coming is what we're doing around performance and scale. So with all of the data movement, with all of the data that's out there, the first thing we're launching in the world of performance and scale is our world of data quality. >>It's something that Collibra has been working on for the past year and a half, but we're launching the ability to have data quality in the cloud. So it's currently an on-premise offering, but we'll now be able to carry that over into the cloud for us to manage that way. We're also introducing the ability to push down data quality into Snowflake. So this is, again, one of those challenges is making sure that that data that you have is d is is high quality as you move forward. And so really another, we're just reducing friction. You already have Snowflake stood up. It's not another machine for you to manage, it's just push down capabilities into Snowflake to be able to track that quality. Another thing that we're launching with that is what we call Collibra Protect. And this is that ability for users to be able to ingest metadata, understand where the PII data is, and then set policies up on top of it. So very quickly be able to set policies and have them enforced at the data level. So anybody in the organization is only getting access to the data they should have access to. >>Here's Topica data quality is interesting. It's something that I've followed for a number of years. It used to be a back office function, you know, and really confined only to highly regulated industries like financial services and healthcare and government. You know, you look back over a decade ago, you didn't have this worry about personal information, g gdpr, and, you know, California Consumer Privacy Act all becomes, becomes so much important. The cloud is really changed things in terms of performance and scale and of course partnering for, for, with Snowflake it's all about sharing data and monetization, anything but a back office function. So it was kind of smart that you guys were early on and of course attracting them and as a, as an investor as well was very strong validation. What can you tell us about the nature of the relationship with Snowflake and specifically inter interested in sort of joint engineering or, and product innovation efforts, you know, beyond the standard go to market stuff? >>Definitely. So you mentioned there were a strategic investor in Calibra about a year ago. A little less than that I guess. We've been working with them though for over a year really tightly with their product and engineering teams to make sure that Collibra is adding real value. Our unified platform is touching pieces of our unified platform or touching all pieces of Snowflake. And when I say that, what I mean is we're first, you know, able to ingest data with Snowflake, which, which has always existed. We're able to profile and classify that data we're announcing with Calibra Protect this week that you're now able to create those policies on top of Snowflake and have them enforce. So again, people can get more value out of their snowflake more quickly as far as time to value with, with our policies for all business users to be able to create. >>We're also announcing Snowflake Lineage 2.0. So this is the ability to take stored procedures in Snowflake and understand the lineage of where did the data come from, how was it transformed with within Snowflake as well as the data quality. Pushdown, as I mentioned, data quality, you brought it up. It is a new, it is a, a big industry push and you know, one of the things I think Gartner mentioned is people are losing up to $15 million without having great data quality. So this push down capability for Snowflake really is again, a big ease of use push for us at Collibra of that ability to, to push it into snowflake, take advantage of the data, the data source, and the engine that already lives there and get the right and make sure you have the right quality. >>I mean, the nice thing about Snowflake, if you play in the Snowflake sandbox, you, you, you, you can get sort of a, you know, high degree of confidence that the data sharing can be done in a safe way. Bringing, you know, Collibra into the, into the story allows me to have that data quality and, and that governance that I, that I need. You know, we've said many times on the cube that one of the notable differences in cloud this decade versus last decade, I mean ob there are obvious differences just in terms of scale and scope, but it's shaping up to be about the strength of the ecosystems. That's really a hallmark of these big cloud players. I mean they're, it's a key factor for innovating, accelerating product delivery, filling gaps in, in the hyperscale offerings cuz you got more stack, you know, mature stack capabilities and you know, it creates this flywheel momentum as we often say. But, so my question is, how do you work with the hyperscalers? Like whether it's AWS or Google, whomever, and what do you see as your role and what's the Collibra sweet spot? >>Yeah, definitely. So, you know, one of the things I mentioned early on is the broader ecosystem of partners is what it's all about. And so we have that strong partnership with Snowflake. We also are doing more with Google around, you know, GCP and kbra protect there, but also tighter data plex integration. So similar to what you've seen with our strategic moves around Snowflake and, and really covering the broad ecosystem of what Collibra can do on top of that data source. We're extending that to the world of Google as well and the world of data plex. We also have great partners in SI's Infosys is somebody we spoke with at the conference who's done a lot of great work with Levi's as they're really important to help people with their whole data strategy and driving that data driven culture and, and Collibra being the core of it. >>Hi Laura, we're gonna, we're gonna end it there, but I wonder if you could kind of put a bow on, you know, this year, the event your, your perspectives. So just give us your closing thoughts. >>Yeah, definitely. So I, I wanna say this is one of the biggest releases Collibra's ever had. Definitely the biggest one since I've been with the company a little over a year. We have all these great new product innovations coming to really drive the ease of use to make data more valuable for users everywhere and, and companies everywhere. And so it's all about everybody being able to easily find, understand, and trust and get access to that data going forward. >>Well congratulations on all the pro progress. It was great to have you on the cube first time I believe, and really appreciate you, you taking the time with us. >>Yes, thank you for your time. >>You're very welcome. Okay, you're watching the coverage of Data Citizens 2022 on the cube, your leader in enterprise and emerging tech coverage. >>So data modernization oftentimes means moving some of your storage and computer to the cloud where you get the benefit of scale and security and so on. But ultimately it doesn't take away the silos that you have. We have more locations, more tools and more processes with which we try to get value from this data. To do that at scale in an organization, people involved in this process, they have to understand each other. So you need to unite those people across those tools, processes, and systems with a shared language. When I say customer, do you understand the same thing as you hearing customer? Are we counting them in the same way so that shared language unites us and that gives the opportunity for the organization as a whole to get the maximum value out of their data assets and then they can democratize data so everyone can properly use that shared language to find, understand, and trust the data asset that's available. >>And that's where Collibra comes in. We provide a centralized system of engagement that works across all of those locations and combines all of those different user types across the whole business. At Collibra, we say United by data and that also means that we're united by data with our customers. So here is some data about some of our customers. There was the case of an online do it yourself platform who grew their revenue almost three times from a marketing campaign that provided the right product in the right hands of the right people. In other case that comes to mind is from a financial services organization who saved over 800 K every year because they were able to reuse the same data in different kinds of reports and before there was spread out over different tools and processes and silos, and now the platform brought them together so they realized, oh, we're actually using the same data, let's find a way to make this more efficient. And the last example that comes to mind is that of a large home loan, home mortgage, mortgage loan provider where they have a very complex landscape, a very complex architecture legacy in the cloud, et cetera. And they're using our software, they're using our platform to unite all the people and those processes and tools to get a common view of data to manage their compliance at scale. >>Hey everyone, I'm Lisa Martin covering Data Citizens 22, brought to you by Collibra. This next conversation is gonna focus on the importance of data culture. One of our Cube alumni is back, Stan Christians is Collibra's co-founder and it's Chief Data citizens. Stan, it's great to have you back on the cube. >>Hey Lisa, nice to be. >>So we're gonna be talking about the importance of data culture, data intelligence, maturity, all those great things. When we think about the data revolution that every business is going through, you know, it's so much more than technology innovation. It also really re requires cultural transformation, community transformation. Those are challenging for customers to undertake. Talk to us about what you mean by data citizenship and the role that creating a data culture plays in that journey. >>Right. So as you know, our event is called Data Citizens because we believe that in the end, a data citizen is anyone who uses data to do their job. And we believe that today's organizations, you have a lot of people, most of the employees in an organization are somehow gonna to be a data citizen, right? So you need to make sure that these people are aware of it. You need that. People have skills and competencies to do with data what necessary and that's on, all right? So what does it mean to have a good data culture? It means that if you're building a beautiful dashboard to try and convince your boss, we need to make this decision that your boss is also open to and able to interpret, you know, the data presented in dashboard to actually make that decision and take that action. Right? >>And once you have that why to the organization, that's when you have a good data culture. Now that's continuous effort for most organizations because they're always moving, somehow they're hiring new people and it has to be continuous effort because we've seen that on the hand. Organizations continue challenged their data sources and where all the data is flowing, right? Which in itself creates a lot of risk. But also on the other set hand of the equation, you have the benefit. You know, you might look at regulatory drivers like, we have to do this, right? But it's, it's much better right now to consider the competitive drivers, for example, and we did an IDC study earlier this year, quite interesting. I can recommend anyone to it. And one of the conclusions they found as they surveyed over a thousand people across organizations worldwide is that the ones who are higher in maturity. >>So the, the organizations that really look at data as an asset, look at data as a product and actively try to be better at it, don't have three times as good a business outcome as the ones who are lower on the maturity scale, right? So you can say, ok, I'm doing this, you know, data culture for everyone, awakening them up as data citizens. I'm doing this for competitive reasons, I'm doing this re reasons you're trying to bring both of those together and the ones that get data intelligence right, are successful and competitive. That's, and that's what we're seeing out there in the market. >>Absolutely. We know that just generally stand right, the organizations that are, are really creating a, a data culture and enabling everybody within the organization to become data citizens are, We know that in theory they're more competitive, they're more successful. But the IDC study that you just mentioned demonstrates they're three times more successful and competitive than their peers. Talk about how Collibra advises customers to create that community, that culture of data when it might be challenging for an organization to adapt culturally. >>Of course, of course it's difficult for an organization to adapt but it's also necessary, as you just said, imagine that, you know, you're a modern day organization, laptops, what have you, you're not using those, right? Or you know, you're delivering them throughout organization, but not enabling your colleagues to actually do something with that asset. Same thing as through with data today, right? If you're not properly using the data asset and competitors are, they're gonna to get more advantage. So as to how you get this done, establish this. There's angles to look at, Lisa. So one angle is obviously the leadership whereby whoever is the boss of data in the organization, you typically have multiple bosses there, like achieve data officers. Sometimes there's, there's multiple, but they may have a different title, right? So I'm just gonna summarize it as a data leader for a second. >>So whoever that is, they need to make sure that there's a clear vision, a clear strategy for data. And that strategy needs to include the monetization aspect. How are you going to get value from data? Yes. Now that's one part because then you can leadership in the organization and also the business value. And that's important. Cause those people, their job in essence really is to make everyone in the organization think about data as an asset. And I think that's the second part of the equation of getting that right, is it's not enough to just have that leadership out there, but you also have to get the hearts and minds of the data champions across the organization. You, I really have to win them over. And if you have those two combined and obviously a good technology to, you know, connect those people and have them execute on their responsibilities such as a data intelligence platform like s then the in place to really start upgrading that culture inch by inch if you'll, >>Yes, I like that. The recipe for success. So you are the co-founder of Collibra. You've worn many different hats along this journey. Now you're building Collibra's own data office. I like how before we went live, we were talking about Calibra is drinking its own champagne. I always loved to hear stories about that. You're speaking at Data Citizens 2022. Talk to us about how you are building a data culture within Collibra and what maybe some of the specific projects are that Collibra's data office is working on. >>Yes, and it is indeed data citizens. There are a ton of speaks here, are very excited. You know, we have Barb from m MIT speaking about data monetization. We have Dilla at the last minute. So really exciting agen agenda. Can't wait to get back out there essentially. So over the years at, we've doing this since two and eight, so a good years and I think we have another decade of work ahead in the market, just to be very clear. Data is here to stick around as are we. And myself, you know, when you start a company, we were for people in a, if you, so everybody's wearing all sorts of hat at time. But over the years I've run, you know, presales that sales partnerships, product cetera. And as our company got a little bit biggish, we're now thousand two. Something like people in the company. >>I believe systems and processes become a lot important. So we said you CBRA isn't the size our customers we're getting there in of organization structure, process systems, et cetera. So we said it's really time for us to put our money where is and to our own data office, which is what we were seeing customers', organizations worldwide. And they organizations have HR units, they have a finance unit and over time they'll all have a department if you'll, that is responsible somehow for the data. So we said, ok, let's try to set an examples that other people can take away with it, right? Can take away from it. So we set up a data strategy, we started building data products, took care of the data infrastructure. That's sort of good stuff. And in doing all of that, ISA exactly as you said, we said, okay, we need to also use our product and our own practices and from that use, learn how we can make the product better, learn how we make, can make the practice better and share that learning with all the, and on, on the Monday mornings, we sometimes refer to eating our dog foods on Friday evenings. >>We referred to that drinking our own champagne. I like it. So we, we had a, we had the driver to do this. You know, there's a clear business reason. So we involved, we included that in the data strategy and that's a little bit of our origin. Now how, how do we organize this? We have three pillars, and by no means is this a template that everyone should, this is just the organization that works at our company, but it can serve as an inspiration. So we have a pillar, which is data science. The data product builders, if you'll or the people who help the business build data products. We have the data engineers who help keep the lights on for that data platform to make sure that the products, the data products can run, the data can flow and you know, the quality can be checked. >>And then we have a data intelligence or data governance builders where we have those data governance, data intelligence stakeholders who help the business as a sort of data partner to the business stakeholders. So that's how we've organized it. And then we started following the CBRA approach, which is, well, what are the challenges that our business stakeholders have in hr, finance, sales, marketing all over? And how can data help overcome those challenges? And from those use cases, we then just started to build a map and started execution use of the use case. And a important ones are very simple. We them with our, our customers as well, people talking about the cata, right? The catalog for the data scientists to know what's in their data lake, for example, and for the people in and privacy. So they have their process registry and they can see how the data flows. >>So that's a starting place and that turns into a marketplace so that if new analysts and data citizens join kbra, they immediately have a place to go to, to look at, see, ok, what data is out there for me as an analyst or a data scientist or whatever to do my job, right? So they can immediately get access data. And another one that we is around trusted business. We're seeing that since, you know, self-service BI allowed everyone to make beautiful dashboards, you know, pie, pie charts. I always, my pet pee is the pie chart because I love buy and you shouldn't always be using pie charts. But essentially there's become proliferation of those reports. And now executives don't really know, okay, should I trust this report or that report the reporting on the same thing. But the numbers seem different, right? So that's why we have trusted this reporting. So we know if a, the dashboard, a data product essentially is built, we not that all the right steps are being followed and that whoever is consuming that can be quite confident in the result either, Right. And that silver browser, right? Absolutely >>Decay. >>Exactly. Yes, >>Absolutely. Talk a little bit about some of the, the key performance indicators that you're using to measure the success of the data office. What are some of those KPIs? >>KPIs and measuring is a big topic in the, in the data chief data officer profession, I would say, and again, it always varies with to your organization, but there's a few that we use that might be of interest. Use those pillars, right? And we have metrics across those pillars. So for example, a pillar on the data engineering side is gonna be more related to that uptime, right? Are the, is the data platform up and running? Are the data products up and running? Is the quality in them good enough? Is it going up? Is it going down? What's the usage? But also, and especially if you're in the cloud and if consumption's a big thing, you have metrics around cost, for example, right? So that's one set of examples. Another one is around the data sciences and products. Are people using them? Are they getting value from it? >>Can we calculate that value in ay perspective, right? Yeah. So that we can to the rest of the business continue to say we're tracking all those numbers and those numbers indicate that value is generated and how much value estimated in that region. And then you have some data intelligence, data governance metrics, which is, for example, you have a number of domains in a data mesh. People talk about being the owner of a data domain, for example, like product or, or customer. So how many of those domains do you have covered? How many of them are already part of the program? How many of them have owners assigned? How well are these owners organized, executing on their responsibilities? How many tickets are open closed? How many data products are built according to process? And so and so forth. So these are an set of examples of, of KPIs. There's a, there's a lot more, but hopefully those can already inspire the audience. >>Absolutely. So we've, we've talked about the rise cheap data offices, it's only accelerating. You mentioned this is like a 10 year journey. So if you were to look into a crystal ball, what do you see in terms of the maturation of data offices over the next decade? >>So we, we've seen indeed the, the role sort of grow up, I think in, in thousand 10 there may have been like 10 achieve data officers or something. Gartner has exact numbers on them, but then they grew, you know, industries and the number is estimated to be about 20,000 right now. Wow. And they evolved in a sort of stack of competencies, defensive data strategy, because the first chief data officers were more regulatory driven, offensive data strategy support for the digital program. And now all about data products, right? So as a data leader, you now need all of those competences and need to include them in, in your strategy. >>How is that going to evolve for the next couple of years? I wish I had one of those balls, right? But essentially I think for the next couple of years there's gonna be a lot of people, you know, still moving along with those four levels of the stack. A lot of people I see are still in version one and version two of the chief data. So you'll see over the years that's gonna evolve more digital and more data products. So for next years, my, my prediction is it's all products because it's an immediate link between data and, and the essentially, right? Right. So that's gonna be important and quite likely a new, some new things will be added on, which nobody can predict yet. But we'll see those pop up in a few years. I think there's gonna be a continued challenge for the chief officer role to become a real executive role as opposed to, you know, somebody who claims that they're executive, but then they're not, right? >>So the real reporting level into the board, into the CEO for example, will continue to be a challenging point. But the ones who do get that done will be the ones that are successful and the ones who get that will the ones that do it on the basis of data monetization, right? Connecting value to the data and making that value clear to all the data citizens in the organization, right? And in that sense, they'll need to have both, you know, technical audiences and non-technical audiences aligned of course. And they'll need to focus on adoption. Again, it's not enough to just have your data office be involved in this. It's really important that you're waking up data citizens across the organization and you make everyone in the organization think about data as an asset. >>Absolutely. Because there's so much value that can be extracted. Organizations really strategically build that data office and democratize access across all those data citizens. Stan, this is an exciting arena. We're definitely gonna keep our eyes on this. Sounds like a lot of evolution and maturation coming from the data office perspective. From the data citizen perspective. And as the data show that you mentioned in that IDC study, you mentioned Gartner as well, organizations have so much more likelihood of being successful and being competitive. So we're gonna watch this space. Stan, thank you so much for joining me on the cube at Data Citizens 22. We appreciate it. >>Thanks for having me over >>From Data Citizens 22, I'm Lisa Martin, you're watching The Cube, the leader in live tech coverage. >>Okay, this concludes our coverage of Data Citizens 2022, brought to you by Collibra. Remember, all these videos are available on demand@thecube.net. And don't forget to check out silicon angle.com for all the news and wiki bod.com for our weekly breaking analysis series where we cover many data topics and share survey research from our partner ETR Enterprise Technology Research. If you want more information on the products announced at Data Citizens, go to collibra.com. There are tons of resources there. You'll find analyst reports, product demos. It's really worthwhile to check those out. Thanks for watching our program and digging into Data Citizens 2022 on the Cube, your leader in enterprise and emerging tech coverage. We'll see you soon.

Published Date : Nov 2 2022

SUMMARY :

largely about getting the technology to work. Now the cloud is definitely helping with that, but also how do you automate governance? So you can see how data governance has evolved into to say we extract the signal from the noise, and over the, the next couple of days, we're gonna feature some of the So it's a really interesting story that we're thrilled to be sharing And we said at the time, you know, maybe it's time to rethink data innovation. 2020s from the previous decade, and what challenges does that bring for your customers? as data becomes more impactful than important, the level of scrutiny with respect to privacy, So again, I think it just another incentive for organization to now truly look at data You know, I don't know when you guys founded Collibra, if, if you had a sense as to how complicated the last kind of financial crisis, and that was really the, the start of Colli where we found product market Well, that's interesting because, you know, in my observation it takes seven to 10 years to actually build a again, a lot of momentum in the org in, in the, in the markets with some of the cloud partners And the second is that those data pipelines that are now being created in the cloud, I mean, the acquisition of i l dq, you know, So that's really the theme of a lot of the innovation that we're driving. And so that's the big theme from an innovation perspective, One of our key differentiators is the ability to really drive a lot of automation through workflows. So actually pushing down the computer and data quality, one of the key principles you think about monetization. And I, and I think we we're really at this pivotal moment, and I think you said it well. We need to look beyond just the I know you're gonna crush it out there. This is Dave Valante for the cube, your leader in enterprise and Without data leverage the Collibra data catalog to automatically And for that you'll establish community owners, a data set to a KPI to a report now enables your users to see what Finally, seven, promote the value of this to your users and Welcome to the Cube's coverage of Data Citizens 2022 Collibra's customer event. And now you lead data quality at Collibra. imagine if we get that wrong, you know, what the ramifications could be, And I realized in that moment, you know, I might have failed him because, cause I didn't know. And it's so complex that the way companies consume them in the IT function is And so it's really become front and center just the whole quality issue because data's so fundamental, nowadays to this topic is, so maybe we could surface all of these problems with So the language is changing a you know, stale data, you know, the, the whole trend toward real time. we sort of lived this problem for a long time, you know, in, in the Wall Street days about a decade you know, they just said, Oh, it's a glitch, you know, so they didn't understand the root cause of it. And the one right now is these hyperscalers in the cloud. And I think if you look at the whole So this is interesting because what you just described, you know, you mentioned Snowflake, And so when you were to log into Big Query tomorrow using our I love this example because, you know, Barry talks about, wow, the cloud guys are gonna own the world and, Seeing that across the board, people used to know it was a zip code and nowadays Appreciate it. Right, and thank you for watching. Nice to be here. Can can you explain to our audience why the ability to manage data across the entire organization. I was gonna say, you know, when I look back at like the last 10 years, it was all about getting the technology to work and it And one of the big pushes and passions we have at Collibra is to help with I I, you know, you mentioned this idea of, and really speeding the time to value for any of the business analysts, So where do you see, you know, the friction in adopting new data technologies? So one of the other things we're announcing with, with all of the innovations that are coming is So anybody in the organization is only getting access to the data they should have access to. So it was kind of smart that you guys were early on and We're able to profile and classify that data we're announcing with Calibra Protect this week that and get the right and make sure you have the right quality. I mean, the nice thing about Snowflake, if you play in the Snowflake sandbox, you, you, you, you can get sort of a, We also are doing more with Google around, you know, GCP and kbra protect there, you know, this year, the event your, your perspectives. And so it's all about everybody being able to easily It was great to have you on the cube first time I believe, cube, your leader in enterprise and emerging tech coverage. the cloud where you get the benefit of scale and security and so on. And the last example that comes to mind is that of a large home loan, home mortgage, Stan, it's great to have you back on the cube. Talk to us about what you mean by data citizenship and the And we believe that today's organizations, you have a lot of people, And one of the conclusions they found as they So you can say, ok, I'm doing this, you know, data culture for everyone, awakening them But the IDC study that you just mentioned demonstrates they're three times So as to how you get this done, establish this. part of the equation of getting that right, is it's not enough to just have that leadership out Talk to us about how you are building a data culture within Collibra and But over the years I've run, you know, So we said you the data products can run, the data can flow and you know, the quality can be checked. The catalog for the data scientists to know what's in their data lake, and data citizens join kbra, they immediately have a place to go to, Yes, success of the data office. So for example, a pillar on the data engineering side is gonna be more related So how many of those domains do you have covered? to look into a crystal ball, what do you see in terms of the maturation industries and the number is estimated to be about 20,000 right now. How is that going to evolve for the next couple of years? And in that sense, they'll need to have both, you know, technical audiences and non-technical audiences And as the data show that you mentioned in that IDC study, the leader in live tech coverage. Okay, this concludes our coverage of Data Citizens 2022, brought to you by Collibra.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
LauraPERSON

0.99+

Lisa MartinPERSON

0.99+

DavePERSON

0.99+

AmazonORGANIZATION

0.99+

HeinekenORGANIZATION

0.99+

Dave ValantePERSON

0.99+

Laura SellersPERSON

0.99+

2008DATE

0.99+

CollibraORGANIZATION

0.99+

AdobeORGANIZATION

0.99+

Felix Von DalaPERSON

0.99+

GoogleORGANIZATION

0.99+

Felix Van DemaPERSON

0.99+

sevenQUANTITY

0.99+

Stan ChristiansPERSON

0.99+

2010DATE

0.99+

LisaPERSON

0.99+

San DiegoLOCATION

0.99+

JayPERSON

0.99+

50 dayQUANTITY

0.99+

FelixPERSON

0.99+

oneQUANTITY

0.99+

Kurt HasselbeckPERSON

0.99+

Bank of AmericaORGANIZATION

0.99+

10 yearQUANTITY

0.99+

California Consumer Privacy ActTITLE

0.99+

10 dayQUANTITY

0.99+

SixQUANTITY

0.99+

SnowflakeORGANIZATION

0.99+

Dave AntePERSON

0.99+

Last yearDATE

0.99+

demand@thecube.netOTHER

0.99+

ETR Enterprise Technology ResearchORGANIZATION

0.99+

BarryPERSON

0.99+

GartnerORGANIZATION

0.99+

one partQUANTITY

0.99+

PythonTITLE

0.99+

2010sDATE

0.99+

2020sDATE

0.99+

CalibraLOCATION

0.99+

last yearDATE

0.99+

twoQUANTITY

0.99+

CalibraORGANIZATION

0.99+

K Bear ProtectORGANIZATION

0.99+

two sidesQUANTITY

0.99+

Kirk HasselbeckPERSON

0.99+

12 monthsQUANTITY

0.99+

tomorrowDATE

0.99+

AWSORGANIZATION

0.99+

BarbPERSON

0.99+

StanPERSON

0.99+

Data CitizensORGANIZATION

0.99+

Nick Barcet, Red Hat & Greg Forrest, Lockheed Martin | KubeCon + CloudNativeCon NA 2022


 

(lighthearted music) >> Hey all. Welcome back to theCube's coverage of Kubecon North America '22 CloudNativeCon. We're in Detroit. We've been here all day covering day one of the event from our perspective. Three days of coverage coming at you. Lisa Martin here with John Furrier. John, a lot of buzz today. A lot of talk about the maturation of Kubernetes with different services that vendors are offering. We talked a little bit about security earlier today. One of the things that is a hot topic is national security. >> Yeah, this is a huge segment we got coming up. It really takes that all that nerd talk about Kubernetes and puts it into action. We actually see demonstrable results. This is about advanced artificial intelligence for tactical decision making at the edge to support our military operations because a lot of the deaths are because of bad technology. And this has been talked about. We've been covering Silicon Angle, we wrote a story there now on this topic. This should be a really exciting segment so I'm really looking forward to it. >> Excellent, so am I. Please welcome back one of our alumni, Nick Barcet senior director, customer led open innovation at Red Hat. Great to have you back. Greg Forrest joins us as well from Lockheed Martin Director of AI Foundations. Guys, great to have you on the program. Nick, what's been your perception before we dig into the news and break that open of KubeCon 2022? >> So, KubeCon is always a wonderful event because we can see people working with us in the community developing new stuff, people that we see virtually all year. But it's the time at which we can really establish human contact and that's wonderful. And it's also the moments where we can make big topic move forward and the topics have been plenty at this KubeCon from MicroShift to KCP, to AI, to all domains have been covered. >> Greg, you're the director of AI foundations at Lockheed Martin. Obviously well known, contractors to the military lot of intellectual property, storied history. >> Greg: Sure. >> Talk about this announcement with Red Hat 'cause I think this is really indicative of what's happening at the edge. Data, compute, industrial equipment, and people, in this case lives are in danger or to preserve peace. This is a killer story in terms of understanding what this all means. What's your take on this relationship with Red Hat? What's the secret sauce? >> Yeah, it's really important for us. So part of our 21st century security strategy as a company is to partner with companies like Red Hat and Big Tech and bring the best of the commercial world into the Department of Defense for our soldiers on the ground. And that's exactly what we announced today or Tuesday in our partnership. And so the ability to take commercial products and utilize them in theater is really important for saving lives on the ground. And so we can go through exactly what we did as part of this demonstration, but we took MicroShift at the edge and we were able to run our AI payloads on that. That provided us with the ability to do things like AI based RF sensing, so radio frequency sensing. And we were also able to do computer vision based technologies at the edge. So we went out, we had a small UAV that went out and searched for a target on the ground. It found a target using its radio frequency capabilities, the RF capabilities. Then once we're able to hone in on that target, what Red Hat device edge and MicroShift enables us to do is actually then switch sensing modalities. And then we're able to look at this target via the camera and use computer vision-based technologies to actually more accurately locate the target and then track that target in real time. So that's one of the keys to be able to actually switch modalities in real time on one platform is really important for our joint all domain operations construct. The idea of how do you actually connect all of these assets in the environment, in the battle space. >> Talk about the challenge and how hard it is to do this. The back haul, you'll go back to the central server, bring data back, connecting things. What if there's insecurity around connectivity? I mean there's a lot of things going, can you just scope the magnitude of how hard it's to actually deploy something at a tactical edge? >> It is. There's a lot of data that comes from all of these sensors, whether they're RF sensors or EO or IR. We're working across multiple domains, right? And so we want to take that data back and train on that and then redeploy to the edge. And so with MicroShift, we're able to do that in a way that's robust, that's repeatable, and that's automated. And that really instills trust in us and our customers that when we deploy new software capabilities to the edge over the air, like we did in this demonstration that they're going to run right on the target hardware. And so that's a huge advantage to what we're doing here that when we push software to the edge in real time we know it's going to run. >> And in realtime is absolutely critical. We talk about it in so many different industries. Oh, it's customers expect realtime access whether it's your banking app or whatnot. But here we're talking about literally life and death situations on the battlefield. So that realtime data access is literally life and death. >> It's paramount to what we're doing. In this case, the aircraft started with one role which was to go find a radio frequency admitter and then switch roles to then go get cameras and eyes on that. So where is that coming from? Are there people on the ground? Are there dangerous people on the ground? And it gives the end user on the ground complete situational awareness of what is actually happening. And that is key for enhanced decision making. Enhanced decision making is critical to what we're doing. And so that's really where we're advancing this technology and where we can save lives. >> I read a report from General Mattis when he was in service that a lot of the deaths are due to not having enough information really at the edge. >> Greg: Friendly fire. >> Friendly fire, a lot of stuff that goes on there. So this is really, really important. Nick, you're sitting there saying this is great. My customer's talking about the product. This is your innovation, Red Hat device edge in action. This is real. This is industrial- >> So it's more than real. Actually this type of use case is what convinced us to transform a technology we had been working on which is a small form factor of Kubernetes to transform it into a product. Because sometimes, US engineers have a tendency to invent stuff that are great on paper, but it's a solution trying to find a problem. And we need customers to work with us to make sure that do solution do solve a real problem. And Lockheed was great. Worked with us upstream on that project. Helped us prove out that the concept was actually worth it and we waited until Lockheed had tested the concept in the air. >> Okay, so Red Hat device edge and MicroShift, explain that, how that works real quick for the folks that don't know. So one of the thing we learned is that Kubernetes is great but it's only part of the journey. In order to get those workloads on those aircraft or in order to get those workloads in a factory, you also need to consider the full life cycle of the device itself. And you don't handle a device that is inside of a UAV or inside of a factory the same way you handle a server. You have to deal with those devices in a way that is much more akin to a setup box. So we had to modify how the OS was behaving to deal with devices and we reduced what we had built in real for each edge aspect and combined it with MicroShift and that's what became with that Red Hat device edge. >> We're in a low SWAP environment, space, weight and power, right? Or very limited, We're on a small UAS in this demonstration. So the ability to spool up and spool down containers and to save computing power and to do that on demand and orchestrate that with MicroShift is paramount to what we're doing. We wouldn't be able to do it without that capability. >> John: That's awesome. >> I want to get both of your opinions. Nick, we'll start with you and then Greg we'll go to you. In terms of MicroShift , what is its superpower? What differentiates it from other competing solutions in the market? >> So MicroShift is Kubernetes but reduced to the strict minimum of a runtime version of Kubernetes so that it takes a minimal footprint so that we maximize the space available for the workload in those very constraints environments. On a board where you have eight or 16 gig of RAM, if you use only two gig of that to run the infrastructure component, you leave the rest for the AI workload that you need on the drone. And that's what is really important. >> And these AI payloads, the inference that we're doing at the edge is very compute intensive. So again, the ability to manage that and orchestrate that is paramount to running on these very small board computers. These are small drones that don't have a lot of weight that don't allow a lot of space. >> John: Got to be efficient >> And be efficient with it. >> How were you guys involved? Talk about the relationship. So you guys were tightly involved. Talk about the roles you guys played together. Was it co-development? Was it customer/partner? Talk about the relationship. >> Yeah, so we started actually with satellite. So you can think of small cube sets in a very similar environment to a low powered UAV. And it started there. And then in the last, I would say year or so, Nick we have worked together to develop MicroShift. We work closely on Slack channels together like we're part of the same team. >> John: That's great. >> And hey Red Hat, this is what we need, this is what we're looking for. These are the constraints that we have. And this team has been amazing and just delivered on everything that we've asked for. >> I mean this is really an example of the innovation at the edge, industrial edge specifically. You got an operating system, you got form factor challenges, you got operating parameters. And just to having that flex, you can't just take this and put it over there. >> But it's what really is a community applied to an industrial context. So what happened there is we worked as part of the MicroShift community together with a real time communication channel, the same slack that anybody developing Kubernetes uses we've been using to identify where the problems were, how to solve them, bring new ideas and that's how we tackle these problems. >> Yeah, a true open source model I mean the Red Hat and the Lockheed teams were in it together on a daily basis communicating like we were part of the same company. And and that's really how you move these things forward. >> Yeah, and of course open source is great but also you got to lock down the security. How did you guys handle that? What's going on with the security? 'Cause you got to make sure no take over the devices. >> So the funny thing is that even though what we produce is highly inclusive of security concern, our development model is completely open. So it's not security biopurification, it's security because we apply the best practices. >> John: You see everything. >> Absolutely. >> Yes. >> And then you harden it in the joint development, there it is. >> Yeah, but what we support, what we offer as a product is the same for Lockheed or for any other customer because there is no domain where security is not important. When you control the recognition on a drone or where you control the behavior of a robot in a factory, security is paramount because you can't immobilize a country by infecting a robot the same way you could immobilize a military operation- >> Greg: That's right. >> By infecting a UAV. >> Not to change the subject, but I got to go on a tangent here cause it pops in my head. You mentioned cube set, not related to theCUBE of course. Where theCube for the video. Cube sets are very powerful. People can launch space right now very inexpensively. So it's a highly contested and congested environment. Any space activity going on around the corner with you guys? 'Cause remember the world's not around, it's edge is now in space. Mars is the edge. >> That's right. >> Our first prototype for MicroShift was actually a cube set. >> Greg: That's where it started. >> And IBM project, the project called Endurance. That's the first time we actually put MicroShift into use. And that was a very interesting project, very early version of MicroShift . And now we have talks with many other people on reproducing that at more industrial level this was more like a cool high school project. >> But to your point, the scalability across different platforms is there. If we're running on top of MicroShift on this common OS, it just eases the development. Behind the scenes, we have a whole AI factory at Lockheed Martin where we have a common ecosystem for how we actually develop and deploy these algorithms to the edge. And now we've got a common ecosystem at the edge. And so it helps that whole process to be able to do that in automated ways, repeatable ways so we can instill trust in our DRD customer that the validation of verification of this is a really important aspect. >> John: Must be a fun place to work. >> It is, it's exciting. There's endless opportunities. >> You must get a lot of young kids applying for those jobs. They're barely into the whole. I mean, AI's a hot feel and people want to get their hands on real applications. I was serious about space. Is there space activity going on with you guys or is it just now military edge, not yet military space? Or is that classified? >> Yeah, so we're working across multiple fronts, absolutely. >> That's awesome. >> What excite, oh, sorry John. What excites you most, never a dull moment with what you're doing, but just the potential to enable a safer, a more secure world, what excites you most about this partnership and the direction and the we'll say the trajectory it's going on? >> Yeah, I think, for me, the safer insecure world is paramount to what we're doing. We're here for national defense and for our allies and that's really critical to what we're doing. That's what motivates me. That's what gets me up in the morning to know that there is a soldier on the ground who will be using this technology and we will give be giving that person the situational awareness to make the right decisions at the right time. So we can go from small UAVs to larger aircraft or we can do it in a small confined edge device like a stalker UAV. We can scale this up to different products different platforms and they don't even have to be Lockheed Martin >> John: And more devices that are going to be imagined. >> More devices that we haven't even imagined yet. >> Right, that aren't even on the frontier yet. Nick, what's next from your perspective? >> In the domain we are in, next is always plenty of things. Sustainability is a huge domain right now on which we're working. We have lots of things going on in the AI space, stuff going on with Lockheed Martin. We have things going on in the radio network domain. We've been very heavily involved in telecommunication and this is constantly evolving. There is not one domain that, in terms of infrastructure Red Hat is not touching >> Well, this is the first of multiple demonstrations. The scenarios will get more complex with multiple aircraft and in the future, we're also looking at bringing a lot of the 5G work. Lockheed has put a large focus on 5G.mil for military applications and running some of those workloads on top of MicroShift as well is things to come in the future that we are already planning and looking at. >> Yeah, and it's needed in theater to have connectivity. Got to have your own connectivity. >> It's paramount, absolutely. >> Absolutely, it's paramount. It's game-changing. Guys, thank you so much for joining John and me on theCube talking about how Red Hat and Lockheed Martin are working together to leverage AI to really improve decision making and save more lives. It was a wonderful conversation. We're going to have to have you back 'cause we got to follow this. >> Yeah, of course. >> This was great, thank you so much. >> Thank you very much for having us. >> Lisa: Our pleasure, thank you. >> Greg: Really appreciate it. >> Excellent. For our guests and John Furrier, I'm Lisa Martin. You're watching theCUBE Live from KubeCon CloudNativeCon '22 from Detroit. Stick around. Next guest is going to join John and Savannah in just a minute. (lighthearted music)

Published Date : Oct 27 2022

SUMMARY :

A lot of talk about the of the deaths are because Guys, great to have you on the program. And it's also the contractors to the military What's the secret sauce? And so the ability to and how hard it is to do this. and then redeploy to the edge. on the battlefield. And it gives the end user on the ground that a lot of the deaths My customer's talking about the product. of Kubernetes to transform it So one of the thing we So the ability to spool up in the market? for the AI workload that So again, the ability to manage Talk about the roles you to a low powered UAV. These are the constraints that we have. of the innovation at the edge, as part of the MicroShift And and that's really how you no take over the devices. So the funny thing is that even though in the joint development, the same way you could around the corner with you guys? MicroShift was actually That's the first time we Behind the scenes, we It is, it's exciting. They're barely into the whole. Yeah, so we're working across just the potential to enable the morning to know that that are going to be imagined. More devices that we even on the frontier yet. In the domain we are in, and in the future, we're Got to have your own connectivity. We're going to have to have you back Next guest is going to join John

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

LockheedORGANIZATION

0.99+

SavannahPERSON

0.99+

Greg ForrestPERSON

0.99+

Lisa MartinPERSON

0.99+

Nick BarcetPERSON

0.99+

John FurrierPERSON

0.99+

LisaPERSON

0.99+

DetroitLOCATION

0.99+

GregPERSON

0.99+

Lockheed MartinORGANIZATION

0.99+

John FurrierPERSON

0.99+

NickPERSON

0.99+

Red HatORGANIZATION

0.99+

21st centuryDATE

0.99+

eightQUANTITY

0.99+

Big TechORGANIZATION

0.99+

16 gigQUANTITY

0.99+

KubeConEVENT

0.99+

IBMORGANIZATION

0.99+

Three daysQUANTITY

0.99+

TuesdayDATE

0.99+

bothQUANTITY

0.99+

two gigQUANTITY

0.99+

Department of DefenseORGANIZATION

0.99+

firstQUANTITY

0.99+

first timeQUANTITY

0.99+

oneQUANTITY

0.98+

todayDATE

0.98+

one platformQUANTITY

0.98+

one roleQUANTITY

0.97+

MicroShiftTITLE

0.97+

CloudNativeConEVENT

0.96+

first prototypeQUANTITY

0.96+

one domainQUANTITY

0.96+

KubeCon 2022EVENT

0.95+

each edgeQUANTITY

0.95+

Red HatORGANIZATION

0.95+

day oneQUANTITY

0.95+

USLOCATION

0.95+

MattisPERSON

0.91+

GeneralPERSON

0.91+

KubernetesTITLE

0.9+

SlackORGANIZATION

0.88+

theCubeORGANIZATION

0.84+

Kirk Haslbeck, Collibra | Data Citizens '22


 

(bright upbeat music) >> Welcome to theCUBE's Coverage of Data Citizens 2022 Collibra's Customer event. My name is Dave Vellante. With us is Kirk Hasselbeck, who's the Vice President of Data Quality of Collibra. Kirk, good to see you. Welcome. >> Thanks for having me, Dave. Excited to be here. >> You bet. Okay, we're going to discuss data quality, observability. It's a hot trend right now. You founded a data quality company, OwlDQ and it was acquired by Collibra last year. Congratulations! And now you lead data quality at Collibra. So we're hearing a lot about data quality right now. Why is it such a priority? Take us through your thoughts on that. >> Yeah, absolutely. It's definitely exciting times for data quality which you're right, has been around for a long time. So why now, and why is it so much more exciting than it used to be? I think it's a bit stale, but we all know that companies use more data than ever before and the variety has changed and the volume has grown. And while I think that remains true, there are a couple other hidden factors at play that everyone's so interested in as to why this is becoming so important now. And I guess you could kind of break this down simply and think about if Dave, you and I were going to build, you know a new healthcare application and monitor the heartbeat of individuals, imagine if we get that wrong, what the ramifications could be? What those incidents would look like? Or maybe better yet, we try to build a new trading algorithm with a crossover strategy where the 50 day crosses the 10 day average. And imagine if the data underlying the inputs to that is incorrect. We'll probably have major financial ramifications in that sense. So, it kind of starts there where everybody's realizing that we're all data companies and if we are using bad data, we're likely making incorrect business decisions. But I think there's kind of two other things at play. I bought a car not too long ago and my dad called and said, "How many cylinders does it have?" And I realized in that moment, I might have failed him because 'cause I didn't know. And I used to ask those types of questions about any lock brakes and cylinders and if it's manual or automatic and I realized I now just buy a car that I hope works. And it's so complicated with all the computer chips. I really don't know that much about it. And that's what's happening with data. We're just loading so much of it. And it's so complex that the way companies consume them in the IT function is that they bring in a lot of data and then they syndicate it out to the business. And it turns out that the individuals loading and consuming all of this data for the company actually may not know that much about the data itself and that's not even their job anymore. So, we'll talk more about that in a minute but that's really what's setting the foreground for this observability play and why everybody's so interested, it's because we're becoming less close to the intricacies of the data and we just expect it to always be there and be correct. >> You know, the other thing too about data quality and for years we did the MIT CDOIQ event we didn't do it last year at COVID, messed everything up. But the observation I would make there love thoughts is it data quality used to be information quality used to be this back office function, and then it became sort of front office with financial services and government and healthcare, these highly regulated industries. And then the whole chief data officer thing happened and people were realizing, well, they sort of flipped the bit from sort of a data as a a risk to data as an asset. And now, as we say, we're going to talk about observability. And so it's really become front and center, just the whole quality issue because data's fundamental, hasn't it? >> Yeah, absolutely. I mean, let's imagine we pull up our phones right now and I go to my favorite stock ticker app and I check out the NASDAQ market cap. I really have no idea if that's the correct number. I know it's a number, it looks large, it's in a numeric field. And that's kind of what's going on. There's so many numbers and they're coming from all of these different sources and data providers and they're getting consumed and passed along. But there isn't really a way to tactically put controls on every number and metric across every field we plan to monitor. But with the scale that we've achieved in early days, even before Collibra. And what's been so exciting is we have these types of observation techniques, these data monitors that can actually track past performance of every field at scale. And why that's so interesting and why I think the CDO is listening right intently nowadays to this topic is so maybe we could surface all of these problems with the right solution of data observability and with the right scale and then just be alerted on breaking trends. So we're sort of shifting away from this world of must write a condition and then when that condition breaks, that was always known as a break record. But what about breaking trends and root cause analysis? And is it possible to do that, with less human intervention? And so I think most people are seeing now that it's going to have to be a software tool and a computer system. It's not ever going to be based on one or two domain experts anymore. >> So, how does data observability relate to data quality? Are they sort of two sides of the same coin? Are they cousins? What's your perspective on that? >> Yeah, it's super interesting. It's an emerging market. So the language is changing a lot of the topic and areas changing the way that I like to say it or break it down because the lingo is constantly moving as a target on this space is really breaking records versus breaking trends. And I could write a condition when this thing happens it's wrong and when it doesn't, it's correct. Or I could look for a trend and I'll give you a good example. Everybody's talking about fresh data and stale data and why would that matter? Well, if your data never arrived or only part of it arrived or didn't arrive on time, it's likely stale and there will not be a condition that you could write that would show you all the good and the bads. That was kind of your traditional approach of data quality break records. But your modern day approach is you lost a significant portion of your data, or it did not arrive on time to make that decision accurately on time. And that's a hidden concern. Some people call this freshness, we call it stale data but it all points to the same idea of the thing that you're observing may not be a data quality condition anymore. It may be a breakdown in the data pipeline. And with thousands of data pipelines in play for every company out there there, there's more than a couple of these happening every day. >> So what's the Collibra angle on all this stuff made the acquisition you got data quality observability coming together, you guys have a lot of expertise in this area but you hear providence of data you just talked about stale data, the whole trend toward real time. How is Collibra approaching the problem and what's unique about your approach? >> Well, I think where we're fortunate is with our background, myself and team we sort of lived this problem for a long time in the Wall Street days about a decade ago. And we saw it from many different angles. And what we came up with before it was called data observability or reliability was basically the underpinnings of that. So we're a little bit ahead of the curve there when most people evaluate our solution. It's more advanced than some of the observation techniques that currently exist. But we've also always covered data quality and we believe that people want to know more, they need more insights and they want to see break records and breaking trends together so they can correlate the root cause. And we hear that all the time. I have so many things going wrong just show me the big picture. Help me find the thing that if I were to fix it today would make the most impact. So we're really focused on root cause analysis, business impact connecting it with lineage and catalog, metadata. And as that grows, you can actually achieve total data governance. At this point, with the acquisition of what was a lineage company years ago and then my company OwlDQ, now Collibra Data Quality, Collibra may be the best positioned for total data governance and intelligence in the space. >> Well, you mentioned financial services a couple of times and some examples, remember the flash crash in 2010. Nobody had any idea what that was, they just said, "Oh, it's a glitch." So they didn't understand the root cause of it. So this is a really interesting topic to me. So we know at Data Citizens '22 that you're announcing you got to announce new products, right? Your yearly event, what's new? Give us a sense as to what products are coming out but specifically around data quality and observability. >> Absolutely. There's always a next thing on the forefront. And the one right now is these hyperscalers in the cloud. So you have databases like Snowflake and Big Query and Data Bricks, Delta Lake and SQL Pushdown. And ultimately what that means is a lot of people are storing in loading data even faster in a salike model. And we've started to hook in to these databases. And while we've always worked with the same databases in the past they're supported today we're doing something called Native Database pushdown, where the entire compute and data activity happens in the database. And why that is so interesting and powerful now is everyone's concerned with something called Egress. Did my data that I've spent all this time and money with my security team securing ever leave my hands? Did it ever leave my secure VPC as they call it? And with these native integrations that we're building and about to unveil here as kind of a sneak peek for next week at Data Citizens, we're now doing all compute and data operations in databases like Snowflake. And what that means is with no install and no configuration you could log into the Collibra Data Quality app and have all of your data quality running inside the database that you've probably already picked as your your go forward team selection secured database of choice. So we're really excited about that. And I think if you look at the whole landscape of network cost, egress cost, data storage and compute, what people are realizing is it's extremely efficient to do it in the way that we're about to release here next week. >> So this is interesting because what you just described you mentioned Snowflake, you mentioned Google, oh actually you mentioned yeah, the Data Bricks. Snowflake has the data cloud. If you put everything in the data cloud, okay, you're cool but then Google's got the open data cloud. If you heard Google Nest and now Data Bricks doesn't call it the data cloud but they have like the open source data cloud. So you have all these different approaches and there's really no way up until now I'm hearing to really understand the relationships between all those and have confidence across, it's like (indistinct) you should just be a note on the mesh. And I don't care if it's a data warehouse or a data lake or where it comes from, but it's a point on that mesh and I need tooling to be able to have confidence that my data is governed and has the proper lineage, providence. And that's what you're bringing to the table. Is that right? Did I get that right? >> Yeah, that's right. And for us, it's not that we haven't been working with those great cloud databases, but it's the fact that we can send them the instructions now we can send them the operating ability to crunch all of the calculations, the governance, the quality and get the answers. And what that's doing, it's basically zero network cost, zero egress cost, zero latency of time. And so when you were to log into Big BigQuery tomorrow using our tool or let or say Snowflake, for example, you have instant data quality metrics, instant profiling, instant lineage and access privacy controls things of that nature that just become less onerous. What we're seeing is there's so much technology out there just like all of the major brands that you mentioned but how do we make it easier? The future is about less clicks, faster time to value faster scale, and eventually lower cost. And we think that this positions us to be the leader there. >> I love this example because every talks about wow the cloud guys are going to own the world and of course now we're seeing that the ecosystem is finding so much white space to add value, connect across cloud. Sometimes we call it super cloud and so, or inter clouding. Alright, Kirk, give us your final thoughts and on the trends that we've talked about and Data Citizens '22. >> Absolutely. Well I think, one big trend is discovery and classification. Seeing that across the board people used to know it was a zip code and nowadays with the amount of data that's out there, they want to know where everything is where their sensitive data is. If it's redundant, tell me everything inside of three to five seconds. And with that comes, they want to know in all of these hyperscale databases, how fast they can get controls and insights out of their tools. So I think we're going to see more one click solutions, more SAS-based solutions and solutions that hopefully prove faster time to value on all of these modern cloud platforms. >> Excellent, all right. Kurt Hasselbeck, thanks so much for coming on theCUBE and previewing Data Citizens '22. Appreciate it. >> Thanks for having me, Dave. >> You're welcome. All right, and thank you for watching. Keep it right there for more coverage from theCUBE.

Published Date : Oct 24 2022

SUMMARY :

Kirk, good to see you. Excited to be here. and it was acquired by Collibra last year. And it's so complex that the And now, as we say, we're going and I check out the NASDAQ market cap. and areas changing the and what's unique about your approach? of the curve there when most and some examples, remember and data activity happens in the database. and has the proper lineage, providence. and get the answers. and on the trends that we've talked about and solutions that hopefully and previewing Data Citizens '22. All right, and thank you for watching.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

DavePERSON

0.99+

CollibraORGANIZATION

0.99+

Kurt HasselbeckPERSON

0.99+

2010DATE

0.99+

oneQUANTITY

0.99+

Kirk HasselbeckPERSON

0.99+

50 dayQUANTITY

0.99+

KirkPERSON

0.99+

10 dayQUANTITY

0.99+

OwlDQORGANIZATION

0.99+

Kirk HaslbeckPERSON

0.99+

next weekDATE

0.99+

GoogleORGANIZATION

0.99+

last yearDATE

0.99+

two sidesQUANTITY

0.99+

thousandsQUANTITY

0.99+

NASDAQORGANIZATION

0.99+

SnowflakeTITLE

0.99+

Data CitizensORGANIZATION

0.99+

Data BricksORGANIZATION

0.99+

two other thingsQUANTITY

0.98+

one clickQUANTITY

0.98+

tomorrowDATE

0.98+

todayDATE

0.98+

five secondsQUANTITY

0.97+

two domainQUANTITY

0.94+

Collibra Data QualityTITLE

0.92+

MIT CDOIQEVENT

0.9+

Data Citizens '22TITLE

0.9+

EgressORGANIZATION

0.89+

Delta LakeTITLE

0.89+

threeQUANTITY

0.86+

zeroQUANTITY

0.85+

Big QueryTITLE

0.85+

about a decade agoDATE

0.85+

SQL PushdownTITLE

0.83+

Data Citizens 2022 CollibraEVENT

0.82+

Big BigQueryTITLE

0.81+

more than a coupleQUANTITY

0.79+

coupleQUANTITY

0.78+

one bigQUANTITY

0.77+

Collibra Data QualityORGANIZATION

0.75+

CollibraOTHER

0.75+

Google NestORGANIZATION

0.75+

Data Citizens '22ORGANIZATION

0.74+

zero latencyQUANTITY

0.72+

SASORGANIZATION

0.71+

SnowflakeORGANIZATION

0.69+

COVIDORGANIZATION

0.69+

years agoDATE

0.68+

Wall StreetLOCATION

0.66+

theCUBEORGANIZATION

0.66+

many numbersQUANTITY

0.63+

CollibraPERSON

0.63+

timesQUANTITY

0.61+

DataORGANIZATION

0.61+

too longDATE

0.6+

Vice PresidentPERSON

0.57+

dataQUANTITY

0.56+

CDOTITLE

0.52+

BricksTITLE

0.48+

Amit Eyal Govrin, Kubiya.ai | Cube Conversation


 

(upbeat music) >> Hello everyone, welcome to this special Cube conversation here in Palo Alto, California. I'm John Furrier, host of theCUBE in theCUBE Studios. We've got a special video here. We love when we have startups that are launching. It's an exclusive video of a hot startup that's launching. Got great reviews so far. You know, word on the street is, they got something different and unique. We're going to' dig into it. Amit Govrin who's the CEO and co-founder of Kubiya, which stands for Cube in Hebrew, and they're headquartered in Bay Area and in Tel Aviv. Amit, congratulations on the startup launch and thanks for coming in and talk to us in theCUBE >> Thank you, John, very nice to be here. >> So, first of all, a little, 'cause we love the Cube, 'cause theCUBE's kind of an open brand. We've never seen the Cube in Hebrew, so is that true? Kubiya is? >> Kubiya literally means cube. You know, clearly there's some additional meanings that we can discuss. Obviously we're also launching a KubCon, so there's a dual meaning to this event. >> KubCon, not to be confused with CubeCon. Which is an event we might have someday and compete. No, I'm only kidding, good stuff. I want to get into the startup because I'm intrigued by your story. One, you know, conversational AI's been around, been a category. We've seen chat bots be all the rage and you know, I kind of don't mind chat bots on some sites. I can interact with some, you know, form based knowledge graph, whatever, knowledge database and get basic stuff self served. So I can see that, but it never really scaled or took off. And now with Cloud Native kind of going to the next level, we're starting to see a lot more open source and a lot more automation, in what I call AI as code or you know, AI as a service, machine learning, developer focused action. I think you guys might have an answer there. So if you don't mind, could you take a minute to explain what you guys are doing, what's different about Kubiya, what's happening? >> Certainly. So thank you for that. Kubiya is what we would consider the first, or one of the first, advanced virtual assitants with a domain specific expertise in DevOps. So, we respect all of the DevOps concepts, GitOps, workflow automation, of those categories you've mentioned, but also the added value of the conversational AI. That's really one of the few elements that we can really bring to the table to extract what we call intent based operations. And we can get into what that means in a little bit. I'll save that maybe for the next question. >> So the market you're going after is kind of, it's, I love to hear starters when they, they don't have a Gartner Magic quadrant, they can fit nicely, it means they're onto something. What is the market you're going after? Because you're seeing a lot of developers driving a lot of the key successes in DevOps. DevOps has evolved to the point where, and DevSecOps, where developers are driving the change. And so having something that's developer focused is key. Are you guys targeting the developers, IT buyers, cloud architects? Who are you looking to serve with this new opportunity? >> So essentially self-service in the world of DevOps, the end user typically would be a developer, but not only, and obviously the operators, those are the folks that we're actually looking to help augment a lot of their efforts, a lot of the toil that they're experiencing in a day to day. So there's subcategories within that. We can talk about the different internal developer tools, or platforms, shared services platforms, service catalogs are tangential categories that this kind of comes on. But on top of that, we're adding the element of conversational AI. Which, as I mentioned, that's really the "got you". >> I think you're starting to see a lot of autonomous stuff going on, autonomous pen testing. There's a company out there doing I've seen autonomous AI. Automation is a big theme of it. And I got to ask, are you guys on the business side purely in the cloud? Are you born in the cloud, is it a cloud service? What's the product choice there? It's a service, right? >> Software is a service. We have the classic, Multi-Tenancy SAAS, but we also have a hybrid SAAS solution, which allows our customers to run workflows using remote runners, essentially hosted at their own location. >> So primary cloud, but you're agnostic on where they could consume, how they want to' consume the product. >> Technology agnostic. >> Okay, so that's cool. So let's get into the problem you're solving. So take me through, this will drive a lot of value here, when you guys did the company, what problems did you hone in on and what are you guys seeing as the core problem that you solve? >> So we, this is a unique, I don't know how unique, but this is a interesting proposition because I come from the business side, so call it the top down. I've been in enterprise sales, I've been in a CRO, VP sales hat. My co-founder comes from the bottom up, right? He ran DevOps teams and SRE teams in his previous company. That's actually what he did. So, we met each other halfway, essentially with me seeing a lot of these problems of self-service not being so self-service after all, platforms hitting walls with adoption. And he actually created his own self-service platform, within his last company, to address his own personal pains. So we essentially kind of met with both perspectives. >> So you're absolutely hardcore on self-service. >> We're enabling self-service. >> And that basically is what everybody wants. I mean, the developers want self-service. I mean, that's kind of like, you know, that's the nirvana. So take us through what you guys are offering, give us an example of use cases and who's buying your product, why, and take us through that whole piece. >> Do you mind if I take a step back and say why we believe self-service has somewhat failed or not gotten off. >> Yeah, absolutely. >> So look, this is essentially how we're looking at it. All the analysts and the industry insiders are talking about self-service platforms as being what's going to' remove the dependency of the operator in the loop the entire time, right? Because the operator, that scarce resource, it's hard to hire, hard to train, hard to retain those folks, Developers are obviously dependent on them for productivity. So the operators in this case could be a DevOps, could be a SecOps, it could be a platform engineer. It comes in different flavors. But the common denominator, somebody needs an access request, provisioning a new environment, you name it, right? They go to somebody, that person is operator. The operator typically has a few things on their plate. It's not just attending and babysitting platforms, but it's also innovating, spinning up, and scaling services. So they see this typically as kind of, we don't really want to be here, we're going to' go and do this because we're on call. We have to take it on a chin, if you may, for this. >> It's their child, they got to' do it. >> Right, but it's KTLOs, right, keep the lights on, this is maintenance of a platform. It's not what they're born and bred to do, which is innovate. That's essentially what we're seeing, we're seeing that a lot of these platforms, once they finally hit the point of maturity, they're rolled out to the team. People come to serve themselves in platform, and low and behold, it's not as self-service as it may seem. >> We've seen that certainly with Kubernetes adoption being, I won't say slow, it's been fast, but it's been good. But I think this is kind of the promise of what SRE was supposed to be. You know, do it once and then babysit in the sense of it's working and automated. Nothing's broken yet. Don't call me unless you need something, I see that. So the question, you're trying to make it easier then, you're trying to free up the talent. >> Talent to operate and have essentially a human, like in the loop, essentially augment that person and give the end users all of the answers they require, as if they're talking to a person. >> I mean it's basically, you're taking the virtual assistant concept, or chat bot, to a level of expertise where there's intelligence, jargon, experience into the workflows that's known. Not just talking to chat bot, get a support number to rebook a hotel room. >> We're converting operational workflows into conversations. >> Give me an example, take me through an example. >> Sure, let's take a simple example. I mean, not everyone provisions EC2's with two days (indistinct). But let's say you want to go and provision new EC2 instances, okay? If you wanted to do it, you could go and talk to the assistant and say, "I want to spin up a new server". If it was a human in the loop, they would ask you the following questions: what type of environment? what are we attributing this to? what type of instance? security groups, machine images, you name it. So, these are the questions that typically somebody needs to be armed with before they can go and provision themselves, serve themselves. Now the problem is users don't always have these questions. So imagine the following scenario. Somebody comes in, they're in Jira ticket queue, they finally, their turn is up and the next question they don't have the answer to. So now they have to go and tap on a friend, or they have to go essentially and get that answer. By the time they get back, they lost their turn in queue. And then that happens again. So, they lose a context, they lose essentially the momentum. And a simple access request, or a simple provision request, can easily become a couple days of ping pong back and forth. This won't happen with the virtual assistant. >> You know, I think, you know, and you mentioned chat bots, but also RPA is out there, you've seen a lot of that growth. One of the hard things, and you brought this up, I want to get your reaction to, is contextualizing the workflow. It might not be apparent, but the answer might be there, it disrupts the entire experience at that point. RPA and chat bots don't have that contextualization. Is that what you guys do differently? Is that the unique flavor here? Is that difference between current chat bots and RPA? >> The way we see it, I alluded to the intent based operations. Let me give a tangible experience. Even not from our own world, this will be easy. It's a bidirectional feedback loop 'cause that's actually what feeds the context and the intent. We all know Waze, right, in the world of navigation. They didn't bring navigation systems to the world. What they did is they took the concept of navigation systems that are typically satellite guided and said it's not just enough to drive down the 280, which typically have no traffic, right, and to come across traffic and say, oh, why didn't my satellite pick that up? So they said, have the end users, the end nodes, feed that direction back, that feedback, right. There has to be a bidirectional feedback loop that the end nodes help educate the system, make the system be better, more customized. And that's essentially what we're allowing the end users. So the maintenance of the system isn't entirely in the hands of the operators, right? 'Cause that's the part that they dread. And the maintenance of the system is democratized across all the users that they can teach the system, give input to the system, hone in the system in order to make it more of the DNA of the organization. >> You and I were talking before you came on this camera interview, you said playfully that the Siri for DevOps, which kind of implies, hey infrastructure, do something for me. You know, we all know Siri, so we get that. So that kind of illustrates kind of where the direction is. Explain why you say that, what does that mean? Is that like a NorthStar vision that you guys are approaching? You want to' have a state where everything's automated in it's conversational deployments, that kind of thing. And take us through why that Siri for DevOps is. >> I think it helps anchor people to what a virtual assistant is. Because when you hear virtual assistant, that can mean any one of various connotations. So the Siri is actually a conversational assistant, but it's not necessarily a virtual assistant. So what we're saying is we're anchoring people to that thought and saying, we're actually allowing it to be operational, turning complex operations into simple conversations. >> I mean basically they take the automate with voice Google search or a query, what's the score of the game? And, it also, and talking to the guy who invented Siri, I actually interviewed on theCUBE, it's a learning system. It actually learns as it gets more usage, it learns. How do you guys see that evolving in DevOps? There's a lot of jargon in DevOps, a lot of configurations, a lot of different use cases, a lot of new technologies. What's the secret sauce behind what you guys do? Is it the conversational AI, is it the machine learning, is it the data, is it the model? Take us through the secret sauce. >> In fact, it's all the above. And I don't think we're bringing any one element to the table that hasn't been explored before, hasn't been done. It's a recipe, right? You give two people the same ingredients, they can have complete different results in terms of what they come out with. We, because of our domain expertise in DevOps, because of our familiarity with developer workflows with operators, we know how to give a very well suited recipe. Five course meal, hopefully with Michelin stars as part of that. So a few things, maybe a few of the secret sauce element, conversational AI, the ability to essentially go and extract the intent of the user, so that if we're missing context, the system is smart enough to go and to get that feedback and to essentially feed itself into that model. >> Someone might say, hey, you know, conversational AI, that was yesterday's trend, it never happened. It was kind of weak, chat bots were lame. What's different now and with you guys, and the market, that makes a redo or a second shot at this, a second bite at the apple, as they say. What do you guys see? 'Cause you know, I would argue that it's, you know, it's still early, real early. >> Certainly. >> How do you guys view that? How would you handle that objection? >> It's a fair question. I wasn't around the first time around to tell you what didn't work. I'm not afraid to share that the feedback that we're getting is phenomenal. People understand that we're actually customizing the workflows, the intent based operations to really help hone in on the dark spots. We call it last mile, you know, bottlenecks. And that's really where we're helping. We're helping in a way tribalize internal knowledge that typically hasn't been documented because it's painful enough to where people care about it but not painful enough to where you're going to' go and sit down an entire day and document it. And that's essentially what the virtual assistant can do. It can go and get into those crevices and help document, and operationalize all of those toils. And into workflows. >> Yeah, I mean some will call it grunt work, or low level work. And I think the automation is interesting. I think we're seeing this in a lot of these high scale situations where the talented hard to hire person is hired to do, say, things that were hard to do, but now harder things are coming around the corner. So, you know, serverless is great and all this is good, but it doesn't make the complexity go away. As these inflection points continue to drive more scale, the complexity kind of grows, but at the same time so is the ability to abstract away the complexity. So you're starting to see the smart, hired guns move to higher, bigger problems. And the automation seems to take the low level kind of like capabilities or the toil, or the grunt work, or the low level tasks that, you know, you don't want a high salaried person doing. Or I mean it's not so much that they don't want to' do it, they'll take one for the team, as you said, or take it on the chin, but there's other things to work on. >> I want to add one more thing, 'cause this goes into essentially what you just said. Think about it's not the virtual system, what it gives you is not just the intent and that's one element of it, is the ability to carry your operations with you to the place where you're not breaking your workflows, you're actually comfortable operating. So the virtual assistant lives inside of a command line interface, it lives inside of chat like Slack, and Teams, and Mattermost, and so forth. It also lives within a low-code editor. So we're not forcing anyone to use uncomfortable language or operations if they're not comfortable with. It's almost like Siri, it travels in your mobile phone, it's on your laptop, it's with you everywhere. >> It makes total sense. And the reason why I like this, and I want to' get your reaction on this because we've done a lot of interviews with DevOps, we've met at every CubeCon since it started, and Kubernetes kind of highlights the value of the containers at the orchestration level. But what's really going on is the DevOps developers, and the CICD pipeline, with infrastructure's code, they're basically have a infrastructure configuration at their disposal all the time. And all the ops challenges have been around that, the repetitive mundane tasks that most people do. There's like six or seven main use cases in DevOps. So the guardrails just need to be set. So it sounds like you guys are going down the road of saying, hey here's the use cases you can bounce around these use cases all day long. And just keep doing your jobs cause they're bolting on infrastructure to every application. >> There's one more element to this that we haven't really touched on. It's not just workflows and use cases, but it's also knowledge, right? Tribal knowledge, like you asked me for an example. You can type or talk to the assistant and ask, "How much am I spending on AWS, on US East 1, on so and so customer environment last week?", and it will know how to give you that information. >> Can I ask, should I buy a reserve instances or not? Can I ask that question? 'Cause there's always good trade offs between buying the reserve instances. I mean that's kind of the thing that. >> This is where our ecosystem actually comes in handy because we're not necessarily going to' go down every single domain and try to be the experts in here. We can tap into the partnerships, API, we have full extensibility in API and the software development kit that goes into. >> It's interesting, opinionated and declarative are buzzwords in developer language. So you started to get into this editorial thing. So I can bring up an example. Hey cube, implement the best service mesh. What answer does it give you? 'Cause there's different choices. >> Well this is actually where the operator, there's clearly guard rails. Like you can go and say, I want to' spin up a machine, and it will give you all of the machines on AWS. Doesn't mean you have to get the X one, that's good for a SAP environment. You could go and have guardrails in place where only the ones that are relevant to your team, ones that have resources and budgetary, you know, guidelines can be. So, the operator still has all the control. >> It was kind of tongue in cheek around the editorialized, but actually the answer seems to be as you're saying, whatever the customer decided their service mesh is. So I think this is where it gets into as an assistant to architecting and operating, that seems to be the real value. >> Now code snippets is a different story because that goes on to the web, that goes onto stock overflow, and that's actually one of the things. So inside the CLI, you could actually go and ask for code snippets and we could actually go and populate that, it's a smart CLI. So that's actually one of the things that are an added value of that. >> I was saying to a friend and we were talking about open source and how when I grew up, there was no open source. If you're a developer now, I mean there's so much code, it's not so much coding anymore as it is connecting and integrating. >> Certainly. >> And writing glue layers, if you will. I mean there's still code, but it's not, you don't have to build it from scratch. There's so much code out there. This low-code notion of a smart system is interesting 'cause it's very matrix like. It can build its own code. >> Yes, but I'm also a little wary with low-code and no code. I think part of the problem is we're so constantly focused on categories and categorizing ourselves, and different categories take on a life of their own. So low-code no code is not necessarily, even though we have the low-code editor, we're not necessarily considering ourselves low-code. >> Serverless, no code, low-code. I was so thrown on a term the other day, architecture-less. As a joke, no we don't need architecture. >> There's a use case around that by the way, yeah, we do. Show me my AWS architecture and it will build the architect diagram for you. >> Again, serverless architect, this is all part of infrastructure's code. At the end of the day, the developer has infrastructure with code. Again, how they deploy it is the neuron. That's what we've been striving for. >> But infrastructure is code. You can destroy, you know, terraform, you can go and create one. It's not necessarily going to' operate it for you. That's kind of where this comes in on top of that. So it's really complimentary to infrastructure. >> So final question, before we get into the origination story, data and security are two hot areas we're seeing fill the IT gap, that has moved into the developer role. IT is essentially provisioned by developers now, but the OP side shifted to large scale SRE like environments, security and data are critical. What's your opinion on those two things? >> I agree. Do you want me to give you the normal data as gravity? >> So you agree that IT is now, is kind of moved into the developer realm, but the new IT is data ops and security ops basically. >> A hundred percent, and the lines are so blurred. Like who's what in today's world. I mean, I can tell you, I have customers who call themselves five different roles in the same day. So it's, you know, at the end of the day I call 'em operators 'cause I don't want to offend anybody because that's just the way it is. >> Architectural-less, we're going to' come back to that. Well, I know we're going to' see you at CubeCon. >> Yes. >> We should catch up there and talk more. I'm looking forward to seeing how you guys get the feedback from the marketplace. It should be interesting to hear, the curious question I have for you is, what was the origination story? Why did you guys come together, was it a shared problem? Was it a big market opportunity? Was it an itch you guys were scratching? Did you feel like you needed to come together and start this company? What was the real vision behind the origination? Take a take a minute to explain the story. >> No, absolutely. So I've been living in Palo Alto for the last couple years. Previous, also a founder. So, you know, from my perspective, I always saw myself getting back in the game. Spent a few years in AWS essentially managing partnerships for tier one DevOps partners, you know, all of the known players. Some in public, some of them not. And really the itch was there, right. I saw what everyone's doing. I started seeing consistency in the pains that I was hearing back, in terms of what hasn't been solved. So I already had an opinion where I wanted to go. And when I was visiting actually Israel with the family, I was introduced by a mutual friend to Shaked, Shaked Askayo, my co-founder and CTO. Amazing guy, unbelievable technologists, probably one the most, you know, impressive folks I've had a chance to work with. And he actually solved a very similar problem, you know, in his own way in a previous company, BlueVine, a FinTech company where he was head of SRE, having to, essentially, oversee 200 developers in a very small team. The ratio was incongruent to what the SRE guideline would tell. >> That's more than 10 x rate developer. >> Oh, absolutely. Sure enough. And just imagine it's four different time zones. He finishes day shift and you already had the US team coming, asking for a question. He said, this is kind of a, >> Got to' clone himself, basically. >> Well, yes. He essentially said to me, I had no day, I had no life, but I had Corona, I had COVID, which meant I could work from home. And I essentially programed myself in the form of a bot. Essentially, when people came to him, he said, "Don't talk to me, talk to the bot". Now that was a different generation. >> Just a trivial example, but the idea was to automate the same queries all the time. There's an answer for that, go here. And that's the benefit of it. >> Yes, so he was able to see how easy it was to solve, I mean, how effective it was solving 70% of the toil in his organization. Scaling his team, froze the headcount and the developer team kept on going. So that meant that he was doing some right. >> When you have a problem, and you need to solve it, the creativity comes out of the woodwork, you know, invention is the mother of necessity. So final question for you, what's next? Got the launch, what are you guys hope to do over the next six months to a year, hiring? Put a plug in for the company. What are you guys looking to do? Take a minute to share the future vision and get a plug in. >> A hundred percent. So, Kubiya, as you can imagine, announcing ourselves at CubeCon, so in a couple weeks. Opening the gates towards the public beta and NGA in the next couple months. Essentially working with dozens of customers, Aston Martin, and business earn in. We have quite a few, our website's full of quotes. You can go ahead. But effectively we're looking to go and to bring the next operator, generation of operators, who value their time, who value the, essentially, the value of tribal knowledge that travels between organizations that could be essentially shared. >> How many customers do you guys have in your pre-launch? >> It's above a dozen. Without saying, because we're actually looking to onboard 10 more next week. So that's just an understatement. It changes from day to day. >> What's the number one thing people are saying about you? >> You got that right. I know it's, I'm trying to be a little bit more, you know. >> It's okay, you can be cocky, startups are good. But I mean they're obviously, they're using the product and you're getting good feedback. Saving time, are they saying this is a dream product? Got it right, what are some of the things? >> I think anybody who doesn't feel the pain won't know, but the folks who are in the trenches, or feeling the pain, or experiencing this toil, who know what this means, they said, "You're doing this different, you're doing this right. You architected it right. You know exactly what the developer workflows," you know, where all the areas, you know, where all the skeletons are hidden within that. And you're attending to that. So we're happy about that. >> Everybody wants to clone themselves, again, the tribal knowledge. I think this is a great example of where we see the world going. Make things autonomous, operationally automated for the use cases you know are lock solid. Why wouldn't you just deploy? >> Exactly, and we have a very generous free tier. People can, you know, there's a plugin, you can sign up for free until the end of the year. We have a generous free tier. Yeah, free forever tier, as well. So we're looking for people to try us out and to give us feedback. >> I think the self-service, I think the point is, we've talked about it on the Cube at our events, everyone says the same thing. Every developer wants self-service, period. Full stop, done. >> What they don't say is they need somebody to help them babysit to make sure they're doing it right. >> The old dashboard, green, yellow, red. >> I know it's an analogy that's not related, but have you been to Whole Foods? Have you gone through their self-service line? That's the beauty of it, right? Having someone in a loop helping you out throughout the time. You don't get confused, if something's not working, someone's helping you out, that's what people want. They want a human in the loop, or a human like in the loop. We're giving that next best thing. >> It's really the ratio, it's scale. It's a scaling. It's force multiplier, for sure. Amit, thanks for coming on, congratulations. >> Thank you so much. >> See you at KubeCon. Thanks for coming in, sharing the story. >> KubiyaCon. >> CubeCon. Cube in Hebrew, Kubiya. Founder, co-founder and CEO here, sharing the story in the launch. Conversational AI for DevOps, the theory of DevOps, really kind of changing the game, bringing efficiency, solving a lot of the pain points of large scale infrastructure. This is theCUBE, CUBE conversation, I'm John Furrier, thanks for watching. (upbeat electronic music)

Published Date : Oct 18 2022

SUMMARY :

on the startup launch We've never seen the Cube so there's a dual meaning to this event. I can interact with some, you know, but also the added value of the conversational AI. a lot of the key successes in DevOps. a lot of the toil that they're What's the product choice there? We have the classic, Multi-Tenancy SAAS, So primary cloud, So let's get into the call it the top down. So you're absolutely I mean, the developers want self-service. Do you mind if I take a step back So the operators in this keep the lights on, this is of the promise of what SRE all of the answers they require, experience into the We're converting operational take me through an example. So imagine the following scenario. Is that the unique flavor here? that the end nodes help the Siri for DevOps, So the Siri is actually a is it the data, is it the model? the system is smart enough to a second bite at the apple, as they say. on the dark spots. And the automation seems to it, is the ability to carry So the guardrails just need to be set. the assistant and ask, I mean that's kind of the thing that. and the software development implement the best service mesh. of the machines on AWS. but actually the answer So inside the CLI, you could actually go I was saying to a And writing glue layers, if you will. So low-code no code is not necessarily, I was so thrown on a term the around that by the way, At the end of the day, You can destroy, you know, terraform, that has moved into the developer role. the normal data as gravity? is kind of moved into the developer realm, in the same day. to' see you at CubeCon. the curious question I have for you is, And really the itch was there, right. the US team coming, asking for a question. myself in the form of a bot. And that's the benefit of it. and the developer team kept on going. of the woodwork, you know, and NGA in the next couple months. It changes from day to day. bit more, you know. It's okay, you can be but the folks who are in the for the use cases you know are lock solid. and to give us feedback. everyone says the same thing. need somebody to help them That's the beauty of it, right? It's really the ratio, it's scale. Thanks for coming in, sharing the story. sharing the story in the launch.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

John FurrierPERSON

0.99+

70%QUANTITY

0.99+

SiriTITLE

0.99+

sixQUANTITY

0.99+

AWSORGANIZATION

0.99+

AmitPERSON

0.99+

Tel AvivLOCATION

0.99+

Amit GovrinPERSON

0.99+

Palo AltoLOCATION

0.99+

Amit Eyal GovrinPERSON

0.99+

two daysQUANTITY

0.99+

10QUANTITY

0.99+

200 developersQUANTITY

0.99+

Palo Alto, CaliforniaLOCATION

0.99+

Bay AreaLOCATION

0.99+

two peopleQUANTITY

0.99+

IsraelLOCATION

0.99+

Aston MartinORGANIZATION

0.99+

last weekDATE

0.99+

Whole FoodsORGANIZATION

0.99+

two thingsQUANTITY

0.99+

next weekDATE

0.99+

firstQUANTITY

0.99+

KubiyaORGANIZATION

0.99+

SREORGANIZATION

0.99+

KubeConEVENT

0.99+

BlueVineORGANIZATION

0.99+

EC2TITLE

0.99+

DevOpsTITLE

0.98+

five different rolesQUANTITY

0.98+

Five courseQUANTITY

0.98+

oneQUANTITY

0.98+

KubiyaPERSON

0.98+

first timeQUANTITY

0.97+

KubiyaConEVENT

0.97+

second shotQUANTITY

0.96+

yesterdayDATE

0.96+

hundred percentQUANTITY

0.96+

one elementQUANTITY

0.96+

KubConEVENT

0.96+

one more elementQUANTITY

0.96+

second biteQUANTITY

0.95+

both perspectivesQUANTITY

0.95+

GartnerORGANIZATION

0.95+

GoogleORGANIZATION

0.95+

HebrewOTHER

0.94+

NorthStarORGANIZATION

0.94+

Shaked AskayoPERSON

0.94+

CubeORGANIZATION

0.93+

ShakedPERSON

0.93+

theCUBE StudiosORGANIZATION

0.93+

dozens of customersQUANTITY

0.93+

CoronaORGANIZATION

0.92+

DevSecOpsTITLE

0.92+

theCUBEORGANIZATION

0.92+

above a dozenQUANTITY

0.91+

OneQUANTITY

0.9+

more than 10 xQUANTITY

0.9+

Siri for DevOpsTITLE

0.9+

cubePERSON

0.9+

US East 1LOCATION

0.89+

280QUANTITY

0.89+

CubeConEVENT

0.88+

two hot areasQUANTITY

0.87+

todayDATE

0.87+

seven main use casesQUANTITY

0.84+

USLOCATION

0.84+

MichelinTITLE

0.83+

a yearQUANTITY

0.83+

Horizon3.ai Signal | Horizon3.ai Partner Program Expands Internationally


 

hello I'm John Furrier with thecube and welcome to this special presentation of the cube and Horizon 3.ai they're announcing a global partner first approach expanding their successful pen testing product Net Zero you're going to hear from leading experts in their staff their CEO positioning themselves for a successful Channel distribution expansion internationally in Europe Middle East Africa and Asia Pacific in this Cube special presentation you'll hear about the expansion the expanse partner program giving Partners a unique opportunity to offer Net Zero to their customers Innovation and Pen testing is going International with Horizon 3.ai enjoy the program [Music] welcome back everyone to the cube and Horizon 3.ai special presentation I'm John Furrier host of thecube we're here with Jennifer Lee head of Channel sales at Horizon 3.ai Jennifer welcome to the cube thanks for coming on great well thank you for having me so big news around Horizon 3.aa driving Channel first commitment you guys are expanding the channel partner program to include all kinds of new rewards incentives training programs help educate you know Partners really drive more recurring Revenue certainly cloud and Cloud scale has done that you got a great product that fits into that kind of Channel model great Services you can wrap around it good stuff so let's get into it what are you guys doing what are what are you guys doing with this news why is this so important yeah for sure so um yeah we like you said we recently expanded our Channel partner program um the driving force behind it was really just um to align our like you said our Channel first commitment um and creating awareness around the importance of our partner ecosystems um so that's it's really how we go to market is is through the channel and a great International Focus I've talked with the CEO so you know about the solution and he broke down all the action on why it's important on the product side but why now on the go to market change what's the what's the why behind this big this news on the channel yeah for sure so um we are doing this now really to align our business strategy which is built on the concept of enabling our partners to create a high value high margin business on top of our platform and so um we offer a solution called node zero it provides autonomous pen testing as a service and it allows organizations to continuously verify their security posture um so we our company vision we have this tagline that states that our pen testing enables organizations to see themselves Through The Eyes of an attacker and um we use the like the attacker's perspective to identify exploitable weaknesses and vulnerabilities so we created this partner program from a perspective of the partner so the partner's perspective and we've built It Through The Eyes of our partner right so we're prioritizing really what the partner is looking for and uh will ensure like Mutual success for us yeah the partners always want to get in front of the customers and bring new stuff to them pen tests have traditionally been really expensive uh and so bringing it down in one to a service level that's one affordable and has flexibility to it allows a lot of capability so I imagine people getting excited by it so I have to ask you about the program What specifically are you guys doing can you share any details around what it means for the partners what they get what's in it for them can you just break down some of the mechanics and mechanisms or or details yeah yep um you know we're really looking to create business alignment um and like I said establish Mutual success with our partners so we've got two um two key elements that we were really focused on um that we bring to the partners so the opportunity the profit margin expansion is one of them and um a way for our partners to really differentiate themselves and stay relevant in the market so um we've restructured our discount model really um you know highlighting profitability and maximizing profitability and uh this includes our deal registration we've we've created deal registration program we've increased discount for partners who take part in our partner certification uh trainings and we've we have some other partner incentives uh that we we've created that that's going to help out there we've we put this all so we've recently Gone live with our partner portal um it's a Consolidated experience for our partners where they can access our our sales tools and we really view our partners as an extension of our sales and Technical teams and so we've extended all of our our training material that we use internally we've made it available to our partners through our partner portal um we've um I'm trying I'm thinking now back what else is in that partner portal here we've got our partner certification information so all the content that's delivered during that training can be found in the portal we've got deal registration uh um co-branded marketing materials pipeline management and so um this this portal gives our partners a One-Stop place to to go to find all that information um and then just really quickly on the second part of that that I mentioned is our technology really is um really disruptive to the market so you know like you said autonomous pen testing it's um it's still it's well it's still still relatively new topic uh for security practitioners and um it's proven to be really disruptive so um that on top of um just well recently we found an article that um that mentioned by markets and markets that reports that the global pen testing markets really expanding and so it's expected to grow to like 2.7 billion um by 2027. so the Market's there right the Market's expanding it's growing and so for our partners it's just really allows them to grow their revenue um across their customer base expand their customer base and offering this High profit margin while you know getting in early to Market on this just disruptive technology big Market a lot of opportunities to make some money people love to put more margin on on those deals especially when you can bring a great solution that everyone knows is hard to do so I think that's going to provide a lot of value is there is there a type of partner that you guys see emerging or you aligning with you mentioned the alignment with the partners I can see how that the training and the incentives are all there sounds like it's all going well is there a type of partner that's resonating the most or is there categories of partners that can take advantage of this yeah absolutely so we work with all different kinds of Partners we work with our traditional resale Partners um we've worked we're working with systems integrators we have a really strong MSP mssp program um we've got Consulting partners and the Consulting Partners especially with the ones that offer pen test services so we they use us as a as we act as a force multiplier just really offering them profit margin expansion um opportunity there we've got some technology partner partners that we really work with for co-cell opportunities and then we've got our Cloud Partners um you'd mentioned that earlier and so we are in AWS Marketplace so our ccpo partners we're part of the ISP accelerate program um so we we're doing a lot there with our Cloud partners and um of course we uh we go to market with uh distribution Partners as well gotta love the opportunity for more margin expansion every kind of partner wants to put more gross profit on their deals is there a certification involved I have to ask is there like do you get do people get certified or is it just you get trained is it self-paced training is it in person how are you guys doing the whole training certification thing because is that is that a requirement yeah absolutely so we do offer a certification program and um it's been very popular this includes a a seller's portion and an operator portion and and so um this is at no cost to our partners and um we operate both virtually it's it's law it's virtually but live it's not self-paced and we also have in person um you know sessions as well and we also can customize these to any partners that have a large group of people and we can just we can do one in person or virtual just specifically for that partner well any kind of incentive opportunities and marketing opportunities everyone loves to get the uh get the deals just kind of rolling in leads from what we can see if our early reporting this looks like a hot product price wise service level wise what incentive do you guys thinking about and and Joint marketing you mentioned co-sell earlier in pipeline so I was kind of kind of honing in on that piece sure and yes and then to follow along with our partner certification program we do incentivize our partners there if they have a certain number certified their discount increases so that's part of it we have our deal registration program that increases discount as well um and then we do have some um some partner incentives that are wrapped around meeting setting and um moving moving opportunities along to uh proof of value gotta love the education driving value I have to ask you so you've been around the industry you've seen the channel relationships out there you're seeing companies old school new school you know uh Horizon 3.ai is kind of like that new school very cloud specific a lot of Leverage with we mentioned AWS and all the clouds um why is the company so hot right now why did you join them and what's why are people attracted to this company what's the what's the attraction what's the vibe what do you what do you see and what what do you use what did you see in in this company well this is just you know like I said it's very disruptive um it's really in high demand right now and um and and just because because it's new to Market and uh a newer technology so we are we can collaborate with a manual pen tester um we can you know we can allow our customers to run their pen test um with with no specialty teams and um and and then so we and like you know like I said we can allow our partners can actually build businesses profitable businesses so we can they can use our product to increase their services revenue and um and build their business model you know around around our services what's interesting about the pen test thing is that it's very expensive and time consuming the people who do them are very talented people that could be working on really bigger things in the in absolutely customers so bringing this into the channel allows them if you look at the price Delta between a pen test and then what you guys are offering I mean that's a huge margin Gap between street price of say today's pen test and what you guys offer when you show people that they follow do they say too good to be true I mean what are some of the things that people say when you kind of show them that are they like scratch their head like come on what's the what's the catch here right so the cost savings is a huge is huge for us um and then also you know like I said working as a force multiplier with a pen testing company that offers the services and so they can they can do their their annual manual pen tests that may be required around compliance regulations and then we can we can act as the continuous verification of their security um um you know that that they can run um weekly and so it's just um you know it's just an addition to to what they're offering already and an expansion so Jennifer thanks for coming on thecube really appreciate you uh coming on sharing the insights on the channel uh what's next what can we expect from the channel group what are you thinking what's going on right so we're really looking to expand our our Channel um footprint and um very strategically uh we've got um we've got some big plans um for for Horizon 3.ai awesome well thanks for coming on really appreciate it you're watching thecube the leader in high tech Enterprise coverage [Music] [Music] hello and welcome to the Cube's special presentation with Horizon 3.ai with Raina Richter vice president of emea Europe Middle East and Africa and Asia Pacific APAC for Horizon 3 today welcome to this special Cube presentation thanks for joining us thank you for the invitation so Horizon 3 a guy driving Global expansion big international news with a partner first approach you guys are expanding internationally let's get into it you guys are driving this new expanse partner program to new heights tell us about it what are you seeing in the momentum why the expansion what's all the news about well I would say uh yeah in in international we have I would say a similar similar situation like in the US um there is a global shortage of well-educated penetration testers on the one hand side on the other side um we have a raising demand of uh network and infrastructure security and with our approach of an uh autonomous penetration testing I I believe we are totally on top of the game um especially as we have also now uh starting with an international instance that means for example if a customer in Europe is using uh our service node zero he will be connected to a node zero instance which is located inside the European Union and therefore he has doesn't have to worry about the conflict between the European the gdpr regulations versus the US Cloud act and I would say there we have a total good package for our partners that they can provide differentiators to their customers you know we've had great conversations here on thecube with the CEO and the founder of the company around the leverage of the cloud and how successful that's been for the company and honestly I can just Connect the Dots here but I'd like you to weigh in more on how that translates into the go to market here because you got great Cloud scale with with the security product you guys are having success with great leverage there I've seen a lot of success there what's the momentum on the channel partner program internationally why is it so important to you is it just the regional segmentation is it the economics why the momentum well there are it's there are multiple issues first of all there is a raising demand in penetration testing um and don't forget that uh in international we have a much higher level in number a number or percentage in SMB and mid-market customers so these customers typically most of them even didn't have a pen test done once a year so for them pen testing was just too expensive now with our offering together with our partners we can provide different uh ways how customers could get an autonomous pen testing done more than once a year with even lower costs than they had with with a traditional manual paint test so and that is because we have our uh Consulting plus package which is for typically pain testers they can go out and can do a much faster much quicker and their pain test at many customers once in after each other so they can do more pain tests on a lower more attractive price on the other side there are others what even the same ones who are providing um node zero as an mssp service so they can go after s p customers saying okay well you only have a couple of hundred uh IP addresses no worries we have the perfect package for you and then you have let's say the mid Market let's say the thousands and more employees then they might even have an annual subscription very traditional but for all of them it's all the same the customer or the service provider doesn't need a piece of Hardware they only need to install a small piece of a Docker container and that's it and that makes it so so smooth to go in and say okay Mr customer we just put in this this virtual attacker into your network and that's it and and all the rest is done and within within three clicks they are they can act like a pen tester with 20 years of experience and that's going to be very Channel friendly and partner friendly I can almost imagine so I have to ask you and thank you for calling the break calling out that breakdown and and segmentation that was good that was very helpful for me to understand but I want to follow up if you don't mind um what type of partners are you seeing the most traction with and why well I would say at the beginning typically you have the the innovators the early adapters typically Boutique size of Partners they start because they they are always looking for Innovation and those are the ones you they start in the beginning so we have a wide range of Partners having mostly even um managed by the owner of the company so uh they immediately understand okay there is the value and they can change their offering they're changing their offering in terms of penetration testing because they can do more pen tests and they can then add other ones or we have those ones who offer 10 tests services but they did not have their own pen testers so they had to go out on the open market and Source paint testing experts um to get the pen test at a particular customer done and now with node zero they're totally independent they can't go out and say okay Mr customer here's the here's the service that's it we turn it on and within an hour you're up and running totally yeah and those pen tests are usually expensive and hard to do now it's right in line with the sales delivery pretty interesting for a partner absolutely but on the other hand side we are not killing the pain testers business we do something we're providing with no tiers I would call something like the foundation work the foundational work of having an an ongoing penetration testing of the infrastructure the operating system and the pen testers by themselves they can concentrate in the future on things like application pen testing for example so those Services which we we're not touching so we're not killing the paint tester Market we're just taking away the ongoing um let's say foundation work call it that way yeah yeah that was one of my questions I was going to ask is there's a lot of interest in this autonomous pen testing one because it's expensive to do because those skills are required are in need and they're expensive so you kind of cover the entry level and the blockers that are in there I've seen people say to me this pen test becomes a blocker for getting things done so there's been a lot of interest in the autonomous pen testing and for organizations to have that posture and it's an overseas issue too because now you have that that ongoing thing so can you explain that particular benefit for an organization to have that continuously verifying an organization's posture yep certainly so I would say um typically you are you you have to do your patches you have to bring in new versions of operating systems of different Services of uh um operating systems of some components and and they are always bringing new vulnerabilities the difference here is that with node zero we are telling the customer or the partner package we're telling them which are the executable vulnerabilities because previously they might have had um a vulnerability scanner so this vulnerability scanner brought up hundreds or even thousands of cves but didn't say anything about which of them are vulnerable really executable and then you need an expert digging in one cve after the other finding out is it is it really executable yes or no and that is where you need highly paid experts which we have a shortage so with notes here now we can say okay we tell you exactly which ones are the ones you should work on because those are the ones which are executable we rank them accordingly to the risk level how easily they can be used and by a sudden and then the good thing is convert it or indifference to the traditional penetration test they don't have to wait for a year for the next pain test to find out if the fixing was effective they weren't just the next scan and say Yes closed vulnerability is gone the time is really valuable and if you're doing any devops Cloud native you're always pushing new things so pen test ongoing pen testing is actually a benefit just in general as a kind of hygiene so really really interesting solution really bring that global scale is going to be a new new coverage area for us for sure I have to ask you if you don't mind answering what particular region are you focused on or plan to Target for this next phase of growth well at this moment we are concentrating on the countries inside the European Union Plus the United Kingdom um but we are and they are of course logically I'm based into Frankfurt area that means we cover more or less the countries just around so it's like the total dark region Germany Switzerland Austria plus the Netherlands but we also already have Partners in the nordics like in Finland or in Sweden um so it's it's it it's rapidly we have Partners already in the UK and it's rapidly growing so I'm for example we are now starting with some activities in Singapore um um and also in the in the Middle East area um very important we uh depending on let's say the the way how to do business currently we try to concentrate on those countries where we can have um let's say um at least English as an accepted business language great is there any particular region you're having the most success with right now is it sounds like European Union's um kind of first wave what's them yes that's the first definitely that's the first wave and now we're also getting the uh the European instance up and running it's clearly our commitment also to the market saying okay we know there are certain dedicated uh requirements and we take care of this and and we're just launching it we're building up this one uh the instance um in the AWS uh service center here in Frankfurt also with some dedicated Hardware internet in a data center in Frankfurt where we have with the date six by the way uh the highest internet interconnection bandwidth on the planet so we have very short latency to wherever you are on on the globe that's a great that's a great call outfit benefit too I was going to ask that what are some of the benefits your partners are seeing in emea and Asia Pacific well I would say um the the benefits is for them it's clearly they can they can uh talk with customers and can offer customers penetration testing which they before and even didn't think about because it penetrates penetration testing in a traditional way was simply too expensive for them too complex the preparation time was too long um they didn't have even have the capacity uh to um to support a pain an external pain tester now with this service you can go in and say even if they Mr customer we can do a test with you in a couple of minutes within we have installed the docker container within 10 minutes we have the pen test started that's it and then we just wait and and I would say that is we'll we are we are seeing so many aha moments then now because on the partner side when they see node zero the first time working it's like this wow that is great and then they work out to customers and and show it to their typically at the beginning mostly the friendly customers like wow that's great I need that and and I would say um the feedback from the partners is that is a service where I do not have to evangelize the customer everybody understands penetration testing I don't have to say describe what it is they understand the customer understanding immediately yes penetration testing good about that I know I should do it but uh too complex too expensive now with the name is for example as an mssp service provided from one of our partners but it's getting easy yeah it's great and it's great great benefit there I mean I gotta say I'm a huge fan of what you guys are doing I like this continuous automation that's a major benefit to anyone doing devops or any kind of modern application development this is just a godsend for them this is really good and like you said the pen testers that are doing it they were kind of coming down from their expertise to kind of do things that should have been automated they get to focus on the bigger ticket items that's a really big point so we free them we free the pain testers for the higher level elements of the penetration testing segment and that is typically the application testing which is currently far away from being automated yeah and that's where the most critical workloads are and I think this is the nice balance congratulations on the international expansion of the program and thanks for coming on this special presentation really I really appreciate it thank you you're welcome okay this is thecube special presentation you know check out pen test automation International expansion Horizon 3 dot AI uh really Innovative solution in our next segment Chris Hill sector head for strategic accounts will discuss the power of Horizon 3.ai and Splunk in action you're watching the cube the leader in high tech Enterprise coverage foreign [Music] [Music] welcome back everyone to the cube and Horizon 3.ai special presentation I'm John Furrier host of thecube we're with Chris Hill sector head for strategic accounts and federal at Horizon 3.ai a great Innovative company Chris great to see you thanks for coming on thecube yeah like I said uh you know great to meet you John long time listener first time caller so excited to be here with you guys yeah we were talking before camera you had Splunk back in 2013 and I think 2012 was our first splunk.com and boy man you know talk about being in the right place at the right time now we're at another inflection point and Splunk continues to be relevant um and continuing to have that data driving Security in that interplay and your CEO former CTO of his plug as well at Horizon who's been on before really Innovative product you guys have but you know yeah don't wait for a breach to find out if you're logging the right data this is the topic of this thread Splunk is very much part of this new international expansion announcement uh with you guys tell us what are some of the challenges that you see where this is relevant for the Splunk and Horizon AI as you guys expand uh node zero out internationally yeah well so across so you know my role uh within Splunk it was uh working with our most strategic accounts and so I looked back to 2013 and I think about the sales process like working with with our small customers you know it was um it was still very siled back then like I was selling to an I.T team that was either using this for it operations um we generally would always even say yeah although we do security we weren't really designed for it we're a log management tool and we I'm sure you remember back then John we were like sort of stepping into the security space and and the public sector domain that I was in you know security was 70 of what we did when I look back to sort of uh the transformation that I was witnessing in that digital transformation um you know when I look at like 2019 to today you look at how uh the IT team and the security teams are being have been forced to break down those barriers that they used to sort of be silent away would not commute communicate one you know the security guys would be like oh this is my box I.T you're not allowed in today you can't get away with that and I think that the value that we bring to you know and of course Splunk has been a huge leader in that space and continues to do Innovation across the board but I think what we've we're seeing in the space and I was talking with Patrick Coughlin the SVP of uh security markets about this is that you know what we've been able to do with Splunk is build a purpose-built solution that allows Splunk to eat more data so Splunk itself is ulk know it's an ingest engine right the great reason people bought it was you could build these really fast dashboards and grab intelligence out of it but without data it doesn't do anything right so how do you drive and how do you bring more data in and most importantly from a customer perspective how do you bring the right data in and so if you think about what node zero and what we're doing in a horizon 3 is that sure we do pen testing but because we're an autonomous pen testing tool we do it continuously so this whole thought I'd be like oh crud like my customers oh yeah we got a pen test coming up it's gonna be six weeks the week oh yeah you know and everyone's gonna sit on their hands call me back in two months Chris we'll talk to you then right not not a real efficient way to test your environment and shoot we saw that with Uber this week right um you know and that's a case where we could have helped oh just right we could explain the Uber thing because it was a contractor just give a quick highlight of what happened so you can connect the doctor yeah no problem so um it was uh I got I think it was yeah one of those uh you know games where they would try and test an environment um and with the uh pen tester did was he kept on calling them MFA guys being like I need to reset my password we need to set my right password and eventually the um the customer service guy said okay I'm resetting it once he had reset and bypassed the multi-factor authentication he then was able to get in and get access to the building area that he was in or I think not the domain but he was able to gain access to a partial part of that Network he then paralleled over to what I would assume is like a VA VMware or some virtual machine that had notes that had all of the credentials for logging into various domains and So within minutes they had access and that's the sort of stuff that we do you know a lot of these tools like um you know you think about the cacophony of tools that are out there in a GTA architect architecture right I'm gonna get like a z-scale or I'm going to have uh octum and I have a Splunk I've been into the solar system I mean I don't mean to name names we have crowdstriker or Sentinel one in there it's just it's a cacophony of things that don't work together they weren't designed work together and so we have seen so many times in our business through our customer support and just working with customers when we do their pen tests that there will be 5 000 servers out there three are misconfigured those three misconfigurations will create the open door because remember the hacker only needs to be right once the defender needs to be right all the time and that's the challenge and so that's what I'm really passionate about what we're doing uh here at Horizon three I see this my digital transformation migration and security going on which uh we're at the tip of the spear it's why I joined sey Hall coming on this journey uh and just super excited about where the path's going and super excited about the relationship with Splunk I get into more details on some of the specifics of that but um you know well you're nailing I mean we've been doing a lot of things on super cloud and this next gen environment we're calling it next gen you're really seeing devops obviously devsecops has already won the it role has moved to the developer shift left is an indicator of that it's one of the many examples higher velocity code software supply chain you hear these things that means that it is now in the developer hands it is replaced by the new Ops data Ops teams and security where there's a lot of horizontal thinking to your point about access there's no more perimeter huge 100 right is really right on things one time you know to get in there once you're in then you can hang out move around move laterally big problem okay so we get that now the challenges for these teams as they are transitioning organizationally how do they figure out what to do okay this is the next step they already have Splunk so now they're kind of in transition while protecting for a hundred percent ratio of success so how would you look at that and describe the challenge is what do they do what is it what are the teams facing with their data and what's next what are they what are they what action do they take so let's use some vernacular that folks will know so if I think about devsecops right we both know what that means that I'm going to build security into the app it normally talks about sec devops right how am I building security around the perimeter of what's going inside my ecosystem and what are they doing and so if you think about what we're able to do with somebody like Splunk is we can pen test the entire environment from Soup To Nuts right so I'm going to test the end points through to its I'm going to look for misconfigurations I'm going to I'm going to look for um uh credential exposed credentials you know I'm going to look for anything I can in the environment again I'm going to do it at light speed and and what what we're doing for that SEC devops space is to you know did you detect that we were in your environment so did we alert Splunk or the Sim that there's someone in the environment laterally moving around did they more importantly did they log us into their environment and when do they detect that log to trigger that log did they alert on us and then finally most importantly for every CSO out there is going to be did they stop us and so that's how we we do this and I think you when speaking with um stay Hall before you know we've come up with this um boils but we call it fine fix verifying so what we do is we go in is we act as the attacker right we act in a production environment so we're not going to be we're a passive attacker but we will go in on credentialed on agents but we have to assume to have an assumed breach model which means we're going to put a Docker container in your environment and then we're going to fingerprint the environment so we're going to go out and do an asset survey now that's something that's not something that Splunk does super well you know so can Splunk see all the assets do the same assets marry up we're going to log all that data and think and then put load that into this long Sim or the smoke logging tools just to have it in Enterprise right that's an immediate future ad that they've got um and then we've got the fix so once we've completed our pen test um we are then going to generate a report and we can talk about these in a little bit later but the reports will show an executive summary the assets that we found which would be your asset Discovery aspect of that a fix report and the fixed report I think is probably the most important one it will go down and identify what we did how we did it and then how to fix that and then from that the pen tester or the organization should fix those then they go back and run another test and then they validate like a change detection environment to see hey did those fixes taste play take place and you know snehaw when he was the CTO of jsoc he shared with me a number of times about it's like man there would be 15 more items on next week's punch sheet that we didn't know about and it's and it has to do with how we you know how they were uh prioritizing the cves and whatnot because they would take all CBDs it was critical or non-critical and it's like we are able to create context in that environment that feeds better information into Splunk and whatnot that brings that brings up the efficiency for Splunk specifically the teams out there by the way the burnout thing is real I mean this whole I just finished my list and I got 15 more or whatever the list just can keeps growing how did node zero specifically help Splunk teams be more efficient like that's the question I want to get at because this seems like a very scale way for Splunk customers and teams service teams to be more so the question is how does node zero help make Splunk specifically their service teams be more efficient so so today in our early interactions we're building customers we've seen are five things um and I'll start with sort of identifying the blind spots right so kind of what I just talked about with you did we detect did we log did we alert did they stop node zero right and so I would I put that you know a more Layman's third grade term and if I was going to beat a fifth grader at this game would be we can be the sparring partner for a Splunk Enterprise customer a Splunk Essentials customer someone using Splunk soar or even just an Enterprise Splunk customer that may be a small shop with three people and just wants to know where am I exposed so by creating and generating these reports and then having um the API that actually generates the dashboard they can take all of these events that we've logged and log them in and then where that then comes in is number two is how do we prioritize those logs right so how do we create visibility to logs that that um are have critical impacts and again as I mentioned earlier not all cves are high impact regard and also not all or low right so if you daisy chain a bunch of low cves together boom I've got a mission critical AP uh CPE that needs to be fixed now such as a credential moving to an NT box that's got a text file with a bunch of passwords on it that would be very bad um and then third would be uh verifying that you have all of the hosts so one of the things that splunk's not particularly great at and they'll literate themselves they don't do asset Discovery so dude what assets do we see and what are they logging from that um and then for from um for every event that they are able to identify one of the cool things that we can do is actually create this low code no code environment so they could let you know Splunk customers can use Splunk sword to actually triage events and prioritize that event so where they're being routed within it to optimize the Sox team time to Market or time to triage any given event obviously reducing MTR and then finally I think one of the neatest things that we'll be seeing us develop is um our ability to build glass cables so behind me you'll see one of our triage events and how we build uh a Lockheed Martin kill chain on that with a glass table which is very familiar to the community we're going to have the ability and not too distant future to allow people to search observe on those iocs and if people aren't familiar with it ioc it's an instant of a compromise so that's a vector that we want to drill into and of course who's better at Drilling in the data and smoke yeah this is a critter this is an awesome Synergy there I mean I can see a Splunk customer going man this just gives me so much more capability action actionability and also real understanding and I think this is what I want to dig into if you don't mind understanding that critical impact okay is kind of where I see this coming got the data data ingest now data's data but the question is what not to log you know where are things misconfigured these are critical questions so can you talk about what it means to understand critical impact yeah so I think you know going back to the things that I just spoke about a lot of those cves where you'll see um uh low low low and then you daisy chain together and they're suddenly like oh this is high now but then your other impact of like if you're if you're a Splunk customer you know and I had it I had several of them I had one customer that you know terabytes of McAfee data being brought in and it was like all right there's a lot of other data that you probably also want to bring but they could only afford wanted to do certain data sets because that's and they didn't know how to prioritize or filter those data sets and so we provide that opportunity to say hey these are the critical ones to bring in but there's also the ones that you don't necessarily need to bring in because low cve in this case really does mean low cve like an ILO server would be one that um that's the print server uh where the uh your admin credentials are on on like a printer and so there will be credentials on that that's something that a hacker might go in to look at so although the cve on it is low is if you daisy chain with somebody that's able to get into that you might say Ah that's high and we would then potentially rank it giving our AI logic to say that's a moderate so put it on the scale and we prioritize those versus uh of all of these scanners just going to give you a bunch of CDs and good luck and translating that if I if I can and tell me if I'm wrong that kind of speaks to that whole lateral movement that's it challenge right print serve a great example looks stupid low end who's going to want to deal with the print server oh but it's connected into a critical system there's a path is that kind of what you're getting at yeah I use Daisy Chain I think that's from the community they came from uh but it's just a lateral movement it's exactly what they're doing in those low level low critical lateral movements is where the hackers are getting in right so that's the beauty thing about the uh the Uber example is that who would have thought you know I've got my monthly Factor authentication going in a human made a mistake we can't we can't not expect humans to make mistakes we're fallible right the reality is is once they were in the environment they could have protected themselves by running enough pen tests to know that they had certain uh exposed credentials that would have stopped the breach and they did not had not done that in their environment and I'm not poking yeah but it's an interesting Trend though I mean it's obvious if sometimes those low end items are also not protected well so it's easy to get at from a hacker standpoint but also the people in charge of them can be fished easily or spearfished because they're not paying attention because they don't have to no one ever told them hey be careful yeah for the community that I came from John that's exactly how they they would uh meet you at a uh an International Event um introduce themselves as a graduate student these are National actor States uh would you mind reviewing my thesis on such and such and I was at Adobe at the time that I was working on this instead of having to get the PDF they opened the PDF and whoever that customer was launches and I don't know if you remember back in like 2008 time frame there was a lot of issues around IP being by a nation state being stolen from the United States and that's exactly how they did it and John that's or LinkedIn hey I want to get a joke we want to hire you double the salary oh I'm gonna click on that for sure you know yeah right exactly yeah the one thing I would say to you is like uh when we look at like sort of you know because I think we did 10 000 pen tests last year is it's probably over that now you know we have these sort of top 10 ways that we think and find people coming into the environment the funniest thing is that only one of them is a cve related vulnerability like uh you know you guys know what they are right so it's it but it's it's like two percent of the attacks are occurring through the cves but yeah there's all that attention spent to that and very little attention spent to this pen testing side which is sort of this continuous threat you know monitoring space and and this vulnerability space where I think we play a such an important role and I'm so excited to be a part of the tip of the spear on this one yeah I'm old enough to know the movie sneakers which I loved as a you know watching that movie you know professional hackers are testing testing always testing the environment I love this I got to ask you as we kind of wrap up here Chris if you don't mind the the benefits to Professional Services from this Alliance big news Splunk and you guys work well together we see that clearly what are what other benefits do Professional Services teams see from the Splunk and Horizon 3.ai Alliance so if you're I think for from our our from both of our uh Partners uh as we bring these guys together and many of them already are the same partner right uh is that uh first off the licensing model is probably one of the key areas that we really excel at so if you're an end user you can buy uh for the Enterprise by the number of IP addresses you're using um but uh if you're a partner working with this there's solution ways that you can go in and we'll license as to msps and what that business model on msps looks like but the unique thing that we do here is this C plus license and so the Consulting plus license allows like a uh somebody a small to mid-sized to some very large uh you know Fortune 100 uh consulting firms use this uh by buying into a license called um Consulting plus where they can have unlimited uh access to as many IPS as they want but you can only run one test at a time and as you can imagine when we're going and hacking passwords and um checking hashes and decrypting hashes that can take a while so but for the right customer it's it's a perfect tool and so I I'm so excited about our ability to go to market with uh our partners so that we understand ourselves understand how not to just sell to or not tell just to sell through but we know how to sell with them as a good vendor partner I think that that's one thing that we've done a really good job building bring it into the market yeah I think also the Splunk has had great success how they've enabled uh partners and Professional Services absolutely you know the services that layer on top of Splunk are multi-fold tons of great benefits so you guys Vector right into that ride that way with friction and and the cool thing is that in you know in one of our reports which could be totally customized uh with someone else's logo we're going to generate you know so I I used to work in another organization it wasn't Splunk but we we did uh you know pen testing as for for customers and my pen testers would come on site they'd do the engagement and they would leave and then another release someone would be oh shoot we got another sector that was breached and they'd call you back you know four weeks later and so by August our entire pen testings teams would be sold out and it would be like well even in March maybe and they're like no no I gotta breach now and and and then when they do go in they go through do the pen test and they hand over a PDF and they pack on the back and say there's where your problems are you need to fix it and the reality is that what we're going to generate completely autonomously with no human interaction is we're going to go and find all the permutations of anything we found and the fix for those permutations and then once you've fixed everything you just go back and run another pen test it's you know for what people pay for one pen test they can have a tool that does that every every Pat patch on Tuesday and that's on Wednesday you know triage throughout the week green yellow red I wanted to see the colors show me green green is good right not red and one CIO doesn't want who doesn't want that dashboard right it's it's exactly it and we can help bring I think that you know I'm really excited about helping drive this with the Splunk team because they get that they understand that it's the green yellow red dashboard and and how do we help them find more green uh so that the other guys are in red yeah and get in the data and do the right thing and be efficient with how you use the data know what to look at so many things to pay attention to you know the combination of both and then go to market strategy real brilliant congratulations Chris thanks for coming on and sharing um this news with the detail around the Splunk in action around the alliance thanks for sharing John my pleasure thanks look forward to seeing you soon all right great we'll follow up and do another segment on devops and I.T and security teams as the new new Ops but and super cloud a bunch of other stuff so thanks for coming on and our next segment the CEO of horizon 3.aa will break down all the new news for us here on thecube you're watching thecube the leader in high tech Enterprise coverage [Music] yeah the partner program for us has been fantastic you know I think prior to that you know as most organizations most uh uh most Farmers most mssps might not necessarily have a a bench at all for penetration testing uh maybe they subcontract this work out or maybe they do it themselves but trying to staff that kind of position can be incredibly difficult for us this was a differentiator a a new a new partner a new partnership that allowed us to uh not only perform services for our customers but be able to provide a product by which that they can do it themselves so we work with our customers in a variety of ways some of them want more routine testing and perform this themselves but we're also a certified service provider of horizon 3 being able to perform uh penetration tests uh help review the the data provide color provide analysis for our customers in a broader sense right not necessarily the the black and white elements of you know what was uh what's critical what's high what's medium what's low what you need to fix but are there systemic issues this has allowed us to onboard new customers this has allowed us to migrate some penetration testing services to us from from competitors in the marketplace But ultimately this is occurring because the the product and the outcome are special they're unique and they're effective our customers like what they're seeing they like the routineness of it many of them you know again like doing this themselves you know being able to kind of pen test themselves parts of their networks um and the the new use cases right I'm a large organization I have eight to ten Acquisitions per year wouldn't it be great to have a tool to be able to perform a penetration test both internal and external of that acquisition before we integrate the two companies and maybe bringing on some risk it's a very effective partnership uh one that really is uh kind of taken our our Engineers our account Executives by storm um you know this this is a a partnership that's been very valuable to us [Music] a key part of the value and business model at Horizon 3 is enabling Partners to leverage node zero to make more revenue for themselves our goal is that for sixty percent of our Revenue this year will be originated by partners and that 95 of our Revenue next year will be originated by partners and so a key to that strategy is making us an integral part of your business models as a partner a key quote from one of our partners is that we enable every one of their business units to generate Revenue so let's talk about that in a little bit more detail first is that if you have a pen test Consulting business take Deloitte as an example what was six weeks of human labor at Deloitte per pen test has been cut down to four days of Labor using node zero to conduct reconnaissance find all the juicy interesting areas of the of the Enterprise that are exploitable and being able to go assess the entire organization and then all of those details get served up to the human to be able to look at understand and determine where to probe deeper so what you see in that pen test Consulting business is that node zero becomes a force multiplier where those Consulting teams were able to cover way more accounts and way more IPS within those accounts with the same or fewer consultants and so that directly leads to profit margin expansion for the Penn testing business itself because node 0 is a force multiplier the second business model here is if you're an mssp as an mssp you're already making money providing defensive cyber security operations for a large volume of customers and so what they do is they'll license node zero and use us as an upsell to their mssb business to start to deliver either continuous red teaming continuous verification or purple teaming as a service and so in that particular business model they've got an additional line of Revenue where they can increase the spend of their existing customers by bolting on node 0 as a purple team as a service offering the third business model or customer type is if you're an I.T services provider so as an I.T services provider you make money installing and configuring security products like Splunk or crowdstrike or hemio you also make money reselling those products and you also make money generating follow-on services to continue to harden your customer environments and so for them what what those it service providers will do is use us to verify that they've installed Splunk correctly improved to their customer that Splunk was installed correctly or crowdstrike was installed correctly using our results and then use our results to drive follow-on services and revenue and then finally we've got the value-added reseller which is just a straight up reseller because of how fast our sales Cycles are these vars are able to typically go from cold email to deal close in six to eight weeks at Horizon 3 at least a single sales engineer is able to run 30 to 50 pocs concurrently because our pocs are very lightweight and don't require any on-prem customization or heavy pre-sales post sales activity so as a result we're able to have a few amount of sellers driving a lot of Revenue and volume for us well the same thing applies to bars there isn't a lot of effort to sell the product or prove its value so vars are able to sell a lot more Horizon 3 node zero product without having to build up a huge specialist sales organization so what I'm going to do is talk through uh scenario three here as an I.T service provider and just how powerful node zero can be in driving additional Revenue so in here think of for every one dollar of node zero license purchased by the IT service provider to do their business it'll generate ten dollars of additional revenue for that partner so in this example kidney group uses node 0 to verify that they have installed and deployed Splunk correctly so Kitty group is a Splunk partner they they sell it services to install configure deploy and maintain Splunk and as they deploy Splunk they're going to use node 0 to attack the environment and make sure that the right logs and alerts and monitoring are being handled within the Splunk deployment so it's a way of doing QA or verifying that Splunk has been configured correctly and that's going to be internally used by kidney group to prove the quality of their services that they've just delivered then what they're going to do is they're going to show and leave behind that node zero Report with their client and that creates a resell opportunity for for kidney group to resell node 0 to their client because their client is seeing the reports and the results and saying wow this is pretty amazing and those reports can be co-branded where it's a pen testing report branded with kidney group but it says powered by Horizon three under it from there kidney group is able to take the fixed actions report that's automatically generated with every pen test through node zero and they're able to use that as the starting point for a statement of work to sell follow-on services to fix all of the problems that node zero identified fixing l11r misconfigurations fixing or patching VMware or updating credentials policies and so on so what happens is node 0 has found a bunch of problems the client often lacks the capacity to fix and so kidney group can use that lack of capacity by the client as a follow-on sales opportunity for follow-on services and finally based on the findings from node zero kidney group can look at that report and say to the customer you know customer if you bought crowdstrike you'd be able to uh prevent node Zero from attacking and succeeding in the way that it did for if you bought humano or if you bought Palo Alto networks or if you bought uh some privileged access management solution because of what node 0 was able to do with credential harvesting and attacks and so as a result kidney group is able to resell other security products within their portfolio crowdstrike Falcon humano Polito networks demisto Phantom and so on based on the gaps that were identified by node zero and that pen test and what that creates is another feedback loop where kidney group will then go use node 0 to verify that crowdstrike product has actually been installed and configured correctly and then this becomes the cycle of using node 0 to verify a deployment using that verification to drive a bunch of follow-on services and resell opportunities which then further drives more usage of the product now the way that we licensed is that it's a usage-based license licensing model so that the partner will grow their node zero Consulting plus license as they grow their business so for example if you're a kidney group then week one you've got you're going to use node zero to verify your Splunk install in week two if you have a pen testing business you're going to go off and use node zero to be a force multiplier for your pen testing uh client opportunity and then if you have an mssp business then in week three you're going to use node zero to go execute a purple team mssp offering for your clients so not necessarily a kidney group but if you're a Deloitte or ATT these larger companies and you've got multiple lines of business if you're Optive for instance you all you have to do is buy one Consulting plus license and you're going to be able to run as many pen tests as you want sequentially so now you can buy a single license and use that one license to meet your week one client commitments and then meet your week two and then meet your week three and as you grow your business you start to run multiple pen tests concurrently so in week one you've got to do a Splunk verify uh verify Splunk install and you've got to run a pen test and you've got to do a purple team opportunity you just simply expand the number of Consulting plus licenses from one license to three licenses and so now as you systematically grow your business you're able to grow your node zero capacity with you giving you predictable cogs predictable margins and once again 10x additional Revenue opportunity for that investment in the node zero Consulting plus license my name is Saint I'm the co-founder and CEO here at Horizon 3. I'm going to talk to you today about why it's important to look at your Enterprise Through The Eyes of an attacker the challenge I had when I was a CIO in banking the CTO at Splunk and serving within the Department of Defense is that I had no idea I was Secure until the bad guys had showed up am I logging the right data am I fixing the right vulnerabilities are my security tools that I've paid millions of dollars for actually working together to defend me and the answer is I don't know does my team actually know how to respond to a breach in the middle of an incident I don't know I've got to wait for the bad guys to show up and so the challenge I had was how do we proactively verify our security posture I tried a variety of techniques the first was the use of vulnerability scanners and the challenge with vulnerability scanners is being vulnerable doesn't mean you're exploitable I might have a hundred thousand findings from my scanner of which maybe five or ten can actually be exploited in my environment the other big problem with scanners is that they can't chain weaknesses together from machine to machine so if you've got a thousand machines in your environment or more what a vulnerability scanner will do is tell you you have a problem on machine one and separately a problem on machine two but what they can tell you is that an attacker could use a load from machine one plus a low from machine two to equal to critical in your environment and what attackers do in their tactics is they chain together misconfigurations dangerous product defaults harvested credentials and exploitable vulnerabilities into attack paths across different machines so to address the attack pads across different machines I tried layering in consulting-based pen testing and the issue is when you've got thousands of hosts or hundreds of thousands of hosts in your environment human-based pen testing simply doesn't scale to test an infrastructure of that size moreover when they actually do execute a pen test and you get the report oftentimes you lack the expertise within your team to quickly retest to verify that you've actually fixed the problem and so what happens is you end up with these pen test reports that are incomplete snapshots and quickly going stale and then to mitigate that problem I tried using breach and attack simulation tools and the struggle with these tools is one I had to install credentialed agents everywhere two I had to write my own custom attack scripts that I didn't have much talent for but also I had to maintain as my environment changed and then three these types of tools were not safe to run against production systems which was the the majority of my attack surface so that's why we went off to start Horizon 3. so Tony and I met when we were in Special Operations together and the challenge we wanted to solve was how do we do infrastructure security testing at scale by giving the the power of a 20-year pen testing veteran into the hands of an I.T admin a network engineer in just three clicks and the whole idea is we enable these fixers The Blue Team to be able to run node Zero Hour pen testing product to quickly find problems in their environment that blue team will then then go off and fix the issues that were found and then they can quickly rerun the attack to verify that they fixed the problem and the whole idea is delivering this without requiring custom scripts be developed without requiring credential agents be installed and without requiring the use of external third-party consulting services or Professional Services self-service pen testing to quickly Drive find fix verify there are three primary use cases that our customers use us for the first is the sock manager that uses us to verify that their security tools are actually effective to verify that they're logging the right data in Splunk or in their Sim to verify that their managed security services provider is able to quickly detect and respond to an attack and hold them accountable for their slas or that the sock understands how to quickly detect and respond and measuring and verifying that or that the variety of tools that you have in your stack most organizations have 130 plus cyber security tools none of which are designed to work together are actually working together the second primary use case is proactively hardening and verifying your systems this is when the I that it admin that network engineer they're able to run self-service pen tests to verify that their Cisco environment is installed in hardened and configured correctly or that their credential policies are set up right or that their vcenter or web sphere or kubernetes environments are actually designed to be secure and what this allows the it admins and network Engineers to do is shift from running one or two pen tests a year to 30 40 or more pen tests a month and you can actually wire those pen tests into your devops process or into your detection engineering and the change management processes to automatically trigger pen tests every time there's a change in your environment the third primary use case is for those organizations lucky enough to have their own internal red team they'll use node zero to do reconnaissance and exploitation at scale and then use the output as a starting point for the humans to step in and focus on the really hard juicy stuff that gets them on stage at Defcon and so these are the three primary use cases and what we'll do is zoom into the find fix verify Loop because what I've found in my experience is find fix verify is the future operating model for cyber security organizations and what I mean here is in the find using continuous pen testing what you want to enable is on-demand self-service pen tests you want those pen tests to find attack pads at scale spanning your on-prem infrastructure your Cloud infrastructure and your perimeter because attackers don't only state in one place they will find ways to chain together a perimeter breach a credential from your on-prem to gain access to your cloud or some other permutation and then the third part in continuous pen testing is attackers don't focus on critical vulnerabilities anymore they know we've built vulnerability Management Programs to reduce those vulnerabilities so attackers have adapted and what they do is chain together misconfigurations in your infrastructure and software and applications with dangerous product defaults with exploitable vulnerabilities and through the collection of credentials through a mix of techniques at scale once you've found those problems the next question is what do you do about it well you want to be able to prioritize fixing problems that are actually exploitable in your environment that truly matter meaning they're going to lead to domain compromise or domain user compromise or access your sensitive data the second thing you want to fix is making sure you understand what risk your crown jewels data is exposed to where is your crown jewels data is in the cloud is it on-prem has it been copied to a share drive that you weren't aware of if a domain user was compromised could they access that crown jewels data you want to be able to use the attacker's perspective to secure the critical data you have in your infrastructure and then finally as you fix these problems you want to quickly remediate and retest that you've actually fixed the issue and this fine fix verify cycle becomes that accelerator that drives purple team culture the third part here is verify and what you want to be able to do in the verify step is verify that your security tools and processes in people can effectively detect and respond to a breach you want to be able to integrate that into your detection engineering processes so that you know you're catching the right security rules or that you've deployed the right configurations you also want to make sure that your environment is adhering to the best practices around systems hardening in cyber resilience and finally you want to be able to prove your security posture over a time to your board to your leadership into your regulators so what I'll do now is zoom into each of these three steps so when we zoom in to find here's the first example using node 0 and autonomous pen testing and what an attacker will do is find a way to break through the perimeter in this example it's very easy to misconfigure kubernetes to allow an attacker to gain remote code execution into your on-prem kubernetes environment and break through the perimeter and from there what the attacker is going to do is conduct Network reconnaissance and then find ways to gain code execution on other machines in the environment and as they get code execution they start to dump credentials collect a bunch of ntlm hashes crack those hashes using open source and dark web available data as part of those attacks and then reuse those credentials to log in and laterally maneuver throughout the environment and then as they loudly maneuver they can reuse those credentials and use credential spraying techniques and so on to compromise your business email to log in as admin into your cloud and this is a very common attack and rarely is a CV actually needed to execute this attack often it's just a misconfiguration in kubernetes with a bad credential policy or password policy combined with bad practices of credential reuse across the organization here's another example of an internal pen test and this is from an actual customer they had 5 000 hosts within their environment they had EDR and uba tools installed and they initiated in an internal pen test on a single machine from that single initial access point node zero enumerated the network conducted reconnaissance and found five thousand hosts were accessible what node 0 will do under the covers is organize all of that reconnaissance data into a knowledge graph that we call the Cyber terrain map and that cyber Terrain map becomes the key data structure that we use to efficiently maneuver and attack and compromise your environment so what node zero will do is they'll try to find ways to get code execution reuse credentials and so on in this customer example they had Fortinet installed as their EDR but node 0 was still able to get code execution on a Windows machine from there it was able to successfully dump credentials including sensitive credentials from the lsas process on the Windows box and then reuse those credentials to log in as domain admin in the network and once an attacker becomes domain admin they have the keys to the kingdom they can do anything they want so what happened here well it turns out Fortinet was misconfigured on three out of 5000 machines bad automation the customer had no idea this had happened they would have had to wait for an attacker to show up to realize that it was misconfigured the second thing is well why didn't Fortinet stop the credential pivot in the lateral movement and it turned out the customer didn't buy the right modules or turn on the right services within that particular product and we see this not only with Ford in it but we see this with Trend Micro and all the other defensive tools where it's very easy to miss a checkbox in the configuration that will do things like prevent credential dumping the next story I'll tell you is attackers don't have to hack in they log in so another infrastructure pen test a typical technique attackers will take is man in the middle uh attacks that will collect hashes so in this case what an attacker will do is leverage a tool or technique called responder to collect ntlm hashes that are being passed around the network and there's a variety of reasons why these hashes are passed around and it's a pretty common misconfiguration but as an attacker collects those hashes then they start to apply techniques to crack those hashes so they'll pass the hash and from there they will use open source intelligence common password structures and patterns and other types of techniques to try to crack those hashes into clear text passwords so here node 0 automatically collected hashes it automatically passed the hashes to crack those credentials and then from there it starts to take the domain user user ID passwords that it's collected and tries to access different services and systems in your Enterprise in this case node 0 is able to successfully gain access to the Office 365 email environment because three employees didn't have MFA configured so now what happens is node 0 has a placement and access in the business email system which sets up the conditions for fraud lateral phishing and other techniques but what's especially insightful here is that 80 of the hashes that were collected in this pen test were cracked in 15 minutes or less 80 percent 26 of the user accounts had a password that followed a pretty obvious pattern first initial last initial and four random digits the other thing that was interesting is 10 percent of service accounts had their user ID the same as their password so VMware admin VMware admin web sphere admin web Square admin so on and so forth and so attackers don't have to hack in they just log in with credentials that they've collected the next story here is becoming WS AWS admin so in this example once again internal pen test node zero gets initial access it discovers 2 000 hosts are network reachable from that environment if fingerprints and organizes all of that data into a cyber Terrain map from there it it fingerprints that hpilo the integrated lights out service was running on a subset of hosts hpilo is a service that is often not instrumented or observed by security teams nor is it easy to patch as a result attackers know this and immediately go after those types of services so in this case that ILO service was exploitable and were able to get code execution on it ILO stores all the user IDs and passwords in clear text in a particular set of processes so once we gain code execution we were able to dump all of the credentials and then from there laterally maneuver to log in to the windows box next door as admin and then on that admin box we're able to gain access to the share drives and we found a credentials file saved on a share Drive from there it turned out that credentials file was the AWS admin credentials file giving us full admin authority to their AWS accounts not a single security alert was triggered in this attack because the customer wasn't observing the ILO service and every step thereafter was a valid login in the environment and so what do you do step one patch the server step two delete the credentials file from the share drive and then step three is get better instrumentation on privileged access users and login the final story I'll tell is a typical pattern that we see across the board with that combines the various techniques I've described together where an attacker is going to go off and use open source intelligence to find all of the employees that work at your company from there they're going to look up those employees on dark web breach databases and other forms of information and then use that as a starting point to password spray to compromise a domain user all it takes is one employee to reuse a breached password for their Corporate email or all it takes is a single employee to have a weak password that's easily guessable all it takes is one and once the attacker is able to gain domain user access in most shops domain user is also the local admin on their laptop and once your local admin you can dump Sam and get local admin until M hashes you can use that to reuse credentials again local admin on neighboring machines and attackers will start to rinse and repeat then eventually they're able to get to a point where they can dump lsas or by unhooking the anti-virus defeating the EDR or finding a misconfigured EDR as we've talked about earlier to compromise the domain and what's consistent is that the fundamentals are broken at these shops they have poor password policies they don't have least access privilege implemented active directory groups are too permissive where domain admin or domain user is also the local admin uh AV or EDR Solutions are misconfigured or easily unhooked and so on and what we found in 10 000 pen tests is that user Behavior analytics tools never caught us in that lateral movement in part because those tools require pristine logging data in order to work and also it becomes very difficult to find that Baseline of normal usage versus abnormal usage of credential login another interesting Insight is there were several Marquee brand name mssps that were defending our customers environment and for them it took seven hours to detect and respond to the pen test seven hours the pen test was over in less than two hours and so what you had was an egregious violation of the service level agreements that that mssp had in place and the customer was able to use us to get service credit and drive accountability of their sock and of their provider the third interesting thing is in one case it took us seven minutes to become domain admin in a bank that bank had every Gucci security tool you could buy yet in 7 minutes and 19 seconds node zero started as an unauthenticated member of the network and was able to escalate privileges through chaining and misconfigurations in lateral movement and so on to become domain admin if it's seven minutes today we should assume it'll be less than a minute a year or two from now making it very difficult for humans to be able to detect and respond to that type of Blitzkrieg attack so that's in the find it's not just about finding problems though the bulk of the effort should be what to do about it the fix and the verify so as you find those problems back to kubernetes as an example we will show you the path here is the kill chain we took to compromise that environment we'll show you the impact here is the impact or here's the the proof of exploitation that we were able to use to be able to compromise it and there's the actual command that we executed so you could copy and paste that command and compromise that cubelet yourself if you want and then the impact is we got code execution and we'll actually show you here is the impact this is a critical here's why it enabled perimeter breach affected applications will tell you the specific IPS where you've got the problem how it maps to the miter attack framework and then we'll tell you exactly how to fix it we'll also show you what this problem enabled so you can accurately prioritize why this is important or why it's not important the next part is accurate prioritization the hardest part of my job as a CIO was deciding what not to fix so if you take SMB signing not required as an example by default that CVSs score is a one out of 10. but this misconfiguration is not a cve it's a misconfig enable an attacker to gain access to 19 credentials including one domain admin two local admins and access to a ton of data because of that context this is really a 10 out of 10. you better fix this as soon as possible however of the seven occurrences that we found it's only a critical in three out of the seven and these are the three specific machines and we'll tell you the exact way to fix it and you better fix these as soon as possible for these four machines over here these didn't allow us to do anything of consequence so that because the hardest part is deciding what not to fix you can justifiably choose not to fix these four issues right now and just add them to your backlog and surge your team to fix these three as quickly as possible and then once you fix these three you don't have to re-run the entire pen test you can select these three and then one click verify and run a very narrowly scoped pen test that is only testing this specific issue and what that creates is a much faster cycle of finding and fixing problems the other part of fixing is verifying that you don't have sensitive data at risk so once we become a domain user we're able to use those domain user credentials and try to gain access to databases file shares S3 buckets git repos and so on and help you understand what sensitive data you have at risk so in this example a green checkbox means we logged in as a valid domain user we're able to get read write access on the database this is how many records we could have accessed and we don't actually look at the values in the database but we'll show you the schema so you can quickly characterize that pii data was at risk here and we'll do that for your file shares and other sources of data so now you can accurately articulate the data you have at risk and prioritize cleaning that data up especially data that will lead to a fine or a big news issue so that's the find that's the fix now we're going to talk about the verify the key part in verify is embracing and integrating with detection engineering practices so when you think about your layers of security tools you've got lots of tools in place on average 130 tools at any given customer but these tools were not designed to work together so when you run a pen test what you want to do is say did you detect us did you log us did you alert on us did you stop us and from there what you want to see is okay what are the techniques that are commonly used to defeat an environment to actually compromise if you look at the top 10 techniques we use and there's far more than just these 10 but these are the most often executed nine out of ten have nothing to do with cves it has to do with misconfigurations dangerous product defaults bad credential policies and it's how we chain those together to become a domain admin or compromise a host so what what customers will do is every single attacker command we executed is provided to you as an attackivity log so you can actually see every single attacker command we ran the time stamp it was executed the hosts it executed on and how it Maps the minor attack tactics so our customers will have are these attacker logs on one screen and then they'll go look into Splunk or exabeam or Sentinel one or crowdstrike and say did you detect us did you log us did you alert on us or not and to make that even easier if you take this example hey Splunk what logs did you see at this time on the VMware host because that's when node 0 is able to dump credentials and that allows you to identify and fix your logging blind spots to make that easier we've got app integration so this is an actual Splunk app in the Splunk App Store and what you can come is inside the Splunk console itself you can fire up the Horizon 3 node 0 app all of the pen test results are here so that you can see all of the results in one place and you don't have to jump out of the tool and what you'll show you as I skip forward is hey there's a pen test here are the critical issues that we've identified for that weaker default issue here are the exact commands we executed and then we will automatically query into Splunk all all terms on between these times on that endpoint that relate to this attack so you can now quickly within the Splunk environment itself figure out that you're missing logs or that you're appropriately catching this issue and that becomes incredibly important in that detection engineering cycle that I mentioned earlier so how do our customers end up using us they shift from running one pen test a year to 30 40 pen tests a month oftentimes wiring us into their deployment automation to automatically run pen tests the other part that they'll do is as they run more pen tests they find more issues but eventually they hit this inflection point where they're able to rapidly clean up their environment and that inflection point is because the red and the blue teams start working together in a purple team culture and now they're working together to proactively harden their environment the other thing our customers will do is run us from different perspectives they'll first start running an RFC 1918 scope to see once the attacker gained initial access in a part of the network that had wide access what could they do and then from there they'll run us within a specific Network segment okay from within that segment could the attacker break out and gain access to another segment then they'll run us from their work from home environment could they Traverse the VPN and do something damaging and once they're in could they Traverse the VPN and get into my cloud then they'll break in from the outside all of these perspectives are available to you in Horizon 3 and node zero as a single SKU and you can run as many pen tests as you want if you run a phishing campaign and find that an intern in the finance department had the worst phishing behavior you can then inject their credentials and actually show the end-to-end story of how an attacker fished gained credentials of an intern and use that to gain access to sensitive financial data so what our customers end up doing is running multiple attacks from multiple perspectives and looking at those results over time I'll leave you two things one is what is the AI in Horizon 3 AI those knowledge graphs are the heart and soul of everything that we do and we use machine learning reinforcement techniques reinforcement learning techniques Markov decision models and so on to be able to efficiently maneuver and analyze the paths in those really large graphs we also use context-based scoring to prioritize weaknesses and we're also able to drive collective intelligence across all of the operations so the more pen tests we run the smarter we get and all of that is based on our knowledge graph analytics infrastructure that we have finally I'll leave you with this was my decision criteria when I was a buyer for my security testing strategy what I cared about was coverage I wanted to be able to assess my on-prem cloud perimeter and work from home and be safe to run in production I want to be able to do that as often as I wanted I want to be able to run pen tests in hours or days not weeks or months so I could accelerate that fine fix verify loop I wanted my it admins and network Engineers with limited offensive experience to be able to run a pen test in a few clicks through a self-service experience and not have to install agent and not have to write custom scripts and finally I didn't want to get nickeled and dimed on having to buy different types of attack modules or different types of attacks I wanted a single annual subscription that allowed me to run any type of attack as often as I wanted so I could look at my Trends in directions over time so I hope you found this talk valuable uh we're easy to find and I look forward to seeing seeing you use a product and letting our results do the talking when you look at uh you know kind of the way no our pen testing algorithms work is we dynamically select uh how to compromise an environment based on what we've discovered and the goal is to become a domain admin compromise a host compromise domain users find ways to encrypt data steal sensitive data and so on but when you look at the the top 10 techniques that we ended up uh using to compromise environments the first nine have nothing to do with cves and that's the reality cves are yes a vector but less than two percent of cves are actually used in a compromise oftentimes it's some sort of credential collection credential cracking uh credential pivoting and using that to become an admin and then uh compromising environments from that point on so I'll leave this up for you to kind of read through and you'll have the slides available for you but I found it very insightful that organizations and ourselves when I was a GE included invested heavily in just standard vulnerability Management Programs when I was at DOD that's all disa cared about asking us about was our our kind of our cve posture but the attackers have adapted to not rely on cves to get in because they know that organizations are actively looking at and patching those cves and instead they're chaining together credentials from one place with misconfigurations and dangerous product defaults in another to take over an environment a concrete example is by default vcenter backups are not encrypted and so as if an attacker finds vcenter what they'll do is find the backup location and there are specific V sender MTD files where the admin credentials are parsippled in the binaries so you can actually as an attacker find the right MTD file parse out the binary and now you've got the admin credentials for the vcenter environment and now start to log in as admin there's a bad habit by signal officers and Signal practitioners in the in the Army and elsewhere where the the VM notes section of a virtual image has the password for the VM well those VM notes are not stored encrypted and attackers know this and they're able to go off and find the VMS that are unencrypted find the note section and pull out the passwords for those images and then reuse those credentials across the board so I'll pause here and uh you know Patrick love you get some some commentary on on these techniques and other things that you've seen and what we'll do in the last say 10 to 15 minutes is uh is rolled through a little bit more on what do you do about it yeah yeah no I love it I think um I think this is pretty exhaustive what I like about what you've done here is uh you know we've seen we've seen double-digit increases in the number of organizations that are reporting actual breaches year over year for the last um for the last three years and it's often we kind of in the Zeitgeist we pegged that on ransomware which of course is like incredibly important and very top of mind um but what I like about what you have here is you know we're reminding the audience that the the attack surface area the vectors the matter um you know has to be more comprehensive than just thinking about ransomware scenarios yeah right on um so let's build on this when you think about your defense in depth you've got multiple security controls that you've purchased and integrated and you've got that redundancy if a control fails but the reality is that these security tools aren't designed to work together so when you run a pen test what you want to ask yourself is did you detect node zero did you log node zero did you alert on node zero and did you stop node zero and when you think about how to do that every single attacker command executed by node zero is available in an attacker log so you can now see you know at the bottom here vcenter um exploit at that time on that IP how it aligns to minor attack what you want to be able to do is go figure out did your security tools catch this or not and that becomes very important in using the attacker's perspective to improve your defensive security controls and so the way we've tried to make this easier back to like my my my the you know I bleed Green in many ways still from my smoke background is you want to be able to and what our customers do is hey we'll look at the attacker logs on one screen and they'll look at what did Splunk see or Miss in another screen and then they'll use that to figure out what their logging blind spots are and what that where that becomes really interesting is we've actually built out an integration into Splunk where there's a Splunk app you can download off of Splunk base and you'll get all of the pen test results right there in the Splunk console and from that Splunk console you're gonna be able to see these are all the pen tests that were run these are the issues that were found um so you can look at that particular pen test here are all of the weaknesses that were identified for that particular pen test and how they categorize out for each of those weaknesses you can click on any one of them that are critical in this case and then we'll tell you for that weakness and this is where where the the punch line comes in so I'll pause the video here for that weakness these are the commands that were executed on these endpoints at this time and then we'll actually query Splunk for that um for that IP address or containing that IP and these are the source types that surface any sort of activity so what we try to do is help you as quickly and efficiently as possible identify the logging blind spots in your Splunk environment based on the attacker's perspective so as this video kind of plays through you can see it Patrick I'd love to get your thoughts um just seeing so many Splunk deployments and the effectiveness of those deployments and and how this is going to help really Elevate the effectiveness of all of your Splunk customers yeah I'm super excited about this I mean I think this these kinds of purpose-built integration snail really move the needle for our customers I mean at the end of the day when I think about the power of Splunk I think about a product I was first introduced to 12 years ago that was an on-prem piece of software you know and at the time it sold on sort of Perpetual and term licenses but one made it special was that it could it could it could eat data at a speed that nothing else that I'd have ever seen you can ingest massively scalable amounts of data uh did cool things like schema on read which facilitated that there was this language called SPL that you could nerd out about uh and you went to a conference once a year and you talked about all the cool things you were splunking right but now as we think about the next phase of our growth um we live in a heterogeneous environment where our customers have so many different tools and data sources that are ever expanding and as you look at the as you look at the role of the ciso it's mind-blowing to me the amount of sources Services apps that are coming into the ciso span of let's just call it a span of influence in the last three years uh you know we're seeing things like infrastructure service level visibility application performance monitoring stuff that just never made sense for the security team to have visibility into you um at least not at the size and scale which we're demanding today um and and that's different and this isn't this is why it's so important that we have these joint purpose-built Integrations that um really provide more prescription to our customers about how do they walk on that Journey towards maturity what does zero to one look like what does one to two look like whereas you know 10 years ago customers were happy with platforms today they want integration they want Solutions and they want to drive outcomes and I think this is a great example of how together we are stepping to the evolving nature of the market and also the ever-evolving nature of the threat landscape and what I would say is the maturing needs of the customer in that environment yeah for sure I think especially if if we all anticipate budget pressure over the next 18 months due to the economy and elsewhere while the security budgets are not going to ever I don't think they're going to get cut they're not going to grow as fast and there's a lot more pressure on organizations to extract more value from their existing Investments as well as extracting more value and more impact from their existing teams and so security Effectiveness Fierce prioritization and automation I think become the three key themes of security uh over the next 18 months so I'll do very quickly is run through a few other use cases um every host that we identified in the pen test were able to score and say this host allowed us to do something significant therefore it's it's really critical you should be increasing your logging here hey these hosts down here we couldn't really do anything as an attacker so if you do have to make trade-offs you can make some trade-offs of your logging resolution at the lower end in order to increase logging resolution on the upper end so you've got that level of of um justification for where to increase or or adjust your logging resolution another example is every host we've discovered as an attacker we Expose and you can export and we want to make sure is every host we found as an attacker is being ingested from a Splunk standpoint a big issue I had as a CIO and user of Splunk and other tools is I had no idea if there were Rogue Raspberry Pi's on the network or if a new box was installed and whether Splunk was installed on it or not so now you can quickly start to correlate what hosts did we see and how does that reconcile with what you're logging from uh finally or second to last use case here on the Splunk integration side is for every single problem we've found we give multiple options for how to fix it this becomes a great way to prioritize what fixed actions to automate in your soar platform and what we want to get to eventually is being able to automatically trigger soar actions to fix well-known problems like automatically invalidating passwords for for poor poor passwords in our credentials amongst a whole bunch of other things we could go off and do and then finally if there is a well-known kill chain or attack path one of the things I really wish I could have done when I was a Splunk customer was take this type of kill chain that actually shows a path to domain admin that I'm sincerely worried about and use it as a glass table over which I could start to layer possible indicators of compromise and now you've got a great starting point for glass tables and iocs for actual kill chains that we know are exploitable in your environment and that becomes some super cool Integrations that we've got on the roadmap between us and the Splunk security side of the house so what I'll leave with actually Patrick before I do that you know um love to get your comments and then I'll I'll kind of leave with one last slide on this wartime security mindset uh pending you know assuming there's no other questions no I love it I mean I think this kind of um it's kind of glass table's approach to how do you how do you sort of visualize these workflows and then use things like sore and orchestration and automation to operationalize them is exactly where we see all of our customers going and getting away from I think an over engineered approach to soar with where it has to be super technical heavy with you know python programmers and getting more to this visual view of workflow creation um that really demystifies the power of Automation and also democratizes it so you don't have to have these programming languages in your resume in order to start really moving the needle on workflow creation policy enforcement and ultimately driving automation coverage across more and more of the workflows that your team is seeing yeah I think that between us being able to visualize the actual kill chain or attack path with you know think of a of uh the soar Market I think going towards this no code low code um you know configurable sore versus coded sore that's going to really be a game changer in improve or giving security teams a force multiplier so what I'll leave you with is this peacetime mindset of security no longer is sustainable we really have to get out of checking the box and then waiting for the bad guys to show up to verify that security tools are are working or not and the reason why we've got to really do that quickly is there are over a thousand companies that withdrew from the Russian economy over the past uh nine months due to the Ukrainian War there you should expect every one of them to be punished by the Russians for leaving and punished from a cyber standpoint and this is no longer about financial extortion that is ransomware this is about punishing and destroying companies and you can punish any one of these companies by going after them directly or by going after their suppliers and their Distributors so suddenly your attack surface is no more no longer just your own Enterprise it's how you bring your goods to Market and it's how you get your goods created because while I may not be able to disrupt your ability to harvest fruit if I can get those trucks stuck at the border I can increase spoilage and have the same effect and what we should expect to see is this idea of cyber-enabled economic Warfare where if we issue a sanction like Banning the Russians from traveling there is a cyber-enabled counter punch which is corrupt and destroy the American Airlines database that is below the threshold of War that's not going to trigger the 82nd Airborne to be mobilized but it's going to achieve the right effect ban the sale of luxury goods disrupt the supply chain and create shortages banned Russian oil and gas attack refineries to call a 10x spike in gas prices three days before the election this is the future and therefore I think what we have to do is shift towards a wartime mindset which is don't trust your security posture verify it see yourself Through The Eyes of the attacker build that incident response muscle memory and drive better collaboration between the red and the blue teams your suppliers and Distributors and your information uh sharing organization they have in place and what's really valuable for me as a Splunk customer was when a router crashes at that moment you don't know if it's due to an I.T Administration problem or an attacker and what you want to have are different people asking different questions of the same data and you want to have that integrated triage process of an I.T lens to that problem a security lens to that problem and then from there figuring out is is this an IT workflow to execute or a security incident to execute and you want to have all of that as an integrated team integrated process integrated technology stack and this is something that I very care I cared very deeply about as both a Splunk customer and a Splunk CTO that I see time and time again across the board so Patrick I'll leave you with the last word the final three minutes here and I don't see any open questions so please take us home oh man see how you think we spent hours and hours prepping for this together that that last uh uh 40 seconds of your talk track is probably one of the things I'm most passionate about in this industry right now uh and I think nist has done some really interesting work here around building cyber resilient organizations that have that has really I think helped help the industry see that um incidents can come from adverse conditions you know stress is uh uh performance taxations in the infrastructure service or app layer and they can come from malicious compromises uh Insider threats external threat actors and the more that we look at this from the perspective of of a broader cyber resilience Mission uh in a wartime mindset uh I I think we're going to be much better off and and will you talk about with operationally minded ice hacks information sharing intelligence sharing becomes so important in these wartime uh um situations and you know we know not all ice acts are created equal but we're also seeing a lot of um more ad hoc information sharing groups popping up so look I think I think you framed it really really well I love the concept of wartime mindset and um I I like the idea of applying a cyber resilience lens like if you have one more layer on top of that bottom right cake you know I think the it lens and the security lens they roll up to this concept of cyber resilience and I think this has done some great work there for us yeah you're you're spot on and that that is app and that's gonna I think be the the next um terrain that that uh that you're gonna see vendors try to get after but that I think Splunk is best position to win okay that's a wrap for this special Cube presentation you heard all about the global expansion of horizon 3.ai's partner program for their Partners have a unique opportunity to take advantage of their node zero product uh International go to Market expansion North America channel Partnerships and just overall relationships with companies like Splunk to make things more comprehensive in this disruptive cyber security world we live in and hope you enjoyed this program all the videos are available on thecube.net as well as check out Horizon 3 dot AI for their pen test Automation and ultimately their defense system that they use for testing always the environment that you're in great Innovative product and I hope you enjoyed the program again I'm John Furrier host of the cube thanks for watching

Published Date : Sep 28 2022

SUMMARY :

that's the sort of stuff that we do you

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Patrick CoughlinPERSON

0.99+

Jennifer LeePERSON

0.99+

ChrisPERSON

0.99+

TonyPERSON

0.99+

2013DATE

0.99+

Raina RichterPERSON

0.99+

SingaporeLOCATION

0.99+

EuropeLOCATION

0.99+

PatrickPERSON

0.99+

FrankfurtLOCATION

0.99+

JohnPERSON

0.99+

20-yearQUANTITY

0.99+

hundredsQUANTITY

0.99+

AWSORGANIZATION

0.99+

20 yearsQUANTITY

0.99+

seven minutesQUANTITY

0.99+

95QUANTITY

0.99+

FordORGANIZATION

0.99+

2.7 billionQUANTITY

0.99+

MarchDATE

0.99+

FinlandLOCATION

0.99+

seven hoursQUANTITY

0.99+

sixty percentQUANTITY

0.99+

John FurrierPERSON

0.99+

SwedenLOCATION

0.99+

John FurrierPERSON

0.99+

six weeksQUANTITY

0.99+

seven hoursQUANTITY

0.99+

19 credentialsQUANTITY

0.99+

ten dollarsQUANTITY

0.99+

JenniferPERSON

0.99+

5 000 hostsQUANTITY

0.99+

Horizon 3TITLE

0.99+

WednesdayDATE

0.99+

30QUANTITY

0.99+

eightQUANTITY

0.99+

Asia PacificLOCATION

0.99+

American AirlinesORGANIZATION

0.99+

DeloitteORGANIZATION

0.99+

three licensesQUANTITY

0.99+

two companiesQUANTITY

0.99+

2019DATE

0.99+

European UnionORGANIZATION

0.99+

sixQUANTITY

0.99+

seven occurrencesQUANTITY

0.99+

70QUANTITY

0.99+

three peopleQUANTITY

0.99+

Horizon 3.aiTITLE

0.99+

ATTORGANIZATION

0.99+

Net ZeroORGANIZATION

0.99+

SplunkORGANIZATION

0.99+

UberORGANIZATION

0.99+

fiveQUANTITY

0.99+

less than two percentQUANTITY

0.99+

less than two hoursQUANTITY

0.99+

2012DATE

0.99+

UKLOCATION

0.99+

AdobeORGANIZATION

0.99+

four issuesQUANTITY

0.99+

Department of DefenseORGANIZATION

0.99+

next yearDATE

0.99+

three stepsQUANTITY

0.99+

node 0TITLE

0.99+

15 minutesQUANTITY

0.99+

hundred percentQUANTITY

0.99+

node zeroTITLE

0.99+

10xQUANTITY

0.99+

last yearDATE

0.99+

7 minutesQUANTITY

0.99+

one licenseQUANTITY

0.99+

second thingQUANTITY

0.99+

thousands of hostsQUANTITY

0.99+

five thousand hostsQUANTITY

0.99+

next weekDATE

0.99+

Manoj Sharma, Google Cloud | VMware Explore 2022


 

>>Welcome back everyone to the Cube's live coverage here in San Francisco of VMware Explorer, 2022. I'm John furrier with Dave ante coast of the hub. We're two sets, three days of wall to wall coverage. Our 12 year covering VMware's annual conference day, formerly world. Now VMware Explorer. We're kicking off day tube, no Sharma director of product management at Google cloud GCP. No Thankss for coming on the cube. Good to see you. >>Yeah. Very nice to see you as well. >>It's been a while. Google next cloud. Next is your event. We haven't been there cuz of the pandemic. Now you got an event coming up in October. You wanna give that plug out there in October 11th, UHS gonna be kind of a hybrid show. You guys with GCP, doing great. Getting up, coming up on in, in the rear with third place, Amazon Azure GCP, you guys have really nailed the developer and the AI and the data piece in the cloud. And now with VMware, with multicloud, you guys are in the mix in the universal program that they got here had been, been a partnership. Talk about the Google VMware relationship real quick. >>Yeah, no, I wanna first address, you know, us being in third place. I think when, when customers think about cloud transformation, you know, they, they, for them, it's all about how you can extract value from the data, you know, how you can transform your business with AI. And as far as that's concerned, we are in first place. Now coming to the VMware partnership, what we observed was, you know, you know, first of all, like there's a lot of data gravity built over the past, you know, 20 years in it, you know, and you know, VMware has, you know, really standardized it platforms. And when it comes to the data gravity, what we found was that, you know, customers want to extract the value that, you know, lives in that data as I was just talking about, but they find it hard to change architectures and, you know, bring those architectures into, you know, the cloud native world, you know, with microservices and so forth. >>Especially when, you know, these applications have been built over the last 20 years with off the shelf, you know, commercial off the shelf in, you know, systems you don't even know who wrote the code. You don't know what the IP address configuration is. And it's, you know, if you change anything, it can break your production. But at the same time, they want to take advantage of what the cloud has to offer. You know, the self-service the elasticity, you know, the, the economies of scale efficiencies of operation. So we wanted to, you know, bring CU, you know, bring the cloud to where the customer is with this service. And, you know, with, like I said, you know, VMware was the defacto it platform. So it was a no brainer for us to say, you know what, we'll give VMware in a native manner yeah. For our customers and bring all the benefits of the cloud into it to help them transform and take advantage of the cloud. >>It's interesting. And you called out that the, the advantages of Google cloud, one of the things that we've observed is, you know, VMware trying to be much more cloud native in their messaging and their positioning. They're trying to connect into that developer world for cloud native. I mean, Google, I mean, you guys have been cloud native literally from day one, just as a company. Yeah. Infrastructure wise, I mean, DevOps was an infrastructures code was Google's DNA. I, you had Borg, which became Kubernetes. Everyone kind of knows that in the history, if you, if you're in, in the, inside the ropes. Yeah. So as you guys have that core competency of essentially infrastructures code, which is basically cloud, how are you guys bringing that into the enterprise with the VMware, because that's where the puck is going. Right. That's where the use cases are. Okay. You got data clearly an advantage there, developers, you guys do really well with developers. We see that at say Coon and CNCF. Where's the use cases as the enterprise start to really figure out that this is now happening with hybrid and they gotta be more cloud native. Are they ramping up certain use cases? Can you share and connect the dots between what you guys had as your core competency and where the enterprise use cases are? >>Yeah. Yeah. You know, I think transformation means a lot of things, especially when you get into the cloud, you want to be not only efficient, but you also wanna make sure you're secure, right. And that you can manage and maintain your infrastructure in a way that you can reason about it. When, you know, when things go wrong, we took a very unique approach with Google cloud VMware engine. When we brought it to the cloud to Google cloud, what we did was we, we took like a cloud native approach. You know, it would seem like, you know, we are to say that, okay, VMware is cloud native, but in fact that's what we've done with this service from the ground up. One of the things we wanted to do was make sure we meet all the enterprise needs availability. We are the only service that gives four nines of SLA in a single site. >>We are the only service that has fully redundant networking so that, you know, some of the pets that you run on the VMware platform with your operational databases and the keys to the kingdom, you know, they can be run in a efficient manner and in a, in a, in a stable manner and, and, you know, in a highly available fashion, but we also paid attention to performance. One of our customers Mitel runs a unified communication service. And what they found was, you know, the high performance infrastructure, low latency infrastructure actually helps them deliver, you know, highly reliable, you know, communication experience to their customers. Right. And so, you know, we, you know, while, you know, so we developed the service from the ground up, making sure we meet the needs of these enterprise applications, but also wanted to make sure it's positioned for the future. >>Well, integrated into Google cloud VPC, networking, billing, identities, access control, you know, support all of that with a one stop shop. Right? And so this completely changes the game for, for enterprises on the outset, but what's more like we also have built in integration to cloud operations, you know, a single pane of glass for managing all your cloud infrastructure. You know, you have the ability to easily ELT into BigQuery and, you know, get a data transformation going that way from your operational databases. So, so I think we took a very like clean room ground from the ground of approach to make sure we get the best of both worlds to our customers. So >>Essentially made the VMware stack of first class citizen connecting to all the go Google tool. Did you build a bare metal instance to be able to support >>That? We, we actually have a very customized infrastructure to make sure that, you know, the experience that customers looking for in the VMware context is what we can deliver to them. And, and like I said, you know, being able to manage the pets in, in addition to the cattle that, that we are, we are getting with the modern containerized workloads. >>And, and it's not likely you did that as a one off, I, I would presume that other partners can potentially take advantage of that, that approach as well. Is that >>True? Absolutely. So one of our other examples is, is SAP, you know, our SAP infrastructure runs on very similar kind of, you know, highly redundant infrastructure, some, some parts of it. And, and then, you know, we also have in the same context partners such as NetApp. So, so customers want to, you know, truly, so, so there's two parts to it, right? One is to meet customers where they already are, but also take them to the future. And partner NetApp has delivered a cloud service that is well integrated into the platform, serves use cases like VDI serves use cases for, you know, tier two data protection scenarios, Dr. And also high performance context that customers are looking for, explain >>To people because think a lot of times people understand say, oh, NetApp, but doesn't Google have storage. Yeah. So explain that relationship and why that, that is complimentary. Yeah. And not just some kind of divergence from your strategy. >>Yeah. Yeah. No. So I think the, the idea here is NetApp, the NetApp platform living on-prem, you know, for, for so many years, it's, it's built a lot of capabilities that customers take advantage of. Right. So for example, it has the sta snap mirror capabilities that enable, you know, instant Dr. Of between locations and customers. When they think of the cloud, they are also thinking of heterogeneous context where some of the infrastructure is still needs to live on prem. So, you know, they have the Dr going on from the on-prem side using snap mirror, into Google cloud. And so, you know, it enables that entry point into the cloud. And so we believe, you know, partnering with NetApp kind of enables these high performance, you know, high, you know, reliability and also enables the customers to meet regulatory needs for, you know, the Dr. And data protection that they're looking for. And, >>And NetApp, obviously a big VMware partner as well. So I can take that partnership with VMware and NetApp into the Google cloud. >>Correct. Yeah. Yeah. It's all about leverage. Like I said, you know, meeting customers where they already are and ensuring that we smoothen their journey into the future rather than making it like a single step, you know, quantum leap. So to speak between two words, you know, I think, you know, I like to say like for the, for the longest time the cloud was being presented as a false choice between, you know, the infrastructure as of, of the past and the infrastructure of the future, like the red pill and the blue pill. Right. And, you know, we've, I like to say, like, I've, you know, we've brought, brought into the, into this context, the purple pill. Right. Which gives you really the best of both tools. >>Yeah. And this is a tailwind for you guys now, and I wanna get your thoughts on this and your differentiation around multi-cloud that's around the corner. Yeah. I mean, everyone now recognizes at least multi clouds of reality. People have workloads on AWS, Azure and GCP. That is technically multi-cloud. Yeah. Now the notion of spanning applications across clouds is coming certainly hybrid cloud is a steady state, which essentially DevOps on prem or edge in the cloud. So, so you have, now the recognition that's here, you guys are positioned well for this. How is that evolving and how are you positioning yourself with, and how you're differentiating around as clients start thinking, Hey, you know what, I can start running things on AWS and GCP. Yeah. And OnPrem in a really kind of a distributed way. Yeah. With abstractions and these things that people are talking about super cloud, what we call it. And, and this is really the conversations. Okay. What does that next future around the corner architecture look like? And how do you guys fit in, because this is an opportunity for you guys. It's almost, it's almost, it's like Wayne Gretsky, the puck is coming to you. Yeah. Yeah. It seems that way to me. What, how do you respond to >>That? Yeah, no, I think, you know, Raghu said, yes, I did yesterday. Right. It's all about being cloud smart in this new heterogeneous world. I think Google cloud has always been the most open and the most customer oriented cloud. And the reason I say that is because, you know, looking at like our Kubernetes platform, right. What we've enabled with Kubernetes and Antho is the ability for a customer to run containerized infrastructure in the same consistent manner, no matter what the platform. So while, you know, Kubernetes runs on GKE, you can run using Anthos on the VMware platform and you can run using Anthos on any other cloud on the planet in including AWS Azure. And, and so it's, you know, we, we take a very open, we've taken an open approach with Kubernetes to begin with, but, you know, the, the fact that, you know, with Anthos and this multicloud management experience that we can provide customers, we are, we are letting customers get the full freedom of an advantage of what multicloud has to has to offer. And I like to say, you know, VMware is the ES of ISAs, right. Cause cuz if you think about it, it's the only hypervisor that you can run in the same consistent manner, take the same image and run it on any of the providers. Right. And you can, you know, link it, you know, with the L two extensions and create a fabric that spans the world and, and, and multiple >>Products with, with almost every company using VMware. >>That's pretty much that's right. It's the largest, like the VMware network of, of infrastructure is the largest network on the planet. Right. And so, so it's, it's truly about enabling customer choice. We believe that every cloud, you know, brings its advantages and, you know, at the end of their day, the technology of, you know, capabilities of the provider, the differentiation of the provider need to stand on its merit. And so, you know, we truly embrace this notion of money. Those ops guys >>Have to connect to opportunities to connect to you, you guys in yeah. In, in the cloud. >>Yeah. Absolutely >>Like to ask you a question sort of about database philosophy and maybe, maybe futures a little bit, there seems to be two camps. I mean, you've got multiple databases, you got span for, you know, kind of global distributed database. You've got big query for analytics. There seems to be a trend in the industry for some providers to say, okay, let's, let's converge the transactions and analytics and kind of maybe eliminate the need to do a lot of Elting and others are saying, no, no, we want to be, be, you know, really precise and distinct with our capabilities and, and, and have be spoke set of capability, right. Tool for the right job. Let's call it. What's Google's philosophy in that regard. And, and how do you think about database in the future? >>So, so I think, you know, when it comes to, you know, something as general and as complex as data, right, you know, data lives in all ships and forms, it, it moves at various velocities that moves at various scale. And so, you know, we truly believe that, you know, customers should have the flexibility and freedom to put things together using, you know, these various contexts and, and, you know, build the right set of outcomes for themselves. So, you know, we, we provide cloud SQL, right, where customers can run their own, you know, dedicated infrastructure, fully managed and operated by Google at a high level of SLA compared to any other way of doing it. We have a database born in the cloud, a data warehouse born in the cloud BigQuery, which enables zero ops, you know, zero touch, you know, instant, you know, know high performance analytics at scale, you know, span gives customers high levels of reliability and redundancy in, in, in a worldwide context. So with, with, with extreme levels of innovation coming from, you know, the, the, the NTP, you know, that happen across different instances. Right? So I, you know, I, we, we do think that, you know, data moves a different scale and, and different velocity and, and, you know, customers have a complex set of needs. And, and so our portfolio of database services put together can truly address all ends of the spectrum. >>Yeah. And we've certainly been following you guys at CNCF and the work that Google cloud's doing extremely strong technical people. Yeah. Really open source focused, great products, technology. You guys do a great job. And I, I would imagine, and it's clear that VMware is an opportunity for you guys, given the DNA of their customer base. The installed base is huge. You guys have that nice potential connection where these customers are kind of going where its puck is going. You guys are there now for the next couple minutes, give a, give a plug for Google cloud to the VMware customer base out there. Yeah. Why Google cloud, why now what's in it for them? What's the, what's the value parts? Give the, give the plug for Google cloud to the VMware community. >>Absolutely. So, so I think, you know, especially with VMware engine, what we've built, you know, is truly like a cloud native next generation enterprise platform. Right. And it does three specific things, right? It gives you a cloud optimized experience, right? Like the, the idea being, you know, self-service efficiencies, economies, you know, operational benefits, you get that from the platform and a customer like Mitel was able to take advantage of that. Being able to use the same platform that they were running in their co-located context and migrate more than a thousand VMs in less than 90 days, something that they weren't able to do for, for over two years. The second aspect of our, you know, our transformation journey that we enable with this service is cloud integration. What that means is the same VPC experience that you get in the, the, the networking global networking that Google cloud has to offer. >>The VMware platform is fully integrated into that. And so the benefits of, you know, having a subnet that can live anywhere in the world, you know, having multi VPC, but more importantly, the benefits of having these Google cloud services like BigQuery and span and cloud operations management at your fingertips in the same layer, three domain, you know, just make an IP call and your data is transformed into BigQuery from your operational databases and car four. The retailer in Europe actually was able to do that with our service. And not only that, you know, do do the operational transform into BigQuery, you know, from their, the data gravity living in VMware on, on VMware engine, but they were able to do it in, you know, cost effective, a manner. They, they saved, you know, over 40% compared to the, the current context and also lower the co increase the agility of operations at the same time. >>Right. And so for them, this was extremely transf transformative. And lastly, we believe in the context of being open, we are also a very partner friendly cloud. And so, you know, customers come bring VMware platform because of all the, it, you know, ecosystem that comes along with it, right. You've got your VM or your Zerto or your rubric, or your capacity for data protection and, and backup. You've got security from Forex, tha fortunate, you know, you've got, you know, like we'd already talked about NetApp storage. So we, you know, we are open in that technology context, ISVs, you know, fully supported >>Integrations key. Yeah, >>Yeah, exactly. And, and, you know, that's how you build a platform, right? Yeah. And so, so we enable that, but, but, you know, we also enable customers getting into the future, going into the future, through their AI, through the AI capabilities and services that are once again available at, at their fingertips. >>Soo, thanks for coming on. Really appreciate it. And, you know, as super clouds, we call it, our multi-cloud comes around the corner, you got the edge exploding, you guys do a great job in networking and security, which is well known. What's your view of this super cloud multi-cloud world. What's different about it? Why isn't it just sass on cloud what's, what's this next gen cloud really about it. You had to kind of kind explain that to, to business folks and technical folks out there. Is it, is it something unique? Do you see a, a refactoring? Is it something that does something different? Yeah. What, what doesn't make it just SAS. >>Yeah. Yeah. No, I think that, you know, there's, there's different use cases that customers have have in mind when they, when they think about multi-cloud. I think the first thing is they don't want to have, you know, all eggs in a single basket. Right. And, and so, you know, it, it helps diversify their risk. I mean, and it's a real problem. Like you, you see outages in, you know, in, in availability zones that take out entire businesses. So customers do wanna make sure that they're not, they're, they're able to increase their availability, increase their resiliency through the use of multiple providers, but I think so, so that's like getting the same thing in different contexts, but at the same time, the context is shifting right. There is some, there's some data sources that originate, you know, elsewhere and there, the scale and the velocity of those sources is so vast, you know, you might be producing video from retail stores and, you know, you wanna make sure, you know, this, this security and there's, you know, information awareness built about those sources. >>And so you want to process that data, add the source and take instant decisions with that proximity. And that's why we believe with the GC and, you know, with, with both, both the edge versions and the hosted versions, GDC stands for Google, Google distributed cloud, where we bring the benefit and value of Google cloud to different locations on the edge, as well as on-prem. And so I think, you know, those kinds of contexts become important. And so I think, you know, we, you know, we are not only do we need to be open and pervasive, you know, but we also need to be compatible and, and, and also have the proximity to where information lives and value lives. >>Minish. Thanks for coming on the cube here at VMware Explorer, formerly world. Thanks for your time. Thank >>You so much. Okay. >>This is the cube. I'm John for Dave ante live day two coverage here on Moscone west lobby for VMware Explorer. We'll be right back with more after the short break.

Published Date : Aug 31 2022

SUMMARY :

No Thankss for coming on the cube. And now with VMware, with multicloud, you guys are in the mix in the universal program you know, the cloud native world, you know, with microservices and so forth. You know, the self-service the elasticity, you know, you know, VMware trying to be much more cloud native in their messaging and their positioning. You know, it would seem like, you know, we And so, you know, we, you know, while, you know, so we developed the service from the you know, get a data transformation going that way from your operational databases. Did you build a bare metal instance to be able to support And, and like I said, you know, being able to manage the pets in, And, and it's not likely you did that as a one off, I, I would presume that other partners And, and then, you know, we also have in the same context partners such as NetApp. And not just some kind of divergence from your strategy. to meet regulatory needs for, you know, the Dr. And data protection that they're looking for. and NetApp into the Google cloud. you know, I think, you know, I like to say like for the, now the recognition that's here, you guys are positioned well for this. Kubernetes to begin with, but, you know, the, the fact that, you know, And so, you know, we truly embrace this notion of money. In, in the cloud. no, no, we want to be, be, you know, really precise and distinct with So, so I think, you know, when it comes to, you know, for you guys, given the DNA of their customer base. of our, you know, our transformation journey that we enable with this service is you know, having a subnet that can live anywhere in the world, you know, you know, we are open in that technology context, ISVs, you know, fully supported Yeah, so we enable that, but, but, you know, we also enable customers getting And, you know, as super clouds, we call it, our multi-cloud comes stores and, you know, you wanna make sure, you know, this, this security and there's, And so I think, you know, Thanks for coming on the cube here at VMware Explorer, formerly world. You so much. This is the cube.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
EuropeLOCATION

0.99+

GoogleORGANIZATION

0.99+

RaghuPERSON

0.99+

San FranciscoLOCATION

0.99+

Manoj SharmaPERSON

0.99+

October 11thDATE

0.99+

Wayne GretskyPERSON

0.99+

OctoberDATE

0.99+

two wordsQUANTITY

0.99+

two partsQUANTITY

0.99+

AmazonORGANIZATION

0.99+

JohnPERSON

0.99+

less than 90 daysQUANTITY

0.99+

BigQueryTITLE

0.99+

DavePERSON

0.99+

12 yearQUANTITY

0.99+

second aspectQUANTITY

0.99+

yesterdayDATE

0.99+

AWSORGANIZATION

0.99+

CNCFORGANIZATION

0.99+

2022DATE

0.99+

20 yearsQUANTITY

0.99+

bothQUANTITY

0.99+

more than a thousand VMsQUANTITY

0.99+

two setsQUANTITY

0.99+

both toolsQUANTITY

0.99+

oneQUANTITY

0.98+

over two yearsQUANTITY

0.98+

VMwareORGANIZATION

0.98+

OneQUANTITY

0.98+

CoonORGANIZATION

0.98+

three daysQUANTITY

0.98+

both worldsQUANTITY

0.98+

first thingQUANTITY

0.98+

third placeQUANTITY

0.98+

MosconeLOCATION

0.98+

over 40%QUANTITY

0.98+

first placeQUANTITY

0.97+

AnthosTITLE

0.97+

GDCORGANIZATION

0.96+

NetAppTITLE

0.96+

two campsQUANTITY

0.96+

VMware ExplorerORGANIZATION

0.95+

first addressQUANTITY

0.95+

single stepQUANTITY

0.95+

KubernetesTITLE

0.95+

VMwareTITLE

0.93+

single basketQUANTITY

0.93+

GCPORGANIZATION

0.93+

tier twoQUANTITY

0.92+

MitelORGANIZATION

0.92+

SQLTITLE

0.91+

single siteQUANTITY

0.91+

OnPremORGANIZATION

0.91+

Google VMwareORGANIZATION

0.9+

ForexORGANIZATION

0.88+

day oneQUANTITY

0.88+

pandemicEVENT

0.87+

ISAsTITLE

0.87+

three specific thingsQUANTITY

0.86+

VMware ExplorerORGANIZATION

0.86+

AnthoTITLE

0.86+

David Linthicum, Deloitte US | Supercloud22


 

(bright music) >> "Supermetafragilisticexpialadotious." What's in a name? In an homage to the inimitable Charles Fitzgerald, we've chosen this title for today's session because of all the buzz surrounding "supercloud," a term that we introduced last year to signify a major architectural trend and shift that's occurring in the technology industry. Since that time, we've published numerous videos and articles on the topic, and on August 9th, kicked off "Supercloud22," an open industry event designed to advance the supercloud conversation, gathering input from more than 30 experienced technologists and business leaders in "The Cube" and broader technology community. We're talking about individuals like Benoit Dageville, Kit Colbert, Ali Ghodsi, Mohit Aron, David McJannet, and dozens of other experts. And today, we're pleased to welcome David Linthicum, who's a Chief Strategy Officer of Cloud Services at Deloitte Consulting. David is a technology visionary, a technical CTO. He's an author and a frequently sought after keynote speaker at high profile conferences like "VMware Explore" next week. David Linthicum, welcome back to "The Cube." Good to see you again. >> Oh, it's great to be here. Thanks for the invitation. Thanks for having me. >> Yeah, you're very welcome. Okay, so this topic of supercloud, what you call metacloud, has created a lot of interest. VMware calls it cross-cloud services, Snowflake calls it their data cloud, there's a lot of different names, but recently, you published a piece in "InfoWorld" where you said the following. "I really don't care what we call it, "and I really don't care if I put "my own buzzword into the mix. "However, this does not change the fact "that metacloud is perhaps the most important "architectural evolution occurring right now, "and we need to get this right out of the gate. "If we do that, who cares what it's named?" So very cool. And you also mentioned in a recent article that you don't like to put out new terms out in the wild without defining them. So what is a metacloud, or what we call supercloud? What's your definition? >> Yeah, and again, I don't care what people call it. The reality is it's the ability to have a layer of cross-cloud services. It sits above existing public cloud providers. So the idea here is that instead of building different security systems, different governance systems, different operational systems in each specific cloud provider, using whatever native features they provide, we're trying to do that in a cross-cloud way. So in other words, we're pushing out data integration, security, all these other things that we have to take care of as part of deploying a particular cloud provider. And in a multicloud scenario, we're building those in and between the clouds. And so we've been tracking this for about five years. We understood that multicloud is not necessarily about the particular public cloud providers, it's about things that you build in and between the clouds. >> Got it, okay. So I want to come back to that, to the definition, but I want to tie us to the so-called multicloud. You guys did a survey recently. We've said that multicloud was mostly a symptom of multi-vendor, Shadow Cloud, M&A, and only recently has become a strategic imperative. Now, Deloitte published a survey recently entitled "Closing the Cloud Strategy, Technology, Innovation Gap," and I'd like to explore that a little bit. And so in that survey, you showed data. What I liked about it is you went beyond what we all know, right? The old, "Our research shows that on average, "X number of clouds are used at an individual company." I mean, you had that too, but you really went deeper. You identified why companies are using multiple clouds, and you developed different categories of practitioners across 500 survey respondents. But the reasons were very clear for "why multicloud," as this becomes more strategic. Service choice scale, negotiating leverage, improved business resiliency, minimizing lock-in, interoperability of data, et cetera. So my question to you, David, is what's the problem supercloud or metacloud solves, and what's different from multicloud? >> That's a great question. The reality is that if we're... Well, supercloud or metacloud, whatever, is really something that exists above a multicloud, but I kind of view them as the same thing. It's an architectural pattern. We can name it anything. But the reality is that if we're moving to these multicloud environments, we're doing so to leverage best of breed things. In other words, best of breed technology to provide the innovators within the company to take the business to the next level, and we determine that in the survey. And so if we're looking at what a multicloud provides, it's the ability to provide different choices of different services or piece parts that allows us to build anything that we need to do. And so what we found in the survey and what we found in just practice in dealing with our clients is that ultimately, the value of cloud computing is going to be the innovation aspects. In other words, the ability to take the company to the next level from being more innovative and more disruptive in the marketplace that they're in. And the only way to do that, instead of basically leveraging the services of a particular walled garden of a single public cloud provider, is to cast a wider net and get out and leverage all kinds of services to make these happen. So if you think about that, that's basically how multicloud has evolved. In other words, it wasn't planned. They didn't say, "We're going to go do a multicloud." It was different developers and innovators in the company that went off and leveraged these cloud services, sometimes with the consent of IT leadership, sometimes not. And now we have these multitudes of different services that we're leveraging. And so many of these enterprises are going from 1000 to, say, 3000 services under management. That creates a complexity problem. We have a problem of heterogeneity, different platforms, different tools, different services, different AI technology, database technology, things like that. So the metacloud, or the supercloud, or whatever you want to call it, is the ability to deal with that complexity on the complexity's terms. And so instead of building all these various things that we have to do individually in each of the cloud providers, we're trying to do so within a cross-cloud service layer. We're trying to create this layer of technology, which removes us from dealing with the complexity of the underlying multicloud services and makes it manageable. Because right now, I think we're getting to a point of complexity we just can't operate it at the budgetary limits that we are right now. We can't keep the number of skills around, the number of operators around, to keep these things going. We're going to have to get creative in terms of how we manage these things, how we manage a multicloud. And that's where the supercloud, metacloud, whatever they want to call it, comes that. >> Yeah, and as John Furrier likes to say, in IT, we tend to solve complexity with more complexity, and that's not what we're talking about here. We're talking about simplifying, and you talked about the abstraction layer, and then it sounds like I'm inferring more. There's value that's added on top of that. And then you also said the hyperscalers are in a walled garden. So I've been asked, why aren't the hyperscalers superclouds? And I've said, essentially, they want to put your data into their cloud and keep it there. Now, that doesn't mean they won't eventually get into that. We've seen examples a little bit, Outposts, Anthos, Azure Arc, but the hyperscalers really aren't building superclouds or metaclouds, at least today, are they? >> No, they're not. And I always have the predictions for every major cloud conference that this is the conference that the hyperscaler is going to figure out some sort of a multicloud across-cloud strategy. In other words, building services that are able to operate across clouds. That really has never happened. It has happened in dribs and drabs, and you just mentioned a few examples of that, but the ability to own the space, to understand that we're not going to be the center of the universe in how people are going to leverage it, is going to be multiple things, including legacy systems and other cloud providers, and even industry clouds that are emerging these days, and SaaS providers, and all these things. So we're going to assist you in dealing with complexity, and we're going to provide the core services of being there. That hasn't happened yet. And they may be worried about conflicting their market, and the messaging is a bit different, even actively pushing back on the concept of multicloud, but the reality is the market's going to take them there. So in other words, if enough of their customers are asking for this and asking that they take the lead in building these cross-cloud technologies, even if they're participating in the stack and not being the stack, it's too compelling of a market that it's not going to drag a lot of the existing public cloud providers there. >> Well, it's going to be interesting to see how that plays out, David, because I never say never when it comes to a company like AWS, and we've seen how fast they move. And at the same time, they don't want to be commoditized. There's the layer underneath all this infrastructure, and they got this ecosystem that's adding all this tremendous value. But I want to ask you, what are the essential elements of supercloud, coming back to the definition, if you will, and what's different about metacloud, as you call it, from plain old SaaS or PaaS? What are the key elements there? >> Well, the key elements would be holistic management of all of the IT infrastructure. So even though it's sitting above a multicloud, I view metacloud, supercloud as the ability to also manage your existing legacy systems, your existing security stack, your existing network operations, basically everything that exists under the purview of IT. If you think about it, we're moving our infrastructure into the clouds, and we're probably going to hit a saturation point of about 70%. And really, if the supercloud, metacloud, which is going to be expensive to build for most of the enterprises, it needs to support these things holistically. So it needs to have all the services, that is going to be shareable across the different providers, and also existing legacy systems, and also edge computing, and IoT, and all these very diverse systems that we're building there right now. So if complexity is a core challenge to operate these things at scale and the ability to secure these things at scale, we have to have commonality in terms of security architecture and technology, commonality in terms of our directory services, commonality in terms of network operations, commonality in term of cloud operations, commonality in terms of FinOps. All these things should exist in some holistic cross-cloud layer that sits above all this complexity. And you pointed out something very profound. In other words, that is going to mean that we're hiding a lot of the existing cloud providers in terms of their interfaces and dashboards and things like that that we're dealing with today, their APIs. But the reality is that if we're able to manage these things at scale, the public cloud providers are going to benefit greatly from that. They're going to sell more services because people are going to find they're able to leverage them easier. And so in other words, if we're removing the complexity wall, which many in the industry are calling it right now, then suddenly we're moving from, say, the 25 to 30% migrated in the cloud, which most enterprises are today, to 50, 60, 70%. And we're able to do this at scale, and we're doing it at scale because we're providing some architectural optimization through the supercloud, metacloud layer. >> Okay, thanks for that. David, I just want to tap your CTO brain for a minute. At "Supercloud22," we came up with these three deployment models. Kit Colbert put forth the idea that one model would be your control planes running in one cloud, let's say AWS, but it interacts with and can manage and deploy on other clouds, the Kubernetes Cluster Management System. The second one, Mohit Aron from Cohesity laid out, where you instantiate the stack on different clouds and different cloud regions, and then you create a layer, a common interface across those. And then Snowflake was the third deployment model where it's a single global instance, it's one instantiation, and basically building out their own cloud across these regions. Help us parse through that. Do those seem like reasonable deployment models to you? Do you have any thoughts on that? >> Yeah, I mean, that's a distributed computing trick we've been doing, which is, in essence, an agent of the supercloud that's carrying out some of the cloud native functions on that particular cloud, but is, in essence, a slave to the metacloud, or the supercloud, whatever, that's able to run across the various cloud providers. In other words, when it wants to access a service, it may not go directly to that service. It goes directly to the control plane, and that control plane is responsible... Very much like Kubernetes and Docker works, that control plane is responsible for reaching out and leveraging those native services. I think that that's thinking that's a step in the right direction. I think these things unto themselves, at least initially, are going to be a very complex array of technology. Even though we're trying to remove complexity, the supercloud unto itself, in terms of the ability to build this thing that's able to operate at scale across-cloud, is going to be a collection of many different technologies that are interfacing with the public cloud providers in different ways. And so we can start putting these meta architectures together, and I certainly have written and spoke about this for years, but initially, this is going to be something that may escape the detail or the holistic nature of these meta architectures that people are floating around right now. >> Yeah, so I want to stay on this, because anytime I get a CTO brain, I like to... I'm not an engineer, but I've been around a long time, so I know a lot of buzzwords and have absorbed a lot over the years, but so you take those, the second two models, the Mohit instantiate on each cloud and each cloud region versus the Snowflake approach. I asked Benoit Dageville, "Does that mean if I'm in "an AWS east region and I want to do a query on Azure West, "I can do that without moving data?" And he said, "Yes and no." And the answer was really, "No, we actually take a subset of that data," so there's the latency problem. From those deployment model standpoints, what are the trade-offs that you see in terms of instantiating the stack on each individual cloud versus that single instance? Is there a benefit of the single instance for governance and security and simplicity, but a trade-off on latency, or am I overthinking this? >> Yeah, you hit it on the nose. The reality is that the trade-off is going to be latency and performance. If we get wiggy with the distributed nature, like the distributed data example you just provided, we have to basically separate the queries and communicate with the databases on each instance, and then reassemble the result set that goes back to the people who are recording it. And so we can do caching systems and things like that. But the reality is, if it's distributed system, we're going to have latency and bandwidth issues that are going to be limiting us. And also security issues, because if we're removing lots of information over the open internet, or even private circuits, that those are going to be attack vectors that hackers can leverage. You have to keep that in mind. We're trying to reduce those attack vectors. So it would be, in many instances, and I think we have to think about this, that we're going to keep the data in the same physical region for just that. So in other words, it's going to provide the best performance and also the most simplistic access to dealing with security. And so we're not, in essence, thinking about where the data's going, how it's moving across things, things like that. So the challenge is going to be is when you're dealing with a supercloud or metacloud is, when do you make those decisions? And I think, in many instances, even though we're leveraging multiple databases across multiple regions and multiple public cloud providers, and that's the idea of it, we're still going to localize the data for performance reasons. I mean, I just wrote a blog in "InfoWorld" a couple of months ago and talked about, people who are trying to distribute data across different public cloud providers for different reasons, distribute an application development system, things like that, you can do it. With enough time and money, you can do anything. I think the challenge is going to be operating that thing, and also providing a viable business return based on the application. And so why it may look like a good science experiment, and it's cool unto itself as an architect, the reality is the more pragmatic approach is going to be a leavitt in a single region on a single cloud. >> Very interesting. The other reason I like to talk to companies like Deloitte and experienced people like you is 'cause I can get... You're agnostic, right? I mean, you're technology agnostic, vendor agnostic. So I want to come back with another question, which is, how do you deal with what I call the lowest common denominator problem? What I mean by that is if one cloud has, let's say, a superior service... Let's take an example of Nitro and Graviton. AWS seems to be ahead on that, but let's say some other cloud isn't quite quite there yet, and you're building a supercloud or a metacloud. How do you rationalize that? Does it have to be like a caravan in the army where you slow down so all the slowest trucks can keep up, or are the ways to adjudicate that that are advantageous to hide that deficiency? >> Yeah, and that's a great thing about leveraging a supercloud or a metacloud is we're putting that management in a single layer. So as far as a user or even a developer on those systems, they shouldn't worry about the performance that may come back, because we're dealing with the... You hit the nail on the head with that one. The slowest component is the one that dictates performance. And so we have to have some sort of a performance management layer. We're also making dynamic decisions to move data, to move processing, from one server to the other to try to minimize the amount of latency that's coming from a single component. So the great thing about that is we're putting that volatility into a single domain, and it's making architectural decisions in terms of where something will run and where it's getting its data from, things are stored, things like that, based on the performance feedback that's coming back from the various cloud services that are under management. And so if you're running across clouds, it becomes even more interesting, because ultimately, you're going to make some architectural choices on the fly in terms of where that stuff runs based on the active dynamic performance that that public cloud provider is providing. So in other words, we may find that it automatically shut down a database service, say MySQL, on one cloud instance, and moved it to a MySQL instance on another public cloud provider because there was some sort of a performance issue that it couldn't work around. And by the way, it does so dynamically. Away from you making that decision, it's making that decision on your behalf. Again, this is a matter of abstraction, removing complexity, and dealing with complexity through abstraction and automation, and this is... That would be an example of fixing something with automation, self-healing. >> When you meet with some of the public cloud providers and they talk about on-prem private cloud, the general narrative from the hyperscalers is, "Well, that's not a cloud." Should on-prem be inclusive of supercloud, metacloud? >> Absolutely, I mean, and they're selling private cloud instances with the edge cloud that they're selling. The reality is that we're going to have to keep a certain amount of our infrastructure, including private clouds, on premise. It's something that's shrinking as a market share, and it's going to be tougher and tougher to justify as the public cloud providers become better and better at what they do, but we certainly have edge clouds now, and hyperscalers have examples of that where they run a instance of their public cloud infrastructure on premise on physical hardware and software. And the reality is, too, we have data centers and we have systems that just won't go away for another 20 or 30 years. They're just too sticky. They're uneconomically viable to move into the cloud. That's the core thing. It's not that we can't do it. The fact of the matter is we shouldn't do it, because there's not going to be an economic... There's not going to be an economic incentive of making that happen. So if we're going to create this meta layer or this infrastructure which is going to run across clouds, and everybody agrees on, that's what the supercloud is, we have to include the on-premise systems, including private clouds, including legacy systems. And by the way, include the rising number of IoT systems that are out there, and edge-based systems out there. So we're managing it using the same infrastructure into cloud services. So they have metadata systems and they have specialized services, and service finance and retail and things like doing risk analytics. So it gets them further down that path, but not necessarily giving them a SaaS application where they're forced into all of the business processes. We're giving you piece parts. So we'll give you 1000 different parts that are related to the finance industry. You can assemble anything you need, but the thing is, it's not going to be like building it from scratch. We're going to give you risk analytics, we're giving you the financial analytics, all these things that you can leverage within your applications how you want to leverage them. We'll maintain them. So in other words, you don't have to maintain 'em just like a cloud service. And suddenly, we can build applications in a couple of weeks that used to take a couple of months, in some cases, a couple of years. So that seems to be a large take of it moving forward. So get it up in the supercloud. Those become just other services that are under managed... That are under management on the supercloud, the metacloud. So we're able to take those services, abstract them, assemble them, use them in different applications. And the ability to manage where those services are originated versus where they're consumed is going to be managed by the supercloud layer, which, you're dealing with the governance, the service governance, the security systems, the directory systems, identity access management, things like that. They're going to get you further along down the pike, and that comes back as real value. If I'm able to build something in two weeks that used to take me two months, and I'm able to give my creators in the organization the ability to move faster, that's a real advantage. And suddenly, we are going to be valued by our digital footprint, our ability to do things in a creative and innovative way. And so organizations are able to move that fast, leveraging cloud computing for what it should be leveraged, as a true force multiplier for the business. They're going to win the game. They're going to get the most value. They're going to be around in 20 years, the others won't. >> David Linthicum, always love talking. You have a dangerous combination of business and technology expertise. Let's tease. "VMware Explore" next week, you're giving a keynote, if they're going to be there. Which day are you? >> Tuesday. Tuesday, 11 o'clock. >> All right, that's a big day. Tuesday, 11 o'clock. And David, please do stop by "The Cube." We're in Moscone West. Love to get you on and continue this conversation. I got 100 more questions for you. Really appreciate your time. >> I always love talking to people at "The Cube." Thank you very much. >> All right, and thanks for watching our ongoing coverage of "Supercloud22" on "The Cube," your leader in enterprise tech and emerging tech coverage. (bright music)

Published Date : Aug 24 2022

SUMMARY :

and articles on the Oh, it's great to be here. right out of the gate. The reality is it's the ability to have and I'd like to explore that a little bit. is the ability to deal but the hyperscalers but the ability to own the space, And at the same time, they and the ability to secure and then you create a layer, that may escape the detail and have absorbed a lot over the years, So the challenge is going to be in the army where you slow down And by the way, it does so dynamically. of the public cloud providers And the ability to manage if they're going to be there. Tuesday, 11 o'clock. Love to get you on and to people at "The Cube." and emerging tech coverage.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

David LinthicumPERSON

0.99+

David McJannetPERSON

0.99+

DeloitteORGANIZATION

0.99+

Ali GhodsiPERSON

0.99+

August 9thDATE

0.99+

AWSORGANIZATION

0.99+

Benoit DagevillePERSON

0.99+

Kit ColbertPERSON

0.99+

25QUANTITY

0.99+

two monthsQUANTITY

0.99+

Charles FitzgeraldPERSON

0.99+

50QUANTITY

0.99+

next weekDATE

0.99+

M&AORGANIZATION

0.99+

Mohit AronPERSON

0.99+

John FurrierPERSON

0.99+

each cloudQUANTITY

0.99+

Tuesday, 11 o'clockDATE

0.99+

two weeksQUANTITY

0.99+

TuesdayDATE

0.99+

60QUANTITY

0.99+

todayDATE

0.99+

MySQLTITLE

0.99+

100 more questionsQUANTITY

0.99+

eachQUANTITY

0.99+

last yearDATE

0.99+

each instanceQUANTITY

0.99+

30 yearsQUANTITY

0.99+

20QUANTITY

0.99+

Moscone WestLOCATION

0.99+

3000 servicesQUANTITY

0.99+

one modelQUANTITY

0.99+

70%QUANTITY

0.99+

second oneQUANTITY

0.98+

1000QUANTITY

0.98+

30%QUANTITY

0.98+

500 survey respondentsQUANTITY

0.98+

1000 different partsQUANTITY

0.98+

VMwareORGANIZATION

0.98+

single componentQUANTITY

0.98+

single layerQUANTITY

0.97+

Deloitte ConsultingORGANIZATION

0.97+

oneQUANTITY

0.97+

NitroORGANIZATION

0.97+

about five yearsQUANTITY

0.97+

more than 30 experienced technologistsQUANTITY

0.97+

about 70%QUANTITY

0.97+

single instanceQUANTITY

0.97+

Shadow CloudORGANIZATION

0.96+

SnowflakeTITLE

0.96+

The CubeORGANIZATION

0.96+

third deploymentQUANTITY

0.96+

Deloitte USORGANIZATION

0.95+

Supercloud22ORGANIZATION

0.95+

20 yearsQUANTITY

0.95+

each cloud regionQUANTITY

0.95+

second two modelsQUANTITY

0.95+

Closing the Cloud Strategy, Technology, Innovation GapTITLE

0.94+

one cloudQUANTITY

0.94+

single cloudQUANTITY

0.94+

CohesityORGANIZATION

0.94+

one serverQUANTITY

0.94+

single domainQUANTITY

0.94+

each individual cloudQUANTITY

0.93+

supercloudORGANIZATION

0.93+

metacloudORGANIZATION

0.92+

multicloudORGANIZATION

0.92+

The CubeTITLE

0.92+

GravitonORGANIZATION

0.92+

VMware ExploreEVENT

0.91+

couple of months agoDATE

0.89+

single global instanceQUANTITY

0.88+

SnowflakeORGANIZATION

0.88+

cloudQUANTITY

0.88+

Ed Walsh, ChaosSearch | AWS re:Inforce 2022


 

(upbeat music) >> Welcome back to Boston, everybody. This is the birthplace of theCUBE. In 2010, May of 2010 at EMC World, right in this very venue, John Furrier called it the chowder and lobster post. I'm Dave Vellante. We're here at RE:INFORCE 2022, Ed Walsh, CEO of ChaosSearch. Doing a drive by Ed. Thanks so much for stopping in. You're going to help me wrap up in our final editorial segment. >> Looking forward to it. >> I really appreciate it. >> Thank you for including me. >> How about that? 2010. >> That's amazing. It was really in this-- >> Really in this building. Yeah, we had to sort of bury our way in, tunnel our way into the Blogger Lounge. We did four days. >> Weekends, yeah. >> It was epic. It was really epic. But I'm glad they're back in Boston. AWS was going to do June in Houston. >> Okay. >> Which would've been awful. >> Yeah, yeah. No, this is perfect. >> Yeah. Thank God they came back. You saw Boston in summer is great. I know it's been hot, And of course you and I are from this area. >> Yeah. >> So how you been? What's going on? I mean, it's a little crazy out there. The stock market's going crazy. >> Sure. >> Having the tech lash, what are you seeing? >> So it's an interesting time. So I ran a company in 2008. So we've been through this before. By the way, the world's not ending, we'll get through this. But it is an interesting conversation as an investor, but also even the customers. There's some hesitation but you have to basically have the right value prop, otherwise things are going to get sold. So we are seeing longer sales cycles. But it's nothing that you can't overcome. But it has to be something not nice to have, has to be a need to have. But I think we all get through it. And then there is some, on the VC side, it's now buckle down, let's figure out what to do which is always a challenge for startup plans. >> In pre 2000 you, maybe you weren't a CEO but you were definitely an executive. And so now it's different and a lot of younger people haven't seen this. You've got interest rates now rising. Okay, we've seen that before but it looks like you've got inflation, you got interest rates rising. >> Yep. >> The consumer spending patterns are changing. You had 6$, $7 gas at one point. So you have these weird crosscurrents, >> Yup. >> And people are thinking, "Okay post-September now, maybe because of the recession, the Fed won't have to keep raising interest rates and tightening. But I don't know what to root for. It's like half full, half empty. (Ed laughing) >> But we haven't been in an environment with high inflation. At least not in my career. >> Right. Right. >> I mean, I got into 92, like that was long gone, right?. >> Yeah. >> So it is a interesting regime change that we're going to have to deal with, but there's a lot of analogies between 2008 and now that you still have to work through too, right?. So, anyway, I don't think the world's ending. I do think you have to run a tight shop. So I think the grow all costs is gone. I do think discipline's back in which, for most of us, discipline never left, right?. So, to me that's the name of the game. >> What do you tell just generally, I mean you've been the CEO of a lot of private companies. And of course one of the things that you do to retain people and attract people is you give 'em stock and it's great and everybody's excited. >> Yeah. >> I'm sure they're excited cause you guys are a rocket ship. But so what's the message now that, Okay the market's down, valuations are down, the trees don't grow to the moon, we all know that. But what are you telling your people? What's their reaction? How do you keep 'em motivated? >> So like anything, you want over communicate during these times. So I actually over communicate, you get all these you know, the Sequoia decks, 2008 and the recent... >> (chuckles) Rest in peace good times, that one right? >> I literally share it. Why? It's like, Hey, this is what's going on in the real world. It's going to affect us. It has almost nothing to do with us specifically, but it will affect us. Now we can't not pay attention to it. It does change how you're going to raise money, so you got to make sure you have the right runway to be there. So it does change what you do, but I think you over communicate. So that's what I've been doing and I think it's more like a student of the game, so I try to share it, and I say some appreciate it others, I'm just saying, this is normal, we'll get through this and this is what happened in 2008 and trust me, once the market hits bottom, give it another month afterwards. Then everyone says, oh, the bottom's in and we're back to business. Valuations don't go immediately back up, but right now, no one knows where the bottom is and that's where kind of the world's ending type of things. >> Well, it's interesting because you talked about, I said rest in peace good times >> Yeah >> that was the Sequoia deck, and the message was tighten up. Okay, and I'm not saying you shouldn't tighten up now, but the difference is, there was this period of two years of easy money and even before that, it was pretty easy money. >> Yeah. >> And so companies are well capitalized, they have runway so it's like, okay, I was talking to Frank Slootman about this now of course there are public companies, like we're not taking the foot off the gas. We're inherently profitable, >> Yeah. >> we're growing like crazy, we're going for it. You know? So that's a little bit of a different dynamic. There's a lot of good runway out there, isn't there? >> But also you look at the different companies that were either born or were able to power through those environments are actually better off. You come out stronger in a more dominant position. So Frank, listen, if you see what Frank's done, it's been unbelievable to watch his career, right?. In fact, he was at Data Domain, I was Avamar so, but look at what he's done since, he's crushed it. Right? >> Yeah. >> So for him to say, Hey, I'm going to literally hit the gas and keep going. I think that's the right thing for Snowflake and a right thing for a lot of people. But for people in different roles, I literally say that you have to take it seriously. What you can't be is, well, Frank's in a different situation. What is it...? How many billion does he have in the bank? So it's... >> He's over a billion, you know, over a billion. Well, you're on your way Ed. >> No, no, no, it's good. (Dave chuckles) Okay, I want to ask you about this concept that we've sort of we coined this term called Supercloud. >> Sure. >> You could think of it as the next generation of multi-cloud. The basic premises that multi-cloud was largely a symptom of multi-vendor. Okay. I've done some M&A, I've got some Shadow IT, spinning up, you know, Shadow clouds, projects. But it really wasn't a strategy to have a continuum across clouds. And now we're starting to see ecosystems really build, you know, you've used the term before, standing on the shoulders of giants, you've used that a lot. >> Yep. >> And so we're seeing that. Jerry Chen wrote a seminal piece on Castles in The Cloud, so we coined this term SuperCloud to connote this abstraction layer that hides the underlying complexities and primitives of the individual clouds and then adds value on top of it and can adjudicate and manage, irrespective of physical location, Supercloud. >> Yeah. >> Okay. What do you think about that concept?. How does it maybe relate to some of the things that you're seeing in the industry? >> So, standing on shoulders of giants, right? So I always like to do hard tech either at big company, small companies. So we're probably your definition of a Supercloud. We had a big vision, how to literally solve the core challenge of analytics at scale. How are you going to do that? You're not going to build on your own. So literally we're leveraging the primitives, everything you can get out of the Amazon cloud, everything get out of Google cloud. In fact, we're even looking at what it can get out of this Snowflake cloud, and how do we abstract that out, add value to it? That's where all our patents are. But it becomes a simplified approach. The customers don't care. Well, they care where their data is. But they don't care how you got there, they just want to know the end result. So you simplify, but you gain the advantages. One thing's interesting is, in this particular company, ChaosSearch, people try to always say, at some point the sales cycle they say, no way, hold on, no way that can be fast no way, or whatever the different issue. And initially we used to try to explain our technology, and I would say 60% was explaining the public, cloud capabilities and then how we, harvest those I guess, make them better add value on top and what you're able to get is something you couldn't get from the public clouds themselves and then how we did that across public clouds and then extracted it. So if you think about that like, it's the Shoulders of giants. But what we now do, literally to avoid that conversation because it became a lengthy conversation. So, how do you have a platform for analytics that you can't possibly overwhelm for ingest. All your messy data, no pipelines. Well, you leverage things like S3 and EC2, and you do the different security things. You can go to environments say, you can't possibly overrun me, I could not say that. If I didn't literally build on the shoulders giants of all these public clouds. But the value. So if you're going to do hard tech as a startup, you're going to build, you're going to be the principles of Supercloud. Maybe they're not the same size of Supercloud just looking at Snowflake, but basically, you're going to leverage all that, you abstract it out and that's where you're able to have a lot of values at that. >> So let me ask you, so I don't know if there's a strict definition of Supercloud, We sort of put it out to the community and said, help us define it. So you got to span multiple clouds. It's not just running in each cloud. There's a metadata layer that kind of understands where you're pulling data from. Like you said you can pull data from Snowflake, it sounds like we're not running on Snowflake, correct? >> No, complimentary to them in their different customers. >> Yeah. Okay. >> They want to build on top of a data platform, data apps. >> Right. And of course they're going cross cloud. >> Right. >> Is there a PaaS layer in there? We've said there's probably a Super PaaS layer. You're probably not doing that, but you're allowing people to bring their own, bring your own PaaS sort of thing maybe. >> So we're a little bit different but basically we publish open APIs. We don't have a user interface. We say, keep the user interface. Again, we're solving the challenge of analytics at scale, we're not trying to retrain your analytics, either analysts or your DevOps or your SOV or your Secop team. They use the tools they already use. Elastic search APIs, SQL APIs. So really they program, they build applications on top of us, Equifax is a good example. Case said it coming out later on this week, after 18 months in production but, basically they're building, we provide the abstraction layer, the quote, I'm going to kill it, Jeff Tincher, who owns all of SREs worldwide, said to the effect of, Hey I'm able to rethink what I do for my data pipelines. But then he also talked about how, that he really doesn't have to worry about the data he puts in it. We deal with that. And he just has to, just query on the other side. That simplicity. We couldn't have done that without that. So anyway, what I like about the definition is, if you were going to do something harder in the world, why would you try to rebuild what Amazon, Google and Azure or Snowflake did? You're going to add things on top. We can still do intellectual property. We're still doing patents. So five grand patents all in this. But literally the abstraction layer is the simplification. The end users do not want to know that complexity, even though they ask the questions. >> And I think too, the other attribute is it's ecosystem enablement. Whereas I think, >> Absolutely >> in general, in the Multicloud 1.0 era, the ecosystem wasn't thinking about, okay, how do I build on top and abstract that. So maybe it is Multicloud 2.0, We chose to use Supercloud. So I'm wondering, we're at the security conference, >> RE: INFORCE is there a security Supercloud? Maybe Snyk has the developer Supercloud or maybe Okta has the identity Supercloud. I think CrowdStrike maybe not. Cause CrowdStrike competes with Microsoft. So maybe, because Microsoft, what's interesting, Merritt Bear was just saying, look, we don't show up in the spending data for security because we're not charging for most of our security. We're not trying to make a big business. So that's kind of interesting, but is there a potential for the security Supercloud? >> So, I think so. But also, I'll give you one thing I talked to, just today, at least three different conversations where everyone wants to log data. It's a little bit specific to us, but basically they want to do the security data lake. The idea of, and Snowflake talks about this too. But the idea of putting all the data in one repository and then how do you abstract out and get value from it? Maybe not the perfect, but it becomes simple to do but hard to get value out. So the different players are going to do that. That's what we do. We're able to, once you land it in your S3 or it doesn't matter, cloud of choice, simple storage, we allow you to get after that data, but we take the primitives and hide them from you. And all you do is query the data and we're spinning up stateless computer to go after it. So then if I look around the floor. There's going to be a bunch of these players. I don't think, why would someone in this floor try to recreate what Amazon or Google or Azure had. They're going to build on top of it. And now the key thing is, do you leave it in standard? And now we're open APIs. People are building on top of my open APIs or do you try to put 'em in a walled garden? And they're in, now your Supercloud. Our belief is, part of it is, it needs to be open access and let you go after it. >> Well. And build your applications on top of it openly. >> They come back to snowflake. That's what Snowflake's doing. And they're basically saying, Hey come into our proprietary environment. And the benefit is, and I think both can win. There's a big market. >> I agree. But I think the benefit of Snowflake's is, okay, we're going to have federated governance, we're going to have data sharing, you're going to have access to all the ecosystem players. >> Yep. >> And as everything's going to be controlled and you know what you're getting. The flip side of that is, Databricks is the other end >> Yeah. >> of that spectrum, which is no, no, you got to be open. >> Yeah. >> So what's going to happen, well what's happening clearly, is Snowflake's saying, okay we've got Snowpark. we're going to allow Python, we're going to have an Apache Iceberg. We're going to have open source tooling that you can access. By the way, it's not going to be as good as our waled garden where the flip side of that is you get Databricks coming at it from a data science and data engineering perspective. And there's a lot of gaps in between, aren't there? >> And I think they both win. Like for instance, so we didn't do Snowpark integration. But we work with people building data apps on top of Snowflake or data bricks. And what we do is, we can add value to that, or what we've done, again, using all the Supercloud stuff we're done. But we deal with the unstructured data, the four V's coming at you. You can't pipeline that to save. So we actually could be additive. As they're trying to do like a security data cloud inside of Snowflake or do the same thing in Databricks. That's where we can play. Now, we play with them at the application level that they get some data from them and some data for us. But I believe there's a partnership there that will do it inside their environment. To us they're just another large scaler environment that my customers want to get after data. And they want me to abstract it out and give value. >> So it's another repository to you. >> Yeah. >> Okay. So I think Snowflake recently added support for unstructured data. You chose not to do Snowpark because why? >> Well, so the way they're doing the unstructured data is not bad. It's JSON data. Basically, This is the dilemma. Everyone wants their application developers to be flexible, move fast, securely but just productivity. So you get, give 'em flexibility. The problem with that is analytics on the end want to be structured to be performant. And this is where Snowflake, they have to somehow get that raw data. And it's changing every day because you just let the developers do what they want now, in some structured base, but do what you need to do your business fast and securely. So it completely destroys. So they have large customers trying to do big integrations for this messy data. And it doesn't quite work, cause you literally just can't make the pipelines work. So that's where we're complimentary do it. So now, the particular integration wasn't, we need a little bit deeper integration to do that. So we're integrating, actually, at the data app layer. But we could, see us and I don't, listen. I think Snowflake's a good actor. They're trying to figure out what's best for the customers. And I think we just participate in that. >> Yeah. And I think they're trying to figure out >> Yeah. >> how to grow their ecosystem. Because they know they can't do it all, in fact, >> And we solve the key thing, they just can't do certain things. And we do that well. Yeah, I have SQL but that's where it ends. >> Yeah. >> I do the messy data and how to play with them. >> And when you talk to one of their founders, anyway, Benoit, he comes on the cube and he's like, we start with simple. >> Yeah. >> It reminds me of the guy's some Pure Storage, that guy Coz, he's always like, no, if it starts to get too complicated. So that's why they said all right, we're not going to start out trying to figure out how to do complex joins and workload management. And they turn that into a feature. So like you say, I think both can win. It's a big market. >> I think it's a good model. And I love to see Frank, you know, move. >> Yeah. I forgot So you AVMAR... >> In the day. >> You guys used to hate each other, right? >> No, no, no >> No. I mean, it's all good. >> But the thing is, look what he's done. Like I wouldn't bet against Frank. I think it's a good message. You can see clients trying to do it. Same thing with Databricks, same thing with BigQuery. We get a lot of same dynamic in BigQuery. It's good for a lot of things, but it's not everything you need to do. And there's ways for the ecosystem to play together. >> Well, what's interesting about BigQuery is, it is truly cloud native, as is Snowflake. You know, whereas Amazon Redshift was sort of Parexel, it's cobbled together now. It's great engineering, but BigQuery gets a lot of high marks. But again, there's limitations to everything. That's why companies like yours can exist. >> And that's why.. so back to the Supercloud. It allows me as a company to participate in that because I'm leveraging all the underlying pieces. Which we couldn't be doing what we're doing now, without leveraging the Supercloud concepts right, so... >> Ed, I really appreciate you coming by, help me wrap up today in RE:INFORCE. Always a pleasure seeing you, my friend. >> Thank you. >> All right. Okay, this is a wrap on day one. We'll be back tomorrow. I'll be solo. John Furrier had to fly out but we'll be following what he's doing. This is RE:INFORCE 2022. You're watching theCUBE. I'll see you tomorrow.

Published Date : Jul 26 2022

SUMMARY :

John Furrier called it the How about that? It was really in this-- Yeah, we had to sort of bury our way in, But I'm glad they're back in Boston. No, this is perfect. And of course you and So how you been? But it's nothing that you can't overcome. but you were definitely an executive. So you have these weird crosscurrents, because of the recession, But we haven't been in an environment Right. that was long gone, right?. I do think you have to run a tight shop. the things that you do But what are you telling your people? 2008 and the recent... So it does change what you do, and the message was tighten up. the foot off the gas. So that's a little bit But also you look at I literally say that you you know, over a billion. Okay, I want to ask you about this concept you know, you've used the term before, of the individual clouds and to some of the things So I always like to do hard tech So you got to span multiple clouds. No, complimentary to them of a data platform, data apps. And of course people to bring their own, the quote, I'm going to kill it, And I think too, the other attribute is in the Multicloud 1.0 era, for the security Supercloud? And now the key thing is, And build your applications And the benefit is, But I think the benefit of Snowflake's is, you know what you're getting. which is no, no, you got to be open. that you can access. You can't pipeline that to save. You chose not to do Snowpark but do what you need to do they're trying to figure out how to grow their ecosystem. And we solve the key thing, I do the messy data And when you talk to So like you say, And I love to see Frank, you know, move. So you AVMAR... it's all good. but it's not everything you need to do. there's limitations to everything. so back to the Supercloud. Ed, I really appreciate you coming by, I'll see you tomorrow.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff TincherPERSON

0.99+

Dave VellantePERSON

0.99+

BostonLOCATION

0.99+

2008DATE

0.99+

Jerry ChenPERSON

0.99+

MicrosoftORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

Ed WalshPERSON

0.99+

FrankPERSON

0.99+

Frank SlootmanPERSON

0.99+

AWSORGANIZATION

0.99+

two yearsQUANTITY

0.99+

GoogleORGANIZATION

0.99+

John FurrierPERSON

0.99+

HoustonLOCATION

0.99+

2010DATE

0.99+

tomorrowDATE

0.99+

BenoitPERSON

0.99+

EdPERSON

0.99+

60%QUANTITY

0.99+

DavePERSON

0.99+

ChaosSearchORGANIZATION

0.99+

JuneDATE

0.99+

May of 2010DATE

0.99+

BigQueryTITLE

0.99+

Castles in The CloudTITLE

0.99+

SeptemberDATE

0.99+

Data DomainORGANIZATION

0.99+

SnowflakeORGANIZATION

0.99+

todayDATE

0.99+

$7QUANTITY

0.99+

each cloudQUANTITY

0.99+

bothQUANTITY

0.99+

over a billionQUANTITY

0.99+

Multicloud 2.0TITLE

0.99+

four daysQUANTITY

0.99+

M&AORGANIZATION

0.98+

one repositoryQUANTITY

0.98+

PythonTITLE

0.98+

DatabricksORGANIZATION

0.98+

Merritt BearPERSON

0.98+

SupercloudORGANIZATION

0.98+

AzureORGANIZATION

0.97+

SQLTITLE

0.97+

EC2TITLE

0.97+

oneQUANTITY

0.96+

FedORGANIZATION

0.96+

S3TITLE

0.96+

five grand patentsQUANTITY

0.96+

SnowparkORGANIZATION

0.96+

Multicloud 1.0TITLE

0.95+

billionQUANTITY

0.94+

AvamarORGANIZATION

0.93+

EMC WorldLOCATION

0.93+

SnowflakePERSON

0.93+

one pointQUANTITY

0.93+

SupercloudTITLE

0.93+

EquifaxORGANIZATION

0.92+

92QUANTITY

0.91+

Super PaaSTITLE

0.91+

SnowflakeTITLE

0.89+

Breaking Analysis: Amping it up with Frank Slootman


 

>> From theCUBE studios in Palo Alto in Boston, bringing you data-driven insights from the cube and ETR, this is Breaking Analysis with Dave Vellante. >> Organizations have considerable room to improve their performance without making expensive changes to their talent, their structure, or their fundamental business model. You don't need a slew of consultants to tell you what to do. You already know. What you need is to immediately ratchet up expectations, energy, urgency, and intensity. You have to fight mediocrity every step of the way. Amp it up and the results will follow. This is the fundamental premise of a hard-hitting new book written by Frank Slootman, CEO of Snowflake, and published earlier this year. It's called "Amp It Up, Leading for Hypergrowth "by Raising Expectations, Increasing Urgency, "and Elevating Intensity." Hello and welcome to this week's Wikibon CUBE Insights, powered by ETR. At Snowflake Summit last month, I was asked to interview Frank on stage about his new book. I've read it several times. And if you haven't read it, you should. Even if you have read it, in this Breaking Analysis, we'll dig deeper into the book and share some clarifying insights and nuances directly from Slootman himself from my one-on-one conversation with him. My first question to Slootman was why do you write this book? Okay, it's kind of a common throwaway question. And how the heck did you find time to do it? It's fairly well-known that a few years ago, Slootman put up a post on LinkedIn with the title Amp It Up. It generated so much buzz and so many requests for Frank's time that he decided that the best way to efficiently scale and share his thoughts on how to create high-performing companies and organizations was to publish a book. Now, he wrote the book during the pandemic. And I joked that they must not have Netflix in Montana where he resides. In a pretty funny moment, he said that writing the book was easier than promoting it. Take a listen. >> Denise, our CMO, you know, she just made sure that this process wasn't going to. It was more work for me to promote this book with all these damn podcasts and other crap, than actually writing the book, you know. And after a while, I was like I'm not doing another podcast. >> Now, the book gives a lot of interesting background information on Slootman's career and what he learned at various companies that he led and participated in. Now, I'm not going to go into most of that today, which is why you should read the book yourself. But Slootman, he's become somewhat of a business hero to many people, myself included. Leaders like Frank, Scott McNealy, Jayshree Ullal, and my old boss, Pat McGovern at IDG, have inspired me over the years. And each has applied his or her own approach to building cultures and companies. Now, when Slootman first took over the reins at Snowflake, I published a Breaking Analysis talking about Snowflake and what we could expect from the company now that Slootman and CFO Mike Scarpelli were back together. In that post, buried toward the end, I referenced the playbook that Frank used at Data Domain and ServiceNow, two companies that I followed quite closely as an analyst, and how it would be applied at Snowflake, that playbook if you will. Frank reached out to me afterwards and said something to the effect of, "I don't use playbooks. "I am a situational leader. "Playbooks, you know, they work in football games. "But in the military, they teach you "situational leadership." Pretty interesting learning moment for me. So I asked Frank on the stage about this. Here's what he said. >> The older you get, the more experience that you have, the more you become a prisoner of your own background because you sort of think in terms of what you know as opposed to, you know, getting outside of what you know and trying to sort of look at things like a five-year-old that has never seen this before. And then how would you, you know, deal with it? And I really try to force myself into I've never seen this before and how do I think about it? Because at least they're very different, you know, interpretations. And be open-minded, just really avoid that rinse and repeat mentality. And you know, I've brought people in from who have worked with me before. Some of them come with me from company to company. And they were falling prey to, you know, rinse and repeat. I would just literally go like that's not what we want. >> So think about that for a moment. I mean, imagine coming in to lead a new company and forcing yourself and your people to forget what they know that works and has worked in the past, put that aside and assess the current situation with an open mind, essentially start over. Now, that doesn't mean you don't apply what has worked in the past. Slootman talked to me about bringing back Scarpelli and the synergistic relationship that they have and how they build cultures and the no BS and hard truth mentality they bring to companies. But he bristles when people ask him, "What type of CEO are you?" He says, "Do we have to put a label on it? "It really depends on the situation." Now, one of the other really hard-hitting parts of the book was the way Frank deals with who to keep and who to let go. He uses the Volkswagen tagline of drivers wanted. He says in his book, in companies there are passengers and there are drivers, and we want drivers. He said, "You have to figure out really quickly "who the drivers are and basically throw the wrong people "off the bus, keep the right people, bring in new people "that fit the culture and put them "in the right seats on the bus." Now, these are not easy decisions to make. But as it pertains to getting rid of people, I'm reminded of the movie "Moneyball." Art Howe, the manager of the Oakland As, he refused to play Scott Hatteberg at first base. So the GM, Billy Bean played by Brad Pitt says to Peter Brand who was played by Jonah Hill, "You have to fire Carlos Pena." Don't learn how to fire people. Billy Bean says, "Just keep it quick. "Tell him he's been traded and that's it." So I asked Frank, "Okay, I get it. "Like the movie, when you have the wrong person "on the bus, you just have to make the decision, "be straightforward, and do it." But I asked him, "What if you're on the fence? "What if you're not completely sure if this person "is a driver or a passenger, if he or she "should be on the bus or not on the bus? "How do you handle that?" Listen to what he said. >> I have a very simple way to break ties. And when there's doubt, there's no doubt, okay? >> When there's doubt, there's no doubt. Slootman's philosophy is you have to be emphatic and have high conviction. You know, back to the baseball analogy, if you're thinking about taking the pitcher out of the game, take 'em out. Confrontation is the single hardest thing in business according to Slootman but you have to be intellectually honest and do what's best for the organization, period. Okay, so wow, that may sound harsh but that's how Slootman approaches it, very Belichickian if you will. But how can you amp it up on a daily basis? What's the approach that Slootman takes? We got into this conversation with a discussion about MBOs, management by objective. Slootman in his book says he's killed MBOs at every company he's led. And I asked him to explain why. His rationale was that individual MBOs invariably end up in a discussion about relief of the MBO if the person is not hitting his or her targets. And that detracts from the organizational alignment. He said at Snowflake everyone gets paid the same way, from the execs on down. It's a key way he creates focus and energy in an organization, by creating alignment, urgency, and putting more resources into the most important things. This is especially hard, Slootman says, as the organization gets bigger. But if you do approach it this way, everything gets easier. The cadence changes, the tempo accelerates, and it works. Now, and to emphasize that point, he said the following. Play the clip. >> Every meeting that you have, every email, every encounter in the hallway, whatever it is, is an opportunity to amp things up. That's why I use that title. But do you take that opportunity? >> And according to Slootman, if you don't take that opportunity, if you're not in the moment, amping it up, then you're thinking about your golf game or the tennis match that's going on this weekend or being out on your boat. And to the point, this approach is not for everyone. You're either built for it or you're not. But if you can bring people into the organization that can handle this type of dynamic, it creates energy. It becomes fun. Everything moves faster. The conversations are exciting. They're inspiring. And it becomes addictive. Now let's talk about priorities. I said to Frank that for me anyway, his book was an uncomfortable read. And he was somewhat surprised by that. "Really," he said. I said, "Yeah. "I mean, it was an easy read but uncomfortable "because over my career, I've managed thousands of people, "not tens of thousands but thousands, "enough to have to take this stuff very seriously." And I found myself throughout the book, oh, you know, on the one hand saying to myself, "Oh, I got that right, good job, Dave." And then other times, I was thinking to myself, "Oh wow, I probably need to rethink that. "I need to amp it up on that front." And the point is to Frank's leadership philosophy, there's no one correct way to approach all situations. You have to figure it out for yourself. But the one thing in the book that I found the hardest was Slootman challenged the reader. If you had to drop everything and focus on one thing, just one thing, for the rest of the year, what would that one thing be? Think about that for a moment. Were you able to come up with that one thing? What would happen to all the other things on your priority list? Are they all necessary? If so, how would you delegate those? Do you have someone in your organization who can take those off your plate? What would happen if you only focused on that one thing? These are hard questions. But Slootman really forces you to think about them and do that mental exercise. Look at Frank's body language in this screenshot. Imagine going into a management meeting with Frank and being prepared to share all the things you're working on that you're so proud of and all the priorities you have for the coming year. Listen to Frank in this clip and tell me it doesn't really make you think. >> I've been in, you know, on other boards and stuff. And I got a PowerPoint back from the CEO and there's like 15 things. They're our priorities for the year. I'm like you got 15, you got none, right? It's like you just can't decide, you know, what's important. So I'll tell you everything because I just can't figure out. And the thing is it's very hard to just say one thing. But it's really the mental exercise that matters. >> Going through that mental exercise is really important according to Slootman. Let's have a conversation about what really matters at this point in time. Why does it need to happen? And does it take priority over other things? Slootman says you have to pull apart the hairball and drive extraordinary clarity. You could be wrong, he says. And he admits he's been wrong on many things before. He, like everyone, is fearful of being wrong. But if you don't have the conversation according to Slootman, you're already defeated. And one of the most important things Slootman emphasizes in the book is execution. He said that's one of the reasons he wrote "Amp It Up." In our discussion, he referenced Pat Gelsinger, his former boss, who bought Data Domain when he was working for Joe Tucci at EMC. Listen to Frank describe the interaction with Gelsinger. >> Well, one of my prior bosses, you know, Pat Gelsinger, when they acquired Data Domain through EMC, Pat was CEO of Intel. And he quoted Andy Grove as saying, 'cause he was Intel for a long time when he was younger man. And he said no strategy is better than its execution, which if I find one of the most brilliant things. >> Now, before you go changing your strategy, says Slootman, you have to eliminate execution as a potential point of failure. All too often, he says, Silicon Valley wants to change strategy without really understanding whether the execution is right. All too often companies don't consider that maybe the product isn't that great. They will frequently, for example, make a change to sales leadership without questioning whether or not there's a product fit. According to Slootman, you have to drive hardcore intellectual honesty. And as uncomfortable as that may be, it's incredibly important and powerful. Okay, one of the other contrarian points in the book was whether or not to have a customer success department. Slootman says this became really fashionable in Silicon Valley with the SaaS craze. Everyone was following and pattern matching the lead of salesforce.com. He says he's eliminated the customer service department at every company he's led which had a customer success department. Listen to Frank Slootman in his own words talk about the customer success department. >> I view the whole company as a customer success function. Okay, I'm customer success, you know. I said it in my presentation yesterday. We're a customer-first organization. I don't need a department. >> Now, he went on to say that sales owns the commercial relationship with the customer. Engineering owns the technical relationship. And oh, by the way, he always puts support inside of the engineering department because engineering has to back up support. And rather than having a separate department for customer success, he focuses on making sure that the existing departments are functioning properly. Slootman also has always been big on net promoter score, NPS. And Snowflake's is very high at 72. And according to Slootman, it's not just the product. It's the people that drive that type of loyalty. Now, Slootman stresses amping up the big things and even the little things too. He told a story about someone who came into his office to ask his opinion about a tee shirt. And he turned it around on her and said, "Well, what do you think?" And she said, "Well, it's okay." So Frank made the point by flipping the situation. Why are you coming to me with something that's just okay? If we're going to do something, let's do it. Let's do it all out. Let's do it right and get excited about it, not just check the box and get something off your desk. Amp it up, all aspects of our business. Listen to Slootman talk about Steve Jobs and the relevance of demanding excellence and shunning mediocrity. >> He was incredibly intolerant of anything that he didn't think of as great. You know, he was immediately done with it and with the person. You know, I'm not that aggressive, you know, in that way. I'm a little bit nicer, you know, about it. But I still, you know, I don't want to give into expediency and mediocrity. I just don't, I'm just going to fight it, you know, every step of the way. >> Now, that story was about a little thing like some swag. But Slootman talked about some big things too. And one of the major ways Snowflake was making big, sweeping changes to amp up its business was reorganizing its go-to-market around industries like financial services, media, and healthcare. Here's some ETR data that shows Snowflake's net score or spending momentum for key industry segments over time. The red dotted line at 40% is an indicator of highly elevated spending momentum. And you can see for the key areas shown, Snowflake is well above that level. And we cut this data where responses were greater, the response numbers were greater than 15. So not huge ends but large enough to have meaning. Most were in the 20s. Now, it's relatively uncommon to see a company that's having the success of Snowflake make this kind of non-trivial change in the middle of steep S-curve growth. Why did they make this move? Well, I think it's because Snowflake realizes that its data cloud is going to increasingly have industry diversity and unique value by industry, that ecosystems and data marketplaces are forming around industries. So the more industry affinity Snowflake can create, the stronger its moat will be. It also aligns with how the largest and most prominent global system integrators, global SIs, go to market. This is important because as companies are transforming, they are radically changing their data architecture, how they think about data, how they approach data as a competitive advantage, and they're looking at data as specifically a monetization opportunity. So having industry expertise and knowledge and aligning with those customer objectives is going to serve Snowflake and its ecosystems well in my view. Slootman even said he joined the board of Instacart not because he needed another board seat but because he wanted to get out of his comfort zone and expose himself to other industries as a way to learn. So look, we're just barely scratching the surface of Slootman's book and I've pulled some highlights from our conversation. There's so much more that I can share just even from our conversation. And I will as the opportunity arises. But for now, I'll just give you the kind of bumper sticker of "Amp It Up." Raise your standards by taking every opportunity, every interaction, to increase your intensity. Get your people aligned and moving in the same direction. If it's the wrong direction, figure it out and course correct quickly. Prioritize and sharpen your focus on things that will really make a difference. If you do these things and increase the urgency in your organization, you'll naturally pick up the pace and accelerate your company. Do these things and you'll be able to transform, better identify adjacent opportunities and go attack them, and create a lasting and meaningful experience for your employees, customers, and partners. Okay, that's it for today. Thanks for watching. And thank you to Alex Myerson who's on production and he manages the podcast for Breaking Analysis. Kristin Martin and Cheryl Knight help get the word out on social and in our newsletters. And Rob Hove is our EIC over at Silicon Angle who does some wonderful and tremendous editing. Thank you all. Remember, all these episodes are available as podcasts. Wherever you listen, just search Breaking Analysis podcast. I publish each week on wikibon.com and siliconangle.com. And you can email me at david.vellante@siliconangle.com or DM me @dvellante or comment on my LinkedIn posts. And please do check out etr.ai for the best survey data in enterprise tech. This is Dave Vellante for theCUBE Insights, powered by ETR. Thanks for watching. Be well. And we'll see you next time on Breaking Analysis. (upbeat music)

Published Date : Jul 17 2022

SUMMARY :

insights from the cube and ETR, And how the heck did than actually writing the book, you know. "But in the military, they teach you And you know, I've brought people in "on the bus, you just And when there's doubt, And that detracts from the Every meeting that you have, And the point is to Frank's And I got a PowerPoint back from the CEO And one of the most important things the most brilliant things. According to Slootman, you have to drive Okay, I'm customer success, you know. and even the little things too. going to fight it, you know, and he manages the podcast

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
SlootmanPERSON

0.99+

FrankPERSON

0.99+

Alex MyersonPERSON

0.99+

Frank SlootmanPERSON

0.99+

EMCORGANIZATION

0.99+

Pat McGovernPERSON

0.99+

Pat GelsingerPERSON

0.99+

Dave VellantePERSON

0.99+

PatPERSON

0.99+

DenisePERSON

0.99+

MontanaLOCATION

0.99+

Cheryl KnightPERSON

0.99+

Peter BrandPERSON

0.99+

Joe TucciPERSON

0.99+

Art HowePERSON

0.99+

GelsingerPERSON

0.99+

Kristin MartinPERSON

0.99+

Brad PittPERSON

0.99+

Jonah HillPERSON

0.99+

VolkswagenORGANIZATION

0.99+

Palo AltoLOCATION

0.99+

Andy GrovePERSON

0.99+

Mike ScarpelliPERSON

0.99+

IntelORGANIZATION

0.99+

MoneyballTITLE

0.99+

Carlos PenaPERSON

0.99+

DavePERSON

0.99+

Scott McNealyPERSON

0.99+

Jayshree UllalPERSON

0.99+

Billy BeanPERSON

0.99+

yesterdayDATE

0.99+

SnowflakeORGANIZATION

0.99+

Rob HovePERSON

0.99+

Scott HattebergPERSON

0.99+

thousandsQUANTITY

0.99+

david.vellante@siliconangle.comOTHER

0.99+

Data DomainORGANIZATION

0.99+

two companiesQUANTITY

0.99+

Silicon ValleyLOCATION

0.99+

Silicon AngleORGANIZATION

0.99+

ServiceNowORGANIZATION

0.99+

first questionQUANTITY

0.99+

Steve JobsPERSON

0.99+

last monthDATE

0.99+

IDGORGANIZATION

0.99+

ScarpelliPERSON

0.99+

15QUANTITY

0.99+

40%QUANTITY

0.99+

siliconangle.comOTHER

0.99+

72QUANTITY

0.99+

Jon Loyens, data.world | Snowflake Summit 2022


 

>>Good morning, everyone. Welcome back to the Cube's coverage of snowflake summit 22 live from Caesar's forum in Las Vegas. Lisa Martin, here with Dave Valante. This is day three of our coverage. We've had an amazing, amazing time. Great conversations talking with snowflake executives, partners, customers. We're gonna be digging into data mesh with data.world. Please welcome John loins, the chief product officer. Great to have you on the program, John, >>Thank you so much for, for having me here. I mean, the summit, like you said, has been incredible, so many great people, so such a good time, really, really nice to be back in person with folks. >>It is fabulous to be back in person. The fact that we're on day four for, for them. And this is the, the solution showcase is as packed as it is at 10 11 in the morning. Yeah. Is saying something >>Yeah. Usually >>Chopping at the bit to hear what they're doing and innovate. >>Absolutely. Usually those last days of conferences, everybody starts getting a little tired, but we're not seeing that at all here, especially >>In Vegas. This is impressive. Talk to the audience a little bit about data.world, what you guys do and talk about the snowflake relationship. >>Absolutely data.world is the only true cloud native enterprise data catalog. We've been an incredible snowflake partner and Snowflake's been an incredible partner to us really since 2018. When we became the first data catalog in the snowflake partner connect experience, you know, snowflake and the data cloud make it so possible. And it's changed so much in terms of being able to, you know, very easily transition data into the cloud to break down those silos and to have a platform that enables folks to be incredibly agile with data from an engineering and infrastructure standpoint, data out world is able to provide a layer of discovery and governance that matches that agility and the ability for a lot of different stakeholders to really participate in the process of data management and data governance. >>So data mesh basically Jamma, Dani lays out the first of all, the, the fault domains of existing data and big data initiatives. And she boils it down to the fact that it's just this monolithic architecture with hyper specialized teams that you have to go through and it just slows everything down and it doesn't scale. They don't have domain context. So she came up with four principles if I may, yep. Domain ownership. So push it out to the businesses. They have the context they should own the data. The second is data as product. We're certainly hearing a lot about that today this week. The third is that. So that makes it sounds good. Push out the, the data great, but it creates two problems. Self-serve infrastructure. Okay. But her premises infrastructure should be an operational detail. And then the fourth is computational governance. So you talked about data CA where do you fit in those four principles? >>You know, honestly, we are able to help teams realize the data mesh architecture. And we know that data mesh is really, it's, it's both a process in a culture change, but then when you want to enact a process in a culture change like this, you also need to select the appropriate tools to match the culture that you're trying to build the process in the architecture that you're trying to build. And the data world data catalog can really help along all four of those axes. When you start thinking first about, let's say like, let's take the first one, you know, data as a product, right? We even like very meta of us from metadata management platform at the end of the day. But very meta of us. When you talk about data as a product, we track adoption and usage of all your data assets within your organization and provide program teams and, you know, offices of the CDO with incredible evented analytics, very detailed that gives them the right audit trail that enables them to direct very scarce data engineering, data architecture resources, to make sure that their data assets are getting adopted and used properly. >>On the, on the domain driven side, we are entirely knowledge graph and open standards based enabling those different domains. We have, you know, incredible joint snowflake customers like Prologis. And we chatted a lot about this in our session here yesterday, where, because of our knowledge graph underpinnings, because of the flexibility of our metadata model, it enables those domains to actually model their assets uniquely from, from group to group, without having to, to relaunch or run different environments. Like you can do that all within one day catalog platform without having to have separate environments for each of those domains, federated governance. Again, the amount of like data exhaust that we create that really enables ambient governance and participatory governance as well. We call it agile data governance, really the adoption of agile and open principles applied to governance to make it more inclusive and transparent. And we provide that in a way that Confederate across those means and make it consistent. >>Okay. So you facilitate across that whole spectrum of, of principles. And so what in the, in the early examples of data mesh that I've studied and actually collaborated with, like with JPMC, who I don't think is who's not using your data catalog, but hello, fresh who may or may not be, but I mean, there, there are numbers and I wanna get to that. But what they've done is they've enabled the domains to spin up their own, whatever data lakes, data, warehouses, data hubs, at least in, in concept, most of 'em are data lakes on AWS, but still in concept, they wanna be inclusive and they've created a master data catalog. And then each domain has its sub catalogue, which feeds into the master and that's how they get consistency and governance and everything else is, is that the right way to think about it? And or do you have a different spin on that? >>Yeah, I, I, you know, I have a slightly different spin on it. I think organizationally it's the right way to think about it. And in absence of a catalog that can truly have multiple federated metadata models, multiple graphs in one platform, I, that is really kind of the, the, the only way to do it, right with data.world. You don't have to do that. You can have one platform, one environment, one instance of data.world that spans all of your domains, enable them to operate independently and then federate across. So >>You just answered my question as to why I should use data.world versus Amazon glue. >>Oh, absolutely. >>And that's a, that's awesome that you've done now. How have you done that? What, what's your secret >>Sauce? The, the secret sauce era is really an all credit to our CTO. One of my closest friends who was a true student of knowledge graph practices and principles, and really felt that the right way to manage metadata and knowledge about the data analytics ecosystem that companies were building was through federated linked data, right? So we use standards and we've built a, a, an open and extensible metadata model that we call costs that really takes the best parts of existing open standards in the semantics space. Things like schema.org, DCA, Dublin core brings them together and models out the most typical enterprise data assets providing you with an ontology that's ready to go. But because of the graph nature of what we do is instantly accessible without having to rebuild environments, without having to do a lot of management against it. It's, it's really quite something. And it's something all of our customers are, are very impressed with and, and, and, and, you know, are getting a lot of leverage out of, >>And, and we have a lot of time today, so we're not gonna shortchange this topic. So one last question, then I'll shut up and let you jump in. This is an open standard. It's not open source. >>No, it's an open built on open standards, built on open standards. We also fundamentally believe in extensibility and openness. We do not want to vertically like lock you into our platform. So everything that we have is API driven API available. Your metadata belongs to you. If you need to export your graph, you know, instantly available in open machine readable formats. That's really, we come from the open data community. That was a lot of the founding of data.world. We, we worked a lot in with the open data community and we, we fundamentally believe in that. And that's enabled a lot of our customers as well to truly take data.world and not have it be a data catalog application, but really an entire metadata management platform and extend it even further into their enterprise to, to really catalog all of their assets, but also to build incredible integrations to things like corporate search, you know, having data assets show up in corporate Wiki search, along with all the, the descriptive metadata that people need has been incredibly powerful and an incredible extension of our platform that I'm so happy to see our customers in. >>So leasing. So it's not exclusive to, to snowflake. It's not exclusive to AWS. You can bring it anywhere. Azure GCP, >>Anytime. Yeah. You know where we are, where we love snowflake, look, we're at the snowflake summit. And we've always had a great relationship with snowflake though, and really leaned in there because we really believe Snowflake's principles, particularly around cloud and being cloud native and the operating advantages that it affords companies that that's really aligned with what we do. And so snowflake was really the first of the cloud data catalogs that we ultimately or say the cloud data warehouses that we integrated with and to see them transition to building really out the data cloud has been awesome. >>Talk about how data world and snowflake enable companies like per lodges to be data companies. These days, every company has to be a data company, but they, they have to be able to do so quickly to be competitive and to, to really win. How do you help them if we like up level the conversation to really impacting the overall business? >>That's a great question, especially right now, everybody knows. And pro is a great example. They're a logistics and supply chain company at the end of the day. And we know how important logistics and supply chain is nowadays and for them and for a lot of our customers. I think one of the advantages of having a data catalog is the ability to build trust, transparency and inclusivity into their data analytics practice by adopting agile principles, by adopting a data mesh, you're able to extend your data analytics practice to a much broader set of stakeholders and to involve them in the process while the work is getting done. One of the greatest things about agile software development, when it became a thing in the early two thousands was how inclusive it was. And that inclusivity led to a much faster ROI on software projects. And we see the same thing happening in data analytics, people, you know, we have amazing data scientists and data analysts coming up with these insights that could be business changing that could make their company significantly more resilient, especially in the face of economic uncertainty. >>But if you have to sit there and argue with your business stakeholders about the validity of the data, about the, the techniques that were used to do the analysis, and it takes you three months to get people to trust what you've done, that opportunity's passed. So how do we shorten those cycles? How do we bring them closer? And that's, that's really a huge benefit that like Prologis has, has, has realized just tightening that cycle time, building trust, building inclusion, and making sure ultimately humans learn by doing, and if you can be inclusive, it, even, it even increases things like that. We all want to, to, to, to help cuz Lord knows the world needs it. Things like data literacy. Yeah. Right. >>So data.world can inform me as to where on the spectrum of data quality, my data set lives. So I can say, okay, this is usable, shareable, you know, exactly of gold standard versus fix this. Right. Okay. Yep. >>Yep. >>That's yeah. Okay. And you could do that with one data catalog, not a bunch of >>Yeah. And trust trust is really a multifaceted and multi multi-angle idea, right? It's not just necessarily data quality or data observability. And we have incredible partnerships in that space, like our partnership with, with Monte Carlo, where we can ingest all their like amazing observability information and display that in a really like a really consumable way in our data catalog. But it also includes things like the lineage who touch it, who is involved in the process of a, can I get a, a, a question answered quickly about this data? What's it been used for previously? And do I understand that it's so multifaceted that you have to be able to really model and present that in a way that's unique to any given organization, even unique within domains within a single organization. >>If you're not, that means to suggest you're a data quality. No, no supplier. Absolutely. But your partner with them and then that you become the, the master catalog. >>That's brilliant. I love it. Exactly. And you're >>You, you just raised your series C 15 million. >>We did. Yeah. So, you know, really lucky to have incredible investors like Goldman Sachs, who, who led our series C it really, I think, communicates the trust that they have in our vision and what we're doing and the impact that we can have on organization's ability to be agile and resilient around data analytics, >>Enabling customers to have that single source of truth is so critical. You talked about trust. That is absolutely. It's no joke. >>Absolutely. >>That is critical. And there's a tremendous amount of business impact, positive business impact that can come from that. What are some of the things that are next for data.world that we're gonna see? >>Oh, you know, I love this. We have such an incredibly innovative team. That's so dedicated to this space and the mission of what we're doing. We're out there trying to fundamentally change how people get data analytics work done together. One of the big reasons I founded the company is I, I really truly believe that data analytics needs to be a team sport. It needs to go from, you know, single player mode to team mode and everything that we've worked on in the last six years has leaned into that. Our architecture being cloud native, we do, we've done over a thousand releases a year that nobody has to manage. You don't have to worry about upgrading your environment. It's a lot of the same story that's made snowflake. So great. We are really excited to have announced in March on our own summit. And we're rolling this suite of features out over the course of the year, a new package of features that we call data.world Eureka, which is a suite of automations and, you know, knowledge driven functionality that really helps you leverage a knowledge graph to make decisions faster and to operationalize your data in, in the data ops way with significantly less effort, >>Big, big impact there. John, thank you so much for joining David, me unpacking what data world is doing. The data mesh, the opportunities that you're giving to customers and every industry. We appreciate your time and congratulations on the news and the funding. >>Ah, thank you. It's been a, a true pleasure. Thank you for having me on and, and I hope, I hope you guys enjoy the rest of, of the day and, and your other guests that you have. Thank you. >>We will. All right. For our guest and Dave ante, I'm Lisa Martin. You're watching the cubes third day of coverage of snowflake summit, 22 live from Vegas, Dave and I will be right back with our next guest. So stick around.

Published Date : Jun 16 2022

SUMMARY :

Great to have you on the program, John, I mean, the summit, like you said, has been incredible, It is fabulous to be back in person. Usually those last days of conferences, everybody starts getting a little tired, but we're not seeing that at all here, what you guys do and talk about the snowflake relationship. And it's changed so much in terms of being able to, you know, very easily transition And she boils it down to the fact that it's just this monolithic architecture with hyper specialized teams about, let's say like, let's take the first one, you know, data as a product, We have, you know, incredible joint snowflake customers like Prologis. governance and everything else is, is that the right way to think about it? And in absence of a catalog that can truly have multiple federated How have you done that? of knowledge graph practices and principles, and really felt that the right way to manage then I'll shut up and let you jump in. an incredible extension of our platform that I'm so happy to see our customers in. It's not exclusive to AWS. first of the cloud data catalogs that we ultimately or say the cloud data warehouses but they, they have to be able to do so quickly to be competitive and to, thing happening in data analytics, people, you know, we have amazing data scientists and data the data, about the, the techniques that were used to do the analysis, and it takes you three So I can say, okay, this is usable, shareable, you know, That's yeah. that you have to be able to really model and present that in a way that's unique to any then that you become the, the master catalog. And you're that we can have on organization's ability to be agile and resilient Enabling customers to have that single source of truth is so critical. What are some of the things that are next for data.world that we're gonna see? It needs to go from, you know, single player mode to team mode and everything The data mesh, the opportunities that you're giving to customers and every industry. and I hope, I hope you guys enjoy the rest of, of the day and, and your other guests that you have. So stick around.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

Lisa MartinPERSON

0.99+

Dave ValantePERSON

0.99+

DavePERSON

0.99+

JohnPERSON

0.99+

Jon LoyensPERSON

0.99+

Monte CarloORGANIZATION

0.99+

John loinsPERSON

0.99+

AmazonORGANIZATION

0.99+

MarchDATE

0.99+

Las VegasLOCATION

0.99+

VegasLOCATION

0.99+

Goldman SachsORGANIZATION

0.99+

yesterdayDATE

0.99+

three monthsQUANTITY

0.99+

AWSORGANIZATION

0.99+

one platformQUANTITY

0.99+

one dayQUANTITY

0.99+

thirdQUANTITY

0.99+

two problemsQUANTITY

0.99+

fourthQUANTITY

0.99+

OneQUANTITY

0.99+

2018DATE

0.99+

15 millionQUANTITY

0.98+

DaniPERSON

0.98+

secondQUANTITY

0.98+

firstQUANTITY

0.98+

third dayQUANTITY

0.98+

first oneQUANTITY

0.98+

SnowflakeORGANIZATION

0.98+

DCAORGANIZATION

0.98+

one last questionQUANTITY

0.98+

data.world.ORGANIZATION

0.97+

PrologisORGANIZATION

0.97+

JPMCORGANIZATION

0.97+

each domainQUANTITY

0.97+

today this weekDATE

0.97+

JammaPERSON

0.97+

bothQUANTITY

0.97+

first data catalogQUANTITY

0.95+

Snowflake Summit 2022EVENT

0.95+

eachQUANTITY

0.94+

todayDATE

0.94+

singleQUANTITY

0.94+

data.worldORGANIZATION

0.93+

day threeQUANTITY

0.93+

oneQUANTITY

0.93+

one instanceQUANTITY

0.92+

over a thousand releases a yearQUANTITY

0.92+

day fourQUANTITY

0.91+

SnowflakeTITLE

0.91+

fourQUANTITY

0.91+

10 11 in the morningDATE

0.9+

22QUANTITY

0.9+

one environmentQUANTITY

0.9+

single organizationQUANTITY

0.88+

four principlesQUANTITY

0.86+

agileTITLE

0.85+

last six yearsDATE

0.84+

one data catalogQUANTITY

0.84+

EurekaORGANIZATION

0.83+

Azure GCPTITLE

0.82+

CaesarPERSON

0.82+

series COTHER

0.8+

CubeORGANIZATION

0.8+

data.worldOTHER

0.78+

LordPERSON

0.75+

thousandsQUANTITY

0.74+

single sourceQUANTITY

0.74+

DublinORGANIZATION

0.73+

snowflake summit 22EVENT

0.7+

WikiTITLE

0.68+

schema.orgORGANIZATION

0.67+

early twoDATE

0.63+

CDOTITLE

0.48+

Matt Hicks, Red Hat | Red Hat Summit 2022


 

>>We're back at the red hat summit, 2022, the Cube's continuous coverage. This is day one. We're here all day tomorrow as well. My name is Dave LAN. I'm here with Paul Gillon. Matt Hicks is here. He's executive vice president of products and technologies at red hat. Matt. Good to see you. Thanks for coming on. Nice to see you face to >>Face. Thanks. Thanks Dave. Thanks fall. It's uh, good to be here. >>So you took a different tack with your, uh, keynote today, had a homage to ate a love lace and Serena VA Ramian, which was kind of cool. And your, your point was they weren't noted at their time and nobody was there to build on their early ideas. I mean, ate a lovely, I think it was a century before, right. Ram illusion was a, you know, decade plus, but, and you tied that to open source. You can give us your kind of bumper sticker of your premise there. >>Yeah. You know, I think I have a unique seat in this from red hat where we see, we see new engineers that come in that sort of compete on a world stage and open source and the, the best, which is easy to track just in contributions are not necessarily from the background you would expect them from. And, and it, for me, it's always really inspiring. Like you have this potential in, in people and open source is a great model for getting that out. We told the history story, cuz it, I think when you look over history, just some of that potential that's been ignored before. Um, sure. It's happening right now. But getting that tied into open source models, we think can hopefully let us tap into a little more than, than we have in the past. So >>Greatly. So when you're thinking about innovation and specific to open source, is it a case where I wonder, I really know the history here of open source. Maybe you can educate me. Is it the case where open source observes, uh, a de factacto standard let's say, or some other proprietary approach and says, Hey, we can build that in open and that's so the, the inspiration, or is it an innovation flywheel that just invents? >>I think it's both at this stage. So in the, in the early days, if you take something like Linux, it was a little more of, you know, there was the famous memo of like, this is gonna be a hobbyist project. We're just gonna light up X 86 hardware and have an operating system we can work with. That was a little more of like this standards were there, but it was, can we just build a better operating system with it, be >>Better than Unix cuz would live up to the promise of units. >>That's right. Where in Unix you had some standardization to models, but it wasn't open in that same sense. Uh, Linux has gone well beyond a hobbyist project at this point. Uh, but that was maybe that clone model, um, to units these days though, if you take something like Kubernetes or take something like Ansible, that's just more pure innovation, you didn't necessarily have a Kubernetes model that you're building a better version of it was distributed computing and how can we really make that tick and, um, bring a lot of great minds into that to build it. Um, so I think you see both of 'em, which is it's one of the things that makes open source fun. Like it, it has a broad reach at this point. >>There's one major area of software that opensource has not penetrated yet. And that is applications. I mean, we, there have been, you know, sugar CRM there have been open E R P applications and, and such, none of them really taken off and in fact tend to be drawn back to being proprietary. Why do you suppose opensource has been limited to infrastructure and has hasn't branched out further? >>Yeah, I think part of it is, uh, where can you find a, a model where lots of different companies are, are comfortable contributing into, if you have one solution and one domain from one company you're gonna struggle more getting a real vibrant community built around that. When you pick an area like infrastructure or core platforms, you have a lot of hardware providers, the use cases span from traditional apps to AI. You have a lot of places to run that it's a massive companies. So >>Volume really, it, >>It really is. You just have an interest that spans beyond companies and that's where we've seen open source projects really pick up and build critical mass. How about crypto >>Dows? I mean, that's right. Isn't that the, a form of open source? I mean, is it, isn't that the application really what exactly what you're talking about? It is true or >>It, well, if you look at cryptography encryption algorithms even go to, um, quantum going forward, I think a lot of quantum access will be driven in an open source model. The machines themselves, uh, will be machines, but things like kids kit, uh, that is how most people will access that. So it is a powerful model for getting into areas that are, um, pretty bleeding edge on it as well. >>We were talking, go ahead. We were talking before Andy mentioned that hardware and software increasingly intersecting. That was the theme we heard at the, at the keynote this morning. Yeah. Why do you believe that's happening and how do you see that? How does that affect what you do? >>Uh, I, I think the reason that's happening is there is a push to make decisions closer and closer to users on it because on one side, like law of physics and then on the other of it's just a better experience for it. And so whether that is in transportation or it's in telecommunications, so you see this push outside of data centers to be able to get at that data locally for it. Uh, but if that's the draw, I think also we're seeing hardware architectures are changing. There are, um, standards like arm that are lower power that lets you run pretty powerful compute at the edge as well. And I think it's that combination saying we can do a lot at the edge now and that actually benefits us building user experiences in a lot of different domains is, is making this pull to the edge, uh, really quickly. But it's, it's a, it's an exciting time to be seeing that happening >>And, and, and pretty powerful is almost an understatement. When you think about what the innovations that are going on. Right. I mean, in, in, in, in particular, at the edge mm-hmm, <affirmative>, I mean, you're seeing Moore's law be blown. Everybody says Moore's law is dead, but you're seeing the performance of when you combine the GPU and the CPU and the NPU and the Excel. I mean, it blows away anything we've historically known. Yeah. So you think about the innovations in software that occurred as a result of Moore's law. What are the new beachheads that we could potentially see in open source? >>I think when you start taking the, um, AI patterns on this and AI is a broad space, but if you go even to like machine learning of optimization type use cases, you start, uh, leveraging how you're gonna train those models, which gets you into, you know, CPUs and GPU and TPUs in that world. And then you also have the, how am I gonna take that train model, put it on a really lightweight device and efficiently ask that model questions. And that gets you into a different architecture design. Uh, but that combination, I think we're gonna see these domains build differently where you have mass compute training type capabilities, and then push that as close to the user, as you can, to make decisions that are more dynamic than traditional codes. >>So a lot of the AI that's done today is modeling that's done in the cloud. Yep. And what you're talking about at the edge, and you think about, you know, vehicles is real time influencing. Yep. And that's, that's massive amounts of data. It's a different architecture. Right. And requires different hardware presumably and different software. So, and you guys, well, Linux is obviously there. Yeah. >>That's, that is the, where we get excited about things like the GM announcement you are in the square, in that, um, aspect of running compute right at the end user and actually dealing with sensor and data, that's changing there to help, you know, in this case, like driver's assistance capabilities with it. But I think that the innovation we'll see in that space will be limitless on it. So it's, it's a nice combination of it too. And you'll still have traditional applications that are gonna use those models. I think of it almost as it's like the new middleware, we have our traditional middleware techniques that we know and patterns. Um, they will actually be augmented with things like, um, machine learning models and those capabilities to just be more dynamic. So it's a fun time right now seeing >>That conversion a lot of data too. And again, I wonder how much of that is even gonna be persisted prob probably enough, cuz there's gonna be so much of it, how much it'll come back to the cloud a lot, but maybe not most of it, but it's still massive amounts relative to what we've seen before >>It is. And this is, you know, you've heard our announcement around OpenShift streams in those capabilities. So in red hat, what we do, we will always focus on hybrid with it because a lot of that data it'll be dropped at the edge cuz you won't need it, but the data you act on and the data you need, you will probably need at your indice and in your cloud. And maybe even on premise and capabilities like Kafka and the ability to pick and stream and stay consistent. We think there's a set of really exciting services to be able to enable that class of development where, um, hopefully we'll be at the center of, of that. >>You, you announced, uh, today an agreement with GM, uh, to, to build on their all to five platform, uh, auto industry, very proprietary historically, uh, with their technology. Do you think that this is an opportunity to crank that open? >>A absolutely. I think in, I've been involved with opensource for, for a while, but I think all of them started in a very proprietary model. And then you get to a tipping point where open source models can just unlock more innovation than proprietary models and you see 'em tip and flip. And I think in the automotive industry and actually in a lot of other industries, the capabilities of being able to combine hardware and software fast with the latest capabilities, it'll drive more innovation than just sticking to proprietary models. So yeah, I believe it will be one of many things to come there. >>You've been involved in open surf for a while. Like how long of a while people must joke about when they look at you, Matt, they must say, oh, did you start when you were five? Yeah. >>It's >>Uh, you get that a lot. >>I, I do, uh, it's my, my children, I think aged me a bit, but uh, but yeah, for me it was the mid nineties. That's when I started with, uh, with open source. >>It was uh, wow. So >>It's been a long, long >>Run. You made the statement in your keynote, that software development is, is, is messy. I presumably part of your job is to make it less messy. But now we talk about all this, these new beachheads, this new new innovations, a lot of it's unknown. Yeah. And it could be really messy. So who are the, who is there a new breed of developer that's emerging? Are they gonna come over from the cloud developers or is it the, is it the OT crowd and the, and the OT crowd? That's gonna be the new developers. >>I, I wish I knew, but I would say, I think you, I do think you'll get to almost like a laws of physics type challenge where you won't learn everything. You're not gonna know, uh, the depths of 5g implementation and Kubernetes and Linux on that. And so for us, this is where ecosystem providers are really, really critical where you have to know your intersection points, but you also have to partner really well to actually drive innovation in some of these spaces cuz uh, the domains themselves are massive on it. So our areas we're gonna know hybrid, we're gonna know, you know, open source based platforms to enable hybrid. And then we're gonna partner with companies that know their domains and industries really well to bring solutions to customers. So >>I'm curious about partnering, uh, cuz Paul cor may mentioned that as well as, as being critical, do you have sort of a template for partnering or is each partnership unique? >>Um, >>I think at this point, uh, the market's changing so fast that, uh, we do have templates of, uh, who are you going to embed solutions with? Who are you going to co-sell with? And co-create uh, the challenge in technology though, is it shifts so quickly. If you go back five years, maybe even 10 years, public cloud probably wasn't as dominant. Um, as it is now, now we're starting to see the uptick of edge solutions, probably being, having as much draw as public cloud. And so I think for us, the partnership follows the innovation on those curves and finding the right model where that works for customers is the key thing for us. But I wish there was more of a pattern. We could say it stays stable for decades, but I think it changes with the market on, we do that. >>But you know, it's funny cuz you you've, you see every 15 years or so the industry gets disrupted. I mean we certainly saw it with mainframes and PC and then the internet and then the cloud, uh, you guys have kind of been there. Well Linux throughout, I mean, okay. It built the, built the internet, built the cloud, it's building the edge. So it's almost, I don't wanna say your disruption proof cause that's just, that's gonna jinx you, but, but in, but you've architected the products in a way that they're compatible with these new errors. Mm-hmm <affirmative> of industry, >>Everything needs an operating >>System. Everything needs an operating system, but you've seen operating systems come and go, you know, and, and Linux has survived so many different waves. Why, how >>You know, I, I think for us, when you see open source projects, they definitely get to a critical mass where you have so much contribution, so much innovation there that they're gonna be able to follow the trends pretty well. If you look at a Linux, whatever the next hardware innovation that comes out is Linux has enough gravity that, um, it's open, it's successful, you're gonna design to it. The capability will be there. I think you're seeing similar things in Kubernetes now where if you're going to try to drive application innovation, it is a model that gives you a ton of reach. You have thousands of contributors. That's been our model though is find those projects be influential in, 'em be able to drive value in life cycles. But I think it's that open source model that gives us the durability where it can keep changing and tracking to new patterns. So, so >>Yeah, there's been a lot of open source that wasn't able to sustain. So I think you guys obviously have a magic formula. That's true. >>We, there is a, there is some art to picking, I think millions of projects. Uh, but you've gotta watch for that. >>Yeah. Open source is also a place place where failed products go to die. Yeah. <laugh> so you have to be sure you're not, you're not in that corner. >>Yeah. Well >>Look at Kubernetes. I mean the fact that that actually happened is it's astounding to me when you think about it, I mean even red hat was ready to go on a different path. What if that had happened? Who knows? Maybe it never would've maybe to your point about Ava Lovelace, maybe it would've taken a decade to, or run revolution. >>You know, I think in some of these you have to, you have to watch really closely. We obviously have a lot of signals of what will make good long term health. And I, I don't think everyone looks at those the same. We look at 'em from trademark controls and how foundations are structured and um, who the contributors are and the spread of that. And it's not perfect. But I think for us, you have to have those that longevity built in there where you will have a spike of popularity that has the tendency to just, um, fall apart on it. So we've been yeah. Doing that pretty >>Well conditions for a long life is something that's a that's maybe it's an art form. I don't know if it's a data form. It's a culture. Maybe, maybe it's >>Cultural. Yeah. Probably a combination some days I think I'm like this could part art, part science. Yeah. But, uh, but it's certainly a fun space to be in and see that happen. It, um, yeah, it's inspiring to me. Yeah. >>Matt Hicks. Great to have you back on the cube and uh, good job on the keynote really, um, interesting angle that you took. So >>Congratulations. Thanks for having me. >>Yeah. You're very welcome. All right. Keep it right there. Dave ante for Paul Gillon red hat summit, 2022 from Boston. You're watching the cube.

Published Date : May 10 2022

SUMMARY :

Nice to see you face to It's uh, good to be here. So you took a different tack with your, uh, keynote today, had a homage to ate I think when you look over history, just some of that potential that's been ignored before. Maybe you can educate me. if you take something like Linux, it was a little more of, you know, there was the famous memo Um, so I think you see both of 'em, which is it's one of the things that makes open source fun. I mean, we, there have been, you know, sugar CRM there have been open E R Yeah, I think part of it is, uh, where can you find a, You just have an interest that spans beyond companies and that's where we've seen open is it, isn't that the application really what exactly what you're talking about? It, well, if you look at cryptography encryption algorithms even go to, How does that affect what you do? And I think it's that combination saying we can do So you think about the innovations in software Uh, but that combination, I think we're gonna see these domains build differently where you have mass and you guys, well, Linux is obviously there. That's, that is the, where we get excited about things like the GM announcement you are in the square, lot, but maybe not most of it, but it's still massive amounts relative to what we've seen before And this is, you know, you've heard our announcement around OpenShift streams in those capabilities. Do you think that this is an opportunity to crank that open? And then you get to a tipping point where open source models can just unlock more Like how long of a while people must joke about when they but uh, but yeah, for me it was the mid nineties. So I presumably part of your And so for us, this is where ecosystem providers are really, really critical where you uh, we do have templates of, uh, who are you going to embed solutions with? But you know, it's funny cuz you you've, you see every 15 years or so the industry gets disrupted. you know, and, and Linux has survived so many different waves. You know, I, I think for us, when you see open source projects, So I think you guys obviously have We, there is a, there is some art to picking, I think millions of projects. <laugh> so you have to be sure you're not, me when you think about it, I mean even red hat was ready to go on a different path. But I think for us, you have to have those that longevity built I don't know if it's a data form. But, uh, but it's certainly a fun space to be in and see that happen. Great to have you back on the cube and uh, good job on the keynote really, Thanks for having me. Keep it right there.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AndyPERSON

0.99+

DavePERSON

0.99+

Matt HicksPERSON

0.99+

Paul GillonPERSON

0.99+

MattPERSON

0.99+

fiveQUANTITY

0.99+

BostonLOCATION

0.99+

GMORGANIZATION

0.99+

ExcelTITLE

0.99+

10 yearsQUANTITY

0.99+

Dave LANPERSON

0.99+

LinuxTITLE

0.99+

UnixTITLE

0.99+

todayDATE

0.99+

tomorrowDATE

0.99+

one domainQUANTITY

0.98+

Red HatORGANIZATION

0.98+

five yearsQUANTITY

0.98+

bothQUANTITY

0.98+

red hatORGANIZATION

0.97+

Ava LovelacePERSON

0.97+

one solutionQUANTITY

0.96+

each partnershipQUANTITY

0.96+

five platformQUANTITY

0.96+

one companyQUANTITY

0.96+

oneQUANTITY

0.96+

KafkaTITLE

0.95+

mid ninetiesDATE

0.94+

this morningDATE

0.94+

KubernetesTITLE

0.94+

Paul corPERSON

0.93+

one sideQUANTITY

0.91+

a centuryDATE

0.9+

Red Hat Summit 2022EVENT

0.9+

KubernetesPERSON

0.88+

millions of projectsQUANTITY

0.87+

MooreTITLE

0.86+

decadesQUANTITY

0.86+

red hat summitEVENT

0.86+

2022DATE

0.83+

OpenShiftTITLE

0.82+

thousands of contributorsQUANTITY

0.81+

opensourceORGANIZATION

0.79+

15 yearsQUANTITY

0.76+

one majorQUANTITY

0.74+

a decadeQUANTITY

0.64+

CubeORGANIZATION

0.62+

Serena VA RamianPERSON

0.6+

XOTHER

0.58+

AnsibleORGANIZATION

0.56+

86TITLE

0.44+

Brad Kam, Unstoppable Domains | Unstoppable Domains Partner Showcase


 

(bright upbeat music) >> Hello, welcome to this CUBE Unstoppable Domain Showcase. I'm John Furrier, host of theCUBE. We've been showcasing all the great content about Web3 and what's going around the corner for Web4. Of course, Unstoppable Domains is one of the big growth stories in the business. Brad Kam, the Co-founder is here with me, of Unstoppable Domains, Brad, great to see you, thanks for coming on this showcase. >> Thanks, pleasure for having me. >> So you have a lot of history in the Web3. They're calling it now, but it's basically crypto and blockchain. You know, the white paper came out and then, you know how it developed was organically. We saw how that happened. Now you're the co-founder of Unstoppable Domains. You're seeing the mainstream, I would say mainstream scene, Superbowl commercials, okay? You're seeing it everywhere. So it is here. Stadiums are named after cryptos, companies. It's here. Hey, it's no longer a fringe, it is reality. You guys are in the middle of it. What's going on with the trend, and where does Unstoppable fit in and where do you guys tie in here? >> I mean, I think that what's been happening in general, this whole revolution around cryptocurrencies and then NFTs and what Unstoppable Domain is doing. It's all around creating this idea that people can own something that's digital. And this hasn't really been possible before Bitcoin. Bitcoin was the first case. You could own money. You don't need a bank, no one else. You know, you can completely control it. No one else can turn you off. Then there was this next phase of the revolution, which is, assets beyond just currencies. So NFTs, digital art. What we're working on is like a decentralized identity, like a username for Web3 and each individual domain name is an NFT. But yeah, it's been a crazy ride over the past 10 years. >> It's fun because, you know, on siliconangle.com, which we founded, we were covering early days of crypto. In fact, our first website, the developer want to be paid in crypto. It's interesting. Price of Bitcoin, I won't say that how low it was. But then you saw the ICO Wave, the token started coming in. You started seeing much more engineering focus, a lot of white papers coming out, a lot of cool ideas. And then now you got this mainstream of this. So I got to ask you, what are the coolest things you guys are working on, because Unstoppable has a solution that solves a problem today, and that people are facing at the same time, it is part of this new architecture. What problem do you guys solve right now that's in market that you're seeing the most traction on? >> Yeah, so it's really about, so whenever you interact with a blockchain, you wind up having to deal with one of these really, really crazy public keys, public addresses. And they're like anywhere from 20 to 40 characters long, they're random, they're impossible to memorize. And going back to even early days in crypto, I think people knew that this tech was not going to go mainstream if you have to copy and paste these things around. If I'm getting ready to send you like a million dollars, I'm going to copy and paste some random string of numbers and letters. I'm going to have no confirmations about who I'm sending it to, and I'm going to hope that it works out. It's just not practical. People have kind of always known there was going to be a solution. And one of the more popular ideas was, doing kind of like what DNS did, which is, instead of having to deal with these crazy IP addresses, this long random string of numbers to find a website, you have a name like a keyword, something that's easy to remember. You know, like a hotels.com or something like that. And so what NFT domains are, is basically the same thing, but for blockchain addresses. And yeah, it's just better and easier. There's this joke that everybody, you know, if you want to send me money, you're going to send me a test transaction of, you know, like a dollar first, just to make sure that I get it. Call me up and make sure that I get it before you go and send the big amount. Just not the way of moving billions of dollars of value is going to work in the future. >> Yeah, and I think one of the things you just point out, make it easier. When you have these new waves, these shifts, we saw it with the web pages. More and more web pages were coming on, more online users. They called it the online populations growing. Here, the same thing's happening. And if the focus is on ease of use, making things simpler to understand, and reducing the step it takes to do things, right? This is kind of what's going on and with the developer community, and what Ethereum has done really well is, brought in the developers. So that's the convergence of all the action. And so, when you (John chuckles) so that's where you're at right now. How do you go forward from here? Obviously, there's business development deals to do, you guys are partnering a lot. What's the strategy? What are some of the things that you can share about some of your business activity that points to how mainstream it is and where it's going? >> So I think the way to think about an NFT domain name is that it's meant to be like your identity on Web3. So, it's going to have a lot of different context. So it's kind of like your Venmo account, where you could send me money to brad.crypto, can be your decentralized website, where you can check out my content at brad.crypto. It can also be my like login kind of like a decentralized Facebook O oth, where I can log into DApps and share information about myself and bring my data along with me. So it's got all of these different things that it can do, but where it's starting is inside of crypto wallets and crypto apps, and they are adopting it for this identity idea. And it's the same form of identity across all your apps. That's the thing that's new here. So, yeah, that's the really big and profound shift that's happening. And the reason why this is going to be maybe even more important than a lot of, you know, your listeners think is that, everyone's going to have a crypto wallet. Every person in the world is going to have a crypto wallet. Every app, every consumer app that you use is going to build one in. Twitter just launched, just built one. Reddit is building one. You're seeing it across all the consumer finance apps. So it's not just the crypto companies that you're thinking of, every app's going to have a wallet. And it's going to really change the way that we use the internet. >> I think there's a couple things you pointed. I want to get your reaction to and thoughts more on this concept of DApps or decentralized applications, DApps or depending on what you call it. This is applications. And that take advantage of the architecture, and then this idea of users owning their own data. And this absolutely reverses the script today. Today, you see Facebook, you see LinkedIn, all these silos, they own the data that you are the product. Here, the users are in control. They have their data, but the apps are being built for it for the paradigm shift here, right? That's what's happening. Is that right? >> Totally, totally. And so, it all starts. I mean, DApp is just this crazy term. It feels like it's this, like really foreign, weird thing. All it means is that you sign in with your wallet instead of signing in with a username and password, where the data is stored inside of that app. Like inside of Facebook. So that's the only real, like, core underneath difference to keep in mind, signing in with the wallet. But that is like a complete sea change in the way the internet works. Because I have this key, this private key, it's on my phone or my device or whatever. And I'm the only one that has it. So, if somebody wanted to hack me, they need to go get access to my device. Two years ago, when Twitter got hacked, Barack Obama and Elon Musk were tweeting the same stuff. That's because Twitter had all the data. And so, you needed to hack Twitter instead of each individual person. It's a completely different security model. It's way better for users to have that. But, if you're thinking from the user perspective, what's going to happen is, is that instead of Facebook storing all of my data, and then me being trapped inside of Facebook, I'm going to store it, and I'm going to move around on the internet, logging in with my Web3 username, my NFT domain name, and I'm going to have all my data with me. And then I could use 100 different Facebooks all in one day. And it would be effortless for me to go and move from one to the other. So, the monopoly situation that we exist in as a society is because of the way data storage works and- >> So that's a huge point. So let's double down on that for one more second. This is a huge point. I want to get your thoughts. So I think people don't understand that in the mainstream having that horizontal traversal or ability to move around with your identity in this case, your Unstoppable Domain and your data allows the user to take it from place to place. It's like going to other apps that could be like Facebook, where the user's in charge. And they're either deciding whether to share their data or not, or they're certainly continuate their data. And this allows for more of a horizontal scalability for the user, not for a company. >> Yeah, and what's going to happen is, as users are building up their reputation. They're building up their identity in Web3. So you have your username and you have your profile and you have certain badges of activities that you've done. And you're building up this reputation. And now apps are looking at that, and they're starting to create social networks and other things to provide me services because it started with the user. And so, the user is starting to collect all this valuable data, and then apps are saying, well, hey, let me give you a special experience based on that. But the real thing, and this is like the core, I mean, this is just like a core capitalist idea, in general. If you have more competition, you get a better experience for users. We have not had competition in Web2 for decades because these companies have become monopolies. And what Web3 is really allowing is, this wide open competition. And that's the core thing. Like, it's not like, you know, it's going to take time for Web3 to get better than Web2. You know, it's very, very early days. But the reason why it's going to work is because of the competitive aspect here. Like it's just so much better for consumers when this happens. >> I would also add to that, first of all, great point, great insight. I would also add that the web presence technology based upon DNS specifically is, first of all, it's asking, so it's not foreign characters, it's not Unicode for the geeks out there. But that's limiting too, it limits you to be on a site. And so, I think the combination of kind of inadequate or antiquated DNS has limitations. So if... And that doesn't help communities, right? So when you're in the communities, you have potentially marketplaces that could be anywhere. So if you have ID, I'm just kind of thinking it forward here. But if you have your own data and your own ID, you can jump into a marketplace, two-sided marketplace anywhere. An app can provide that, if the community's robust, this is kind of where I see the use case going. How do you guys, do you guys agree with that statement and how do you see that ability for the user to take advantage of other competitive or new emerging communities or marketplaces? >> So I think it all comes down. So identity is just this huge problem in Web2. And part of the reason why it's very, very hard for new marketplaces and new communities to emerge is 'cause you need all kinds of trust and reputation. And it's very hard to get real information about the users that you're interacting with. If you're in the Web3 paradigm, then what happens is, is you can go and check certain things on the blockchain to see if they're true. And you can know that they're true 100%. You can know that I have used Uniswap in the past 30 days, and OpenSea in the past 30 days. You can know for sure that this wallet is mine. The same owner of this wallet also owns this other wallet, owns this asset. So having the ability to know certain things about a stranger is really what's going to change behavior. And one of the things that we're really excited about is being able to prove information about yourself without sharing it. So I can tell you, hey, I'm a unique person. I'm an American, I'm not an American, but I don't have to tell you who I am. And you can still know that it's true. And that concept is going to be what enables what you're talking about. I'm going to be able to show up in some new community that was created two hours ago, and we can all trust each other that a certain set of facts are true. And that's possible because- >> And exchange value with smart contracts and other with no middle men involved activities, which is the promise of the new decentralized web. All right, so let me ask you a question on that. Because I think this is key. The anonymous point is huge. If you look at any kind of abstraction layers or any evolution in technology over the years, it's always been about cleaning up the mess or extending capabilities of something that was inadequate. We mentioned DNS, now you got this. There's a lot of problems with Web2, 2.0, social bots. You mentioned bots. Bots are anonymous and they don't have a lot of time in market. So it's easy to start bots, and everyone who does either scraping bots, everyone knows this. What you just pointed out was, in an ops environment that was user choice, but has all the data that could be verified. So it's almost like a blue check mark on Twitter without having your name, kind of- >> It's going to be 100s of check marks, but exactly. 'Cause there's so many different things that you're going to want to communicate to strangers, but that's exactly the right mental model. It's going to be these check marks for all kinds of different contexts. And that's what's going to enable people to trust that they're, you know, you're talking to a real person or you're talking to the type of person you thought you were talking to, et cetera. But yeah, it's, you know, I think that the issues that we have with bots today are because Web2 has failed at solving identity. I think Facebook at one point was deleting half a billion fake accounts per quarter. Something like the entire number of user profiles they were deleting per year. So it's just a total- >> And they spring up like mushrooms. They just pop up, to think that's the problem. I mean, the data that you acquire in these siloed platforms is used by them, the company. So you don't own the data, so you become the product as the cliche goes. But what you guys are saying is, if you have an identity and you pop around to multiple sites, you also have your digital footprints and your exhaust that you own. Okay, that's time, that's reputation data. I mean, you can cut it any way you want, but the point is, it's your stuff over time, that's yours. And that's immutables on the blockchain, you can store it and then make that permanent and add to it. >> Exactly. >> That's a time based thing versus today, bots that are spreading misinformation can get popped up when they get killed. They just start another one. So time actually is a metric for quality here. >> Absolutely. And people already use it in the crypto world to say like, hey, this wallet was created greater than two years ago. This wallet has had transactions for at least three or four years. Like this is probably a real, you know, this is probably a legitimate user. And anybody can look that up. I mean, we can we go look it up together right now on Etherscan, it would take a minute. >> Yeah, (indistinct). Yeah, I'm a big fan, I can tell, I love this product. I think you guys are going to do really well. Congratulations, I'm a big fan. I think this is needed. What are some of the deals you've done? blockchain.com is one and Opera. Can you take us through those deals and why they're working with you? Let's start with blockchain.com. >> Yeah, so the whole thing here is that, this identity standard for Web3 apps need to choose to support it. So, you know, we spent several years as a company working to get as many crypto wallets and browsers and crypto exchanges to support this identity standard. Some of the largest and probably most popular companies to have done this are, blockchain.com, for example, blockchain.com, one of the largest crypto wallets in the world. And you can use your domain names instead of crypto addresses. And this is super cool because blockchain.com in particular focuses on onboarding new users. So they're very focused on how we're going to get the next 4 billion internet users to use this tech. And they said, usernames are going to be essential. Like, how can we onboard this next several billion people if we have to explain to them about all these crazy addresses. And it's not just one, like we want to give you 10, 40 character addresses for all these different contexts. Like, it's just no way people are going to be able to do that without having a user name. So, that's why we're really excited about what blockchain.com's doing. They want to train users that this is the way you should use the tech. >> Yeah, and certainly no one wants to remember. I remember how writing down all my... You know, I was never a big wallet fan 'cause of all the hacks I used to write it down and store it in my safe. But if the house burns down or I kick the can who's going to find it, right? So again, these are all important things. Your key storing it, securing it, super important. Talk about Opera. That's an interesting partnership because it's got a browser that people know what it is. What are they doing different? Almost imagine they're innovating around the identity and what people's experiences with what they touch. >> Yeah, so this is one of those things that's a little bit easier and I strongly encourage everybody to go and try DApps after this. 'Cause this is going to be one of those concepts, it can be a little easier if you try it than if you hear about it. But the concept of a wallet and a browser are kind of merging. So it makes sense to have a wallet inside of your browser. Because when you go to a website, the website's going to want you to sign in with your wallet. So having that be in one app is quite convenient for users. And so Opera was one of the trailblazers, a traditional browser that added a crypto wallet so that you can store money in there. And then also added support for domain names for payments and for websites. So, you can type in brad.crypto and you can send me money, or you can type in brad.crypto into the browser and you can check out my website. I've got a little NFT gallery. You can see my collection up there right now. So that's the idea is that, browsers have this kind of superpower in Web3. And what I think is going to happen, Opera and Brave have been kind of the trailblazers here. What I think is going to happen is that, these traditional browsers are going to wake up and they're going to see that integrating a wallet is critical for them to be able to provide services to consumers. >> I mean, it is an app. I mean, why not make it a DApps as well? Because why wouldn't I want to just send you crypto, like Venmo, you mentioned earlier, which people can understand that concept. Venmo, let me make my cash. Same concept here. But built in to the browser, which is not a browser anymore it's a reader, a DApp reader, basically with a wallet. All right, so what does this mean for you guys and the marketplace? You got Opera pushing the envelope on browsing, changing the experience, enabling the applications to be discovered and navigated and consumed. You got blockchain.com with the wallets and being embedded there. Good distribution. Who are you looking for for partners? How do people partner? Let's just say theCUBE wants to do NFTs, and we want to have a login for our communities, which are all open. How do we partner with you? Or do we? We have to wait or is there a... I mean, take us through the partnership strategy. How do people engage with Unstoppable Domains? >> Yeah, so, I mean, I think that if you're a wallet or a crypto exchange, it's super easy, we would love to have you support being able to send money using domains. We also have all sorts of different kind of marketing activities we can do together. We can give out free stuff to your communities. We have a bunch of education that we do. We're really trying to be this onboarding point to Web3. So there's, I think a lot of cool stuff we can do together on the commercial side and on the marketing side. And then the other category that we didn't talk about was DApps. And we now have this login with ensemble domains, which you kind of alluded to there. And so you can log in with your domain name and then you can give the app permission to get certain information about you or proof of information about you, not the actual information, if you don't want to share it, because it's your choice and you're in control. And so, that would be another thing. Like, if you all launch a DApps, we should absolutely have login with Unstoppable there. >> Yeah, there's so much headroom here. You got a short term solution with exchange. Get that distribution, I get that, that's early days of the foundation, push the distribution, get you guys everywhere. But the real success comes in for the login. I mean, the sign in single sign in concept. I think that's going to be powerful, great stuff. Okay, future, tell us something we don't know about Unstoppable Domains that people might be interested in. >> I think the thing that you're going to hear about a lot from us in the future is going to be around this idea of identity, of being able to prove that you're a human and be able to tell apps that. And apps are going to give you all kinds of special access and rewards and all kinds of other things, because you gave 'em that information. So that's probably, that's the hint I'm going to drop. >> You know, it's interesting, Brad. You bring trust, you bring quality verified data, choose intelligence software and machine learning, AI and access to distributed communities and distributed applications. Interesting to see what the software does with that. Cause it traditionally didn't have that before. I mean, just in mind blowing. I mean, it's pretty crazy. Great stuff. Brad, thanks for coming on. Thanks for sharing the insight. The Co-founder of Unstoppable Domains, Brad Kam. Thanks for stopping by theCUBE's Showcase with Unstoppable Domains. >> Thanks for having me. (bright upbeat music)

Published Date : Mar 10 2022

SUMMARY :

Brad Kam, the Co-founder is here with me, and where do you guys tie in here? You know, you can completely control it. And then now you got And one of the more popular ideas was, the things you just point out, And it's the same form of of the architecture, and I'm going to have all my data with me. for the user, not for a company. and you have your profile But if you have your own but I don't have to tell you who I am. So it's easy to start bots, to trust that they're, you know, I mean, the data that you bots that are spreading misinformation Like this is probably a real, you know, I think you guys are And you can use your domain names 'cause of all the hacks I used the website's going to want you to just send you crypto, to get certain information about you I mean, the sign in And apps are going to give you and access to distributed communities Thanks for having me.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
BradPERSON

0.99+

Brad KamPERSON

0.99+

John FurrierPERSON

0.99+

Elon MuskPERSON

0.99+

FacebookORGANIZATION

0.99+

Unstoppable DomainsORGANIZATION

0.99+

Barack ObamaPERSON

0.99+

TwitterORGANIZATION

0.99+

JohnPERSON

0.99+

100%QUANTITY

0.99+

100QUANTITY

0.99+

TodayDATE

0.99+

first websiteQUANTITY

0.99+

LinkedInORGANIZATION

0.99+

RedditORGANIZATION

0.99+

Two years agoDATE

0.99+

20QUANTITY

0.99+

first caseQUANTITY

0.99+

oneQUANTITY

0.99+

Unstoppable DomainsORGANIZATION

0.99+

four yearsQUANTITY

0.99+

one more secondQUANTITY

0.99+

one appQUANTITY

0.98+

one dayQUANTITY

0.98+

4 billion internet usersQUANTITY

0.98+

40 charactersQUANTITY

0.97+

todayDATE

0.97+

OperaTITLE

0.97+

100s of check marksQUANTITY

0.97+

VenmoORGANIZATION

0.96+

singleQUANTITY

0.96+

each individual domainQUANTITY

0.96+

each individual personQUANTITY

0.95+

billions of dollarsQUANTITY

0.95+

Web2ORGANIZATION

0.95+

blockchain.comOTHER

0.95+

Web3ORGANIZATION

0.94+

decadesQUANTITY

0.93+

two hours agoDATE

0.93+

SuperbowlEVENT

0.91+

10, 40 characterQUANTITY

0.91+

EthereumORGANIZATION

0.9+

siliconangle.comOTHER

0.9+

two-sidedQUANTITY

0.89+

FacebooksORGANIZATION

0.87+

half a billion fake accounts per quarterQUANTITY

0.87+

theCUBEORGANIZATION

0.87+

two years agoDATE

0.86+

Unstoppable DomainORGANIZATION

0.86+

UnstoppableTITLE

0.86+

past 30 daysDATE

0.85+

million dollarsQUANTITY

0.85+

one pointQUANTITY

0.81+

Unstoppable Domain ShowcaseEVENT

0.8+

brad.cryptoOTHER

0.79+

OpenSeaORGANIZATION

0.79+

firstQUANTITY

0.76+

at least threeQUANTITY

0.75+

Matt Gould, Unstoppable Domains | Unstoppable Domains Partner Showcase


 

(upbeat music) >> Hello, welcome to theCUBE's special showcase with Unstoppable Domains. I'm John Furrier, your host of theCUBE here in Palo Alto, California. And Matt Gould who's the founder and CEO of Unstoppable Domains. Matt, great to come on. Congratulations on the success of your company, Unstoppable Domains. Thanks for kicking off this showcase. >> Well, thank you, happy to be here. >> So first of all, love the story you've got going on here. Love the approach, very innovative, but you're also on the big Web3 wave, which we know that leads into, metaverse unlimited new ways, people are consuming information, content, applications are being built differently. This is a major wave and it's happening. Some people are trying to squint through the hype versus reality, but you don't have to be a rocket science to realize that it's a cultural shift and a technical shift going on with Web3. So this is kind of what's happening in the market. So give us your take. What's your reaction? You're in the middle of it. You're on this wave. >> Yeah, well, I would say it's a torrent of change that got unleashed just over a decade ago with Bitcoin coming out and giving people the ability to have digital items that they could actually own themselves online. And this is a new thing. And people coming, especially from my generation of millennials, they spend their time online in these digital spaces and they've wanted to be able to own these items and you see it from, you know Gaming and Fortnite and Skins and Warcraft and all these other places. But this is really being enabled by this new crypto technology to just extend to a whole lot more applications, from money, which everyone's familiar with, to NFT projects like Board Apes or CryptoBucks. >> You know, I was listening to your podcast. You guys got a great pod. I think you're on 117 episodes now and growing. You guys do a deep dive, so people watching check out the Unstoppable Podcast. But on the last podcast, Matt, you mentioned some of the older generations like me, I grew up with IP addresses and before the web, they called it information super highway. It wasn't even called the web yet, but IP was generated by the United States Department of Commerce and R&D, that became the internet, the internet became the web. Back then it was just get some web pages up and find what you're looking for. Very analog compared to what's now today, now you mentioned gaming. You mentioned how people are changing. Can you talk about your view of this cultural shift? And we've been talking about the queue for many, many years now, but it's at actually happening now where the expectation of the audience and the users and the people consuming and communicating and bonding in groups, whether it's gaming or communities are expecting new behaviors, new applications, and it's a forcing function. This shift is having now, what's your reaction to that? What's your explanation? >> Yeah, well, I think it just goes back to the shift of people's where are they spending their time? And if you look today, most people spend 50% plus of their time in front of a screen. And that's just a tremendous amount of effort. But if you look at how much of their assets are digital, it's like less than 1% of their portfolio would be some sort of digital asset compared to literally 50% of every day sitting in front of a screen. And simultaneously what's happening is these new technologies are emerging around cryptocurrencies, blockchain systems, ways for you to track digital ownership of things, and then kind of bring that into your different applications. So one of the big things that's happening with Web3 is this concept of data portability, meaning that I can own something on one application and then I could potentially take that with me to several other applications across the internet. And so this is like the emerging digital property rights that are happening right now as we transition from a model in Web2, where you are on a hosted service like Facebook, it's a walled garden, they own and control everything. You are the product, they're mining you for data and they're just selling ads, right? To a system where it's much more open. You can go into these worlds and experiences. You can take things with you and you can leave with them. And most people are doing this with cryptocurrency. Maybe you earn an end-game currency, you can leave and take that to a different game, and you can spend it somewhere else. So the user is now enable to bring their data to the party. Whereas before now, you couldn't really do that. And that data includes their money or that includes their digital items. And so I think that's the big shift that we're seeing and that changes a lot in how applications serve up to users. It's going to change their user experiences for instance. >> I think the script has flipped and you're right on. I agree with you. I think you guys are smart to see it. And I think everyone who's on this wave will see it. Let's get into that because this is happening. People are saying, "I'm done with being mined "and being manipulated by the big Facebook "and the LinkedIns of the world who are using the user." Now, the contract was a free product and you gave up your data, but then it got too far. Now people want to be in charge of their data. They want to broker their data. They want to collect their digital exhaust, maybe collect some things in a game, or maybe do some commerce in an application or marketplace. So these are the new use cases. How does a digital identity architecture work with Unstoppable? How would you guys enabling that? Can you take us through the vision of where you guys came on this because it's unique, you had an NFT and kind of the domain name concept coming together, can you explain? >> Yeah, so we think we approach the problem for if we're going to rebuild the way that people interact online, what are kind of the first primitives that they're going to need in order to make that possible? And we thought that one of the things that you have on every network, like when you log on Twitter, you have a Twitter handle, when you log on Instagram, you have an Instagram handle. It's your name, right? You have that name that's on those applications. And right now what happens is if users get kicked off the platform, they lose a 100% of their followers, right? And they also, in some cases, they can't even directly contact their followers on some of these platforms. There's no way for them to retain this social network. So you have all these influencers, who are today's small businesses, who build up these large, you know profitable, small businesses online, being key opinion leaders to their demographic. And then they could be deplatformed, or they're unable to take this data and move to another platform if that platform raises their fees. You've seen several platforms increase their take rates. You have 10, 20, 30, 40%, and they're getting locked in and they're getting squeezed, right? So we just said, "You know what, "the first thing you're going to want to own "that this is going to be your piece of digital property "is going to be your name across these applications." If you look at every computing network in the history of computing networks, they end up with a naming system. And when we looked back at DNS, you know which came out in the 90s, it was just a way for people to find these webpages much easier instead of mapping these IP addresses. And then we said to ourselves, "What's going to happen in the future?" Is just like everyone has an email address that they use in their Web2 world in order to identify themselves as they log into all these applications. They're going to have an NFT domain in the Web3 world in order to authenticate and bring their data with them across these applications. So we saw a direct correlation there between DNS and what we're doing with NFT domain name systems. And the bigger breakthrough here is that NFT domain systems all of these NFT assets that live on a blockchain. They are owned by users. They're built on these open systems so that multiple applications could read data off of them and that makes them portable. So we were looking for an infrastructure play like a picks and shovels play for the emerging Web3 metaverse. And we thought that names were just something that if we wanted a future to happen, where all 3.5 billion people with cell phones are sending crypto and digital assets back and forth, they're going to need to have a name to make this a lot easier instead of these long IP addresses or hex addresses in the case of crypto. >> Yeah, and also people have multiple wallets too. It's not like there's all kinds of wallet, variations, name verification, you see the link tree is everywhere. You know, that's essentially just an app. I mean, doesn't really do anything. I mean, so you're seeing people trying to figure it out. I got a github handle. I got a LinkedIn handle. I mean, what do you do with it? >> Yeah, and then specific to crypto, there was a very hair on fire use case for people who buy their first Bitcoin. And for those in the audience who haven't done this yet, when you go in and you go into an app and you buy your first Bitcoin or Ethereum or whatever cryptocurrency, and then the first time you try to send it, there's this field where you want to send it and it's this very long hex address. And it looks like an IP address from the 1980s, right? And it's like a bank number and no one's going to use that to send money back and forth to each other. And so just like domain names and the D apps system replace IP addresses, NFT domains on blockchain systems replace hex addresses for sending and receiving cryptocurrency, Bitcoin, Ethereum, whatever. And that's its first use case is it really plugs in there. So when you want to send money to someone, you can just instead of sending money to a large hex address that you have to copy and paste, you could have an error, or you could send it to the wrong place, it's pretty scary. You could send it to johnfurrier.nft. And so we thought that you're just not going to get global adoption without better UX, same thing it worked with .com domains. And this is the same thing for Bitcoin and other crypto. >> It's interesting, look at the Web2 or trend one to two, Web1 went to two, it was all about use, ease of use, right? And making things simpler, clutter, more pages can't find things that was search, that was Google. Since then, has there actually been an advancement? >> Hmm. >> Facebook certainly is not an advancement, they're hoarding all the data. So I think we're broken between that step of a free search to all the resources in the world, to which by the way, they're mining a lot of data too, with the Toolbar and Chrome. But now where's that Web3 crossover? So take us through your vision on digital identity on Web2, Google searching, Facebook's broken, democracy's broken, users aren't in charge to Web3? >> Got it. Well, we can start at Web1. So the way that I think about it is if you go to Web1, it was very simple, just text web pages. So it was just a way for someone to like put up a billboard and here's a piece of information and here's some things that you could read about it, right? And then what happened with Web2, was you started having applications being built that had backend infrastructure to provide services. So if you think about Web2, these are all websites or web portals that have services attached to them, whether that's a social network service or a search engine or whatever. And then as we move to Web3, the new thing that's happening here, is the user is coming onto that experience and they're able to connect in their wallet or their Web3 identity to that app and they can bring their data to the party. So it's kind of like Web1, you just have a static web page, Web2, you have a static webpage with a service, like a server back here, and then Web3, the user can come in and bring their database with them in order to have much better app experiences. So how does that change things? Well, for one, that means that you want data to be portable across apps. So we touched on gaming earlier and maybe if I have an in-game item for one game that I'm playing for a certain company, I can take it across two or three different games. It also impacts money. Money is just digital information. So now I can connect to a bunch of different apps and I can just use cryptocurrency to make those payments across those things instead of having to use a credit card. But then another thing that happens is I can bring unlimited amount of additional information about myself when I plug in my wallet. And as an example, when I plug into Google search, for instance, they could take a look at my wallet that I've connected and they could pull information about me that I enable that I share with them. And this means that I'm going to get a much more personalized experience on these websites and I'm also going to have much more control over my data. There's a lot of people out there right now who are worried about data privacy, especially in places like Europe. And one of the ways to solve that is simply to not store the data and instead have the user bring it with them. >> You know, I've always thought about this and always debated it with Dave a lot and my co-host, does top down governance privacy laws outweigh the organic bottoms up innovation? So what you're getting at here is, "Hey, if you can actually have that solved "(laughing) before it even starts." It was almost as if those services were built for the problem of Web2. >> Yes. >> Not three. >> Right. >> What's your reaction to that? >> I think that is right on the money. And if you look at it as a security, like if I put my security researcher hat on, I think the biggest problem we have with security and privacy on the web today is that we have these large organizations that are collecting so much data on us and they just become these honeypots and there have been huge breaches. Like Equifax, you know a few years back is a big one and this all your credit card data got leaked, right? And all your credit information got leaked. And we just have this model where these big companies silo your data, they create a giant database, which is worth hundreds of millions of dollars, if not billions, to be attack. And then someone eventually is going to hack that in order to pull that information. Well, if instead, and you can look at this at Web3. So for those in the audience who have used, a Web3 application, one of these D app, to trade cryptocurrencies or something, you'll know that when you go there, you actually connect your wall. So when you're working with these web, you connect, you bring your information with you and you connect it. That means that the app has none of that stored, right? So these apps that people are using for crypto trading cryptocurrency on apps or whatever, they have no stored information. So if someone hacks one of these defi exchanges, for instance, there's nothing to steal. And that's because the only time the information is being accessed is when the user is actively using the site. And so as someone who cares about security and privacy, I go, "Wow, that's a much better or data model." And that gives so much more control of user 'cause the user just permissions access to the data only during the time period in which they're interacting with the application. And so I think you're right and like we are very excited to be building these tools, right? Because I see it like if you look at Europe, they basically pass GDPR and then all the companies are going, "We can't comply with that." They keep postponing it or like changing it a little bit and trying to make it easier to comply with. But honestly we just need to switch the data models. So the companies aren't even taking the data and then they're going to be in a much better spot. >> And GDPR is again a nightmare. I think it's the wrong approach. I always said it was screwed up because most companies don't even know where stuff is stored and nevermind how they delete someone's entry in a database. They don't even know what they're collecting. Some at some level they become so complicated. So right on the money there good, good call out. There question for you. Is this then? Okay, so do you decouple the wallet from the ID or are they together and is it going to be a universal wallet? Do you guys see yourselves as universal domains? Take me through the thinking around how you're looking at the wallet and the actual identity of the user, which obviously is important on the identity side, wallet is that just universal or is that going to be coming together? >> Well, I think so. The way that we kind of think about it is that wallets are where people have their financial interactions online. And then identity is much more about, it's kind of like being your passport. So it's like your driver's license for the internet. So these are two kind of separate products we see longer term and actually work together. So, if you have a domain name, it actually is easier to make deposits into your wallet because it's easier to remember to send money to mattgloud.crypto. And that way it's easier for me to receive payments or whatever. And then inside my wallet, I'm going to be doing defi trades or whatever. And that doesn't really have an interaction with names necessarily in order to do those transactions. But then if I want to sign into a website or something, I could connect that with my NFT domain. And I do think that these two things are kind of separate. I think we're going to... Still early, so figuring out exactly how the industry is going to shake out over like a five to 10 year time horizon and maybe a little bit more difficult and we could see some other emerging... What you would consider like cornerstones of the crypto ecosystem. But I do think identity and reputation is one of those. And I also think that your financial applications of defi are going to be another. So those are the two areas where I see it. And just a note on this, when you have a wallet that usually has multiple cryptocurrency addresses, so you're going to have like 50 cryptocurrency addresses in a wallet. You're going to want to have one domain name that links back to all those, because you're just not going to remember those 50 different addresses. So that's how I think that they collaborate and we collaborate with several large wallets as well, like blockchain.com and another 30 plus of these to make it easier for sending out and receiving cryptocurrency. >> So the wallet basically is a D app, the way you look at it, the integrated. >> Yeah. >> Whatever you want, just integrate in. How do I log into decentralized application with my NFT domain name? Because this becomes okay. I got to love the idea, love my identity. I'm an my own NFT. I mean, how this video's going to be an NFT soon. We get on board with the program here, but how do I log into my app? I'm going to have a D app and I got my domain name. Do I have to submit is there benchmarking? Is there approval process? Is there APIs and SDK kind of thinking around it? How are you thinking about dealing with the apps? >> Yeah, so all of the above. And what we're trying to do here is build like an SSO solution, but it's consumer based. So what we've done is adapted some SSO protocols that other people have used, the standard ones, in order to connect that back to an NFT domain in this case. And that way you GET the best of both worlds. So you can use these authorization protocols for data permissioning that are, you know, standard Web2 APIs, but then the permissioning system is actually based on the user-controlled NFT. So they're assigning it that with their private public key pair in order to make those updates. So that allows you to connect into both of these systems. We think that that's how technology typically impacts the world is it's not like you have something that just replaces something overnight. You have an integration of these technologies over time. And we really see these Web3 components and net two domains integrating nicely into regular apps. So as an example in the future, when you log in right now, you see Google off, Facebook off, or you can type in an email address, you could see NFT, Unstoppable Domains or NFT authorization. And you can SSO in to that website. When you go to a website like an e-commerce website, you could share information about yourself, 'cause you've connected your wallet now. So you could say, "Yes, I am a unique individual. "I do live in New York and I just bought a new house." And then when you permission all that information about yourself to that application, you can serve up a new user experience for you. And we think it's going to be very interesting for doing rewards and discounts online for e-commerce specifically in the future because that opens up a whole new market. 'Cause they can ask you questions about yourself and you can deliver that information directly to the app. >> I really think that the gaming market has totally nailed the future use case, which is in game currency, in game end engagement, in game data. And now bringing that to kind of a horizontally scalable like surface areas is huge, right? So, you know, I think that's a huge success on the concept. The question I have to ask you is you getting any pushback from, ICANN, the International Corporation of Naming and Numbers, they got dot everything now.club, 'cause the clubhouse, they got dot, party.live. I mean the real domain name people are over here, Web2, you guys are coming out with the Web3. Where is that connect for people who are not following along the Web3 trend? How do you rationalize the domain angle here? >> Yeah, well, so I would say that NFT domains are what domains on DNS were always meant to be 30 plus years ago. And they just didn't have blockchain systems back in the nineties when they were building these things. So there's no way to make them for individuals. So what happened was for DNS, it actually ended up being business. So if you look at DNS names, there's about 350 million registrations. They're basically all small business. And it's like, 20 to 50 million small businesses who own the majority of these.com or these regular DNS domain names. And that's their focus. NFT domains, because all of a sudden you have the wallet, you have them in your wallet and your crypto wallet, they're actually for individuals. So that market, instead of being for small businesses is actually end users. So instead of being for 20 to 50 million small businesses, we're talking about being useful for three to four billion people who have an internet connection. And so we actually think that the market size for NFT domains is somewhere 50 to a hundred X, the market size for traditional domain names. And then the use cases are going to be much more for individuals on a day to day basis. So it's like people are going to want you on a use them for receiving cryptocurrency or receiving dollars or payments or USCC coin, where they're going to want to use them as identifiers on social networks, where they're going to want to use them for SSO. And they're not going to want to use them as much for things like websites, which is what Web2 is. And if I'm being perfectly honest, if I'm looking out 10 years from now, I think that these traditional domain name systems are going to want to work with and adopt this new NFT technology, 'cause they're going to want to have these features for the domain names. So like in short, I think NFT domain names are domain names with superpowers. This is the next generation of naming systems. And naming systems were always meant to be identity networks. >> Yeah, they hit a glass ceiling. I mean they just can't, they're not built for that. And having people, having their own names, is essentially what decentralization is all about. 'Cause we, what is a company? It's a collection of humans that aren't working in one place, they're decentralized. So then you decentralize the identity and everything's been changed. So completely love it. I think you guys are onto something really huge here. You pretty much laid out what's next for Web3, but you guys are in this state of growth. You've seen people signing up for names. That's great. What are the best practices? What are the steps are people taking? What's the common use case for folks who are putting this to work right now for you guys? Why do you see, what's the progression? >> Yeah, so the thing that we want to solve for people most immediately, is we want to make it easier for sending and receiving crypto payments. And I know that sounds like a niche market, but there's over 200 million people right now who have some form of cryptocurrency, right? And 99.9% of them are still sending crypto using these really long hex addresses. And that market is growing at 60 to 100% year over year. So first we need to get crypto into everybody's pocket and that's going to happen over the next three to five years. Let's call it, if it doubles every year for the next five years, we'll be there. And then we want to make it easier for all those people to send crypto back and forth. And I will admit I'm a big fan of these stable coins and these like... I would say utility focused tokens that are coming out just to make it easier for transferring money from here to Turkey and back or whatever. And that's the really the first step for NFT domain names. But what happens is when you have an NFT domain and that's what you're using to receive payments, and then you realize, oh, I can also use this to log into my favorite apps. It starts building that identity piece. And so we're also building products and services to make it more like your identity. And we think that it's going to build up over time. So instead of like doing an identity network top down where you're like a government or corporation, you say, oh, you have to have ID, here's your password, you have to have it. We're going to do it, bottoms up. We're going to give everyone on the planet and up to you domain name, it's going to give them some utility to make it easier to send, receive cryptocurrency. And we're going to say, "Hey, do you want to verify your Twitter profile?" Yes. Okay, great. You just attach that back. Hey, you want to verify your Reddit? Yes, Instagram? Yes, TikTok, yes. You want to verify your driver's license? Okay, yeah, we can attach that back. And then what happens is you end up building up organically digital identifiers for people using these blockchain naming systems. And once they have that, they're going to just... They're going to be able to share that information and that's going to lead to better experiences online for both commerce, but also just better user experiences in general. >> Every company when they web came along first, everyone pro proved the web once. Oh, it's terrible. A bad idea. Oh, it's so, unreliable, so slow. Hard to find things. Web2, everyone bought a domain name for their company, but then as they added webpages, these premalinks became so long, the webpage address fully qualified, permalink string, they bought keywords. And then that's another layer on top. So you started to see that evolution in the web. Now it's kind of hit ceiling. Here, everyone gets their NFT, they start doing more things. Then it becomes much more of a use case where it's more usable, not just for one thing. So we saw that movie before, so it's like a permalink, permanent. >> Yeah. >> Excellent. >> Yes indeed. I mean, if we're lucky it will be a decentralized bottoms up global identity that appreciates user privacy and allows people to opt in. And that's what we want to build. >> And the gas prices thing that's always come out always an objection here that, I mean, blockchain's perfect for this 'cause it's, imitable, it's written on the chain. All good, totally secure. What about the efficiency? How do you see that evolving real quick? >> Well, so a couple comments on efficiency. First of all, we picked domains as first product to market. 'Cause you need to take a look and see if the technology is capable of handling what you're trying to do and for domain names, you're not updating that every day. So like, if you look at traditional domain names, you only update it a couple times per year. So, the usage for that to set this up and configure it, most people set it up and configure it and then they only have a few changes per year. First of all, the overall you, it's not like a game. >> An IO problem. >> Right, right, right. So that part's good. So we picked a good place to start for going to market. And then the second piece is like, you're really just asking, are computer systems going to get more efficient over time? And if you know, the history of that has always been yes. And I remember the 90s, I had a modem and it was 14 kilobits and then it was 28 and then 56 and then 100. And now I have a hundred megabits up and down. And I look at blockchain systems and I don't know if anyone has a law for this yet, but throughput of blockchains is going up over time and there's going to be continued improvements over this over the next decade. We need them. We're going to use all of it. And you just need to make sure you're planning a business makes sense for the current environment. Just as an example, if you would try to launch Netflix for online streaming in 1990, you would've had a bad time because no one had bandwidth. So yeah, some applications are going to be wait to be a little bit later on in the cycle, but I actually think identity is perfectly fine to go ahead and get off the ground now. >> Yeah, the motivated parties for innovations here, I mean a point cast failed miserably that was like, they tried to stream video over T1 lines, but back in the days, nothing. So again, we've seen those speeds, double, triple in homes right now. Matt, congratulations, great stuff. Final, TikTok moment here. How would you summarize in a short clip, the difference between digital identity and Web2 and Web3. >> In Web2, you don't get to own your own online presidence, and in Web3 you do get to own it. So I think if you were going to simplify it, really Web3 is about ownership and we're excited to give everyone on the planet a chance to own their name and choose when and where and how they want to share information about themselves. >> So now users are in charge. >> Exactly, you got it. >> They're not the product anymore. If you got to be the product you might as well monetize the product, and that's the data. Real quick thoughts just to close out the roll of data and all this, your view. >> We haven't enabled users to own their data online since the beginning of the internet. And we're now starting to do that. It's going to have profound changes for how every application on the planet interacts with their users. >> Awesome stuff, Matt, take a minute to give a plug for the company. How many employees you got? What are you guys looking for for hiring, fundraising? Give a quick commercial for what's going on Unstoppable Domains. >> Yeah, so if you haven't already, check us out at unstoppabledomains.com, we're also on Twitter at Unstoppable web. And we have a wonderful podcast as well that you should check out if you haven't already. And we are just crossed a hundred people. We're growing, three to five hundred percent year over year. We're basically hiring every position across the company right now. So if you're interested in getting into Web3, even if you're coming from a traditional to background, please reach out. We love teaching people about this new world and how you can be a part of it. >> And you're a virtual company. You have a little headquarters or is it all virtual? What's the situation there? >> Yeah, I actually just assumed we are 100% remote and asynchronous and we're currently in five countries across the planet in mostly concentrated in the US and the EU areas. >> I heard a rumor too. Maybe you can confirm or admit or deny this rumor. I heard a rumor that you have mandatory vacation policy. >> This is true. And that's because we are a team of people who like to get things done. But we also know that recovery is an important part of any organization. So if you push too hard, we want to remind people we're on a marathon, right? This is not a sprint. And so we want people to be with us long term, and we do think that this is a 10 year move. And so yeah, do force people. We'll unplug you at the end of the year, if you- >> That's what I was going to ask you. So what's the consequence of, I don't take vacation. >> Yeah, we literally unplug you. (both laughing) >> You won't be able to get into slack. Right, and that's (indistinct). >> Well, when people start having their avatars be their bought and you don't even know what you're unplugging at some point, that's where you guys come in with the NFT saying that that's not the real person, it's not the real human. >> Yeah, exactly. We'll be able to check. >> NFT is great innovation, great use case, Matt congratulations. Thanks for coming on and sharing the story to kick off this showcase with theCUBE. Thanks for sharing all that great insight. Appreciate it. >> Yeah, John had a wonderful time. >> All right, this is theCUBE Unstoppable Domains showcasing. We've got 10 great pieces of content we're dropping all today. Check them out. Stay with us for more coverage. I'm John Furrier with theCUBE. Thanks for watching. (upbeat music)

Published Date : Mar 10 2022

SUMMARY :

Matt, great to come on. So first of all, love the and you see it from, you and the users and the people consuming And if you look today, and you gave up your data, that they're going to need in I mean, what do you do with it? Yeah, and then specific to crypto, the Web2 or trend one to two, of a free search to all So it's kind of like Web1, you "Hey, if you can actually have that solved and then they're going to or is that going to be coming together? how the industry is going to shake out the way you look at it, the integrated. I got to love the idea, love my identity. And that way you GET And now bringing that to kind to want you on a use them So then you decentralize the identity And then what happens is you So you started to see and allows people to opt in. And the gas prices thing So like, if you look at And if you know, the history but back in the days, nothing. and in Web3 you do get to own it. and that's the data. for how every application on the planet What are you guys looking and how you can be a part of it. And you're a virtual company. and the EU areas. I heard a rumor that you have So if you push too hard, So what's the consequence Yeah, we literally unplug you. Right, and that's (indistinct). saying that that's not the real person, We'll be able to check. to kick off this showcase with theCUBE. I'm John Furrier with theCUBE.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Matt GouldPERSON

0.99+

United States Department of CommerceORGANIZATION

0.99+

twoQUANTITY

0.99+

20QUANTITY

0.99+

ICANNORGANIZATION

0.99+

threeQUANTITY

0.99+

International Corporation of Naming and NumbersORGANIZATION

0.99+

John FurrierPERSON

0.99+

1990DATE

0.99+

DavePERSON

0.99+

MattPERSON

0.99+

New YorkLOCATION

0.99+

100%QUANTITY

0.99+

billionsQUANTITY

0.99+

USLOCATION

0.99+

117 episodesQUANTITY

0.99+

EuropeLOCATION

0.99+

FacebookORGANIZATION

0.99+

EquifaxORGANIZATION

0.99+

TurkeyLOCATION

0.99+

10QUANTITY

0.99+

50%QUANTITY

0.99+

100QUANTITY

0.99+

10 yearQUANTITY

0.99+

40%QUANTITY

0.99+

99.9%QUANTITY

0.99+

five countriesQUANTITY

0.99+

Palo Alto, CaliforniaLOCATION

0.99+

10 great piecesQUANTITY

0.99+

LinkedInORGANIZATION

0.99+

JohnPERSON

0.99+

second pieceQUANTITY

0.99+

30QUANTITY

0.99+

NFTORGANIZATION

0.99+

NetflixORGANIZATION

0.99+

56QUANTITY

0.99+

14 kilobitsQUANTITY

0.99+

less than 1%QUANTITY

0.99+

firstQUANTITY

0.99+

GDPRTITLE

0.99+

ChromeTITLE

0.99+

50 cryptocurrencyQUANTITY

0.99+

50 different addressesQUANTITY

0.99+

Unstoppable DomainsORGANIZATION

0.99+

hundreds of millions of dollarsQUANTITY

0.99+

TwitterORGANIZATION

0.99+

first timeQUANTITY

0.99+

over 200 million peopleQUANTITY

0.99+

two areasQUANTITY

0.99+

28QUANTITY

0.99+

LinkedInsORGANIZATION

0.99+

one gameQUANTITY

0.99+

fiveQUANTITY

0.99+

60QUANTITY

0.99+

two thingsQUANTITY

0.99+

about 350 million registrationsQUANTITY

0.99+

30 plusQUANTITY

0.99+

bothQUANTITY

0.99+

InstagramORGANIZATION

0.98+

50 millionQUANTITY

0.98+

first stepQUANTITY

0.98+

50QUANTITY

0.98+

unstoppabledomains.comOTHER

0.98+

50 millionQUANTITY

0.98+

GoogleORGANIZATION

0.98+

oneQUANTITY

0.98+

90sDATE

0.98+

both worldsQUANTITY

0.98+

first productQUANTITY

0.98+

FortniteTITLE

0.98+

first use caseQUANTITY

0.97+

todayDATE

0.97+

3.5 billion peopleQUANTITY

0.97+

three different gamesQUANTITY

0.97+

Matt Mickiewicz, Unstoppable Domains | Unstoppable Domains Partner Showcase


 

(upbeat music) >> Hello, welcome to theCUBE's presentation with Unstoppable Domains. It's a showcase we're featuring all the best content in Web 3 and with unstoppable showcase, I'm John Furrier, your host of theCUBE. We got a great guest here, Matt Mickiewicz who's the Chief Revenue Officer of Unstoppable Domains. Matt, welcome to the showcase, appreciate it. >> Thank you for having me. >> So the theme of this segment is the potential of the Web 3 marketplace with Unstoppable Domains. You're the Chief Revenue Officer, you guys have a very interesting concept that's going extremely well, congratulations. But you're using NFTs for access and domains, Of course through the metaverse is huge. People want their own domains, but it's not just like real estate in the sense of a website. It's bigger than that it's a lot going on. So take us through what is the value proposition and what is the product? >> Absolutely, so for the past 20 years, most of us have been interacting on the internet using usernames issued to us by big corporations like Facebook, Google, Twitter, TikTok, Snapchat, et cetera. Whenever we get these usernames for free it's because we and our data are the product. As some of the recent leaks in the media have shown incentive individual in companies are not always aligned. And most importantly individuals are not in control of their own digital identity and the data, which means they can economically benefit from the value they create online. Think of Twitter as a two-sided marketplace with 0% revenue share back to its creators. We're now having in the creator economy and we believe that individuals should see the economic rewards of what they do and create online. That's what we are trying to do in** support of domains is provide user own and control identity to four and a half billion internet users. >> It's interesting to see change that's happening with Web3 and just in cultural terms, users are expecting to be part of the creator the personality of the company, there's this almost this intermediation of the middle man whether it's an ad network or a gatekeeper of any kind people going direct, right? So if I'm an artist, I can go direct to my fans. >> Exactly, so Web3 really shifts the power away from a aggregators. Aggregators and marketplaces have been some of the best business models for the last 20 years onto the internet. But Web3 is going to dramatically change all over the next decade. Bring more power back in the hands of consumers. >> What type of companies do you guys work with and partner with that we see out there? Give us some examples of the kinds of companies you're doing business with end partnering with. >> Yeah, so let's talk about use cases first actually. Was the big use case that we identified initially for NFT domain names was around cryptocurrency transfers. Anyone who's ever bought cryptocurrency and tried to transfer it between accounts or wallets is familiar with these awkwardly long hexa decimal strings of random numbers and letters, or even if you make a single type of money is lost forever. That's a pretty scary experience that exists today. That 2 trillion asset dollar as a class with 250 million users. So the first set of partners that we worked on integrating with, we're actually crypto wallets and exchanges. So we will allow users to do is replace all their long hexa decimal wallet addresses with a single human readable name, like John.NFT or MattMickiewicz.crypto to allow for simple crypto transfers. >> And how do the exchange work with you guys on that is it a plugin, is it co-locating code together? What's the relationship between exchanges and Unstoppable Domains? >> Yeah, absolutely great question. So exchanges actually have to do a little bit of engineering list to work with us and they can do that by either using our resolution libraries or using one of our APIs in order to look up an Unstoppable Domain and figure out all the wallet addresses that's associated with that name. So today we work with dozens of the world's top exchanges and wallets ranging from OKX to Coinbase wallet, to Trust wallet, to bread wallet, and many many others. >> I got to ask you on the wallet side, is that a requirement in terms of having specific code and are the wallets that you work well with? Explain the wallet dynamic between Unstoppable Domains and wallets. >> Yeah, so wallets all have this huge usability problem for their users because every single cryptocurrency held by every single one of their users has a different hexadecimal wallet address. And once again every user is subject to the same human fallacies and errors where if they make a single type their money can be lost forever. So what we enable these wallets to do is to make crypto transfer simple and less scary than the current status quo by giving the users an Unstoppable name that they can use to attach to all the wallet addresses on the back end. So companies like Trust Wallet for example, which has 10 million user or Coinbase Wallet. When you go to the crypto transfer fields, there you can just type in an unstoppable name It'll correctly route the currency to the right person, to the right wallet, without any chance for human error. >> When these big waves coming out I got to ask this question, 'cause a lot of people in the mainstream are getting into it now. It reminds me of the web wave that hit the big thing was how many people are coming online, was one of the key metrics and how many web pages are being developed was another metric, which meant that people were building out webpages. And it's hard to look back and think, wow, that was actually a KPI. So internet users and webpages where the two proxies 'cause then search engines came out and everything else happened. So I got to ask you, there are people watching, they're seeing it on commercials on TV, they're seeing it everywhere stadiums are named after crypto companies. So, the bottom line is people want to know how NFT domains take the fear out of working with crypto and sending crypto. >> Yeah, absolutely, so imagine we had to navigate the web using IP addresses rather than typing in Google.com. You'd have to type in a random string of numbers that you'd had to memorize. That would be super painful for users and internet wouldn't have gotten to where it is today with almost 5 billion people online. The history of computer networks we have human readable naming systems built on top in every single instance, it's almost crazy that we got to a $2 trillion asset class with 250 million users worldwide. 13 years after the Satoshi white paper, without a human readable naming system other Unstoppable Domains in a few of our competitors, that's a fundamental problem that we need to solve in order to go from 250 million crypto users in 2022 to 5 billion crypto users a decade from now. >> And just to point out, not to look back and maybe make a correlation but I will, if you look at the naming system of DNS, what it did to IP addresses, that's one major innovation that enabled the web. Then you look at what keyword navigation has done on top of DNS, what that did for the industry, and that basically birthed Google keywords basically ads. So that's trillions and trillions of dollars. Again, now shifting to you guys, is that how you see it? Obviously it's decentralized, so what's different? Okay, I get, so if you compare here Google was successful, keyword advertising industry for the last of 25 years or 20 years. >> What's different now is? >> yeah >> Yeah, what's different now is the technology inflection points. So Blockchains have evolved to a point where they enable high throughput high transaction volume and true decentralized ownership. The NFTs standard, which is only a couple years old, has taken off massively around trading of profile pictures like CryptoPunks and the Bored Apes Yacht Club where the use cases extend much more than just a cool JPEG that goes up in value two or three X year over year. There is a true use case here around ownership of identity ownership over data, a decentralized login authentication and permission data sharing. One of the sad things that happened on the internet the last decade really was, that the platforms built out have now allowed developers to build on top of them in a trustless comissionless way. Developers who built applications on top of them, the early monopolies in the last decade, got the rules changed on them. APIs cut off, new fees instituted. That's not going to happen in Web3 because all permission list. Once an NFT is minted, it's custody in a user's own wallet, we cannot take the way it will continue to exist in eternity, regardless of what happens to Unstoppable Domains, which gives developers a lot more confidence in building new products for the Web3 identity standard that we're building out. >> You know what's amazing is that's a whole another generational shift. I've always been a big fan of abstractions when innovation is needed when there are problems that need to be solved, messes to be cleaned up, a good abstraction layer on top of new architecture is really, really phenomenal. I guess the key question for I have for you is, theCUBE we have all this video where's our NFT how should we implement NFTs? >> There's a couple different ways you could think about it, you could do proof of attendance protocol NFTs, which are really interesting way for users to show that they were at particular event. So just in the same way that people collect T-shirts from conferences, people will be collecting NFTs to show they were attending in person cultural moments or that they were part of an event online or offline. You could do NFTs for our employees to show that they were at your company during certain periods of the company's growth. So think of replacing their resume with a cryptographically secure resume like this on the Blockchain and perpetuity. Now more than half of all resumes contain lies, which is a pretty gnarly problem as a hiring manager that we constantly have to sort through. There's where that this can impact that side of the market as well. >> That's awesome, and I think this is a use case for everything we appreciate that. And of course we can have the most favorite cube moments, it can be a cube host NFT at Board Apes out there. Why not have a board cube host going on and then.. >> We're an auction for charity and OpenSea. >> All right, great stuff, now let's get into some of the cool tech nerd stuff, which is really the login piece which I think is fascinating. The having NFTs be a login mechanism is another great innovation, okay. So this is cool, 'cause it's like think of it as one click NFTs, if you will. What's the response been on this login with Unstoppable for that product? What's some of the use cases, can you get some examples of the momentum intraction? >> Yeah, absolutely, so we launched a product less than 90 days ago and we already have 90 committed or integrated partners live today with a login product. And this replaces login with Google, login with Facebook with a way that it's user owned and user controlled. And over time people will be attaching additional information back to their NFT domain name, such as their reputation, their history, things they've done online and be able to permission to share that with applications that they interact with in order to gain rewards. Once you own all of your data, and you can choose who you shared with . Companies will incentivize you to share data. For example, imagine you just buy a new house and you have 3000 square feet to furnish. If you could tell that fact and prove it, to a company like Wayfair, would they be incentivized to give you discounts? We're spending 10, 20, $30,000 and you'll do all of your purchasing there rather than spread across other e-commerce retailers. For sure they would, but right now when you go to that website, you're just another random email address. They have no idea who you are, what you've done, what your credit score is, whether you're a new house buyer or not. But if you could permission to share that using a log and installable product, I mean the web would just be much much different. >> And I think one of the things too, as these, I call them analog old school companies, old guard companies as referred to in theCUBE talk here. But we always call that old guard as the people who aren't innovating. You could think about companies having more community too, because if you have more sharing and you have this marketplace concept and you have these new dynamics of how people are working together, sharing will provide more or transparency but yet security on identity. Therefore things are going to be happening organically. That's a community dynamic what's your view on that? And what's your reaction. >> Communities are such an important part of Web3 and the cryptos ecosystem in general. People are very tightly knit, they all support each other. There there's a huge amount of collaboration in this space because we're all trying to onboard the next billion users into the ecosystem. And we know we have some fundamental challenges and problems to solve, whether it's complex wallet addresses, whether it's the lack of portable data sharing, whether it's just simple education, right? I'm sure, tens of million of people have gone to crypto for the first time during this year's Super Bowl based on some of those awesome ads they ran. >> Yeah, love the QR code, that's a direct response. I remember when the QR codes been around for a long time. I remember in the late 90's, it was a device at red QR code that did navigation to a webpage. So I mean, QR codes are super cool, great way to get, and we all using it too with the pandemic to ordering food. So I think QR codes are here to stay, in fact, we should have a QR code on all of our images here on the screen too. So we'll work on that, but I got to ask you on the project side, now let's get into the devs and kind of the applications, the users that are adopting unstoppable and this new way of things. Why are they gravitating towards this login concept? Can you give some examples and give some color commentary to why are these D-application, distributed application, dApps guys and gals programming with you guys? >> Yeah, they all believe that the potential for what we're trying to create around user own controlled identity. Where the only company in the market right now with a product that's live and working today. There's been a lot of promises made, and we're the first ones to actually delivered. So companies like Cook Finance for example, are seeing the benefit of being able to have their users, go through a simple process to check in and authenticate into the application using your NFT domain name rather than having to create an email address and password combination as a login, which inevitably leads to problems such as lost passwords, password resets, all those fun things that we used to deal with on a daily basis. >> Okay, so now I got to ask you the kind of partnerships you guys are looking at doing. I can only imagine the old school days you had a registry and you had registrars, you had a sales mechanism. I noticed you guys are selling NFT kind of like domain names on your website. Is that a kind of a current situation, is that going to be ongoing? How do you envision your business model evolving and what kind of partnerships do you see coming along? >> Yeah, absolutely, so we're working with a lot of different companies from browsers to exchanges, to wallets, to individual NFT projects, to more recently even exploring partnership opportunities with fashion brands for example. Monetarily, market is moving so so fast. And what we're trying to essentially do here is create the standard naming system for Web3. So a big part of that for us will be working with partners like blockchain.com and with Circle, who's behind the USDC coin on creating registry such as .blockchain and .coin and making those available to tens of millions and ultimately hundreds of millions and billions of users worldwide. We want an Unstoppable domain name to be the first asset that every user in crypto gets even before they buy their Bitcoin, Ethereum or Dogecoin. >> It makes a lot of sense to abstract the way the long hexa desal stream we all know, that we all write down, put in a safe, hopefully we don't forget about it. I always say, make sure you tell someone where your address is. So in case something happens, you don't lose all that crypto. All good stuff. I got to ask this the question around the ecosystem. Okay, can you share your view and vision of either yourself or the company when you have this kind of new market, you have all kinds of, we meant the web was a good example, right? Web pages, you need to web develop and tools. You had HTML by hand, then you had all these tools. So you had tools and platforms and things kind of came well grew together. How is the Web3 stakeholder ecosystem space evolving? What are some of the white spaces? What are some of the clearly defined areas that are developing? >> Yeah, I mean, we've seen explosion in new smart contract blockchains in the past couple of years, actually going live, which is really interesting because they support a huge number of different use cases, different trade offs on each. We recently partnered and moved over a primary infrastructure to Polygon, which is a leading EVM compatible smart chain, which allows us to provide free gas fees to users for minting and managing their domain name. So we're trying to move all obstacles around user adoption. Here you'll need to have Ethereum in your wallet in order to be an Unstoppable Domains customer or user, you don't have to worry about paying transaction fees every time you want to update the wallet addresses associated with your domain name. We want to make this really big and accessible for everybody. And that means driving down costs as much as possible. >> Yeah, it's a whole nother wave. It's a wave that's built on the shoulders of others. It's a shift in infrastructure, new capabilities, new applications. I think it's a great thing you guys do in the naming system, makes a lot of sense. It abstraction layer creates that ease of use, it simplifies things, makes things easier. I mean was the promise of these abstraction layer. Final question, if I want to get involved, say we want to do a CUBE NFT with Unstoppable, how do we work with you? How do we engage? Can you give a quick plug on what companies can do to engage with you guys on a business level? >> Yeah, absolutely, so we're looking to partner with wallet exchanges, browsers and companies who are in the crypto space already and realize they have a huge problem around usability with crypto transfers and wallet addresses. Additionally, we're looking to partner with decentralized applications as well as Web2 companies who perhaps want to offer logging with Unstoppable domain functionality. In addition to, or in replacement of the login with Google and login with Facebook buttons that we all know and love. And we're looking to work with fashion brands and companies in the sports sector who perhaps want to claim their Unstoppable name, free of charge from us. I might add in order to use that on Twitter or in other marketing materials that they may have out there in the world to signal that they're not only forward looking, but that they're supportive of this huge waves that we're all riding at the moment. >> Matt, great insight, chief revenue officer, Unstoppable Domains. Thanks for coming on the showcase, theCUBE and Unstoppable Domains share in the insights. Thanks for coming on. >> Thank you. >> Okay, this CUBE's coverage here with the Unstoppable Domain showcase. I'm John Furrier, your host, thanks for watching. (upbeat music)

Published Date : Mar 10 2022

SUMMARY :

featuring all the best content So the theme of this segment in the media have shown intermediation of the middle man for the last 20 years onto the internet. the kinds of companies Was the big use case that we identified and figure out all the wallet addresses I got to ask you on the wallet side, on the back end. 'cause a lot of people in the mainstream in order to go from 250 that enabled the web. that the platforms built out problems that need to be solved, that side of the market as well. And of course we can have the We're an auction for of the momentum intraction? to give you discounts? and you have this marketplace concept of Web3 and the cryptos and kind of the applications, that the potential is that going to be ongoing? the standard naming system for Web3. What are some of the white spaces? in the past couple of on the shoulders of others. of the login with Google Thanks for coming on the showcase, with the Unstoppable Domain showcase.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Matt MickiewiczPERSON

0.99+

Cook FinanceORGANIZATION

0.99+

WayfairORGANIZATION

0.99+

John FurrierPERSON

0.99+

GoogleORGANIZATION

0.99+

hundreds of millionsQUANTITY

0.99+

3000 square feetQUANTITY

0.99+

10QUANTITY

0.99+

TwitterORGANIZATION

0.99+

FacebookORGANIZATION

0.99+

MattPERSON

0.99+

tens of millionsQUANTITY

0.99+

TikTokORGANIZATION

0.99+

SnapchatORGANIZATION

0.99+

$30,000QUANTITY

0.99+

2022DATE

0.99+

two proxiesQUANTITY

0.99+

oneQUANTITY

0.99+

four and a half billionQUANTITY

0.99+

$2 trillionQUANTITY

0.99+

20QUANTITY

0.99+

250 million usersQUANTITY

0.99+

CUBEORGANIZATION

0.99+

first timeQUANTITY

0.99+

threeQUANTITY

0.99+

0%QUANTITY

0.99+

late 90'sDATE

0.99+

OpenSeaORGANIZATION

0.99+

Board ApesORGANIZATION

0.99+

first setQUANTITY

0.99+

theCUBEORGANIZATION

0.98+

CircleORGANIZATION

0.98+

Unstoppable DomainsORGANIZATION

0.98+

todayDATE

0.98+

last decadeDATE

0.97+

OneQUANTITY

0.97+

20 yearsQUANTITY

0.97+

PolygonORGANIZATION

0.97+

next decadeDATE

0.96+

dozensQUANTITY

0.95+

CryptoPunksORGANIZATION

0.95+

pandemicEVENT

0.95+

tens of million of peopleQUANTITY

0.95+

billion usersQUANTITY

0.94+

UnstoppableORGANIZATION

0.94+

2 trillion asset dollarQUANTITY

0.94+

two-sidedQUANTITY

0.93+

Bored Apes Yacht ClubORGANIZATION

0.93+

this yearDATE

0.93+

.coinOTHER

0.92+

less than 90 days agoDATE

0.92+

almost 5 billion peopleQUANTITY

0.92+

250 million cryptoQUANTITY

0.92+

CoinbaseORGANIZATION

0.91+

NFTORGANIZATION

0.91+

trillions and trillions of dollarsQUANTITY

0.9+

10 million userQUANTITY

0.9+

Super BowlEVENT

0.9+

.blockchainOTHER

0.89+

billions of usersQUANTITY

0.89+

a couple yearsQUANTITY

0.89+

eachQUANTITY

0.88+

firstQUANTITY

0.88+

past couple of yearsDATE

0.88+

90 committed or integrated partnersQUANTITY

0.87+

first onesQUANTITY

0.87+

singleQUANTITY

0.87+

more than halfQUANTITY

0.87+

Trust WalletORGANIZATION

0.86+

value twoQUANTITY

0.86+

big wavesEVENT

0.85+

one clickQUANTITY

0.84+

5 billion crypto usersQUANTITY

0.83+

last 20 yearsDATE

0.82+

Ren Besnard & Jeremiah Owyang | Unstoppable Domains Partner Showcase


 

(bright upbeat music) >> Hello, welcome to theCUBE, "Unstoppable Domains Showcase." I'm John Furrier, your host of theCUBE. We got a great discussion here called the influencers around what's going on Web 3.0. And also this new sea change, cultural change around this next generation, internet, web, cloud, all happening, Jeremiah Owyang, Industry Analyst and Founding Part of Kaleido Insights. Jeremiah, great to see you thanks for coming on I appreciate it. Ren Besnard, Vice President of Marketing and Unstoppable Domains in the middle of all the action. Gentlemen, thanks for coming on on theCUBE for this showcase. >> Wow, my pleasure. >> Thanks for having us, John. >> Jeremiah, I want to start with you. You've seen many ways refer in all of your work for over a decade now. You've seen the Web 2.0 wave now the Web 3.0 is here. And it's not, I wouldn't say hyped up it's really just ramping up. And you're seeing real practical examples. You're in the middle of all the action. What is this Web 3.0, can you frame for us? I mean, you've seen many webs. What is Web 3.0 mean, what is it all about? >> Well John, you and I worked in the Web 2.0 space and essentially that enabled peer-to-peer media where people could upload their thoughts and ideas and videos without having to rely on centralized media. Unfortunately, that distributed and decentralized movement actually became centralized on the platform which are the big social networks and big tech companies. And this has caused an uproar because the people who are creating the content did not have control, could not control their identities, and could not really monetize or make decisions. So Web 3.0 which is a moniker of a lot of different trends, including crypto, blockchain and sometimes the metaverse. Is to undo the controlling that has become centralized. And the power is now shifting back into the hands of the participants again. And in this movement, they want to have more control over their identities, their governance, the content that they're creating, how they're actually building it, and then how they're monetizing it. So in many ways it's changing the power and it's a new economic model. So that's Web 3.0. Without really even mentioning the technologies. Is that helpful? >> Yeah, it's great. And Ren, we're talking about on theCUBE many times and one notable stat I don't think it's been reported, but it's been more kind of a rumor. I hear that 30% of the Berkeley computer science students are dropping out and going into to crypto or blockchain or decentralized startups. Which means that there's a big wave coming in of talent. You're seeing startups, you're seeing a lot more formation, you're seeing a lot more, I would say it's kind of ramping up of real people, not just people with dream is actual builders out here doing stuff. What's your take on the Web 3.0 movement with all this kind of change happening from people and also the new ideas being refactored? >> I think that the competition for talent is extremely real. And we start looking at the stats, we see that there is an enormous draft of people that are moving into this space. People that are fascinated by technology and are embracing the ethos of Web 3.0. And at this stage I think it's not only engineers and developers, but we have moved into a second phase where we see that a lot of supporting functions, you know, marketing being one of them, sales, business development are being built up quite rapidly. It's not without actually reminding me of the mid 2000s, you know. When I started working with Google, at that point in time the walled gardens rightly absorbing vast, vast cohorts of young graduates and more experienced professionals that were passionate and moving into the web environment. And I think we are seeing a movement right now, which is not entirely similar except faster. >> Yeah, Jeremiah, you've seen the conversations of the cloud, I call the cloud kind of revolution. You had mobile in 2007. But you got Amazon Web Services changed the application space on how people developed in the cloud. And again, that created a lot of value. Now you're seeing the role of data as a huge part of how people are scaling and the decentralized movements. So you've got cloud which is kind of classic today, state of the art enterprise and or app developers. And you've got now decentralized wave coming, okay. You're seeing apps being developed on that architecture. Data is central in all this, right. So how, how do you view this as someone who's watching the landscape, you know, these walled gardens are hoarding all the data I mean, LinkedIn, Facebook. They're not sharing that data with anyone they're using it for themselves. So as- >> That's right. >> They can control back comes to the forefront. How do you see this market with the applications and what comes out of that? >> So the thing that we seen out of the five things that I had mentioned that are decentralizing. (Jeremiah coughing) Are the ones that have been easier to move across. Have been the ability to monetize and to build. But the data aspect has actually stayed pretty much central, frankly. What has decentralized is that the contracts, the blockchain ledgers, those have decentralized. But the funny thing is often a big portion of these blockchain networks are on Amazon 63 to 70%, same thing with (indistinct). So they're still using the Web 2.0 architectures. However, we're also seeing other forms like IPFS where the data could be spread across a wider range of folks. But right now we're still dependent on what Web 2.0. So the vision and the promise Web 3.0 when it to full decentralization is not here by any means. I'd say we're at a Web 2.25. >> Pre-Web 3.0 no, but actions there. How do you guys see the dangers, 'cause there's a lot of negative press but also there's a lot of positive press. You're seeing a lot of fraud, we've seen a lot of the crypto fraud over the past years. You've seen a lot of now positive. It's almost a self-governance thing and environment, the way the culture is. But what are the dangers, how do you guys educate people, what should people pay attention to, what should people look for to understand, you know, where to position themselves? >> Yes, so we've learned a lot from Web 1.0, Web 2.0, the sharing economy. And we are walking into Web 3.0 with eyes wide open. So people have rightfully put forth a number of challenges, the sustainability issues with excess using of computing and mining the excessive amount of scams that are happening in part due to unknown identities. Also the architecture breaks DAOn in some periods and there's a lack of regulation. This is something different though. In the last periods that we've gone through, we didn't really know what was going to happen. And we walked and think this is going to be great. The sharing economy, the gig economy, the social media's going to change the world around. It's very different now. People are a little bit jaded. So I think that's a change. And so I think we're going to see that sorted out in suss out just like we've seen with other trends. It's still very much in the early years. >> Ren, I got to get your take on this whole should influencers and should people be anonymous or should they be docs out there? You saw the board, eight guys that did that were kind of docs a little bit there. And that went viral. This is an issue, right? Because we just had a problem of fake news, fake people, fake information. And now you have a much more secure environment imutability is a wonderful thing. It's a feature, not a bug, right? So how is this all coming down? And I know you guys are in the middle of it with NFTs as authentication. Take us, what's your take on this because this is a big issue. >> Look, I think first I am extremely optimistic about technology in general. So I'm super, super bullish about this. And yet, you know, I think that while crypto has so many upsides, it's important to be super conscious and aware of the downsides that come with it to, you know. If you think about every Fortune 500 company there is always training required by all employees on internet safety, reporting of potential attacks and so on. In Web 3.0, we don't have that kind of standard reporting mechanisms yet for bad actors in that space. And so when you think about influencers in particular, they do have a responsibility to educate people about the potential, but also the dangers of the technology of Web 3.0 of crypto basically. Whether you're talking about hacks or online safety, the need for hardware, wallet, impersonators on discord, you know, security storing your seed phrase. So every actor influencer or else has got a role to play. I think that in that context to your point, it's very hard to tell whether influencers should be anonymous, oxydemous or fully docked. The decentralized nature of Web 3.0 will probably lead us to see a combination of those anonymity levels so to speak. And the movements that we've seen around some influencers identities become public are particularly interesting. I think there's probably a convergence of Web 2.O and Web 3.0 at play here, you know. Maybe occurring on the notion of 2.5. But for now I think in Web 2.0, all business founders and employees are known and they held accountable for their public comments and their actions. If Web 3.0 enables us to be anonymous, if DAOs have voting control, you know. What happens if people make comments and there is no way to know who they are, basically. What if the DAO doesn't take appropriate action? I think eventually there will be an element of community self-regulation where influencers will be acting in the best interest of their reputation. And I believe that the communities will self-regulate themselves and will create natural boundaries around what can be said or not said. >> I think that's a really good point about influencers and reputation because. Jeremiah, does it matter that you're anonymous have an icon that could be a NFT or a picture. But if I have an ongoing reputation I have trust, to this trust there. It's not like just a bot that was created just to spam someone. You know I'm starting to getting into this new way. >> You're right, and that word you said trust, that's what really this is about. But we've seen that public docs, people with their full identities have made mistakes. They have pulled the hood over people's faces and really scammed them out of a lot of money. We've seen that in the, that doesn't change anything in human behavior. So I think over time that we will see a new form of a reputation system emerge even for pseudonym and perhaps for people that are just anonymous that only show their potential wallet, address a series of numbers and letters. That form might take a new form of a Web 3.0 FICO Score. And you could look at their behaviors. Did they transact, you know, how did they behave? Were they involved in projects that were not healthy? And because all of that information is public on the chain and you can go back in time and see that. We might see a new form of a scoring emerge, of course. Who controls that scoring? That's a whole nother topic gone on controling and trust. So right now, John we do see that there's a number of projects, new NFT projects, where the founders will claim and use this as a point of differentiation that they are fully docs. So you know who they are and in their names. Secondly, we're seeing a number of products or platforms that require KYC, you know, your customers. So that's self-identification often with a government ID or credit card in order to bridge out your coins and turn that into fiat. In some cases that's required in some of these marketplaces. So we're seeing a collision here between our full names and pseudonyms and being anonymous. >> That's awesome. And I think this is the new, again, a whole new form of governance. Ren, you mentioned some comments about DAO. I want to get your thoughts again. You know, Jeremiah we've become historians over the years. We're getting old I'm a little bit older than you. (Jeremiah laughs) But we've seen the- >> You're young men. You know, I remember breaking in the business when the computer standards bodies were built to be more organic and then they became much more of a, kind of an anti-innovation environment where people, the companies would get involved, the standards organization just to slow things DAO and mark things up a little bit. So, you know, you look at DAOs like, hmm, is DAO a good thing or a bad thing. The answer is from people I talk to is, it depends. So I'd love to get your thoughts on getting momentum and becoming defacto with value, a value proposition, vis-a-vis just a DAO for the sake of having a DAO. This has been a conversation that's been kind of in the inside the baseball here, inside the ropes of the industry, but there's trade offs. Can you guys share your thoughts on when to do a DAO and when not to do a DAO and the benefits and trade offs of that? >> Sure, maybe I'll start off with a definition and then we'll go to, Ren. So a DAO, a decentralized autonomous organization, the best way to think about this It's a digital cooperative. and we've heard of worker cooperatives before. The difference is that they're using blockchain technologies in order to do three things, identity, governance, and rewards and mechanisms. They're relying on Web 2.0 tools and technologies like discord and Telegram and social networks to communicate. And as a cooperative they're trying to come up with a common goal. Ren, what's your take, that's the setup. >> So, you know for me when I started my journey into crypto and Web 3.0, I had no idea about what DAO actually meant. And an easy way for me to think of it and to grasp the nature of it was about the comparison between a DAO and perhaps a more traditional company structure, you know. In the traditional company structure, you have (indistinct), the company's led by a CEO and other executives. The DAO is a flat structure, and it's very much led by a group of core contributors. So to Jeremiah's point, you know, you get that notion of a cooperative type of structure. The decision making is very different, you know. We're talking about a super high level of transparency proposals getting submitted and voting systems using (indistinct) as opposed to, you know, management, making decisions behind closed doors. I think that speaks to a totally new form of governance. And I think we have hardly, hardly scratched the surface. We have seen recently very interesting moments in Web 3.0 culture. And we have seen how DAO suddenly have to make certain decisions and come to moments of claiming responsibility in order to police behavior of some of the members. I think that's important. I think it's going to redefine how we're thinking about that particularly new governance models. And I think it's going to pave the way for a lot of super interesting structure in the near future. >> Yeah and that's a great point. >> Go ahead, Jeremiah. >> That's a great point, Ren. Around the transparency for governance. So, John you post the question, does this make things faster or slower? And right now in the most doubts are actually pretty slow because they're set up as a flat organization. So as a response to that they're actually shifting to become representative democracies. Does that sound familiar? Or you can appoint delegates and use tokens to vote for them and they have a decision power. Almost like a committee and they can function. And so we've seen actually there sometimes are hierarchy except the person at the top is voted by those that have the tokens. In some cases, the people at the top had the most tokens. But that's a whole nother topic. So we're seeing a wide variety of governance structures. >> You know, Ren I was talking with Matt G, the Founder of Unstoppable. And I was telling him about the Domain Name System. And one little trivia note that many people don't know about is that the US government 'cause the internet was started by the US. The Department of Commerce kept that on tight leash because the international telecommunications wanted to get their hands on it because of ccTLDs and other things. So at that time, 'cause the innovation yet was isn't yet baked out. It was organically growing the governance, the rules of the road, keeping it very stable versus melding with it. So there's certain technologies that require, Jeremiah that let's keep an eye on as a community let's not formalize anything. Like the government did with the Domain Name System. Let's keep it tight and then finally released it. I think multiple years after 2004, I think it went over to the ITU. But this is a big point. I mean, if you get too structured, organic innovation can't go. What's you guys reaction to that? >> So I think, you know to take the stab at it. We have as a business, you know, thinking of Unstoppable Domains, a strong incentive to innovate. And this is what is going to be determining long-term value growth for the organization, for partners, for users, for customers. So you know the degree of formalization actually gives us a sense of purpose and a sense of action. And if you compare that to DAO, for instance, you can see how some of the upsides and downsides can pan out either way. It's not to say that there is a perfect solution. I think one of the advantages of the DAO is that you can let more people contribute. You can probably remove buyers quite effectively and you can have a high level of participation and involvement in decisions and own the upside in many ways. You know as a company, it's a slightly different setup. We have the opportunity to coordinate a very diverse and part-time workforce in a very you a different way. And we do not have to deal with the inefficiencies that might be inherent to some form of extreme decentralization. So there is a balance from an organizational structure that comes either side. >> Awesome. Jeremiah, I want to get your thoughts on a trend that you've been involved in, we've both been involved in. And you're seeing it now with the kind of social media world, the world of the role of an influencer. It's kind of moved from what was open source and influencer was a connect to someone who shared, created content enabled things to much more of a vanity. You update the photo on Instagram and having a large audience. So is there a new influencer model with Web 3.0 or is it, I control the audience I'm making money that way. Is there a shift in the influencer role or ideas that you see that should be in place for what is the role of an influencer? 'Cause as Web 3.0 comes you're going to see that role become instrumental. We've seen it in open source projects. Influencers, you know, the people who write code or ship code. So what's your take on that? Because this has been a conversation. People have been having the word influencer and redefining and reframing it. >> Sure, the influence model really hasn't changed that much, but the way that they're behaving has when it comes to Web 3.0. In this market, I mean there's a couple of things. Some of the influencers are investors. And so when you see their name on a project or a new startup, that's an indicator there's a higher level of success. You might want to pay more attention to it or not. Secondly, influencers themselves are launching their own NFT projects. So, Gary Vaynerchuk, a number of celebrities, Paris Hilton is involved. They are also doing theirs as well. Steve Aok, famous DJ launched his as well. So they're going head first and participating in building in this model. And their communities are coming around them and they're building economy. Now the difference is it's not I speak as an influencer to the fans. The difference is that the fans are now part of the community and they literally hold and own some of the economic value, whether it's tokens or the NFTs. So it's a collaborative economy, if you will, where they're all benefiting together. And that's a big difference as well. >> Can you see- >> Lastly, there's one little tactic we're seeing where marketers are air dropping NFTs, branded NFTs influencers wallet. So you can see it in there. So there's new tactics that are forming as well. Back to you. >> That's super exciting. Ren, what's your reaction to that? Because he just hit on a whole new way of how engagement's happening, how people are closed looping their votes, their votes of confidence or votes with their wallet. And the brands which are artists now influencers. I mean, this is a whole game changing instrumentation level. >> I think that what we are seeing right now is super reinvigorating as a marketeer who's been around for a few years, basically. I think that the shift in the way brands are going to communicate and engage with their audiences is profound. It's probably as revolutionary and even more revolutionary than the movement for brands in getting into digital. And you have that sentiment of a gold rush right now with a lot of brands that are trying to understand NFTs and how to actually engage with those communities and those audiences. There are many levels in which brands and influencers are going to engage. There are many influencers that actually advance the message and the mission because the explosion of content on Web 3.0 has been crazy. Part of that is due to the network effect nature of crypto. Because as Jaremiah mentioned, people are incentivized to promote projects. Holders of an NFT are also incentivized to promote it. So you end up with a fly wheel which is pretty unique of people that are hyping their project and that are educating other people about it and commenting on the ecosystem with IP right being given to NFT holders. You're going to see people promote brands instead of the brands actually having to. And so the notion of brands are gaining and delivering elements of the value to their fans is something that's super attractive, extremely interesting. And I think again, we have hardly scratched the surface of all that is possible in that particular space. >> That's interesting. You guys are bringing some great insight here. Jeremiah, the old days the word authentic was a kind of a cliche and brands like tried to be authentic. And they didn't really know what to do they called it organic, right? And now you have the trust concept with authenticity and environment like Web 3.0 where you can actually measure it and monetize it and capture it if you're actually authentic and trustworthy. >> That's right, and be because it's on blockchain, you can see how somebody's behaved with their economic behavior in the past. Of course, big corporations aren't going to have that type of trail on blockchain just yet. But individuals and executives who participate in this market might be. And we'll also see new types of affinity. Do executives do they participate in these NFT communities, do they purchase them or numerous brands like Adidas to acquire, you know, different NFT projects to participate. And of course the big brands are grabbing their domains. Of course you could talk to, Ren about that because it's owning your own name is a part of this trust and being found. >> That's awesome. Great insight guys. Closing comments, takeaways for the audience here. Each of you take a minute to share your thoughts on what you think is happening now where it goes, all right, where's it going to go? Jeremiah, we'll start with you. >> Sure, I think the vision of Web 3.0 where full decentralization happens, where the power is completely shifted to the edges. I don't think it's going to happen. I think we will reach Web 2.5. And I've been through so many tech trends where we said that the power's going to shift completely to of the end, it just doesn't. In part there's two reasons. One is the venture capital are the ones who tend to own the programs in the first place. And secondly, the startups themselves end up becoming the one-percenter. We see Airbnb and Uber are one-percenter now. So that trend happens over and over and over. Now with that said, the world will be in a better place. We will have more transparency. We will see economic power shifted to the people, the participants. And so they will have more control over the internet that they are building. >> Awesome, Ren final comments. >> I'm fully aligned with Jeremiah on the notion of control being returned to users, the notion of ownership and the notion of redistribution of the economic value that is created across all the different chains that we are going to see and all those ecosystems. I believe that we are going to witness two parallel movements of expansion. One that is going to be very lateral. When you think of crypto and Web 3.0 essentially you think of a few 100 tribes. And I think that more projects are going to be a more coalitions of individuals and entities, and those are going to exist around those projects. So you're going to see, you know, an increase in the number of tribes that one might join. And I also think that we're going to progress rapidly from the low 100 millions of crypto and NFT holders into the big hands basically. And that's going to be extreme interesting. I think that the next waves of crypto users, NFT fans are going to look very different from the early adopters that we had witnessed in the very early days. So it's not going to be your traditional model of technology adoption curves. I think the demographics are going to shift and the motivations are going to be different as well, which is going to be a wonderful time to educate and engage with new community members. >> All right, Ren and Jeremiah, thank you both for that great insight great segment breaking down Web 3.0 or Web 2.5 as Jeremiah says but we're in a better place. This is a segment with the influencers. As part of theCUBE and the Unstoppable Domain Showcase. I'm John Furrie, your host. Thanks for watching. (bright upbeat music)

Published Date : Mar 10 2022

SUMMARY :

in the middle of all the action. You're in the middle of all the action. and sometimes the metaverse. I hear that 30% of the Berkeley of the mid 2000s, you know. the landscape, you know, comes to the forefront. is that the contracts, to understand, you know, the excessive amount of scams are in the middle of it And I believe that the communities It's not like just a bot that was created And because all of that And I think this is the new, again, and the benefits and trade offs of that? and social networks to communicate. And I think it's going to pave the way that have the tokens. is that the US government We have the opportunity to coordinate or ideas that you see The difference is that the fans So you can see it in there. And the brands which are and commenting on the ecosystem Jeremiah, the old days the word authentic And of course the big brands for the audience here. And secondly, the startups themselves and the notion of redistribution This is a segment with the influencers.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Gary VaynerchukPERSON

0.99+

JohnPERSON

0.99+

AdidasORGANIZATION

0.99+

Matt GPERSON

0.99+

2007DATE

0.99+

Steve AokPERSON

0.99+

RenPERSON

0.99+

JeremiahPERSON

0.99+

Paris HiltonPERSON

0.99+

Jeremiah OwyangPERSON

0.99+

JaremiahPERSON

0.99+

John FurriePERSON

0.99+

UberORGANIZATION

0.99+

AirbnbORGANIZATION

0.99+

Ren BesnardPERSON

0.99+

GoogleORGANIZATION

0.99+

30%QUANTITY

0.99+

Amazon Web ServicesORGANIZATION

0.99+

two reasonsQUANTITY

0.99+

five thingsQUANTITY

0.99+

firstQUANTITY

0.99+

OneQUANTITY

0.99+

LinkedInORGANIZATION

0.99+

FacebookORGANIZATION

0.99+

eight guysQUANTITY

0.99+

Kaleido InsightsORGANIZATION

0.99+

100 tribesQUANTITY

0.99+

DAOORGANIZATION

0.99+

mid 2000sDATE

0.99+

63QUANTITY

0.98+

one-percenterQUANTITY

0.98+

John FurrierPERSON

0.98+

bothQUANTITY

0.98+

2.5OTHER

0.98+

AmazonORGANIZATION

0.98+

second phaseQUANTITY

0.98+

ITUORGANIZATION

0.98+

EachQUANTITY

0.98+

SecondlyQUANTITY

0.97+

UnstoppableORGANIZATION

0.97+

Department of CommerceORGANIZATION

0.96+

InstagramORGANIZATION

0.96+

oneQUANTITY

0.96+

70%QUANTITY

0.95+

todayDATE

0.95+

secondlyQUANTITY

0.94+

TelegramTITLE

0.94+

theCUBEORGANIZATION

0.94+

Unstoppable DomainsEVENT

0.91+

Web 2.5OTHER

0.9+

Unstoppable Domains ShowcaseEVENT

0.88+

.0QUANTITY

0.86+

discordTITLE

0.85+

one little tacticQUANTITY

0.82+

Web 3.0OTHER

0.82+

one notable statQUANTITY

0.82+

NFTORGANIZATION

0.82+

US governmentORGANIZATION

0.81+

two parallel movementsQUANTITY

0.8+

one littleQUANTITY

0.79+

100 millionsQUANTITY

0.77+

Sandy Carter, Unstoppable Domains, announces Women of Web3 | WoW3


 

(upbeat music) >> Hello, everyone welcome to theCube special presentation of the Unstoppable Domains partner showcase. I'm John Furrier, your host of theCube. We have here, Cube alumni, Sandy Carter, SVP and channel chief of Unstoppable Domains. Sandy, great to see you. Congratulations on your new assignment. Exciting new company, and thanks for coming on for the showcase. >> Well, thank you, John. It's so fun to always be here with you through all my companies, it's really great. Thanks for having me. >> Well, it's been pretty amazing what's going on in the world right now. We just had the past Super Bowl which is the biggest event in the world around advertising, a lot of Web 3.0, crypto, blockchain, decentralized applications. It's here, it's mainstream. We've talked off camera many times around the shifts in technology, cloud computing. We're now with Web 3.0 and some are even saying Web 4.0. (Sandy laughs) A lot of technology programmers, people who are building new things are all in the Web 3.0 world. It's really going mainstream. So what's your view on that? I see you're in it too. You're leading it. >> I am in it too. And it's so exciting to be at the verge of the next technology trend that's out there. And I'm really excited about this one, John because this is all about ownership. It's about members not users. It's quite fascinating to be honest. >> What is Web 3.0? What is Web 3.0? Define it for us 'cause you have a good knack for putting things in the perspective. People want to know what does this Web 3.0? What does it mean? >> Okay, great. That's a great question. In fact, I have just a couple of slides because I'm a visual learner. So I don't know if you guys could pop up just a couple of slides for us. So first to me, Web 3.0 is really all about this area of ownership and that's whether it's in gaming or art or even business applications today. In fact, let me show you an example. If you go to the next slide, you will see like with Twitter, and John, you and I were there, I was the first person to onstage announce that we were going to do tweets during a major event. And of course I started on Twitter back in 2008, pretty early on. And now the valuation of Twitter is going up, I got a lot of value and I helped to attract a lot of those early users. But my value was really based on the people, building my network, not based on that monetary valuation. So I really wasn't an owner. I was a user of Twitter and helped Twitter to grow. Now, if you go with me to the next slide you'll see just a little bit more about what we're talking about here and I know this is one of your favorites. So Web 1.0 was about discovery. We discovered a lot of information. Web 2.0 was about reading the information but also contributing with that two-way dialogue with social but Web 3.0 is now all about membership, not being a user but being a member and therefore having an ownership stake in the power of what's coming. And I think this is a big differential, John, if I had to just nail one thing. This would be the big differential. >> That's awesome. And I love that slide because it goes to the progression. Most people think of web 1.0 data, the worldwide web, web pages, browsers, search engines, Web 2.0, better interfaces. You got mobile, you got social networks. And then it got messy, bots and misinformation, users of the product being used by the companies. So clearly Web 3.0 is changing all that and I think the ownership thing is interesting because you think about it, we should own our data. We should have a data wallet. We should have all that stored. So this is really at the heart of what you guys are doing. So I think that's a great way to put it. I would ask you what's your impression when people you talk to in the mainstream industry that aren't in Web 3.0 that are coming in, what's their reaction? What do they think? What do they see? >> Well, a lot of what I see from Web 2.0 folks is that they don't understand it, first of all. They're not sure about it. And I always like to say that we're in the early days of Web 3.0. So we're in that dial up phase. What was that? Was that AOL? Remember that little that they used to make? >> (laughs) You've got mail. >> Yeah, you've got mail. That's right. That's where we are today with Web 3.0. And so it is early days and I think people are looking for something they can hang their hat on. And so one of the things that we've been working on are what would be the elements of Web 3.0? And if you could take me to one more slide and this will be my last slide, but again, I'm a very visual person. I think there are really five basic assumptions that Web 3.0 really hangs its hat on. The first is decentralization, or I say at least partially decentralized because today we're building on Web 2.0 technology and that is okay. Number two is that digital identity. That identity you just talked about, John where you take your identity with you. You don't have identity for Twitter, an identity for LinkedIn, an identity for a game. I can take my identity today, play a game with it, bank with it, now move on to a Metaverse with it, the same identity. The other thing we like to say is it's built on blockchain and we know that blockchain is still making a lot of improvements but it's getting better and better and better. It's trustless, meaning there's no in between party. You're going direct, user, member to institution, if you would. So there's no bank in between, for example. And then last but not least, it's financially beneficial for the people involved. It's not just that network effect that you're getting, it's actually financially beneficial for those folks. All five of those give us that really big push towards that ownership notion. >> One thing I would point out, first of all, great insight, I would also add and and love to get your reaction to it, and this is a great lead into the news, but there's also a diversity angle because this is a global phenomenon, okay? And it's also a lot of young cultural shift happening with the younger generation, but also technologists from all ages are participating and all genders. Everything's coming together. It's a melting pot. It's a global... This is like the earth is flat moment for us. This is an interesting time. What's your reaction to them? >> Absolutely and I believe that the more diverse the community can be, the more innovative it will be. And that's been proven out by studies, by McKenzie and Deloitte and more. I think this is a moment for Web 3.0 to be very inclusive. And the more inclusive that Web 3.0 is, the bigger the innovation and the bigger the power and the bigger that dream of ownership will become a reality. So I'm 100% with you on the diversity angle for sure. >> So big new news tomorrow launching. This is super exciting. First of all, I love the acronym, but I love the news. Take us through the big announcement that you're having. >> Yeah. So John, we are so excited. We have over 55 different companies joining together to form Unstoppable Women of Web 3.0, or we call it WOW3. Unstoppable WOW3. And the mission is really clear and very inclusive. The first is that we want to make Web 3.0 accessible for everyone. The second is we don't want to just say we want it accessible for everyone, we want to help with that first step. We're going to be giving away $10 million worth of domains from Unstoppable which we believe is that first step into Web 3.0. And then we're going to be action oriented. We don't want to just say we're going to help you get started or just say that Web 3.0 is accessible, we're going to launch education, networking, and events. So for example, we've got our first in person event that will occur at South by Southwest. Our first virtual event will occur on March 8th which is International Women's Day and there'll be two components of it. One is an hour YouTube Live so that people can come in and ask questions and then we've got a 24 hour Twitter space. So almost every half an hour or every hour on the hour, you're going to have these amazing women talk to you about what is DeFi? What is minting? What is Web 3.0 all about? Why gaming in Web 3.0? I mean, it's just going to be phenomenal. And in that we want to support each other as we're moving forward. This whole concept of from the very beginning, we want Web 3.0 to be diverse. >> And I want to also point out that you've got some activities on the March 8th International Women's Day but it's always every day in this community because it's a community. So this whole idea of community inclusion continues every day. Talk about those activities you're having on March 8th. Can you share what's happening on International Women's Day? >> Yeah, so first we're going to have a YouTube Live where we're going to go in detail into what is Web 3.0? What is DeFi? What is an NFT and why do they exist? Then we're going to have this 24 hour Twitter spaces where we've got all these different guest speakers from the 55 different companies that are supporting the initiative. We're also going to launch a list of the 100 most inspirational women of Web 3.0. We're going to do that twice a year. And we decided John not to do the top women, but the women that are inspirational, who are pioneering the trail, who are having an impact. And so we want it to be a community. So it's 100 of the most inspirational women of Web 3.0. We're also setting up a Web 3.0 Women's Speakers Bureau. So I cannot tell you, John, how many time people will call me up and they'll be like, "We really want you to speak here." And when I really get down to it, they really want me because I'm a woman that can speak about Web 3.0 but there are so many women who can do this. And so I wanted to have a place where everybody could come and see how many different diverse people we have that could speak out this. >> Yeah, and that's a great thing because there are a lot of women who can speak on this. They just have to have their voices found. So there's a lot of discovery in that format. Is there any plans to go beyond? You mentioned some workshops, what other things... Can you give another quick highlight of the things else you're doing post the event? >> Yeah, so one of the big things post the event is working with Girls in Tech, and I know you know Adriana. We are going to host on their platform. They have a platform for mentoring. We're going to host a track for Web 3.0 and during International Women's Day, we're going to auction off some NFTs that will contribute to that mentoring platform. So we've got folks like Lazy Lions and Bella and Deadheads that are going to donate NFTs. We'll auction those off and then that will enable the ongoing platform of Girls in Tech to have that mentoring that will be available for the next generation. We'll also do events, both virtually through Twitter spaces and other means as well as in-person events. I just mentioned at South by Southwest which I'm really looking forward to. We're going to have our first in-person event on March the 12th. It's going to be a brunch. A lot of the women told me, John, that they go to all these Web 3.0 or crypto events and everything's like a frat party in the evening. And they're like, "Why can't we just have a nice brunch and sit down and talk about it?" (John laughs) So at South by Southwest that is exactly what we're going to do. We're going to have a brunch and we're going to sit down and talk about it with all of these companies. And John, one of the things that's amazing to me is that we have over 55 companies that are all coming together to support this initiative. To me, that was just overwhelming. I was hoping to get about 20 companies and so far we have 55. So I'm feeling so excited and so empowered by what I see as the potential for this group. >> Yeah, well, first of all, congratulations. That's a really great thing you're doing. If you need place on theCube to post those videos, if you can get copies, we'd be glad to share them as well 'cause it's super important to get all the great minds out there that are working on Web 3.0 and have them showcased. I got to ask you now that you're in the trenches now, doing all this great work. What are some of the buzzwords that people should know about in Web 3.0? You mentioned to five main pillars as well as the ownership, the paradigm shift, we got that. What are some of the buzzword that people should know about? How would you rank those? >> Well, I think there are a couple. Let's see. I mean, one is if you think about it, what is a decentralized application? Some people call them Dapps. Dapps, you'll hear that a lot. And a decentralized application just means that you are leveraging and using multiple forms. There's no centralization of the back end. So everything is decentralized or moving around. Another is the gas fee. This comes up a lot, many people think, "Oh yeah, I put gas in my car." But a gas fee in Web 3.0 is you're actually paying for those decentralized computers that you're using. So in a centralized land, a company owns those computers. In a decentralized land, since you're using all these different assets, you've got to pay for them and that's what the gas fee is for. The gas fee is to pay for those particular types of solutions. And many of these terms that we're talking about minting, what is an NFT, we'll be explaining all of these terms on International Women's Day in that 24 hour Twitter space as well. >> We'll look forward to that Twitter space. We'll share as well. In the Web 3.0 world, when you look at it, when you look at what Unstoppable's doing, it's a paradigm shift. You laid it out there. What is the bottom line? What's the most practical thing people are doing with the domains? 'Cause it is definitely headroom in terms of capability, single sign on, you own your own data, integrating into wallet and decentralized applications and creating this new wave just like the web. More web pages, better search. More pages, the search has to get better, flywheel kicking in. What's the flywheel for Unstoppable? >> Well, I think the flywheel is the really around digital identity. It's why I came to Unstoppable because I believe that the data about you should be owned by you and that identity now travels with you. It's your wallet, it's your healthcare data, it's your educational records, and it's more. So in the future, that digital identity is going to become so much more important than it is today. And oh my gosh, John, it's going to be used in so many different ways that we can't even imagine it now. So for me, I think that digital identity and it really puts that ownership right in the hands of the members, not in anyone else's hands, a company, a government, et cetera. It puts the ownership of that data in your hands. >> I just love these big waves, these shifts, because you mentioned healthcare. Imagine an NFT is that sign on where you don't have to worry about all these HIPAA regulations. You can just say, "Here's me. Here's who I'm trusted." And they don't even know my name, but they know it's trusted. >> And everything just trickles down from there. >> That's right. >> And all the databases are called. It's all immutable. I got my private key. It unlocks so much potential in a new way. Really is amazing. >> I agree. And even just think about education. I was with Arizona State University and so my daughter took some classes at a community college and I wanted to get those classes and have those credits available for her university. How hard is that? Just to get that education and everything is paper and I had to physically sign, I had to physically mail it. It was pretty crazy. So now imagine that your digital identity contains all of your degrees, all of the skills that you've gone through all of your experiences, John. You told me before the show, all different experiences that you have that I didn't know about. I'm sure a lot of people didn't. What if you had that piece of you that would be available that you could use it at any time. >> It's locked in LinkedIn. There's a silo. Again, I'm a huge believer in silo busting going on. This new generation is not going to tolerate experiences that don't fit their mission. They want to have liberation on their data. They don't want to be the product. They want to have the value. >> That's right. >> And then broker that value for services and be able to be horizontally scalable and pop around from place to place without logging in again or having that siloed platform have the data like LinkedIn. You mentioned my resume's on basically LinkedIn, but I got webpages. I got some stories. I got videos. I'm all over the place. I need an NFT. >> And just think about LinkedIn, John. You could say that you graduated from Yale and didn't even graduate from Yale because nobody double checks that but in a wallet, if Yale actually sent that information in so you could verify it. It's that verification that's done over the blockchain, that immutable verification that I find to be very powerful. And John, we were just chatting with some companies earlier today that are Web 2.0 companies and they're like, "Oh, okay. All this is just for people? It's just for consumers?" And I was like, "No, this is for B2B. You've got to start thinking about this as a company." So for example, if you're a company today, how are you going to entice users to let you see some of their data? How are you going to look at ownership when it might be done via a dow and maybe a part of a piece of art, a part of a company, a part of real estate, like Parcel who you guys are going to talk to later on. Look at how that is going to change the world. It's going to change the way funds are raised. It's going to change the way you buy carbon credits, the way you buy art. If you're a consumer company, think about games and endgame economics. People are now playing game that money is real and your brand could be positioned. Have you thought about that? >> Yeah, I think that point you mentioned earlier about Twitter being the user, you had some personal connection, we didn't monetize it. Now with Web 3.0, you own it. One of the things that I see happening and it's coming out a lot of the Unstoppable interviews as well as what we're seeing in the marketplace is that the communities are part owners of the talent of whether it's an artist, a music artist, could be theCube team. The communities are part of the fabric of the overall group ownership. So you're starting to see you mentioned dows, okay? It's one kind of it. So as users become in control of their data and owning it, they're also saying, "Hey I want to be part of someone else." Artists are saying, " Be my stockholder. Own my company." >> That's right. >> So you start to see ownership concept not just be about the individual, it's about the groups. >> Right. And it's about companies too. So I'm hoping that as part of our Unstoppable Women of Web 3.0, we do have several companies who have joined us that are what I would say, traditionally Web 2.0 companies, trying to go over the chasm into Web 3.0. And I do think it's really important that companies of all types and sizes start looking at the implication of that ownership model and what that does. So for example, it's a silly one, but a simple one. I bought a Lazy Lion. It was actually part of my signing bonus, which is also interesting. My signing bonus was an NFT and now my Lazy Lion, I now own that Lazy Lion but the artist also gets a potential percentage of that. I can put my Lazy Lion on a t-shirt. I could name a store after my Lazy Lion because now it's mine. I own it. I own that asset. And now myself and the artists are teamed together. We're like a joint venture together. It's fascinating new models and there are so many of them. After ETHDenver, I was reading some of the key takeaways. And I think the biggest key takeaway was that this space is moving so fast with so much new information that you really have to pick one or two things and just go really deep so that you really understand them versus trying to go so wide that you can't understand everything at one time and to keep up it's a mission today to keep up. >> That interesting example about the Lazy Lion, the artist in relationship with you, that's a smart contract. There's no law firm doing that. It's the blockchain. Disintermediation is happening. >> It's trustless. Back to those five things we talked about. It's on the blockchain, it's decentralized at least partially, it's a digital identity, it's financially beneficial to you and it's trustless. That's what that is. It's a smart contract. There's no in between >> Can't change. It's immutable. Can't hack. Once it's on the blockchain, you're good to go. Sandy, well, congratulations. Great to see you. Unstoppable Women of Web3, WOW3. Great acronym. We're going to support you. We're going to put you on our March 8th site we're putting together. Great to have you on. Congratulations and thanks for sharing the big news. >> Thank you so much, John. Great to be on. >> Okay, this is theCube coverage of Unstoppable Domain partner showcase. I'm John Furrier, your host, here with Sandy Carter. Thanks for watching. (upbeat music)

Published Date : Mar 8 2022

SUMMARY :

and thanks for coming on for the showcase. It's so fun to always be here with you are all in the Web 3.0 world. It's quite fascinating to be honest. you have a good knack and I helped to attract And I love that slide And I always like to say And so one of the things This is like the earth that the more diverse First of all, I love the And in that we want to support each other on the March 8th International Women's Day So it's 100 of the most highlight of the things else that they go to all these I got to ask you now that that you are leveraging More pages, the search has to get better, and that identity now travels with you. Imagine an NFT is that sign on And everything just And all the databases are called. all different experiences that you have going to tolerate experiences and be able to be horizontally scalable that I find to be very powerful. One of the things that I see happening So you start to see ownership that you really have to It's the blockchain. to you and it's trustless. We're going to put you Great to be on. of Unstoppable Domain partner showcase.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

YaleORGANIZATION

0.99+

Sandy CarterPERSON

0.99+

AdrianaPERSON

0.99+

2008DATE

0.99+

Arizona State UniversityORGANIZATION

0.99+

John FurrierPERSON

0.99+

$10 millionQUANTITY

0.99+

100QUANTITY

0.99+

SandyPERSON

0.99+

55QUANTITY

0.99+

March 8thDATE

0.99+

LinkedInORGANIZATION

0.99+

oneQUANTITY

0.99+

McKenzieORGANIZATION

0.99+

24 hourQUANTITY

0.99+

International Women's DayEVENT

0.99+

100%QUANTITY

0.99+

Unstoppable DomainsORGANIZATION

0.99+

DeloitteORGANIZATION

0.99+

TwitterORGANIZATION

0.99+

Super BowlEVENT

0.99+

first stepQUANTITY

0.99+

OneQUANTITY

0.99+

two thingsQUANTITY

0.99+

firstQUANTITY

0.99+

secondQUANTITY

0.99+

todayDATE

0.99+

five main pillarsQUANTITY

0.99+

CubeORGANIZATION

0.99+

fiveQUANTITY

0.98+

one more slideQUANTITY

0.98+

Lazy LionsORGANIZATION

0.98+

Girls in TechORGANIZATION

0.98+

AOLORGANIZATION

0.98+

bothQUANTITY

0.98+

FirstQUANTITY

0.97+

South by SouthwestORGANIZATION

0.97+

singleQUANTITY

0.97+

one timeQUANTITY

0.96+

HIPAATITLE

0.96+

over 55 companiesQUANTITY

0.96+

55 different companiesQUANTITY

0.96+

over 55 different companiesQUANTITY

0.96+

five thingsQUANTITY

0.95+

an hourQUANTITY

0.95+

One thingQUANTITY

0.94+

Sandy Carter, Unstoppable Domains, announces Women of Web3 | WoW3


 

(upbeat music) >> Hello, everyone welcome to theCube special presentation of the Unstoppable Domains partner showcase. I'm John Furrier, your host of theCube. We have here, Cube alumni, Sandy Carter, SVP and channel chief of Unstoppable Domains. Sandy, great to see you. Congratulations on your new assignment. Exciting new company, and thanks for coming on for the showcase. >> Well, thank you, John. It's so fun to always be here with you through all my companies, it's really great. Thanks for having me. >> Well, it's been pretty amazing what's going on in the world right now. We just had the past Super Bowl which is the biggest event in the world around advertising, a lot of Web 3.0, crypto, blockchain, decentralized applications. It's here, it's mainstream. We've talked off camera many times around the shifts in technology, cloud computing. We're now with Web 3.0 and some are even saying Web 4.0. (Sandy laughs) A lot of technology programmers, people who are building new things are all in the Web 3.0 world. It's really going mainstream. So what's your view on that? I see you're in it too. You're leading it. >> I am in it too. And it's so exciting to be at the verge of the next technology trend that's out there. And I'm really excited about this one, John because this is all about ownership. It's about members not users. It's quite fascinating to be honest. >> What is Web 3.0? What is Web 3.0? Define it for us 'cause you have a good knack for putting things in the perspective. People want to know what does this Web 3.0? What does it mean? >> Okay, great. That's a great question. In fact, I have just a couple of slides because I'm a visual learner. So I don't know if you guys could pop up just a couple of slides for us. So first to me, Web 3.0 is really all about this area of ownership and that's whether it's in gaming or art or even business applications today. In fact, let me show you an example. If you go to the next slide, you will see like with Twitter, and John, you and I were there, I was the first person to onstage announce that we were going to do tweets during a major event. And of course I started on Twitter back in 2008, pretty early on. And now the valuation of Twitter is going up, I got a lot of value and I helped to attract a lot of those early users. But my value was really based on the people, building my network, not based on that monetary valuation. So I really wasn't an owner. I was a user of Twitter and helped Twitter to grow. Now, if you go with me to the next slide you'll see just a little bit more about what we're talking about here and I know this is one of your favorites. So Web 1.0 was about discovery. We discovered a lot of information. Web 2.0 was about reading the information but also contributing with that two-way dialogue with social but Web 3.0 is now all about membership, not being a user but being a member and therefore having an ownership stake in the power of what's coming. And I think this is a big differential, John, if I had to just nail one thing. This would be the big differential. >> That's awesome. And I love that slide because it goes to the progression. Most people think of web 1.0 data, the worldwide web, web pages, browsers, search engines, Web 2.0, better interfaces. You got mobile, you got social networks. And then it got messy, bots and misinformation, users of the product being used by the companies. So clearly Web 3.0 is changing all that and I think the ownership thing is interesting because you think about it, we should own our data. We should have a data wallet. We should have all that stored. So this is really at the heart of what you guys are doing. So I think that's a great way to put it. I would ask you what's your impression when people you talk to in the mainstream industry that aren't in Web 3.0 that are coming in, what's their reaction? What do they think? What do they see? >> Well, a lot of what I see from Web 2.0 folks is that they don't understand it, first of all. They're not sure about it. And I always like to say that we're in the early days of Web 3.0. So we're in that dial up phase. What was that? Was that AOL? Remember that little that they used to make? >> (laughs) You've got mail. >> Yeah, you've got mail. That's right. That's where we are today with Web 3.0. And so it is early days and I think people are looking for something they can hang their hat on. And so one of the things that we've been working on are what would be the elements of Web 3.0? And if you could take me to one more slide and this will be my last slide, but again, I'm a very visual person. I think there are really five basic assumptions that Web 3.0 really hangs its hat on. The first is decentralization, or I say at least partially decentralized because today we're building on Web 2.0 technology and that is okay. Number two is that digital identity. That identity you just talked about, John where you take your identity with you. You don't have identity for Twitter, an identity for LinkedIn, an identity for a game. I can take my identity today, play a game with it, bank with it, now move on to a Metaverse with it, the same identity. The other thing we like to say is it's built on blockchain and we know that blockchain is still making a lot of improvements but it's getting better and better and better. It's trustless, meaning there's no in between party. You're going direct, user, member to institution, if you would. So there's no bank in between, for example. And then last but not least, it's financially beneficial for the people involved. It's not just that network effect that you're getting, it's actually financially beneficial for those folks. All five of those give us that really big push towards that ownership notion. >> One thing I would point out, first of all, great insight, I would also add and and love to get your reaction to it, and this is a great lead into the news, but there's also a diversity angle because this is a global phenomenon, okay? And it's also a lot of young cultural shift happening with the younger generation, but also technologists from all ages are participating and all genders. Everything's coming together. It's a melting pot. It's a global... This is like the earth is flat moment for us. This is an interesting time. What's your reaction to them? >> Absolutely and I believe that the more diverse the community can be, the more innovative it will be. And that's been proven out by studies, by McKenzie and Deloitte and more. I think this is a moment for Web 3.0 to be very inclusive. And the more inclusive that Web 3.0 is, the bigger the innovation and the bigger the power and the bigger that dream of ownership will become a reality. So I'm 100% with you on the diversity angle for sure. >> So big new news tomorrow launching. This is super exciting. First of all, I love the acronym, but I love the news. Take us through the big announcement that you're having. >> Yeah. So John, we are so excited. We have over 55 different companies joining together to form Unstoppable Women of Web 3.0, or we call it WOW3. Unstoppable WOW3. And the mission is really clear and very inclusive. The first is that we want to make Web 3.0 accessible for everyone. The second is we don't want to just say we want it accessible for everyone, we want to help with that first step. We're going to be giving away $10 million worth of domains from Unstoppable which we believe is that first step into Web 3.0. And then we're going to be action oriented. We don't want to just say we're going to help you get started or just say that Web 3.0 is accessible, we're going to launch education, networking, and events. So for example, we've got our first in person event that will occur at South by Southwest. Our first virtual event will occur on March 8th which is International Women's Day and there'll be two components of it. One is an hour YouTube Live so that people can come in and ask questions and then we've got a 24 hour Twitter space. So almost every half an hour or every hour on the hour, you're going to have these amazing women talk to you about what is DeFi? What is minting? What is Web 3.0 all about? Why gaming in Web 3.0? I mean, it's just going to be phenomenal. And in that we want to support each other as we're moving forward. This whole concept of from the very beginning, we want Web 3.0 to be diverse. >> And I want to also point out that you've got some activities on the March 8th International Women's Day but it's always every day in this community because it's a community. So this whole idea of community inclusion continues every day. Talk about those activities you're having on March 8th. Can you share what's happening on International Women's Day? >> Yeah, so first we're going to have a YouTube Live where we're going to go in detail into what is Web 3.0? What is DeFi? What is an NFT and why do they exist? Then we're going to have this 24 hour Twitter spaces where we've got all these different guest speakers from the 55 different companies that are supporting the initiative. We're also going to launch a list of the 100 most inspirational women of Web 3.0. We're going to do that twice a year. And we decided John not to do the top women, but the women that are inspirational, who are pioneering the trail, who are having an impact. And so we want it to be a community. So it's 100 of the most inspirational women of Web 3.0. We're also setting up a Web 3.0 Women's Speakers Bureau. So I cannot tell you, John, how many time people will call me up and they'll be like, "We really want you to speak here." And when I really get down to it, they really want me because I'm a woman that can speak about Web 3.0 but there are so many women who can do this. And so I wanted to have a place where everybody could come and see how many different diverse people we have that could speak out this. >> Yeah, and that's a great thing because there are a lot of women who can speak on this. They just have to have their voices found. So there's a lot of discovery in that format. Is there any plans to go beyond? You mentioned some workshops, what other things... Can you give another quick highlight of the things else you're doing post the event? >> Yeah, so one of the big things post the event is working with Girls in Tech, and I know you know Adriana. We are going to host on their platform. They have a platform for mentoring. We're going to host a track for Web 3.0 and during International Women's Day, we're going to auction off some NFTs that will contribute to that mentoring platform. So we've got folks like Lazy Lions and Bella and Deadheads that are going to donate NFTs. We'll auction those off and then that will enable the ongoing platform of Girls in Tech to have that mentoring that will be available for the next generation. We'll also do events, both virtually through Twitter spaces and other means as well as in-person events. I just mentioned at South by Southwest which I'm really looking forward to. We're going to have our first in-person event on March the 12th. It's going to be a brunch. A lot of the women told me, John, that they go to all these Web 3.0 or crypto events and everything's like a frat party in the evening. And they're like, "Why can't we just have a nice brunch and sit down and talk about it?" (John laughs) So at South by Southwest that is exactly what we're going to do. We're going to have a brunch and we're going to sit down and talk about it with all of these companies. And John, one of the things that's amazing to me is that we have over 55 companies that are all coming together to support this initiative. To me, that was just overwhelming. I was hoping to get about 20 companies and so far we have 55. So I'm feeling so excited and so empowered by what I see as the potential for this group. >> Yeah, well, first of all, congratulations. That's a really great thing you're doing. If you need place on theCube to post those videos, if you can get copies, we'd be glad to share them as well 'cause it's super important to get all the great minds out there that are working on Web 3.0 and have them showcased. I got to ask you now that you're in the trenches now, doing all this great work. What are some of the buzzwords that people should know about in Web 3.0? You mentioned to five main pillars as well as the ownership, the paradigm shift, we got that. What are some of the buzzword that people should know about? How would you rank those? >> Well, I think there are a couple. Let's see. I mean, one is if you think about it, what is a decentralized application? Some people call them Dapps. Dapps, you'll hear that a lot. And a decentralized application just means that you are leveraging and using multiple forms. There's no centralization of the back end. So everything is decentralized or moving around. Another is the gas fee. This comes up a lot, many people think, "Oh yeah, I put gas in my car." But a gas fee in Web 3.0 is you're actually paying for those decentralized computers that you're using. So in a centralized land, a company owns those computers. In a decentralized land, since you're using all these different assets, you've got to pay for them and that's what the gas fee is for. The gas fee is to pay for those particular types of solutions. And many of these terms that we're talking about minting, what is an NFT, we'll be explaining all of these terms on International Women's Day in that 24 hour Twitter space as well. >> We'll look forward to that Twitter space. We'll share as well. In the Web 3.0 world, when you look at it, when you look at what Unstoppable's doing, it's a paradigm shift. You laid it out there. What is the bottom line? What's the most practical thing people are doing with the domains? 'Cause it is definitely headroom in terms of capability, single sign on, you own your own data, integrating into wallet and decentralized applications and creating this new wave just like the web. More web pages, better search. More pages, the search has to get better, flywheel kicking in. What's the flywheel for Unstoppable? >> Well, I think the flywheel is the really around digital identity. It's why I came to Unstoppable because I believe that the data about you should be owned by you and that identity now travels with you. It's your wallet, it's your healthcare data, it's your educational records, and it's more. So in the future, that digital identity is going to become so much more important than it is today. And oh my gosh, John, it's going to be used in so many different ways that we can't even imagine it now. So for me, I think that digital identity and it really puts that ownership right in the hands of the members, not in anyone else's hands, a company, a government, et cetera. It puts the ownership of that data in your hands. >> I just love these big waves, these shifts, because you mentioned healthcare. Imagine an NFT is that sign on where you don't have to worry about all these HIPAA regulations. You can just say, "Here's me. Here's who I'm trusted." And they don't even know my name, but they know it's trusted. >> And everything just trickles down from there. >> That's right. >> And all the databases are called. It's all immutable. I got my private key. It unlocks so much potential in a new way. Really is amazing. >> I agree. And even just think about education. I was with Arizona State University and so my daughter took some classes at a community college and I wanted to get those classes and have those credits available for her university. How hard is that? Just to get that education and everything is paper and I had to physically sign, I had to physically mail it. It was pretty crazy. So now imagine that your digital identity contains all of your degrees, all of the skills that you've gone through all of your experiences, John. You told me before the show, all different experiences that you have that I didn't know about. I'm sure a lot of people didn't. What if you had that piece of you that would be available that you could use it at any time. >> It's locked in LinkedIn. There's a silo. Again, I'm a huge believer in silo busting going on. This new generation is not going to tolerate experiences that don't fit their mission. They want to have liberation on their data. They don't want to be the product. They want to have the value. >> That's right. >> And then broker that value for services and be able to be horizontally scalable and pop around from place to place without logging in again or having that siloed platform have the data like LinkedIn. You mentioned my resume's on basically LinkedIn, but I got webpages. I got some stories. I got videos. I'm all over the place. I need an NFT. >> And just think about LinkedIn, John. You could say that you graduated from Yale and didn't even graduate from Yale because nobody double checks that but in a wallet, if Yale actually sent that information in so you could verify it. It's that verification that's done over the blockchain, that immutable verification that I find to be very powerful. And John, we were just chatting with some companies earlier today that are Web 2.0 companies and they're like, "Oh, okay. All this is just for people? It's just for consumers?" And I was like, "No, this is for B2B. You've got to start thinking about this as a company." So for example, if you're a company today, how are you going to entice users to let you see some of their data? How are you going to look at ownership when it might be done via a dow and maybe a part of a piece of art, a part of a company, a part of real estate, like Parcel who you guys are going to talk to later on. Look at how that is going to change the world. It's going to change the way funds are raised. It's going to change the way you buy carbon credits, the way you buy art. If you're a consumer company, think about games and endgame economics. People are now playing game that money is real and your brand could be positioned. Have you thought about that? >> Yeah, I think that point you mentioned earlier about Twitter being the user, you had some personal connection, we didn't monetize it. Now with Web 3.0, you own it. One of the things that I see happening and it's coming out a lot of the Unstoppable interviews as well as what we're seeing in the marketplace is that the communities are part owners of the talent of whether it's an artist, a music artist, could be theCube team. The communities are part of the fabric of the overall group ownership. So you're starting to see you mentioned dows, okay? It's one kind of it. So as users become in control of their data and owning it, they're also saying, "Hey I want to be part of someone else." Artists are saying, " Be my stockholder. Own my company." >> That's right. >> So you start to see ownership concept not just be about the individual, it's about the groups. >> Right. And it's about companies too. So I'm hoping that as part of our Unstoppable Women of Web 3.0, we do have several companies who have joined us that are what I would say, traditionally Web 2.0 companies, trying to go over the chasm into Web 3.0. And I do think it's really important that companies of all types and sizes start looking at the implication of that ownership model and what that does. So for example, it's a silly one, but a simple one. I bought a Lazy Lion. It was actually part of my signing bonus, which is also interesting. My signing bonus was an NFT and now my Lazy Lion, I now own that Lazy Lion but the artist also gets a potential percentage of that. I can put my Lazy Lion on a t-shirt. I could name a store after my Lazy Lion because now it's mine. I own it. I own that asset. And now myself and the artists are teamed together. We're like a joint venture together. It's fascinating new models and there are so many of them. After ETHDenver, I was reading some of the key takeaways. And I think the biggest key takeaway was that this space is moving so fast with so much new information that you really have to pick one or two things and just go really deep so that you really understand them versus trying to go so wide that you can't understand everything at one time and to keep up it's a mission today to keep up. >> That interesting example about the Lazy Lion, the artist in relationship with you, that's a smart contract. There's no law firm doing that. It's the blockchain. Disintermediation is happening. >> It's trustless. Back to those five things we talked about. It's on the blockchain, it's decentralized at least partially, it's a digital identity, it's financially beneficial to you and it's trustless. That's what that is. It's a smart contract. There's no in between >> Can't change. It's immutable. Can't hack. Once it's on the blockchain, you're good to go. Sandy, well, congratulations. Great to see you. Unstoppable Women of Web3, WOW3. Great acronym. We're going to support you. We're going to put you on our March 8th site we're putting together. Great to have you on. Congratulations and thanks for sharing the big news. >> Thank you so much, John. Great to be on. >> Okay, this is theCube coverage of Unstoppable Domain partner showcase. I'm John Furrier, your host, here with Sandy Carter. Thanks for watching. (upbeat music)

Published Date : Feb 25 2022

SUMMARY :

and thanks for coming on for the showcase. It's so fun to always be here with you are all in the Web 3.0 world. It's quite fascinating to be honest. you have a good knack and I helped to attract And I love that slide And I always like to say And so one of the things This is like the earth that the more diverse First of all, I love the And in that we want to support each other on the March 8th International Women's Day So it's 100 of the most highlight of the things else that they go to all these I got to ask you now that that you are leveraging More pages, the search has to get better, and that identity now travels with you. Imagine an NFT is that sign on And everything just And all the databases are called. all different experiences that you have going to tolerate experiences and be able to be horizontally scalable that I find to be very powerful. One of the things that I see happening So you start to see ownership that you really have to It's the blockchain. to you and it's trustless. We're going to put you Great to be on. of Unstoppable Domain partner showcase.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

YaleORGANIZATION

0.99+

Sandy CarterPERSON

0.99+

AdrianaPERSON

0.99+

2008DATE

0.99+

Arizona State UniversityORGANIZATION

0.99+

John FurrierPERSON

0.99+

$10 millionQUANTITY

0.99+

100QUANTITY

0.99+

SandyPERSON

0.99+

55QUANTITY

0.99+

March 8thDATE

0.99+

LinkedInORGANIZATION

0.99+

oneQUANTITY

0.99+

McKenzieORGANIZATION

0.99+

24 hourQUANTITY

0.99+

International Women's DayEVENT

0.99+

100%QUANTITY

0.99+

Unstoppable DomainsORGANIZATION

0.99+

DeloitteORGANIZATION

0.99+

TwitterORGANIZATION

0.99+

Super BowlEVENT

0.99+

first stepQUANTITY

0.99+

OneQUANTITY

0.99+

two thingsQUANTITY

0.99+

firstQUANTITY

0.99+

secondQUANTITY

0.99+

todayDATE

0.99+

five main pillarsQUANTITY

0.99+

CubeORGANIZATION

0.99+

fiveQUANTITY

0.98+

one more slideQUANTITY

0.98+

Lazy LionsORGANIZATION

0.98+

Girls in TechORGANIZATION

0.98+

AOLORGANIZATION

0.98+

bothQUANTITY

0.98+

FirstQUANTITY

0.97+

South by SouthwestORGANIZATION

0.97+

singleQUANTITY

0.97+

one timeQUANTITY

0.96+

HIPAATITLE

0.96+

over 55 companiesQUANTITY

0.96+

55 different companiesQUANTITY

0.96+

over 55 different companiesQUANTITY

0.96+

five thingsQUANTITY

0.95+

an hourQUANTITY

0.95+

One thingQUANTITY

0.94+

2022 007 Matt Gould


 

>>Hello, and welcome to the cubes. Special showcase with unstoppable domains. I'm John furrier, your host of the cube here in Palo Alto, California and Matt Gould, who is the founder and CEO of unstoppable domains. Matt, great to come on. Congratulations on the success of your company on stumbled domains. Thanks for kicking off this showcase. >>Thank you. Happy to be here. So >>Love, first of all, love the story you got going on here. Love the approach, very innovative, but you're also on the big web three wave, which we know where that leads into. Metaverse unlimited new ways. People are consuming information, content applications are being built differently. This is a major wave and it's happening. Some people are trying to squint through the hype versus reality, but you don't have to be a rocket science to realize that it's a cultural shift and a technical shift going on with web three. So this is kind of the what's happening in the market. So give us your take. What's your reaction? You're in the middle of it. You're on this wave. >>Yeah. Well, I would say it's a torrent of change and the get unleashed just over a decade ago with Bitcoin coming out and giving people the ability to have a digital items that they could actually own themselves online. And this is a new thing. And people coming, especially from my generation of millennials, they spend their time online in these digital spaces and they've wanted to be able to own these items. Do you see it from, you know, gaming and Fortnite and skins and Warcraft and all these other places, but this is really being enabled by this new crypto technology to just extend a whole lot more, uh, applications for money, which everyone's familiar with, uh, to, uh, NFT projects, uh, like boarding school. >>You know, I was listening to your podcast. You guys got a great pot. I think you're on a 117 episodes now and growing, you guys do a deep dive. So people watching check out the unstoppable podcast, but in the last podcast, man, you mentioned, you know, some of the older generations like me, I grew up with IP addresses and before the web, they called it information super highway. It wasn't even called the web yet. Um, but IP was, was generated by the United States department of commerce and R and D that became the internet. The internet became the web back then it was just get some webpages up and find what you're looking for. Right. Very analog compared to what's. Now, today, now you mentioned gaming, you mentioned, uh, how people are changing. Can you talk about your view of this cultural shift? And we've been talking about in the queue for many, many years now, but it's actually happening now where the expectation of the audience and the users and the people consuming and communicating and bonding and groups, whether it's gaming or communities are expecting new behaviors, new applications, and it's a forcing function. >>This shift is having now, what's your reaction to that? What's your explanation? >>Yeah, well, I think, uh, it just goes back to the shift of peoples, where are they spending their time? And if you look today, most people spend 50% plus of their time in front of a screen. And that's just a tremendous amount of effort. But if you look at how much, how much of assets are digital, it's like less than 1% of their portfolio would be some sort of digital asset, uh, compared to, you know, literally 50% of every day sitting in front of a screen and simultaneously what's happening is these new technologies are emerging around, uh, cryptocurrencies, blockchain systems, uh, ways for you to track the digital ownership of things, and then kind of bring that into, uh, your different applications. So one of the big things that's happening with web three is this concept of data portability, meaning that I can own something on one application. >>And I could potentially take that with me to several other applications across the internet. And so this is like the emerging digital property rights that are happening right now. As we transitioned from a model in web to where you're on a hosted service, like Facebook, it's a walled garden, they own and control everything. You are the product, you know, they're mining you for data and they're just selling ads, right? So to assist them where it's much more open, you can go into these worlds and experiences. You can take things with you, uh, and you can, you can leave with them. And most people are doing this with cryptocurrency. Maybe you earn an in-game currency, you can leave and take that to a different game and you can spend it somewhere else. Uh, so the user is now enabled to bring their data to the party. Whereas before now you couldn't really do that. And that data includes their money or that includes their digital items. And so I think that's the big shift that we're seeing and that changes a lot and how applications, uh, serve up to user. So it's going to change their user experiences. For instance, >>The flip, the script has flipped and you're right on. I agree with you. I think you guys are smart to see it. And I think everyone who's on this wave will see it. Let's get into that because this is happening. People are saying I'm done with being mined and being manipulated by the big Facebooks and the LinkedIns of the world who were using the user. Now, the contract was a free product and you gave it your data, but then it got too far. Now people want to be in charge of their data. They want to broker their data. They want to collect their digital exhaust, maybe collect some things in a game, or maybe do some commerce in an application or a marketplace. So these are the new use cases. How does the digital identity architecture work with unstoppable? How are you guys enabling that? Could you take us through the vision of where you guys came on this because it's unique in an NFT and kind of the domain name concept coming together? Can you explain? >>Yeah. So, uh, we think we approach the problem for if we're going to rebuild the way that people interact online, uh, what are kind of the first primitives that they're going to need in order to make that possible? And we thought that one of the things that you have on every network, like when you log on Twitter, you have a Twitter handle. When you log on, uh, you know, Instagram, you have an Instagram handle, it's your name, right? You have that name that's that's on those applications. And right now what happens is if users get kicked off the platform, they lose a hundred percent of their followers, right? And theirs. And they also, in some cases, they can't even directly contact their followers on some of these platforms. There's no way for them to retain this social network. So you have all these influencers who are, today's small businesses who build up these large, you know, profitable, small businesses online, uh, you know, being key opinion leaders to their demographic. >>Uh, and then they could be D platform, or they're unable to take this data and move to another platform. If that platform raised their fees, you've seen several platforms, increase their take rates. You have 10, 20, 30, 40%, and they're getting locked in and they're getting squeezed. Right. Uh, so we just said, you know what, the first thing you're going to want to own that this is going to be your piece of digital property. It's going to be your name across these applications. And if you look at every computer network in the history of computing networks, the end up with a naming system, and when we've looked back at DDA desk, which came out in the nineties, uh, it was just a way for people to find these webpages much easier, you know, instead of mapping these IP addresses. Uh, and then we said to ourselves, you know, uh, what's going to happen in the future is just like everyone has an email address that they use in their web two world in order to, uh, identify themselves as they log into all these applications. >>They're going to have an NFT domain in the web three world in order to authenticate and, and, uh, bring their data with them across these applications. So we saw a direct correlation there between DNS and what we're doing with NFT domain name systems. Um, and the bigger breakthrough here is at NMT domain systems or these NFT assets that live on a blockchain. They are owned by users to build on these open systems so that multiple applications could read data off of them. And that makes them portable. So we were looking for an infrastructure play like a picks and shovels play for the emerging web three metaverse. Uh, and we thought that names were just something that if we wanted a future to happen, where all 3.5 billion people, you know, with cell phones are sending crypto and digital assets back and forth, they're gonna need to have a name to make this a lot easier instead of, you know, these long IP addresses or a hex addresses in the case of Porto. >>So people have multiple wallets too. It's not like there's all kinds of wallet, variations, name, verification, you see link trees everywhere. You know, that's essentially just an app and it doesn't really do anything. I mean, so you're seeing people kind of trying to figure it out. I mean, you've got to get up, Angela got a LinkedIn handle. I mean, what do you do with it? >>Yeah. And, and then specific to crypto, there was a very hair on fire use case for people who buy their first Bitcoin. And for those in the audience who haven't done this yet, when you go in and you go into an app, you buy your first Bitcoin or Ethereum or whatever cryptocurrency. And then the first time you try to send it, there's this, there's this field where you want to send it. And it's this very long text address. And it looks like an IP address from the 1980s, right? And it's, it's like a bank number and no one's going to use that to send money back and forth to each other. And so just like domain names and the DNS system replace IP addresses in Ft domains, uh, on blockchain systems, replace hex addresses for sending and receiving, you know, cryptocurrency, Bitcoin, Ethereum, whatever. And that's its first use case is it really plugs in there. So when you want to send money to someone, you can just, instead of sending money to a large hex address that you have to copy and paste, you can have an error or you can send it to the wrong place. It's pretty scary. You could send it to John furrier dot, uh, NFT. And uh, so we thought that you're just not going to get global adoption without better UX, same thing. It worked with the.com domains. And this is the same thing for the coin and other >>Crypto. It's interesting to look at the web two or trend one to two web one went to two. It was all about user ease of use, right? And making things simpler. Clutter, you have more pages. You can't find things that was search that was Google since then. Has there actually been an advancement? Facebook certainly is not an advancement. They're hoarding all the data. So I think we're broken between that step of, you know, a free search to all the resources in the world, to which, by the way, they're mining a lot of data too, with the toolbar and Chrome. But now where's that web three crossover. So take us through your vision on digital identity on web to Google searching, Facebook's broken democracy is broken users. Aren't in charge to web three. >>Got it. Well, we can start at web one. So the way that I think about it is if you go to web one, it was very simple, just text web pages. So it was just a way for someone to like put up a billboard and here's a piece of information and here's some things that you could read about it. Right. Uh, and then what happened with web two was you started having applications being built that had backend infrastructure to provide services. So if you think about web two, these are all, you know, these are websites or web portals that have services attached to them, whether that's a social network service or search engine or whatever. And then as we moved to web three, the new thing that's happening here is the user is coming on to that experience. And they're able to connect in their wallet or their web three identity, uh, to that app and they can bring their data to the party. >>So it's kind of like web one, you just have a static web page whip, two, you have a static web page with a service, like a server back here. And then with three, the user can come in and bring their database with them, uh, in order to have much better app experiences. So how does that change things? Well, for one, that means that the, you want data to be portable across apps. So we've touched on gaming earlier and maybe if I have an end game item for one, a game that I'm playing for a certain company, I can take it across two or three different games. Uh, it also impacts money. Money is just digital information. So now I can connect to a bunch of different apps and I can just use cryptocurrency to make those payments across those things instead of having to use a credit card. >>Uh, but then another thing that happens is I can bring in from, you know, an unlimited amount of additional information about myself. When I plug in my wallet, uh, as an example, when I plug in to Google search, for instance, they could take a look at my wallet that I've connected and they could pull information about me that I enabled that I share with them. And this means that I'm going to get a much more personalized experience on these websites. And I'm also going to have much more control over my data. There's a lot of people out there right now who are worried about data privacy, especially in places like Europe. And one of the ways to solve that is simply to not store the data and instead have the user bring it with them. >>I always thought about this and I always debated it with David laundry. My cohost does top down governance, privacy laws outweigh the organic bottoms up innovation. So what you're getting at here is, Hey, if you can actually have that solved before it even starts, it was almost as if those services were built for the problem of web two. Yes, not three. Write your reaction to that. >>I think that is, uh, right on the money. And, uh, if you look at it as a security, like if I put my security researcher hat on, I think the biggest problem we have with security and privacy on the web today is that we have these large organizations that are collecting so much data on us and they just become these honeypots. And there have been huge, uh, breaches like Equifax, you know, a few years back is a big one and just all your credit card data got leaked, right? And all your, uh, credit information got leaked. And we just have this model where these big companies silo your data. They create a giant database, which is worth hundreds of millions of dollars, if not, billions, to be attacked. And then someone eventually is going to hack that in order to pull that information. Well, if instead, and you can look at this at web three. >>So for those of the audience who have used the web three application, one of these depths, um, you know, trade cryptocurrencies or something, you'll know that when you go there, you actually connect to your wall. So when you're working with these web, you connect, you, you know, you bring your information with you and you connect it. That means that the app has none of that storage, right? So these apps that people are using for crypto trading cryptocurrency on depths or whatever, they have no stored information. So if someone hacks one of these DFI exchanges, for instance, uh, there's nothing to steal. And that's because the only time the information is being accessed is when the users actively using the site. And so as someone who cares about security and privacy, I go, wow, that's a much better data model. And that give so much more control of user because the user just permissions access to the data only during the time period in which they're interacting with the application. Um, and so I think you're right. And like, we are very excited to be building these tools, right? Because I see, like, if you look at Europe, they basically pass GDPR. And then all the companies are going, we can't comply with that and they keep postponing it or like changing a little bit and trying to make it easier to comply with. But honestly we just need to switch the data models. So the companies aren't even taking the data and then they're gonna be in a much better spot. >>The GDPR is again, a nightmare. I think it's the wrong approach. Oh, I said it was screwed up because most companies don't even know where stuff is stored. Nevermind how they delete someone's entering a database. They don't even know what they're collecting. Some at some level it becomes so complicated. So right on the money are good. Good call out there. Question for you. Is this then? Okay. So do you decouple the wallet from the ID or are they together? Uh, and is it going to be a universal wallet? Do you guys see yourselves as universal domains? Take me through the thinking around how you're looking at the wallet and the actual identity of the user, which obviously is super important on the identity side while it, is that just universal or is that going to be coming together? >>Well, I think so. The way that we kind of think about it is that wallets are where people have their financial interactions online. Right. And then identity is much more about, it's kind of like being your passport. So it's like your driver's license for the internet. So these are two kind of separate products we see longer term, uh, and they actually work together. So, you know, like if you have a domain name, it actually is easier to make deposits into your wallet because it's easier to remember to send money to, you know, method, rules dot crypto. And that way it's easier for me to receive payments or whatever. And then inside my wallet, I'm going to be doing defy trades or whatever. And doesn't really have an interaction with names necessarily in order to do those transactions. But then if I want to, uh, you know, sign into a website or something, I could connect that with my NFT domain. >>And I do think that these two things are kind of separate. I think there's, we're gonna still early. So figuring out exactly how the industry is gonna shake out over like a five to 10 year time horizon. And it may be a little bit more difficult and we could see some other emerging, uh, what you would consider like cornerstones of the crypto ecosystem. But I do think identity and reputation is one of those. Uh, and I also think that your financial applications of defy are going to be another. So those are the two areas where I see it. Um, and just to, you know, a note on this, when you have a wallet, it usually has multiple cryptocurrency address. So you're going to have like 50 cryptocurrency addresses in a wallet. Uh, you're going to want to have one domain name that links back to all those, because you're just not going to remember those 50 different addresses. So that's how I think that they collaborate. And we collaborate with several large wallets as well, uh, like blockchain.com, uh, and you know, another 30 plus of these, uh, to make it easier for sending out and receiving cryptocurrency. >>So the wallet, basically as a D app, the way you look at it, you integrate whatever you want, just integrate in. How do I log into decentralized applications with my NFT domain name? Because this becomes okay, I got to love the idea, love my identity. I'm in my own NFT. I mean, hell, this video is going to be an NFT. Soon. We get on board with the program here. Uh, but I do, I log into my app, I'm going to have a D app and I got my domain name. Do I have to submit, is there benchmarking, is there approval process? Is there API APIs and a SDK kind of thinking around it? How do you thinking about dealing with the apps? >>Yeah, so all of the above and what we're trying to, what we're trying to do here is build like an SSO solution. Uh, but that it's consumer based. So, uh, what we've done is adapted some SSL protocols that other people have used the standard ones, uh, in order to connect that back to an NFT domain in this case. And that way you keep the best of both worlds. So you can use these authorization protocols for data permissioning that are standard web to API APIs. Uh, but then the permissioning system is actually based on the user controlled in FTE. So they're assigning that with their private public key pair order to make those updates. Um, so that, that allows you to connect into both of these systems. Uh, we think that that's how technology typically impacts the world is it's not like you have something that just replaces something overnight. >>You have an integration of these technologies over time. Uh, and we really see these three components in MTU domains integrating nicely into regular apps. So as an example in the future, when you log in right now, you see Google or Facebook, or you can type in an email address, you can see not ensemble domains or NFT, uh, authorization, and you can SSO in with that, to that website. When you go to a website like an e-commerce website, you could share information about yourself because you've connected your wallet now. So you could say, yes, I am a unique individual. I do live in New York, uh, and I just bought a new house. Right. And then when you permission all that information about yourself to that application, you can serve up a new user experience for you. Um, and we think it's going to be very interesting for doing rewards and discounts, um, online for e-commerce specifically, uh, in the future, because that opens up a whole new market because they can ask you questions about yourself and you can deliver that information. >>Yeah. I really think that the gaming market has totally nailed the future use case, which is in game currency in game to engagement in game data. And now bringing that, so kind of a horizontally scalable, like surface areas is huge, right? So, you know, I think you're, that's huge success on the concept. The question I have to ask you is, um, you getting any pushback from ICANN, the international corporates have name and numbers. They got dot everything now.club, cause the clubhouse, they got dot, you know, party.live. I mean, so the real domain name people are over here, web too. You guys are coming out with the web three where's that connect for people who are not following along the web three trend. How do they, how do you rationalize the, the domain angle here? >>Yeah, well, uh, so I would say that NFTE domains or what domains on DNS were always meant to be 30 plus years ago and they just didn't have blockchain systems back in the nineties when they were building these things. So there's no way to make them for individuals. So what happened was for DNS, it actually ended up being the business. So if you look at DNS names, there's about 350 million registrations. They're basically all small business. And it's like, you know, 20 to 50 million small businesses, uh, who, uh, own the majority of these, uh, these.com or these regular DNS domain names. And that's their focus NFTE domains because all of a sudden you have the, uh, the Walton, if you have them in your wallet and your crypto wallet, they're actually for individuals. So that market, instead of being for small businesses is actually end-users. So, and instead of being for, you know, 20 to 50 million small businesses, we're talking about being useful for three to 4 billion people who have an internet connection. >>Uh, and so we actually think that the market size we're in a few domains and somewhere 50 to 100 X, the market size for traditional domain names. And then the use cases are going to be much more for, uh, individuals on a day-to-day basis. So it's like people are gonna want you on to use them for receiving cryptocurrency versus receiving dollars or payments or USCC point where they're going to want to use them as identifiers on social networks, where they're going to want to use them for SSO. Uh, and they're not gonna want to use them as much for things like websites, which is what web is. And if I'm being perfectly honest, if I'm looking out 10 years from now, I think that these traditional domain name systems are gonna want to work with and adopt this new NFC technology. Cause they're going to want to have these features for the domain next. So like in short, I think NMT domain names or domain names with superpowers, this is the next generation of, uh, naming systems and naming systems were always meant to be identity networks. >>Yeah. They hit a car, they hit a glass ceiling. I mean, they just can't, they're not built for that. Right. So I mean, and, and having people, having their own names is essentially what decentralization is all about. Cause what does a company, it's a collection of humans that aren't working in one place they're decentralized. So, and then you decentralize the identity and everything's can been changed so completely love it. I think you guys are onto something really huge here. Um, you pretty much laid out what's next for web three, but you guys are in this state of, of growth. You've seen people signing up for names. That's great. What are the, what are the, um, best practices? What are the steps are people taking? What's the common, uh, use case for folks we're putting this to work right now for you guys? Why do you see what's the progression? >>Yeah. So the, the thing that we want to solve for people most immediately is, uh, we want to make it easier for sending and receiving crypto payments. And I, and I know that sounds like a niche market, but there's over 200 million people right now who have some form of cryptocurrency, right? And 99.9% of them are still sending crypto using these really long hex addresses. And that market is growing at 60 to a hundred percent year over year. So, uh, first we need to get crypto into everybody's pocket and that's going to happen over the next three to five years. Let's call it if it doubles every year for the next five years, we'll be there. Uh, and then we want to make it easier for all those people to sit encrypted back and forth. And I, and I will admit I'm a big fan of these stable coins and these like, you know, I would say utility focused, uh, tokens that are coming out just to make it easier for, you know, transferring money from here to Turkey and back or whatever. >>Uh, and that's the really the first step freight FTE domain names. But what happens is when you have an NFTE domain and that's what you're using to receive payments, um, and then you realize, oh, I can also use this to log into my favorite apps. It starts building that identity piece. And so we're also building products and services to make it more like your identity. And we think that it's going to build up over time. So instead of like doing an identity network, top-down where you're like a government or a corporation say, oh, you have to have ID. Here's your password. You have to have it. We're going to do a bottoms up. We're going to give everyone on the planet, NFTE domain name, it's going to give them to the utility to make it easier to send, receive cryptocurrency. They're going to say, Hey, do you want to verify your Twitter profile? Yes. Okay, great. You test that back. Hey, you want to verify your Reddit? Yes. Instagram. Yes. Tik TOK. Yes. You want to verify your driver's license? Okay. Yeah, we can attach that back. Uh, and then what happens is you end up building up organically, uh, digital identifiers for people using these blockchain, uh, naming systems. And once they have that, they're gonna just, they're going to be able to share that information. Uh, and that's gonna lead to better experiences online for, uh, both commerce, but also just better user experiences. >>You know, every company when they web came along, first of all, everyone, poo-pooed the web ones. That was terrible, bad idea. Oh. And so unreliable. So slow, hard to find things. Web two, everyone bought a domain name for their company, but then as they added webpages, these permalinks became so long. The web page address fully qualified, you know, permalink string, they bought keywords. And then that's another layer on top. So you started to see that evolution in the web. Now it's kind of hit a ceiling here. Everyone gets their NFT. They, they started doing more things. Then it becomes much more of a use case where it's more usable, not just for one thing. Um, so we saw that movie before, so it's like a permalink permanent. Yeah. >>Yes. I mean, if we're lucky, it will be a decentralized bottoms up global identity, uh, that appreciates user privacy and allows people to opt in. And that's what we want to build. >>And the gas prices thing that's always coming. That's always an objection here that, I mean, blockchain is perfect for this because it's immutable, it's written on the chain. All good, totally secure. What about the efficiency? How do you see that evolving real quick? >>Well, so a couple of comments on efficiency. Uh, first of all, we picked domains as a first product to market because, you know, as you need to take a look and see if the technology is capable of handling what you're trying to do, uh, and for domain names, you're not updating that every day. Right? So like, if you look at traditional domain names, you only update it a couple of times per year. So, so the usage for that to set this up and configure it, you know, most people set up and configure it and then it'll have a few changes for years. First of all, the overall it's not like a game problem. Right, right, right. So, so that, that part's good. We picked a good place to start for going to market. And then the second piece is like, you're really just asking our computer, system's going to get more efficient over time. >>And if you know, the history of that has always been yes. Uh, and you know, I remember the nineties, I had a modem and it was, you know, whatever, 14 kilobits and then it was 28 and then 56, then 100. And now I have a hundred megabits up and down. Uh, and I look at blockchain systems and I don't know if anyone has a law for this yet, but throughput of blockchains is going up over time. And you know, there's, there's going to be continued improvements over this over the next decade. We need them. We're going to use all of it. Uh, and you just need to make sure you're planning a business makes sense for the current environment. Just as an example, if you had tried to launch Netflix for online streaming in 1990, you would have had a bad time because no one had bandwidth. So yeah. Some applications are going to wait to be a little bit later on in the cycle, but I actually think identity is perfectly fine to go ahead and get off the ground now. >>Yeah. The motivated parties for innovations here, I mean, a point cast failed miserably that was like the, they try to stream video over T1 lines, but back in the days, nothing. So again, we've seen those speeds double, triple on homes right now, Matt. Congratulations. Great stuff. Final tick, tock moment here. How would you summarize short in a short clip? The difference between digital identity in web two and web three, >>Uh, in, in web too, you don't get to own your own online presidents and in web three, you do get to own it. So I think if you were gonna simplify it really web three is about ownership and we're excited to give everyone on the planet a chance to own their name and choose when and where and how they want to share information about themselves. >>So now users are in charge. >>Exactly. >>They're not the product anymore. Going to be the product might as well monetize the product. And that's the data. Um, real quick thoughts just to close out the role of data in all this, your view. >>We haven't enabled users to own their data online since the beginning of the internet. And we're now starting to do that. It's going to have profound changes for how every application on the planet interacts with >>Awesome stuff, man, I take a minute to give a plug for the company. How many employees you got? What do you guys looking for for hiring, um, fundraising, give a quick, a quick commercial for what's going on, on unstoppable domains. Yeah. >>So if you haven't already check us out@ensembledomains.com, we're also on Twitter at unstoppable web, and we have a wonderful podcast as well that you should check out if you haven't already. And, uh, we are just crossed a hundred people. We've, we're growing, you know, three to five, a hundred percent year over year. Uh, we're basically hiring every position across the company right now. So if you're interested in getting into web three, even if you're coming from a traditional web two background, please reach out. Uh, we love teaching people about this new world and how you can be a part of it. >>And you're a virtual company. Do you have a little headquarters or is it all virtual? What's the situation there? >>Yeah, I actually just assumed we were a hundred percent remote and asynchronous and we're currently in five countries across the planet. Uh, mostly concentrated in the U S and EU areas, >>Rumor to maybe you can confirm or admit or deny this rumor. I heard a rumor that you have mandatory vacation policy. >>Uh, this is true. Uh, and that's because we are a team of people who like to get things done. And, but we also know that recovery is an important part of any organizations. So if you push too hard, uh, we want to remind people we're on a marathon, right? This is not a sprint. Uh, and so we want people to be with us term. Uh, we do think that this is a ten-year move. And so yeah. Do force people. We'll unplug you at the end of the year, if you have >>To ask me, so what's the consequence of, I don't think vacation. >>Yeah. We literally unplug it. You won't be able to get it. You won't be able to get into slack. Right. And that's a, that's how we regulate. >>Well, when people start having their avatars be their bot and you don't even know what you're unplugging at some point, that's where you guys come in with the NFD saying that that's not the real person. It's not the real human And FTS. Great innovation, great use case, Matt. Congratulations. Thanks for coming on and sharing the story to kick off this showcase with the cube. Thanks for sharing all that great insight. Appreciate it. >>John had a wonderful time. All right. Just the >>Cube unstoppable domains showcasing. We got great 10 great pieces of content we're dropping all today. Check them out. Stay with us for more coverage on John furrier with cube. Thanks for watching.

Published Date : Feb 15 2022

SUMMARY :

Congratulations on the success of your company on stumbled domains. Happy to be here. Love, first of all, love the story you got going on here. Do you see it from, you know, gaming and Fortnite and skins and Warcraft and all these other places, Can you talk about your view of this cultural shift? And if you look today, most people spend 50% plus of their time in front of a screen. You are the product, you know, they're mining you for data and they're just selling ads, right? and you gave it your data, but then it got too far. And we thought that one of the things that you have on every network, like when you log on Twitter, you have a Twitter handle. Uh, and then we said to ourselves, you know, this a lot easier instead of, you know, these long IP addresses or a hex addresses in the case of Porto. I mean, what do you do with it? And then the first time you try to send it, there's this, there's this field where you want to send it. you know, a free search to all the resources in the world, to which, by the way, they're mining a lot of data too, So the way that I think about it is if you go to web one, So it's kind of like web one, you just have a static web page whip, two, you have a static web page with a service, Uh, but then another thing that happens is I can bring in from, you know, an unlimited amount of additional information about So what you're getting at here is, Hey, if you can actually have that solved before you know, a few years back is a big one and just all your credit card data got leaked, um, you know, trade cryptocurrencies or something, you'll know that when you go there, you actually connect to your wall. So do you decouple the wallet But then if I want to, uh, you know, sign into a website or something, And we collaborate with several large wallets as well, uh, like blockchain.com, uh, and you know, So the wallet, basically as a D app, the way you look at it, you integrate whatever And that way you keep the best of both worlds. And then when you permission all that information about yourself to that application, you can serve up a new user experience So, you know, I think you're, that's huge success on the concept. So, and instead of being for, you know, 20 to 50 million small businesses, So it's like people are gonna want you on to use them for receiving cryptocurrency What's the common, uh, use case for folks we're putting this to work right now for you guys? to make it easier for, you know, transferring money from here to Turkey and back or whatever. Uh, and then what happens is you end up building up So you started to see that evolution in the web. And that's what we want to build. How do you see that evolving real quick? So, so the usage for that to set this up and configure it, you know, And if you know, the history of that has always been yes. How would you summarize short in a short clip? Uh, in, in web too, you don't get to own your own online presidents And that's the data. And we're now starting to do that. What do you guys looking for for hiring, um, fundraising, give a quick, Uh, we love teaching people about this new world and how you can be a part Do you have a little headquarters or is it all virtual? Uh, mostly concentrated in the U S and EU areas, Rumor to maybe you can confirm or admit or deny this rumor. So if you push too hard, And that's a, that's how we regulate. Well, when people start having their avatars be their bot and you don't even know what you're unplugging at some point, Just the Stay with us for more coverage on John furrier

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AngelaPERSON

0.99+

Matt GouldPERSON

0.99+

TurkeyLOCATION

0.99+

MattPERSON

0.99+

New YorkLOCATION

0.99+

twoQUANTITY

0.99+

1990DATE

0.99+

20QUANTITY

0.99+

117 episodesQUANTITY

0.99+

threeQUANTITY

0.99+

FacebookORGANIZATION

0.99+

JohnPERSON

0.99+

10QUANTITY

0.99+

billionsQUANTITY

0.99+

50%QUANTITY

0.99+

second pieceQUANTITY

0.99+

40%QUANTITY

0.99+

100QUANTITY

0.99+

30QUANTITY

0.99+

ten-yearQUANTITY

0.99+

99.9%QUANTITY

0.99+

EquifaxORGANIZATION

0.99+

GDPRTITLE

0.99+

five countriesQUANTITY

0.99+

EuropeLOCATION

0.99+

50 different addressesQUANTITY

0.99+

10 great piecesQUANTITY

0.99+

GoogleORGANIZATION

0.99+

John furrierPERSON

0.99+

hundreds of millions of dollarsQUANTITY

0.99+

LinkedInORGANIZATION

0.99+

U SLOCATION

0.99+

56QUANTITY

0.99+

28QUANTITY

0.99+

NFTEORGANIZATION

0.99+

fiveQUANTITY

0.99+

two areasQUANTITY

0.99+

50QUANTITY

0.99+

first productQUANTITY

0.99+

14 kilobitsQUANTITY

0.99+

less than 1%QUANTITY

0.99+

bothQUANTITY

0.99+

ICANNORGANIZATION

0.99+

ChromeTITLE

0.99+

InstagramORGANIZATION

0.99+

firstQUANTITY

0.99+

NetflixORGANIZATION

0.99+

todayDATE

0.99+

over 200 million peopleQUANTITY

0.98+

both worldsQUANTITY

0.98+

50 millionQUANTITY

0.98+

LinkedInsORGANIZATION

0.98+

first timeQUANTITY

0.98+

TwitterORGANIZATION

0.98+

FortniteTITLE

0.98+

oneQUANTITY

0.98+

30 plus years agoDATE

0.98+

10 yearQUANTITY

0.98+

FacebooksORGANIZATION

0.98+

two thingsQUANTITY

0.98+

FirstQUANTITY

0.98+

RedditORGANIZATION

0.97+

FTEORGANIZATION

0.97+

30 plusQUANTITY

0.97+

3.5 billion peopleQUANTITY

0.97+

about 350 million registrationsQUANTITY

0.97+

60QUANTITY

0.97+

1980sDATE

0.97+

WaltonORGANIZATION

0.97+

first use caseQUANTITY

0.96+

100 XQUANTITY

0.96+

4 billion peopleQUANTITY

0.96+

EULOCATION

0.96+

Breaking Analysis: Data Mesh...A New Paradigm for Data Management


 

from the cube studios in palo alto in boston bringing you data driven insights from the cube and etr this is breaking analysis with dave vellante data mesh is a new way of thinking about how to use data to create organizational value leading edge practitioners are beginning to implement data mesh in earnest and importantly data mesh is not a single tool or a rigid reference architecture if you will rather it's an architectural and organizational model that's really designed to address the shortcomings of decades of data challenges and failures many of which we've talked about on the cube as important by the way it's a new way to think about how to leverage data at scale across an organization and across ecosystems data mesh in our view will become the defining paradigm for the next generation of data excellence hello and welcome to this week's wikibon cube insights powered by etr in this breaking analysis we welcome the founder and creator of data mesh author thought leader technologist jamaak dagani shamak thank you for joining us today good to see you hi dave it's great to be here all right real quick let's talk about what we're going to cover i'll introduce or reintroduce you to jamaac she joined us earlier this year in our cube on cloud program she's the director of emerging tech at dot works north america and a thought leader practitioner software engineer architect and a passionate advocate for decentralized technology solutions and and data architectures and jamaa since we last had you on as a guest which was less than a year ago i think you've written two books in your spare time one on data mesh and another called software architecture the hard parts both published by o'reilly so how are you you've been busy i've been busy yes um good it's been a great year it's been a busy year i'm looking forward to the end of the year and the end of these two books but it's great to be back and um speaking with you well you got to be pleased with the the momentum that data mesh has and let's just jump back to the agenda for a bit and get that out of the way we're going to set the stage by sharing some etr data our partner our data partner on the spending profile and some of the key data sectors and then we're going to review the four key principles of data mesh just it's always worthwhile to sort of set that framework we'll talk a little bit about some of the dependencies and the data flows and we're really going to dig today into principle number three and a bit around the self-service data platforms and to that end we're going to talk about some of the learnings that shamak has captured since she embarked on the datamess journey with her colleagues and her clients and we specifically want to talk about some of the successful models for building the data mesh experience and then we're going to hit on some practical advice and we'll wrap with some thought exercises maybe a little tongue-in-cheek some of the community questions that we get so the first thing i want to do we'll just get this out of the way is introduce the spending climate we use this xy chart to do this we do this all the time it shows the spending profiles and the etr data set for some of the more data related sectors of the ecr etr taxonomy they they dropped their october data last friday so i'm using the july survey here we'll get into the october survey in future weeks but about 1500 respondents i don't see a dramatic change coming in the october survey but the the y-axis is net score or spending momentum the horizontal axis is market share or presence in the data set and that red line that 40 percent anything over that we consider elevated so for the past eight quarters or so we've seen machine learning slash ai rpa containers and cloud is the four areas where cios and technology buyers have shown the highest net scores and as we've said what's so impressive for cloud is it's both pervasive and it shows high velocity from a spending standpoint and we plotted the three other data related areas database edw analytics bi and big data and storage the first two well under the red line are still elevated the storage market continues to kind of plot along and we've we've plotted the outsourced it just to balance it out for context that's an area that's not so hot right now so i just want to point out that these areas ai automation containers and cloud they're all relevant to data and they're fundamental building blocks of data architectures as are the two that are directly related to data database and analytics and of course storage so it just gives you a picture of the spending sector so i wanted to share this slide jamark uh that that we presented in that you presented in your webinar i love this it's a taxonomy put together by matt turk who's a vc and he called this the the mad landscape machine learning and ai and data and jamock the key point here is there's no lack of tooling you've you've made the the data mesh concept sort of tools agnostic it's not like we need more tools to succeed in data mesh right absolutely great i think we have plenty of tools i think what's missing is a meta architecture that defines the landscape in a way that it's in step with organizational growth and then defines that meta architecture in a way that these tools can actually interoperable and to operate and integrate really well like the the clients right now have a lot of challenges in terms of picking the right tool regardless of the technology they go down the path either they have to go in and big you know bite into a big data solution and then try to fit the other integrated solutions around it or as you see go to that menu of large list of applications and spend a lot of time trying to kind of integrate and stitch this tooling together so i'm hoping that data mesh creates that kind of meta architecture for tools to interoperate and plug in and i think our conversation today around self-subjective platform um hopefully eliminate that yeah we'll definitely circle back because that's one of the questions we get all the time from the community okay let's review the four main principles of data mesh for those who might not be familiar with it and those who are it's worth reviewing jamar allow me to introduce them and then we can discuss a bit so a big frustration i hear constantly from practitioners is that the data teams don't have domain context the data team is separated from the lines of business and as a result they have to constantly context switch and as such there's a lack of alignment so principle number one is focused on putting end-to-end data ownership in the hands of the domain or what i would call the business lines the second principle is data as a product which does cause people's brains to hurt sometimes but it's a key component and if you start sort of thinking about it you'll and talking to people who have done it it actually makes a lot of sense and this leads to principle number three which is a self-serve data infrastructure which we're going to drill into quite a bit today and then the question we always get is when we introduce data meshes how to enforce governance in a federated model so let me bring up a more detailed slide jamar with the dependencies and ask you to comment please sure but as you said the the really the root cause we're trying to address is the siloing of the data external to where the action happens where the data gets produced where the data needs to be shared when the data gets used right in the context of the business so it's about the the really the root cause of the centralization gets addressed by distribution of the accountability end to end back to the domains and these domains this distribution of accountability technical accountability to the domains have already happened in the last you know decade or so we saw the transition from you know one general i.t addressing all of the needs of the organization to technology groups within the itu or even outside of the iit aligning themselves to build applications and services that the different business units need so what data mesh does it just extends that model and say okay we're aligning business with the tech and data now right so both application of the data in ml or inside generation in the domains related to the domain's needs as well as sharing the data that the domains are generating with the rest of the organization but the moment you do that then you have to solve other problems that may arise and that you know gives birth to the second principle which is about um data as a product as a way of preventing data siloing happening within the domain so changing the focus of the domains that are now producing data from i'm just going to create that data i collect for myself and that satisfy my needs to in fact the responsibility of domain is to share the data as a product with all of the you know wonderful characteristics that a product has and i think that leads to really interesting architectural and technical implications of what actually constitutes state has a product and we can have a separate conversation but once you do that then that's the point in the conversation that cio says well how do i even manage the cost of operation if i decentralize you know building and sharing data to my technical teams to my application teams do i need to go and hire another hundred data engineers and i think that's the role of a self-serve data platform in a way that it enables and empowers generalist technologies that we already have in the technical domains the the majority population of our developers these days right so the data platform attempts to mobilize the generalist technologies to become data producers to become data consumers and really rethink what tools these people need um and the last last principle so data platform is really to giving autonomy to domain teams and empowering them and reducing the cost of ownership of the data products and finally as you mentioned the question around how do i still assure that these different data products are interoperable are secure you know respecting privacy now in a decentralized fashion right when we are respecting the sovereignty or the domain ownership of um each domain and that leads to uh this idea of both from operational model um you know applying some sort of a federation where the domain owners are accountable for interoperability of their data product they have incentives that are aligned with global harmony of the data mesh as well as from the technology perspective thinking about this data is a product with a new lens with a lens that all of those policies that need to be respected by these data products such as privacy such as confidentiality can we encode these policies as computational executable units and encode them in every data product so that um we get automation we get governance through automation so that's uh those that's the relationship the complex relationship between the four principles yeah thank you for that i mean it's just a couple of points there's so many important points in there but the idea of the silos and the data as a product sort of breaking down those cells because if you have a product and you want to sell more of it you make it discoverable and you know as a p l manager you put it out there you want to share it as opposed to hide it and then you know this idea of managing the cost you know number three where people say well centralize and and you can be more efficient but that but that essentially was the the failure in your other point related point is generalist versus specialist that's kind of one of the failures of hadoop was you had these hyper specialist roles emerge and and so you couldn't scale and so let's talk about the goals of data mesh for a moment you've said that the objective is to extend exchange you call it a new unit of value between data producers and data consumers and that unit of value is a data product and you've stated that a goal is to lower the cognitive load on our brains i love this and simplify the way in which data are presented to both producers and consumers and doing so in a self-serve manner that eliminates the tapping on the shoulders or emails or raising tickets so how you know i'm trying to understand how data should be used etc so please explain why this is so important and how you've seen organizations reduce the friction across the data flows and the interconnectedness of things like data products across the company yeah i mean this is important um as you mentioned you know initially when this whole idea of a data-driven innovation came to exist and we needed all sorts of you know technology stacks we we centralized um creation of the data and usage of the data and that's okay when you first get started with where the expertise and knowledge is not yet diffused and it's only you know the privilege of a very few people in the organization but as we move to a data driven um you know innovation cycle in the organization as we learn how data can unlock new new programs new models of experience new products then it's really really important as you mentioned to get the consumers and producers talk to each other directly without a broker in the middle because even though that having that centralized broker could be a cost-effective model but if you if we include uh the cost of missed opportunity for something that we could have innovated well we missed that opportunity because of months of looking for the right data then that cost parented the cost benefit parameters and formula changes so um so to to have that innovation um really embedded data-driven innovation embedded into every domain every team we need to enable a model where the producer can directly peer-to-peer discover the data uh use it understand it and use it so the litmus test for that would be going from you know a hypothesis that you know as a data scientist i think there is a pattern and there is an insight in um you know in in the customer behavior that if i have access to all of the different informations about the customer all of the different touch points i might be able to discover that pattern and personalize the experience of my customer the liquid stuff is going from that hypothesis to finding all of the different sources be able to understanding and be able to connect them um and then turn them them into you know training of my machine learning and and the rest is i guess known as an intelligent product got it thank you so i i you know a lot of what we do here in breaking it house is we try to curate and then point people to new resources so we will have some additional resources because this this is not superficial uh what you and your colleagues in the community are creating but but so i do want to you know curate some of the other material that you had so if i bring up this next chart the left-hand side is a curated description both sides of your observations of most of the monolithic data platforms they're optimized for control they serve a centralized team that has hyper-specialized roles as we talked about the operational stacks are running running enterprise software they're on kubernetes and the microservices are isolated from let's say the spark clusters you know which are managing the analytical data etc whereas the data mesh proposes much greater autonomy and the management of code and data pipelines and policy as independent entities versus a single unit and you've made this the point that we have to enable generalists to borrow from so many other examples in the in the industry so it's an architecture based on decentralized thinking that can really be applied to any domain really domain agnostic in a way yes and i think if i pick one key point from that diagram is really um or that comparison is the um the the the data platforms or the the platform capabilities need to present a continuous experience from an application developer building an application that generates some data let's say i have an e-commerce application that generates some data to the data product that now presents and shares that data as as temporal immutable facts that can be used for analytics to the data scientist that uses that data to personalize the experience to the deployment of that ml model now back to that e-commerce application so if we really look at this continuous journey um the walls between these separate platforms that we have built needs to come down the platforms underneath that generate you know that support the operational systems versus supported data platforms versus supporting the ml models they need to kind of play really nicely together because as a user i'll probably fall off the cliff every time i go through these stages of this value stream um so then the interoperability of our data solutions and operational solutions need to increase drastically because so far we've got away with running operational systems an application on one end of the organization running and data analytics in another and build a spaghetti pipeline to you know connect them together neither of the ends are happy i hear from data scientists you know data analyst pointing finger at the application developer saying you're not developing your database the right way and application point dipping you're saying my database is for running my application it wasn't designed for sharing analytical data so so we've got to really what data mesh as a mesh tries to do is bring these two world together closer because and then the platform itself has to come closer and turn into a continuous set of you know services and capabilities as opposed to this disjointed big you know isolated stacks very powerful observations there so we want to dig a little bit deeper into the platform uh jamar can have you explain your thinking here because it's everybody always goes to the platform what do i do with the infrastructure what do i do so you've stressed the importance of interfaces the entries to and the exits from the platform and you've said you use a particular parlance to describe it and and this chart kind of shows what you call the planes not layers the planes of the platform it's complicated with a lot of connection points so please explain these planes and how they fit together sure i mean there was a really good point that you started with that um when we think about capabilities or that enables build of application builds of our data products build their analytical solutions usually we jump too quickly to the deep end of the actual implementation of these technologies right do i need to go buy a data catalog or do i need you know some sort of a warehouse storage and what i'm trying to kind of elevate us up and out is to to to force us to think about interfaces and apis the experiences that the platform needs to provide to run this secure safe trustworthy you know performance mesh of data products and if you focus on then the interfaces the implementation underneath can swap out right you can you can swap one for the other over time so that's the purpose of like having those lollipops and focusing and emphasizing okay what is the interface that provides a certain capability like the storage like the data product life cycle management and so on the purpose of the planes the mesh experience playing data product expense utility plan is really giving us a language to classify different set of interfaces and capabilities that play nicely together to provide that cohesive journey of a data product developer data consumer so then the three planes are really around okay at the bottom layer we have a lot of utilities we have that mad mac turks you know kind of mad data tooling chart so we have a lot of utilities right now they they manage workflow management you know they they do um data processing you've got your spark link you've got your storage you've got your lake storage you've got your um time series of storage you've got a lot of tooling at that level but the layer that we kind of need to imagine and build today we don't buy yet as as long as i know is this linger that allows us to uh exchange that um unit of value right to build and manage these data products so so the language and the apis and interface of this product data product experience plan is not oh i need this storage or i need that you know workflow processing is that i have a data product it needs to deliver certain types of data so i need to be able to model my data it needs to as part of this data product i need to write some processing code that keeps this data constantly alive because it's receiving you know upstream let's say user interactions with a website and generating the profile of my user so i need to be able to to write that i need to serve the data i need to keep the data alive and i need to provide a set of slos and guarantees for my data so that good documentation so that some you know someone who comes to data product knows but what's the cadence of refresh what's the retention of the data and a lot of other slos that i need to provide and finally i need to be able to enforce and guarantee certain policies in terms of access control privacy encryption and so on so as a data product developer i just work with this unit a complete autonomous self-contained unit um and the platform should give me ways of provisioning this unit and testing this unit and so on that's why kind of i emphasize on the experience and of course we're not dealing with one or two data product we're dealing with a mesh of data products so at the kind of mesh level experience we need a set of capabilities and interfaces to be able to search the mesh for the right data to be able to explore the knowledge graph that emerges from this interconnection of data products need to be able to observe the mesh for any anomalies did we create one of these giant master data products that all the data goes into and all the data comes out of how we found ourselves the bottlenecks to be able to kind of do those level machine level capabilities we need to have a certain level of apis and interfaces and once we decide and decide what constitutes that to satisfy this mesh experience then we can step back and say okay now what sort of a tool do i need to build or buy to satisfy them and that's that is not what the data community or data part of our organizations used to i think traditionally we're very comfortable with buying a tool and then changing the way we work to serve to serve the tool and this is slightly inverse to that model that we might be comfortable with right and pragmatists will will to tell you people who've implemented data match they'll tell you they spent a lot of time on figuring out data as a product and the definitions there the organizational the getting getting domain experts to actually own the data and and that's and and they will tell you look the technology will come and go and so to your point if you have those lollipops and those interfaces you'll be able to evolve because we know one thing's for sure in this business technology is going to change um so you you had some practical advice um and i wanted to discuss that for those that are thinking about data mesh i scraped this slide from your presentation that you made and and by the way we'll put links in there your colleague emily who i believe is a data scientist had some really great points there as well that that practitioners should dig into but you made a couple of points that i'd like you to summarize and to me that you know the big takeaway was it's not a one and done this is not a 60-day project it's a it's a journey and i know that's kind of cliche but it's so very true here yes um this was a few starting points for um people who are embarking on building or buying the platform that enables the people enables the mesh creation so it was it was a bit of a focus on kind of the platform angle and i think the first one is what we just discussed you know instead of thinking about mechanisms that you're building think about the experiences that you're enabling uh identify who are the people like what are the what is the persona of data scientists i mean data scientist has a wide range of personas or did a product developer the same what is the persona i need to develop today or enable empower today what skill sets do they have and and so think about experience as mechanisms i think we are at this really magical point i mean how many times in our lifetime we come across a complete blanks you know kind of white space to a degree to innovate so so let's take that opportunity and use a bit of a creativity while being pragmatic of course we need solutions today or yesterday but but still think about the experiences not not mechanisms that you need to buy so that was kind of the first step and and the nice thing about that is that there is an evolutionary there is an iterative path to maturity of your data mesh i mean if you start with thinking about okay which are the initial use cases i need to enable what are the data products that those use cases depend on that we need to unlock and what is the persona of my or general skill set of my data product developer what are the interfaces i need to enable you can start with the simplest possible platform for your first two use cases and then think about okay the next set of data you know data developers they have a different set of needs maybe today i just enable the sql-like querying of the data tomorrow i enable the data scientists file based access of the data the day after i enable the streaming aspect so so have this evolutionary kind of path ahead of you and don't think that you have to start with building out everything i mean one of the things we've done is taking this harvesting approach that we work collaboratively with those technical cross-functional domains that are building the data products and see how they are using those utilities and harvesting what they are building as the solutions for themselves back into the back into the platform but at the end of the day we have to think about mobilization of the large you know largest population of technologies we have we'd have to think about diffusing the technology and making it available and accessible by the generous technologies that you know and we've come a long way like we've we've gone through these sort of paradigm shifts in terms of mobile development in terms of functional programming in terms of cloud operation it's not that we are we're struggling with learning something new but we have to learn something that works nicely with the rest of the tooling that we have in our you know toolbox right now so so again put that generalist as the uh as one of your center personas not the only person of course we will have specialists of course we will always have data scientists specialists but any problem that can be solved as a general kind of engineering problem and i think there's a lot of aspects of data michigan that can be just a simple engineering problem um let's just approach it that way and then create the tooling um to empower those journalists great thank you so listen i've i've been around a long time and so as an analyst i've seen many waves and we we often say language matters um and so i mean i've seen it with the mainframe language it was different than the pc language it's different than internet different than cloud different than big data et cetera et cetera and so we have to evolve our language and so i was going to throw a couple things out here i often say data is not the new oil because because data doesn't live by the laws of scarcity we're not running out of data but i get the analogy it's powerful it powered the industrial economy but it's it's it's bigger than that what do you what do you feel what do you think when you hear the data is the new oil yeah i don't respond to those data as the gold or oil or whatever scarce resource because as you said it evokes a very different emotion it doesn't evoke the emotion of i want to use this i want to utilize it feels like i need to kind of hide it and collect it and keep it to myself and not share it with anyone it doesn't evoke that emotion of sharing i really do think that data and i with it with a little asterisk and i think the definition of data changes and that's why i keep using the language of data product or data quantum data becomes the um the most important essential element of existence of uh computation what do i mean by that i mean that you know a lot of applications that we have written so far are based on logic imperative logic if this happens do that and else do the other and we're moving to a world where those applications generating data that we then look at and and the data that's generated becomes the source the patterns that we can exploit to build our applications as in you know um curate the weekly playlist for dave every monday based on what he has listened to and the you know other people has listened to based on his you know profile so so we're moving to the world that is not so much about applications using the data necessarily to run their businesses that data is really truly is the foundational building block for the applications of the future and then i think in that we need to rethink the definition of the data and maybe that's for a different conversation but that's that's i really think we have to converge the the processing that the data together the substance substance and the processing together to have a unit that is uh composable reusable trustworthy and that's that's the idea behind the kind of data product as an atomic unit of um what we build from future solutions got it now something else that that i heard you say or read that really struck me because it's another sort of often stated phrase which is data is you know our most valuable asset and and you push back a little bit on that um when you hear people call data and asset people people said often have said they think data should be or will eventually be listed as an asset on the balance sheet and i i in hearing what you said i thought about that i said well you know maybe data as a product that's an income statement thing that's generating revenue or it's cutting costs it's not necessarily because i don't share my my assets with people i don't make them discoverable add some color to this discussion i think so i think it's it's actually interesting you mentioned that because i read the new policy in china that cfos actually have a line item around the data that they capture we don't have to go to the political conversation around authoritarian of um collecting data and the power that that creates and the society that leads to but that aside that big conversation little conversation aside i think you're right i mean the data as an asset generates a different behavior it's um it creates different performance metrics that we would measure i mean before conversation around data mesh came to you know kind of exist we were measuring the success of our data teams by the terabytes of data they were collecting by the thousands of tables that they had you know stamped as golden data none of that leads to necessarily there's no direct line i can see between that and actually the value that data generated but if we invert that so that's why i think it's rather harmful because it leads to the wrong measures metrics to measure for success so if you invert that to a bit of a product thinking or something that you share to delight the experience of users your measures are very different your measures are the the happiness of the user they decrease lead time for them to actually use and get value out of it they're um you know the growth of the population of the users so it evokes a very different uh kind of behavior and success metrics i do say if if i may that i probably come back and regret the choice of word around product one day because of the monetization aspect of it but maybe there is a better word to use but but that's the best i think we can use at this point in time why do you say that jamar because it's too directly related to monetization that has a negative connotation or it might might not apply in things like healthcare or you know i think because if we want to take your shortcuts and i remember this conversation years back that people think that the reason to you know kind of collect data or have data so that we can sell it you know it's just the monetization of the data and we have this idea of the data market places and so on and i think that is actually the least valuable um you know outcome that we can get from thinking about data as a product that direct cell an exchange of data as a monetary you know exchange of value so so i think that might redirect our attention to something that really matters which is um enabling using data for generating ultimately value for people for the customers for the organizations for the partners as opposed to thinking about it as a unit of exchange for for money i love data as a product i think you were your instinct was was right on and i think i'm glad you brought that up because because i think people misunderstood you know in the last decade data as selling data directly but you really what you're talking about is using data as a you know ingredient to actually build a product that has value and value either generate revenue cut costs or help with a mission like it could be saving lives but in some way for a commercial company it's about the bottom line and that's just the way it is so i i love data as a product i think it's going to stick so one of the other things that struck me in one of your webinars was one of the q a one of the questions was can i finally get rid of my data warehouse so i want to talk about the data warehouse the data lake jpmc used that term the data lake which some people don't like i know john furrier my business partner doesn't like that term but the data hub and one of the things i've learned from sort of observing your work is that whether it's a data lake a data warehouse data hub data whatever it's it should be a discoverable node on the mesh it really doesn't matter the the technology what are your your thoughts on that yeah i think the the really shift is from a centralized data warehouse to data warehouse where it fits so i think if you just cross that centralized piece uh we are all in agreement that data warehousing provides you know interesting and capable interesting capabilities that are still required perhaps as a edge node of the mesh that is optimizing for certain queries let's say financial reporting and we still want to direct a fair bit of data into a node that is just for those financial reportings and it requires the precision and the um you know the speed of um operation that the warehouse technology provides so i think um definitely that technology has a place where it falls apart is when you want to have a warehouse to rule you know all of your data and model canonically model your data because um it you have to put so much energy into you know kind of try to harness this model and create this very complex the complex and fragile snowflake schemas and so on that that's all you do you spend energy against the entropy of your organization to try to get your arms around this model and the model is constantly out of step with what's happening in reality because reality the model the reality of the business is moving faster than our ability to model everything into into uh into one you know canonical representation i think that's the one we need to you know challenge not necessarily application of data warehousing on a node i want to close by coming back to the issues of standards um you've specifically envisioned data mesh to be technology agnostic as i said before and of course everyone myself included we're going to run a vendor's technology platform through a data mesh filter the reality is per the matt turc chart we showed earlier there are lots of technologies that that can be nodes within the data mesh or facilitate data sharing or governance etc but there's clearly a lack of standardization i'm sometimes skeptical that the vendor community will drive this but maybe like you know kubernetes you know google or some other internet giant is going to contribute something to open source that addresses this problem but talk a little bit more about your thoughts on standardization what kinds of standards are needed and where do you think they'll come from sure i mean the you write that the vendors are not today incentivized to create those open standards because majority of the vet not all of them but some vendors operational model is about bring your data to my platform and then bring your computation to me uh and all will be great and and that will be great for a portion of the clients and portion of environments where that complexity we're talking about doesn't exist so so we need yes other players perhaps maybe um some of the cloud providers or people that are more incentivized to open um open their platform in a way for data sharing so as a starting point i think standardization around data sharing so if you look at the spectrum right now we have um a de facto sound it's not even a standard for something like sql i mean everybody's bastardized to call and extended it with so many things that i don't even know what this standard sql is anymore but we have that for some form of a querying but beyond that i know for example folks at databricks to start to create some standards around delta sharing and sharing the data in different models so i think data sharing as a concept the same way that apis were about capability sharing so we need to have the data apis or analytical data apis and data sharing extended to go beyond simply sql or languages like that i think we need standards around computational prior policies so this is again something that is formulating in the operational world we have a few standards around how do you articulate access control how do you identify the agents who are trying to access with different authentication mechanism we need to bring some of those our ad our own you know our data specific um articulation of policies uh some something as simple as uh identity management across different technologies it's non-existent so if you want to secure your data across three different technologies there is no common way of saying who's the agent that is acting uh to act to to access the data can i authenticate and authorize them so so those are some of the very basic building blocks and then the gravy on top would be new standards around enriched kind of semantic modeling of the data so we have a common language to describe the semantic of the data in different nodes and then relationship between them we have prior work with rdf and folks that were focused on i guess linking data across the web with the um kind of the data web i guess work that we had in the past we need to revisit those and see their practicality in the enterprise con context so so data modeling a rich language for data semantic modeling and data connectivity most importantly i think those are some of the items on my wish list that's good well we'll do our part to try to keep the standards you know push that push that uh uh movement jamaica we're going to leave it there i'm so grateful to have you uh come on to the cube really appreciate your time it's just always a pleasure you're such a clear thinker so thanks again thank you dave that's it's wonderful to be here now we're going to post a number of links to some of the great work that jamark and her team and her books and so you check that out because we remember we publish each week on siliconangle.com and wikibon.com and these episodes are all available as podcasts wherever you listen listen to just search breaking analysis podcast don't forget to check out etr.plus for all the survey data do keep in touch i'm at d vallante follow jamac d z h a m a k d or you can email me at david.velante at siliconangle.com comment on the linkedin post this is dave vellante for the cube insights powered by etrbwell and we'll see you next time you

Published Date : Oct 25 2021

SUMMARY :

all of the you know wonderful

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
60-dayQUANTITY

0.99+

oneQUANTITY

0.99+

40 percentQUANTITY

0.99+

matt turkPERSON

0.99+

two booksQUANTITY

0.99+

chinaLOCATION

0.99+

thousands of tablesQUANTITY

0.99+

dave vellantePERSON

0.99+

jamaacPERSON

0.99+

googleORGANIZATION

0.99+

siliconangle.comOTHER

0.99+

tomorrowDATE

0.99+

yesterdayDATE

0.99+

octoberDATE

0.99+

bostonLOCATION

0.99+

first stepQUANTITY

0.98+

jamarPERSON

0.98+

todayDATE

0.98+

jamaicaPERSON

0.98+

both sidesQUANTITY

0.98+

shamakPERSON

0.98+

davePERSON

0.98+

jamarkPERSON

0.98+

first oneQUANTITY

0.98+

o'reillyORGANIZATION

0.98+

bothQUANTITY

0.97+

each weekQUANTITY

0.97+

john furrierPERSON

0.97+

second principleQUANTITY

0.97+

jamaak dagani shamakPERSON

0.96+

less than a year agoDATE

0.96+

earlier this yearDATE

0.96+

three different technologiesQUANTITY

0.96+

jamaaPERSON

0.95+

each domainQUANTITY

0.95+

terabytes of dataQUANTITY

0.94+

three planesQUANTITY

0.94+

julyDATE

0.94+

last decadeDATE

0.93+

about 1500 respondentsQUANTITY

0.93+

decadesQUANTITY

0.93+

firstQUANTITY

0.93+

first twoQUANTITY

0.93+

dot worksORGANIZATION

0.93+

one key pointQUANTITY

0.93+

first two use casesQUANTITY

0.92+

last fridayDATE

0.92+

this weekDATE

0.92+

twoQUANTITY

0.92+

three otherQUANTITY

0.92+

ndorORGANIZATION

0.92+

first thingQUANTITY

0.9+

two dataQUANTITY

0.9+

lakeORGANIZATION

0.89+

four areasQUANTITY

0.88+

single toolQUANTITY

0.88+

north americaLOCATION

0.88+

single unitQUANTITY

0.87+

jamacPERSON

0.86+

one ofQUANTITY

0.85+

thingsQUANTITY

0.85+

david.velanteOTHER

0.83+

past eight quartersDATE

0.83+

four principlesQUANTITY

0.82+

daveORGANIZATION

0.82+

a lot of applicationsQUANTITY

0.81+

four main principlesQUANTITY

0.8+

sqlTITLE

0.8+

palo altoORGANIZATION

0.8+

emilyPERSON

0.8+

d vallantePERSON

0.8+

Venkat Venkataramani and Dhruba Borthakur, Rockset | CUIBE Conversation


 

(bright intro music) >> Welcome to this "Cube Conversation". I'm your host, Lisa Martin. This is part of our third AWS Start-up Showcase. And I'm pleased to welcome two gentlemen from Rockset, Venkat Venkataramani is here, the CEO and co-founder and Dhruba Borthakur, CTO and co-founder. Gentlemen, welcome to the program. >> Thanks for having us. >> Thank you. >> Excited to learn more about Rockset, Venkat, talk to me about Rockset and how it's putting real-time analytics within the reach of every company. >> If you see the confluent IPO, if you see where the world is going in terms of analytics, I know, we look at this, real-time analytics is like the lost frontier. Everybody wants fast queries on fresh data. Nobody wants to say, "I don't need that. You know, give me slow queries on stale data," right? I think if you see what data warehouses and data lakes have done, especially in the cloud, they've really, really made batch analytics extremely accessible, but real-time analytics still seems too clumsy, too complex, and too expensive for most people. And we are on a mission to make, you know, real-time analytics, make it very, very easy and affordable for everybody to be able to take advantage of that. So that's our, that's what we do. >> But you're right, nobody wants a stale data or slower queries. And it seems like one of the things that we learned, Venkat, sticking with you in the last 18 months of a very strange world that we're living in, is that real-time is no longer a nice to have. It's really a differentiator and table stakes for businesses in every industry. How do you make it more affordable and accessible to businesses in so many different industries? >> I think that's a great question. I think there are, at a very high level, there are two categories of use cases we see. I think there is one full category of use cases where business teams and business units are demanding almost like business observability. You know, if you think about one domain that actually understood real-time and made everything work in real-time is the DevOps world, you know, metrics and monitoring coming out of like, you know, all these machines and because they really want to know as soon as something goes wrong, immediately, I want to, you know, be able to dive in and click and see what happens. But now businesses are demanding the same thing, right? Like a CEO wants to know, "Are we on track to hit our quarterly estimates or not? And tell me now what's happening," because you know, the larger the company, the more complex that have any operations dashboards are. And, you know, if you don't give them real-time visibility, the window of opportunity to do something about it disappears. And so they are really, businesses is really demanding that. And so that is one big use case we have. And the other strange thing we're also seeing is that customers are demanding real-time even from the products they are using. So you could be using a SaaS product for sales automation, support automation, marketing automation. Now I don't want to use a product if it doesn't have real-time analytics baked into the product itself. And so all these software companies, you know, providing a SaaS service to their cloud customers and clients, they are also looking to actually, you know, their proof of value really comes from the analytics that they can show within the product. And if that is not interactive and real-time, then they are also going to be left behind. So it's really a huge differentiator whether you're building a software product or your running a business, the real-time observability gives you a window of opportunity to actually do something about, you know, when something goes wrong, you can actually act on it very, very quickly. >> Right, which is absolutely critical. Dhruba, I want to get your take on this. As the CTO and co-founder as I introduced you, what were some of the gaps in the market back in 2016 that you saw that really necessitated the development of this technology? >> Yeah, for real-time analytics, the difference compared to what it was earlier is that all your things used to be a lot of batch processes. Again, the reason being because there was something called MapReduce, and that was a scanning system that was kind of a invention from Google, which talked about processing big data sets. And it was about scanning, scanning large data sets to give answers. Whereas for real-time analytics, the new trend is that how can you index these big datasets so that you can answer queries really fast? So this is what Rockset does as well, is that we have capabilities to index humongous amounts of data cheaply, efficiently, and economically feasible for our customers. And that's why query is the leverage the index to give fast (indistinct). This is one of the big changes. The other change obviously is that it has moved to the cloud, right? A lot of analytics have moved to the cloud. So Rockset is built natively for the cloud, which is why we can scale up, scale down resources when queries come and we can provide a great (indistinct) for people as data latency, and as far as query latencies comes on, both of these things. So these two trends, I think, are kind of the power behind moving, making people use more real-time analytics. >> Right, and as Venkat was talking about how it's an absolute differentiator for businesses, you know, last year we saw this really, this quick, all these quick pivots to survive and ultimately thrive. And we're seeing the businesses now coming out of this, that we're able to do that, and we're able to pivot to digital, to be successful and to out-compete those who maybe were not as fast. I saw that recently, Venkat, you guys had a new product release a few weeks ago, major product release, that is making real-time analytics on streaming data sources like Apache Kafka, Amazon Kinesis, Amazon DynamoDB, and data lakes a lot more accessible and affordable. Breakdown that launch for me, and how is it doing the accessibility and affordability that you talked about before? >> Extremely good question. So we're really excited about what we call SQL-based roll-ups, is what we call that release. So what does that do? So if you think about real-time analytics and even teeing off the previous question you asked on what is the gap in the market? The gap in the market is really, all that houses and lakes are built for batch. You know, they're really good at letting people accumulate huge volumes of data, and once a week, analyst asking a question, generating a report, and everybody's looking at it. And with real-time, the data never stops coming. The queries never stop coming. So how do you, if I want real-time metrics on all this huge volumes of data coming in, now if I drain it into a huge data lake and then I'm doing analytics on that, it gets very expensive and very complex very quickly. And so the new release that we had is called SQL-based roll-ups, where simply using SQL, you can define any real-time metric that you want to track across any dimensions you care about. It could be geo demographic and other dimensions you care about that and Rockset will automatically maintain all those real-time metrics for you in real-time in a highly accurate fashion. So you never have to doubt whether the metrics are valid and it will be accurate up to the second. And the best part is you don't have to learn a new language. You can actually use SQL to define those metrics and Rockset will automatically maintain that and scale that for you in the cloud. And that, I think, reduces the barrier. So like if somebody wants to build a real-time, you know, track something for their business in real-time, you know, you have to duct tape together multiple, disparate components and systems that were never meant to work with each other. Now you have a real-time database built for the cloud that is fully, you know, supports full feature SQL. So you can do this in a matter of minutes, which would probably take you days or weeks with alternate technologies. >> That's a dramatic X reduction in time there. I want to mention the Snowflake IPO since you guys mentioned the Confluent IPO. You say that Rockset does for real-time, what Snowflake did for batch. Dhruba, I want to get your perspective on that. Tell me about that. What do you mean by that? >> Yeah, so like we see this trend in the market where lot of analytics, which are very batch, they get a lot of value if they've moved more real-time, right? Like Venkat mentioned, when analytics powers, actual products, which need to use analytics into their, to make the product better. So Rockset very much plays in this area. So Rockset is the only solution. I shouldn't say solution. It's a database, it's a real-time database, which powers these kind of analytic systems. If you don't use Rockset, then you might be using maybe a warehouse or something, but you cannot get real-time because there is always a latency of putting data into the warehouse. It could be minutes, it could be hours. And then also you don't get too many people making concurrent queries on the warehouse. So this is another difference for real-time analytics because it powers applications, the query volume could be large. So that's why you need a real-time database and not a real-time warehouse or any other technologies for this. And this trend has really caught up because most people have either, are pretty much into this journey. You asked me this previous question about what has changed since 2016 as well. And this is a journey that most enterprises we see are already embarking upon. >> One thing too, that we're seeing is that more and more applications are becoming data intensive applications, right? We think of whether it's Instagram or DoorDash or whatnot, or even our banking app, we expect to have the information updated immediately. How do you help, Dhruba, sticking with you, how do you help businesses build and power those data intensive applications that the consumers are demanding? >> That's a great question. And we have booked, me and Venkat, we have seen these data applications at large scale when we were at Facebook earlier. We were both parts of the Facebook team. So we saw how real-time was really important for building that kind of a business, that was social media. But now we are taking the same kind of back ends, which can scale to like huge volumes of data to the enterprises as well. Venkat, do you have anything to add? >> Yeah, I think when you're trying to go from batch to real-time, you're 100% spot on that, a static report, a static dashboard actually becomes an application, becomes a data application, and it has to be interactive. So you're not just showing a newspaper where you just get to read. You want to click and deep dive, do slice and dice the data to not only understand what happened, but why it happened and come up with hypotheses to figure out what I want to do with it. So the interactivity is important and the real-timeliness now it becomes important. So the way we think about it is like, once you go into real-time analytics, you know, the data never stops coming. That's obvious. Data freshness is important. But the queries never stop coming also because one, when your dashboards and metrics are getting up to date real-time, you really want alerts and anomaly detection to be automatically built in. And so you don't even have to look at the graphs once a week. When something is off, the system will come and tap on your shoulder and say, "Hey, something is going on." And so that really is a real-time application at that point, because it's constantly looking at the data and querying on your behalf and only alerting you when something, actually, is interesting happening that you might need to look at. So yeah, the whole movement towards data applications and data intensive apps is a huge use case for us. I think most of our customers, I would say, are building a data application in one shape or form or another. >> And if I think of use cases like cutthroat customer 360, you know, as customers and consumers of whatever product or solution we're talking about, we expect that these brands know who we are, know what we've done with them, what we've bought, what to show me next is what I expect whether again, it's my bank or it's Instagram or something else. So that personalization approach is absolutely critical, and I imagine another big game changer, differentiator for the customers that use Rockset. What do you guys think about that? >> Absolutely, personalized recommendation is a huge use case. We see this all where we have, you know, Ritual is one of the customers. We have a case study on that, I think. They want to personalize. They generate offline recommendations for anything that the user is buying, but they want to use behavioral data from the product to personalize that experience and combine the two before they serve anything on the checkout lane, right? We also see in B2B companies, real-time analytics and data applications becoming a very important thing. And we have another customer, Command Alkon, who, you know, they have a supply chain platform for heavy construction and 80% of concrete in North America flows through their platform, for example. And what they want to know in real-time is reporting on how many concrete trucks are arriving at a big construction site, which ones are late and whatnot. And the real-time, you know, analytics needs to be accurate and needs to be, you know, up to the second, you know, don't tell me what trucks were, you know, coming like an hour ago. No, I need this right now. And so even in a B2B platform, we see that very similar trend trend where real-time reporting, real-time search, real-time indexing is actually a very, very important piece to the puzzle. And not just for B to C examples that you said, and the Instagram comment is also very appropriate because a hedge fund customer came to us and said, "I have kind of a dashboards built on top of like Snowflake. They're taking two to five seconds and I have certain parts of my dashboards, but I am actually having 50/60 visualizations. You do the math, it takes many minutes to load. And so they said, "Hey, you have some indexing deck. Can you make this faster?" Three weeks later, the queries that would take two to five seconds on a traditional warehouse or a cloud data warehouse came back in 18 milliseconds with Rockset. And so it is so fast that they said, you know, "If my internal dashboards are not as fast as Instagram, no one in my company uses it." These are their words. And so they are really, you know, the speed is really, really important. The scale is really, really important. Data freshness is important. If you combine all of these things and also make it simple for people to access with SQL-based, that's really the real unique value prop that we have a Rockset, which is what our customers love. >> You brought up something interesting, Venkat, that kind of made me think of the employee experience. You know, we always think of the customer 360. The customer experience with the employee experience, in my opinion, is inextricably linked. The employees have to have access to what they need to deliver and help these great customer relationships. And as you were saying, you know, the employees are expecting databases to be as fast as they see on Instagram, when they're, you know, surfing on their free time. Then adoption, I imagine, gets better, obviously, than the benefit from the end user and customers' perspective is that speed. Talk to me a little bit about how Rockset, and I would like to get both of your opinions here, is a facilitator of that employee productivity for your customers. >> This is a great question. In fact, the same hedge fund, you know, customer, I pushed them to go and measure how many times do people even look at all the data that you produce? (laughs) How many analysts and investors actually use your dashboards and ask them to go investigate at that. And one of the things that they eventually showed me was there was a huge uptake and their dashboards went from two to three second kind of like, you know, lags to 18 milliseconds. They almost got the daily active user for their own internal dashboards to be almost going from five people to the entire company, you know, so I think you're absolutely spot on. So it really goes back to, you know, really leveraging the data and actually doing something about it. Like, you know, if I ask a question and it's going to, you know, system is going to take 20 minutes to answer that, you know, I will probably not ask as many questions as I want to. When it becomes interactive and very, very fast, and all of a sudden, I not only start with a question and, you know, I can ask a follow-up question and then another follow-up question and make it really drive that to, you know, a conclusion and I can actually act upon it. And this really accelerates. So even if you kind of like, look at the macro, you hear these phrases, the world is going from batch to real-time, and in my opinion, when I look at this, people want to, you know, accelerate their growth. People want to make faster decisions. People want to get to, what can I do about this and get actionable insights. And that is not really going to come from systems that take 20 minutes to give a response. It's going to really come from systems that are interactive and real-time, and that's really the need for acceleration is what's really driving this movement from batch to real-time. And we're very happy to facilitate that and accelerate that moment. >> And it really drives the opportunity for your customers to monetize more and more data so that they can actually act on it, as you said, in real-time and do something about it, whether it's a positive experience or it is, you know, remediating a challenge. Last question guys, since we're almost out of time here, but I want to understand, talk to me about the Rockset-AWS partnership and what the value is for your customers. >> Okay, yeah. I'll get to that in a second, but I wanted to add something to your previous question. I think my opinion for all the customers that we see is that real-time analytics is addictive. Once they get used to it, they can go back to the old stuff. So this is what we have found with all our customers. So, yeah, for the AWS question, I think maybe Venkat can answer that better than me. >> Yeah, I mean, we love partnering with AWS. I think, they are the world's leader when it comes to public clouds. We have a lot of joint happy customers that are all AWS customers. Rockset is entirely built on top of AWS, and we love that. And there is a lot of integrations that Rockset natively comes with. So if you're already managing your data in AWS, you know, there are no data transfer costs or anything like that involved for you to also, you know, index that data in Rockset and actually build real-time applications and stream the data to Rockset. So I think the partnership goes in very, very deep in terms of like, we are an AWS customer, we are a partner and we, you know, our go-to market teams work with them. And so, yeah, we're very, very happy, you know, like, AWS fanboys here, yeah. >> Excellent, it sounds like a very great synergistic collaborative relationship, and I love, Dhruba, what you said. This is like, this is a great quote. "Real-time analytics is addictive." That sounds to me like a good addiction (all subtly laugh) for businesses and every industry to take out. Guys, it's been a pleasure talking to you. Thank you for joining me, talking to the audience about Rockset, what differentiates you, and how you're helping customers really improve their customer productivity, their employee productivity, and beyond. We appreciate your time. >> Thanks, Lisa. >> Thank you, thanks a lot. >> For my guests, I'm Lisa Martin. You're watching this "Cube Conversation". (bright ending music)

Published Date : Sep 14 2021

SUMMARY :

And I'm pleased to welcome the reach of every company. And we are on a mission to make, you know, How do you make it more is the DevOps world, you know, that you saw that really the new trend is that how can you index for businesses, you know, And the best part is you don't What do you mean by that? And then also you don't that the consumers are demanding? Venkat, do you have anything to add? that you might need to look at. you know, as customers and And the real-time, you And as you were saying, you know, So it really goes back to, you know, a positive experience or it is, you know, the customers that we see and stream the data to Rockset. and I love, Dhruba, what you said. For my guests, I'm Lisa Martin.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Lisa MartinPERSON

0.99+

AWSORGANIZATION

0.99+

RocksetORGANIZATION

0.99+

FacebookORGANIZATION

0.99+

20 minutesQUANTITY

0.99+

Dhruba BorthakurPERSON

0.99+

2016DATE

0.99+

twoQUANTITY

0.99+

80%QUANTITY

0.99+

100%QUANTITY

0.99+

LisaPERSON

0.99+

five peopleQUANTITY

0.99+

last yearDATE

0.99+

GoogleORGANIZATION

0.99+

five secondsQUANTITY

0.99+

AmazonORGANIZATION

0.99+

oneQUANTITY

0.99+

Venkat VenkataramaniPERSON

0.99+

North AmericaLOCATION

0.99+

two categoriesQUANTITY

0.99+

18 millisecondsQUANTITY

0.99+

bothQUANTITY

0.99+

InstagramORGANIZATION

0.99+

DhrubaORGANIZATION

0.99+

SQLTITLE

0.99+

SnowflakeORGANIZATION

0.98+

one domainQUANTITY

0.98+

two gentlemenQUANTITY

0.98+

thirdQUANTITY

0.98+

Three weeks laterDATE

0.97+

three secondQUANTITY

0.97+

two trendsQUANTITY

0.97+

One thingQUANTITY

0.96+

secondQUANTITY

0.96+

VenkatORGANIZATION

0.95+

RitualORGANIZATION

0.93+

an hour agoDATE

0.92+

both partsQUANTITY

0.91+

once a weekQUANTITY

0.91+

SnowflakeTITLE

0.9+

one big use caseQUANTITY

0.89+

50/60QUANTITY

0.89+

few weeks agoDATE

0.87+

one shapeQUANTITY

0.86+

Cube ConversationTITLE

0.84+

Thomas Scheibe | Cisco Future Cloud


 

(upbeat music) >> Narrator: From around the globe, it's theCUBE. Presenting Future Cloud. One event, a world of opportunities. Brought to you by Cisco. >> Okay. We're here with Thomas Scheibe, who's the vice president of Product Management, aka VP of all things Data Center Networking, STN, cloud, you name it in that category. Welcome Thomas, good to see you again. >> Hey, same here. Thanks for having me on. >> Yeah, it's our pleasure. Okay. Let's get right into observability. When you think about observability, visibility, infrastructure monitoring, problem resolution across the network, how does cloud change things? In other words, what are the challenges that networking teams are currently facing as they're moving to the cloud and trying to implement hybrid cloud? >> Yeah. (scoffs) Yeah. Visibility as always is very, very important and it's quite frankly, it's not just, it's not just the networking team, it's actually the application team too, right? And as you pointed out, the the underlying impetus to what's going on here is the, the data center is wherever the data is, and I think we said this a couple years back. And really what happens the, the applications are going to be deployed in different locations, right? Whether it's in a public cloud, whether it's on-prem and they're built differently, right? They're built as micro servers, so they might actually be distributed as well at the same application. And so what that really means is you need, as an operator as well as actually a user, a better visibility, "where are my pieces?", and you need to be able to correlate between where the app is and what the underlying network is, that is in place in these different locations. So you have actually a good knowledge why the app is running so fantastic or sometimes not. So I think that's, that's really the problem statement. What, what we're trying to go after with observability. >> Okay. Let's, let's double click on that. So, so a lot of customers tell me that you got to stare at log files until your eyes bleed, then you've got to bring in guys with lab coats who have PhDs to figure all this stuff out. >> Thomas: Yeah. >> So you just described, it's getting more complex, but at the same time, you have to simplify things. So how, how are you doing that? >> Correct. So what we basically have done is we have this fantastic product that is called ThousandEyes. And so what this does is basically (chuckles) as the name which I think is a fantastic, fantastic name. You have these sensors everywhere and you can have a good correlation on links between if I run a from a site to a site, from a site to a cloud, from the cloud to cloud. And you basic can measure what is the performance of these links? And so what we're, what we're doing here is we're actually extending the footprint of the ThousandEyes agent, right? Instead of just having a, an inversion machine of clouds we are now embedding them with the Cisco network devices, right? We announced this was the Catalyst 9000. And we're extending this now to our 8000 Catalyst product line for the for the SD-WAN products, as well as to the data center products, in Nexus line. And so what you see is, is you know, a half a thing, you have ThousandEyes. You get a million insights and you get a billion dollar off improvements for how your applications run. And this is really the, the power of tying together the footprint of what a network is with the visibility, what is going on. So you actually know the application behavior that is attached to this network. >> I see. So, okay. So as the cloud evolves, it expands, it connects, you're actually enabling ThousandEyes to go further, not just confined within a single data center location but out to the network across clouds, et cetera. >> Thomas: Correct. >> Wherever the network is you're going to have a ThousandEyes sensor and you can bring this together and you can quite frankly pick, if you want to say, Hey I have my application in public cloud provider A domain one, and I have another one in domain two I can do monitor that link. I can also monitor, I have a user that has a campus location or a branch location. I kind of put an agent there and then I can monitor the connectivity from that branch location all the way to the, let's say, corporation's data center or headquarter or to the cloud. And I can have these probes and just the, have visibility in saying, Hey, if there's a performance I know where the issue is. And then I obviously can use all the other tools that we have to address those. >> All right, let's talk about the cloud operating model. Everybody tells us that, you know, it's really the change in the model that drives big numbers in terms of ROI. And I want you to maybe address how you're bringing automation and DevOps to this world of hybrid and specifically, how is Cisco enabling IT organizations to move to a cloud operating model as that cloud definition expands? >> Yeah, no, that's that's another interesting topic beyond the observability. So it really, really what we're seeing, and this is going on for, I want to say couple of years now it's really this transition from operating infrastructure as a networking team, more like a service like what you would expect from a cloud provider, right? This is really around the networking team offering services like a cloud provided us. And that's really what the meaning is of cloud operating model, right? Where this is infrastructure running your own data center where that's linking that infrastructure was whatever runs on the public cloud is operating it like a cloud service. And so we are on this journey for a while. So one of the examples um that we have, we're moving some of the control software assets that customers today can deploy on-prem to an instance that they can deploy in a, in a cloud provider and just basically instantiate things there and then just run it that way. Right? And so the latest example for this is what we have, our Identity Service Engine that is now unlimited availability, available on AWS and will become available mid this year, both on AWS and Azure, as a service. You can just go to Marketplace, you can load it there and now increase. You can start running your policy control in the cloud managing your access infrastructure in your data center, in your campus, wherever you want to do it. And so that's just one example of how we see our Customers Network Operations team taking advantage of a cloud operating model and basically deploying their, their tools where they need them and when they need them. >> Dave: So >> What's the scope of I, I hope I'm saying it right, ISE, right? I.S.E, I think it's um, you call it ISE. What's the scope of that? Like for instance, to an effect my, or even, you know address, simplify my security approach? >> Absolutely. That's now coming to what is the beauty of the product itself? Yes. What you can do is really is, a lot of people talking about is, how do I get to a Zero Trust approach to networking? How do I get to a much more dynamic, flexible segmentation in my infrastructure, again, whether this was only campus access as well as the data center and ISE helps you there. You can use it as a pawn to define your policies and then inter-connect from there, right. In this particular case, we would, instead of ISE in a cloud as a software, alone, you now can connect and say, Hey, I want to manage and program my network infrastructure and my data center or my campus going to the respective controller, whether it's DNA Center for campus or whether it's the, the ACI policy controller. And so yes, what you get as an effect out of this is a very elegant way to automatically manage ,in one place, "what is my policy", and then drive the right segmentation in your network infrastructure. >> Yeah. Zero Trust. It was..Pre pandemic it was kind of a buzzword, now it's become a mandate. I, I wonder if we could talk about- >> Thomas: - Yes >> Yeah, right. I mean, so- >> Thomas: -Pretty much. >> I wondered if we could talk about cloud native apps. You got all these developers that are working inside organizations, they're maintaining legacy apps they're connecting their data to systems in the cloud. They're sharing that data. These developers, they're rapidly advancing their skillsets. How is Cisco enabling its infrastructure to support this world of cloud native, making infrastructure more responsive and agile for application developers? >> Yeah. So you were going to the talk we saw was the visibility. We talked about the operating model how our network operates actually want to use tools going forward. Now the next step to this is, it's not just the operator. How do they actually, where do they want to put these tools? Or how they interact with this tools? As well as quite frankly, as how let's say, a DevOps team, or application team or a cloud team also wants to take advantage of the programmability of the underlying network. And this is where we're moving into this whole cloud native discussion, right. Which has really two angles. So it's the cloud native way, how applications are being built. And then there is the cloud native way, how you interact with infrastructure, right? And so what we have done is we're A, putting in place the on-ramps between clouds, and then on top of it, we're exposing for all these tools APIs that can be used and leveraged by standard cloud tools or cloud-native tools, right? And one example or two examples we always have. And again, we're on this journey for a while, is both Ansible script capabilities that access from RedHat as well as Hashi Terraform capabilities that you can orchestrate across infrastructure to drive infrastructure automation. And what, what really stands behind it is what either the networking operations team wants to do or even the app team. They want to be able to describe the application as a code and then drive automatically or programmatically instantiation of infrastructure needed for that application. And so what you see us doing is providing all these capability as an interface for all our network tools, right. Whether this is ISE, what I just mentioned, whether this is our DCN controllers in the data center whether these are the controllers in the, in the campus for all of those, we have cloud-native interfaces. So operator or a DevOps team can actually interact directly with that infrastructure the way they would do today with everything that lives on the cloud or with everything how they built the application. >> Yeah, this is key. You can't even have the conversation of of Op cloud operating model that includes and comprises on-prem without programmable infrastructure. So that's, that's very important. Last question, Thomas, are customers actually using this? You made the announcement today. Are there, are there any examples of customers out there doing this? >> We do have a lot of customers out there that are moving down the path and using the Cisco High-performance Infrastructure both on the compute side, as well as on the Nexus side. One of the costumers, and this is like an interesting case, is Rakuten. Rakuten is a large telco provider, a mobile 5G operator in Japan and expanding, and as in different countries. And so people, some think, "Oh cloud" "You must be talking about the public cloud provider" "the big three or four". But if you look at it, there's a lot of the telco service providers are actually cloud providers as well and expanding very rapidly. And so we're actually very proud to work together with Rakuten and help them build high performance data center infrastructure based on HANA Gig and actually for a gig to drive their deployment to its 5G mobile cloud infrastructure, which is which is where the whole the whole world, which frankly is going. And so it's really exciting to see this development and see the power of automation visibility together with the High-performance infrastructure becoming a reality on delivering actually, services. >> Yeah, some great points you're making there. Yes, you have the big four clouds, they're enormous but then you have a lot of actually quite large clouds telcos that are either proximate to those clouds or they're in places where those hyper-scalers may not have a presence and building out their own infrastructure. So, so that's a great case study. Thomas.Hey, great having you on. Thanks much for spending some time with us. >> Yeah, same here. I appreciate it. Thanks a lot. >> All right. And thank you for watching everybody. This is Dave Vellante for theCUBE, the leader in tech event coverage. (upbeat music)

Published Date : Jun 2 2021

SUMMARY :

Brought to you by Cisco. Welcome Thomas, good to see you again. Thanks for having me on. as they're moving to the cloud And so what that really means is you need, that you got to stare at log but at the same time, you And so what you see is, is So as the cloud evolves, and you can bring this together And I want you to maybe address how And so the latest example What's the scope of I, And so yes, what you get was kind of a buzzword, I mean, so- to support this world And so what you see us You can't even have the conversation of and see the power of but then you have a lot of I appreciate it. And thank you for watching everybody.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
ThomasPERSON

0.99+

Dave VellantePERSON

0.99+

JapanLOCATION

0.99+

CiscoORGANIZATION

0.99+

DavePERSON

0.99+

RakutenORGANIZATION

0.99+

Thomas ScheibePERSON

0.99+

two examplesQUANTITY

0.99+

AWSORGANIZATION

0.99+

ThousandEyesORGANIZATION

0.99+

one exampleQUANTITY

0.99+

mid this yearDATE

0.99+

two anglesQUANTITY

0.99+

ACIORGANIZATION

0.99+

todayDATE

0.99+

oneQUANTITY

0.98+

HANA GigTITLE

0.98+

OneQUANTITY

0.98+

bothQUANTITY

0.98+

One eventQUANTITY

0.97+

8000COMMERCIAL_ITEM

0.96+

fourQUANTITY

0.96+

ISETITLE

0.95+

one placeQUANTITY

0.94+

Data Center NetworkingORGANIZATION

0.91+

billion dollarQUANTITY

0.91+

Cisco Future CloudORGANIZATION

0.9+

STNORGANIZATION

0.87+

a million insightsQUANTITY

0.86+

a couple years backDATE

0.86+

threeQUANTITY

0.85+

pandemicEVENT

0.82+

Catalyst 9000COMMERCIAL_ITEM

0.82+

RedHatTITLE

0.81+

doubleQUANTITY

0.8+

theCUBEORGANIZATION

0.78+

single data centerQUANTITY

0.76+

Hashi TerraformTITLE

0.75+

coupleQUANTITY

0.75+

DevOpsORGANIZATION

0.73+

AzureTITLE

0.71+

half a thingQUANTITY

0.66+

Thomas.HeyPERSON

0.64+

MarketplaceTITLE

0.62+

yearsQUANTITY

0.6+

CatalystORGANIZATION

0.58+

twoQUANTITY

0.58+

domainQUANTITY

0.56+

NexusCOMMERCIAL_ITEM

0.47+

AnsibleORGANIZATION

0.38+

Thomas Scheibe, Cisco | Cisco Future Cloud


 

>>From around the globe. It's the cube present a future cloud one event, a world of opportunities brought to you by Cisco. >>Okay. We're here with Thomas Shabbat. Who's the vice president of product management, AKA VP of all things, data center, networking, SDN cloud, you name it in that category. Welcome Thomas. Good to see you again. >>Hey Sam. Yes. Thanks for having me on. >>Yeah, it's our pleasure. Okay. Let's get right into observability. When you think about observability visibility, infrastructure monitoring, problem resolution across the network, how does cloud change thing? In other words, what are the challenges that networking teams are currently facing as they're moving to the cloud and trying to implement hybrid cloud? >>Yeah. Yeah. Uh, visibility as always is very, very important. And it's perfect. It's not just, it's not just the network team is actually the application team too. Right. And as you pointed out, the, the underlying impetus to what's going on here is the data center is wherever the data is. And I think we set as a couple of years back and really what happens, the, the applications are going to be deployed, uh, in different locations, right? Whether it's in a public cloud, whether it's on prem, uh, and they're built differently, right? They are built as microservices. They might actually be distributed as well at the same application. And so what that really means is you need as an operator as well as actually a user, a bit of visibility where I, my pieces and you need to be able to correlate between where the Apres and what the underlying network is. That is a place at these different locations. So you have actually a good knowledge why the app is running so fantastic or sometimes not. So I think that's, that's really the problem statement, uh, what, what we tried to go after was the observability. >>Okay. Let's double click on that. So, so a lot of customers telling me that you got to stare at log files until your eyes bleed. And then you've got to bring in guys with lab coats who have PhDs to figure all this stuff out. So you just described, it's getting more complex, but at the same time, you have to simplify things. So how are you doing that? >>Correct. So what we basically have done is we have this fantastic product that is called thousand eyes. And so what does DAS is basically as the name, which I think is a fantastic, uh, fantastic name. You have these sensors everywhere. Um, and you can have a good correlation on, uh, links between if I run a, a site to a site from a site to a cloud, from a cloud to cloud, and you basically can measure what is the performance of these links? And so what we're, what we're doing here is we're actually extending the footprint of these thousand eyes agent, right. Instead of just having them, uh, in Virgin material clouds, we are now embedding them with the Cisco network devices, right. We announced this was the catalyst of 9,000. And we're extending this now to our, um, uh, 8,000 catalyst product line for the, for the sun products, as well as to the data center products, the next line. Um, and so what you see is, is there a half a thing you have sounds nice, you get a million insights and you get a billion dollar off improvements, uh, for how your applications run. And this is really, um, the, the power of tying together, the footprint of what a network is with the visibility, what is going on. So you actually know the application behavior that is attached to this network. >>I see. So, okay. So as the cloud evolves, it expands, it connects, you're actually enabling thousand eyes to go further, not just confined within a single data center location, but out to the network across clouds, et cetera, >>Correct. Wherever the network is, you're going to have a thousand eyes sensor and you can bring this together and you can quite frankly pick, if you want to say, Hey, I have my application in public cloud provider, a, uh, domain one, and I have another one domain, two, I can do monitor that link. I can also monitor, I have a user that has a campus location or branch location. I kind of put an agent there and then I can monitor the connectivity from that branch location, all the way to the let's say, corporations, that data center or headquarter, or to the cloud. And I can have these probes and just to be, have visibility and saying, Hey, if there's a performance, I know where the issue is. And then I obviously can use all the other sorts that we have to address those. >>All right, let's talk about the cloud operating model. Everybody tells us that, you know, it's, it's really the change in the model that drives big numbers in terms of ROI. And I want you to maybe address how you're bringing automation and dev ops to this world of, of hybrid and specifically, how is Cisco enabling it organizations to move to a cloud operating model as that cloud definition expands? >>Yeah, no, that's, that's another interesting topic beyond the observability. So really, really what we're seeing. And this has gone on for, uh, I want to say couple of years now, it's really this transition from, uh, operating infrastructure as a network and team more like a service, like what you would expect from a cloud provider, right? This is really around the network team, offering services like a cloud provided us. And that's really what the meaning is of cloud operating model, right? Where this is infrastructure running in your own data center, whether that's linking that infrastructure was whatever runs on the public cloud is operating at like a cloud service. And so we are on this journey for a while. So one of the examples, um, that we have removing some of the control software assets that customers today can deploy on prem, uh, to, uh, an instance that they can deploy in a, in a cloud provider and just basically instantiate saying, stay, and then just run it that way. >>Right? And so the latest example for this is what we have our identity service engine that is now unlimited availability available on AWS. And we will become available mid this year. Also data, we, as a visual, as a service, you can just go to marketplace, you can load it there and now increase. You can start running your policy control in a cloud, managing your X's infrastructure in your data center and your, uh, wherever you want to do it. And so that's just one example of how we see, uh, our customers' network operations team taking advantage of a cloud operating model, or basically deploying their, their tools where they need them and when they need them. So >>What's the scope of, I hope I'm saying it right, ice, right. ISC. I think they call it ice. What's the scope of that? Like for instance, 10 an effect my, or even, you know, address simplify my security approach. >>Absolutely. That's now coming to, what is the beauty of the product itself? Yes. Uh, what you can do is really is like, there's a lot of people talking, what I, how do I get to a zero trust approach to networking? How do I get to a much more dynamic, flexible segmentation in my infrastructure, again, whether this is on only campus X, as well as the data center and ice helps you there, you can use this as a point to, to find your policies and then any connect from there, right? In this particular case, we would instead ice in a cloud as a software, uh, load you now can connect and say, Hey, I want to manage and program my network infrastructure and my data center, or my campus going to the respect of controller with it's DNA center for campus, or whether does this, the, uh, ACI policy controller. And so, yes, what'd you get as an effect out of this is a very elegant way to automatically manage in one place. What does my policy, and then drive the right segmentation in your network infrastructure. Okay. >>Zero trust. It was pre pandemic. It was kind of a buzzword. Now it's become a mandate. I, I wonder if we could talk about yet, right. I mean, so I wonder, could talk about cloud native apps. Uh, you got all these developers that are working inside organizations, they're maintaining legacy apps, they're connecting their data to systems in the cloud. They're sharing that data. These developers they're rapidly advancing their skillsets. How is Cisco enabling its infrastructure to support this world of cloud native making infrastructure more responsive and agile for application developers? >>Yeah. So you were going to, the talk we saw was the visibility. We talked about the operating model, how our network operates, actually want to use tools going forward. Now the next step to visits, it's not just the operator. How do they actually, where do they want to put these tools? All they, how they interact with this tools as well as quite frankly, is how let's say a dev ops team on application team or a cloud team also wants to take advantage off the programmability of the underlying network. And this is where we moving into this whole cloud native discussion, right. Which has really two angles to, is the cloud native way, how applications are being built. And then there is the cloud native way, how you interact with infrastructure, right? And so what we have done as we're putting in place, the on-ramps between clouds, uh, and then on top of it, we're exposing for all these tools, API APIs that can be used and leveraged by standard cloud tools or, uh, uh, cloud native tools, right? >>And one example or two examples we always have. And again, we're on this journey for a while is, uh, both Ansible, uh, script capabilities, uh, that access from red hat, as well as, uh, Hashi Terraform capabilities that you can orchestrate across infrastructure to drive infrastructure automation. And what, what really stands behind it is what either the networking operations team wants to do, or even the app team. They want to be able to describe the application as a code and then drive automatically or programmatically in sedation of infrastructure needed for that application. And so what you see us doing is providing all these, uh, capability as an interface for all our network tools, right? Whether this is ice. What I just mentioned, whether this is our, uh, DCN controllers in the data center, uh, whether these are the controllers in the, uh, in the campus for all of those, we have cloud native interfaces. So, uh, operator or a dev ops team can actually interact directly with that infrastructure the way they would do today with everything that lives in the cloud, or was everything, how they built the application, >>You can't even have the conversation of, of op cloud operating model that includes and comprises on-prem without programmable infrastructure. So that's, that's very important. Last question, Thomas are customers actually using this? They made the announcement today. Are there any examples of customers out there doing this? >>We, we do have a lot of customers out there, um, that are moving down a path and using the D D Cisco high-performance infrastructure, also on the compute side, as well as on the next site. Uh, one of the customers, uh, and this is like an interesting case, is the Rakuten, uh, record in is a large type of provider, um, uh, mobile 5g operator, uh, in Japan and expanding and as in different countries. Uh, and so people, something, Oh, cloud, you must be talking about the public cloud provider, the big, the big three or four. Uh, but if you look at it as a lot of the tackles service providers are actually cloud providers as well and expanding very rapidly. And so we're actually very, um, proud to work together was, was Rakuten and in help them building a high performance, uh, data center infrastructure based on how they gig and actually phone a gig, uh, to drive their deployment to it's a 5g mobile cloud infrastructure, which is, which is, um, where the whole, the whole world where traffic is going. And so it's really exciting to see these development and see the power of automation, visibility, uh, together with the high performance infrastructure, becoming reality and delivering actually, uh, services. Yes. >>Some great points you're making there, but yes, you have the big four clouds are enormous, but then you have a lot of actually quite large clouds, telcos that are either proximate to those clouds or they're in places where those hyperscalers may not have a presence and building out their own infrastructure. So, so that's a great case study, uh, Thomas, Hey, great. Having you on. Thanks so much for spending some time with us. >>Yeah. The same here. I appreciate it. Thanks a lot. >>Thank you for watching everybody. This is Dave Volante for the cube, the leader in tech event coverage.

Published Date : May 18 2021

SUMMARY :

you by Cisco. Good to see you again. When you think about observability And so what that really means is you need it's getting more complex, but at the same time, you have to simplify things. and so what you see is, is there a half a thing you have sounds nice, you get a million insights So as the cloud evolves, it expands, it connects, And I can have these probes and just to be, have visibility and saying, Hey, if there's a performance, And I want you to And this has gone on for, uh, I want to say couple of years now, And so the latest example for this is what we have our identity service engine that you know, address simplify my security approach. And so, yes, what'd you get as an effect out of this is a very elegant Uh, you got all these developers that are working inside organizations, And then there is the cloud native way, how you interact with infrastructure, And so what you see You can't even have the conversation of, of op cloud operating model that includes and comprises And so it's really exciting to see these development and see the power of automation, visibility, so that's a great case study, uh, Thomas, Hey, great. I appreciate it. Thank you for watching everybody.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
ThomasPERSON

0.99+

Dave VolantePERSON

0.99+

CiscoORGANIZATION

0.99+

SamPERSON

0.99+

Thomas ScheibePERSON

0.99+

Thomas ShabbatPERSON

0.99+

RakutenORGANIZATION

0.99+

JapanLOCATION

0.99+

two examplesQUANTITY

0.99+

AWSORGANIZATION

0.99+

mid this yearDATE

0.99+

one exampleQUANTITY

0.99+

twoQUANTITY

0.98+

ACIORGANIZATION

0.98+

one placeQUANTITY

0.98+

two anglesQUANTITY

0.98+

8,000QUANTITY

0.98+

todayDATE

0.98+

oneQUANTITY

0.96+

bothQUANTITY

0.96+

9,000QUANTITY

0.96+

fourQUANTITY

0.96+

pandemicEVENT

0.94+

threeQUANTITY

0.87+

thousand eyesQUANTITY

0.86+

VirginORGANIZATION

0.85+

zero trustQUANTITY

0.85+

Zero trustQUANTITY

0.84+

billion dollarQUANTITY

0.84+

couple of years backDATE

0.82+

Cisco Future CloudORGANIZATION

0.81+

10QUANTITY

0.75+

one domainQUANTITY

0.74+

a million insightsQUANTITY

0.73+

doubleQUANTITY

0.73+

one eventQUANTITY

0.7+

single data centerQUANTITY

0.7+

halfQUANTITY

0.65+

HashiTITLE

0.61+

coupleQUANTITY

0.53+

yearsQUANTITY

0.48+

ApresORGANIZATION

0.44+

thousandQUANTITY

0.41+

Zhamak Dehghani, ThoughtWorks | theCUBE on Cloud 2021


 

>>from around the globe. It's the Cube presenting Cuban cloud brought to you by silicon angle in 2000 >>nine. Hal Varian, Google's chief economist, said that statisticians would be the sexiest job in the coming decade. The modern big data movement >>really >>took off later in the following year. After the Second Hadoop World, which was hosted by Claudette Cloudera in New York City. Jeff Ham Abakar famously declared to me and John further in the Cube that the best minds of his generation, we're trying to figure out how to get people to click on ads. And he said that sucks. The industry was abuzz with the realization that data was the new competitive weapon. Hadoop was heralded as the new data management paradigm. Now, what actually transpired Over the next 10 years on Lee, a small handful of companies could really master the complexities of big data and attract the data science talent really necessary to realize massive returns as well. Back then, Cloud was in the early stages of its adoption. When you think about it at the beginning of the last decade and as the years passed, Maurin Mawr data got moved to the cloud and the number of data sources absolutely exploded. Experimentation accelerated, as did the pace of change. Complexity just overwhelmed big data infrastructures and data teams, leading to a continuous stream of incremental technical improvements designed to try and keep pace things like data Lakes, data hubs, new open source projects, new tools which piled on even Mawr complexity. And as we reported, we believe what's needed is a comm pleat bit flip and how we approach data architectures. Our next guest is Jean Marc de Connie, who is the director of emerging technologies That thought works. John Mark is a software engineer, architect, thought leader and adviser to some of the world's most prominent enterprises. She's, in my view, one of the foremost advocates for rethinking and changing the way we create and manage data architectures. Favoring a decentralized over monolithic structure and elevating domain knowledge is a primary criterion. And how we organize so called big data teams and platforms. Chamakh. Welcome to the Cube. It's a pleasure to have you on the program. >>Hi, David. This wonderful to be here. >>Well, okay, so >>you're >>pretty outspoken about the need for a paradigm shift in how we manage our data and our platforms that scale. Why do you feel we need such a radical change? What's your thoughts there? >>Well, I think if you just look back over the last decades you gave us, you know, a summary of what happened since 2000 and 10. But if even if we go before then what we have done over the last few decades is basically repeating and, as you mentioned, incrementally improving how we've managed data based on a certain assumptions around. As you mentioned, centralization data has to be in one place so we can get value from it. But if you look at the parallel movement off our industry in general since the birth of Internet, we are actually moving towards decentralization. If we think today, like if this move data side, if he said the only way Web would work the only way we get access to you know various applications on the Web pages is to centralize it. We would laugh at that idea, but for some reason we don't. We don't question that when it comes to data, right? So I think it's time to embrace the complexity that comes with the growth of number of sources, the proliferation of sources and consumptions models, you know, embrace the distribution of sources of data that they're not just within one part of organization. They're not just within even bounds of organization there beyond the bounds of organization. And then look back and say Okay, if that's the trend off our industry in general, Um, given the fabric of computation and data that we put in, you know globally in place, then how the architecture and technology and organizational structure incentives need to move to embrace that complexity. And to me, that requires a paradigm shift, a full stack from how we organize our organizations, how we organize our teams, how we, you know, put a technology in place, um, to to look at it from a decentralized angle. >>Okay, so let's let's unpack that a little bit. I mean, you've spoken about and written that today's big architecture and you basically just mentioned that it's flawed, So I wanna bring up. I love your diagrams of a simple diagram, guys, if you could bring up ah, figure one. So on the left here we're adjusting data from the operational systems and other enterprise data sets and, of course, external data. We cleanse it, you know, you've gotta do the do the quality thing and then serve them up to the business. So So what's wrong with that picture that we just described and give granted? It's a simplified form. >>Yeah, quite a few things. So, yeah, I would flip the question may be back to you or the audience if we said that. You know, there are so many sources off the data on the Actually, the data comes from systems and from teams that are very diverse in terms off domains. Right? Domain. If if you just think about, I don't know retail, Uh, the the E Commerce versus Order Management versus customer This is a very diverse domains. The data comes from many different diverse domains. And then we expect to put them under the control off a centralized team, a centralized system. And I know that centralization. Probably if you zoom out, it's centralized. If you zoom in it z compartmentalized based on functions that we can talk about that and we assume that the centralized model will be served, you know, getting that data, making sense of it, cleansing and transforming it then to satisfy in need of very diverse set of consumers without really understanding the domains, because the teams responsible for it or not close to the source of the data. So there is a bit of it, um, cognitive gap and domain understanding Gap, um, you know, without really understanding of how the data is going to be used, I've talked to numerous. When we came to this, I came up with the idea. I talked to a lot of data teams globally just to see, you know, what are the pain points? How are they doing it? And one thing that was evident in all of those conversations that they actually didn't know after they built these pipelines and put the data in whether the data warehouse tables or like, they didn't know how the data was being used. But yet the responsible for making the data available for these diverse set of these cases, So s centralized system. A monolithic system often is a bottleneck. So what you find is, a lot of the teams are struggling with satisfying the needs of the consumers, the struggling with really understanding the data. The domain knowledge is lost there is a los off understanding and kind of in that in that transformation. Often, you know, we end up training machine learning models on data that is not really representative off the reality off the business. And then we put them to production and they don't work because the semantic and the same tax off the data gets lost within that translation. So we're struggling with finding people thio, you know, to manage a centralized system because there's still the technology is fairly, in my opinion, fairly low level and exposes the users of those technologies. I said, Let's say warehouse a lot off, you know, complexity. So in summary, I think it's a bottleneck is not gonna, you know, satisfy the pace of change, of pace, of innovation and the pace of, you know, availability of sources. Um, it's disconnected and fragmented, even though the centralizes disconnected and fragmented from where the data comes from and where the data gets used on is managed by, you know, a team off hyper specialized people that you know, they're struggling to understand the actual value of the data, the actual format of the data, so it's not gonna get us where our aspirations and ambitions need to be. >>Yes. So the big data platform is essentially I think you call it, uh, context agnostic. And so is data becomes, you know, more important, our lives. You've got all these new data sources, you know, injected into the system. Experimentation as we said it with the cloud becomes much, much easier. So one of the blockers that you've started, you just mentioned it is you've got these hyper specialized roles the data engineer, the quality engineer, data scientists and and the It's illusory. I mean, it's like an illusion. These guys air, they seemingly they're independent and in scale independently. But I think you've made the point that in fact, they can't that a change in the data source has an effect across the entire data lifecycle entire data pipeline. So maybe you could maybe you could add some color to why that's problematic for some of the organizations that you work with and maybe give some examples. >>Yeah, absolutely so in fact, that initially the hypothesis around that image came from a Siris of requests that we received from our both large scale and progressive clients and progressive in terms of their investment in data architectures. So this is where clients that they were there were larger scale. They had divers and reached out of domains. Some of them were big technology tech companies. Some of them were retail companies, big health care companies. So they had that diversity off the data and the number off. You know, the sources of the domains they had invested for quite a few years in, you know, generations. If they had multi generations of proprietary data warehouses on print that they were moving to cloud, they had moved to the barriers, you know, revisions of the Hadoop clusters and they were moving to the cloud. And they the challenges that they were facing were simply there were not like, if I want to just, like, you know, simplifying in one phrase, they were not getting value from the data that they were collecting. There were continuously struggling Thio shift the culture because there was so much friction between all of these three phases of both consumption of the data and transformation and making it available consumption from sources and then providing it and serving it to the consumer. So that whole process was full of friction. Everybody was unhappy. So its bottom line is that you're collecting all this data. There is delay. There is lack of trust in the data itself because the data is not representative of the reality has gone through a transformation. But people that didn't understand really what the data was got delayed on bond. So there is no trust. It's hard to get to the data. It's hard to create. Ultimately, it's hard to create value from the data, and people are working really hard and under a lot of pressure. But it's still, you know, struggling. So we often you know, our solutions like we are. You know, Technologies will often pointed to technology. So we go. Okay, This this version of you know, some some proprietary data warehouse we're using is not the right thing. We should go to the cloud, and that certainly will solve our problems. Right? Or warehouse wasn't a good one. Let's make a deal Lake version. So instead of you know, extracting and then transforming and loading into the little bits. And that transformation is that, you know, heavy process, because you fundamentally made an assumption using warehouses that if I transform this data into this multi dimensional, perfectly designed schema that then everybody can run whatever choir they want that's gonna solve. You know everybody's problem, but in reality it doesn't because you you are delayed and there is no universal model that serves everybody's need. Everybody that needs the divers data scientists necessarily don't don't like the perfectly modeled data. They're looking for both signals and the noise. So then, you know, we've We've just gone from, uh, et elles to let's say now to Lake, which is okay, let's move the transformation to the to the last mile. Let's just get load the data into, uh into the object stores into semi structured files and get the data. Scientists use it, but they're still struggling because the problems that we mentioned eso then with the solution. What is the solution? Well, next generation data platform, let's put it on the cloud, and we sell clients that actually had gone through, you know, a year or multiple years of migration to the cloud. But with it was great. 18 months I've seen, you know, nine months migrations of the warehouse versus two year migrations of the various data sources to the clubhouse. But ultimately, the result is the same on satisfy frustrated data users, data providers, um, you know, with lack of ability to innovate quickly on relevant data and have have have an experience that they deserve toe have have a delightful experience off discovering and exploring data that they trust. And all of that was still a missed so something something else more fundamentally needed to change than just the technology. >>So then the linchpin to your scenario is this notion of context and you you pointed out you made the other observation that look, we've made our operational systems context aware. But our data platforms are not on bond like CRM system sales guys very comfortable with what's in the CRM system. They own the data. So let's talk about the answer that you and your colleagues are proposing. You're essentially flipping the architecture whereby those domain knowledge workers, the builders, if you will, of data products or data services there now, first class citizens in the data flow and they're injecting by design domain knowledge into the system. So So I wanna put up another one of your charts. Guys, bring up the figure to their, um it talks about, you know, convergence. You showed data distributed domain, dream and architecture. Er this self serve platform design and this notion of product thinking. So maybe you could explain why this approach is is so desirable, in your view, >>sure. The motivation and inspiration for the approach came from studying what has happened over the last few decades in operational systems. We had a very similar problem prior to micro services with monolithic systems, monolithic systems where you know the bottleneck. Um, the changes we needed to make was always, you know, our fellow Noto, how the architecture was centralized and we found a nice nation. I'm not saying this is the perfect way of decoupling a monolith, but it's a way that currently where we are in our journey to become data driven, um is a nice place to be, um, which is distribution or decomposition off your system as well as organization. I think when we whenever we talk about systems, we've got to talk about people and teams that's responsible for managing those systems. So the decomposition off the systems and the teams on the data around domains because that's how today we are decoupling our business, right? We're decoupling our businesses around domains, and that's a that's a good thing and that What does that do really for us? What it does? Is it localizes change to the bounded context of fact business. It creates clear boundary and interfaces and contracts between the rest of the universe of the organization on that particular team, so removes the friction that often we have for both managing the change and both serving data or capability. So it's the first principle of data meshes. Let's decouple this world off analytical data the same to mirror the same way we have to couple their systems and teams and business why data is any different. And the moment you do that, So you, the moment you bring the ownership to people who understands the data best, then you get questions that well, how is that any different from silence that's connected databases that we have today and nobody can get to the data? So then the rest of the principles is really to address all of the challenges that comes with this first principle of decomposition around domain Context on the second principle is well, we have to expect a certain level off quality and accountability and responsibility for the teams that provide the data. So let's bring product thinking and treating data as a product to the data that these teams now, um share and let's put accountability around. And we need a new set of incentives and metrics for domain teams to share the data. We need to have a new set off kind of quality metrics that define what it means for the data to be a product. And we can go through that conversation perhaps later eso then the second principle is okay. The teams now that are responsible, the domain teams responsible for the analytical data need to provide that data with a certain level of quality and assurance. Let's call that a product and bring products thinking to that. And then the next question you get asked off by C. E. O s or city or the people who build the infrastructure and, you know, spend the money. They said, Well, it's actually quite complex to manage big data, and now we're We want everybody, every independent team to manage the full stack of, you know, storage and computation and pipelines and, you know, access, control and all of that. And that's well, we have solved that problem in operational world. And that requires really a new level of platform thinking toe provide infrastructure and tooling to the domain teams to now be able to manage and serve their big data. And that I think that requires reimagining the world of our tooling and technology. But for now, let's just assume that we need a new level of abstraction to hide away ton of complexity that unnecessarily people get exposed to and that that's the third principle of creating Selves of infrastructure, um, to allow autonomous teams to build their domains. But then the last pillar, the last you know, fundamental pillar is okay. Once you distributed problem into a smaller problems that you found yourself with another set of problems, which is how I'm gonna connect this data, how I'm gonna you know, that the insights happens and emerges from the interconnection of the data domains right? It does not necessarily locked into one domain. So the concerns around interoperability and standardization and getting value as a result of composition and interconnection of these domains requires a new approach to governance. And we have to think about governance very differently based on a Federated model and based on a computational model. Like once we have this powerful self serve platform, we can computational e automate a lot of governance decisions. Um, that security decisions and policy decisions that applies to you know, this fabric of mesh not just a single domain or not in a centralized. Also, really. As you mentioned that the most important component of the emissions distribution of ownership and distribution of architecture and data the rest of them is to solve all the problems that come with that. >>So very powerful guys. We actually have a picture of what Jamaat just described. Bring up, bring up figure three, if you would tell me it. Essentially, you're advocating for the pushing of the pipeline and all its various functions into the lines of business and abstracting that complexity of the underlying infrastructure, which you kind of show here in this figure, data infrastructure is a platform down below. And you know what I love about this Jama is it to me, it underscores the data is not the new oil because I could put oil in my car I can put in my house, but I can't put the same court in both places. But I think you call it polyglot data, which is really different forms, batch or whatever. But the same data data doesn't follow the laws of scarcity. I can use the same data for many, many uses, and that's what this sort of graphic shows. And then you brought in the really important, you know, sticking problem, which is that you know the governance which is now not a command and control. It's it's Federated governance. So maybe you could add some thoughts on that. >>Sure, absolutely. It's one of those I think I keep referring to data much as a paradigm shift. And it's not just to make it sound ground and, you know, like, kind of ground and exciting or in court. And it's really because I want to point out, we need to question every moment when we make a decision around how we're going to design security or governance or modeling off the data, we need to reflect and go back and say, um, I applying some of my cognitive biases around how I have worked for the last 40 years, I have seen it work. Or do I do I really need to question. And we do need to question the way we have applied governance. I think at the end of the day, the rule of the data governance and objective remains the same. I mean, we all want quality data accessible to a diverse set of users. And these users now have different personas, like David, Personal data, analyst data, scientists, data application, Um, you know, user, very diverse personal. So at the end of the day, we want quality data accessible to them, um, trustworthy in in an easy consumable way. Um, however, how we get there looks very different in as you mentioned that the governance model in the old world has been very commander control, very centralized. Um, you know, they were responsible for quality. They were responsible for certification off the data, you know, applying making sure the data complies. But also such regulations Make sure you know, data gets discovered and made available in the world of the data mesh. Really. The job of the data governance as a function becomes finding that equilibrium between what decisions need to be um, you know, made and enforced globally. And what decisions need to be made locally so that we can have an interoperable measure. If data sets that can move fast and can change fast like it's really about instead of hardest, you know, kind of putting the putting those systems in a straitjacket of being constant and don't change, embrace, change and continuous change of landscape because that's that's just the reality we can't escape. So the role of governance really the governance model called Federated and Computational. And by that I mean, um, every domain needs to have a representative in the governance team. So the role of the data or domain data product owner who really were understand the data that domain really well but also wears that hacks of a product owner. It is an important role that had has to have a representation in the governance. So it's a federation off domains coming together, plus the SMEs and people have, you know, subject matter. Experts who understands the regulations in that environmental understands the data security concerns, but instead off trying to enforce and do this as a central team. They make decisions as what need to be standardized, what need to be enforced. And let's push that into that computational E and in an automated fashion into the into the camp platform itself. For example, instead of trying to do that, you know, be part of the data quality pipeline and inject ourselves as people in that process, let's actually, as a group, define what constitutes quality, like, how do we measure quality? And then let's automate that and let Z codify that into the platform so that every native products will have a C I City pipeline on as part of that pipeline. Those quality metrics gets validated and every day to product needs to publish those SLOC or service level objectives. So you know, whatever we choose as a measure of quality, maybe it's the, you know, the integrity of the data, the delay in the data, the liveliness of it, whatever the are the decisions that you're making, let's codify that. So it's, um, it's really, um, the role of the governance. The objectives of the governance team tried to satisfies the same, but how they do it. It is very, very different. I wrote a new article recently trying to explain the logical architecture that would emerge from applying these principles. And I put a kind of light table to compare and contrast the roll off the You know how we do governance today versus how we will do it differently to just give people a flavor of what does it mean to embrace the centralization? And what does it mean to embrace change and continuous change? Eso hopefully that that that could be helpful. >>Yes, very so many questions I haven't but the point you make it to data quality. Sometimes I feel like quality is the end game. Where is the end game? Should be how fast you could go from idea to monetization with the data service. What happens again? You sort of address this, but what happens to the underlying infrastructure? I mean, spinning a PC to S and S three buckets and my pie torches and tensor flows. And where does that that lives in the business? And who's responsible for that? >>Yeah, that's I'm glad you're asking this question. Maybe because, um, I truly believe we need to re imagine that world. I think there are many pieces that we can use Aziz utilities on foundational pieces, but I but I can see for myself a 5 to 7 year roadmap of building this new tooling. I think, in terms of the ownership, the question around ownership, if that would remains with the platform team, but and perhaps the domain agnostic, technology focused team right that there are providing instead of products themselves. And but the products are the users off those products are data product developers, right? Data domain teams that now have really high expectations in terms of low friction in terms of lead time to create a new data product. Eso We need a new set off tooling, and I think with the language needs to shift from, You know, I need a storage buckets. So I need a storage account. So I need a cluster to run my, you know, spark jobs, too. Here's the declaration of my data products. This is where the data for it will come from. This is the data that I want to serve. These are the policies that I need toe apply in terms of perhaps encryption or access control. Um, go make it happen. Platform, go provision, Everything that I mean so that as a data product developer. All I can focus on is the data itself, representation of semantic and representation of the syntax. And make sure that data meets the quality that I have that I have to assure and it's available. The rest of provisioning of everything that sits underneath will have to get taken care of by the platform. And that's what I mean by requires a re imagination and in fact, Andi, there will be a data platform team, the data platform teams that we set up for our clients. In fact, themselves have a favorite of complexity. Internally, they divide into multiple teams multiple planes, eso there would be a plane, as in a group of capabilities that satisfied that data product developer experience, there would be a set of capabilities that deal with those need a greatly underlying utilities. I call it at this point, utilities, because to me that the level of abstraction of the platform is to go higher than where it is. So what we call platform today are a set of utilities will be continuing to using will be continuing to using object storage, will continue using relation of databases and so on so there will be a plane and a group of people responsible for that. There will be a group of people responsible for capabilities that you know enable the mesh level functionality, for example, be able to correlate and connects. And query data from multiple knows. That's a measure level capability to be able to discover and explore the measure data products as a measure of capability. So it would be set of teams as part of platforms with a strong again platform product thinking embedded and product ownership embedded into that. To satisfy the experience of this now business oriented domain data team teams s way have a lot of work to do. >>I could go on. Unfortunately, we're out of time. But I guess my first I want to tell people there's two pieces that you put out so far. One is, uh, how to move beyond a monolithic data lake to a distributed data mesh. You guys should read that in a data mesh principles and logical architectures kind of part two. I guess my last question in the very limited time we have is our organization is ready for this. >>E think the desire is there I've bean overwhelmed with number off large and medium and small and private and public governments and federal, you know, organizations that reached out to us globally. I mean, it's not This is this is a global movement and I'm humbled by the response of the industry. I think they're the desire is there. The pains are really people acknowledge that something needs to change. Here s so that's the first step. I think that awareness isa spreading organizations. They're more and more becoming aware. In fact, many technology providers are reach out to us asking what you know, what shall we do? Because our clients are asking us, You know, people are already asking We need the data vision. We need the tooling to support. It s oh, that awareness is there In terms of the first step of being ready, However, the ingredients of a successful transformation requires top down and bottom up support. So it requires, you know, support from Chief Data Analytics officers or above the most successful clients that we have with data. Make sure the ones that you know the CEOs have made a statement that, you know, we want to change the experience of every single customer using data and we're going to do, we're going to commit to this. So the investment and support, you know, exists from top to all layers. The engineers are excited that maybe perhaps the traditional data teams are open to change. So there are a lot of ingredients. Substance to transformation is to come together. Um, are we really ready for it? I think I think the pioneers, perhaps the innovators. If you think about that innovation, careful. My doctors, probably pioneers and innovators and leaders. Doctors are making making move towards it. And hopefully, as the technology becomes more available, organizations that are less or in, you know, engineering oriented, they don't have the capability in house today, but they can buy it. They would come next. Maybe those are not the ones who aren't quite ready for it because the technology is not readily available. Requires, you know, internal investment today. >>I think you're right on. I think the leaders are gonna lead in hard, and they're gonna show us the path over the next several years. And I think the the end of this decade is gonna be defined a lot differently than the beginning. Jammeh. Thanks so much for coming in. The Cuban. Participate in the >>program. Pleasure head. >>Alright, Keep it right. Everybody went back right after this short break.

Published Date : Jan 22 2021

SUMMARY :

cloud brought to you by silicon angle in 2000 The modern big data movement It's a pleasure to have you on the program. This wonderful to be here. pretty outspoken about the need for a paradigm shift in how we manage our data and our platforms the only way we get access to you know various applications on the Web pages is to So on the left here we're adjusting data from the operational lot of data teams globally just to see, you know, what are the pain points? that's problematic for some of the organizations that you work with and maybe give some examples. And that transformation is that, you know, heavy process, because you fundamentally So let's talk about the answer that you and your colleagues are proposing. the changes we needed to make was always, you know, our fellow Noto, how the architecture was centralized And then you brought in the really important, you know, sticking problem, which is that you know the governance which So at the end of the day, we want quality data accessible to them, um, Where is the end game? And make sure that data meets the quality that I I guess my last question in the very limited time we have is our organization is ready So the investment and support, you know, Participate in the Alright, Keep it right.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

Jean Marc de ConniePERSON

0.99+

Hal VarianPERSON

0.99+

Zhamak DehghaniPERSON

0.99+

New York CityLOCATION

0.99+

John MarkPERSON

0.99+

5QUANTITY

0.99+

Jeff Ham AbakarPERSON

0.99+

two yearQUANTITY

0.99+

two piecesQUANTITY

0.99+

GoogleORGANIZATION

0.99+

JohnPERSON

0.99+

nine monthsQUANTITY

0.99+

2000DATE

0.99+

18 monthsQUANTITY

0.99+

first stepQUANTITY

0.99+

second principleQUANTITY

0.99+

both placesQUANTITY

0.99+

bothQUANTITY

0.99+

OneQUANTITY

0.99+

a yearQUANTITY

0.99+

one partQUANTITY

0.99+

firstQUANTITY

0.99+

Claudette ClouderaPERSON

0.99+

third principleQUANTITY

0.98+

10DATE

0.98+

first principleQUANTITY

0.98+

one domainQUANTITY

0.98+

todayDATE

0.98+

LeePERSON

0.98+

one phraseQUANTITY

0.98+

three phasesQUANTITY

0.98+

CubanOTHER

0.98+

JammehPERSON

0.97+

7 yearQUANTITY

0.97+

MawrPERSON

0.97+

JamaatPERSON

0.97+

last decadeDATE

0.97+

Maurin MawrPERSON

0.94+

single domainQUANTITY

0.92+

one thingQUANTITY

0.91+

ThoughtWorksORGANIZATION

0.9+

oneQUANTITY

0.9+

nineQUANTITY

0.9+

theCUBEORGANIZATION

0.89+

endDATE

0.88+

last few decadesDATE

0.87+

one placeQUANTITY

0.87+

Second Hadoop WorldEVENT

0.86+

threeOTHER

0.85+

C. E. OORGANIZATION

0.84+

this decadeDATE

0.84+

SirisTITLE

0.83+

coming decadeDATE

0.83+

AndiPERSON

0.81+

ChamakhPERSON

0.8+

three bucketsQUANTITY

0.77+

JamaPERSON

0.77+

CubanPERSON

0.76+

AzizORGANIZATION

0.72+

yearsDATE

0.72+

first classQUANTITY

0.72+

last 40DATE

0.67+

single customerQUANTITY

0.66+

part twoOTHER

0.66+

lastDATE

0.66+

CloudTITLE

0.56+

2021DATE

0.55+

next 10 yearsDATE

0.54+

HadoopEVENT

0.53+

following yearDATE

0.53+

yearsQUANTITY

0.51+

CubeORGANIZATION

0.5+

NotoORGANIZATION

0.45+

CubePERSON

0.39+

CubeCOMMERCIAL_ITEM

0.26+