From Zero to Search | Beyond.2020 Digital
>>Yeah, >>yeah. Hello and welcome to Day two at Beyond. I am so excited that you've chosen to join the building a vibrant data ecosystem track. I might be just a little bit biased, but I think it's going to be the best track of the day. My name is Mallory Lassen and I run partner Marketing here, a thought spot, and that might give you a little bit of a clue as to why I'm so excited about the four sessions we're about to hear from. We'll start off hearing from two thought spotters on how the power of embrace can allow you to directly query on the cloud data warehouse of your choice Next up. And I shouldn't choose favorites, but I'm very excited to watch Cindy housing moderate a panel off true industry experts. We'll hear from Deloitte Snowflake and Eagle Alfa as they describe how you can enrich your organization's data and better understand and benchmark by using third party data. They may even close off with a prediction or two about the future that could prove to be pretty thought provoking. So I'd stick around for that. Next we'll hear from the cloud juggernaut themselves AWS. We'll even get to see a live demo using TV show data, which I'm pretty sure is near and dear to our hearts. At this point in time and then last, I'm very excited to welcome our customer from T Mobile. They're going to describe how they partnered with whip pro and developed a full solution, really modernizing their analytics and giving self service to so many employees. We'll see what that's done for them. But first, let's go over to James Bell Z and Ana Son on the zero to search session. James, take us away. >>Thanks, Mallory. I'm James Bell C and I look after the solutions engineering and customer success teams have thought spot here in Asia Pacific and Japan today I'm joined by my colleague Anderson to give you a look at just how simple and quick it is to connect thought spot to your cloud data warehouse and extract value from the data within in the demonstration, and I will show you just how we can connect to data, make it simple for the business to search and then search the data itself or within this short session. And I want to point out that everything you're going to see in the demo is Run Live against the Cloud Data Warehouse. In this case, we're using snowflake, and there's no cashing of data or summary tables in terms of what you're going to see. But >>before we >>jump into the demo itself, I just like to provide a very brief overview of the value proposition for thought spot. If you're already familiar with thought spot, this will come as no surprise. But for those new to the platform, it's all about empowering the business to answer their own questions about data in the most simple way possible Through search, the personalized user experience provides a familiar search based way for anyone to get answers to their questions about data, not just the analysts. The search, indexing and ranking makes it easy to find the data you're looking for using business terms that you understand. While the smart ranking constantly adjust the index to ensure the most relevant information is provided to you. The query engine removes the complexity of SQL and complex joint paths while ensuring that users will always get thio the correct answers their questions. This is all backed up by an architecture that's designed to be consumed entirely through a browser with flexibility on deployment methods. You can run thought spot through our thoughts about cloud offering in your own cloud or on premise. The choice is yours, so I'm sure you're thinking that all sounds great. But how difficult is it to get this working? Well, I'm happy to tell you it's super easy. There's just forced steps to unlock the value of your data stored in snowflake, Red Shift, Google, Big Query or any of the other cloud data warehouses that we support. It's a simple is connecting to the Cloud Data Warehouse, choosing what data you want to make available in thought spot, making it user friendly. That column that's called cussed underscore name in the database is great for data management, but when users they're searching for it, they'll probably want to use customer or customer name or account or even client. Also, the business shouldn't need to know that they need to get data from multiple tables or the joint parts needed to get the correct results in thought spot. The worksheet allows you to make all of this simple for the users so they can simply concentrate on getting answers to their questions on Once the worksheet is ready, you can start asking those questions by now. I'm sure you're itching to see this in action. So without further ado, I'm gonna hand over to Anna to show you exactly how this works over to you. Anna, >>In this demo, I'm going to go to cover three areas. First, we'll start with how simple it is to get answers to your questions in class spot. Then we'll have a look at how to create a new connection to Cloud Data Warehouse. And lastly, how to create a use of friendly data layer. Let's get started to get started. I'm going to show you the ease off search with thoughts Spot. As you can see thought spot is or were based. I'm simply lobbying. Divide a browser. This means you don't need to install an application. Additionally, possible does not require you to move any data. So all your data stays in your cloud data warehouse and doesn't need to be moved around. Those sports called differentiator is used experience, and that is primarily search. As soon as we come into the search bar here, that's what suggestion is guiding uses through to the answers? Let's let's say that I would wanna have a look at spending across the different product categories, and we want Thio. Look at that for the last 12 months, and we also want to focus on a trending on monthly. And just like that, we get our answer straightaway without alive from Snowflake. Now let's say we want to focus on 11 product category here. We want to have a look at the performance for finished goods. As I started partially typing my search them here, Thoughts was already suggesting the data value that's available for me to use as a filter. The indexing behind the scene actually index everything about the data which allowed me to get to my data easily and quickly as an end user. Now I've got my next to my data answer here. I can also go to the next level of detail in here. In third spot to navigate on the next level of detail is simply one click away. There's no concept off drill path, pre defined drill path in here. That means we've ordered data that's available to me from Snowflake. I'm able to navigate to the level of detail. Allow me to answer those questions. As you can see as a business user, I don't need to do any coding. There's no dragon drop to get to the answer that I need right here. And she can see other calculations are done on the fly. There is no summary tables, no cubes building are simply able to ask the questions. Follow my train or thoughts, and this provides a better use experience for users as anybody can search in here, the more we interact with the spot, the more it learns about my search patterns and make those suggestions based on the ranking in here and that a returns on the fly from Snowflake. Now you've seen example of a search. Let's go ahead and have a look at How do we create a connection? Brand new one toe a cloud at a warehouse. Here we are here, let me add a new connection to the data were healthy by just clicking at new connection. Today we're going to connect Thio retail apparel data step. So let's start with the name. As you can see, we can easily connect to all the popular data warehouse easily. By just one single click here today, we're going to click to Snowflake. I'm gonna ask some detail he'd let me connect to my account here. Then we quickly enter those details here, and this would determine what data is available to me. I can go ahead and specify database to connect to as well, but I want to connect to all the tables and view. So let's go ahead and create a connection. Now the two systems are talking to each other. I can see all the data that's available available for me to connect to. Let's go ahead and connect to the starter apparel data source here and expanding that I can see all the data tables as available to me. I could go ahead and click on any table here, so there's affect herbal containing all the cells information. I also have the store and product information here I can make. I can choose any Data column that I want to include in my search. Available in soft spot, what can go ahead and select entire table, including all the data columns. I will. I would like to point out that this is important because if any given table that you have contains hundreds of columns it it may not be necessary for you to bring across all of those data columns, so thoughts would allow you to select what's relevant for your analysis. Now that's selected all the tables. Let's go ahead and create a connection. Now force what confirms the data columns that we have selected and start to read the medic metadata from Snowflake and automatically building that search index behind the scene. Now, if your daughter does contain information such as personal, identifiable information, then you can choose to turn those investing off. So none of that would be, um, on a hot spots platform. Now that my tables are ready here, I can actually go ahead and search straight away. Let's go ahead and have a look at the table here. I'm going to click on the fact table heat on the left hand side. It shows all the data column that we've brought across from Snowflake as well as the metadata that also brought over here as well. A preview off the data shows me off the data that's available on my snowflake platform. Let's take a look at the joints tap here. The joint step shows may relationship that has already been defined the foreign and primary care redefining snowflake, and we simply inherited he in fourth spot. However, you don't have toe define all of this relationship in snowflake to add a joint. He is also simple and easy. If I click on at a joint here, I simply select the table that I wanted to create a connection for. So select the fact table on the left, then select the product table onto the right here and then simply selected Data column would wish to join those two tables on Let's select Product ID and clicking next, and that's always required to create a joint between those two tables. But since we already have those strong relationship brought over from Snow Flag, I won't go ahead and do that Now. Now you have seen how the tables have brought over Let's go and have a look at how easy is to search coming to search here. Let's start with selecting the data table would brought over expanding the tables. You can see all the data column that we have previously seen from snowflake that. Let's say I wanna have a look at sales in last year. Let's start to type. And even before I start to type anything in the search bar passport already showing me all those suggestions, guiding me to the answers that's relevant to my need. Let's start with having a look at sales for 2019. And I want to see this across monthly for my trend and out off all of these product line he. I also want to focus on a product line called Jackets as I started partially typing the product line jacket for sport, already proactively recommending me all the matches that it has. So all the data values available for me to search as a filter here, let's go ahead and select jacket. And just like that, I get my answer straight away from Snowflake. Now that's relatively simple. Let's try something a little bit more complex. Let's say I wanna have a look at sales comparing across different regions, um, in us. So I want compare West compared to Southwest, and then I want to combat it against Midwest as well as against based on still and also want to see these trending monthly as well. Let's have look at monthly. If you can see that I can use terms such as monthly Key would like that to look at different times. Buckets. Now all of these is out of the box. As she can see, I didn't have to do any indexing. I didn't have to do any formulas in here. As long as there is a date column in the data set, crossbows able to dynamically calculate those time bucket so she can see. Just by doing that search, I was able to create dynamic groupings segment of different sales across the United States on the sales data here. Now that we've done doing search, you can see that across different tables here might not be the most user friendly layer we don't want uses having to individually select tables. And then, um, you know, selecting different columns with cryptic names in here. We want to make this easy for users, and that's when a work ship comes in. But those were were sheet encapsulate all of the data you want to make available for search as well as formulas, as well as business terminologies that the users are familiar with for a specific business area. Let's start with adding the daughter columns we need for this work shape. Want to slack all of the tables that we just brought across from Snowflake? Expanding each of those tables from the facts type of want sales from the fax table. We want sales as well as the date. Then on the store's table. We want store name as well as the stay eating, then expanding to the product we want name and finally product type. Now that we've got our work shit ready, let's go ahead and save it Now, in order to provide best experience for users to search, would want to optimize the work sheet here. So coming to the worksheet here, you can see the data column that we have selected. Let's start with changing this name to be more user friendly, so let's call it fails record. They will want to call it just simply date, store name, call it store, and then we also want state to be in lower case product name. Simply call it product and finally, product type can also further optimize this worksheet by adding, uh, other areas such as synonyms, so allow users to use terms of familiar with to do that search. So in sales, let's call this revenue and we all cannot also further configure the geo configuration. So want to identify state in here as state for us. And finally, we want Thio. Also add more friendly on a display on a currency. So let's change the currency type. I want to show it in U. S. Dollars. That's all we need. So let's try to change and let's get started on our search now coming back to the search here, Let's go ahead. Now select out worksheet that we have just created. If I don't select any specific tables or worksheets, force what Simply a search across everything that's available to you. Expanding the worksheet. We can see all of the data columns in heat that's we've made available and clicking on search bar for spot already. Reckon, making those recommendations in here to start off? Let's have a look at I wanna have a look at the revenue across different states for here today, so let's use the synonym that we have defined across the different states and we want to see this for here today. Um yesterday as well. I know that I also want to focus on the product line jacket that we have seen before, so let's go ahead and select jacket. Yeah, and just like that, I was able to get the answer straight away in third spot. Let's also share some data label here so we can see exactly the Mount as well to state that police performance across us in here. Now I've got information about the sales of jackets on the state. I want to ask next level question. I want to draw down to the store that has been selling these jackets right Click e. I want to drill down. As you can see out of the box. I didn't have to pre define any drill paths on a target. Reports simply allow me to navigate to the next level of detail to answer my own questions. One Click away. Now I see the same those for the jackets by store from year to date, and this is directly from snowflake data life Not gonna start relatively simple question. Let's go ahead and ask a question that's a little bit more complex. Imagine one. Have a look at Silas this year, and I want to see that by month, month over month or so. I want to see a month. Yeah, and I also want to see that our focus on a sale on the last week off the month. So that's where we see most. Sales comes in the last week off the month, so I want to focus on that as well. Let's focus on last week off each month. And on top of that, I also want to only focus on the top performing stores from last year. So I want to focus on the top five stores from last year, so only store in top five in sales store and for last year. And with that, we also want to focus just on the populist product types as well. So product type. Now, this could be very reasonable question that a business user would like to ask. But behind the scenes, this could be quite complex. But First part takes cares, or the complexity off the data allow the user to focus on the answer they want to get to. If we quickly have a look at the query here, this shows how forceful translate the search that were put in there into queries into that, we can pass on the snowflake. As you can see, the search uses all three tables as well shooting, utilizing the joints and the metadata layer that we have created. Switching over to the sequel here, this sequel actually generate on the fly pass on the snowflake in order for the snowflake to bring back to result and presented in the first spot. I also want to mention that in the latest release Off Hot Spot, we also bringing Embraced um, in the latest version, Off tosspot 6.3 story Q is also coming to embrace. That means one click or two analysis. Those who are in power users to monitor key metrics on kind of anomalies, identify leading indicators and isolate trends, as you can see in a matter of minutes. Using thought spot, we were able to connect to most popular on premise or on cloud data warehouses. We were able to get blazing fast answers to our searches, allow us to transform raw data to incite in the speed off thoughts. Ah, pass it back to you, James. >>Thanks, Anna. Wow, that was awesome. It's incredible to see how much committee achieved in such a short amount of time. I want to close this session by referring to a customer example of who, For those of you in the US, I'm sure you're familiar with who, Lou. But for our international audience, who Lou our immediate streaming service similar to a Netflix or Disney Plus, As you can imagine, the amount of data created by a service like this is massive, with over 32 million subscribers and who were asking questions of over 16 terabytes of data in snow folk. Using regular B I tools on top of this size of data would usually mean using summary or aggregate level data, but with thoughts. What? Who are able to get granular insights into the data, allowing them to understand what they're subscribes of, watching how their campaigns of performing and how their programming is being received, and take advantage of that data to reduce churn and increase revenue. So thank you for your time today. Through the session, you've seen just how simple it is to get thought spot up and running on your cloud data warehouse toe. Unlock the value of your data and minutes. If you're interested in trying this on your own data, you can sign up for a free 14 day trial of thoughts. What cloud? Right now? Thanks again, toe Anna for such awards and demo. And if you have any questions, please feel free to let us know. >>Awesome. Thank you, James and Anna. That was incredible. To see it in action and how it all came together on James. We do actually have a couple of questions in our last few minutes here, Anna. >>The first one will be >>for you. Please. This will be a two part question. One. What Cloud Data Warehouses does embrace support today. And to can we use embrace to connect to multiple data warehouses. Thank you, Mallory. Today embrace supports. Snowflake Google, Big query. Um, Red shift as you assign that Teradata advantage and essay Bahana with more sources to come in the future. And, yes, you can connect on live query from notable data warehouses. Most of our enterprise customers have gotta spread across several data warehouses like just transactional data and red Shift and South will start. It's not like, excellent on James will have the final question go to you, You please. Are there any size restrictions for how much data thought spot can handle? And does one need to optimize their database for performance, for example? Aggregations. >>Yeah, that's a great question. So, you know, as we've just heard from our customer, who there's, there's really no limits in terms of the amount of data that you can bring into thoughts Ponant connect to. We have many customers that have, in excess of 10 terabytes of data that they're connecting to in those cloud data warehouses. And, yeah, there's there's no need to pre aggregate or anything. Thought Spot works best with that transactional level data being able to get right down into the details behind it and surface those answers to the business uses. >>Excellent. Well, thank you both so much. And for everyone at home watching thank you for joining us for that session. You have a few minutes toe. Get up, get some water, get a bite of food. What? You won't want to miss this next panel in it. We have our chief data strategy off Officer Cindy, Housing speaking toe experts in the field from Deloitte Snowflake and Eagle Alfa. All on best practices for leveraging external data sources. See you there
SUMMARY :
I might be just a little bit biased, but I think it's going to be the best track of the day. to give you a look at just how simple and quick it is to connect thought spot to your cloud data warehouse and extract adjust the index to ensure the most relevant information is provided to you. source here and expanding that I can see all the data tables as available to me. Who are able to get granular insights into the data, We do actually have a couple of questions in our last few sources to come in the future. of data that they're connecting to in those cloud data warehouses. And for everyone at home watching thank you for joining
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
James | PERSON | 0.99+ |
Anna | PERSON | 0.99+ |
2019 | DATE | 0.99+ |
two tables | QUANTITY | 0.99+ |
T Mobile | ORGANIZATION | 0.99+ |
Asia Pacific | LOCATION | 0.99+ |
US | LOCATION | 0.99+ |
14 day | QUANTITY | 0.99+ |
Mallory | PERSON | 0.99+ |
two systems | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
yesterday | DATE | 0.99+ |
last year | DATE | 0.99+ |
today | DATE | 0.99+ |
Japan | LOCATION | 0.99+ |
Ana Son | PERSON | 0.99+ |
Deloitte Snowflake | ORGANIZATION | 0.99+ |
Eagle Alfa | ORGANIZATION | 0.99+ |
First | QUANTITY | 0.99+ |
United States | LOCATION | 0.99+ |
Mallory Lassen | PERSON | 0.99+ |
Today | DATE | 0.99+ |
Netflix | ORGANIZATION | 0.99+ |
last week | DATE | 0.99+ |
two | QUANTITY | 0.99+ |
U. S. | LOCATION | 0.99+ |
Anderson | PERSON | 0.99+ |
four sessions | QUANTITY | 0.99+ |
first spot | QUANTITY | 0.99+ |
each month | QUANTITY | 0.99+ |
SQL | TITLE | 0.99+ |
ORGANIZATION | 0.99+ | |
one click | QUANTITY | 0.99+ |
Eagle Alfa | ORGANIZATION | 0.99+ |
first | QUANTITY | 0.98+ |
Day two | QUANTITY | 0.98+ |
First part | QUANTITY | 0.98+ |
10 terabytes | QUANTITY | 0.98+ |
11 product | QUANTITY | 0.98+ |
over 32 million subscribers | QUANTITY | 0.98+ |
over 16 terabytes | QUANTITY | 0.98+ |
this year | DATE | 0.98+ |
Cindy | PERSON | 0.98+ |
One | QUANTITY | 0.98+ |
third spot | QUANTITY | 0.97+ |
each | QUANTITY | 0.97+ |
Disney Plus | ORGANIZATION | 0.97+ |
both | QUANTITY | 0.96+ |
fourth spot | QUANTITY | 0.96+ |
first one | QUANTITY | 0.96+ |
Teradata | ORGANIZATION | 0.95+ |
One Click | QUANTITY | 0.94+ |
two analysis | QUANTITY | 0.92+ |
five stores | QUANTITY | 0.91+ |
Off tosspot | TITLE | 0.9+ |
Off Hot Spot | TITLE | 0.89+ |
Beyond | ORGANIZATION | 0.89+ |
Thio | ORGANIZATION | 0.89+ |
one single | QUANTITY | 0.89+ |
Lou | PERSON | 0.88+ |
two part question | QUANTITY | 0.87+ |
two thought spotters | QUANTITY | 0.87+ |
Silas | ORGANIZATION | 0.87+ |
6.3 | QUANTITY | 0.86+ |
three tables | QUANTITY | 0.85+ |
last 12 months | DATE | 0.85+ |
James Bell C | PERSON | 0.8+ |
Snowflake | TITLE | 0.79+ |
five | QUANTITY | 0.77+ |
Midwest | LOCATION | 0.75+ |
three | QUANTITY | 0.75+ |
hundreds of columns | QUANTITY | 0.75+ |
Day Zero Analysis | Cisco Live EU Barcelona 2020
>> Announcer: Live from Barcelona Spain, it's theCUBE. Covering Cisco Live 2020. Brought to you by Cisco and its ecosystem partners. >> Hello everyone, welcome to theCUBE here live in Barcelona Spain for Cisco Live 2020. This is our first CUBE event for the year. Next 10 years of CUBE history, we look back 10 years since we've been around, for 10 years, we have another 10 more we're looking forward to. And this is the first event for 2020 Cisco Live at Barcelona. I'm John Furrier, your host, with Dave Vellante, Stuart Miniman, extracting the signal from the noise. The cloud business is noisy, the networking business is under siege and changing, Dave and Stu we're pre gaming, Cisco Live, kicking off the show, end of the first kind of pre day, Tomorrow's the big keynotes. David Geckler, Verizon exec is preparing to announce rumor has it some insights into what Cisco's position will be vis a vis cloudification, that's going to change their portfolio and probably identify some opportunities, and also some potential gaps in their strategy and what they can do to be competitive. The number one leader in networking, they got a great market position. But cloud is changing the game with networking. >> Yeah, john, it's funny, I heard you talking about, the 10 years and everything. 10 years ago, if I thought about Cisco, I'd be looking at the I pattern of getting the jitter out of the network and trying to tweak everything. And today, what are we talking about with Cisco? We're talking about software, we're talking about cloud. We're talking about developers. Yeah, they're a networking company at its core, but Cisco has been going through a significant transformation, it's been an interesting one to watch. Dave, you wrote a little bit about, Cisco is one of the four horsemen of the internet era, of course the dotcom, they were one of the ones that actually survived and thrived after the dotcom burst, but Cisco is a very different company today far from the $500 billion market cap that they had a few years back, they were at about $200 billion, but still dominant in switching and routing. But there are threats from a number of environments and a lot of changes as to what you need to think about when it comes to Cisco. >> Well, sometimes it's instructive to look back and see how we got here. Cisco made three big bets during its ascendancy, the first one was it bet on IP, I mean, John, you've talked about this a lot, it decimated the mini computer industry by connecting distributed computing and client server, the underlying plumbing there. The second big bet it made was it trained a bunch of engineers, the Cisco certified engineers CCIEs, and they used that as a lever and created a whole army of people that were Cisco advocates, and that was just a brilliant move. And the third was under the leadership of John Chambers, They did about 180 acquisitions, and they were quite good at acquisitions and what that did for them is it continued to fuel growth, it filled in gaps and it kept them relevant with customers. Now, part of that, too, was, Chambers had dozens and dozens of adjacent businesses, remember, he said they were all going to be a billion dollars. Well, most of them, didn't pan out. So they had to cut and burn, and so but now under the leadership of Roberts, they're a much more focused company, kind of getting back to basics, trying to bet on sure things and so let's talk about what some of those sure things are and how Cisco's performing. >> Well it's clear you said lever, they're got to pull a lever at some point and turn the boat that is Cisco, aircraft carrier, what do you want to call it? In the right direction? That's been something that, we've been covering Cisco for decades Stu as you just pointed out, and while we've been close to all the action, I think Cisco knows what's going on. It's clear to me that they kind of understand the landscape. They understand their opportunities in the future, but they're a massive business, Dave, you pointed it out. The combination of all those mergers. The thing that got my attention was as they understood the unification many, many years ago in the compute side, you saw Cisco clearly understands the unification. They know cloud is here, they know that do not make a move, that's cloud friendly, they were going to get swept away and be adrift with the next wave, which is cloud 3.0, whatever we call it. So to me, that's the big story with Cisco. What is the impact of the company when you cloudify business? That's not public cloud, that's hybrid public, economics are changing, the compute capabilities are changing, the network capabilities are changing, got the edge. I think Cisco will be defined by their actions over the next two to three years. What they announce, how they position it, and what they bring value to the customers because you got Silicon One chip, good move, got cloud position, got App D on the on the top of the stack, you got cloud center, they're trying to get to the cloud, but you can't do that until you have the subscription business, until you, can't do pricing by usage unless you have that model. So I think it's a brick by brick, but slowly they're doing it. We have to hear some things next year on Cisco, on how they're going to be true, cloud enabled? >> Well, software is a huge play for them, right? I mean, they've got it, because Cisco's been the dominant player in networking with two thirds of the market, I mean, they've sustained that for a decade plus, and it has allowed them to drive 60 plus percent gross margins for years and years and years, huge operating margin. So how are they going to continue that? Software is the key. And as you say, John, subscriptions is the cloud model that is critical for Cisco. Now they talk about 70% of their software business is subscriptions and annual recurring revenue, it's unclear really how big their software business is, they give hints, I'd peg it at about seven to eight billion last year, maybe growing to 10, 12 billion this year. So pretty sizable, but that's critical in terms of them driving the margins that they need to throw off free cash flow so they can invest in things like stock buybacks and dividends which prop up the stock. >> Well, the problem is you start chasing your tail on the stock price and or product TAMS and product revenue, you might actually miss the boat on the new product. So it's a balance between cannibalizing your own before you can bring in the new, and this is going to be the challenge with Cisco, when do they bite the bullet and say, "Okay, we got to get a position on this piece here "or that piece there, ultimately, "it's going to be about customers." And what do we know, public cloud succeeded with one data, hybrid cloud is a reality and people are executing specific technologies to do an operating model that's cloud And to me, the big wave for Cisco, in my opinion, is multi cloud, because that's not a technology. That's just, that's a value proposition, it's not so much a technology. >> Yeah, Dave, you mentioned a lot of the acquisitions that Cisco has done. In many ways, though, some of the areas where Cisco can be defined is the acquisitions that they didn't do. Cisco did not buy VMware, and were behind in the virtualization wave. And then they created UCS and that actually was a great tailwind for them, created their data center business. They did not end up buying Nicira, and yet, Nicira's done very well. But if you talk to most customers well, even if you're deploying NSX, whose hardware do you tend to have? Well, yes, sure, it might be Arista, might be somebody else's but Cisco still doing good, going well, so they haven't had, there hasn't been a silver bullet to kill Cisco's dominant, but how are they going to do without cloudification? The data center group has gone through a lot of challenges. If you look at they fumbled along with OpenStack, like many other companies did, they went through just as VMware really failed with VCloud Air, the cloud group inside of Cisco had, they had this large Cisco offering that for a couple of years, everybody's looking, I don't know, are you enabling service providers? What are you doing? Now they have management pieces, they're partnering with Google, Amazon, Azure, across the environments, they are heavily involved in Kubernetes and the service meshes. So it remains to be seen where Cisco will find that next Tam expansion to kind of take them to the next wave. >> But Stu, acquisition is a good piece. And what I think they got to do some M&A clearly and organic but the question is would Nicira have been successful at Cisco versus VMware. Look at the timing of that, I think VMware being bought would have been a home run. But Nicira, I don't think that succeeds at Cisco. I think that would be a bunch of knife fights internally. And Nicira would have been shifted up because what it was then and what it is now and VMware are two different things because VMware took it, and shaped it, that I don't think Cisco could have done it at that time, >> The success would have been a defensive move to keep VMware out. That would have been the nature of the success, but I think you're right, the infighting would have been brutal, but VMware wouldn't have Nicira. >> VMware, What they did when they bought Nicira is they spent the first three or four years just making it an extension of VMware. Now it's starting to become their multi Cloud Interconnect. And that's where we need to see Cisco be involved. Cisco's bought many companies that have promised to be multicloud management or that interconnecting fabric and they have not yet panned out. >> Well, security is the linchpin though here, they've made a bunch of acquisitions in security. And I've always said that they've got a position, their networking is the most cost effective, the highest performance and the most secure to connect multiple clouds to hybrid on-prem. And they're in a good position to do that. >> Well, I think I've always said this from day one, you guys know I'm harping on this, Stu and I, we High Five each other all the time when we say this, but back in the days in IT days, the heyday, if you were a network operator, network designer, network architect, you were the king, king or queen. So you had the keys to the kingdom. VMware is a legitimate threat to Cisco. They compete, and they talk about that all the time. But the question is, which community has the keys to the kingdom? Rhetorical question. >> Yeah, well John one point I made earlier, (John laughing) >> Okay, go ahead. >> I remember Pat Gelsinger got on stage and he's like, "Hey, here's the largest collection of network admins" and everybody's looking at him, what are you talking about Pat? When I talk to customers that are deploying NSX, it is mostly not the network team, it is the virtualization team, and they're still often fighting with the network team. But to your point, where I've seen some of the really smart network architects, and people building stuff, Amazon, Azure, Google all have phenomenal people, and they're building environment back Cisco needs to make sure they partner and are embedded there. >> If you, Dave mentioned the leverage. Cisco's got to pull that lever or, turn the boat around and one shift move now, or otherwise, they'll lose that leverage. They have more power than they think in my opinion, they probably do know, but they have the network. And I think the network guys trump the operating guys, because you always swap in operating staff, but you got the network, and the network runs the business. No one could swap out Cisco boxes for a Synopsis years ago, so or Bay, whatever it turned into, so they have that nested position. If they lose that they're done. >> Yeah, and I agree with you, John, there's a lot of, Stu, you pointed out this, people buy NSX and Cisco ACI, but my question is, okay, how long will that redundancy last? I think, to your rhetorical question, Cisco is sitting in the catbird seat and they know networking, they're investing in it. I don't think they're going to lose sight of that. Yeah, wrist is, common Adam and Juniper, but Cisco, they know how to manage that business and maintain its leadership. I guess my question is, have they lost that acquisition formula? Are they as good at acquisitions as they used to be? >> I think their old model's flawed for the modern era. I think the acquisition's got to come in and integrate and I think VMware has proven that they can do acquisitions right. I think that comes from the EMC kind of concept where it's got to fit in beautifully and have synergies right away. I think what Pat Gelsinger is doing I think he's smart and I think that's why VMware is so successful. They got great technical talent, they know the right waves to be on and they execute. So I think Cisco has got to get out of these siloed acquisitions, this business unit mindset and have things come in, if they work, in line with the strategy and the execution. It has to from day one, I've got it. You got to be fitting perfectly in. >> The portfolio is still pretty complicated. You got the core networking. You got things like WebEx, right? I mean, would you want to be going up against Microsoft Teams? But they're in it, Cisco's in it to win it, and they got to they got to talk about-- >> Don't count out Zoom. >> Talking about, no, Zoom's right there too in the mix. And so Cisco's got some work to do, expect some enhancements coming there, in HCI, they've got to walk a fine line Stu, you made this point. On the one hand, they've got, IBM and NetApp with UCS and conversion infrastructure, but then they buy Springpath, which is designed to replace converged infrastructure. So they've got to walk that fine line. >> All right, what are you guys going to hear this week? Let's just wrap this up by going down the line on thoughts and predictions as the keynote kicks off tomorrow, I took some notes, I was doing some, going around the floor trying to get inside people's heads and ask them probing questions. And here's what I got out of it. I think Cisco is going to recognize cloud and absolutely throw the holy water on the fact that it's part of their strategy. I think we'll hear a little bit about Silicon One and how it relates to the portfolio, but I think the big story will be how tying the application environment together with networking, not end to end but really as one seamless solution for customers. I think it's going to be a top story that's been teased out by some of the booths that I saw, connecting things as one holistic thing with application development focus with DevOps. >> Yeah, so John, ACI was application centric infrastructure. And it was critical back in back in the day there is like, well, the application owner really doesn't have much connection there. If you look at what Cisco has been doing the last few years, it is tying together more that application owner, the DevNet group that, we're sitting here in the DevNet zone, that connection between the developer and making enabling them as part of the business absolutely is a wave that Cisco needs to drive. I don't think we're going to see a ton of the Silicon One, 5G and that kind of stuff, if for no other reason then in about a month, they're going to be sitting here with 100,000 people from Mobile World Congress and that's where they keep their dry powder to make sure that they push that piece of it. But that is super important, so and yeah. >> I think, software and security, I mean, I, as you were talking about, Zoom, Teams, so they better focus on collaboration and I want to hear some stuff there, security, IoT and the edge. They've got a very strong position there. Their security, Cisco security business grew 22% last quarter, it's really doing well. So I want to hear more about that. And I think data center, what they're doing in the data center, what they're doing with their switching business, their HCI stuff and converged infrastructure, hyper converged and, three important areas that we'll hear about this week. >> And Dave, I'll emphasise on what you were saying. Edge edge edge, absolutely, if Cisco is going to maintain a dominant player in the network, they need to deliver on that edge. And I've heard a couple of messaging strategies in the past, there was fog computing and all this other stuff, but I think Cisco is in a position today between Meraki that they have between their core product, >> Dave: Devnet. >> To really be able to enable-- >> And those are really-- >> Well, I want to see more progress, I'm looking forward to see, I'm going to drill them on the interviews we do here. They spent millions, billions of dollars satisfying and creating a subscription model with the cloud. We're going to dig into it, we're going to extract the signal from the noise, theCUBE coverage here in Barcelona, Spain. First show of 2020, Cisco Live 2020, I'm John Furrier, Stuart Miniman, Dave Vellante. We'll be right back. (upbeat music)
SUMMARY :
Brought to you by Cisco But cloud is changing the game with networking. and a lot of changes as to what you need to think about So they had to cut and burn, So to me, that's the big story with Cisco. driving the margins that they need to throw off Well, the problem is you start chasing your tail but how are they going to do without cloudification? but the question is would Nicira have been successful to keep VMware out. Cisco's bought many companies that have promised to be And they're in a good position to do that. but back in the days in IT days, the heyday, But to your point, where I've seen some of the really smart Cisco's got to pull that lever or, turn the boat around I don't think they're going to lose sight of that. I think the acquisition's got to come in and integrate and they got to they got to talk about-- On the one hand, they've got, IBM and NetApp with UCS I think it's going to be a top story that's been teased out in about a month, they're going to be sitting here in the data center, what they're doing with their they need to deliver on that edge. We're going to dig into it, we're going to extract the signal
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
David Geckler | PERSON | 0.99+ |
Stuart Miniman | PERSON | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
John | PERSON | 0.99+ |
VMware | ORGANIZATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
Verizon | ORGANIZATION | 0.99+ |
Pat Gelsinger | PERSON | 0.99+ |
$500 billion | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
IBM | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
millions | QUANTITY | 0.99+ |
10, 12 billion | QUANTITY | 0.99+ |
22% | QUANTITY | 0.99+ |
Barcelona | LOCATION | 0.99+ |
Stu | PERSON | 0.99+ |
NSX | ORGANIZATION | 0.99+ |
10 years | QUANTITY | 0.99+ |
Barcelona, Spain | LOCATION | 0.99+ |
next year | DATE | 0.99+ |
ACI | ORGANIZATION | 0.99+ |
Pat | PERSON | 0.99+ |
Barcelona Spain | LOCATION | 0.99+ |
UCS | ORGANIZATION | 0.99+ |
Nicira | ORGANIZATION | 0.99+ |
Horizon3.ai Signal | Horizon3.ai Partner Program Expands Internationally
hello I'm John Furrier with thecube and welcome to this special presentation of the cube and Horizon 3.ai they're announcing a global partner first approach expanding their successful pen testing product Net Zero you're going to hear from leading experts in their staff their CEO positioning themselves for a successful Channel distribution expansion internationally in Europe Middle East Africa and Asia Pacific in this Cube special presentation you'll hear about the expansion the expanse partner program giving Partners a unique opportunity to offer Net Zero to their customers Innovation and Pen testing is going International with Horizon 3.ai enjoy the program [Music] welcome back everyone to the cube and Horizon 3.ai special presentation I'm John Furrier host of thecube we're here with Jennifer Lee head of Channel sales at Horizon 3.ai Jennifer welcome to the cube thanks for coming on great well thank you for having me so big news around Horizon 3.aa driving Channel first commitment you guys are expanding the channel partner program to include all kinds of new rewards incentives training programs help educate you know Partners really drive more recurring Revenue certainly cloud and Cloud scale has done that you got a great product that fits into that kind of Channel model great Services you can wrap around it good stuff so let's get into it what are you guys doing what are what are you guys doing with this news why is this so important yeah for sure so um yeah we like you said we recently expanded our Channel partner program um the driving force behind it was really just um to align our like you said our Channel first commitment um and creating awareness around the importance of our partner ecosystems um so that's it's really how we go to market is is through the channel and a great International Focus I've talked with the CEO so you know about the solution and he broke down all the action on why it's important on the product side but why now on the go to market change what's the what's the why behind this big this news on the channel yeah for sure so um we are doing this now really to align our business strategy which is built on the concept of enabling our partners to create a high value high margin business on top of our platform and so um we offer a solution called node zero it provides autonomous pen testing as a service and it allows organizations to continuously verify their security posture um so we our company vision we have this tagline that states that our pen testing enables organizations to see themselves Through The Eyes of an attacker and um we use the like the attacker's perspective to identify exploitable weaknesses and vulnerabilities so we created this partner program from a perspective of the partner so the partner's perspective and we've built It Through The Eyes of our partner right so we're prioritizing really what the partner is looking for and uh will ensure like Mutual success for us yeah the partners always want to get in front of the customers and bring new stuff to them pen tests have traditionally been really expensive uh and so bringing it down in one to a service level that's one affordable and has flexibility to it allows a lot of capability so I imagine people getting excited by it so I have to ask you about the program What specifically are you guys doing can you share any details around what it means for the partners what they get what's in it for them can you just break down some of the mechanics and mechanisms or or details yeah yep um you know we're really looking to create business alignment um and like I said establish Mutual success with our partners so we've got two um two key elements that we were really focused on um that we bring to the partners so the opportunity the profit margin expansion is one of them and um a way for our partners to really differentiate themselves and stay relevant in the market so um we've restructured our discount model really um you know highlighting profitability and maximizing profitability and uh this includes our deal registration we've we've created deal registration program we've increased discount for partners who take part in our partner certification uh trainings and we've we have some other partner incentives uh that we we've created that that's going to help out there we've we put this all so we've recently Gone live with our partner portal um it's a Consolidated experience for our partners where they can access our our sales tools and we really view our partners as an extension of our sales and Technical teams and so we've extended all of our our training material that we use internally we've made it available to our partners through our partner portal um we've um I'm trying I'm thinking now back what else is in that partner portal here we've got our partner certification information so all the content that's delivered during that training can be found in the portal we've got deal registration uh um co-branded marketing materials pipeline management and so um this this portal gives our partners a One-Stop place to to go to find all that information um and then just really quickly on the second part of that that I mentioned is our technology really is um really disruptive to the market so you know like you said autonomous pen testing it's um it's still it's well it's still still relatively new topic uh for security practitioners and um it's proven to be really disruptive so um that on top of um just well recently we found an article that um that mentioned by markets and markets that reports that the global pen testing markets really expanding and so it's expected to grow to like 2.7 billion um by 2027. so the Market's there right the Market's expanding it's growing and so for our partners it's just really allows them to grow their revenue um across their customer base expand their customer base and offering this High profit margin while you know getting in early to Market on this just disruptive technology big Market a lot of opportunities to make some money people love to put more margin on on those deals especially when you can bring a great solution that everyone knows is hard to do so I think that's going to provide a lot of value is there is there a type of partner that you guys see emerging or you aligning with you mentioned the alignment with the partners I can see how that the training and the incentives are all there sounds like it's all going well is there a type of partner that's resonating the most or is there categories of partners that can take advantage of this yeah absolutely so we work with all different kinds of Partners we work with our traditional resale Partners um we've worked we're working with systems integrators we have a really strong MSP mssp program um we've got Consulting partners and the Consulting Partners especially with the ones that offer pen test services so we they use us as a as we act as a force multiplier just really offering them profit margin expansion um opportunity there we've got some technology partner partners that we really work with for co-cell opportunities and then we've got our Cloud Partners um you'd mentioned that earlier and so we are in AWS Marketplace so our ccpo partners we're part of the ISP accelerate program um so we we're doing a lot there with our Cloud partners and um of course we uh we go to market with uh distribution Partners as well gotta love the opportunity for more margin expansion every kind of partner wants to put more gross profit on their deals is there a certification involved I have to ask is there like do you get do people get certified or is it just you get trained is it self-paced training is it in person how are you guys doing the whole training certification thing because is that is that a requirement yeah absolutely so we do offer a certification program and um it's been very popular this includes a a seller's portion and an operator portion and and so um this is at no cost to our partners and um we operate both virtually it's it's law it's virtually but live it's not self-paced and we also have in person um you know sessions as well and we also can customize these to any partners that have a large group of people and we can just we can do one in person or virtual just specifically for that partner well any kind of incentive opportunities and marketing opportunities everyone loves to get the uh get the deals just kind of rolling in leads from what we can see if our early reporting this looks like a hot product price wise service level wise what incentive do you guys thinking about and and Joint marketing you mentioned co-sell earlier in pipeline so I was kind of kind of honing in on that piece sure and yes and then to follow along with our partner certification program we do incentivize our partners there if they have a certain number certified their discount increases so that's part of it we have our deal registration program that increases discount as well um and then we do have some um some partner incentives that are wrapped around meeting setting and um moving moving opportunities along to uh proof of value gotta love the education driving value I have to ask you so you've been around the industry you've seen the channel relationships out there you're seeing companies old school new school you know uh Horizon 3.ai is kind of like that new school very cloud specific a lot of Leverage with we mentioned AWS and all the clouds um why is the company so hot right now why did you join them and what's why are people attracted to this company what's the what's the attraction what's the vibe what do you what do you see and what what do you use what did you see in in this company well this is just you know like I said it's very disruptive um it's really in high demand right now and um and and just because because it's new to Market and uh a newer technology so we are we can collaborate with a manual pen tester um we can you know we can allow our customers to run their pen test um with with no specialty teams and um and and then so we and like you know like I said we can allow our partners can actually build businesses profitable businesses so we can they can use our product to increase their services revenue and um and build their business model you know around around our services what's interesting about the pen test thing is that it's very expensive and time consuming the people who do them are very talented people that could be working on really bigger things in the in absolutely customers so bringing this into the channel allows them if you look at the price Delta between a pen test and then what you guys are offering I mean that's a huge margin Gap between street price of say today's pen test and what you guys offer when you show people that they follow do they say too good to be true I mean what are some of the things that people say when you kind of show them that are they like scratch their head like come on what's the what's the catch here right so the cost savings is a huge is huge for us um and then also you know like I said working as a force multiplier with a pen testing company that offers the services and so they can they can do their their annual manual pen tests that may be required around compliance regulations and then we can we can act as the continuous verification of their security um um you know that that they can run um weekly and so it's just um you know it's just an addition to to what they're offering already and an expansion so Jennifer thanks for coming on thecube really appreciate you uh coming on sharing the insights on the channel uh what's next what can we expect from the channel group what are you thinking what's going on right so we're really looking to expand our our Channel um footprint and um very strategically uh we've got um we've got some big plans um for for Horizon 3.ai awesome well thanks for coming on really appreciate it you're watching thecube the leader in high tech Enterprise coverage [Music] [Music] hello and welcome to the Cube's special presentation with Horizon 3.ai with Raina Richter vice president of emea Europe Middle East and Africa and Asia Pacific APAC for Horizon 3 today welcome to this special Cube presentation thanks for joining us thank you for the invitation so Horizon 3 a guy driving Global expansion big international news with a partner first approach you guys are expanding internationally let's get into it you guys are driving this new expanse partner program to new heights tell us about it what are you seeing in the momentum why the expansion what's all the news about well I would say uh yeah in in international we have I would say a similar similar situation like in the US um there is a global shortage of well-educated penetration testers on the one hand side on the other side um we have a raising demand of uh network and infrastructure security and with our approach of an uh autonomous penetration testing I I believe we are totally on top of the game um especially as we have also now uh starting with an international instance that means for example if a customer in Europe is using uh our service node zero he will be connected to a node zero instance which is located inside the European Union and therefore he has doesn't have to worry about the conflict between the European the gdpr regulations versus the US Cloud act and I would say there we have a total good package for our partners that they can provide differentiators to their customers you know we've had great conversations here on thecube with the CEO and the founder of the company around the leverage of the cloud and how successful that's been for the company and honestly I can just Connect the Dots here but I'd like you to weigh in more on how that translates into the go to market here because you got great Cloud scale with with the security product you guys are having success with great leverage there I've seen a lot of success there what's the momentum on the channel partner program internationally why is it so important to you is it just the regional segmentation is it the economics why the momentum well there are it's there are multiple issues first of all there is a raising demand in penetration testing um and don't forget that uh in international we have a much higher level in number a number or percentage in SMB and mid-market customers so these customers typically most of them even didn't have a pen test done once a year so for them pen testing was just too expensive now with our offering together with our partners we can provide different uh ways how customers could get an autonomous pen testing done more than once a year with even lower costs than they had with with a traditional manual paint test so and that is because we have our uh Consulting plus package which is for typically pain testers they can go out and can do a much faster much quicker and their pain test at many customers once in after each other so they can do more pain tests on a lower more attractive price on the other side there are others what even the same ones who are providing um node zero as an mssp service so they can go after s p customers saying okay well you only have a couple of hundred uh IP addresses no worries we have the perfect package for you and then you have let's say the mid Market let's say the thousands and more employees then they might even have an annual subscription very traditional but for all of them it's all the same the customer or the service provider doesn't need a piece of Hardware they only need to install a small piece of a Docker container and that's it and that makes it so so smooth to go in and say okay Mr customer we just put in this this virtual attacker into your network and that's it and and all the rest is done and within within three clicks they are they can act like a pen tester with 20 years of experience and that's going to be very Channel friendly and partner friendly I can almost imagine so I have to ask you and thank you for calling the break calling out that breakdown and and segmentation that was good that was very helpful for me to understand but I want to follow up if you don't mind um what type of partners are you seeing the most traction with and why well I would say at the beginning typically you have the the innovators the early adapters typically Boutique size of Partners they start because they they are always looking for Innovation and those are the ones you they start in the beginning so we have a wide range of Partners having mostly even um managed by the owner of the company so uh they immediately understand okay there is the value and they can change their offering they're changing their offering in terms of penetration testing because they can do more pen tests and they can then add other ones or we have those ones who offer 10 tests services but they did not have their own pen testers so they had to go out on the open market and Source paint testing experts um to get the pen test at a particular customer done and now with node zero they're totally independent they can't go out and say okay Mr customer here's the here's the service that's it we turn it on and within an hour you're up and running totally yeah and those pen tests are usually expensive and hard to do now it's right in line with the sales delivery pretty interesting for a partner absolutely but on the other hand side we are not killing the pain testers business we do something we're providing with no tiers I would call something like the foundation work the foundational work of having an an ongoing penetration testing of the infrastructure the operating system and the pen testers by themselves they can concentrate in the future on things like application pen testing for example so those Services which we we're not touching so we're not killing the paint tester Market we're just taking away the ongoing um let's say foundation work call it that way yeah yeah that was one of my questions I was going to ask is there's a lot of interest in this autonomous pen testing one because it's expensive to do because those skills are required are in need and they're expensive so you kind of cover the entry level and the blockers that are in there I've seen people say to me this pen test becomes a blocker for getting things done so there's been a lot of interest in the autonomous pen testing and for organizations to have that posture and it's an overseas issue too because now you have that that ongoing thing so can you explain that particular benefit for an organization to have that continuously verifying an organization's posture yep certainly so I would say um typically you are you you have to do your patches you have to bring in new versions of operating systems of different Services of uh um operating systems of some components and and they are always bringing new vulnerabilities the difference here is that with node zero we are telling the customer or the partner package we're telling them which are the executable vulnerabilities because previously they might have had um a vulnerability scanner so this vulnerability scanner brought up hundreds or even thousands of cves but didn't say anything about which of them are vulnerable really executable and then you need an expert digging in one cve after the other finding out is it is it really executable yes or no and that is where you need highly paid experts which we have a shortage so with notes here now we can say okay we tell you exactly which ones are the ones you should work on because those are the ones which are executable we rank them accordingly to the risk level how easily they can be used and by a sudden and then the good thing is convert it or indifference to the traditional penetration test they don't have to wait for a year for the next pain test to find out if the fixing was effective they weren't just the next scan and say Yes closed vulnerability is gone the time is really valuable and if you're doing any devops Cloud native you're always pushing new things so pen test ongoing pen testing is actually a benefit just in general as a kind of hygiene so really really interesting solution really bring that global scale is going to be a new new coverage area for us for sure I have to ask you if you don't mind answering what particular region are you focused on or plan to Target for this next phase of growth well at this moment we are concentrating on the countries inside the European Union Plus the United Kingdom um but we are and they are of course logically I'm based into Frankfurt area that means we cover more or less the countries just around so it's like the total dark region Germany Switzerland Austria plus the Netherlands but we also already have Partners in the nordics like in Finland or in Sweden um so it's it's it it's rapidly we have Partners already in the UK and it's rapidly growing so I'm for example we are now starting with some activities in Singapore um um and also in the in the Middle East area um very important we uh depending on let's say the the way how to do business currently we try to concentrate on those countries where we can have um let's say um at least English as an accepted business language great is there any particular region you're having the most success with right now is it sounds like European Union's um kind of first wave what's them yes that's the first definitely that's the first wave and now we're also getting the uh the European instance up and running it's clearly our commitment also to the market saying okay we know there are certain dedicated uh requirements and we take care of this and and we're just launching it we're building up this one uh the instance um in the AWS uh service center here in Frankfurt also with some dedicated Hardware internet in a data center in Frankfurt where we have with the date six by the way uh the highest internet interconnection bandwidth on the planet so we have very short latency to wherever you are on on the globe that's a great that's a great call outfit benefit too I was going to ask that what are some of the benefits your partners are seeing in emea and Asia Pacific well I would say um the the benefits is for them it's clearly they can they can uh talk with customers and can offer customers penetration testing which they before and even didn't think about because it penetrates penetration testing in a traditional way was simply too expensive for them too complex the preparation time was too long um they didn't have even have the capacity uh to um to support a pain an external pain tester now with this service you can go in and say even if they Mr customer we can do a test with you in a couple of minutes within we have installed the docker container within 10 minutes we have the pen test started that's it and then we just wait and and I would say that is we'll we are we are seeing so many aha moments then now because on the partner side when they see node zero the first time working it's like this wow that is great and then they work out to customers and and show it to their typically at the beginning mostly the friendly customers like wow that's great I need that and and I would say um the feedback from the partners is that is a service where I do not have to evangelize the customer everybody understands penetration testing I don't have to say describe what it is they understand the customer understanding immediately yes penetration testing good about that I know I should do it but uh too complex too expensive now with the name is for example as an mssp service provided from one of our partners but it's getting easy yeah it's great and it's great great benefit there I mean I gotta say I'm a huge fan of what you guys are doing I like this continuous automation that's a major benefit to anyone doing devops or any kind of modern application development this is just a godsend for them this is really good and like you said the pen testers that are doing it they were kind of coming down from their expertise to kind of do things that should have been automated they get to focus on the bigger ticket items that's a really big point so we free them we free the pain testers for the higher level elements of the penetration testing segment and that is typically the application testing which is currently far away from being automated yeah and that's where the most critical workloads are and I think this is the nice balance congratulations on the international expansion of the program and thanks for coming on this special presentation really I really appreciate it thank you you're welcome okay this is thecube special presentation you know check out pen test automation International expansion Horizon 3 dot AI uh really Innovative solution in our next segment Chris Hill sector head for strategic accounts will discuss the power of Horizon 3.ai and Splunk in action you're watching the cube the leader in high tech Enterprise coverage foreign [Music] [Music] welcome back everyone to the cube and Horizon 3.ai special presentation I'm John Furrier host of thecube we're with Chris Hill sector head for strategic accounts and federal at Horizon 3.ai a great Innovative company Chris great to see you thanks for coming on thecube yeah like I said uh you know great to meet you John long time listener first time caller so excited to be here with you guys yeah we were talking before camera you had Splunk back in 2013 and I think 2012 was our first splunk.com and boy man you know talk about being in the right place at the right time now we're at another inflection point and Splunk continues to be relevant um and continuing to have that data driving Security in that interplay and your CEO former CTO of his plug as well at Horizon who's been on before really Innovative product you guys have but you know yeah don't wait for a breach to find out if you're logging the right data this is the topic of this thread Splunk is very much part of this new international expansion announcement uh with you guys tell us what are some of the challenges that you see where this is relevant for the Splunk and Horizon AI as you guys expand uh node zero out internationally yeah well so across so you know my role uh within Splunk it was uh working with our most strategic accounts and so I looked back to 2013 and I think about the sales process like working with with our small customers you know it was um it was still very siled back then like I was selling to an I.T team that was either using this for it operations um we generally would always even say yeah although we do security we weren't really designed for it we're a log management tool and we I'm sure you remember back then John we were like sort of stepping into the security space and and the public sector domain that I was in you know security was 70 of what we did when I look back to sort of uh the transformation that I was witnessing in that digital transformation um you know when I look at like 2019 to today you look at how uh the IT team and the security teams are being have been forced to break down those barriers that they used to sort of be silent away would not commute communicate one you know the security guys would be like oh this is my box I.T you're not allowed in today you can't get away with that and I think that the value that we bring to you know and of course Splunk has been a huge leader in that space and continues to do Innovation across the board but I think what we've we're seeing in the space and I was talking with Patrick Coughlin the SVP of uh security markets about this is that you know what we've been able to do with Splunk is build a purpose-built solution that allows Splunk to eat more data so Splunk itself is ulk know it's an ingest engine right the great reason people bought it was you could build these really fast dashboards and grab intelligence out of it but without data it doesn't do anything right so how do you drive and how do you bring more data in and most importantly from a customer perspective how do you bring the right data in and so if you think about what node zero and what we're doing in a horizon 3 is that sure we do pen testing but because we're an autonomous pen testing tool we do it continuously so this whole thought I'd be like oh crud like my customers oh yeah we got a pen test coming up it's gonna be six weeks the week oh yeah you know and everyone's gonna sit on their hands call me back in two months Chris we'll talk to you then right not not a real efficient way to test your environment and shoot we saw that with Uber this week right um you know and that's a case where we could have helped oh just right we could explain the Uber thing because it was a contractor just give a quick highlight of what happened so you can connect the doctor yeah no problem so um it was uh I got I think it was yeah one of those uh you know games where they would try and test an environment um and with the uh pen tester did was he kept on calling them MFA guys being like I need to reset my password we need to set my right password and eventually the um the customer service guy said okay I'm resetting it once he had reset and bypassed the multi-factor authentication he then was able to get in and get access to the building area that he was in or I think not the domain but he was able to gain access to a partial part of that Network he then paralleled over to what I would assume is like a VA VMware or some virtual machine that had notes that had all of the credentials for logging into various domains and So within minutes they had access and that's the sort of stuff that we do you know a lot of these tools like um you know you think about the cacophony of tools that are out there in a GTA architect architecture right I'm gonna get like a z-scale or I'm going to have uh octum and I have a Splunk I've been into the solar system I mean I don't mean to name names we have crowdstriker or Sentinel one in there it's just it's a cacophony of things that don't work together they weren't designed work together and so we have seen so many times in our business through our customer support and just working with customers when we do their pen tests that there will be 5 000 servers out there three are misconfigured those three misconfigurations will create the open door because remember the hacker only needs to be right once the defender needs to be right all the time and that's the challenge and so that's what I'm really passionate about what we're doing uh here at Horizon three I see this my digital transformation migration and security going on which uh we're at the tip of the spear it's why I joined sey Hall coming on this journey uh and just super excited about where the path's going and super excited about the relationship with Splunk I get into more details on some of the specifics of that but um you know well you're nailing I mean we've been doing a lot of things on super cloud and this next gen environment we're calling it next gen you're really seeing devops obviously devsecops has already won the it role has moved to the developer shift left is an indicator of that it's one of the many examples higher velocity code software supply chain you hear these things that means that it is now in the developer hands it is replaced by the new Ops data Ops teams and security where there's a lot of horizontal thinking to your point about access there's no more perimeter huge 100 right is really right on things one time you know to get in there once you're in then you can hang out move around move laterally big problem okay so we get that now the challenges for these teams as they are transitioning organizationally how do they figure out what to do okay this is the next step they already have Splunk so now they're kind of in transition while protecting for a hundred percent ratio of success so how would you look at that and describe the challenge is what do they do what is it what are the teams facing with their data and what's next what are they what are they what action do they take so let's use some vernacular that folks will know so if I think about devsecops right we both know what that means that I'm going to build security into the app it normally talks about sec devops right how am I building security around the perimeter of what's going inside my ecosystem and what are they doing and so if you think about what we're able to do with somebody like Splunk is we can pen test the entire environment from Soup To Nuts right so I'm going to test the end points through to its I'm going to look for misconfigurations I'm going to I'm going to look for um uh credential exposed credentials you know I'm going to look for anything I can in the environment again I'm going to do it at light speed and and what what we're doing for that SEC devops space is to you know did you detect that we were in your environment so did we alert Splunk or the Sim that there's someone in the environment laterally moving around did they more importantly did they log us into their environment and when do they detect that log to trigger that log did they alert on us and then finally most importantly for every CSO out there is going to be did they stop us and so that's how we we do this and I think you when speaking with um stay Hall before you know we've come up with this um boils but we call it fine fix verifying so what we do is we go in is we act as the attacker right we act in a production environment so we're not going to be we're a passive attacker but we will go in on credentialed on agents but we have to assume to have an assumed breach model which means we're going to put a Docker container in your environment and then we're going to fingerprint the environment so we're going to go out and do an asset survey now that's something that's not something that Splunk does super well you know so can Splunk see all the assets do the same assets marry up we're going to log all that data and think and then put load that into this long Sim or the smoke logging tools just to have it in Enterprise right that's an immediate future ad that they've got um and then we've got the fix so once we've completed our pen test um we are then going to generate a report and we can talk about these in a little bit later but the reports will show an executive summary the assets that we found which would be your asset Discovery aspect of that a fix report and the fixed report I think is probably the most important one it will go down and identify what we did how we did it and then how to fix that and then from that the pen tester or the organization should fix those then they go back and run another test and then they validate like a change detection environment to see hey did those fixes taste play take place and you know snehaw when he was the CTO of jsoc he shared with me a number of times about it's like man there would be 15 more items on next week's punch sheet that we didn't know about and it's and it has to do with how we you know how they were uh prioritizing the cves and whatnot because they would take all CBDs it was critical or non-critical and it's like we are able to create context in that environment that feeds better information into Splunk and whatnot that brings that brings up the efficiency for Splunk specifically the teams out there by the way the burnout thing is real I mean this whole I just finished my list and I got 15 more or whatever the list just can keeps growing how did node zero specifically help Splunk teams be more efficient like that's the question I want to get at because this seems like a very scale way for Splunk customers and teams service teams to be more so the question is how does node zero help make Splunk specifically their service teams be more efficient so so today in our early interactions we're building customers we've seen are five things um and I'll start with sort of identifying the blind spots right so kind of what I just talked about with you did we detect did we log did we alert did they stop node zero right and so I would I put that you know a more Layman's third grade term and if I was going to beat a fifth grader at this game would be we can be the sparring partner for a Splunk Enterprise customer a Splunk Essentials customer someone using Splunk soar or even just an Enterprise Splunk customer that may be a small shop with three people and just wants to know where am I exposed so by creating and generating these reports and then having um the API that actually generates the dashboard they can take all of these events that we've logged and log them in and then where that then comes in is number two is how do we prioritize those logs right so how do we create visibility to logs that that um are have critical impacts and again as I mentioned earlier not all cves are high impact regard and also not all or low right so if you daisy chain a bunch of low cves together boom I've got a mission critical AP uh CPE that needs to be fixed now such as a credential moving to an NT box that's got a text file with a bunch of passwords on it that would be very bad um and then third would be uh verifying that you have all of the hosts so one of the things that splunk's not particularly great at and they'll literate themselves they don't do asset Discovery so dude what assets do we see and what are they logging from that um and then for from um for every event that they are able to identify one of the cool things that we can do is actually create this low code no code environment so they could let you know Splunk customers can use Splunk sword to actually triage events and prioritize that event so where they're being routed within it to optimize the Sox team time to Market or time to triage any given event obviously reducing MTR and then finally I think one of the neatest things that we'll be seeing us develop is um our ability to build glass cables so behind me you'll see one of our triage events and how we build uh a Lockheed Martin kill chain on that with a glass table which is very familiar to the community we're going to have the ability and not too distant future to allow people to search observe on those iocs and if people aren't familiar with it ioc it's an instant of a compromise so that's a vector that we want to drill into and of course who's better at Drilling in the data and smoke yeah this is a critter this is an awesome Synergy there I mean I can see a Splunk customer going man this just gives me so much more capability action actionability and also real understanding and I think this is what I want to dig into if you don't mind understanding that critical impact okay is kind of where I see this coming got the data data ingest now data's data but the question is what not to log you know where are things misconfigured these are critical questions so can you talk about what it means to understand critical impact yeah so I think you know going back to the things that I just spoke about a lot of those cves where you'll see um uh low low low and then you daisy chain together and they're suddenly like oh this is high now but then your other impact of like if you're if you're a Splunk customer you know and I had it I had several of them I had one customer that you know terabytes of McAfee data being brought in and it was like all right there's a lot of other data that you probably also want to bring but they could only afford wanted to do certain data sets because that's and they didn't know how to prioritize or filter those data sets and so we provide that opportunity to say hey these are the critical ones to bring in but there's also the ones that you don't necessarily need to bring in because low cve in this case really does mean low cve like an ILO server would be one that um that's the print server uh where the uh your admin credentials are on on like a printer and so there will be credentials on that that's something that a hacker might go in to look at so although the cve on it is low is if you daisy chain with somebody that's able to get into that you might say Ah that's high and we would then potentially rank it giving our AI logic to say that's a moderate so put it on the scale and we prioritize those versus uh of all of these scanners just going to give you a bunch of CDs and good luck and translating that if I if I can and tell me if I'm wrong that kind of speaks to that whole lateral movement that's it challenge right print serve a great example looks stupid low end who's going to want to deal with the print server oh but it's connected into a critical system there's a path is that kind of what you're getting at yeah I use Daisy Chain I think that's from the community they came from uh but it's just a lateral movement it's exactly what they're doing in those low level low critical lateral movements is where the hackers are getting in right so that's the beauty thing about the uh the Uber example is that who would have thought you know I've got my monthly Factor authentication going in a human made a mistake we can't we can't not expect humans to make mistakes we're fallible right the reality is is once they were in the environment they could have protected themselves by running enough pen tests to know that they had certain uh exposed credentials that would have stopped the breach and they did not had not done that in their environment and I'm not poking yeah but it's an interesting Trend though I mean it's obvious if sometimes those low end items are also not protected well so it's easy to get at from a hacker standpoint but also the people in charge of them can be fished easily or spearfished because they're not paying attention because they don't have to no one ever told them hey be careful yeah for the community that I came from John that's exactly how they they would uh meet you at a uh an International Event um introduce themselves as a graduate student these are National actor States uh would you mind reviewing my thesis on such and such and I was at Adobe at the time that I was working on this instead of having to get the PDF they opened the PDF and whoever that customer was launches and I don't know if you remember back in like 2008 time frame there was a lot of issues around IP being by a nation state being stolen from the United States and that's exactly how they did it and John that's or LinkedIn hey I want to get a joke we want to hire you double the salary oh I'm gonna click on that for sure you know yeah right exactly yeah the one thing I would say to you is like uh when we look at like sort of you know because I think we did 10 000 pen tests last year is it's probably over that now you know we have these sort of top 10 ways that we think and find people coming into the environment the funniest thing is that only one of them is a cve related vulnerability like uh you know you guys know what they are right so it's it but it's it's like two percent of the attacks are occurring through the cves but yeah there's all that attention spent to that and very little attention spent to this pen testing side which is sort of this continuous threat you know monitoring space and and this vulnerability space where I think we play a such an important role and I'm so excited to be a part of the tip of the spear on this one yeah I'm old enough to know the movie sneakers which I loved as a you know watching that movie you know professional hackers are testing testing always testing the environment I love this I got to ask you as we kind of wrap up here Chris if you don't mind the the benefits to Professional Services from this Alliance big news Splunk and you guys work well together we see that clearly what are what other benefits do Professional Services teams see from the Splunk and Horizon 3.ai Alliance so if you're I think for from our our from both of our uh Partners uh as we bring these guys together and many of them already are the same partner right uh is that uh first off the licensing model is probably one of the key areas that we really excel at so if you're an end user you can buy uh for the Enterprise by the number of IP addresses you're using um but uh if you're a partner working with this there's solution ways that you can go in and we'll license as to msps and what that business model on msps looks like but the unique thing that we do here is this C plus license and so the Consulting plus license allows like a uh somebody a small to mid-sized to some very large uh you know Fortune 100 uh consulting firms use this uh by buying into a license called um Consulting plus where they can have unlimited uh access to as many IPS as they want but you can only run one test at a time and as you can imagine when we're going and hacking passwords and um checking hashes and decrypting hashes that can take a while so but for the right customer it's it's a perfect tool and so I I'm so excited about our ability to go to market with uh our partners so that we understand ourselves understand how not to just sell to or not tell just to sell through but we know how to sell with them as a good vendor partner I think that that's one thing that we've done a really good job building bring it into the market yeah I think also the Splunk has had great success how they've enabled uh partners and Professional Services absolutely you know the services that layer on top of Splunk are multi-fold tons of great benefits so you guys Vector right into that ride that way with friction and and the cool thing is that in you know in one of our reports which could be totally customized uh with someone else's logo we're going to generate you know so I I used to work in another organization it wasn't Splunk but we we did uh you know pen testing as for for customers and my pen testers would come on site they'd do the engagement and they would leave and then another release someone would be oh shoot we got another sector that was breached and they'd call you back you know four weeks later and so by August our entire pen testings teams would be sold out and it would be like well even in March maybe and they're like no no I gotta breach now and and and then when they do go in they go through do the pen test and they hand over a PDF and they pack on the back and say there's where your problems are you need to fix it and the reality is that what we're going to generate completely autonomously with no human interaction is we're going to go and find all the permutations of anything we found and the fix for those permutations and then once you've fixed everything you just go back and run another pen test it's you know for what people pay for one pen test they can have a tool that does that every every Pat patch on Tuesday and that's on Wednesday you know triage throughout the week green yellow red I wanted to see the colors show me green green is good right not red and one CIO doesn't want who doesn't want that dashboard right it's it's exactly it and we can help bring I think that you know I'm really excited about helping drive this with the Splunk team because they get that they understand that it's the green yellow red dashboard and and how do we help them find more green uh so that the other guys are in red yeah and get in the data and do the right thing and be efficient with how you use the data know what to look at so many things to pay attention to you know the combination of both and then go to market strategy real brilliant congratulations Chris thanks for coming on and sharing um this news with the detail around the Splunk in action around the alliance thanks for sharing John my pleasure thanks look forward to seeing you soon all right great we'll follow up and do another segment on devops and I.T and security teams as the new new Ops but and super cloud a bunch of other stuff so thanks for coming on and our next segment the CEO of horizon 3.aa will break down all the new news for us here on thecube you're watching thecube the leader in high tech Enterprise coverage [Music] yeah the partner program for us has been fantastic you know I think prior to that you know as most organizations most uh uh most Farmers most mssps might not necessarily have a a bench at all for penetration testing uh maybe they subcontract this work out or maybe they do it themselves but trying to staff that kind of position can be incredibly difficult for us this was a differentiator a a new a new partner a new partnership that allowed us to uh not only perform services for our customers but be able to provide a product by which that they can do it themselves so we work with our customers in a variety of ways some of them want more routine testing and perform this themselves but we're also a certified service provider of horizon 3 being able to perform uh penetration tests uh help review the the data provide color provide analysis for our customers in a broader sense right not necessarily the the black and white elements of you know what was uh what's critical what's high what's medium what's low what you need to fix but are there systemic issues this has allowed us to onboard new customers this has allowed us to migrate some penetration testing services to us from from competitors in the marketplace But ultimately this is occurring because the the product and the outcome are special they're unique and they're effective our customers like what they're seeing they like the routineness of it many of them you know again like doing this themselves you know being able to kind of pen test themselves parts of their networks um and the the new use cases right I'm a large organization I have eight to ten Acquisitions per year wouldn't it be great to have a tool to be able to perform a penetration test both internal and external of that acquisition before we integrate the two companies and maybe bringing on some risk it's a very effective partnership uh one that really is uh kind of taken our our Engineers our account Executives by storm um you know this this is a a partnership that's been very valuable to us [Music] a key part of the value and business model at Horizon 3 is enabling Partners to leverage node zero to make more revenue for themselves our goal is that for sixty percent of our Revenue this year will be originated by partners and that 95 of our Revenue next year will be originated by partners and so a key to that strategy is making us an integral part of your business models as a partner a key quote from one of our partners is that we enable every one of their business units to generate Revenue so let's talk about that in a little bit more detail first is that if you have a pen test Consulting business take Deloitte as an example what was six weeks of human labor at Deloitte per pen test has been cut down to four days of Labor using node zero to conduct reconnaissance find all the juicy interesting areas of the of the Enterprise that are exploitable and being able to go assess the entire organization and then all of those details get served up to the human to be able to look at understand and determine where to probe deeper so what you see in that pen test Consulting business is that node zero becomes a force multiplier where those Consulting teams were able to cover way more accounts and way more IPS within those accounts with the same or fewer consultants and so that directly leads to profit margin expansion for the Penn testing business itself because node 0 is a force multiplier the second business model here is if you're an mssp as an mssp you're already making money providing defensive cyber security operations for a large volume of customers and so what they do is they'll license node zero and use us as an upsell to their mssb business to start to deliver either continuous red teaming continuous verification or purple teaming as a service and so in that particular business model they've got an additional line of Revenue where they can increase the spend of their existing customers by bolting on node 0 as a purple team as a service offering the third business model or customer type is if you're an I.T services provider so as an I.T services provider you make money installing and configuring security products like Splunk or crowdstrike or hemio you also make money reselling those products and you also make money generating follow-on services to continue to harden your customer environments and so for them what what those it service providers will do is use us to verify that they've installed Splunk correctly improved to their customer that Splunk was installed correctly or crowdstrike was installed correctly using our results and then use our results to drive follow-on services and revenue and then finally we've got the value-added reseller which is just a straight up reseller because of how fast our sales Cycles are these vars are able to typically go from cold email to deal close in six to eight weeks at Horizon 3 at least a single sales engineer is able to run 30 to 50 pocs concurrently because our pocs are very lightweight and don't require any on-prem customization or heavy pre-sales post sales activity so as a result we're able to have a few amount of sellers driving a lot of Revenue and volume for us well the same thing applies to bars there isn't a lot of effort to sell the product or prove its value so vars are able to sell a lot more Horizon 3 node zero product without having to build up a huge specialist sales organization so what I'm going to do is talk through uh scenario three here as an I.T service provider and just how powerful node zero can be in driving additional Revenue so in here think of for every one dollar of node zero license purchased by the IT service provider to do their business it'll generate ten dollars of additional revenue for that partner so in this example kidney group uses node 0 to verify that they have installed and deployed Splunk correctly so Kitty group is a Splunk partner they they sell it services to install configure deploy and maintain Splunk and as they deploy Splunk they're going to use node 0 to attack the environment and make sure that the right logs and alerts and monitoring are being handled within the Splunk deployment so it's a way of doing QA or verifying that Splunk has been configured correctly and that's going to be internally used by kidney group to prove the quality of their services that they've just delivered then what they're going to do is they're going to show and leave behind that node zero Report with their client and that creates a resell opportunity for for kidney group to resell node 0 to their client because their client is seeing the reports and the results and saying wow this is pretty amazing and those reports can be co-branded where it's a pen testing report branded with kidney group but it says powered by Horizon three under it from there kidney group is able to take the fixed actions report that's automatically generated with every pen test through node zero and they're able to use that as the starting point for a statement of work to sell follow-on services to fix all of the problems that node zero identified fixing l11r misconfigurations fixing or patching VMware or updating credentials policies and so on so what happens is node 0 has found a bunch of problems the client often lacks the capacity to fix and so kidney group can use that lack of capacity by the client as a follow-on sales opportunity for follow-on services and finally based on the findings from node zero kidney group can look at that report and say to the customer you know customer if you bought crowdstrike you'd be able to uh prevent node Zero from attacking and succeeding in the way that it did for if you bought humano or if you bought Palo Alto networks or if you bought uh some privileged access management solution because of what node 0 was able to do with credential harvesting and attacks and so as a result kidney group is able to resell other security products within their portfolio crowdstrike Falcon humano Polito networks demisto Phantom and so on based on the gaps that were identified by node zero and that pen test and what that creates is another feedback loop where kidney group will then go use node 0 to verify that crowdstrike product has actually been installed and configured correctly and then this becomes the cycle of using node 0 to verify a deployment using that verification to drive a bunch of follow-on services and resell opportunities which then further drives more usage of the product now the way that we licensed is that it's a usage-based license licensing model so that the partner will grow their node zero Consulting plus license as they grow their business so for example if you're a kidney group then week one you've got you're going to use node zero to verify your Splunk install in week two if you have a pen testing business you're going to go off and use node zero to be a force multiplier for your pen testing uh client opportunity and then if you have an mssp business then in week three you're going to use node zero to go execute a purple team mssp offering for your clients so not necessarily a kidney group but if you're a Deloitte or ATT these larger companies and you've got multiple lines of business if you're Optive for instance you all you have to do is buy one Consulting plus license and you're going to be able to run as many pen tests as you want sequentially so now you can buy a single license and use that one license to meet your week one client commitments and then meet your week two and then meet your week three and as you grow your business you start to run multiple pen tests concurrently so in week one you've got to do a Splunk verify uh verify Splunk install and you've got to run a pen test and you've got to do a purple team opportunity you just simply expand the number of Consulting plus licenses from one license to three licenses and so now as you systematically grow your business you're able to grow your node zero capacity with you giving you predictable cogs predictable margins and once again 10x additional Revenue opportunity for that investment in the node zero Consulting plus license my name is Saint I'm the co-founder and CEO here at Horizon 3. I'm going to talk to you today about why it's important to look at your Enterprise Through The Eyes of an attacker the challenge I had when I was a CIO in banking the CTO at Splunk and serving within the Department of Defense is that I had no idea I was Secure until the bad guys had showed up am I logging the right data am I fixing the right vulnerabilities are my security tools that I've paid millions of dollars for actually working together to defend me and the answer is I don't know does my team actually know how to respond to a breach in the middle of an incident I don't know I've got to wait for the bad guys to show up and so the challenge I had was how do we proactively verify our security posture I tried a variety of techniques the first was the use of vulnerability scanners and the challenge with vulnerability scanners is being vulnerable doesn't mean you're exploitable I might have a hundred thousand findings from my scanner of which maybe five or ten can actually be exploited in my environment the other big problem with scanners is that they can't chain weaknesses together from machine to machine so if you've got a thousand machines in your environment or more what a vulnerability scanner will do is tell you you have a problem on machine one and separately a problem on machine two but what they can tell you is that an attacker could use a load from machine one plus a low from machine two to equal to critical in your environment and what attackers do in their tactics is they chain together misconfigurations dangerous product defaults harvested credentials and exploitable vulnerabilities into attack paths across different machines so to address the attack pads across different machines I tried layering in consulting-based pen testing and the issue is when you've got thousands of hosts or hundreds of thousands of hosts in your environment human-based pen testing simply doesn't scale to test an infrastructure of that size moreover when they actually do execute a pen test and you get the report oftentimes you lack the expertise within your team to quickly retest to verify that you've actually fixed the problem and so what happens is you end up with these pen test reports that are incomplete snapshots and quickly going stale and then to mitigate that problem I tried using breach and attack simulation tools and the struggle with these tools is one I had to install credentialed agents everywhere two I had to write my own custom attack scripts that I didn't have much talent for but also I had to maintain as my environment changed and then three these types of tools were not safe to run against production systems which was the the majority of my attack surface so that's why we went off to start Horizon 3. so Tony and I met when we were in Special Operations together and the challenge we wanted to solve was how do we do infrastructure security testing at scale by giving the the power of a 20-year pen testing veteran into the hands of an I.T admin a network engineer in just three clicks and the whole idea is we enable these fixers The Blue Team to be able to run node Zero Hour pen testing product to quickly find problems in their environment that blue team will then then go off and fix the issues that were found and then they can quickly rerun the attack to verify that they fixed the problem and the whole idea is delivering this without requiring custom scripts be developed without requiring credential agents be installed and without requiring the use of external third-party consulting services or Professional Services self-service pen testing to quickly Drive find fix verify there are three primary use cases that our customers use us for the first is the sock manager that uses us to verify that their security tools are actually effective to verify that they're logging the right data in Splunk or in their Sim to verify that their managed security services provider is able to quickly detect and respond to an attack and hold them accountable for their slas or that the sock understands how to quickly detect and respond and measuring and verifying that or that the variety of tools that you have in your stack most organizations have 130 plus cyber security tools none of which are designed to work together are actually working together the second primary use case is proactively hardening and verifying your systems this is when the I that it admin that network engineer they're able to run self-service pen tests to verify that their Cisco environment is installed in hardened and configured correctly or that their credential policies are set up right or that their vcenter or web sphere or kubernetes environments are actually designed to be secure and what this allows the it admins and network Engineers to do is shift from running one or two pen tests a year to 30 40 or more pen tests a month and you can actually wire those pen tests into your devops process or into your detection engineering and the change management processes to automatically trigger pen tests every time there's a change in your environment the third primary use case is for those organizations lucky enough to have their own internal red team they'll use node zero to do reconnaissance and exploitation at scale and then use the output as a starting point for the humans to step in and focus on the really hard juicy stuff that gets them on stage at Defcon and so these are the three primary use cases and what we'll do is zoom into the find fix verify Loop because what I've found in my experience is find fix verify is the future operating model for cyber security organizations and what I mean here is in the find using continuous pen testing what you want to enable is on-demand self-service pen tests you want those pen tests to find attack pads at scale spanning your on-prem infrastructure your Cloud infrastructure and your perimeter because attackers don't only state in one place they will find ways to chain together a perimeter breach a credential from your on-prem to gain access to your cloud or some other permutation and then the third part in continuous pen testing is attackers don't focus on critical vulnerabilities anymore they know we've built vulnerability Management Programs to reduce those vulnerabilities so attackers have adapted and what they do is chain together misconfigurations in your infrastructure and software and applications with dangerous product defaults with exploitable vulnerabilities and through the collection of credentials through a mix of techniques at scale once you've found those problems the next question is what do you do about it well you want to be able to prioritize fixing problems that are actually exploitable in your environment that truly matter meaning they're going to lead to domain compromise or domain user compromise or access your sensitive data the second thing you want to fix is making sure you understand what risk your crown jewels data is exposed to where is your crown jewels data is in the cloud is it on-prem has it been copied to a share drive that you weren't aware of if a domain user was compromised could they access that crown jewels data you want to be able to use the attacker's perspective to secure the critical data you have in your infrastructure and then finally as you fix these problems you want to quickly remediate and retest that you've actually fixed the issue and this fine fix verify cycle becomes that accelerator that drives purple team culture the third part here is verify and what you want to be able to do in the verify step is verify that your security tools and processes in people can effectively detect and respond to a breach you want to be able to integrate that into your detection engineering processes so that you know you're catching the right security rules or that you've deployed the right configurations you also want to make sure that your environment is adhering to the best practices around systems hardening in cyber resilience and finally you want to be able to prove your security posture over a time to your board to your leadership into your regulators so what I'll do now is zoom into each of these three steps so when we zoom in to find here's the first example using node 0 and autonomous pen testing and what an attacker will do is find a way to break through the perimeter in this example it's very easy to misconfigure kubernetes to allow an attacker to gain remote code execution into your on-prem kubernetes environment and break through the perimeter and from there what the attacker is going to do is conduct Network reconnaissance and then find ways to gain code execution on other machines in the environment and as they get code execution they start to dump credentials collect a bunch of ntlm hashes crack those hashes using open source and dark web available data as part of those attacks and then reuse those credentials to log in and laterally maneuver throughout the environment and then as they loudly maneuver they can reuse those credentials and use credential spraying techniques and so on to compromise your business email to log in as admin into your cloud and this is a very common attack and rarely is a CV actually needed to execute this attack often it's just a misconfiguration in kubernetes with a bad credential policy or password policy combined with bad practices of credential reuse across the organization here's another example of an internal pen test and this is from an actual customer they had 5 000 hosts within their environment they had EDR and uba tools installed and they initiated in an internal pen test on a single machine from that single initial access point node zero enumerated the network conducted reconnaissance and found five thousand hosts were accessible what node 0 will do under the covers is organize all of that reconnaissance data into a knowledge graph that we call the Cyber terrain map and that cyber Terrain map becomes the key data structure that we use to efficiently maneuver and attack and compromise your environment so what node zero will do is they'll try to find ways to get code execution reuse credentials and so on in this customer example they had Fortinet installed as their EDR but node 0 was still able to get code execution on a Windows machine from there it was able to successfully dump credentials including sensitive credentials from the lsas process on the Windows box and then reuse those credentials to log in as domain admin in the network and once an attacker becomes domain admin they have the keys to the kingdom they can do anything they want so what happened here well it turns out Fortinet was misconfigured on three out of 5000 machines bad automation the customer had no idea this had happened they would have had to wait for an attacker to show up to realize that it was misconfigured the second thing is well why didn't Fortinet stop the credential pivot in the lateral movement and it turned out the customer didn't buy the right modules or turn on the right services within that particular product and we see this not only with Ford in it but we see this with Trend Micro and all the other defensive tools where it's very easy to miss a checkbox in the configuration that will do things like prevent credential dumping the next story I'll tell you is attackers don't have to hack in they log in so another infrastructure pen test a typical technique attackers will take is man in the middle uh attacks that will collect hashes so in this case what an attacker will do is leverage a tool or technique called responder to collect ntlm hashes that are being passed around the network and there's a variety of reasons why these hashes are passed around and it's a pretty common misconfiguration but as an attacker collects those hashes then they start to apply techniques to crack those hashes so they'll pass the hash and from there they will use open source intelligence common password structures and patterns and other types of techniques to try to crack those hashes into clear text passwords so here node 0 automatically collected hashes it automatically passed the hashes to crack those credentials and then from there it starts to take the domain user user ID passwords that it's collected and tries to access different services and systems in your Enterprise in this case node 0 is able to successfully gain access to the Office 365 email environment because three employees didn't have MFA configured so now what happens is node 0 has a placement and access in the business email system which sets up the conditions for fraud lateral phishing and other techniques but what's especially insightful here is that 80 of the hashes that were collected in this pen test were cracked in 15 minutes or less 80 percent 26 of the user accounts had a password that followed a pretty obvious pattern first initial last initial and four random digits the other thing that was interesting is 10 percent of service accounts had their user ID the same as their password so VMware admin VMware admin web sphere admin web Square admin so on and so forth and so attackers don't have to hack in they just log in with credentials that they've collected the next story here is becoming WS AWS admin so in this example once again internal pen test node zero gets initial access it discovers 2 000 hosts are network reachable from that environment if fingerprints and organizes all of that data into a cyber Terrain map from there it it fingerprints that hpilo the integrated lights out service was running on a subset of hosts hpilo is a service that is often not instrumented or observed by security teams nor is it easy to patch as a result attackers know this and immediately go after those types of services so in this case that ILO service was exploitable and were able to get code execution on it ILO stores all the user IDs and passwords in clear text in a particular set of processes so once we gain code execution we were able to dump all of the credentials and then from there laterally maneuver to log in to the windows box next door as admin and then on that admin box we're able to gain access to the share drives and we found a credentials file saved on a share Drive from there it turned out that credentials file was the AWS admin credentials file giving us full admin authority to their AWS accounts not a single security alert was triggered in this attack because the customer wasn't observing the ILO service and every step thereafter was a valid login in the environment and so what do you do step one patch the server step two delete the credentials file from the share drive and then step three is get better instrumentation on privileged access users and login the final story I'll tell is a typical pattern that we see across the board with that combines the various techniques I've described together where an attacker is going to go off and use open source intelligence to find all of the employees that work at your company from there they're going to look up those employees on dark web breach databases and other forms of information and then use that as a starting point to password spray to compromise a domain user all it takes is one employee to reuse a breached password for their Corporate email or all it takes is a single employee to have a weak password that's easily guessable all it takes is one and once the attacker is able to gain domain user access in most shops domain user is also the local admin on their laptop and once your local admin you can dump Sam and get local admin until M hashes you can use that to reuse credentials again local admin on neighboring machines and attackers will start to rinse and repeat then eventually they're able to get to a point where they can dump lsas or by unhooking the anti-virus defeating the EDR or finding a misconfigured EDR as we've talked about earlier to compromise the domain and what's consistent is that the fundamentals are broken at these shops they have poor password policies they don't have least access privilege implemented active directory groups are too permissive where domain admin or domain user is also the local admin uh AV or EDR Solutions are misconfigured or easily unhooked and so on and what we found in 10 000 pen tests is that user Behavior analytics tools never caught us in that lateral movement in part because those tools require pristine logging data in order to work and also it becomes very difficult to find that Baseline of normal usage versus abnormal usage of credential login another interesting Insight is there were several Marquee brand name mssps that were defending our customers environment and for them it took seven hours to detect and respond to the pen test seven hours the pen test was over in less than two hours and so what you had was an egregious violation of the service level agreements that that mssp had in place and the customer was able to use us to get service credit and drive accountability of their sock and of their provider the third interesting thing is in one case it took us seven minutes to become domain admin in a bank that bank had every Gucci security tool you could buy yet in 7 minutes and 19 seconds node zero started as an unauthenticated member of the network and was able to escalate privileges through chaining and misconfigurations in lateral movement and so on to become domain admin if it's seven minutes today we should assume it'll be less than a minute a year or two from now making it very difficult for humans to be able to detect and respond to that type of Blitzkrieg attack so that's in the find it's not just about finding problems though the bulk of the effort should be what to do about it the fix and the verify so as you find those problems back to kubernetes as an example we will show you the path here is the kill chain we took to compromise that environment we'll show you the impact here is the impact or here's the the proof of exploitation that we were able to use to be able to compromise it and there's the actual command that we executed so you could copy and paste that command and compromise that cubelet yourself if you want and then the impact is we got code execution and we'll actually show you here is the impact this is a critical here's why it enabled perimeter breach affected applications will tell you the specific IPS where you've got the problem how it maps to the miter attack framework and then we'll tell you exactly how to fix it we'll also show you what this problem enabled so you can accurately prioritize why this is important or why it's not important the next part is accurate prioritization the hardest part of my job as a CIO was deciding what not to fix so if you take SMB signing not required as an example by default that CVSs score is a one out of 10. but this misconfiguration is not a cve it's a misconfig enable an attacker to gain access to 19 credentials including one domain admin two local admins and access to a ton of data because of that context this is really a 10 out of 10. you better fix this as soon as possible however of the seven occurrences that we found it's only a critical in three out of the seven and these are the three specific machines and we'll tell you the exact way to fix it and you better fix these as soon as possible for these four machines over here these didn't allow us to do anything of consequence so that because the hardest part is deciding what not to fix you can justifiably choose not to fix these four issues right now and just add them to your backlog and surge your team to fix these three as quickly as possible and then once you fix these three you don't have to re-run the entire pen test you can select these three and then one click verify and run a very narrowly scoped pen test that is only testing this specific issue and what that creates is a much faster cycle of finding and fixing problems the other part of fixing is verifying that you don't have sensitive data at risk so once we become a domain user we're able to use those domain user credentials and try to gain access to databases file shares S3 buckets git repos and so on and help you understand what sensitive data you have at risk so in this example a green checkbox means we logged in as a valid domain user we're able to get read write access on the database this is how many records we could have accessed and we don't actually look at the values in the database but we'll show you the schema so you can quickly characterize that pii data was at risk here and we'll do that for your file shares and other sources of data so now you can accurately articulate the data you have at risk and prioritize cleaning that data up especially data that will lead to a fine or a big news issue so that's the find that's the fix now we're going to talk about the verify the key part in verify is embracing and integrating with detection engineering practices so when you think about your layers of security tools you've got lots of tools in place on average 130 tools at any given customer but these tools were not designed to work together so when you run a pen test what you want to do is say did you detect us did you log us did you alert on us did you stop us and from there what you want to see is okay what are the techniques that are commonly used to defeat an environment to actually compromise if you look at the top 10 techniques we use and there's far more than just these 10 but these are the most often executed nine out of ten have nothing to do with cves it has to do with misconfigurations dangerous product defaults bad credential policies and it's how we chain those together to become a domain admin or compromise a host so what what customers will do is every single attacker command we executed is provided to you as an attackivity log so you can actually see every single attacker command we ran the time stamp it was executed the hosts it executed on and how it Maps the minor attack tactics so our customers will have are these attacker logs on one screen and then they'll go look into Splunk or exabeam or Sentinel one or crowdstrike and say did you detect us did you log us did you alert on us or not and to make that even easier if you take this example hey Splunk what logs did you see at this time on the VMware host because that's when node 0 is able to dump credentials and that allows you to identify and fix your logging blind spots to make that easier we've got app integration so this is an actual Splunk app in the Splunk App Store and what you can come is inside the Splunk console itself you can fire up the Horizon 3 node 0 app all of the pen test results are here so that you can see all of the results in one place and you don't have to jump out of the tool and what you'll show you as I skip forward is hey there's a pen test here are the critical issues that we've identified for that weaker default issue here are the exact commands we executed and then we will automatically query into Splunk all all terms on between these times on that endpoint that relate to this attack so you can now quickly within the Splunk environment itself figure out that you're missing logs or that you're appropriately catching this issue and that becomes incredibly important in that detection engineering cycle that I mentioned earlier so how do our customers end up using us they shift from running one pen test a year to 30 40 pen tests a month oftentimes wiring us into their deployment automation to automatically run pen tests the other part that they'll do is as they run more pen tests they find more issues but eventually they hit this inflection point where they're able to rapidly clean up their environment and that inflection point is because the red and the blue teams start working together in a purple team culture and now they're working together to proactively harden their environment the other thing our customers will do is run us from different perspectives they'll first start running an RFC 1918 scope to see once the attacker gained initial access in a part of the network that had wide access what could they do and then from there they'll run us within a specific Network segment okay from within that segment could the attacker break out and gain access to another segment then they'll run us from their work from home environment could they Traverse the VPN and do something damaging and once they're in could they Traverse the VPN and get into my cloud then they'll break in from the outside all of these perspectives are available to you in Horizon 3 and node zero as a single SKU and you can run as many pen tests as you want if you run a phishing campaign and find that an intern in the finance department had the worst phishing behavior you can then inject their credentials and actually show the end-to-end story of how an attacker fished gained credentials of an intern and use that to gain access to sensitive financial data so what our customers end up doing is running multiple attacks from multiple perspectives and looking at those results over time I'll leave you two things one is what is the AI in Horizon 3 AI those knowledge graphs are the heart and soul of everything that we do and we use machine learning reinforcement techniques reinforcement learning techniques Markov decision models and so on to be able to efficiently maneuver and analyze the paths in those really large graphs we also use context-based scoring to prioritize weaknesses and we're also able to drive collective intelligence across all of the operations so the more pen tests we run the smarter we get and all of that is based on our knowledge graph analytics infrastructure that we have finally I'll leave you with this was my decision criteria when I was a buyer for my security testing strategy what I cared about was coverage I wanted to be able to assess my on-prem cloud perimeter and work from home and be safe to run in production I want to be able to do that as often as I wanted I want to be able to run pen tests in hours or days not weeks or months so I could accelerate that fine fix verify loop I wanted my it admins and network Engineers with limited offensive experience to be able to run a pen test in a few clicks through a self-service experience and not have to install agent and not have to write custom scripts and finally I didn't want to get nickeled and dimed on having to buy different types of attack modules or different types of attacks I wanted a single annual subscription that allowed me to run any type of attack as often as I wanted so I could look at my Trends in directions over time so I hope you found this talk valuable uh we're easy to find and I look forward to seeing seeing you use a product and letting our results do the talking when you look at uh you know kind of the way no our pen testing algorithms work is we dynamically select uh how to compromise an environment based on what we've discovered and the goal is to become a domain admin compromise a host compromise domain users find ways to encrypt data steal sensitive data and so on but when you look at the the top 10 techniques that we ended up uh using to compromise environments the first nine have nothing to do with cves and that's the reality cves are yes a vector but less than two percent of cves are actually used in a compromise oftentimes it's some sort of credential collection credential cracking uh credential pivoting and using that to become an admin and then uh compromising environments from that point on so I'll leave this up for you to kind of read through and you'll have the slides available for you but I found it very insightful that organizations and ourselves when I was a GE included invested heavily in just standard vulnerability Management Programs when I was at DOD that's all disa cared about asking us about was our our kind of our cve posture but the attackers have adapted to not rely on cves to get in because they know that organizations are actively looking at and patching those cves and instead they're chaining together credentials from one place with misconfigurations and dangerous product defaults in another to take over an environment a concrete example is by default vcenter backups are not encrypted and so as if an attacker finds vcenter what they'll do is find the backup location and there are specific V sender MTD files where the admin credentials are parsippled in the binaries so you can actually as an attacker find the right MTD file parse out the binary and now you've got the admin credentials for the vcenter environment and now start to log in as admin there's a bad habit by signal officers and Signal practitioners in the in the Army and elsewhere where the the VM notes section of a virtual image has the password for the VM well those VM notes are not stored encrypted and attackers know this and they're able to go off and find the VMS that are unencrypted find the note section and pull out the passwords for those images and then reuse those credentials across the board so I'll pause here and uh you know Patrick love you get some some commentary on on these techniques and other things that you've seen and what we'll do in the last say 10 to 15 minutes is uh is rolled through a little bit more on what do you do about it yeah yeah no I love it I think um I think this is pretty exhaustive what I like about what you've done here is uh you know we've seen we've seen double-digit increases in the number of organizations that are reporting actual breaches year over year for the last um for the last three years and it's often we kind of in the Zeitgeist we pegged that on ransomware which of course is like incredibly important and very top of mind um but what I like about what you have here is you know we're reminding the audience that the the attack surface area the vectors the matter um you know has to be more comprehensive than just thinking about ransomware scenarios yeah right on um so let's build on this when you think about your defense in depth you've got multiple security controls that you've purchased and integrated and you've got that redundancy if a control fails but the reality is that these security tools aren't designed to work together so when you run a pen test what you want to ask yourself is did you detect node zero did you log node zero did you alert on node zero and did you stop node zero and when you think about how to do that every single attacker command executed by node zero is available in an attacker log so you can now see you know at the bottom here vcenter um exploit at that time on that IP how it aligns to minor attack what you want to be able to do is go figure out did your security tools catch this or not and that becomes very important in using the attacker's perspective to improve your defensive security controls and so the way we've tried to make this easier back to like my my my the you know I bleed Green in many ways still from my smoke background is you want to be able to and what our customers do is hey we'll look at the attacker logs on one screen and they'll look at what did Splunk see or Miss in another screen and then they'll use that to figure out what their logging blind spots are and what that where that becomes really interesting is we've actually built out an integration into Splunk where there's a Splunk app you can download off of Splunk base and you'll get all of the pen test results right there in the Splunk console and from that Splunk console you're gonna be able to see these are all the pen tests that were run these are the issues that were found um so you can look at that particular pen test here are all of the weaknesses that were identified for that particular pen test and how they categorize out for each of those weaknesses you can click on any one of them that are critical in this case and then we'll tell you for that weakness and this is where where the the punch line comes in so I'll pause the video here for that weakness these are the commands that were executed on these endpoints at this time and then we'll actually query Splunk for that um for that IP address or containing that IP and these are the source types that surface any sort of activity so what we try to do is help you as quickly and efficiently as possible identify the logging blind spots in your Splunk environment based on the attacker's perspective so as this video kind of plays through you can see it Patrick I'd love to get your thoughts um just seeing so many Splunk deployments and the effectiveness of those deployments and and how this is going to help really Elevate the effectiveness of all of your Splunk customers yeah I'm super excited about this I mean I think this these kinds of purpose-built integration snail really move the needle for our customers I mean at the end of the day when I think about the power of Splunk I think about a product I was first introduced to 12 years ago that was an on-prem piece of software you know and at the time it sold on sort of Perpetual and term licenses but one made it special was that it could it could it could eat data at a speed that nothing else that I'd have ever seen you can ingest massively scalable amounts of data uh did cool things like schema on read which facilitated that there was this language called SPL that you could nerd out about uh and you went to a conference once a year and you talked about all the cool things you were splunking right but now as we think about the next phase of our growth um we live in a heterogeneous environment where our customers have so many different tools and data sources that are ever expanding and as you look at the as you look at the role of the ciso it's mind-blowing to me the amount of sources Services apps that are coming into the ciso span of let's just call it a span of influence in the last three years uh you know we're seeing things like infrastructure service level visibility application performance monitoring stuff that just never made sense for the security team to have visibility into you um at least not at the size and scale which we're demanding today um and and that's different and this isn't this is why it's so important that we have these joint purpose-built Integrations that um really provide more prescription to our customers about how do they walk on that Journey towards maturity what does zero to one look like what does one to two look like whereas you know 10 years ago customers were happy with platforms today they want integration they want Solutions and they want to drive outcomes and I think this is a great example of how together we are stepping to the evolving nature of the market and also the ever-evolving nature of the threat landscape and what I would say is the maturing needs of the customer in that environment yeah for sure I think especially if if we all anticipate budget pressure over the next 18 months due to the economy and elsewhere while the security budgets are not going to ever I don't think they're going to get cut they're not going to grow as fast and there's a lot more pressure on organizations to extract more value from their existing Investments as well as extracting more value and more impact from their existing teams and so security Effectiveness Fierce prioritization and automation I think become the three key themes of security uh over the next 18 months so I'll do very quickly is run through a few other use cases um every host that we identified in the pen test were able to score and say this host allowed us to do something significant therefore it's it's really critical you should be increasing your logging here hey these hosts down here we couldn't really do anything as an attacker so if you do have to make trade-offs you can make some trade-offs of your logging resolution at the lower end in order to increase logging resolution on the upper end so you've got that level of of um justification for where to increase or or adjust your logging resolution another example is every host we've discovered as an attacker we Expose and you can export and we want to make sure is every host we found as an attacker is being ingested from a Splunk standpoint a big issue I had as a CIO and user of Splunk and other tools is I had no idea if there were Rogue Raspberry Pi's on the network or if a new box was installed and whether Splunk was installed on it or not so now you can quickly start to correlate what hosts did we see and how does that reconcile with what you're logging from uh finally or second to last use case here on the Splunk integration side is for every single problem we've found we give multiple options for how to fix it this becomes a great way to prioritize what fixed actions to automate in your soar platform and what we want to get to eventually is being able to automatically trigger soar actions to fix well-known problems like automatically invalidating passwords for for poor poor passwords in our credentials amongst a whole bunch of other things we could go off and do and then finally if there is a well-known kill chain or attack path one of the things I really wish I could have done when I was a Splunk customer was take this type of kill chain that actually shows a path to domain admin that I'm sincerely worried about and use it as a glass table over which I could start to layer possible indicators of compromise and now you've got a great starting point for glass tables and iocs for actual kill chains that we know are exploitable in your environment and that becomes some super cool Integrations that we've got on the roadmap between us and the Splunk security side of the house so what I'll leave with actually Patrick before I do that you know um love to get your comments and then I'll I'll kind of leave with one last slide on this wartime security mindset uh pending you know assuming there's no other questions no I love it I mean I think this kind of um it's kind of glass table's approach to how do you how do you sort of visualize these workflows and then use things like sore and orchestration and automation to operationalize them is exactly where we see all of our customers going and getting away from I think an over engineered approach to soar with where it has to be super technical heavy with you know python programmers and getting more to this visual view of workflow creation um that really demystifies the power of Automation and also democratizes it so you don't have to have these programming languages in your resume in order to start really moving the needle on workflow creation policy enforcement and ultimately driving automation coverage across more and more of the workflows that your team is seeing yeah I think that between us being able to visualize the actual kill chain or attack path with you know think of a of uh the soar Market I think going towards this no code low code um you know configurable sore versus coded sore that's going to really be a game changer in improve or giving security teams a force multiplier so what I'll leave you with is this peacetime mindset of security no longer is sustainable we really have to get out of checking the box and then waiting for the bad guys to show up to verify that security tools are are working or not and the reason why we've got to really do that quickly is there are over a thousand companies that withdrew from the Russian economy over the past uh nine months due to the Ukrainian War there you should expect every one of them to be punished by the Russians for leaving and punished from a cyber standpoint and this is no longer about financial extortion that is ransomware this is about punishing and destroying companies and you can punish any one of these companies by going after them directly or by going after their suppliers and their Distributors so suddenly your attack surface is no more no longer just your own Enterprise it's how you bring your goods to Market and it's how you get your goods created because while I may not be able to disrupt your ability to harvest fruit if I can get those trucks stuck at the border I can increase spoilage and have the same effect and what we should expect to see is this idea of cyber-enabled economic Warfare where if we issue a sanction like Banning the Russians from traveling there is a cyber-enabled counter punch which is corrupt and destroy the American Airlines database that is below the threshold of War that's not going to trigger the 82nd Airborne to be mobilized but it's going to achieve the right effect ban the sale of luxury goods disrupt the supply chain and create shortages banned Russian oil and gas attack refineries to call a 10x spike in gas prices three days before the election this is the future and therefore I think what we have to do is shift towards a wartime mindset which is don't trust your security posture verify it see yourself Through The Eyes of the attacker build that incident response muscle memory and drive better collaboration between the red and the blue teams your suppliers and Distributors and your information uh sharing organization they have in place and what's really valuable for me as a Splunk customer was when a router crashes at that moment you don't know if it's due to an I.T Administration problem or an attacker and what you want to have are different people asking different questions of the same data and you want to have that integrated triage process of an I.T lens to that problem a security lens to that problem and then from there figuring out is is this an IT workflow to execute or a security incident to execute and you want to have all of that as an integrated team integrated process integrated technology stack and this is something that I very care I cared very deeply about as both a Splunk customer and a Splunk CTO that I see time and time again across the board so Patrick I'll leave you with the last word the final three minutes here and I don't see any open questions so please take us home oh man see how you think we spent hours and hours prepping for this together that that last uh uh 40 seconds of your talk track is probably one of the things I'm most passionate about in this industry right now uh and I think nist has done some really interesting work here around building cyber resilient organizations that have that has really I think helped help the industry see that um incidents can come from adverse conditions you know stress is uh uh performance taxations in the infrastructure service or app layer and they can come from malicious compromises uh Insider threats external threat actors and the more that we look at this from the perspective of of a broader cyber resilience Mission uh in a wartime mindset uh I I think we're going to be much better off and and will you talk about with operationally minded ice hacks information sharing intelligence sharing becomes so important in these wartime uh um situations and you know we know not all ice acts are created equal but we're also seeing a lot of um more ad hoc information sharing groups popping up so look I think I think you framed it really really well I love the concept of wartime mindset and um I I like the idea of applying a cyber resilience lens like if you have one more layer on top of that bottom right cake you know I think the it lens and the security lens they roll up to this concept of cyber resilience and I think this has done some great work there for us yeah you're you're spot on and that that is app and that's gonna I think be the the next um terrain that that uh that you're gonna see vendors try to get after but that I think Splunk is best position to win okay that's a wrap for this special Cube presentation you heard all about the global expansion of horizon 3.ai's partner program for their Partners have a unique opportunity to take advantage of their node zero product uh International go to Market expansion North America channel Partnerships and just overall relationships with companies like Splunk to make things more comprehensive in this disruptive cyber security world we live in and hope you enjoyed this program all the videos are available on thecube.net as well as check out Horizon 3 dot AI for their pen test Automation and ultimately their defense system that they use for testing always the environment that you're in great Innovative product and I hope you enjoyed the program again I'm John Furrier host of the cube thanks for watching
SUMMARY :
that's the sort of stuff that we do you
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Patrick Coughlin | PERSON | 0.99+ |
Jennifer Lee | PERSON | 0.99+ |
Chris | PERSON | 0.99+ |
Tony | PERSON | 0.99+ |
2013 | DATE | 0.99+ |
Raina Richter | PERSON | 0.99+ |
Singapore | LOCATION | 0.99+ |
Europe | LOCATION | 0.99+ |
Patrick | PERSON | 0.99+ |
Frankfurt | LOCATION | 0.99+ |
John | PERSON | 0.99+ |
20-year | QUANTITY | 0.99+ |
hundreds | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
20 years | QUANTITY | 0.99+ |
seven minutes | QUANTITY | 0.99+ |
95 | QUANTITY | 0.99+ |
Ford | ORGANIZATION | 0.99+ |
2.7 billion | QUANTITY | 0.99+ |
March | DATE | 0.99+ |
Finland | LOCATION | 0.99+ |
seven hours | QUANTITY | 0.99+ |
sixty percent | QUANTITY | 0.99+ |
John Furrier | PERSON | 0.99+ |
Sweden | LOCATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
six weeks | QUANTITY | 0.99+ |
seven hours | QUANTITY | 0.99+ |
19 credentials | QUANTITY | 0.99+ |
ten dollars | QUANTITY | 0.99+ |
Jennifer | PERSON | 0.99+ |
5 000 hosts | QUANTITY | 0.99+ |
Horizon 3 | TITLE | 0.99+ |
Wednesday | DATE | 0.99+ |
30 | QUANTITY | 0.99+ |
eight | QUANTITY | 0.99+ |
Asia Pacific | LOCATION | 0.99+ |
American Airlines | ORGANIZATION | 0.99+ |
Deloitte | ORGANIZATION | 0.99+ |
three licenses | QUANTITY | 0.99+ |
two companies | QUANTITY | 0.99+ |
2019 | DATE | 0.99+ |
European Union | ORGANIZATION | 0.99+ |
six | QUANTITY | 0.99+ |
seven occurrences | QUANTITY | 0.99+ |
70 | QUANTITY | 0.99+ |
three people | QUANTITY | 0.99+ |
Horizon 3.ai | TITLE | 0.99+ |
ATT | ORGANIZATION | 0.99+ |
Net Zero | ORGANIZATION | 0.99+ |
Splunk | ORGANIZATION | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
five | QUANTITY | 0.99+ |
less than two percent | QUANTITY | 0.99+ |
less than two hours | QUANTITY | 0.99+ |
2012 | DATE | 0.99+ |
UK | LOCATION | 0.99+ |
Adobe | ORGANIZATION | 0.99+ |
four issues | QUANTITY | 0.99+ |
Department of Defense | ORGANIZATION | 0.99+ |
next year | DATE | 0.99+ |
three steps | QUANTITY | 0.99+ |
node 0 | TITLE | 0.99+ |
15 minutes | QUANTITY | 0.99+ |
hundred percent | QUANTITY | 0.99+ |
node zero | TITLE | 0.99+ |
10x | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
7 minutes | QUANTITY | 0.99+ |
one license | QUANTITY | 0.99+ |
second thing | QUANTITY | 0.99+ |
thousands of hosts | QUANTITY | 0.99+ |
five thousand hosts | QUANTITY | 0.99+ |
next week | DATE | 0.99+ |
Jen Huffstetler, Intel | HPE Discover 2022
>> Announcer: theCube presents HPE Discover 2022 brought to you by HPE. >> Hello and welcome back to theCube's continuous coverage HPE Discover 2022 and from Las Vegas the formerly Sands Convention Center now Venetian, John Furrier and Dave Vellante here were excited to welcome in Jen Huffstetler. Who's the Chief product Sustainability Officer at Intel Jen, welcome to theCube thanks for coming on. >> Thank you very much for having me. >> You're really welcome. So you dial back I don't know, the last decade and nobody really cared about it but some people gave it lip service but corporations generally weren't as in tune, what's changed? Why has it become so top of mind? >> I think in the last year we've noticed as we all were working from home that we had a greater appreciation for the balance in our lives and the impact that climate change was having on the world. So I think across the globe there's regulations industry and even personally, everyone is really starting to think about this a little more and corporations specifically are trying to figure out how are they going to continue to do business in these new regulated environments. >> And IT leaders generally weren't in tune cause they weren't paying the power bill for years it was the facilities people, but then they started to come together. How should leaders in technology, business tech leaders, IT leaders, CIOs, how should they be thinking about their sustainability goals? >> Yeah, I think for IT leaders specifically they really want to be looking at the footprint of their overall infrastructure. So whether that is their on-prem data center, their cloud instances, what can they do to maximize the resources and lower the footprint that they contribute to their company's overall footprint. So IT really has a critical role to play I think because as you'll find in IT, the carbon footprint of the data center of those products in use is actually it's fairly significant. So having a focus there will be key. >> You know compute has always been one of those things where, you know Intel's been makes chips so that, you know heat is important in compute. What is Intel's current goals? Give us an update on where you guys are at. What's the ideal goal in the long term? Where are you now? You guys always had a focus on this for a long, long time. Where are we now? Cause I won't say the goalpost of changed, they're changing the definitions of what this means. What's the current state of Intel's carbon footprint and overall goals? >> Yeah, no thanks for asking. As you mentioned, we've been invested in lowering our environmental footprint for decades in fact, without action otherwise, you know we've already lowered our carbon footprint by 75%. So we're really in that last mile. And that is why when we recently announced a very ambitious goal Net-Zero 2040 for our scope one and two for manufacturing operations, this is really an industry leading goal. And partly because the technology doesn't even exist, right? For the chemistries and for making the silicon into the sand into, you know, computer chips yet. And so by taking this bold goal, we're going to be able to lead the industry, partner with academia, partner with consortia, and that drive is going to have ripple effects across the industry and all of the components in semiconductors. >> Is there a changing definition of Net-Zero? What that means, cause some people say they're Net-Zero and maybe in one area they might be but maybe holistically across the company as it becomes more of a broader mandate society, employees, partners, Wall Street are all putting pressure on companies. Is the Net-Zero conversation changed a little bit or what's your view on that? >> I think we definitely see it changing with changing regulations like those coming forth from the SEC here in the US and in Europe. Net-Zero can't just be lip service anymore right? It really has to be real reductions on your footprint. And we say then otherwise and even including in our supply chain goals what we've taken new goals to reduce, but our operations are growing. So I think everybody is going through this realization that you know, with the growth, how do we keep it lower than it would've been otherwise, keep focusing on those reductions and have not just renewable credits that could have been bought in one location and applied to a different geographical location but real credible offsets for where the the products manufactured or the computes deployed. >> Jen, when you talk about you've reduced already by 75% you're on that last mile. We listened to Pat Gelsinger very closely up until recently he was the number one most frequently had on theCube guest. He's been busy I guess. But as you apply that discipline to where you've been, your existing business and now Pat's laid out this plan to increase the Foundry business how does that affect your... Are you able to carry through that reduction to, you know, the new foundries? Do you have to rethink that? How does that play in? >> Certainly, well, the Foundry expansion of our business with IBM 2.0 is going to include the existing factories that already have the benefit of those decades of investment and focus. And then, you know we have clear goals for our new factories in Ohio, in Europe to achieve goals as well. That's part of the overall plan for Net-Zero 2040. It's inclusive of our expansion into Foundry which means that many, many many more customers are going to be able to benefit from the leadership that Intel has here. And then as we onboard acquisitions as any company does we need to look at the footprint of the acquisition and see what we can do to align it with our overall goals. >> Yeah so sustainable IT I don't know for some reason was always an area of interest to me. And when we first started, even before I met you, John we worked with PG&E to help companies get rebates for installing technologies that would reduce their carbon footprint. >> Jen: Very forward thinking. >> And it was a hard thing to get, you know, but compute was the big deal. And there were technologies and I remember virtualization at the time was one and we would go in and explain to the PG&E engineers how that all worked. Cause they had metrics and that they wanted to see, but anyway, so virtualization was clearly one factor. What are the technologies today that people should be paying, flash storage was another one. >> John: AI's going to have a big impact. >> Reduce the spinning disk, but what are the ones today that are going to have an impact? >> Yeah, no, that's a great question. We like to think of the built in acceleration that we have including some of the early acceleration for virtualization technologies as foundational. So built in accelerated compute is green compute and it allows you to maximize the utilization of the transistors that you already have deployed in your data center. This compute is sitting there and it is ready to be used. What matters most is what you were talking about, John that real world workload performance. And it's not just you know, a lot of specsmanship around synthetic benchmarks, but AI performance with the built in acceleration that we have in Xeon processors with the Intel DL Boost, we're able to achieve four X, the AI performance per Watts without you know, doing that otherwise. You think about the consolidation you were talking about that happened with virtualization. You're basically effectively doing the same thing with these built in accelerators that we have continued to add over time and have even more coming in our Sapphire Generation. >> And you call that green compute? Or what does that mean, green compute? >> Well, you are greening your compute. >> John: Okay got it. >> By increasing utilization of your resources. If you're able to deploy AI, utilize the telemetry within the CPU that already exists. We have customers KDDI in Japan has a great Proofpoint that they already announced on their 5G data center, lowered their data center power by 20%. That is real bottom line impact as well as carbon footprint impact by utilizing all of those built in capabilities. So, yeah. >> We've heard some stories earlier in the event here at Discover where there was some cooling innovations that was powering moving the heat to power towns and cities. So you start to see, and you guys have been following this data center and been part of the whole, okay and hot climates, you have cold climates, but there's new ways to recycle energy where's that cause that sounds very Sci-Fi to me that oh yeah, the whole town runs on the data center exhaust. So there's now systems thinking around compute. What's your reaction to that? What's the current view on re-engineering a system to take advantage of that energy or recycling? >> I think when we look at our vision of sustainable compute over this horizon it's going to be required, right? We know that compute helps to solve society's challenges and the demand for it is not going away. So how do we take new innovations looking at a systems level as compute gets further deployed at the edge, how do we make it efficient? How do we ensure that that compute can be deployed where there is air pollution, right? So some of these technologies that you have they not only enable reuse but they also enable some you know, closing in of the solution to make it more robust for edge deployments. It'll allow you to place your data center wherever you need it. It no longer needs to reside in one place. And then that's going to allow you to have those energy reuse benefits either into district heating if you're in, you know Northern Europe or there's examples with folks putting greenhouses right next to a data center to start growing food in what we're previously food deserts. So I don't think it's science fiction. It is how we need to rethink as a society. To utilize everything we have, the tools at our hand. >> There's a commercial on the radio, on the East Coast anyway, I don't know if you guys have heard of it, it's like, "What's your one thing?" And the gentleman comes on, he talks about things that you can do to help the environment. And he says, "What's your one thing?" So what's the one thing or maybe it's not just one that IT managers should be doing to affect carbon footprint? >> The one thing to affect their carbon footprint, there are so many things. >> Dave: Two, three, tell me. >> I think if I was going to pick the one most impactful thing that they could do in their infrastructure is it's back to John's comment. It's imagine if the world deployed AI, all the benefits not only in business outcomes, you know the revenue, lowering the TCO, but also lowering the footprint. So I think that's the one thing they could do. If I could throw in a baby second, it would be really consider how you get renewable energy into your computing ecosystem. And then you know, at Intel, when we're 80% renewable power, our processors are inherently low carbon because of all the work that we've done others have less than 10% renewable energy. So you want to look for products that have low carbon by design, any Intel based system and where you can get renewables from your grid to ask for it, run your workload there. And even the next step to get to sustainable computing it's going to take everyone, including every enterprise to think differently and really you know, consider what would it look like to bring renewables onto my site? If I don't have access through my local utility and many customers are really starting to evaluate that. >> Well Jen its great to have you on theCube. Great insight into the current state of the art of sustainability and carbon footprint. My final question for you is more about the talent out there. The younger generation coming in I'll say the pressure, people want to work for a company that's mission driven we know that, the Wall Street impact is going to be financial business model and then save the planet kind of pressure. So there's a lot of talent coming in. Is there awareness at the university level? Is there a course where can, do people get degrees in sustainability? There's a lot of people who want to come into this field what are some of the talent backgrounds of people learning or who might want to be in this field? What would you recommend? How would you describe how to onboard into the career if they want to contribute? What are some of those factors? Cause it's not new, new, but it's going to be globally aware. >> Yeah well there certainly are degrees with focuses on sustainability maybe to look at holistically at the enterprise, but where I think the globe is really going to benefit, we didn't really talk about the software inefficiency. And as we delivered more and more compute over the last few decades, basically the programming languages got more inefficient. So there's at least 35% inefficiency in the software. So being a software engineer, even if you're not an AI engineer. So AI would probably be the highest impact being a software engineer to focus on building new applications that are going to be efficient applications that they're well utilizing the transistor that they're not leaving zombie you know, services running that aren't being utilized. So I actually think-- >> So we got a program in assembly? (all laughing) >> (indistinct), would get really offended. >> Get machine language. I have to throw that in sorry. >> Maybe not that bad. (all laughing) >> That's funny, just a joke. But the question is what's my career path. What's a hot career in this area? Sustainability, AI totally see that. Anything else, any other career opportunities you see or hot jobs or hot areas to work on? >> Yeah, I mean, just really, I think it takes every architect, every engineer to think differently about their design, whether it's the design of a building or the design of a processor or a motherboard we have a whole low carbon architecture, you know, set of actions that are we're underway that will take to the ecosystem. So it could really span from any engineering discipline I think. But it's a mindset with which you approach that customer problem. >> John: That system thinking, yeah. >> Yeah sustainability designed in. Jen thanks so much for coming back in theCube, coming on theCube. It's great to have you. >> Thank you. >> All right. Dave Vellante for John Furrier, we're sustaining theCube. We're winding down day three, HPE Discover 2022. We'll be right back. (upbeat music)
SUMMARY :
brought to you by HPE. the formerly Sands Convention I don't know, the last decade and the impact that climate but then they started to come together. and lower the footprint What's the ideal goal in the long term? into the sand into, you but maybe holistically across the company that you know, with the growth, to where you've been, that already have the benefit to help companies get rebates at the time was one and it is ready to be used. the CPU that already exists. and been part of the whole, And then that's going to allow you And the gentleman comes on, The one thing to affect And even the next step to to have you on theCube. that are going to be would get really offended. I have to throw that in sorry. Maybe not that bad. But the question is what's my career path. or the design of a It's great to have you. Dave Vellante for John Furrier,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jen Huffstetler | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Ohio | LOCATION | 0.99+ |
Europe | LOCATION | 0.99+ |
PG&E | ORGANIZATION | 0.99+ |
US | LOCATION | 0.99+ |
80% | QUANTITY | 0.99+ |
Japan | LOCATION | 0.99+ |
Pat Gelsinger | PERSON | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
Jen | PERSON | 0.99+ |
SEC | ORGANIZATION | 0.99+ |
75% | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
Two | QUANTITY | 0.99+ |
John Furrier | PERSON | 0.99+ |
three | QUANTITY | 0.99+ |
Northern Europe | LOCATION | 0.99+ |
one factor | QUANTITY | 0.99+ |
HPE | ORGANIZATION | 0.98+ |
Pat | PERSON | 0.98+ |
Intel | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
one location | QUANTITY | 0.98+ |
20% | QUANTITY | 0.98+ |
two | QUANTITY | 0.98+ |
one thing | QUANTITY | 0.97+ |
first | QUANTITY | 0.97+ |
Net-Zero | ORGANIZATION | 0.96+ |
one place | QUANTITY | 0.96+ |
DL Boost | COMMERCIAL_ITEM | 0.96+ |
last decade | DATE | 0.95+ |
today | DATE | 0.93+ |
decades | QUANTITY | 0.92+ |
day three | QUANTITY | 0.9+ |
one area | QUANTITY | 0.9+ |
East Coast | LOCATION | 0.9+ |
KDDI | ORGANIZATION | 0.89+ |
Discover | ORGANIZATION | 0.88+ |
less than 10% renewable | QUANTITY | 0.86+ |
Wall Street | LOCATION | 0.86+ |
Sands Convention Center | LOCATION | 0.84+ |
theCube | ORGANIZATION | 0.83+ |
four X | QUANTITY | 0.82+ |
Wall | ORGANIZATION | 0.82+ |
least 35% | QUANTITY | 0.75+ |
Chief | PERSON | 0.75+ |
IBM 2.0 | ORGANIZATION | 0.74+ |
Sustainability Officer | PERSON | 0.72+ |
last few decades | DATE | 0.69+ |
second | QUANTITY | 0.63+ |
Net-Zero 2040 | TITLE | 0.62+ |
Generation | COMMERCIAL_ITEM | 0.6+ |
HPE Discover 2022 | COMMERCIAL_ITEM | 0.55+ |
2022 | COMMERCIAL_ITEM | 0.55+ |
every engineer | QUANTITY | 0.54+ |
5G | QUANTITY | 0.54+ |
-Zero | OTHER | 0.54+ |
HPE | COMMERCIAL_ITEM | 0.48+ |
Street | LOCATION | 0.47+ |
Kriss Dieglmeier, Splunk | Splunk .conf21
okay welcome back to thecube's coverage at splunk.com 2021 virtual i'm john furrier with thecube we're here live in the studios of splunk's event here we're all together broadcasting out all over the world here with chris dieglemeyer chief social impact officer for splunk great to see you thanks for coming on great thanks for having me today i love the title chief social impact officer because we're bringing in data unlocks value well you know that and yes it's the theme of the show society has really been impacted by misinformation what context we've seen examples of how data has been good and been bad yes so there's a divide there so you're this is a big part of your talk yes it it's a big part of me and it's going to be even a bigger part of splunk going forward so as many people know they've heard of the digital divide right and that was about access to information communication technologies and it was coined 20 years ago 2001 and we've made progress on that digital divide but now we have all that infrastructure or a lot of it and so on top of that we have the data divide and that's the increasing and expanding use of data and the gap between using that to solve commercial and provide commercial value in contrast to solving our social and environmental challenges and so the the important thing about it is we're early enough that with urgent action we can try to close that gap um and really make a difference in the world so let's get started let's define the data divide and give some specific examples where you see it in action on the pro side and where there's some work needed yeah so all so the definition is again that that gap between using we we have all this data being used for commercial value and a relatively weak use of data being used to solve our social and environmental challenges and we've got four kind of key barriers that we've identified that need to be addressed which will get to you know the questions and how we solve it one is access so think about it think of the data that google has and where that is in access compared to probably the department of education in any country around the world so access is big second is capacity we need both financial resources investing in solving our social and environmental problems and we need data scientists data stewards great data people working to solve our social and environmental problems just as we are in the corporate sector and then the third one is investment choices and this one is a little bit of a be in my bonnet and this happens mostly in the private sector so we all know you know every year it's like what what hits the return on investment criteria and solving social and environmental challenges often does not uh doesn't have that quite time frame return on investment and think about if we'd identified this data divide 20 years ago for climate because companies are doing phenomenal work now about climate what if we had been doing that work 20 years ago around sustainability around efficiency and then the last piece is actionable solutions that we can replicate so those are kind of the four barriers um and again i think we've got a lot of potential and examples there isn't one issue i can think of where more data isn't going to help us you know this is so important i feel very strongly about this because i've seen examples where i've seen really strong people start ngos or non-profits or just building an app and they abandon it because they can't get there fast enough so the idea that cloud and data accessibility can be there you get to see some success and you can double down on that's the cloud way yes so i think this is something that people want to know the playbook so you know where where are people being successful what can people do yeah to take advantage of it yeah so i think that's a really good important point um is transitioning to the cloud so think of the nonprofit sector it's barely there yet so all of us who are investors philanthropists we need to be supporting the nonprofit sector be cloud enabled and cloud forward similarly with government i i you know there's example after example where you know whether it's health whether it's child and human services their data is in file cabinets think about that think of prime so we need to digitize those then we need to data enable that so that we can see those insights that are coming out around those solutions you know it's always the you know it's always a discussion in the industry inside the ropes and now on mainstream but getting data to the right place at the right time yeah is a really important thing it's a technical latency all these things but practically it has societal impact where would you rank the progress bar in terms of where we are on the digital divide because i can see healthcare for instance having access to the right information or it could be something on the government side where it could be related to climate change or hey get this involved where are we on this so i i would say on the digital divide which is the infrastructure piece um for most definitely high-income countries mid-income countries we've actually made progress and so they have that they're all you know network they're cloud but now they have all this data they don't know what to do with right and so what we need to kind of now build on that infrastructure to solve for that data and i'll just you know a splunk example one of our customers the netherlands um in their court system right with using splunk they were able to enable real-time data to inform court decisions so historically the judge would ask you know this happened in covid where are we on bankruptcy cases right and historically somebody would call somebody they'd call somebody they go dig the files and they get the information three months real time this is what's happening with bankruptcy in real time with covid is going to change those decisions that impact people's lives so you add that on top i mean we have environmental examples working with net zero schools we have it and we worked with the healthcare coalition with mitre to enable real-time data with a number of other companies so um where so i would say we're further along on the digital divide we're at step one on the data divide yeah doug merritt was talking earlier today about how you know this data plan that splunk has evolved into this catch basin for all the data and then it becomes useful and really taking us through the journal now security and it's this control plane that's enabling yeah i think to me that's a real key thing here so i have to ask do you see envision a future where we have a data commons where um citizens and could tap into the data and in the gov 2.0 is kind of on that vision yeah what do you where do you see this what do you say well i i think and i i know doug has talked about this before too from a values standpoint of especially with government moving to open data and then what we have to do is we have to protect privacy which actually splunk is really good at doing uh so you've got to take that individual data out of there but then once you get these big data pools into these big data lakes you'll be able to see insights that you couldn't see before you know it's interesting that i remember when the internet came around and how the u.s government's very active it seems now that that tech policy has always been kind of like oh yeah we're kind of involved in dc but now tech is so important and with all the backlash on the facebooks of the world of you know how democracy was broken there's an opportunity yeah and the lawmakers and the people who make the laws are kind of lawyers they're not really techies so so like policy's got to change how do we do that yeah oh gosh if i could solve that one on policy change but but i want to make a comment because i think it's really important because you reference and the situation facebook is in is common knowledge i give a lot of credit to splunk as you know a data platform company saying we see this data divide coming and we're going to step to the table now and do something about it because there's a lot of other companies that knew these challenges if they looked out three five years and they made personal or company choices not to do something about it so transparency is super important getting that out there and and being again in data and just saying it's not all roses right and and so take being a purpose-driven company is about making those decisions as a company to have an impact so then to answer your question on policy um i would say i think it's really complicated and tricky because data moves at the speed of sound and policy moves kind of like a turtle and so i think what we need to have happen is companies going to sometimes have to lead the way and hold themselves accountable and then work in partnership with policy to make you know policy changes that impact everybody so again we're strong advocates of open data you know we we can't make the government do it but we can be a voice for it in service of bridging the state this data divide is a great conversation i wish we had more time for the last minute just give a quick plug for what splunk's doing specifically and how people could get involved and participate yeah so i'll kind of i'd say three things one is at this early stage we're kind of raising the flag to governments out there to philanthropy to nonprofits like we all need to be paying attention to this we're going to be investing in more research on it because it is at such an early stage we've identified these barriers but we've got to go much deeper and build collaborations around the solution so we're going to be mobilizing our partners and our customers we have a 100 million dollar pledge where we donate our product nonprofits we and the equally important thing as i talked about it's our talent right it's getting the talent to help these organizations it's our strategic giving so we're mobilizing you know all of our assets around this pledge we have a 50 million dollar impact fund which is around four purpose data enabled companies so we're trying to do it across a multitude of platforms is that investment fund deploying now or has it been making investments in companies already yeah we've made um three investments refrain ai is one about using machine learning and ai around the jobs of the future and retraining so it's still or it was launched just a couple years ago so we're still early in the 50 million dollar fund so we'll be doing more of that sounds like a great opportunity for people out there watching enable enable the people to change the world yeah that's what splunk's all about right now exactly chris thanks for coming on appreciate great thank you okay the data divide we're bringing you all the data here from the cube live here in the splunk studios i'm john furrier with thecube thanks for watching thank you
SUMMARY :
the facebooks of the world of you know
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
chris dieglemeyer | PERSON | 0.99+ |
Kriss Dieglmeier | PERSON | 0.99+ |
splunk | ORGANIZATION | 0.99+ |
three months | QUANTITY | 0.99+ |
john furrier | PERSON | 0.99+ |
chris | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
20 years ago | DATE | 0.98+ |
50 million dollar | QUANTITY | 0.98+ |
20 years ago 2001 | DATE | 0.97+ |
100 million dollar | QUANTITY | 0.97+ |
u.s | ORGANIZATION | 0.97+ |
ORGANIZATION | 0.97+ | |
2021 | DATE | 0.96+ |
both | QUANTITY | 0.96+ |
today | DATE | 0.96+ |
one issue | QUANTITY | 0.93+ |
step one | QUANTITY | 0.93+ |
net zero schools | ORGANIZATION | 0.92+ |
one | QUANTITY | 0.92+ |
merritt | PERSON | 0.9+ |
three five years | QUANTITY | 0.88+ |
third one | QUANTITY | 0.88+ |
Splunk | OTHER | 0.87+ |
earlier today | DATE | 0.84+ |
Splunk | ORGANIZATION | 0.84+ |
chief | PERSON | 0.83+ |
second | QUANTITY | 0.82+ |
a couple years ago | DATE | 0.76+ |
three things | QUANTITY | 0.74+ |
three investments | QUANTITY | 0.73+ |
a lot of other companies | QUANTITY | 0.72+ |
prime | COMMERCIAL_ITEM | 0.71+ |
barriers | QUANTITY | 0.69+ |
chief social impact | PERSON | 0.65+ |
splunk.com | OTHER | 0.63+ |
facebooks | ORGANIZATION | 0.63+ |
four purpose | QUANTITY | 0.62+ |
netherlands | LOCATION | 0.62+ |
doug | PERSON | 0.6+ |
covid | ORGANIZATION | 0.6+ |
covid | TITLE | 0.49+ |
playbook | TITLE | 0.47+ |