Image Title

Search Results for Graham:

Steve Carefull, PA Consulting Group, and Graham Allen, Hampshire County | AWS PS Partner Awards 2021


 

>> Narrator: From theCUBES studios in Palo Alto in Boston connecting with thought leaders all around the world. This is theCUBE conversation. >> Hello and welcome to the 2021 AWS global public sector partner awards. I'm your host Natalie Erlich. Today we're going to highlight the most valuable valuable Amazon connect appointment. And we are now joined by Steve Careful, adult social care expert PA consulting group and Graham Allen, the director of adults health and care at Hampshire county council. Welcome gentlemen to today's session. >> Thank you Natalie >> I love you Natalie. >> Well by now we are really familiar the call to shelter in place and how it especially affected the most vulnerable of people. Give us some experience or some insight on your experience with that, especially in light of some of the technology that was deployed. Let's start with you, Graham. >> Yeah, Thank you. So just by way of context, Hampshire county council is one of the largest areas of local government in England. So we have a population of 1.4 million people. And when a lockdown was imposed by the national government of England in the 23rd of March 2020. Shortly thereafter the evidence in terms of vulnerabilities around COVID-19 strongly identified that people with a range of clinical conditions were most vulnerable and needed to shield and self issolate. And for the size of our population, we quickly were advised that roughly some 30,000 people in the initial carts because of political vulnerabilities needed to sheild and receive a variety of support shortly after that through the summer of 2020 that number increased some 50,000. And then by January of this year that number further increased based on the scientific and medical evidence to 83,000 people in total. So that represented a huge challenge for us in terms of offering support, being able to make sure that not only practical tasks related to obtaining shopping food and so on and so forth, but also medications but also the real risks of self isolation. Many of the people that we were needing to support when here the two known to us as a social care provider. They were being advised through clinical medical evidence needs and many of those people lived alone. So the real risk of self isolation not seeing anyone potentially for an extended period of time and the risks of their wellbeing was something very significant to us. So we needed very rapidly to develop a solution in terms of making contact, being able to offer that support. >> Yeah and I'd love it now to get your take Steve on how PA consulting group helped deliver on that call on that need. >> True so we have an existing relationship with Graham and the council, we've been working together for number of years, delivering care technology solutions to service users around the county. We were obviously aware there was a major issue as COVID and lockdown began. So we sat down with Graham and his colleagues to ask what we could do to help. We used our relationship with AWS and our knowledge of the connect platform to suggest a mechanism for making outbound calls really at scale. And that was the beginning of the process. We were very quickly in a position where we were able to actually get that service running live. In fact, we had a working prototype within four days and a live service in seven days. And from that point on of those many thousands of people that Graham's alluded to, we were calling up to two and a half thousand a day to ask them did they need any help? Were they okay? If they did need help, If they responded yes, to those, to that question we were then able to put them through to a conventional call handler in our call center where a conversation could take place about what their needs were. And as Graham said, in many cases that was people who couldn't get out to get food shopping, people who were running short of clinical medical supplies, people who needed actually some interesting things pet care came up quite often people who couldn't leave the house home and look after their dog, they just needed some help locally. So we had to integrate with local voluntary services to get those those kinds of results and support delivered to them across the whole of Hampshire and ultimately throughout the whole of the COVID experience. So coming right up until March of this year. >> Right well, as the COVID pandemic progressed and, you know evolved in different stages, you know, with variants and a variety of different issues that came up over the last year or so, you know how did the technology develop how did the relationship develop and, you know tell us about that process that you had with each other. >> So the base service remained very consistent that different points in the year, when there were different issues that may be needed to be communicated to to the service users we were calling we would change and update the script. We would improve the logistics of the service make it simpler for colleagues in the council to get the data into the system, to make the calls. And basically we did that through a constant series of meetings checkpoint, staying in touch and really treating this as a very collaborative exercise. So I don't think for all of us COVID was a constant stream of surprises. Nobody could really predict what was going to happen in a week or a month. So we just have to all stay on our toes keep in touch and be flexible. And I think that's where our preferred way of working and that of AWS and the Hampshire team we were working with we really were able to do something that was special and I'm very fleet of foot and responsive to needs. >> Right and I'd also love to get Graham's insight on this as well. What of results have you seen, you know do you have any statistics on the impact that it made on people? Did you receive any qualitative feedback from the people that use the service? >> Yeah, no, absolutely. We did. And one of the things we were very conscious of from day one was using a system which may have been unfamiliar to people when the first instance in terms of receiving calls, the fact that we were able to use human voice within the call technology, I think really, really assisted. We also did a huge amount of work within a Hampshire county council. Clearly in terms of the work we do day in, day out we're well-known to our local population. We have a huge range of different responsibilities ranging from maintenance of the roads through to the provision of local services, like libraries and so on and so forth, and also social care support. So we were able to use all of that to cover last. And Steve has said through working very collaboratively together with a trusted brand Hampshire county council working with new technology. And the feedback that we received was both very much data-driven in real time, in terms of successful calls and also those going through to call handlers and then the outcomes being delivered through those call handlers to live services out and about around the county but also that qualitative impact that we had. So across Hampshire county council we have some 76 elected members believe me they were very active. They were very interested in the work that we were doing in supporting our most vulnerable residents. And they were receiving literally dozens of phone calls as a thank you by way of congratulating. But as I say, thanking us and our partners PA at district council partners and also the voluntary community sector in terms of the very real support that was being offered to residents. So we had a very fully resolved picture of precisely what was happening literally minute by minute on a live dashboard. In terms of outgoing calls calls going through the call handlers and then successful call completion in terms of the outcomes that were being delivered on the ground around the County of Hampshire. So a phenomenally successful approach well appreciated and well, I think applauded by all those receiving calls. >> Terrific insight. Well, Steve, I'd love to hear from you more about the technology and how you put the focus on the patient on the person really made it more people focused and you know, obviously that's so critical in such a time of need. >> Yeah, you're absolutely right, Natalie. We, I think what we were able to do because I myself and my immediate team have worked with Hampshire and other local authorities on the social care side for so long. We understood the need to be very person focused. I think sometimes with technology, it comes in with it with a particular way of operating that isn't necessarily sensitive to the audience. And we knew we had to get this right from day one. So Graham's already mentioned the use of human voice invoicing the bulk call. that was very, very important. We selected a voice actress who had a very reassuring clear tone recognizing that many of the individuals we were calling would have been would have been older people maybe a little hard of hearing. We needed to have the volume in the call simple things like this were very important. One of the of the debates I remember having very early on was the choice as to whether the response that somebody would give to the question, do you need this? Or that could be by pressing a digital on the phone. We understood that again, because potentially of frailty maybe a little lack of dexterity amongst some of the people we'd be calling that might be a bit awkward for them to take the phone away from their face and find the button and press the button in time. So we pursued the idea of an oral response. So if you want this say, yes if you don't want it to say no and those kinds of small choices around how the technology was deployed I think made a really big difference in terms of of acceptance and adoption and success in the way the service run. >> Terrific. Well Graham I'd like to shift it to you. Could you give us some insight on the lessons that you learned as a result of this pandemic and also trying to move quickly to help people in your community? >> Yeah, I think the lessons in some of the lessons that we've, again learned through our response to the pandemic, are lessons that to a degree have traveled with us over a number of years in terms of the way that we've used technology over a period, working with PA, which is be outcome focused. It's sometimes very easy to get caught up in a brilliant new piece of technology. But as Steve has just said, if it's not meeting the need if we're not thinking about that human perspective and thinking about the humanity and the outcomes that we're seeking to deliver then to some degree it's going to fail And this might certainly did not fail in any way shape or form because of the thoughtfulness that was brought forward. I think what we learned from it is how we can apply that as we go forward to the kinds of work that we do. So, as I've already said we've got a large population, 1.4 million people. We are moving from some really quite traditional ways of responding to that population, accelerated through our response to COVID through using AI technologies. Thinking about how we embed that more generally would a service offer not only in terms of supporting people with social care needs but that interface between ourselves and colleagues within the health sector, the NHS to make sure that we're thinking about outcomes and becoming much more intuitive in terms of how we can engage with our population. It's also, I think about thinking across wider sectors in terms of meeting people's needs. One of the, I think probably unrealized things pre COVID was the using virtual platforms of various kinds of actually increased engagement with people. We always thought in very traditional ways in order to properly support our population we must go out and meet them face to face. What COVID has taught us is actually for many people the virtual world connecting online, having a variety of different technologies made available to support them in their daily living is something that they've absolutely welcomed and actually feel much safer through being able to do the access is much more instant. You're not waiting for somebody to call. You're able to engage with a trusted partner, you know face-to-face over a virtual platform and get an answer more or less then and there. So I think there's a whole range of opportunities that we've learned, some of which we're already embedding into our usual practice. If I can describe anything over the last 15 months as usual but we're taking it forward and we hope to expand upon that at scale and at pace. >> Yeah, that's a really excellent point about the rise of hybrid care, both in the virtual and physical world. What can we expect to see now, moving forward like to shift over to our other guests, you know, what do you see next for technology as a result of the pandemic? >> Well, there's certainly been an uptake in the extent to which people are comfortable using these technologies. And again, if you think about the kind of target group that Graham and his colleagues in the social care world are dealing with these are often older people people with perhaps mobility issues, people with access issues when it comes to getting into their GP or getting into hospital services. The ability for those services to go out to them and interact with them in a much more immediate way in a way that isn't as intrusive. It isn't as time consuming. It doesn't involve leaving the house and finding a ways on public transport to get to see a person who you're going to see for five minutes in a unfamiliar building. I think that that in a sense COVID has accelerated the acceptance that that's actually pretty good for some people. It won't suit everybody and it doesn't work in every context, but I think where it's really worked well and works is a great example of that. Is in triaging and prioritizing. Ultimately the kinds of resources Graham's talked about the people need to access the GPs and the nurses and the care professionals are in short supply. Demand will outstrip will outstrip supply. therefore being able to triage and prioritize in that first interaction, using a technology ruse enables you to ensure you're focusing your efforts on those who've got the most urgent or the greatest need. So it's a kind of win all around. I think there's definitely been a sea change and it's hard to see hard to see people going back just as the debate about, will everybody eventually go back to offices, having spent a working at home? You know, I think the answer is invariably going to be no, some will but many won't. And it's the same with technology. Some will continue to interact through a technology channel. They won't go back to the face-to-face option that they had previously. >> Terrific. Well, thank you both very much. Steve Careful PA consulting group and Graham Allen Hampshire county council really appreciate your, your insights on how this important technology helped people who were suffering in the midst of the pandemic. Thank you. >> Steve: You're welcome. >> Graham: Thank you. >> Well, that's all for this session. Thank you so much for watching. (upbeat music)

Published Date : Jun 30 2021

SUMMARY :

leaders all around the world. and Graham Allen, the director some of the technology Many of the people that we were needing now to get your take Steve and the council, how did the relationship develop and, and that of AWS and the Hampshire on the impact that it made on people? of the outcomes that were on the person really made of the individuals we were insight on the lessons and the outcomes that of hybrid care, both in the in the extent to which midst of the pandemic. Thank you so much for watching.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
StevePERSON

0.99+

Natalie ErlichPERSON

0.99+

GrahamPERSON

0.99+

AWSORGANIZATION

0.99+

NataliePERSON

0.99+

Steve CarefullPERSON

0.99+

five minutesQUANTITY

0.99+

EnglandLOCATION

0.99+

Steve CarefulPERSON

0.99+

Graham AllenPERSON

0.99+

Palo AltoLOCATION

0.99+

HampshireLOCATION

0.99+

PA Consulting GroupORGANIZATION

0.99+

83,000 peopleQUANTITY

0.99+

BostonLOCATION

0.99+

seven daysQUANTITY

0.99+

23rd of March 2020DATE

0.99+

four daysQUANTITY

0.99+

1.4 million peopleQUANTITY

0.99+

last yearDATE

0.99+

twoQUANTITY

0.99+

76 elected membersQUANTITY

0.98+

AmazonORGANIZATION

0.98+

Hampshire CountyLOCATION

0.98+

January of this yearDATE

0.98+

summer of 2020DATE

0.98+

bothQUANTITY

0.98+

TodayDATE

0.98+

OneQUANTITY

0.98+

first instanceQUANTITY

0.98+

COVID-19OTHER

0.97+

pandemicEVENT

0.97+

todayDATE

0.97+

a weekQUANTITY

0.97+

HampshireORGANIZATION

0.97+

dozens of phone callsQUANTITY

0.96+

COVIDORGANIZATION

0.96+

March of this yearDATE

0.96+

oneQUANTITY

0.96+

a monthQUANTITY

0.96+

AWSEVENT

0.94+

national government of EnglandORGANIZATION

0.94+

30,000 peopleQUANTITY

0.92+

up to two and a half thousand a dayQUANTITY

0.9+

first interactionQUANTITY

0.89+

PAORGANIZATION

0.87+

last 15 monthsDATE

0.82+

thousands of peopleQUANTITY

0.82+

2021 AWS global public sector partner awardsEVENT

0.82+

50,000QUANTITY

0.8+

COVIDEVENT

0.8+

COVID pandemicEVENT

0.79+

PS Partner AwardsEVENT

0.78+

Hampshire county councilORGANIZATION

0.73+

day oneQUANTITY

0.7+

theCUBESORGANIZATION

0.65+

County of HampshireLOCATION

0.64+

COVIDTITLE

0.62+

NHSORGANIZATION

0.58+

2021DATE

0.48+

theCUBEORGANIZATION

0.45+

Ann Christel Graham and Chris Degnan V2


 

>> Hello everyone, and welcome back to The Data Cloud Summit 2020. We're going to dig into the all-important ecosystem, and focus in little bit on the intersection of the data cloud and trust. And with me are Ann-Christel Graham, AKA A.C., she's the CRO of Talend, and Chris Degnan is the CRO of Snowflake. We have the go-to-market heavies on this section, folks. Welcome to theCUBE. >> Thank you. >> Thanks for having us. >> Yeah, it's our pleasure. And so let's talk about digital transformation, right? Everybody loves to talk about it. It's an overused term, I know, but what does it mean? Let's talk about the vision of the data cloud for Snowflake and digital transformation. A.C., we've been hearing a lot about digital transformation over the past few years. It means a lot of things to a lot of people. What are you hearing from customers? How are they thinking about what I sometimes call DX? And what's important to them, maybe address some of the challenges even that they're facing? >> Dave, that's a great question. To our customers, digital transformation literally means staying in business or not. It's that simple. The reality is most agree on the opportunity to modernize data management infrastructure, that they need to do that to create the speed, and efficiency, and cost savings that digital transformation promises. But now it's beyond that. What's become front and center for our customers is the need for trusted data supported by an agile infrastructure that can allow a company to pivot operations as they need. Let me give you an example of that. One of our customers, a medical device company, was on their digital journey when COVID hit. They started last year in 2019. And as the pandemic hit, at the earlier part of this year, they really needed to take a closer look at their supply chain, and went through an entire supply chain optimization, having been completely disrupted in the, you think about the logistics, the transportation, the location of where they needed to get parts, all those things, when they were actually facing a need to increase production by about 20 times in order to meet the demand. And so you can imagine what that required them to do, and how reliant they were on clean, compliant, accurate data that they could use to make extremely critical decisions for their business. And in that situation, not just for their business, but decisions that would be about saving lives. So the stakes have gotten a lot higher and that's just one industry, it's really across all industries. So when you think about that, really, when you talk to any of our customers, digital transformation really means now having the confidence in data to support the business at critical times with accurate, trusted information. >> I mean, if you're not a digital business today, you're kind of out of business. Chris, I've always said a key part of digital transformation is really putting data at the core of everything. You know, not the manufacturing plant at the core and the data around it, but putting data at the center. And it seems like that's what Snowflake is bringing to the table. Can you comment? >> Yeah, I mean, I think if I look across what's happening, especially as A.C. said, you know, through COVID, is customers are bringing more and more data sets. They want to make smarter business decisions based on making data-driven decisions. And we are seeing acceleration of data moving to the cloud because there's just an abundance of data, and it's challenging to actually manage that data on-premise. And as we see those customers move those large data sets, I think what A.C. said is spot on, is that customers don't just want to have their data in the cloud, but they actually want to understand what the data is, understand who's has access to that data, making sure that they're actually making smart business decisions based on that data set. And I think that's where the partnership between both Talend and Snowflake are really tremendous, where, you know, we're helping our customers bring their data assets to to the cloud, really landing it, and allowing them to do multiple different types of workloads on top of this data cloud platform in Snowflake. And then I think, again, what Talend is bringing to the table is really helping the customer make sure that they trust the data that they're actually seeing. And I think that's a really important aspect of digital transformation today. >> Awesome, and I want to get into the partnership, but I don't want to leave the pandemic just yet. A.C., I want to ask you how it's affected customer priorities and timelines with regard to modernizing their data operations. And what I mean to that, I think about the end-to-end life cycle of going from kind of raw data to insights and how they're approaching those life cycles. Data quality is a key part of it. If you don't have good data quality, I mean, obviously you want to iterate, and you want to move fast, but if it's garbage out, then you got to to start all over again. So what are you seeing in terms of the effect of the pandemic and the urgency of modernizing those data operations? >> Yeah, well, like Chris just said, it accelerated things. For those companies that hadn't quite started their digital journey, maybe it was something that they had budgeted for, but hadn't quite resourced completely, many of them, this is what it took to really get them off the dime from that perspective, because there was no longer the opportunity to wait. They needed to go and take care of this really critical component within their business. So, you know, what COVID I think has taught companies, taught all of us, is how vulnerable even the largest companies and most robust enterprises can be. Those companies that had already begun their digital transformation, maybe even years ago, had already started that process and were in a great position in their journey, they fared a lot better, and we're able to be agile, were able to, you know, shift priorities, were able to go after what they needed to do to run their businesses better and be able to do so with real clarity and confidence. And I think that's really the second piece of it is for the last six months, people's lives have really depended on the data. People's lives have really depended on certainty. The pandemic has highlighted the importance of reliable and trustworthy information, not just the proliferation of data. And as Chris mentioned, just data being available. It's really about making sure that you can use that data as an asset. And that the greatest weapon we all have really there is the information and good information to make great business decisions. >> And, of course, Chris, the other thing we've seen is the acceleration to the cloud, which is obviously you (indistinct) born in the cloud. It's been a real tailwind. What are you seeing in that regard from your, I was going to say in the field, but from your Zoom vantage point. >> (laughs) Yeah, well, I think, you know, A.C. talked about supply chain analytics in her previous example. And I think one of the things that we did is we hosted a dataset, the COVID data set, COVID-19 dataset within Snowflake's data marketplace. And we saw customers that were, you know, initially hesitant to move to the cloud really accelerate their usage of Snowflake in the cloud with this COVID data set. And then we had other customers that are, you know, in the retail space, for example, and use the COVID data set to do supply chain analytics and accelerated, you know, it helped them make smarter business decisions on that. So, I'd say that, you know, COVID has made customers that were maybe hesitant to start their journey in the cloud move faster. And I've seen that, you know, really go at a blistering pace right now. >> You know, A.C., you just talked about value, 'cause it's all about value, but you know, the old days of data quality and the early days of chief data officer, all the focus was on risk avoidance, how do I get rid of data, how long do I have to keep it? And that has flipped dramatically, you know, sometime during the last decade. I wonder if you could talk about that a little bit. 'Cause I know you talk to a lot of CDOs out there, and have you seen that flip, where the value piece is really dwarfing that risk piece? And not that you can ignore the risk, but that's almost table stakes. What are your thoughts? >> You know, that's interesting, saying it's almost table stakes. I think you can't get away too much from the need for quality data and governed data. I think that's the first step, you can't really get to trust the data without those components. And, but to your point, the chief data officer's role, I would say, has changed pretty significantly. And in the round tables that I've participated in over the last, you know, several months, it's certainly a topic that they bring to the table that they'd like to, you know, chat with their peers about in terms of how they're navigating through the balance, that they still need to manage to the quality, they still need to manage to the governance, they still need to ensure that they're delivering that trusted information to the business. But now on the flip side as well, they're being relied upon to bring new insights and it's really requiring them to work more cross-functionally than they may have needed to in the past, where that's become a big part of their job is being that evangelist for data, the evangelist for those insights, and being able to bring in new ideas for how the business can operate. And identify, you know, not just operational efficiencies, but revenue opportunities, ways that they can shift. All you need to do is take a look at, for example, retail. You know, retail was heavily impacted by the pandemic this year, and it shows how easily an industry can be just kind of thrown off its course simply by just a significant change like that. And they need to be able to adjust. And this is where, when I've talked to some of the CDOs of the retail customers that we work with, they've had to really take a deep look at how they can leverage the data at their fingertips to identify new and different ways in which they can respond to customer demands. So it's a whole different dynamic, for sure. It doesn't mean that you walk away from the other end, the original part of the role or the areas in which they were maybe more defined a few years ago when the role of the chief data officer became very popular. I do believe it's more of a balance at this point, and really being able to deliver great value to the organization with the insights that they can bring. >> Well A.C., stay on that for a second. So you have this concept of data health, and I guess what I'm kind of getting at is that the early days of big data, Hadoop, it was just a lot of rogue efforts going on. People realized, wow, there there's no governance. And what's what seems like with Snowflake and Talend are trying to do is to make that so the business doesn't have to worry about it, build that in, don't bolt it on. But what's this notion of data health that you talk about? >> Well, it's interesting. Companies can measure and do measure just about everything, every aspect of their business health. Except what's interesting is they don't have a great way to measure the health of their data. And this is an asset that they truly rely on. Their future depends on is that health of their data. And so if we take a little bit of a step back, maybe let's take a look at an example of a customer experience just to kind of make a little bit of a delineation between the differences of data quality, data trust, and what data health truly is. We work with a lot of hotel chains, and like all companies today, hotels collect a ton of information. There's mountains of information, private information about their customers, through the loyalty clubs, and all the information that they collect from their the front desk, the systems that store their data. You can start to imagine the amount of information that a hotel chain has about an individual. And frequently, that information has errors in it, such as duplicate entries, you know, is it A.C. Graham, or is it Ann-Christel Graham? Same person, slightly different, depending on how I might've looked, or how I might've checked in at the time. And sometimes the data's also mismanaged, where because it's in so many different locations, it could be accessed by the wrong person, if someone that wasn't necessarily intended to have that kind of visibility. And so these are examples of when you look at something like that, now you're starting to get into, you know, privacy regulations, and other kinds of things that can be really impactful to a business if data's in the wrong hands or if the wrong data is in the wrong hands. So, you know, in a world of misinformation and mistrust, which is around us every single day, Talend has really invented a way for businesses to verify the veracity, the accuracy of their data. And that's where data health really comes in is being able to use a trust score to measure the data health. And that's what we've recently introduced is this concept of the trust score, something that can actually provide and measure the accuracy and the health of the data, all the way down to an individual report. And we believe that that truly provides the explainable trust, issue resolution, the kinds of things that companies are looking for in that next stage of overall data management. >> Thank you. Chris, bring us home. So one of the key aspects of what Snowflake is doing is building out the ecosystem. It's very, very important. Maybe talk about how you guys are partnering and adding value, in particular things that you're seeing customers do today within the ecosystem or with the help of the ecosystem and Snowflake, that they weren't able to do previously? >> Yeah, I mean, I think, you know, A.C. mentioned it, you mentioned it. I spend a lot of my Zoom days talking to chief data officers. And as I'm talking to these chief data officers, they are so concerned, their responsibility on making sure that the business users are getting accurate data, so that they view that as data governance, as one aspect of it. But the other aspect is the circumference of the data, of where it sits, and who has access to that data, and making sure it's super secure. And I think, you know, Snowflake is a tremendous landing spot, being a data warehouse or a cloud data platform as a service. You know, we take care of all the securing that data. And I think where Talend really helps our customer base is helps them exactly what A.C. talked about, is making sure that myself as a business user, someone like myself, who's looking at data all the time, trying to make decisions on how many salespeople I want to hire, how's my forecast coming, you know, how's the product working, all that stuff. I need to make sure that I'm actually looking at good data. And I think the combination of it all sitting in a single repository like Snowflake, and then layering a tool like Talend on top of it where I can actually say, yeah, that is good data, it helps me make smarter decisions faster. And ultimately, I think that's really where the ecosystem plays an incredibly important role for Snowflake and our customers >> Guys, two great guests. I wish we had more time, but we got to go. And so thank you so much for sharing your perspectives, a great conversation. >> Thank you for having us, Dave. >> Thanks Dave. >> All right, and thank you for watching. Keep it right there. We'll be back with more from The Data Cloud Summit 2020.

Published Date : Oct 22 2020

SUMMARY :

and Chris Degnan is the CRO of Snowflake. Let's talk about the that they need to do that and the data around it, but is really helping the customer make sure and the urgency of modernizing And that the greatest weapon is the acceleration to the cloud, that are, you know, in the And not that you can ignore the risk, over the last, you know, several months, is that the early days and the health of the data, is building out the ecosystem. sure that the business users And so thank you so much for All right, and thank you for watching.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
ChrisPERSON

0.99+

Chris DegnanPERSON

0.99+

DavePERSON

0.99+

last yearDATE

0.99+

TalendORGANIZATION

0.99+

Ann Christel GrahamPERSON

0.99+

Ann-Christel GrahamPERSON

0.99+

second pieceQUANTITY

0.99+

OneQUANTITY

0.99+

SnowflakeORGANIZATION

0.99+

two great guestsQUANTITY

0.99+

Ann-Christel GrahamPERSON

0.99+

first stepQUANTITY

0.99+

one industryQUANTITY

0.99+

bothQUANTITY

0.98+

this yearDATE

0.98+

pandemicEVENT

0.98+

todayDATE

0.98+

oneQUANTITY

0.98+

about 20 timesQUANTITY

0.96+

one aspectQUANTITY

0.96+

SnowflakeTITLE

0.95+

SnowflakeEVENT

0.95+

Data Cloud Summit 2020EVENT

0.91+

single repositoryQUANTITY

0.91+

theCUBEORGANIZATION

0.86+

last decadeDATE

0.85+

COVIDEVENT

0.85+

yearsDATE

0.85+

COVIDORGANIZATION

0.84+

A.C.ORGANIZATION

0.84+

few years agoDATE

0.82+

last six monthsDATE

0.79+

2019DATE

0.77+

DXORGANIZATION

0.74+

COVID-19OTHER

0.73+

pastDATE

0.71+

A.C. GrahamPERSON

0.7+

data cloudORGANIZATION

0.67+

secondQUANTITY

0.67+

COVIDPERSON

0.67+

single dayQUANTITY

0.64+

COVIDTITLE

0.54+

COVIDOTHER

0.32+

Ann Christel Graham and Chris Degnan V1


 

>>Hello, everyone. And welcome back to the data cloud. Summer 2020. We're >>gonna >>dig into the all important ecosystem and focusing a little bit on the intersection of the data Cloud and trust and with Me are and Crystal Graham, aka A C. She's the C R O of talent, and Chris Degnan is the C R. O of Snowflake. We have to go to market heavies on this section, folks. Welcome to the Cube. >>Thank you. >>Thanks for having us. >>That's our pleasure. And so let's let's talk about digital transformation, right? Everybody loves to talk about it. It zone overused term. I know, but what does it mean? Let's talk about the vision of the data cloud for snowflake and digital transformation. A. C. We've been hearing a lot about digital transformation over the past few years. It means a lot of things to a lot of people. What are you hearing from customers? How are they thinking about when I come, sometimes called DX and what's important to them? Maybe address some of the challenges even that they're facing where you >>thought. Absolutely. Dave, you know, digital transformation means literally staying in business or not. Um, That's what you hear from customers. Thes days. Eso, you know, most still agree on that. The opportunity Thio modernized data management, bringing efficiencies and scale and cost savings through digital transformation. But now it's really beyond that. What's become front and center is the need for trusted data. Um, and you know, one of the things we could talk a little bit about their Let me give you an example what that means. It's It's really having a agile infrastructure that will allow a company to pivot operations as they need. There's a company that I recently spoke with one of our customers that's in a medical device in the medical device industry, and they had started their digital journey last year. Um, so they were in a really good position when Cove it hit in February. At the time, they needed to take a complete look at their supply chain and optimize it. In fact, they needed to find ways that they could really change their production to a degree of about 20 times more in a given day thing, what they had been doing prior to co vid. And so when you think about what that required them to do. They really needed to rely on trusted, clean, compliant data, um, in an agile, ready to adapt infrastructure. And that's really what digital transformation was to them. And I think that's an example of when you talk to customers today, and they define what digital transformation means to them. You'll find examples like that that demonstrate that in times it's it's really about the difference between, you know, life or death saving lives, staying in business. >>Right? Well, thank you for that. And you're right on. I mean, if you're not a digital business today, you're kind of out of business. And, Chris, I've always said a key part of digital transformation is really putting data at the core of everything. You know, Not not the manufacturing plant, that the core in the data around it, but putting data at the center. It seems like that's what Snowflake is bringing to the table. Can you comment? >>Yeah. I mean, I think if if I look across what's happening and especially a Z A. C said you know, through co vid is customers are bringing more and more data sets. They wanna make smarter business decisions based on data making data driven decisions. And we're seeing acceleration of data moving to the cloud because there's just an abundance of data and it's challenging to actually manage that data on premise and and, as we see those, those customers move those large data sets. Think what A C said is spot on is that customers don't just want to have their data in the cloud. But they actually want to understand what the data is, understand, who has access to that data, making sure that they're actually making smart business decisions based on that data. And I think that's where the partnership between both talent and snowflake are really tremendous, where you know we're helping our customers bring their data assets to to the cloud, really landing it and allowing them to do multiple, different types of workloads on top of this data cloud platform and snowflake. And then I think again what talent is bringing to the table is really helping the customer make sure that they trust the data that they're actually seeing. And I think that's a really, um, important aspect of digital transformation today. >>Awesome and I want to get into the partnership, but I don't wanna leave the pandemic just yet. I wanna ask you how it's affected customer priorities and timelines with regard to modernizing their data operations. And what I mean to that I think about the end and life cycle of going from raw data insights and how they're approaching those life cycles. Data quality is a key part of If you have good data quality, you're gonna I mean, obviously you want to reiterate and you wanna move fast. But if if it's garbage out, then you got to start all over again. So what are you seeing in terms of the effect of the pandemic and the urgency of modernizing those data operations? >>Yeah, but like Chris just said it accelerated things for those companies that hadn't quite started their digital journey. Maybe it was something that they had budgeted for but hadn't quite resourced completely many of them. This is what it took to to really get them off the dying from that perspective, because there was no longer the the opportunity to wait. They needed to go and take care of this really critical component within their their business. So, um, you know what? What cove? In I think has taught cos. Taught all of us is how vulnerable even the largest. Um, you know, companies on most robust enterprises could be, um, those companies that had already begun their digital transformation, maybe even years ago had already started that process. And we're in a better. We're in a great position in their journey. They fared a lot better, and we're able to be agile. Were able Thio in a shift. Priorities were able to go after what they needed to do toe to run their businesses better and be able to do so with riel clarity and confidence. And I think that's really the second piece of it is, um, for the last six months, people's lives have really depended on the data people's lives that have really dependent on uncertainty. The pandemic has highlighted the importance of reliable and trustworthy information, not just the proliferation of data. And as Chris mentioned this data being available, it's really about making sure that you can use that data as an asset Ondas and and that the greatest weapon we all have, really there is the information and good information to make great business decisions. >>Of course, Chris The other thing we've seen is the acceleration toe, the cloud, which is obviously you're born in the cloud. It's been a real tailwind. What are you seeing in that regard from your I was gonna say in the field. But from your zoom vantage >>point? Yeah, well, I think you know, a C talked about supply chain, um, analytics in in her previous example. And I think one of the things that we did is we hosted a data set. The covert data set over 19 data set within snowflakes, data marketplace. And we saw customers that were, you know, initially hesitant to move to the cloud really accelerate. They're used to just snowflake in the cloud with this cove. It covert data set on Ben. We had other customers that are, you know, in the retail space, for example, and use the cova data set to do supply chain analytics and and and accelerated. You know, it helped them make smarter business decisions on that. So So I'd say that you know, Cove, it has, you know, made customers that maybe we're may be hesitant to to start their journey in the cloud, move faster. And I've seen that, you know, really go at a blistering pace right now. >>You know, you just talked about value because it's all about value. But the old days of data quality in the early days of Chief Data officer, all the focus was on risk avoidance. How do I get rid of data? How long do I have to keep it? And that has flipped dramatically. You know, sometime during the last decade, I wonder if you could talk about that a little bit because I know you talked to a lot of CDOs out there. And have you seen that that flip, that where the value pieces really dwarfing that risk, peace. And not that you can. You can ignore the risk that. But that's almost table stakes. What are your thoughts? >>Um, you know, that's interesting. Saying it's it's almost table stakes. I think we can't get away too much from the need for quality data and and govern data. I think that's the first step. You can't really get to, um, you know, to trust the data without those components. And but to your point, the chief data officers role, I would say, has changed pretty significantly and in the round tables that I've participated in over the last, you know, several months. It's certainly a topic that they bring to the table that they'd like Thio chat with their peers about in terms of how they're navigating through the balance that they still need. Thio manage to the quality they still need to manage to the governance. They still need toe to ensure that that they're delivering, um, that trusted information to the business. But now, on the flip side as well, they're being relied upon to bring new insights. And that's, um, it Z really require them to work more cross functionally than they may have needed to in the past. Where that's been become a big part of their job is being that evangelist for data the evangelist. For that, those insights and being able to bring in new ideas for how the business can operate and identified, you know, not just not just operational efficiencies, but revenue opportunities, ways that they can shift. All you need to do is take a look at, for example, retail. Um, you know, retail was heavily impacted by the pandemic this year on git shows how easily an industry could be could be just kind of thrown off its course simply by by a just a significant change like that. They need to be able to to adjust. And this is where, um, when I've talked to some of the CEOs of the retail customers that we work with, they've had to really take a deep look at how they can leverage their the data at their fingertips to identify new in different ways in which they can respond to customer demands. So it's a it's a whole different dynamic for sure. It doesn't mean that that you walk away from the other and the original part of the role of the or the areas in which they were maybe more defined a few years ago when the role of the chief data officer became very popular. I do believe it's more of a balance at this point and really being able to deliver great value to the organization with the insights that they could bring >>Well, a C stay on that for a second. So you have this concept of data health, and I guess what kind of getting tad is that the early days of big data Hadoop. It was a lot of rogue efforts going on. People realize, Wow, there's no governance And what what seems like what snowflake and talent are trying to do is to make that the business doesn't have to worry about it. Build that in, don't bolted on. But what's what's this notion of of data health that you talk about? >>Well, the companies you know, it's it's It's interesting Cos can measure and do measure just about everything, every aspect of their business. Health. Um, except what's interesting is they don't have a great way to measure the health of their data. And this is an asset that they truly rely on. Their future depends on is that health of their data? And so if we take a little bit of a step back, maybe let's take a look at an example of a customer experiences to kind of make a little bit of a delineation between the differences of data data, quality data, trust on what data health truly is, We work with a lot of health, a lot of hotel chains, and like all companies today, hotels collect a ton of information. There's mountains of information. Um private information about their customers through the loyalty clubs and all the information that they collect from there the front desk, the systems that store their data. You can start to imagine the amount of information that a hotel chain has about an individual. And, uh, frequently that information has, you know, errors in it, such as duplicate entries, you know. Is it a Seagram or is it in Chris Telegram? Same person, Slightly different, depending on how I might have looked or how I might have checked in at the time. And sometimes the data is also mismanaged, where because it's in so many different locations, it could be accessed by the wrong person of someone that wasn't necessarily intended to have that kind of visibility. And so these are examples of when you look at something like that. Now you're starting to get into, um, you know, privacy regulations and other kinds of things that could be really impactful to a business if data is in the wrong hands or the wrong data is in the wrong hands. So you know, in a world of misinformation and mistrust, which is around us every single day, um, talent has really invented a way for businesses to verify the veracity, the accuracy of their data. And that's where data health really comes in is being able to use a trust score to measure the data health on. That's what we have recently introduced. Is this concept of the trust score, something that can actually provide and measure the accuracy and the health of the data all the way down to an individual report on? We believe that that that truly, you know, provides the explainable trust issue resolution, the kinds of things that companies are looking for in that next stage of overall data management. >>Thank you, Chris. Bring us home. So one of the key aspects of what snowflake is doing is building out the ecosystem is very, very important. Maybe talk about how how you guys air partnering and adding value in particular things that you're seeing customers do today within the ecosystem or with the help of the ecosystem and stuff like that they weren't able to do previously. >>Yeah. I mean, I think you know a C mentioned it. You mentioned it. You know, we spent I spent a lot of my zoom days talking Thio chief data officers and as I'm talking to the chief data officers that they are so concerned their responsibility on making sure that the business users air getting accurate data so that they view that as data governance is one aspect of it. But the other aspect is the circumference of the data of where it sits and who has access to that data and making sure it's super secure. And I think you know, snowflake is a tremendous landing spot, being a data warehouse or data cloud data platform as a service, you know, we take care of all, you know, securing that data. And I think we're talent really helps our customer base is helps them. Exactly what it is he talked about is making sure that you know myself as a business users someone like myself who's looking at data all the time, trying to make decisions on how many sales people I wanna hire. How's my forecast coming? How's the product working all that stuff? I need to make sure that I'm actually looking at at good data, and I think the combination of all sitting in a single repository like snowflake and then layering it on top or laying a tool like talent on top of it, where I can actually say, Yeah, that is good data. It helps me make smarter decisions faster. And ultimately, I think that's really where the ecosystem plays. An incredibly important, important role for snowflake in our customers, >>guys to great guests. I wish we had more time, but we got to go on. DSo Thank you so much for sharing your perspective is a great conversation. >>Thank you for having a Steve. >>All right. Thank you for watching. Keep it right there. We'll be back with more from the data cloud Summit 2020.

Published Date : Oct 16 2020

SUMMARY :

And welcome back to the data cloud. and Chris Degnan is the C R. O of Snowflake. Maybe address some of the challenges even that they're facing where you it's it's really about the difference between, you know, life or death saving lives, staying in business. You know, Not not the manufacturing plant, that the core in the data around it, C said you know, through co vid is customers are bringing more and more data sets. So what are you seeing in terms of the it's really about making sure that you can use that data as an asset Ondas and and that What are you seeing in that regard from So So I'd say that you know, Cove, it has, you know, made customers that And not that you can. tables that I've participated in over the last, you know, that the business doesn't have to worry about it. Well, the companies you know, it's it's It's interesting Cos can measure and do So one of the key aspects of what snowflake is doing And I think you know, snowflake is a tremendous landing spot, being a data warehouse or data cloud DSo Thank you so much for sharing your perspective Thank you for watching.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
ChrisPERSON

0.99+

DavePERSON

0.99+

Chris DegnanPERSON

0.99+

FebruaryDATE

0.99+

Crystal GrahamPERSON

0.99+

Summer 2020DATE

0.99+

last yearDATE

0.99+

second pieceQUANTITY

0.99+

StevePERSON

0.99+

bothQUANTITY

0.99+

oneQUANTITY

0.99+

ThioPERSON

0.99+

Ann Christel GrahamPERSON

0.99+

BenPERSON

0.98+

first stepQUANTITY

0.98+

pandemicEVENT

0.97+

one aspectQUANTITY

0.97+

about 20 timesQUANTITY

0.96+

todayDATE

0.96+

this yearDATE

0.94+

few years agoDATE

0.91+

last decadeDATE

0.9+

single repositoryQUANTITY

0.85+

over 19 data setQUANTITY

0.84+

CoveORGANIZATION

0.82+

cloud Summit 2020EVENT

0.79+

last six monthsDATE

0.77+

EsoPERSON

0.71+

yearsDATE

0.66+

TelegramTITLE

0.65+

CovePERSON

0.64+

Z A. CPERSON

0.64+

CubeORGANIZATION

0.63+

single dayQUANTITY

0.61+

secondQUANTITY

0.59+

SnowflakeEVENT

0.59+

pastDATE

0.58+

CORGANIZATION

0.55+

DXORGANIZATION

0.53+

R.PERSON

0.53+

SnowflakeORGANIZATION

0.51+

SeagramORGANIZATION

0.41+

Justin Graham, Docker | DockerCon 2020


 

>> announcer: From around the globe. It's the theCUBE with digital coverage of DockerCon live 2020. Brought to you by Docker and its ecosystem partners. >> Welcome back to theCUBE coverage here at the DockerCon virtual headquarters, anchor desks here in the Palo Alto Studios were quarantined in this virtual event of DockerCon. I'm John Furrier, host along with Jenny Bertuccio, John Kreisa, Peter McKee, other folks who are moderating and weaving in and out of the sessions. But here we have a live sessions with Justin Graham, Vice President of the Products group at Docker. Justin, thanks for coming in DockerCon virtual '20. >> Absolutely, happy to be here from my home office in Seattle, Washington where it is almost sunny. >> You had a great backdrop traveler saying in the chat you got a bandwidth, a lot of bandwidth there. Looking good, some island. What a day for Docker global event. 77,000 people registered. It's just been an awesome party. >> It's been great, I could hardly sleep last night. I was up at 5:00 this morning. I was telling my son about it at breakfast. I interrupted his Zoom school. And he talked a little bit about it, so it's been awesome. I've been waiting for this interview slot for the most of the day. >> So yeah, I got to tell the kids to get off, download those gigabytes of new game updates and get off Netflix, I hear you. But you got good bandwidth. Let's get into it, I love your position. VP of Product at a company that's super technical, a lot of software, a lot of cloud. You've got a good view of the landscape of what the current situation is relative to the product, the deals that are going on with this new announced here, sneak Microsoft expansion, multiple clouds as well as the roadmap and community interaction. So you got a lot going on, you've got your fingers in all the action. When you get the keys to the kingdom, as we say in the product side of things, what's the story today from your perspective around DockerCon? What's the most important thing people should know about of what's going on with this new Docker? Obviously, ease of use, we've heard a lot about. What's going on? >> So I'll start with people. We are hyper focused on helping developers and development teams build and ship applications. That's what we're focused on. That's what we wake up every day thinking about. And we double click on that a minute in terms of what that means. If you think about where source control ends and having a running application on some production compute in the Cloud on the other end, there's a whole lot that needs to happen in the middle of those two things. And we hear from our development community and we see from those folks, there's a lot of complexity and choices and options and things in the middle there. And we really want to help streamline the creation of those pipelines to get those apps moving to production as fastly, as quickly as possible. >> And you can see it in some of the results and some of the sessions, one session coming up at around four, around how pipelining with Docker help increase the problem solving around curing cancer, really solving, saving people's lives to the front lines with COVID 19 to business value. So you seeing, again Docker coming back into the fold relative to the simple value proposition of making things super easy for developers, but on top of the mega trend of microservices. So, outside of some of these awesome sessions with his learning, the hardcore sessions here at DockerCon around microservices from monitoring, you name it, not a trivial thing cause you've got stateless and state, all kinds of new things are going on with multiple clouds. So not an easy-- >> No. >> road to kind of grok or understand you have to manage that. What are people paying attention to? What is happening? I think, first off I'll say, one of the things that I'm super passionate about is increasing access to technology, so the greatest and best ideas can get bubbled up to the top and expose no matter where they come from, whom they come from, et cetera. And I think one of the things that makes that harder, that makes that complex is just how much developers need to understand or even emerging developers need to understand. Just to even get started. Languages, IDEs, packaging, building where do you ship to? If you pick a certain powder end point, you have to understand networking and storage and identity models are just so much you have to absorb. So we're hyper focused on how can we make that complex super easy. And these are all the things that we get asked questions on. And we get interacted with on our public roadmap in other places to help with. So that's the biggest things that you're going to see coming out of Docker starting now and moving forward. We'll be serving that end. >> Let's talk about some of the new execution successes you guys had. Honestly, Snyk is security shifting left, that's a major, I think a killer win for Snyk. Obviously, getting access to millions of developers use Docker and vice versa. Into the shifting left, you get to security in that workflow piece. Microsoft expanding relationship's interesting as well because Microsoft's got a robust tech developer ecosystem. They have their own tools. So, you see these symbiotic relationship with Docker, again, coming into the fold where there's a lot of working together going on. Explain that meaning, what does that mean? >> So you're on the back of the refocus Docker in our hyperfocus on developers and development teams, one of the core tenants of the how. So before that was the what. This is the how we're going to go do it. Is by partnering with the ecosystem as much as possible and bringing the best of breed in front of developers in a way that they can most easily consume. So if you take the Snyk partnership that was just a match, a match made in developer dopamine as a Sean Connolly, would say. We're hyper focused on developers and development teams and Snyk is also hyperfocused on making it as easy as possible for developers and development teams to stay secure ship, fast and stay secure. So it really just matched up super well. And then if you think, "Well, how do we even get there in the first place?" Well, we launched our public roadmap a few months ago, which was a first that Docker has ever done. And one of the first things that comes onto that public roadmap is image vulnerability scanning. For Docker, at that time it was really just focused on Docker Hub in terms of how it came through the roadmap. It got up voted a bunch, there has been some interaction and then we thought, "Well, why just like checking that box isn't enough," right? It's just checking the box. What can we do that really brings sort of the promise of the Docker experience to something like this? And Sneak was an immediate thought, in that respect. And we just really got in touch with them and we just saw eye to eye almost immediately. And then off off the rest went. The second piece of it was really around, well why just do it in Docker Hub? What about Docker Desktop? It's downloaded 80,000 times a week and it's got 2.2 million active installations on a weekly basis. What about those folks? So we decided to raise the bar again and say, "Hey, let's make sure that this partnership includes "not only Docker Hub but Docker Desktop, so you'll be able, when we launch this, to scan your images locally on Docker Desktop. >> Awesome, I see getting some phone calls and then you got to hit this, hit the end button real quick. I saw that in there. I've got an interesting chat I want to just kind of lighten things up a little bit from Brian Stevenson. He says, "Justin, what glasses are those?" (Justin laughing) So he wants to know what kind of glasses you're wearing. >> They're glasses that I think signal that I turned 40 last year. >> (laughs) I'd say it's for your gaming environments, the blue light glasses. >> But I'm not going to say where they came from because it's probably not going to engender a bunch of positive good. But they're nice glasses. They help me see the computer screen and make sure that I'm not a bad fingering my CLI commands >> Well as old guys need the glasses, certainly I do. Speaking of old and young, this brought up a conversation since that came up, I'll just quickly riff into this cause I think it's interesting, Kelsey Hightower, during the innovation panel talked about how the developers and people want to just do applications, someone to get under the hood, up and down the stack. I was riffing with John Chrysler, around kind of the new generation, the kids coming in, the young guns, they all this goodness at their disposal. They didn't have to load Linux on a desktop and Rack and Stack servers all that good stuff. So it's so much more capable today. And so this speaks to the modern era and the expansion overall of opensource and the expansion of the people involved, new expectations and new experiences are required. So as a product person, how do you think about that? Because you don't want to just build for the old, you got to build for the new as well as the experience changes and expectations are different. What's your thoughts around that? >> Yeah, I think about sort of my start in this industry as a really good answer to that. I mean, I remember as a kid, I think I asked for a computer for every birthday and Christmas from when I was six, until I got one given to me by a friend's parents in 1994, on my way off to boarding school. And so it took that long just for me to get a computer into my hands. And then when I was in school there wasn't any role sort of Computer Science or coding courses until my senior year. And then I had to go to an Engineering School at Rensselaer city to sort of get that experience at the time. I mean, just to even get into this industry and learn how to code was just, I mean, so many things had to go my way. And then Microsoft hired me out of college. Another thing that sort of fell my way. So this work that we're doing is just so important because I worked hard, but I had a lot of luck. But not everybody's going to have some of that, right? Have that luck. So how can we make it just as easy as possible for folks to get started wherever you are. If you have a family and you're working another full time job, can you spend a few hours at night learning Docker? We can help you with that. Download Docker Desktop. We have tutorials, we have great docs, we have great captains who teach courses. So everything we're doing is sort of in service of that vision and that democratization of getting into the ideas. And I love what Kelsey, said in terms of, let's stop talking about the tech and let's stop talking about what folks can do with the tech. And that's very, very poignant. So we're really working on like, we'll take care of all the complexity behind the scenes and all of the VMs and the launching of containers and the network. We'll try to help take care of all that complexity behind the curtain so that you can just focus on getting your idea built as a developer. >> Yeah, and you mentioned Kelsey, again. He got a great story about his daughter and Serverless and I was joking on Twitter that his daughter convinced them that Serverless is great. Of course we know that Kelsey already loves Serverless. But he's pointing out this developer dopamine. He didn't say that's Shawn's word, but that's really what his daughter wanted to do is show her friends a website that she built, not get into, "Hey look, I just did a Kubernetes cluster." I mean it's not like... But pick your swim lane. This is what it's all about now. >> Yeah, I hope my son never has to understand what a service mesh is or proxy is. Right? >> Yeah. >> I just hope he just learn the language and just learns how to bring an idea to life and all the rest of it is just behind me here. >> When he said I had a parenting moment, I thought he's going to say something like that. Like, "Oh my kid did it." No, I had to describe whether it's a low level data structure or (laughs) just use Serverless. Shifting gears on the product roadmap for Docker, can you share how folks can learn about it and can you give some commentary on what you're thinking right now? I know you guys put on GitHub. Is there a link available-- >> Absolutely, available. Github.com/docker/roadmap. We tried to be very, very poignant about how we named that. So it was as easy as possible. We launched it a few months ago. It was a first in terms of Docker publicly sharing it's roadmap and what we're thinking and what we're working on. And you'll find very clear instructions of how to post issues and get started. What our code of conduct is. And then you can just get started and we even have a template for you to get started and submit an issue and talk to us about it. And internally my team and to many of our engineers as well, we triaged what we see changing and coming into the public roadmap two to three times a week. So for a half an hour to 45 minutes at a time. And then we're on Slack, batting around ideas that are coming in and saying how we can improve those. So for everyone out there, we really do pay attention to this very frequently. And we iterate on it and the image vulnerability scannings one of those great examples you can see some other things that we're working on up there. So I will say this though, there has been some continual asks for our Lennox version of Docker Desktop. So I will commit that, if we get 500 up votes, that we will triage and figure out how to get that done over a period of time. >> You heard 500 up votes to triage-- >> 500 >> You as get that. And is there a shipping date on that if they get the 500 up votes? >> No, no, (John laughs) you went to a shipping date yet, but it's on the public roadmap. So you'll know when we're working on it and when we're getting there. >> I want before I get into your session you had with the capital, which is a very geeky session getting under the hood, I'm more on the business side. The tail wind obviously for Docker is the micro services trend. What containers has enabled is just going to continue to get more awesome and complex but also a lot of value and agility and all the things you guys are talking about. So that obviously is going to be a tailwind for you. But as you guys look at that piece of it, specifically the business value, how is Docker positioned? Because a of the use cases are, no one really starts out microservices from a clean sheet of paper that we heard some talks here DockerCon where the financial services company said, "Hey, it's simple stack," and then it became feature creep, which became a monolith. And then they had to move that technical debt into a much more polyglot system where you have multiple tools and there's a lot of things going on, that seems to be the trend that also speaks to the legacy environment that most enterprises have. Could you share your view on how Docker fits into those worlds? Because you're either coming from a simple stack that more often and got successful and you're going to go microservice or you have legacy, then you want to decouple and make it highly cohesive. So your thoughts. >> So the simple answer is, Docker can help on both ends. So I think as these new technologies sort of gain momentum and get talked about a bunch and sort of get rapid adoption and rapid hype, then they're almost conceived to be this wall that builds up where people start to think, "Well, maybe my thing isn't modern enough," or, "Maybe my team's not modern enough," or, "Maybe I'm not moderate enough to use this." So there's too much of a hurdle to get over. And that we don't see that at all. There's always a way to get started. Even thinking about the other thing, and I'd say, one we can help, let us know, ping us, we'll be happy to chat with you, but start small, right? If you're in a large enterprise and you have a long legacy stack and a bunch of legacy apps, think about the smallest thing that you can start with, then you can begin to break off of that. And as a proof of concept even by just downloading Docker Desktop and visual studio code and just getting started with breaking off a small piece, and improve the model. And I think that's where Docker can be really helpful introducing you to this paradigm and pattern shift of containers and containerized packaging and microservices and production run time. >> And certainly any company coming out of his post pandemic is going to need to have a growth strategy that's going to be based on apps that's going to be based on the projects that they're currently working, double down on those and kind of sunset the ones that aren't or fix the legacy seems to be a major Taylor. >> The second bit is, as a company, you're going to also have to start something new or many new things to innovate for your customers and keep up with the times and the latest technology. So start to think about how you can ensure that the new things that you're doing are starting off in a containerized way using Docker to help you get there. If the legacy pieces may not be able to move as quickly or there's more required there, just think about the new things you're going to do and start new in that respect. >> Well, let's bring some customer scenarios to the table. Pretend I'm a customer, we're talking, "Hey Justin, you're looking good. "Hey, I love Docker. I love the polyglot, blah, blah, blah." Hey, you know what? And I want to get your response to this. And I say, "DevOps won't work here where we are, "it's just not a good fit." What do you say when you hear things like that? >> See my previous comment about the wall that builds up. So the answer is, and I remember hearing this by the way, about Agile years ago, when Agile development and Agile processes began to come in and take hold and take over for sort of waterfall processes, right? What I hear customers really saying is, "Man, this is really hard, this is super hard. "I don't know where to start, it's very hard. "How can you help? "Help me figure out where to start." And that is one of the things that we're very very very clearly working on. So first off we just, our docs team who do great work, just made an unbelievable update to the Docker documentation homepage, docs.docker.com. Before you were sort of met with a wall of text in a long left navigation that if you didn't know what you were doing, I would know where to go. Now you can go there and there's six very clear paths for you to follow. Do you want to get started? Are you looking for a product manual, et cetera. So if you're just looking for where to get started, just click on that. That'll give you a great start. when you download Docker Desktop, there's now an onboarding tutorial that will walk you through getting your first application started. So there are ways for you to help and get started. And then we have a great group of Docker captains Bret Fisher, many others who are also instructors, we can absolutely put you in touch with them or some online coursework that they deliver as well. So there's many resources available to you. Let us help you just get over the hump of getting started. >> And Jenny, and on the community side and Peter McKee, we're talking about some libraries are coming out, some educational stuff's coming around the corner as well. So we'll keep an eye out for that. Question for you, a personal question, can you share a proud devOps Docker moment that you could share with the audience? >> Oh wow, so many to go through. So I think a few things come to mind over the past few weeks. So for everyone that has no... we launched some exciting new pricing plans last week for Docker. So you can now get quite a bit of value for $7 a month in our pro plan. But the amount of work that the team had to do to get there was just an incredible thing. And just watching how the team have a team operated and how the team got there and just how they were turning on a dime with decisions that were being made. And I'm seeing the same thing through some of our teams that are building the image vulnerability scanning feature. I won't quote the number, but there's a very small number of people working on that feature that are creating an incredible thing for customers. So it's just how we think every day. Because we're actually almost trying to productize how we work, right? And bring that to the customer. >> Awesome, and your take on DockerCon virtual, obviously, we're all in this situation. The content's been rich on the site. You would just on the captains program earlier in the day. >> Yes. >> Doctor kept Brett's captain taught like a marathon session. Did they grill you hard or what was your experience on the captain's feed? >> I love the captain's feed. We did a run of that for the Docker birthday a few months ago with my co-worker Justin Cormack. So yes, there are two Justin's that work at Docker. I got the internal Justin Slack handle. He got the external, the community Slack Justin handle. So we split the goods there. But lots of questions about how to get started. I mean, I think there was one really good question there. Someone was saying asking for advice on just how to get started as someone who wants to be a new engineer or get into coding. And I think we're seeing a lot of this. I even have a good friend whose wife was a very successful and still is a very successful person in the marketing field. And is learning how to code and wants to do a career switch. Right? >> Yeah. >> So it's really exciting. >> DockerCon is virtual. We heard Kelsey Hightower, we heard James Governor, talk about events going to be more about group conventions getting together, whether they're small, medium, or large. What's your take on DockerCon virtual, or in general, what makes a great conference these days? Cause we'll soon get back to the physical space. But I think the genie's out of the bottle, that digital space has no boundaries. It's limitless and creativity. We're just scratching the surface. What makes a great event in your mind? >> I think so, I go back to thinking, I've probably flown 600,000 miles in the past three years. Lots of time away from my family, lots of time away from my son. And now that we're all in this situation together in terms of being sheltered in place in the global pandemic and we're executing an event that has 10 times more participation from attendees than we had in our in person event. And I sat back in my chair this morning and I was thinking, "Did I really need to fly that 600,000 miles "in the past three years?" And I think James Governor, brought it up earlier. I really think the world has changed underneath us. It's just going to be really hard to... This will all be over eventually. Hopefully we'll get to a vaccine really soon. And then folks will start to feel like world's a little bit more back to "normal" but man, I'm going to really have to ask myself like, "Do I really need to get on this airplane "and fly wherever it is? "Why can't I just do it from my home office "and give my son breakfast and take them to school, "and then see them in the evening?" Plus second, like I mentioned before in terms of access, no in person event will be able to compete ever with the type of access that this type of a platform provides. There just aren't like fairly or unfairly, lots of people just cannot travel to certain places. For lots of different reasons, monetary probably being primary. And it's not their job to figure out how to get to the thing. It's our job to figure out how to get the tech and the access and the learning to them. Right? >> Yeah (murmurs) >> So I'm super committed to that and I'll be asking the question continually. I think my internal colleagues are probably laughing now because I've been beating the drum of like, "Why do we ever have to do anything in person anymore?" Like, "Let's expand the access." >> Yeah, expand the access. And what's great too is the CEO was in multiple chat streams. So you could literally, it's almost beam in there like Star Trek. And just you can be more places that doesn't require that spatial limitations. >> Yeah. >> I think face to face will be good intimate more a party-like environment, more bonding or where social face to face is more impactful. >> We do have to figure out how to have the attendee party virtually. So, we have to figure out how to get some great electronic, or band, or something to play a virtual show, and like what the ship everybody a beverage, I don't now. >> We'll co-create with Dopper theCUBE pub and have beer for everybody if need they at some point (laughs). Justin, great insight. Thank you for coming on and sharing the roadmap update on the product and your insights into the tech as well as events. Appreciate it, thank you. >> Absolutely, thank you so much. And thanks everyone for attending. >> Congratulations, on all the work on the products Docker going to the next level. Microservices is a tailwind, but it's about productivity, simplicity. Justin, the product, head of the product for Docker, VP of product on here theCUBE, DockerCon 2020. I'm John Furrier. Stay with us for more continuous coverage on theCUBE track we're on now, we're streaming live. These sessions are immediately on demand. Check out the calendar. There's 43 sessions submitted by the community. Jump in there, there are own container of content. Get in there, pun intended, and chat, and meet people, and learn. Thanks for watching. Stay with us for more after this break. (upbeat music)

Published Date : May 29 2020

SUMMARY :

Brought to you by Docker Vice President of the Absolutely, happy to be you got a bandwidth, for the most of the day. tell the kids to get off, the creation of those and some of the sessions, So that's the biggest things of the new execution And one of the first things that comes And we just really got in touch with them and then you got to hit this, They're glasses that I think signal the blue light glasses. But I'm not going to and the expansion of the people involved, and all of the VMs Yeah, and you mentioned Kelsey, again. never has to understand and all the rest of it and can you give some commentary And internally my team and to And is there a shipping date on that but it's on the public roadmap. and agility and all the things and improve the model. of sunset the ones that aren't So start to think about how you can ensure I love the polyglot, And that is one of the things And Jenny, and on the And bring that to the customer. The content's been rich on the site. on the captain's feed? We did a run of that for the We're just scratching the surface. access and the learning to them. and I'll be asking the And just you can be more places I think face to face how to have the attendee party virtually. and sharing the roadmap Absolutely, thank you so much. of the product for Docker,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jenny BertuccioPERSON

0.99+

John KreisaPERSON

0.99+

Bret FisherPERSON

0.99+

Brian StevensonPERSON

0.99+

JennyPERSON

0.99+

1994DATE

0.99+

Peter McKeePERSON

0.99+

JustinPERSON

0.99+

Justin CormackPERSON

0.99+

MicrosoftORGANIZATION

0.99+

BrettPERSON

0.99+

John FurrierPERSON

0.99+

10 timesQUANTITY

0.99+

DockerORGANIZATION

0.99+

Justin GrahamPERSON

0.99+

Sean ConnollyPERSON

0.99+

43 sessionsQUANTITY

0.99+

KelseyPERSON

0.99+

40QUANTITY

0.99+

Star TrekTITLE

0.99+

600,000 milesQUANTITY

0.99+

600,000 milesQUANTITY

0.99+

77,000 peopleQUANTITY

0.99+

sixQUANTITY

0.99+

DockerConEVENT

0.99+

twoQUANTITY

0.99+

ShawnPERSON

0.99+

last yearDATE

0.99+

SnykORGANIZATION

0.99+

second bitQUANTITY

0.99+

RensselaerLOCATION

0.99+

oneQUANTITY

0.99+

45 minutesQUANTITY

0.99+

JohnPERSON

0.99+

millionsQUANTITY

0.99+

two thingsQUANTITY

0.99+

Kelsey HightowerPERSON

0.99+

James GovernorPERSON

0.99+

one sessionQUANTITY

0.99+

last weekDATE

0.99+

ChristmasEVENT

0.98+

first applicationQUANTITY

0.98+

second pieceQUANTITY

0.98+

Seattle, WashingtonLOCATION

0.98+

500QUANTITY

0.98+

Palo Alto StudiosLOCATION

0.98+

firstQUANTITY

0.98+

LinuxTITLE

0.98+

DockerConORGANIZATION

0.97+

Docker DesktopTITLE

0.97+

docs.docker.comOTHER

0.97+

last nightDATE

0.96+

80,000 times a weekQUANTITY

0.96+

500 upQUANTITY

0.96+

DockerCon 2020EVENT

0.95+

todayDATE

0.95+

first thingsQUANTITY

0.94+

secondQUANTITY

0.94+

$7 a monthQUANTITY

0.93+

six very clear pathsQUANTITY

0.92+

TwitterORGANIZATION

0.92+

Docker HubTITLE

0.91+

Graham Breeze & Mario Blandini, Tintri by DDN | VMworld 2019


 

>> live from San Francisco, celebrating 10 years of high tech coverage. It's the Cube covering Veum World 2019. Brought to you by VM Wear and its ecosystem partners. >> Welcome back to San Francisco, everybody. My name is David Lantz. I'm here with my co host John Troia. This is Day three of V M World 2019 2 sets. >> This is >> our 10th year at the M. World Cube is the leader in live enterprise tech coverage. Marry on Blondie is here. He's the C m o and chief evangelist that 10 tree by DDN Yes, sir. He's joined by Graham Breezes The Field CTO at 10 Tree also by DDN Recent acquisition jets Great to see you. >> Likewise, as they say, we're back. I like I like to call it a hibernation in the sense that people may have not known where did Ian or 10 Trias and Tension by Dede and, as the name implies, were acquired a year ago at the M World August 31st of 2018. And in the year since, we've been ableto invest in engineering support, my joining the company in marketing to take this solution, we've been able to save thousands of customers millions of man hours and bring it to a larger number of users. Way >> first saw 10 tree, we said, Wow, this is all about simplification. And Jonah Course you remember that when you go back to the early early Dick Cube days of of'em World, very complex storage was a major challenge. 10 Tree was all about simplifying that. Of course, we know DDN as well is the high performance specialist and have worked with those guys for a number of years. But take >> us >> back Married to the original vision of 10 Cherie. Is that original vision still alive? How was it evolved? >> Well, I'd say that it's, ah, number one reason why we're a part of the DD and family of brands because, as, ah, portfolio company, they're looking good. Bring technologies. I'm the marketing guy for our enterprise or virtual ization audience, and the product sets that cover high performance computing have their own audience. So for me, I'm focused on that. Graham's also focused on that, and, uh, really what continues to make us different today is the fact we were designed to learn from the beginning to understand how virtual machines end to end work with infrastructure. And that's really the foundation of what makes us different today. The same thing, right? >> So from the very beginning we were we were built to understand the work clothes that we service in the data center. So and that was virtual machines. We service those on multiple hyper visors today in terms of being able to understand those workloads intrinsically gives us a tremendous capability. Thio place. I owe again understanding that the infrastructure network storage, hyper visor, uh, weaken view that end end in terms of a latent a graph and give customers and insight into the infrastructure how it's performing. I would say that we're actually extending that further ways in terms of additional workload that we're gonna be able to take on later this year. >> So I know a lot >> of storage admits, although I I only play one on >> TV, but, uh, no, consistently >> throughout the years, right? 10 tree user experiences that is the forefront there. And in fact, they they often some people have said, You know what? I really want to get something done. I grab my tent Reeboks and so it can't talk. Maybe some examples of one example of why the user experience how the user experiences differ or why, why it's different. >> I'll start off by saying that I had a chance being new to the company just two weeks to meet a lot of 10 tree users. And prior to taking the job, I talkto us some folks behind the scenes, and they all told me the same thing. But what I was so interested to hear is that if they didn't have 10 tree, they'd otherwise not have the time to do the automation work, the research work, the strategy work or even the firefighting that's vital to their everyday operations. Right? So it's like, of course, I don't need to manage it. If I did, I wouldn't be able to do all these other things. And I think that's it. Rings true right that it's hard to quantify that time savings because people say, 0 1/2 of it. See, that's really not much of the greater scheme of things. I don't know. 1/2 50. Working on strategic program is a huge opportunity. Let's see >> the value of 10 tree to our end users and we've heard from a lot of them this week actually spent a fantastic event hearing from many of our passionate consumers. From the very beginning. We wanted to build a product that ultimately customers care about, and we've seen that this week in droves. But I would say the going back to what they get out of it. It's the values and what they don't have to do, so they don't have to carve up ones. They don't have to carve up volumes. All they have to do is work with the units of infrastructure that air native to their environment, v ems. They deal with everything in their environment from our virtual machine perspective, virtual machines, one thing across the infrastructure. Again, they can add those virtual machines seamlessly. They can add those in seconds they don't have toe size and add anything in terms of how am I gonna divide up the storage coming in a provisional I Oh, how am I going to get the technical pieces right? Uh, they basically just get place v EMS, and we have a very simplistic way to give them Ah, visualization into that because we understand that virtual machine and what it takes to service. It comes right back to them in terms of time savings that are tremendous in terms of that. >> So let's deal with the elephant in the room. So, so 10 tree. We've talked about all the great stuff in the original founding vision. But then I ran into some troubles, right? And so what? How do you deal with that with customers in terms of just their perception of what what occurred you guys did the eye poets, et cetera, take us through how you're making sure customers are cool with you guys. >> I'm naturally, glass is half full kind of guy from previous, uh, times on the Cube. The interesting thing is, not a lot of people actually knew. Maybe we didn't create enough brand recognition in the past for people to even know that there was a transition. There were even some of our customers. And Graham, you can pile on this that because they don't manage the product every day because they don't have to. It's kind of so easy they even for gotten a lot about it on don't spend a lot of time. I'd say that the reason why we are able to continue. Invest today a year after the acquisition is because retaining existing customers was something that was very successful, and to a lot of them, you can add comments. It wasn't easy to switch to something. They could just switch to something else because there's no other product, does these automatic things and provides the predictive modeling that they're used to. So it's like what we switched to so they just kept going, and to them, they've given us a lot of great feedback. Being owned by the largest private storage company on planet Earth has the advantages of strong source of supply. Great Leverett reverse logistics partnerships with suppliers as a bigger company to be able to service them. Long >> trial wasn't broke, so you didn't need to fix it. And you were ableto maintain obviously a large portion of that customer base. And what was this service experience like? And how is that evolving? And what is Dede and bring to the table? >> So, uh, boy DD and brings so many resources in terms of bringing this from the point when they bought us last year. A year ago today, I think we transition with about 40 people in the company. We're up about 200 now, so Ah, serious investment. Obviously, that's ah have been a pretty heavy job in terms of building that thing back up. Uh, service and support we've put all of the resource is the stated goal coming across the acquisition was they have, ah, 10. Tree support tender by DNC would be better than where 10 tree support was. We fought them on >> rate scores, too. So it's hard to go from there. Right? And >> I would say what we've been doing on that today. I mean, in terms of the S L. A's, I think those were as good as they've ever been from that perspective. So we have a big team behind us that are working really hard to make sure that the customer experience is exactly what we want. A 10 tree experience to be >> So big messages at this This show, of course, multi cloud kubernetes solving climate change, fixing the homeless problem in San Francisco. I'm not hearing that from you guys. What's what's your key message to the VM world? >> Well, I personally believe that there's a lot of opportunity to invest in improving operations that are already pretty darn stable, operating these environments, talking to folks here on the floor. These new technologies you're talking about are certainly gonna change the way we deploy things. But there's gonna be a lot of time left Still operating virtualized server infrastructure and accelerating VD I deployments to just operationalized things better. We're hoping that folks choose some new technologies out there. I mean, there's a bill was a lot of hype in past years. About what technology to choose. We're all flash infrastructure, but well, I'd liketo for the say were intelligent infrastructure. We have 10 and 40 get boards were all flash, but that's not what you choose this. You choose this because you're able to take their operations and spend more your time on the apse because you're not messing around with that low level infrastructure. I think that there's a renaissance of, of, of investment and opportunity to innovate in that space into Graham's point about going further up the stack. We now have data database technology that we can show gives database administrators the direct ability to self service their own cloning, their own, staging their own operations, which otherwise would be a complex set of trouble tickets internally to provision the environment. Everyone loves to self service. That's really big. I think our customers love. It's a self service aspect. I see the self service and >> the ability to d'oh again, not have to worry about all the things that they don't have to do in terms of again not having to get into those details. A cz Morrow mentioned in terms of the database side, that's, ah, workload, the workload intelligence that we've already had for virtual machines. We can now service that database object natively. We're going to do sequel server later this year, uh, being ableto again, being able to see where whether or not they've got a host or a network or a storage problem being able to see where those the that unit they're serving, having that inside is tremendously powerful. Also being able the snapshot to be able to clone to be able thio manage and protect that database in a native way. Not having to worry about, you know, going into a console, worrying about the underlying every structure, the ones, the volumes, all the pieces that might people people would have to get involved with maybe moving from, like, production to test and those kinds of things. So it's the simplicity, eyes all the things that you really don't have to do across the getting down in terms of one's the volumes, the sizing exercises one of our customers put it. Best thing. You know, I hear a lot of things back from different customer. If he says the country, the sentry box is the best employee has >> I see that way? Reinvest, Reinvest. I haven't heard a customer yet that talks about reducing staff. Their I t staff is really, really critical. They want to invest up Kai throw buzzword out there, Dev. Ops. You didn't mention that it's all about Dev ops, right? And one thing that's interesting here is were or ah, technology that supports virtual environments and how many software developers use virtual environments to write, test and and basically developed programmes lots and being able to give those developers the ability to create new machines and be very agile in the way they do. Their test of is awesome and in terms of just taking big amounts of data from a nap, if I can circling APP, which is these virtual machines be ableto look at that on the infrastructure and more of her copy data so that I can do stuff with that data. All in the flying virtualization we think of Dev Ops is being very much a cloud thing. I'd say that virtual ization specifically server virtualization is the perfect foundation for Dav ops like functionality. And what we've been able to do is provide that user experience directly to those folks up the stacks of the infrastructure. Guy doesn't have to touch it. I wanted to pull >> a couple of threads together, and I think because we talked about the original vision kind of E m r centric, VM centric multiple hyper visors now multi cloud here in the world. So what >> are you seeing >> in the customers? Is that is it? Is it a multi cloud portfolio? What? What are you seeing your customers going to in the future with both on premise hybrid cloud public. So where does where does 10 tree fit into the storage portfolio? >> And they kind of >> fit all over the map. I think in terms of the most of the customers that we have ultimately have infrastructure on site and in their own control. We do have some that ultimately put those out in places that are quote unquote clouds, if you will, but they're not in the service. Vendor clouds actually have a couple folks, actually, that our cloud providers. So they're building their own clouds to service customers using market. What >> differentiates service is for serving better d our offerings because they can offer something that's very end end for that customer. And so there's more. They monetize it. Yeah, and I think those type of customers, like the more regional provider or more of a specialty service provider rather than the roll your own stuff, I'd say that Generally speaking, folks want tohave a level of abstraction as they go into new architecture's so multi cloud from a past life I wrote a lot about. This is this idea that I don't have to worry about which cloud I'm on to do what I'm doing. I want to be able to do it and then regards of which clouded on it just works. And so I think that our philosophy is how we can continue to move up the stack and provide not US access to our analytics because all that analytic stuff we do in machine learning is available via a P I We have ah v r o plug in and all that sort of stuff to be able allow that to happen. But when we're talking now about APS and how those APS work across multiple, you know, pieces of infrastructure, multiple V EMS, we can now develop build a composite view of what those analytics mean in a way that really now gives them new inside test. So how can I move it over here? Can I move over here? What's gonna happen if I move it over here over there? And I think that's the part that should at least delineate from your average garden variety infrastructure and what we like to call intelligent infrastructure stopping that can, Actually that's doing stuff to be able to give you that data because there's always a way you could do with the long way. Just nobody has time to do with the long way, huh? No. And I would actually say that you >> know what you just touched on, uh, going back to a fundamental 10 tree. Different churches, getting that level of abstraction, right is absolutely the key to what we do. We understand that workload. That virtual machine is the level of abstraction. It's the unit infrastructure within a virtual environment in terms of somebody who's running databases. Databases are the unit of infrastructure that they want to manage. So we line exactly to the fundamental building blocks that they're doing in those containers, certainly moving forward. It's certainly another piece we're looking. We've actually, uh I think for about three years now, we've been looking pretty hard of containers. We've been waiting to see where customers were at. Obviously Of'em were put. Put some things on the map this week in terms of that they were pretty excited about in terms of looking in terms of how we would support. >> Well, it certainly makes it more interesting if you're gonna lean into it with someone like Vienna where behind it. I mean, I still think there are some questions, but I actually like the strategy of because if I understand it correctly of Visa, the sphere admin is going to see the spear. But ah ah, developers going to see kubernetes. So >> yeah, that's kind of cool. And we just want to give people an experience, allows them to self service under the control of the I T department so that they can spend less time on infrastructure. Just the end of the I haven't met a developer that even likes infrastructure. They love to not have to deal with it at all. They only do it out. It assessed even database folks They love infrastructural because they had to think about it. They wanted to avoid the pitfalls of bad infrastructure infrastructures Code is yeah, way we believe in that >> question. Go to market. Uh, you preserve the 10 tree name so that says a lot. What's to go to market like? How are you guys structuring the >> organizational in terms of, ah, parent company perspective or a wholly owned subsidiary of DDN? So 10 tree by DDN our go to market model is channel centric in the sense that still a vast majority of people who procure I t infrastructure prefer to use an integrator or reseller some sort of thing. As far as that goes, what you'll see from us, probably more than you did historically, is more work with some of the folks in the ecosystem. Let's say in the data protection space, we see a rubric as an example, and I think you can talk to some of that scene where historically 10 Tree hadn't really done. It's much collaboration there, but I think now, given the overall stability of the segment and people knowing exactly where value could be added, we have a really cool joint story and you're talking about because your team does that. >> Yeah, so I would certainly say, you know, in terms of go to market Side, we've been very much channel lead. Actually, it's been very interesting to go through this with the channel folks. It's a There's also a couple other pieces I mentioned you mentioned some of the cloud provider. Some of those certainly crossed lines between whether they're MSP is whether they are resellers, especially as we go to our friends across the pond. Maybe that's the VM it'll Barcelona discussion, but some of those were all three, right? So there are customer their service providers there. Ah ah, channel partner if you want terms of a resellers. So, um, it's been pretty interesting from that perspective. I think the thing is a lot of opportunity interview that Certainly, uh, I would say where we're at in terms of, we're trying to very much. Uh, we understand customers have ecosystems. I mean, Marco Mitchem, the backup spaces, right? Uh, customers. We're doing new and different things in there, and they want us to fit into those pieces. Ah, and I'd certainly say in the world that we're in, we're not tryingto go solve and boil the ocean in terms of all the problems ourselves we're trying to figure out are the things that we can bring to the table that make it easier for them to integrate with us And maybe in some new and novel, right, >> So question So what's the number one customer problem that when you guys hear you say, that's our wheelhouse, we're gonna crush the competition. >> I'll let you go first, >> So I'd say, you know, if they have a virtualized environment, I mean, we belong there. Vermin. Actually, somebody said this bed is the best Earlier again. Today in the booze is like, you know, the person who doesn't have entries, a person who doesn't know about 10 tree. If they have a virtual environment, you know, the, uh I would say that this week's been pretty interesting. Lots of customer meetings. So it's been pretty, pretty awesome, getting a lot of things back. But I would say the things that they're asking us to solve our not impossible things. They're looking for evolution's. They're looking for things in terms of better insights in their environment, maybe deeper insights. One of the things we're looking to do with the tremendous amount of data we've got coming back, Um, got almost a million machines coming back to us in terms of auto support data every single night. About 2.3 trillion data points for the last three years, eh? So we're looking to make that data that we've gotten into meaningful consumable information for them. That's actionable. So again, again, what can we see in a virtual environment, not just 10 tree things in terms of storage of those kinds of things, but maybe what patches they have installed that might be affecting a network driver, which might affect the certain configuration and being able to expose and and give them some actionable ways to go take care of those problems. >> All right, we gotta go marry. I'll give you. The last word >> stated simply if you are using virtual, is a Shinto abstract infrastructure. As a wayto accelerate your operations, I run the M where, if you have ah 100 virtual machine, 150 virtual machines, you could really benefit from maybe choosing a different way to do that. Do infrastructure. I can't say the competition doesn't work. Of course, the products work. We just want hope wanted hope that folks could see that doing it differently may produce a different outcome. And different outcomes could be good. >> All right, Mario Graham, Thanks very much for coming to the cubes. Great. Thank you so much. All right. Thank you for watching John Troy a day Volante. We'll be back with our next guest right after this short break. You're watching the cube?

Published Date : Aug 29 2019

SUMMARY :

Brought to you by VM Wear and its ecosystem partners. Welcome back to San Francisco, everybody. He's the C m o and chief evangelist that 10 tree by DDN my joining the company in marketing to take this solution, we've been able to save thousands of customers And Jonah Course you remember that when back Married to the original vision of 10 Cherie. And that's really the foundation of what makes us different today. So from the very beginning we were we were built to understand the work clothes that we service And in fact, they they often some people So it's like, of course, I don't need to manage it. It's the values and what they don't have to do, so they don't have to carve up ones. We've talked about all the great stuff in I'd say that the reason why we are And you were ableto maintain obviously a large I think we transition with about 40 people in the company. So it's hard to go from there. I mean, in terms of the S L. not hearing that from you guys. database administrators the direct ability to self service their own cloning, their own, So it's the simplicity, eyes all the things that you really don't have to do across All in the flying virtualization we think of Dev Ops is being very much a cloud thing. a couple of threads together, and I think because we talked about the original vision kind of E m r centric, customers going to in the future with both on premise hybrid cloud public. So they're building their own clouds to service customers using market. the stack and provide not US access to our analytics because all that analytic stuff we do in machine learning Different churches, getting that level of abstraction, right is absolutely the key to what we do. But ah ah, developers going to see kubernetes. the control of the I T department so that they can spend less time on infrastructure. What's to go to market like? Let's say in the data protection space, we see a rubric as an example, and I think you can talk to some of that I mean, Marco Mitchem, the backup spaces, right? So question So what's the number one customer problem that when you guys hear Today in the booze is like, you know, the person who doesn't have entries, a person who doesn't know about 10 tree. All right, we gotta go marry. I can't say the competition doesn't work. Thank you so much.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
David LantzPERSON

0.99+

Mario GrahamPERSON

0.99+

DDNORGANIZATION

0.99+

Mario BlandiniPERSON

0.99+

San FranciscoLOCATION

0.99+

August 31st of 2018DATE

0.99+

GrahamPERSON

0.99+

last yearDATE

0.99+

John TroiaPERSON

0.99+

10 yearsQUANTITY

0.99+

150 virtual machinesQUANTITY

0.99+

two weeksQUANTITY

0.99+

DedePERSON

0.99+

DNCORGANIZATION

0.99+

Marco MitchemPERSON

0.99+

10 treeQUANTITY

0.99+

10th yearQUANTITY

0.99+

10QUANTITY

0.99+

TodayDATE

0.99+

a year agoDATE

0.99+

this weekDATE

0.98+

bothQUANTITY

0.98+

40QUANTITY

0.98+

todayDATE

0.98+

VisaORGANIZATION

0.98+

10 TreeORGANIZATION

0.97+

VerminPERSON

0.97+

About 2.3 trillion data pointsQUANTITY

0.97+

about 40 peopleQUANTITY

0.97+

Dev OpsTITLE

0.96+

10 tree usersQUANTITY

0.96+

thousands of customersQUANTITY

0.96+

oneQUANTITY

0.96+

about 200QUANTITY

0.96+

OneQUANTITY

0.95+

Dick CubePERSON

0.95+

John Troy a dayTITLE

0.95+

about three yearsQUANTITY

0.95+

threeQUANTITY

0.95+

firstQUANTITY

0.95+

later this yearDATE

0.95+

Day threeQUANTITY

0.95+

A year ago todayDATE

0.94+

Graham BreezePERSON

0.94+

USLOCATION

0.93+

10 treeTITLE

0.93+

VM WearORGANIZATION

0.92+

M. World CubeORGANIZATION

0.92+

M WorldORGANIZATION

0.91+

one thingQUANTITY

0.9+

one exampleQUANTITY

0.9+

a yearQUANTITY

0.9+

last three yearsDATE

0.9+

single nightQUANTITY

0.88+

millions of man hoursQUANTITY

0.88+

Dev opsTITLE

0.87+

0 1/2QUANTITY

0.87+

10TITLE

0.87+

100 virtual machineQUANTITY

0.86+

10 tree user experiencesQUANTITY

0.85+

V M World 2019EVENT

0.85+

BarcelonaLOCATION

0.84+

about 10 treeQUANTITY

0.84+

Graham BreezesPERSON

0.83+

IanPERSON

0.82+

10 treeORGANIZATION

0.81+

VMworld 2019EVENT

0.8+

S L. AORGANIZATION

0.8+

2QUANTITY

0.79+

BlondieORGANIZATION

0.79+

treeORGANIZATION

0.79+

TintriPERSON

0.79+

David Graham, Dell Technologies | CUBEConversation, August 2019


 

>> From the Silicon Angle Media office in Boston, Massachusetts, It's theCUBE. (upbeat music) Now, here's your host, Stu Miniman. >> Hi. I'm Stu Miniman, and this is theCUBE's Boston area studio; our actually brand-new studio, and I'm really excited to have I believe is a first-time guest, a long-time caller, you know, a long time listener >> Yeah, yep. first time caller, good buddy of mine Dave Graham, who is the director, is a director of emerging technologies: messaging at Dell Technologies. Disclaimer, Dave and I worked together at a company some of you might have heard on the past, it was EMC Corporation, which was a local company. Dave and I both left EMC, and Dave went back, after Dell had bought EMC. So Dave, thanks so much for joining, it is your first time on theCUBE, yes? >> It is the first time on theCUBE. >> Yeah, so. >> Lets do some, Some of the first times that I actually interacted with, with this team here, you and I were bloggers and doing lots of stuff back in the industry, so it's great to be able to talk to you on-camera. >> Yeah, same here. >> All right, so Dave, I mentioned you were a returning former EMC-er, now Dell tech person, and you spent some time at Juniper, at some startups, but give our audience a little bit about your background and your passions. >> Oh, so background-wise, yep, so started my career in technology, if you will, at EMC, worked, started in inside sales of all places. Worked my way into a consulting/engineer type position within ECS, which was, obviously a pretty hard-core product inside of EMC now, or Dell Technologies now. Left, went to a startup, everybody's got to do a start up at some point in their life, right? Take the risk, make the leap, that was awesome, was actually one of those Cloud brokers that's out there, like Nasuni, company called Sertis. Had a little bit of trouble about eight months in, so it kind of fell apart. >> Yeah, the company did, not you. >> The company did! (men laughing) I was fine, you know, but the, yeah, the company had some problems, but ended up leaving there, going to Symantec of all places, so I worked on the Veritas side, kind of the enterprise side, which just recently got bought out by Avago, evidently just. >> Broadcom >> Broadcom, Broadcom, art of the grand whole Avago. >> Dave, Dave, you know we're getting up there in years and our tech, when we keep talking about something 'cause I was just reading about, right, Broadcom, which was of course Avago bought Broadcom in the second largest tech acquisition in history, but when they acquired Broadcom, they took on the name because most people know Broadcom, not as many people know Avago, even those of us with backgrounds in the chip semiconductor and all those pieces. I mean you got Brocade in there, you've got some of the software companies that they've bought over the time, so some of those go together. But yeah, Veritas and Symantec, those of us especially with some storage and networking background know those brands well. >> Absolutely, PLX's being the PCI switched as well, it's actually Broadcom, those things. So yeah, went from Symantec after a short period of time there, went to Juniper Networks, ran part of their Center of Excellence, kind of a data center overlay team, the only non-networking guy in a networking company, it felt like. Can't say that I learned a ton about the networking side, but definitely saw a huge expansion in the data center space with Juniper, which was awesome to see. And then the opportunity came to come back to Dell Technologies. Kind of a everything old becoming new again, right? Going and revisiting a whole bunch of folks that I had worked with 13, you know, 10 years ago. >> Dave, it's interesting, you know, I think about, talk about somebody like Broadcom, and Avago, and things like that. I remember reading blog posts of yours, that you'd get down to some of that nitty-level, you and I would be ones that would be the talk about the product, all right now pull the board out, let me look at all the components, let me understand, you know, the spacing, and the cooling, and all the things there, but you know here it's 2019, Dave. Don't you know software is eating the world? So, tell us a little bit about what you're working on these days, because the high-level things definitely don't bring to mind the low-level board pieces that we used to talk about many years ago. >> Exactly, yeah, it's no longer, you know, thermals and processing power as much, right? Still aspects of that, but a lot of what we're focused on now, or what I'm focused on now is within what we call the emerging technology space. Or horizon 2, horizon 3, I guess. >> Sounds like something some analyst firm came up with, Dave. (Dave laughing) >> Yeah, like Industry 4.0, 5.0 type stuff. It's all exciting stuff, but you know when you look at technologies like five, 5G, fifth generation wireless, you know both millimeter waves, sub six gigahertz, AI, you know, everything old becoming new again, right? Stuff from the fifties, and sixties that's now starting to permeate everything that we do, you're not opening your mouth and breathing unless you're talking about AI at some point, >> Yeah, and you bring up a great point. So, we've spent some time with the Dell team understanding AI, but help connect for our audience that when you talk high AI we're talking about, we're talking about data at the center of everything, and it's those applications, are you working on some of those solutions, or is it the infrastructure that's going to enable that, and what needs to be done at that level for things to work right? >> I think it's all of the above. The beauty of kind of Dell Technologies that you sit across, both infrastructure and software. You look at the efforts and the energies, stuff like VMware buying, BitFusion, right, as a mechanism trying to assuage some of that low-level hardware stuff. Start to tap into what the infrastructure guys have always been doing. When you bring that kind of capability up the stack, now you can start to develop within the software mindset, how, how you're going to access this. Infrastructure still plays a huge part of it, you got to run it on something, right? You can't really do serverless AI at this point, am I allowed to say that? (man laughing) >> Well, you could say that, I might disagree with you, because absolutely >> Eh, that's fine. there's AI that's running on it. Don't you know, Dave, I actually did my serverless 101 article that I had, I actually had Ashley Gorakhpurwalla, who is the General Manager of Dell servers, holding the t-shirt that "there is no serverless, it's just, you know, a function that you only pay the piece that you need when you need and everything there." But the point of the humor that I was having there is even the largest server manufacturer in the world knows that underneath that serverless discussion, absolutely, there is still infrastructure that plays there, just today it tends to primarily be in AWS with all of their services, but that proliferation, serverless, we're just letting the developers be developers and not have to think about that stuff, and I mean, Dave, the stuff we've had background, you know, we want to get rid of silos and make things simpler, I mean, it's the things we've been talking about for decades, it's just, for me it was interesting to look at, it is very much a developer application driven piece, top-down as opposed to so many of the virtualization and infrastructure as a service is more of a bottom-up, let me try to change this construct so that we can then provide what you need above it, it's just a slightly different way of looking at things. >> Yeah, and I think we're really trying to push for that stuff, so you know you can bundle together hardware that makes it, makes the development platform easy to do, right? But the efforts and energy of our partnerships, Dell has engaged in a lot of partnerships within the industry, NVIDIA, Intel, AMD, Graphcore, you name it, right? We're out in that space working along with those folks, but a lot of that is driven by software. It's, you write to a library, like Kudu, or, you know pyEight, you know, PyTorch, you're using these type of elements and you're moving towards that, but then it has to run on something, right? So we want to be in that both-end space, right? We want to enable that kind of flexibility capability, and obviously not prevent it, but we want to also expose that platform to as many people within the industry as possible so they can kind of start to develop on it. You're becoming a platform company, really, when it comes down to it. >> I don't want to get down the semantical arguments of AI, if you will, but what are you hearing from customers, and what's some kind of driving some of the discussions lately that's the reality of AI as opposed to some of just the buzzy hype that everybody talks about? >> Well I still think there's some ambiguity in market around AI versus automation even, so what people that come and ask us are well, "you know, I believe in this thing called artificial intelligence, and I want to do X, Y, and Z." And these particular workloads could be better handled by a simple, not to distill it down to the barest minimum, but like cron jobs, something that's, go back in the history, look at the things that matter, that you could do very very simply that don't require a large amount of library, or sort of an understanding of more advanced-type algorithms or developments that way. In the reverse, you still have that capability now, where everything that we're doing within industry, you use chat-bots. Some of the intelligence that goes into those, people are starting to recognize, this is a better way that I could serve my customers. Really, it's that business out kind of viewpoint. How do I access these customers, where they may not have the knowledge set here, but they're coming to us and saying, "it's more than just, you know, a call, an IVR system," you know, like an electronic IVR system, right? Like I come in and it's just quick response stuff. I need some context, I need to be able to do this, and transform my data into something that's useful for my customers. >> Yeah, no, this is such a great point, Dave. The thing I've asked many times, is, my entire career we've talked about intelligence and we've talked about automation, what's different about it today? And the reality is, is it used to be all right. I was scripting things, or I would have some Bash processes, or I would put these things together. The order of magnitude and scale of what we're talking about today, I couldn't do it manually if I wanted to. And that automation is really, can be really cool these days, and it's not as, to set all of those up, there is more intelligence built into it, so whether it's AI or just machine learning kind of underneath it, that spectrum that we talk about it, there's some real-use cases, a real lot of things that are happening there, and it definitely is, order of magnitudes more improved than what we were talking about say, back when we were both at EMC and the latest generation of Symmetrix was much more intelligent than the last generation, but if you look at that 10 years later, boy, it's, it is night and day, and how could we ever have used those terms before, compared to where we are today. >> Yeah it's, it's, somebody probably at some point coined the term, "exponential". Like, things become exponential as you start to look at it. Yeah, the development in the last 10 years, both in computing horsepower, and GPU/GPGPU horsepower, you know, the innovation around, you know FPGAs are back in a big way now, right? All that brainpower that used to be in these systems now, you now can benefit even more from the flexibility of the systems in order to get specific workloads done. It's not for everybody, we all know that, but it's there. >> I'm glad you brought up FPGAs because those of us that are hardware geeks, I mean, some reason I studied mechanical engineering, not realizing that software would be a software world that we live in. I did a video with Amy Lewis and she's like, "what was your software-defined moments?" I'm like, "gosh, I'm the frog sitting in the pot, and, would love to, if I can't network-diagram it, or put these things together, networking guy, it's my background! So, the software world, but it is a real renaissance in hardware these days. Everything from the FPGAs you mentioned, you look at NVIDIA and all of their partners, and the competitors there. Anything you geeking out on the hardware side? >> I, yeah, a lot of the stuff, I mean, the era of GPU showed up in a big way, all right? We have NVIDIA to thank for that whole, I mean, the kudos to them for developing a software ecosystem alongside a hardware. I think that's really what sold that and made that work. >> Well, you know, you have to be able to solve that Bitcoin mining problem, so. >> Well, you know, depending on which cryptocurrency you did, EMD kind of snuck in there with their stuff and they did some of that stuff better. But you have that kind of competing architecture stuff, which is always good, competition you want. I think now that what we're seeing is that specific workloads now benefit from different styles of compute. And so you have the companies like Graphcore, or the chip that was just launched out of China this past week that's configurable to any type of network, enteral network underneath the covers. You see that kind of evolution in capability now, where general purpose is good, but now you start to go into reconfigurable elements so, I'll, FPGAs are some of these more advanced chips. The neuromorphic hardware, which is always, given my background in psychology, is always interesting to me, so anything that is biomorphic or neuromorphic to me is pinging around up here like, "oh, you're going to emulate the brain?" And Intel's done stuff, BraincChip's done stuff, Netspace, it's amazing. I just, the workloads that are coming along the way, I think are starting to demand different types or more effectiveness within that hardware now, so you're starting to see a lot of interesting developments, IPUs, TPUs, Teslas getting into the inferencing bit now, with their own hardware, so you see a lot of effort and energy being poured in there. Again, there's not going to be one ring to rule them all, to cop Tolkien there for a moment, but there's going to be, I think you're going to start to see the disparation of workloads into those specific hardware platforms. Again, software, it's going to start to drive the applications for how you see these things going, and it's going to be the people that can service the most amount of platforms, or the most amount of capability from a single platform even, I think are the people who are going to come out ahead. And whether it'll be us or any of our August competitors, it remains to be seen, but we want to be in that space we want to be playing hard in that space as well. >> All right Dave, last thing I want to ask you about is just career. So, it's interesting, at Vmworld, I kind of look at it in like, "wow, I'm actually, I'm sitting at a panel for Opening Acts, which is done by the VMunderground people the Sunday, day before VMworld really starts, talking about jobs and there's actually three panels, you know, careers, and financial, and some of those things, >> I'm going to be there, so come on by, >> Maybe I should join startin' at 1 o'clock Monday evening, I'm actually participating in a career cafe, talking about people and everything like that, so all that stuff's online if you want to check it out, but you know, right, you said psychology is what you studied but you worked in engineering, you were a systems engineer, and now you do messaging. The hardcore techies, there's always that boundary between the techies and the marketings, but I think it's obvious to our audience when they hear you geeking out on the TPUs and all the things there that you are not just, you're quite knowledgeable when it comes about the technology, and the good technical marketers I find tend to come from that kind of background, but give us a little bit, looking back at where you've been and where you're going, and some of those dynamics. >> Yeah, I was blessed from a really young age with a father who really loved technology. We were building PCs, like back in the eighties, right, when that was a thing, you know, "I built my AMD 386 DX box" >> Have you watched the AMC show, "Halt and Catch Fire," when that was on? >> Yeah, yeah, yeah, so there was that kind of, always interesting to me, and I, with the way my mind works, I can't code to save my life, that's my brother's gift, not mine. But being able to kind of assemble things in my head was kind of always something that stuck in the back. So going through college, I worked as a lab resident as well, working in computer labs and doing that stuff. It's just been, it's been a passion, right? I had the education, was very, you know, that was my family, was very hard on the education stuff. You're going to do this. But being able to follow that passion, a lot of things fell into place with that, it's been a huge blessing. But even in grad school when I was getting my Masters in clinical counseling, I ran my own consulting business as well, just buying and selling hardware. And a lot of what I've done is just I read and ask a ton of questions. I'm out on Twitter, I'm not the brightest bulb in the, of the bunch, but I've learned to ask a lot of questions and the amount of community support in that has gotten me a lot of where I am as well. But yeah, being able to come out on this side, marketing is, like you're saying, it's kind of an anathema to the technical guys, "oh those are the guys that kind of shine the, shine the turd, so to speak," right? But being able to come in and being able to kind of influence the way and make sure that we're technically sound in what we're saying, but you have to translate some of the harder stuff, the more hardcore engineering terms into layman's terms, because not everybody's going to approach that. A CIO with a double E, or an MS in electrical engineering are going on down that road are very few and far between. A lot of these folks have grown up or developed their careers in understanding things, but being able to kind of go in and translate through that, it's been a huge blessing, it's nice. But always following the areas where, networking for me was never a strong point, but jumping in, going, "hey, I'm here to learn," and being willing to learn has been one of the biggest, biggest things I think that's kind of reinforced that career process. >> Yeah, definitely Dave, that intellectual curiosity is something that serves anyone in the tech industry quite well, 'cause, you know, nobody is going to be an expert on everything, and I've spoken to some of the brightest people in the industry, and even they realize nobody can keep up with all of it, so that being able to ask questions, participate, and Dave, thank you so much for helping me, come have this conversation, great as always to have a chat. >> Ah, great to be here Stu, thanks. >> Alright, so be sure to check out the theCUBE.net, which is where all of our content always is, what shows we will be at, all the history of where we've been. This studio is actually in Marlborough, Massachusetts, so not too far outside of Boston, right on the 495 loop, we're going to be doing lot more videos here, myself and Dave Vellante are located here, we have a good team here, so look for more content out of here, and of course our big studio out of Palo Alto, California. So if we can be of help, please feel free to reach out, I'm Stu Miniman, and as always, thanks for watching theCUBE. (upbeat electronic music)

Published Date : Aug 9 2019

SUMMARY :

From the Silicon Angle Media office is a first-time guest, a long-time caller, you know, some of you might have heard on the past, back in the industry, so it's great to be able and you spent some time at Juniper, at some startups, in technology, if you will, at EMC, I was fine, you know, I mean you got Brocade in there, that I had worked with 13, you know, 10 years ago. and all the things there, but you know here it's 2019, Dave. Exactly, yeah, it's no longer, you know, came up with, Dave. sub six gigahertz, AI, you know, everything old or is it the infrastructure that's going to enable that, The beauty of kind of Dell Technologies that you sit across, so that we can then provide what you need above it, to push for that stuff, so you know you can bundle In the reverse, you still have that capability now, than the last generation, but if you look and GPU/GPGPU horsepower, you know, the innovation Everything from the FPGAs you mentioned, the kudos to them for developing a software ecosystem Well, you know, you have to be able and it's going to be the people you know, careers, and financial, so all that stuff's online if you want to check it out, when that was a thing, you know, "I built my AMD 386 DX box" I had the education, was very, you know, is something that serves anyone in the tech industry Alright, so be sure to check out the theCUBE.net,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavePERSON

0.99+

SymantecORGANIZATION

0.99+

EMCORGANIZATION

0.99+

Dave VellantePERSON

0.99+

Dave GrahamPERSON

0.99+

VeritasORGANIZATION

0.99+

Stu MinimanPERSON

0.99+

Amy LewisPERSON

0.99+

Ashley GorakhpurwallaPERSON

0.99+

David GrahamPERSON

0.99+

ChinaLOCATION

0.99+

JuniperORGANIZATION

0.99+

NVIDIAORGANIZATION

0.99+

DellORGANIZATION

0.99+

BostonLOCATION

0.99+

August 2019DATE

0.99+

2019DATE

0.99+

AvagoORGANIZATION

0.99+

BroadcomORGANIZATION

0.99+

SertisORGANIZATION

0.99+

Dell TechnologiesORGANIZATION

0.99+

AMDORGANIZATION

0.99+

Juniper NetworksORGANIZATION

0.99+

first timeQUANTITY

0.99+

NasuniORGANIZATION

0.99+

AugustDATE

0.99+

GraphcoreORGANIZATION

0.99+

AMCORGANIZATION

0.99+

IntelORGANIZATION

0.99+

Palo Alto, CaliforniaLOCATION

0.99+

BrocadeORGANIZATION

0.99+

VmworldORGANIZATION

0.99+

theCUBEORGANIZATION

0.99+

Halt and Catch FireTITLE

0.99+

first-timeQUANTITY

0.99+

PLXORGANIZATION

0.99+

Boston, MassachusettsLOCATION

0.98+

EMC CorporationORGANIZATION

0.98+

bothQUANTITY

0.98+

10 years agoDATE

0.98+

TeslasORGANIZATION

0.98+

1 o'clock Monday eveningDATE

0.98+

10 years laterDATE

0.98+

first timesQUANTITY

0.98+

SundayDATE

0.97+

495 loopLOCATION

0.97+

eightiesDATE

0.97+

Graham Stringer & Kevin Johnston, DXC Technology | Dell Technologies World 2019


 

>> Live from Las Vegas, it's theCUBE, covering Dell Technologies World 2019. Brought to you by Dell Technologies and it's ecosystem partners. >> Welcome to Vegas! Lisa Martin with John Furrier. You're watching us on theCUBE live. The end of Day One of our three days of coverage of Dell Technologies World. Can you hear the music? The party's already getting started. We have more content to bring you. Please welcome a couple of guests from DXE Technology, Kevin Johnston, Chief Sales and Revenue Officer, Cloud and Platform Service. Kevin, it's great to have you. >> Thank you very much. Glad to be here. >> Our pleasure. We've got Graham Stringer, Managing Director of Workplace and Mobility for DXE Americas. >> Thank you. Good to be here as well. >> Yeah, you waited just in time for the concert, guys! >> We did. >> Just in time. Here we go. >> All right, so, Kevin, let's go ahead and start with you. Give our audience and understanding of DXE. What you guys do, who you are, all that good stuff. >> Yeah, okay. That's great. So DXE was formed two years ago as a result of the merger of legacy HP Enterprise Services Business and CSC. DXE was formed really for the purpose of helping our large enterprise clients accelerate their digital transformation. So we're about a $22 billion IT services company, really aligned with our partners helping our clients transform digitally. >> And you guys were on the cloud early, too. There's a lot of devops going on. >> Yep. >> You guys had your hands in all the clouds. >> We have. >> What's your take on, here at Dell Technologies World, Microsoft's partnering with VMware? >> Yeah, so we would share a lot of beliefs with Dell Technology and VMware in particular, in that multi-cloud is a real thing. And we see multi-cloud, especially for the large enterprise clients, really being an answer for quite some number of years to come. We also believe that a large percentage of application portfolios will migrate to cloud. Whether it's private clouds or public clouds, and that there's a lot of work to be done to transform those applications to really take advantage of cloud native features. >> So last year's theme of Dell Technologies World was Make It Real, 'It' being digital transformation, security transformation, IT transformation, and workforce workplace automation. Graham, I'd love to get your perspectives on workplace mobility and some of the things that were announced this morning with Unified Workspace, Workspace ONE, and recognizing, hey, for our customers to transform digitally successfully, we've got to make sure that their are people are successful, and their people are highly distributed. What are some of the things that you heard this morning that are exciting, aligning with some of the trends that you're seeing in the workplace? >> Well the big trend that we're seeing is the role that HR is now playing in digital transformation of the workplace. If you go back two, three, four years, it was very IT centric. Conversations were predominantly with the CIO. We're now seeing 30, 40% of organizations or more engaging at the HR level. We did a recent project with one of the big retailers in the industry and right out of the bat, this chief HR officer was engaged right from the get-go. They want to know that their employees are going to experience work very differently. So that's one of the big trends we're seeing emerging. >> When did this shift happen? When was this going on? Past year, two years? Because this is a shift. >> I would say the shift has definitely happened the last couple of years. Millennials are having a huge impact. You're getting quite the cross-pollination of a lot of different generations. Millennials are now having an enormous impact. If you look at outlets like Glassdoor, millennials want to know when they go to an organization can I bring my own device? Am I going to have a great workplace experience? And you can't stick with a very traditional, legacy way of delivering IT where everything was shift left and you got to a point where everybody hated each other. >> That's a problem for productivity. >> Yes, a very big problem for productivity, absolutely. >> Talk about some of the challenges that customers have overcome with digital transformation, as it starts to become less of a buzz word and actually more of a reality and strategic imperative that has some visibility at the unit economics and value. >> Yeah, I think every large enterprise client we talk to has a digital transformation agenda of some sort and at some varying place along the path to trying to adopt a new business model or adapt to a different business process, so the challenges that we see with these clients in general is how do we scale? So I have legacy IT that won't disappear overnight and I have all the possibilities of digitally enabling or bringing new digital technologies that enable these processes or models. So this is a challenge: how to enable digital at scale where traditional and digital have to live together for some period of time. >> And it's not just a tech challenge, it's culture, too. How far has tech come because you've mentioned containers with legacy? That has been a great message to IT is I can put a container around it and hold onto it for a little while longer, I don't have to kill it, and make the changes to cloud-native. >> For the tech guys, there's been a lot of fun things and containers probably is the bridge for legacy apps into cloud for sure. For the rest of the folks, for the normal people, the way work gets done and the way to rethink how to do work in the mix of IT or technology into business is just different. >> Graham's point is beautiful because the expectation of the employee or the worker whether they're in the firm or outside the firm, outside in or inside out, however they look at it, is the new experience they want. So the expectations are changing. What's the biggest thing, we saw some stats on stage about remote working, three places, two places, I mean, hell, I'm always on the road. What is some of the expectations that you're seeing? Obviously millennials and some of the older folks. >> They want to see IT delivered in the way they want to receive it. That's one of the biggest trends we're seeing. So for Millennials, my son's kind of in that age category, right, they love to text. To pick up a phone for a younger generation is a little bit foreign. You go and deal with baby boomers, they want to be dealt with in a much different manner. So you've got that whole change, and then you've got the whole notion now of work is changing; where do I work, the ability to basically work 24/7, wherever I want, however I want, using whatever device that I want. And that of course is now creating a whole new set of challenges for IT, particularly around security. >> But employee experience is absolutely fundamental to a business' success; their ability to delight costumers, their ability to deliver outcome, so it's really pretty core. Talk to us about those conversations that you're having with customers. Are they understanding how significant that employee experience is to bottom line business outcomes differentiation? >> Very much so. We're working right now with a large manufacturing firm and they're doing not just an inside out, but outside in, so they're actually coming to watch. It's part of a workplace strategy to look at it from the outside as well. In other words, how can our client take innovation to their suppliers, their customers, to demonstrate that they understand it? So that's extremely exciting when we see that they're not just focused on their own employees and the experience germane to them. >> One thing I might add is that maybe less so from a user experience per say, but the individuals as an employee. So the shift to digital and the skill shift that's required to go with that is really probably the most monumental change that all of us technology companies and the business part of our large enterprise clients is dealing with. Whether it's a skills gap or whether it's a culture gap, this idea of just simply waterfall to agile and the way to think about that or silo versus end-to-end as just simple ways to think differently about how to go faster. So the experience, how you recruit, whose going to make it, who can be trained, and then where you need to be able to source the new talent from as well. >> I totally agree with you. We do hundreds of shows a year, this is our tenth year doing theCUBE, that is the number one things that we hear over and over again from practitioners and customers and from people working. It's not the check, you can always get a check solution, it's the cultural and the skills gap. Both are huge problems. >> And this is part of the digital at scale point. So we'll hire something in the neighborhood of six to eight thousand digital skills people. We're just about to close on active position of Luxoft, an agile devops digital company. We'll bring another 13,000 in. But if you think about the normal large enterprise and what you need to do to be able to have the university networks and to be able to really source that scale in order to effect the transformations that business need to make to stay competitive. >> And the other point, the engagements have changed too. I'm sure you guys have seen your end but every IT or CIO we talk to says, "I outsourced everything decades ago and now I've got a couple guys running the show. Now I need to have a hundred x more people coding and building core competency." That's still going to need to engage people in the channel or our service providers but they need to build core talent in house. It's swinging back and they don't know what to do. (laughter) Is that why they call you guys? Is that how you guys get involved? >> We'll help train. We'll help clients think through what does an IT or business organization need to look like profile wise, skill wise, operating model wise, and in many cases it's I have my digital model but I still have my traditional model that needs to coexist with it and then here's where the opportunities are for people to develop career paths and progress. >> Kevin, talk about the sweet spot of your engagements that you're doing right now. Where's the heart of your business? Is it someone whose really hurting, needs an aspirin, they've got a headache, is it a problem? Is it an opportunity? Is it a growth issue? Where do you see the spectrum of your engagements? >> We kind of find clients in one of three spots normally. "Hey, I know I need to do something but I'm not sure what it is, can you help me figure out to get started?" So more design thinking, problem solving. We have other clients at the other end of the spectrum who are, "Hey, I've got this figured out. I need a partner to help me execute it's scale. And I know the model that I want to do, I know the business reason for doing it." And then we have a lot of folks that are in the middle, which is, "I've started, I've got a few hundred AWS accounts. I got private clouds sitting idle. Someone help me." Or, "I've got security issues, compliance issues." >> So they're in the middle of the journey and they just need a little reboot or a kickstart. >> They need help scaling. >> They ran out of gas. (laughter) >> And how are you working with Dell Technologies and their companies, Dell EMC, if they were to do that? >> The partnership with Dell Technologies, VMware, are really center to how we go to market. DXE is one of the top few partners largest in the ecosystem. The breadth of our portfolios are extremely complementary, whether it's things like device as a service or multi- and hybrid cloud, or pivotal and devops. So the breadth of the portfolios max up really well which makes it the impact potential for our clients even more important. Dell Technology broadly is really one of the few partners that we're shoulder-to-shoulder going with to the market as well. >> Awesome. Great stuff. What's the biggest learnings you guys can share with the audience that you gathered over your multiple engagements holistically across your client base? That's learnings, that could be a best practice, or just either some scar tissue or revelations or epiphanies. Share some experience here. >> I think one of the big learnings we're seeing is the shift now to very much business outcome driven decision making. If you go back to your point about the big ITO outsourcing days, that was all about just strictly driving cost out, and that's why you got to that point where everybody was left hating each other. Now it's about business outcomes. You've got the impact of Millennials, you've got organizations wanting to create a new and better experience for the employees and they're coming to us to say, "How do we accomplish that?" We've got an organization we're working with right now, they're trying to elevate themselves to be one of the top 50 best places to work for in the US. How do they arrive at that? For them, that's their barometer and so it's not about driving costs out, it's really achieving that overall experience and enhance a business outcome. >> So they're betting on productivity gains from morale and happy workers. >> Right. And also they're recognizing the downstream impact on their customers, productivity, the level of employee engagement, right? I mean those are the things that the organization knows that if they hit on those, I mean the sky's the limit. >> Right. Anything on your end? Learnings? >> Yeah, I would say the "don't understand the talent" challenge. The ability to pivot from here's the way we all know and are familiar with doing things to the new way. There will be a big talent challenge. The other thing is the operating model from an IT standpoint. Traditional IT operating model operates at a particular speed, cloud operates at a different speed. And the tools, the talents, the skills that go with that are just completely different. And then I think the last thing is just it seems maybe surprising, but compliance at scale and at speed. So security and regulatory compliance, we see that falling over all the time. >> Great practice you guys. I've been following you guys for many years, you've got a great organization, lots of smart people there we've interviewed many times. My final question is a tech question: what technologies do you guys like that you think is ready for prime time or almost ready for prime time worth having customer keep focusing on and which one's a little more over hyped and out of reach at the moment? >> I'll take a stab at that. If you look at today's Wall Street Journal, Deloitte talks to I believe the figure they quoted was roughly 25% of organizations are doing AI in some form already, PoC or at least are committing to it in terms of strategy. We're seeing that inside DXE as well. AI is now being incorporated into our workplace offerings. The potential for that is enormous, it's real. The technology in the last couple of years, particularly with cloud computing, has really enabled it. When you look at platforms like Watson, these are capabilities that just weren't there 10, 12, 15 years ago, and now the impact that it can have on the workplace, help lines, chats, chatbots, and so forth, is enormous and it's real. Five, 10 years ago it definitely was not in it's maturity. >> Okay, over hyped. >> What's over hyped? I don't know, what comes to mind for you? >> Or maybe I'll rephrase it differently: not yet ready for prime time, but looks good on the fairway but not yet known. . . >> I think for me through workplace, IoT has still got a ways to go. AI and analytics is definitely there. IoT I would say is a little bit behind. I'm sure that Kevin has cloud and platform thoughts. >> Yeah, I would say from an over hyped standpoint, we've seen a lot of companies, large enterprises, legacy application portfolios think they're going to refactor all their applications and cloud native everything. So it feels that people are now kind of getting past that point, but we still see that idea a lot. I think the opportunity that is really in front of us, and you kind of called out, containers. Legacy applications into cloud feel like a remaining frontier for the large enterprise. We think containers and the idea of autonomous, continue optimization, financial performance, is a way to make apps run in cloud financially and performance wise in a way that we don't see a lot of companies fully solving for that yet. >> Awesome. >> A lot of work to do, a lot of opportunity. Kevin, Graham, thank you so much for sharing some of your time and thoughts and insights with John and me on theCUBE this afternoon. >> Very good. >> Thank you. >> We appreciate it. For John Furrier, I'm Lisa Martin, and you've been watching theCUBE live from Vegas. Day One of our coverage of Dell Technologies World is now in the books. Thanks for watching. (upbeat techno music)

Published Date : Apr 30 2019

SUMMARY :

Brought to you by Dell Technologies We have more content to bring you. Glad to be here. of Workplace and Mobility Good to be here as well. Here we go. What you guys do, who you ago as a result of the merger the cloud early, too. hands in all the clouds. the large enterprise clients, What are some of the things of the workplace. Because this is a shift. the last couple of years. for productivity, absolutely. Talk about some of the challenges and I have all the possibilities and make the changes to cloud-native. and the way to rethink What is some of the the ability to basically that employee experience is to bottom line and the experience germane to them. So the shift to digital that is the number one things that we hear in the neighborhood And the other point, the the opportunities are Where's the heart of your business? And I know the model that I want to do, and they just need a little They ran out of gas. So the breadth of the What's the biggest learnings is the shift now to very much So they're betting that the organization knows Anything on your end? And the tools, the talents, the skills and out of reach at the moment? and now the impact that it but looks good on the fairway AI and analytics is definitely there. for the large enterprise. and insights with John and me on theCUBE is now in the books.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

Lisa MartinPERSON

0.99+

Graham StringerPERSON

0.99+

Kevin JohnstonPERSON

0.99+

GrahamPERSON

0.99+

KevinPERSON

0.99+

Dell TechnologiesORGANIZATION

0.99+

John FurrierPERSON

0.99+

DXE TechnologyORGANIZATION

0.99+

twoQUANTITY

0.99+

Dell TechnologyORGANIZATION

0.99+

USLOCATION

0.99+

MicrosoftORGANIZATION

0.99+

LuxoftORGANIZATION

0.99+

tenth yearQUANTITY

0.99+

two placesQUANTITY

0.99+

threeQUANTITY

0.99+

GlassdoorORGANIZATION

0.99+

13,000QUANTITY

0.99+

Las VegasLOCATION

0.99+

VegasLOCATION

0.99+

sixQUANTITY

0.99+

last yearDATE

0.99+

DeloitteORGANIZATION

0.99+

DXE AmericasORGANIZATION

0.99+

BothQUANTITY

0.99+

three placesQUANTITY

0.99+

VMwareORGANIZATION

0.99+

two years agoDATE

0.99+

oneQUANTITY

0.99+

three daysQUANTITY

0.99+

HP Enterprise Services BusinessORGANIZATION

0.98+

decades agoDATE

0.98+

four yearsQUANTITY

0.98+

AWSORGANIZATION

0.98+

25%QUANTITY

0.98+

todayDATE

0.98+

Dell EMCORGANIZATION

0.98+

Dell Technologies WorldORGANIZATION

0.98+

two yearsQUANTITY

0.98+

DXC TechnologyORGANIZATION

0.97+

Day OneQUANTITY

0.97+

10DATE

0.97+

FiveDATE

0.97+

DXEORGANIZATION

0.97+

last couple of yearsDATE

0.96+

10 years agoDATE

0.96+

DXETITLE

0.96+

$22 billionQUANTITY

0.95+

eight thousandQUANTITY

0.93+

Dell Technologies World 2019EVENT

0.93+

this afternoonDATE

0.92+

15 years agoDATE

0.9+

agileTITLE

0.89+

2019DATE

0.89+

Technologies WorldEVENT

0.89+

this morningDATE

0.88+

Wall Street JournalTITLE

0.86+

CSCORGANIZATION

0.85+

couple guysQUANTITY

0.85+

12DATE

0.85+

MillennialsPERSON

0.84+

One thingQUANTITY

0.81+

Unified WorkspaceORGANIZATION

0.81+

three spotsQUANTITY

0.79+

hundredQUANTITY

0.79+

30, 40%QUANTITY

0.78+

hundreds of shows a yearQUANTITY

0.76+

WatsonTITLE

0.75+

Jocelyn Degance Graham, CloudNOW | 7th Annual CloudNOW Awards


 

[Narrator] From the heart of Silicon Valley, it's the Cube. Covering Cloud Now seventh annual Top Women Entrepreneurs in Cloud Innovation Awards. (techno music) >> Hi, Lisa Martin on the ground with the Cube at Facebook headquarters at the seventh annual Cloud Now Top Women in Cloud Innovation Awards. We are here for our third time with the founder of Cloud Now, Jocelyn Degance Graham. Jocelyn, it is great to have you back. Great to be back here for your seventh annual Cloud Now. >> We are just so delighted to be here with you Lisa and the Cube and all of the support and wonderful help that you've given us through the years for this event. >> So you have a lot of firsts that I wanted to cover and I know we've just got a few minutes of your time. Seventh annual, as I mentioned. >> Jocelyn: That's right. >> Your Cloud Now community now boasts over 1500 members. There's over 300 attendees here tonight. >> Jocelyn: That's right. >> And tell us what was really unique about how easy it was to attract this audience. >> Well this, we've never had such a great response for this event Lisa. And some of that could just be the timing. It is finally an idea who's time has come, right? And so there just seems to be such a groundswell of understanding the importance of inclusion and diversity. And beyond that actually creating belonging, right? So more and more, I feel like there's such an actual enthusiasm that we hadn't seen before. So this year, we didn't actually publish tickets or let people know that tickets were available. Everything was essentially sold through word of mouth. And so, we never even published any tickets. And we sold out of the event. And that was definitely a first for us. >> Another first is being here. >> Yeah. >> Not only being here at Facebook headquarters in Menlo Park California, but also having Sheryl Sandberg as one of the keynote speakers this evening. >> It is such an honor. You know she is one of the women who has just been so important in terms of the Seminole movement of women in tech. Like many women, I read her book, Lean In, years ago. And the fact that she's here with us tonight at the event, after having inspired an entire movement, is really significant and we're just thrilled. >> Another thing that's really interesting and a unique first for Cloud Now this year is you're recognizing 10 female tech entrepreneurs who are technical founders. >> Jocelyn: Technical founder, venture backed. >> Of venture backed businesses. >> Jocelyn: That's absolutely right. >> Tell us about how you've been able to achieve that because their backgrounds are diverse and the technologies that they're designing and driving are really incredible. >> This is one of the most, I think, exciting firsts about the event this year is in past events, we were recognizing women that had made major contributions in a technical field. And we were recognizing women regardless of the level or role or responsibility in the organization. Now we had largely done that because there were so few female founders of venture funded startups. This year was an absolute breakthrough year for many different reasons. There are organizations now like Allraise.org that are supported by women VC's. And there just seems to be an entire groundswell of female founders and we were able to, this year for the first time, align the criteria around female technical founders. And I'm really hoping that moving forward we'll be able to continue with that as more and more women realize that they should be starting businesses and they can get venture backing. >> And we're excited to talk to those winners tonight and ask how did you go about doing that? What were your inspirations and how do you kind of combat those fears and just the history of the challenge of getting funding there? Another thing that I noticed on the Cloud Now website is one of your taglines is together we can make a difference. In, you know, just the last minute or so give me some examples of how you're helping to make a difference that really resonate with you and that give you inspiration for your 2019 goals. >> That's such a great question. So for me, really one of the most heartening things about the organization is that the work we're doing together and through our scholarship program. So we're identifying the next generation of both female and minority leaders in tech and we're investing in them through our stem scholarship fund. This year, we have funders. Our funders include Google, Intel and Facebook. And we're really hoping to be able to expand that scope next year, Lisa, to increase the number of students we're helping. This year we're also, in addition to women, we are helping minority students as well. And for next year, we're wanting to expand those categories even further and being able to support people with disabilities. So we're really hoping that to create this kind of very strong fabric of the community coming together and really giving each other support. >> Jocelyn, thank you so much for having the Cube back for the third year in a row and congratulations on the groundswell that you're capitalizing on and that you're helping to create. We congratulate you and we appreciate your time. >> Lisa, it's always a pleasure. I love speaking with you. Thanks so much for coming. >> Likewise, we want to thank you for watching the Cube. Lisa Martin on the ground at Facebook headquarters at the Cloud Now Top Women Entrepreneurs in Cloud Innovation Awards. Thanks for watching. (techno music)

Published Date : Jan 29 2019

SUMMARY :

[Narrator] From the heart of Silicon Valley, it's the Cube. Hi, Lisa Martin on the ground with the Cube and the Cube and all of the support and wonderful help and I know we've just got a few minutes of your time. Your Cloud Now community now boasts over 1500 members. And tell us what was really unique about how easy it was And some of that could just be the timing. of the keynote speakers this evening. And the fact that she's here with us tonight and a unique first for Cloud Now this year that they're designing and driving are really incredible. And we were recognizing women regardless of the level and that give you inspiration for your 2019 goals. So for me, really one of the most heartening things for the third year in a row and congratulations I love speaking with you. Lisa Martin on the ground at Facebook headquarters

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JocelynPERSON

0.99+

Sheryl SandbergPERSON

0.99+

FacebookORGANIZATION

0.99+

LisaPERSON

0.99+

Lisa MartinPERSON

0.99+

Jocelyn Degance GrahamPERSON

0.99+

2019DATE

0.99+

This yearDATE

0.99+

IntelORGANIZATION

0.99+

next yearDATE

0.99+

Menlo Park CaliforniaLOCATION

0.99+

third timeQUANTITY

0.99+

Silicon ValleyLOCATION

0.99+

bothQUANTITY

0.99+

Cloud NowORGANIZATION

0.99+

oneQUANTITY

0.99+

first timeQUANTITY

0.99+

over 300 attendeesQUANTITY

0.98+

firstQUANTITY

0.98+

this yearDATE

0.98+

CubeORGANIZATION

0.98+

tonightDATE

0.98+

over 1500 membersQUANTITY

0.98+

GoogORGANIZATION

0.97+

firstsQUANTITY

0.97+

Lean InTITLE

0.96+

Cloud Now Top Women in Cloud Innovation AwardsEVENT

0.96+

third yearQUANTITY

0.94+

Seventh annualQUANTITY

0.93+

Cloud Now Top Women Entrepreneurs in Cloud Innovation AwardsEVENT

0.92+

10 female tech entrepreneursQUANTITY

0.92+

leORGANIZATION

0.92+

7th Annual CloudNOW AwardsEVENT

0.88+

this eveningDATE

0.86+

annual Top Women Entrepreneurs in Cloud Innovation AwardsEVENT

0.84+

Allraise.orgOTHER

0.78+

CloudORGANIZATION

0.77+

seventh annualQUANTITY

0.76+

NowEVENT

0.76+

one of the keynoteQUANTITY

0.74+

annualEVENT

0.69+

years agoDATE

0.68+

Seminole movementEVENT

0.68+

CubeCOMMERCIAL_ITEM

0.65+

your taglinesQUANTITY

0.58+

CloudNOWEVENT

0.56+

seventhQUANTITY

0.55+

eachQUANTITY

0.54+

Jocelyn Degance Graham, CloudNOW | CloudNOW Awards 2017


 

(digital clicking noise) >> Hi. Lisa Martin with the CUBE. On the ground at Google for the 6th annual CloudNOW Top Women in Cloud Awards event. We're very excited to be here. And now to be joined by the founder of CloudNOW, Jocelyn Degance Graham. Welcome back to the CUBE. >> Lisa, we are so happy to have you and the CUBE back for the second year. So our 6th annual event and the second year that you've been broadcasting. We're just really delighted to have your team be able to shine a spotlight on the incredible accomplishments of these women in tech. >> It's always so inspiring, Jocelyn, I was telling you before we went live, that I love reading about the people that you're honoring. But you yourself have been awarded a number of times. So you're quite the women in technology as well. >> (laughs) >> I wanted to talk a little bit about CloudNOW and what you've guys have done. Two really big announcements this year. Tell us about that. >> So the big things we've really been working on for 2017 are the scholarships, Lisa. I have to say of all the professional things this year, I really am the most heartened by the work in the scholarships. It is what is most important to me. As so we start by identifying two exceptional academic partners. We had looked at a number of ... We had read the research, we've been looking at how do you most make impact. And have more women join tech, join technical ranks, right? And so there's been a lot of debate and a lot of research about that. And what we have found is that it's very important for women to have a role model in an organization. It does not necessarily even have to be a mentor. It needs to be a role model. The other piece of the equation is the ambition gap. So it's not just about getting tons of women in the pipeline It's also about getting women that really want to take it the whole way. So this kind of combination factor of that next generation of leader that's really going to be able to get to that upper echelon of office. So the academic partners that we selected, we feel like they've really have done a great job of identifying those future leaders. For us to be able to place our investments with them. To gather corporate partnerships that are willing to be able to fund that next generation of leaders. So we have exceptional partners. We have exceptional academic institutions. If I can, I'd love to tell you just a little bit about the academic partners that we've selected. >> Yes, absolutely, please do. >> Yeah, so the first one is Holberton School. And Holberton is in San Francisco. They have a really unique model. They don't charge students any kind of tuition up front. What they do is once the student has gotten their first full-time job, then they start paying back what they would have paid in tuition. And so, it's a remarkably equitable kind of format for education. >> Lisa: It is. >> It's very different than what most people are seeing for colleges and universities. The problem is in how expensive it is to live in San Francisco. >> Lisa: Right. >> So the scholarships are actually a living wage stipend. Because the school is too intensive for the students to actually be able to work. It's a very compact program. Instead of four year, the students are done in two. So that's our first academic partner. The students are getting jobs at fantastic companies like LinkedIn, and NASA. And they are actually out-competing MIT and Stanford grads for those jobs. >> That's phenomenal. >> It is phenomenal. So we are more than happy to suggest to our corporate funders that they put their money on those bets. >> Lisa: Excellent. >> So we've got Google and we've got Accenture that are funding those Holberton scholarships. And then the second academic partner is in Bangalore, India. And it's Shanti Bhavan. You might have seen this with the Netflix documentary, "Daughters of Destiny." >> Lisa: It was incredible. >> Absolutely incredible and absolutely moving. The Shanti Bhavan school, for your viewers that are unfamiliar with it, they take children from the poorest of the poor background, in rural India. They commit to educating these children from the age of four all the way through the university level. The scholarships we put together with the help of Intel and Apcera and CB Technologies are to fund girls studying STEM at the university level in Bangalore. And this is just the beginning, Lisa. We really hope that in 2018 we can increase the number of scholarships and we really hope that we'll be able to increase the number of corporate partnerships as well. Because these students are doing phenomenal things and we really believe that they're going to be taking their place along side any of what the Ivy League graduates would be doing. >> I love that. And in our last minute, talk to us about Google and Google's involvement with you. Because that's pretty remarkable what you've been able to achieve for CloudNOW with Google. >> Thank you. The Google involvement has definitely been an involving partnership. And the funding for Google actually happened ... It was a happy circumstance that I ran into Vint Serf at a party and got introduced to him. I gave him a quick 30 second overview of what CloudNOW had been doing and he handed me his business card and said, "It sounds really interesting, send me an email." >> Wow, from one of the fathers of the internet. That's pretty amazing. >> I couldn't believe how accessible or easy-going he was. But I went ahead and I emailed him. I said, "What I'm looking for is some money for a scholarship fund. I'm not asking you for it, I just know if you were to endorse this, the money would very easily be found." So I went to sleep. Woke up, the very next morning there was a response from Vint and he had sent me the money. >> Oh my goodness. >> And we were done. The fund was closed, we were on our way. >> Wow. >> And what he said in response, it was so beautiful, Lisa. He said, "One does what one can to be of service." That message, I've been really holding it with me for the last several months. "One does what one can to be of service" Because I think it's just a very inspiring message, especially as we all go into 2018 and think about what we're grateful for. I hope there are people in your audience that feel like they can do what they can and will join us in this very heart-felt mission. >> Wow. You are so inspiring Jocelyn. With what you and your partners have created with CloudNOW. We thank you so much for asking us to be here. Our second year with the CUBE. It's a great event to cover. But be proud of what you've accomplished. >> Thank you, Lisa. >> Because it's incredible. >> Thank you for all of your support, it really means a lot to me. >> Excellent. We want to thank you for watching the CUBE, I'm Lisa Martin on the ground at Google for the 6th annual CloudNOW Top Women in Cloud event. Thanks for watching. (digital beat music)

Published Date : Dec 7 2017

SUMMARY :

And now to be joined by the founder of CloudNOW, So our 6th annual event and the about the people that you're honoring. I wanted to talk a little bit about CloudNOW and what So the academic partners that we selected, Yeah, so the first one is Holberton School. It's very different than what most So the scholarships are actually a living wage stipend. So we are more than happy to suggest to our corporate And it's Shanti Bhavan. of four all the way through the university level. And in our last minute, talk to us about Google And the funding for Google actually happened ... Wow, from one of the fathers of the internet. response from Vint and he had sent me the money. And we were done. And what he said in response, it was so beautiful, Lisa. With what you and your partners have created with CloudNOW. it really means a lot to me. on the ground at Google for the 6th annual CloudNOW

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Lisa MartinPERSON

0.99+

GoogleORGANIZATION

0.99+

JocelynPERSON

0.99+

MITORGANIZATION

0.99+

2017DATE

0.99+

BangaloreLOCATION

0.99+

NASAORGANIZATION

0.99+

2018DATE

0.99+

ApceraORGANIZATION

0.99+

Jocelyn Degance GrahamPERSON

0.99+

StanfordORGANIZATION

0.99+

LinkedInORGANIZATION

0.99+

twoQUANTITY

0.99+

LisaPERSON

0.99+

Ivy LeagueORGANIZATION

0.99+

Holberton SchoolORGANIZATION

0.99+

San FranciscoLOCATION

0.99+

30 secondQUANTITY

0.99+

NetflixORGANIZATION

0.99+

VintPERSON

0.99+

four yearQUANTITY

0.99+

HolbertonORGANIZATION

0.99+

Bangalore, IndiaLOCATION

0.99+

IntelORGANIZATION

0.99+

CB TechnologiesORGANIZATION

0.99+

CloudNOWORGANIZATION

0.99+

second yearQUANTITY

0.99+

AccentureORGANIZATION

0.99+

second academic partnerQUANTITY

0.98+

CloudNOW Awards 2017EVENT

0.98+

CUBEORGANIZATION

0.98+

IndiaLOCATION

0.98+

Daughters of DestinyTITLE

0.98+

this yearDATE

0.98+

oneQUANTITY

0.97+

first academic partnerQUANTITY

0.96+

Shanti Bhavan schoolORGANIZATION

0.93+

first oneQUANTITY

0.93+

two exceptional academic partnersQUANTITY

0.91+

next morningDATE

0.9+

fourQUANTITY

0.89+

CloudNOW Top Women in Cloud Awards eventEVENT

0.88+

CloudNOW Top Women in Cloud eventEVENT

0.88+

6th annual eventQUANTITY

0.86+

Vint SerfPERSON

0.83+

CloudNOWEVENT

0.81+

Two really big announcementsQUANTITY

0.79+

Shanti BhavanORGANIZATION

0.78+

HolbertonPERSON

0.77+

first full-time jobQUANTITY

0.73+

6thQUANTITY

0.69+

last several monthsDATE

0.63+

6th annualEVENT

0.51+

annualEVENT

0.39+

Axel Streichardt, Pure Storage & Todd Graham, ScanSource - Pure Accelerate 2017 - #PureAccelerate


 

>> Announcer: Live from San Francisco, it's the CUBE covering Pure Accelerate 2017. (upbeat music) Brought to you by Pure Storage. (sparse percussion fading) >> Welcome back to San Francisco. We're at Pier 70, and this is Pure Accelerate. And this is the CUBE, the leader in live tech coverage. I'm Dave Vellante with my co-host David Floyer. First segment of the day. Welcome! >> Thank you. >> Dave: Todd Graham is here. He's the Vice President of IT Infrastructure at ScanSource, Inc. >> Thank you. >> Dave: Axel Streichardt, who's the Director of Business Applications Solutions at Pure Storage. Gentlemen, welcome to the CUBE. >> Thank you. >> Thanks. >> Okay, so let's get right into it. Well, if we start with ScanSource, what does ScanSource do? Set up the interview with just a little background. >> Sure, so we are an international technology distribution company. We have been around since 1994, public since 1994. Today we're in the US, North, we're in Europe, Latin America, and we are quickly growing to 45 to 47 locations around the globe. We focus, very vertically focused, on technology such as telecommunications. Recently we bought a telecommunications services master agency, so we can deal with service and connectivity. Point of sale and barcode is our original business unit. And we do Voice over IP phone systems, videoconferencing, and those types of technologies today. >> You said you started in '94 and you been public since '94. So you started with an IPO? (panelists laughing) >> It was very early. That's correct. (panelists laughing) >> Wow, that's amazing. I'd love, I got to talk to you afterwards. (panelists laughing) >> That's right. That's right. >> That's like Bitcoin or something. Okay, and then maybe we could set up to the segment here. Axel, I saw you speaking here earlier to an audience. >> Axel: Right. >> Maybe describe the discussion that we're going to have here about cloud. >> We, of course, focusing a lot on the different flavors of cloud and the different deployment models that SAP customers are considering today, right? So it could be on premise. Do you want to do it in a hybrid cloud? Do you want it in a public cloud? And we see that, initially, a lot of customers were thinking and considering public cloud as the solution for SAP workloads. And it is interesting that, in recent months, we actually see that from this initial, let's say, movement we see a lot of customers actually reconsidering and coming back, right? And they're seeing that the economics, the flexibility, the agility that they were thinking about when moving certain SAP workloads to the cloud is actually not really the reality. And the reality caught up with them. And they see that the value that they get from Pure Storage actually to run SAP workloads on Pure Storage make way more sense from an economical and also from an agility perspective, right? And we also see that IDC and some other analysts, even SAP themselves, they are actually saying that probably 60%-70% of all SAP workloads will stay on premise. They will not go into a public cloud or cloud deployment. >> Okay, so, Todd. So tell us about, so you're a ERP customer, SAP customer. You decide to move into the cloud. Maybe tell us about that journey. You moved in, and the pendulum swung back. So add some color to. >> Yeah, we were migrating away from our legacy ERP environment and moving to SAP. It was a greenfield opportunity, so we felt like it was the right time to move into the cloud. We looked very heavily at our internal expertise from an applications standpoint as well as an infrastructure standpoint and felt that this would be the right opportunity to move to that infrastructure as a service, application as a service model. And then we could take time to take our center of excellence team around SAP and do knowledge transfer between the cloud organization, the managed organization, and use it as a ramp for us to educate ourselves more around SAP. Some of the other driving factors were simply. Why do we want to go to the cloud? The elasticity, the ease of deployment, the things that we firmly believed at the time were the right decision. And we felt like it could be done quicker by moving to the cloud to do that. >> Okay, so you moved to the cloud, and then it wasn't the experience that you thought it would be. It was >> Todd: Correct. >> Axel mentioned a bunch of factors. The agility wasn't there. The cost wasn't there. Maybe add some color to that as well. >> Yeah, absolutely, we felt like, with the growth of our company through acquisitions, that speed of deployment was going to be key in the future. And we quickly learned that that was not necessarily the case. Everything became request-driven, SLA-driven, versus actually worrying about what was happening within our application itself. And so we just became another customer that was submitting tickets, if you will, in that environment. Stability and performance, we saw some real impacts to the environment that were actually end-user-affecting, which really began to force us to look for some different solutions. >> Okay. So, David, you just participated in a study. We call it the True Private Cloud. >> David: Right. >> So what was happening was it was a lot of cloud washing going on. >> Right. >> And with Private Cloud, we said, "Well, you know, essentially what people want is "to be able to substantially mimic "the public cloud on private." So they can get back that control and address some of the problems. >> That's right. >> So maybe pick it up from there and talk a little bit about. >> Sure, so yes, this, this is reports that we've done on the amount of spend that'll go to hyper-converged types of products and bring it back in-house and offer the same sort of facilities to the end users as you get from a public cloud but in a private cloud itself. So is that how you've done it? Did you take a package, or how did you go, how did you take your work from the public cloud back into the private cloud? >> So part of that was, we did the initial cost analysis of where we were at. And that was one of the main drivers behind, we really can do this in-house ourselves. That's when we began looking at partners that could help us. It was a perfect time that it had set up within our refresh strategy around our traditional storage and compute environment for us to really look at what the cost factors were. Could we improve the performance and the stability of that environment and improve that service to our end users? And so those are the decisions that we made, right? And then we said, "It's time for us to bring that back in." We can have control. And one of the biggest things, and it was really more than control, it was that we understood our environment. And that was the biggest thing that we saw a challenge with, was trying to convey the importance of what was happening within our deployment of SAP to the managed services provider. >> So what led you to the Pure decision? Like David said, you got some kind of converged infrastructure, whatever, the metaphor for mimicking public cloud. What led you to Pure? And we could talk about what the solution was. >> Yeah, one of the things was just the simplicity of Pure. At first, when we heard the story, we weren't sure we really believed it. We were like, "This is, this is entirely too simple." The evergreen model was very intriguing to us at the time, because we had been in that traditional storage and compute environment where, every three years, we had a massive project and do a forklift upgrade with choose any of the providers. And it was, is what we were doing. We were looking to set ourselves up for SAP HANA in the future. We wanted to build an infrastructure that would allow us to get there. And in all of the due diligence that we did, Pure came out on top with that, with a lot of the story around their compression and dedupe capabilities. Performance around IO was just extremely compelling at the time. >> So you got to love this story. >> Absolutely. >> I mean, you hear this a lot from customers? Is this a unique situation maybe? >> Yeah, we see this a lot from customers. Actually by moving SAP workloads, mission-critical workloads, now to Pure Storage. And what really, it's not just about the evergreen and the simplicity, right? What also resonates very well with customers today is our story around the data platform, right? So that's not about storage anymore. It's really about providing a foundation for certain SAP workloads, and you can seamlessly go from, let's say, typical Oracle SAP deployment, and you can start with HANA deployments. Actually, by using our solution, you can actually reducing the cost by up to 75%, right? So these are all compelling reasons, and this all without any configuration changes or any setups that you need specifically for SAP workloads, right? It is so simple that you can run various SAP workloads on the same platform. And to move this, actually, to another angle is, What if in the future you want to do analytics, big data, internet of thing? Again, it's the same platform, it's the same foundation that you can run all these various SAP workloads on. And I think this is a very compelling story. >> And it's interesting for us. It's not just SAP workloads that are running in that environment. >> Oh, really? >> We're, it's, it's a mixed environment, so we're running everything else on top of that FlashStack today. >> Dave: Well, you've done a lot of work. >> Axel: Sure, yes. >> Well, I've got one other question I'd like to ask you about landscapes. See, you're a big international set of companies that you are servicing. So from a landscape point of view, did you want to centralize that onto one landscape or multiple landscapes? And I would have thought that's an area as well where using Flash was a great advantage that you could actually. >> It is centralized today. And then as we grow, we are giving consideration to, Will we have multiple instances across the globe. But today it is centralized and will be so probably for the next 24 months. >> But what you described earlier, Todd, was this horizontal infrastructure layer that could support mixed workloads. But there's got to be some kind of software, something in the middle that supports that as well. Did you have to write something to >> Orchestrate >> To support that >> Was it, yeah, some kind of orchestration or management, stack. >> No, today it was all, everything that we're doing today is within the Pure UI or within Wmware and UCS Manager today. >> Dave: Okay, well that'll get you pretty far. >> Yeah, yeah. Yeah. >> So where do you, what do you take away from this in terms of where this market's going? You talked about analysts generally say that most SAP workload's going to stay on prem. I think we would generally agree with that. >> Yes. Yeah. >> It's going to be a long slog before they're ready for the cloud. At least the core, mission-critical stuff, right? Okay, so that says there's real pressure on IT organizations to mimic substantially that public cloud experience. Are we there today? With a lot more work to be done? I'd like both of your inputs on that. >> Right, and that's the beauty of it. We're actually providing it, at Pure, the various flavors of cloud. So if customers want to actually go from physical to virtual, we are supporting this, because you can actually run your virtual SAP workloads seamlessly on our storage array. At the same time, if you're already then moving to the next level and you want to have a private cloud environment, right? So we have all the components and capabilities actually built into our product that you can do things like self-service, right? You can have chargeback. You can have all the deployment, right? So all of these features that actually make up a private cloud environment, so we have them in our mix already, right? So we more or less have everything ready for customers today. And if they want to actually go to a hybrid cloud, that's why I'm saying. 30%, maybe, to 40% of SAP workloads might go into a cloud, into a public cloud or a hybrid cloud environment. And we're actually also providing this hybrid cloud capability that you can move workloads seamlessly to an Azure, to an AWS, or to Google Cloud. So we just heard this morning we have this capability to move certain workloads seamlessly from on premise, from on premise Pure, onto AWS, for instance. So we have all the ingredients, so throughout this entire journey that the customer wants to go through, that they can actually move along with this one data platform, and that makes it. >> So, Todd, how do you decide now, knowing what you know, what goes where, what to put in the public cloud, what to put on prem, what's eventually going to be hybrid? >> Well, and we have adopted a strategy of Cloud First, which means, Will the workload or will the application fit in that as-a-service model? Does it necessarily mean that we're going to put everything there? We still believe that most mission-critical, anything around the RP, will most likely remain in-house. And one of the main differences that we saw was the availability in uptime that the Pure system gives us around what we could see that the manu-services providers could provide. And downtime is really not tolerated, and it's one of those things that we need. And when it's down, we've got to have things back up, and we need the availability to our end users. And as we expand across the globe, we're becoming more of a 7 by, maybe today we're a 6 by 20. We're not fully 7-by-24 shop yet. But we're getting to that, and so we're looking at the infrastructure that will help us achieve that goal. >> So you're looking at cloud as an operating model more so than a destination. Is that right? >> Todd: That's correct. That's correct. >> And of course, there's the destination aspect of it, which is a function of, what, performance and cost, and. What do you look at? What are the determinants there? >> Yeah, so performance is obviously key for us. Cost is always an important factor, but it's probably number 3 or 4 on the list, right? Availability, uptime, and performance are our key. And if we can get those, we can get the support and the availability that we need, then maybe it makes sense, right? If it's a web application, if it's something that's very straightforward, again, one of the biggest reasons that we go back to bringing it in-house is we truly understood the environment and how things fit together. Whereas in that manu-services environment, it was very difficult to do that. >> And what about security? We haven't talked much about security today. But where does that fit in in your cloud decision? >> David: Especially internationally, the different rules in different countries, for example. >> Yeah, internationally, it's a challenge with all of the data privacy laws and the things that are country-specific, and we're learning a lot of that in Latin America as well (David chuckling) as we begin to move into those markets. But security is absolutely top of mine. We will work with those cloud services providers, but we've talked to a lot of folks along the AWS and the Azure route. And we're comfortable with where the security around the cloud is going. We're talking to a lot of new cloud security brokers to understand what they can bring to the table as well. And it's not just an IT discussion. It's a legal discussion, right >> Right. >> We're having those legal teams come back to us and say, "Well, what does this mean?" Right? Where is the data going to live? And is it going to fit within our retention models and all of the things that we have in place today? >> Alright, good. Okay, we got to leave it there. But Axel, I'll give you the last word. >> The last word? Pure Accelerate. Give me the bumper sticker. >> So we are really excited to have, actually, a confirmation from a customer side to see that the strategy and the direction that we're going here at Pure is exactly on par with what customers are actually demanding and what they want when it comes to SAP or mission-critical workloads. So I'm really glad that we're hearing this now from a customer and get the confirmation from a customer. So I'm just really super duper excited to have Todd here with us to hear from, directly from a customer. >> Excellent. Alright, Cloud First. The CUBE, we hope you're first, we're first on your playlist. Gentlemen, thank you very much for coming on the CUBE. >> Thank you. Thank you. >> I appreciate it. Alright, keep it right there, buddy. We'll be back with our next guest right after this short break. (upbeat percussion music)

Published Date : Jun 13 2017

SUMMARY :

Brought to you by Pure Storage. First segment of the day. He's the Vice President of IT Infrastructure Dave: Axel Streichardt, Well, if we start with ScanSource, And we do Voice over IP phone systems, videoconferencing, So you started with an IPO? It was very early. I'd love, I got to talk to you afterwards. That's right. Okay, and then maybe we could set up to the segment here. the discussion that we're going to have here about cloud. And the reality caught up with them. You moved in, and the pendulum swung back. the things that we firmly believed that you thought it would be. Maybe add some color to that as well. And so we just became another customer We call it the True Private Cloud. So what was happening was we said, "Well, you know, essentially what people want is So maybe pick it up from there and talk and offer the same sort of facilities to the end users And so those are the decisions that we made, right? And we could talk about what the solution was. And in all of the due diligence that we did, What if in the future you want to do And it's interesting for us. it's a mixed environment, so we're running everything else I'd like to ask you about landscapes. And then as we grow, we are giving consideration to, But what you described earlier, Todd, was or management, stack. No, today it was all, everything that we're doing today is Yeah, yeah. I think we would generally agree with that. Okay, so that says there's real pressure from physical to virtual, we are supporting this, And one of the main differences that we saw was Is that right? That's correct. What are the determinants there? And if we can get those, And what about security? the different rules in different countries, for example. and the things that are country-specific, Okay, we got to leave it there. Give me the bumper sticker. and the direction that we're going here at Pure is The CUBE, we hope you're first, Thank you. We'll be back with our next guest

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavidPERSON

0.99+

David FloyerPERSON

0.99+

Axel StreichardtPERSON

0.99+

Dave VellantePERSON

0.99+

DavePERSON

0.99+

Todd GrahamPERSON

0.99+

AxelPERSON

0.99+

USLOCATION

0.99+

60%QUANTITY

0.99+

ScanSourceORGANIZATION

0.99+

EuropeLOCATION

0.99+

AWSORGANIZATION

0.99+

ToddPERSON

0.99+

Latin AmericaLOCATION

0.99+

40%QUANTITY

0.99+

1994DATE

0.99+

45QUANTITY

0.99+

San FranciscoLOCATION

0.99+

30%QUANTITY

0.99+

ScanSource, Inc.ORGANIZATION

0.99+

todayDATE

0.99+

'94DATE

0.99+

oneQUANTITY

0.99+

TodayDATE

0.99+

Pure StorageORGANIZATION

0.99+

HANATITLE

0.98+

FlashTITLE

0.98+

PureORGANIZATION

0.98+

SAP HANATITLE

0.98+

20QUANTITY

0.98+

up to 75%QUANTITY

0.98+

Pier 70LOCATION

0.98+

47 locationsQUANTITY

0.98+

6QUANTITY

0.98+

7QUANTITY

0.98+

bothQUANTITY

0.98+

CUBEORGANIZATION

0.98+

firstQUANTITY

0.97+

NorthLOCATION

0.97+

First segmentQUANTITY

0.96+

2017DATE

0.94+

70%QUANTITY

0.94+

SAPORGANIZATION

0.94+

IDCORGANIZATION

0.93+

one landscapeQUANTITY

0.93+

Cloud FirstTITLE

0.92+

SAPTITLE

0.91+

this morningDATE

0.9+

next 24 monthsDATE

0.9+

Breaking Analysis: Enterprise Technology Predictions 2023


 

(upbeat music beginning) >> From the Cube Studios in Palo Alto and Boston, bringing you data-driven insights from the Cube and ETR, this is "Breaking Analysis" with Dave Vellante. >> Making predictions about the future of enterprise tech is more challenging if you strive to lay down forecasts that are measurable. In other words, if you make a prediction, you should be able to look back a year later and say, with some degree of certainty, whether the prediction came true or not, with evidence to back that up. Hello and welcome to this week's Wikibon Cube Insights, powered by ETR. In this breaking analysis, we aim to do just that, with predictions about the macro IT spending environment, cost optimization, security, lots to talk about there, generative AI, cloud, and of course supercloud, blockchain adoption, data platforms, including commentary on Databricks, snowflake, and other key players, automation, events, and we may even have some bonus predictions around quantum computing, and perhaps some other areas. To make all this happen, we welcome back, for the third year in a row, my colleague and friend Eric Bradley from ETR. Eric, thanks for all you do for the community, and thanks for being part of this program. Again. >> I wouldn't miss it for the world. I always enjoy this one. Dave, good to see you. >> Yeah, so let me bring up this next slide and show you, actually come back to me if you would. I got to show the audience this. These are the inbounds that we got from PR firms starting in October around predictions. They know we do prediction posts. And so they'll send literally thousands and thousands of predictions from hundreds of experts in the industry, technologists, consultants, et cetera. And if you bring up the slide I can show you sort of the pattern that developed here. 40% of these thousands of predictions were from cyber. You had AI and data. If you combine those, it's still not close to cyber. Cost optimization was a big thing. Of course, cloud, some on DevOps, and software. Digital... Digital transformation got, you know, some lip service and SaaS. And then there was other, it's kind of around 2%. So quite remarkable, when you think about the focus on cyber, Eric. >> Yeah, there's two reasons why I think it makes sense, though. One, the cybersecurity companies have a lot of cash, so therefore the PR firms might be working a little bit harder for them than some of their other clients. (laughs) And then secondly, as you know, for multiple years now, when we do our macro survey, we ask, "What's your number one spending priority?" And again, it's security. It just isn't going anywhere. It just stays at the top. So I'm actually not that surprised by that little pie chart there, but I was shocked that SaaS was only 5%. You know, going back 10 years ago, that would've been the only thing anyone was talking about. >> Yeah. So true. All right, let's get into it. First prediction, we always start with kind of tech spending. Number one is tech spending increases between four and 5%. ETR has currently got it at 4.6% coming into 2023. This has been a consistently downward trend all year. We started, you know, much, much higher as we've been reporting. Bottom line is the fed is still in control. They're going to ease up on tightening, is the expectation, they're going to shoot for a soft landing. But you know, my feeling is this slingshot economy is going to continue, and it's going to continue to confound, whether it's supply chains or spending. The, the interesting thing about the ETR data, Eric, and I want you to comment on this, the largest companies are the most aggressive to cut. They're laying off, smaller firms are spending faster. They're actually growing at a much larger, faster rate as are companies in EMEA. And that's a surprise. That's outpacing the US and APAC. Chime in on this, Eric. >> Yeah, I was surprised on all of that. First on the higher level spending, we are definitely seeing it coming down, but the interesting thing here is headlines are making it worse. The huge research shop recently said 0% growth. We're coming in at 4.6%. And just so everyone knows, this is not us guessing, we asked 1,525 IT decision-makers what their budget growth will be, and they came in at 4.6%. Now there's a huge disparity, as you mentioned. The Fortune 500, global 2000, barely at 2% growth, but small, it's at 7%. So we're at a situation right now where the smaller companies are still playing a little bit of catch up on digital transformation, and they're spending money. The largest companies that have the most to lose from a recession are being more trepidatious, obviously. So they're playing a "Wait and see." And I hope we don't talk ourselves into a recession. Certainly the headlines and some of their research shops are helping it along. But another interesting comment here is, you know, energy and utilities used to be called an orphan and widow stock group, right? They are spending more than anyone, more than financials insurance, more than retail consumer. So right now it's being driven by mid, small, and energy and utilities. They're all spending like gangbusters, like nothing's happening. And it's the rest of everyone else that's being very cautious. >> Yeah, so very unpredictable right now. All right, let's go to number two. Cost optimization remains a major theme in 2023. We've been reporting on this. You've, we've shown a chart here. What's the primary method that your organization plans to use? You asked this question of those individuals that cited that they were going to reduce their spend and- >> Mhm. >> consolidating redundant vendors, you know, still leads the way, you know, far behind, cloud optimization is second, but it, but cloud continues to outpace legacy on-prem spending, no doubt. Somebody, it was, the guy's name was Alexander Feiglstorfer from Storyblok, sent in a prediction, said "All in one becomes extinct." Now, generally I would say I disagree with that because, you know, as we know over the years, suites tend to win out over, you know, individual, you know, point products. But I think what's going to happen is all in one is going to remain the norm for these larger companies that are cutting back. They want to consolidate redundant vendors, and the smaller companies are going to stick with that best of breed and be more aggressive and try to compete more effectively. What's your take on that? >> Yeah, I'm seeing much more consolidation in vendors, but also consolidation in functionality. We're seeing people building out new functionality, whether it's, we're going to talk about this later, so I don't want to steal too much of our thunder right now, but data and security also, we're seeing a functionality creep. So I think there's further consolidation happening here. I think niche solutions are going to be less likely, and platform solutions are going to be more likely in a spending environment where you want to reduce your vendors. You want to have one bill to pay, not 10. Another thing on this slide, real quick if I can before I move on, is we had a bunch of people write in and some of the answer options that aren't on this graph but did get cited a lot, unfortunately, is the obvious reduction in staff, hiring freezes, and delaying hardware, were three of the top write-ins. And another one was offshore outsourcing. So in addition to what we're seeing here, there were a lot of write-in options, and I just thought it would be important to state that, but essentially the cost optimization is by and far the highest one, and it's growing. So it's actually increased in our citations over the last year. >> And yeah, specifically consolidating redundant vendors. And so I actually thank you for bringing that other up, 'cause I had asked you, Eric, is there any evidence that repatriation is going on and we don't see it in the numbers, we don't see it even in the other, there was, I think very little or no mention of cloud repatriation, even though it might be happening in this in a smattering. >> Not a single mention, not one single mention. I went through it for you. Yep. Not one write-in. >> All right, let's move on. Number three, security leads M&A in 2023. Now you might say, "Oh, well that's a layup," but let me set this up Eric, because I didn't really do a great job with the slide. I hid the, what you've done, because you basically took, this is from the emerging technology survey with 1,181 responses from November. And what we did is we took Palo Alto and looked at the overlap in Palo Alto Networks accounts with these vendors that were showing on this chart. And Eric, I'm going to ask you to explain why we put a circle around OneTrust, but let me just set it up, and then have you comment on the slide and take, give us more detail. We're seeing private company valuations are off, you know, 10 to 40%. We saw a sneak, do a down round, but pretty good actually only down 12%. We've seen much higher down rounds. Palo Alto Networks we think is going to get busy. Again, they're an inquisitive company, they've been sort of quiet lately, and we think CrowdStrike, Cisco, Microsoft, Zscaler, we're predicting all of those will make some acquisitions and we're thinking that the targets are somewhere in this mess of security taxonomy. Other thing we're predicting AI meets cyber big time in 2023, we're going to probably going to see some acquisitions of those companies that are leaning into AI. We've seen some of that with Palo Alto. And then, you know, your comment to me, Eric, was "The RSA conference is going to be insane, hopping mad, "crazy this April," (Eric laughing) but give us your take on this data, and why the red circle around OneTrust? Take us back to that slide if you would, Alex. >> Sure. There's a few things here. First, let me explain what we're looking at. So because we separate the public companies and the private companies into two separate surveys, this allows us the ability to cross-reference that data. So what we're doing here is in our public survey, the tesis, everyone who cited some spending with Palo Alto, meaning they're a Palo Alto customer, we then cross-reference that with the private tech companies. Who also are they spending with? So what you're seeing here is an overlap. These companies that we have circled are doing the best in Palo Alto's accounts. Now, Palo Alto went and bought Twistlock a few years ago, which this data slide predicted, to be quite honest. And so I don't know if they necessarily are going to go after Snyk. Snyk, sorry. They already have something in that space. What they do need, however, is more on the authentication space. So I'm looking at OneTrust, with a 45% overlap in their overall net sentiment. That is a company that's already existing in their accounts and could be very synergistic to them. BeyondTrust as well, authentication identity. This is something that Palo needs to do to move more down that zero trust path. Now why did I pick Palo first? Because usually they're very inquisitive. They've been a little quiet lately. Secondly, if you look at the backdrop in the markets, the IPO freeze isn't going to last forever. Sooner or later, the IPO markets are going to open up, and some of these private companies are going to tap into public equity. In the meantime, however, cash funding on the private side is drying up. If they need another round, they're not going to get it, and they're certainly not going to get it at the valuations they were getting. So we're seeing valuations maybe come down where they're a touch more attractive, and Palo knows this isn't going to last forever. Cisco knows that, CrowdStrike, Zscaler, all these companies that are trying to make a push to become that vendor that you're consolidating in, around, they have a chance now, they have a window where they need to go make some acquisitions. And that's why I believe leading up to RSA, we're going to see some movement. I think it's going to pretty, a really exciting time in security right now. >> Awesome. Thank you. Great explanation. All right, let's go on the next one. Number four is, it relates to security. Let's stay there. Zero trust moves from hype to reality in 2023. Now again, you might say, "Oh yeah, that's a layup." A lot of these inbounds that we got are very, you know, kind of self-serving, but we always try to put some meat in the bone. So first thing we do is we pull out some commentary from, Eric, your roundtable, your insights roundtable. And we have a CISO from a global hospitality firm says, "For me that's the highest priority." He's talking about zero trust because it's the best ROI, it's the most forward-looking, and it enables a lot of the business transformation activities that we want to do. CISOs tell me that they actually can drive forward transformation projects that have zero trust, and because they can accelerate them, because they don't have to go through the hurdle of, you know, getting, making sure that it's secure. Second comment, zero trust closes that last mile where once you're authenticated, they open up the resource to you in a zero trust way. That's a CISO of a, and a managing director of a cyber risk services enterprise. Your thoughts on this? >> I can be here all day, so I'm going to try to be quick on this one. This is not a fluff piece on this one. There's a couple of other reasons this is happening. One, the board finally gets it. Zero trust at first was just a marketing hype term. Now the board understands it, and that's why CISOs are able to push through it. And what they finally did was redefine what it means. Zero trust simply means moving away from hardware security, moving towards software-defined security, with authentication as its base. The board finally gets that, and now they understand that this is necessary and it's being moved forward. The other reason it's happening now is hybrid work is here to stay. We weren't really sure at first, large companies were still trying to push people back to the office, and it's going to happen. The pendulum will swing back, but hybrid work's not going anywhere. By basically on our own data, we're seeing that 69% of companies expect remote and hybrid to be permanent, with only 30% permanent in office. Zero trust works for a hybrid environment. So all of that is the reason why this is happening right now. And going back to our previous prediction, this is why we're picking Palo, this is why we're picking Zscaler to make these acquisitions. Palo Alto needs to be better on the authentication side, and so does Zscaler. They're both fantastic on zero trust network access, but they need the authentication software defined aspect, and that's why we think this is going to happen. One last thing, in that CISO round table, I also had somebody say, "Listen, Zscaler is incredible. "They're doing incredibly well pervading the enterprise, "but their pricing's getting a little high," and they actually think Palo Alto is well-suited to start taking some of that share, if Palo can make one move. >> Yeah, Palo Alto's consolidation story is very strong. Here's my question and challenge. Do you and me, so I'm always hardcore about, okay, you've got to have evidence. I want to look back at these things a year from now and say, "Did we get it right? Yes or no?" If we got it wrong, we'll tell you we got it wrong. So how are we going to measure this? I'd say a couple things, and you can chime in. One is just the number of vendors talking about it. That's, but the marketing always leads the reality. So the second part of that is we got to get evidence from the buying community. Can you help us with that? >> (laughs) Luckily, that's what I do. I have a data company that asks thousands of IT decision-makers what they're adopting and what they're increasing spend on, as well as what they're decreasing spend on and what they're replacing. So I have snapshots in time over the last 11 years where I can go ahead and compare and contrast whether this adoption is happening or not. So come back to me in 12 months and I'll let you know. >> Now, you know, I will. Okay, let's bring up the next one. Number five, generative AI hits where the Metaverse missed. Of course everybody's talking about ChatGPT, we just wrote last week in a breaking analysis with John Furrier and Sarjeet Joha our take on that. We think 2023 does mark a pivot point as natural language processing really infiltrates enterprise tech just as Amazon turned the data center into an API. We think going forward, you're going to be interacting with technology through natural language, through English commands or other, you know, foreign language commands, and investors are lining up, all the VCs are getting excited about creating something competitive to ChatGPT, according to (indistinct) a hundred million dollars gets you a seat at the table, gets you into the game. (laughing) That's before you have to start doing promotion. But he thinks that's what it takes to actually create a clone or something equivalent. We've seen stuff from, you know, the head of Facebook's, you know, AI saying, "Oh, it's really not that sophisticated, ChatGPT, "it's kind of like IBM Watson, it's great engineering, "but you know, we've got more advanced technology." We know Google's working on some really interesting stuff. But here's the thing. ETR just launched this survey for the February survey. It's in the field now. We circle open AI in this category. They weren't even in the survey, Eric, last quarter. So 52% of the ETR survey respondents indicated a positive sentiment toward open AI. I added up all the sort of different bars, we could double click on that. And then I got this inbound from Scott Stevenson of Deep Graham. He said "AI is recession-proof." I don't know if that's the case, but it's a good quote. So bring this back up and take us through this. Explain this chart for us, if you would. >> First of all, I like Scott's quote better than the Facebook one. I think that's some sour grapes. Meta just spent an insane amount of money on the Metaverse and that's a dud. Microsoft just spent money on open AI and it is hot, undoubtedly hot. We've only been in the field with our current ETS survey for a week. So my caveat is it's preliminary data, but I don't care if it's preliminary data. (laughing) We're getting a sneak peek here at what is the number one net sentiment and mindshare leader in the entire machine-learning AI sector within a week. It's beating Data- >> 600. 600 in. >> It's beating Databricks. And we all know Databricks is a huge established enterprise company, not only in machine-learning AI, but it's in the top 10 in the entire survey. We have over 400 vendors in this survey. It's number eight overall, already. In a week. This is not hype. This is real. And I could go on the NLP stuff for a while. Not only here are we seeing it in open AI and machine-learning and AI, but we're seeing NLP in security. It's huge in email security. It's completely transforming that area. It's one of the reasons I thought Palo might take Abnormal out. They're doing such a great job with NLP in this email side, and also in the data prep tools. NLP is going to take out data prep tools. If we have time, I'll discuss that later. But yeah, this is, to me this is a no-brainer, and we're already seeing it in the data. >> Yeah, John Furrier called, you know, the ChatGPT introduction. He said it reminded him of the Netscape moment, when we all first saw Netscape Navigator and went, "Wow, it really could be transformative." All right, number six, the cloud expands to supercloud as edge computing accelerates and CloudFlare is a big winner in 2023. We've reported obviously on cloud, multi-cloud, supercloud and CloudFlare, basically saying what multi-cloud should have been. We pulled this quote from Atif Kahn, who is the founder and CTO of Alkira, thanks, one of the inbounds, thank you. "In 2023, highly distributed IT environments "will become more the norm "as organizations increasingly deploy hybrid cloud, "multi-cloud and edge settings..." Eric, from one of your round tables, "If my sources from edge computing are coming "from the cloud, that means I have my workloads "running in the cloud. "There is no one better than CloudFlare," That's a senior director of IT architecture at a huge financial firm. And then your analysis shows CloudFlare really growing in pervasion, that sort of market presence in the dataset, dramatically, to near 20%, leading, I think you had told me that they're even ahead of Google Cloud in terms of momentum right now. >> That was probably the biggest shock to me in our January 2023 tesis, which covers the public companies in the cloud computing sector. CloudFlare has now overtaken GCP in overall spending, and I was shocked by that. It's already extremely pervasive in networking, of course, for the edge networking side, and also in security. This is the number one leader in SaaSi, web access firewall, DDoS, bot protection, by your definition of supercloud, which we just did a couple of weeks ago, and I really enjoyed that by the way Dave, I think CloudFlare is the one that fits your definition best, because it's bringing all of these aspects together, and most importantly, it's cloud agnostic. It does not need to rely on Azure or AWS to do this. It has its own cloud. So I just think it's, when we look at your definition of supercloud, CloudFlare is the poster child. >> You know, what's interesting about that too, is a lot of people are poo-pooing CloudFlare, "Ah, it's, you know, really kind of not that sophisticated." "You don't have as many tools," but to your point, you're can have those tools in the cloud, Cloudflare's doing serverless on steroids, trying to keep things really simple, doing a phenomenal job at, you know, various locations around the world. And they're definitely one to watch. Somebody put them on my radar (laughing) a while ago and said, "Dave, you got to do a breaking analysis on CloudFlare." And so I want to thank that person. I can't really name them, 'cause they work inside of a giant hyperscaler. But- (Eric laughing) (Dave chuckling) >> Real quickly, if I can from a competitive perspective too, who else is there? They've already taken share from Akamai, and Fastly is their really only other direct comp, and they're not there. And these guys are in poll position and they're the only game in town right now. I just, I don't see it slowing down. >> I thought one of your comments from your roundtable I was reading, one of the folks said, you know, CloudFlare, if my workloads are in the cloud, they are, you know, dominant, they said not as strong with on-prem. And so Akamai is doing better there. I'm like, "Okay, where would you want to be?" (laughing) >> Yeah, which one of those two would you rather be? >> Right? Anyway, all right, let's move on. Number seven, blockchain continues to look for a home in the enterprise, but devs will slowly begin to adopt in 2023. You know, blockchains have got a lot of buzz, obviously crypto is, you know, the killer app for blockchain. Senior IT architect in financial services from your, one of your insight roundtables said quote, "For enterprises to adopt a new technology, "there have to be proven turnkey solutions. "My experience in talking with my peers are, "blockchain is still an open-source component "where you have to build around it." Now I want to thank Ravi Mayuram, who's the CTO of Couchbase sent in, you know, one of the predictions, he said, "DevOps will adopt blockchain, specifically Ethereum." And he referenced actually in his email to me, Solidity, which is the programming language for Ethereum, "will be in every DevOps pro's playbook, "mirroring the boom in machine-learning. "Newer programming languages like Solidity "will enter the toolkits of devs." His point there, you know, Solidity for those of you don't know, you know, Bitcoin is not programmable. Solidity, you know, came out and that was their whole shtick, and they've been improving that, and so forth. But it, Eric, it's true, it really hasn't found its home despite, you know, the potential for smart contracts. IBM's pushing it, VMware has had announcements, and others, really hasn't found its way in the enterprise yet. >> Yeah, and I got to be honest, I don't think it's going to, either. So when we did our top trends series, this was basically chosen as an anti-prediction, I would guess, that it just continues to not gain hold. And the reason why was that first comment, right? It's very much a niche solution that requires a ton of custom work around it. You can't just plug and play it. And at the end of the day, let's be very real what this technology is, it's a database ledger, and we already have database ledgers in the enterprise. So why is this a priority to move to a different database ledger? It's going to be very niche cases. I like the CTO comment from Couchbase about it being adopted by DevOps. I agree with that, but it has to be a DevOps in a very specific use case, and a very sophisticated use case in financial services, most likely. And that's not across the entire enterprise. So I just think it's still going to struggle to get its foothold for a little bit longer, if ever. >> Great, thanks. Okay, let's move on. Number eight, AWS Databricks, Google Snowflake lead the data charge with Microsoft. Keeping it simple. So let's unpack this a little bit. This is the shared accounts peer position for, I pulled data platforms in for analytics, machine-learning and AI and database. So I could grab all these accounts or these vendors and see how they compare in those three sectors. Analytics, machine-learning and database. Snowflake and Databricks, you know, they're on a crash course, as you and I have talked about. They're battling to be the single source of truth in analytics. They're, there's going to be a big focus. They're already started. It's going to be accelerated in 2023 on open formats. Iceberg, Python, you know, they're all the rage. We heard about Iceberg at Snowflake Summit, last summer or last June. Not a lot of people had heard of it, but of course the Databricks crowd, who knows it well. A lot of other open source tooling. There's a company called DBT Labs, which you're going to talk about in a minute. George Gilbert put them on our radar. We just had Tristan Handy, the CEO of DBT labs, on at supercloud last week. They are a new disruptor in data that's, they're essentially making, they're API-ifying, if you will, KPIs inside the data warehouse and dramatically simplifying that whole data pipeline. So really, you know, the ETL guys should be shaking in their boots with them. Coming back to the slide. Google really remains focused on BigQuery adoption. Customers have complained to me that they would like to use Snowflake with Google's AI tools, but they're being forced to go to BigQuery. I got to ask Google about that. AWS continues to stitch together its bespoke data stores, that's gone down that "Right tool for the right job" path. David Foyer two years ago said, "AWS absolutely is going to have to solve that problem." We saw them start to do it in, at Reinvent, bringing together NoETL between Aurora and Redshift, and really trying to simplify those worlds. There's going to be more of that. And then Microsoft, they're just making it cheap and easy to use their stuff, you know, despite some of the complaints that we hear in the community, you know, about things like Cosmos, but Eric, your take? >> Yeah, my concern here is that Snowflake and Databricks are fighting each other, and it's allowing AWS and Microsoft to kind of catch up against them, and I don't know if that's the right move for either of those two companies individually, Azure and AWS are building out functionality. Are they as good? No they're not. The other thing to remember too is that AWS and Azure get paid anyway, because both Databricks and Snowflake run on top of 'em. So (laughing) they're basically collecting their toll, while these two fight it out with each other, and they build out functionality. I think they need to stop focusing on each other, a little bit, and think about the overall strategy. Now for Databricks, we know they came out first as a machine-learning AI tool. They were known better for that spot, and now they're really trying to play catch-up on that data storage compute spot, and inversely for Snowflake, they were killing it with the compute separation from storage, and now they're trying to get into the MLAI spot. I actually wouldn't be surprised to see them make some sort of acquisition. Frank Slootman has been a little bit quiet, in my opinion there. The other thing to mention is your comment about DBT Labs. If we look at our emerging technology survey, last survey when this came out, DBT labs, number one leader in that data integration space, I'm going to just pull it up real quickly. It looks like they had a 33% overall net sentiment to lead data analytics integration. So they are clearly growing, it's fourth straight survey consecutively that they've grown. The other name we're seeing there a little bit is Cribl, but DBT labs is by far the number one player in this space. >> All right. Okay, cool. Moving on, let's go to number nine. With Automation mixer resurgence in 2023, we're showing again data. The x axis is overlap or presence in the dataset, and the vertical axis is shared net score. Net score is a measure of spending momentum. As always, you've seen UI path and Microsoft Power Automate up until the right, that red line, that 40% line is generally considered elevated. UI path is really separating, creating some distance from Automation Anywhere, they, you know, previous quarters they were much closer. Microsoft Power Automate came on the scene in a big way, they loom large with this "Good enough" approach. I will say this, I, somebody sent me a results of a (indistinct) survey, which showed UiPath actually had more mentions than Power Automate, which was surprising, but I think that's not been the case in the ETR data set. We're definitely seeing a shift from back office to front soft office kind of workloads. Having said that, software testing is emerging as a mainstream use case, we're seeing ML and AI become embedded in end-to-end automations, and low-code is serving the line of business. And so this, we think, is going to increasingly have appeal to organizations in the coming year, who want to automate as much as possible and not necessarily, we've seen a lot of layoffs in tech, and people... You're going to have to fill the gaps with automation. That's a trend that's going to continue. >> Yep, agreed. At first that comment about Microsoft Power Automate having less citations than UiPath, that's shocking to me. I'm looking at my chart right here where Microsoft Power Automate was cited by over 60% of our entire survey takers, and UiPath at around 38%. Now don't get me wrong, 38% pervasion's fantastic, but you know you're not going to beat an entrenched Microsoft. So I don't really know where that comment came from. So UiPath, looking at it alone, it's doing incredibly well. It had a huge rebound in its net score this last survey. It had dropped going through the back half of 2022, but we saw a big spike in the last one. So it's got a net score of over 55%. A lot of people citing adoption and increasing. So that's really what you want to see for a name like this. The problem is that just Microsoft is doing its playbook. At the end of the day, I'm going to do a POC, why am I going to pay more for UiPath, or even take on another separate bill, when we know everyone's consolidating vendors, if my license already includes Microsoft Power Automate? It might not be perfect, it might not be as good, but what I'm hearing all the time is it's good enough, and I really don't want another invoice. >> Right. So how does UiPath, you know, and Automation Anywhere, how do they compete with that? Well, the way they compete with it is they got to have a better product. They got a product that's 10 times better. You know, they- >> Right. >> they're not going to compete based on where the lowest cost, Microsoft's got that locked up, or where the easiest to, you know, Microsoft basically give it away for free, and that's their playbook. So that's, you know, up to UiPath. UiPath brought on Rob Ensslin, I've interviewed him. Very, very capable individual, is now Co-CEO. So he's kind of bringing that adult supervision in, and really tightening up the go to market. So, you know, we know this company has been a rocket ship, and so getting some control on that and really getting focused like a laser, you know, could be good things ahead there for that company. Okay. >> One of the problems, if I could real quick Dave, is what the use cases are. When we first came out with RPA, everyone was super excited about like, "No, UiPath is going to be great for super powerful "projects, use cases." That's not what RPA is being used for. As you mentioned, it's being used for mundane tasks, so it's not automating complex things, which I think UiPath was built for. So if you were going to get UiPath, and choose that over Microsoft, it's going to be 'cause you're doing it for more powerful use case, where it is better. But the problem is that's not where the enterprise is using it. The enterprise are using this for base rote tasks, and simply, Microsoft Power Automate can do that. >> Yeah, it's interesting. I've had people on theCube that are both Microsoft Power Automate customers and UiPath customers, and I've asked them, "Well you know, "how do you differentiate between the two?" And they've said to me, "Look, our users and personal productivity users, "they like Power Automate, "they can use it themselves, and you know, "it doesn't take a lot of, you know, support on our end." The flip side is you could do that with UiPath, but like you said, there's more of a focus now on end-to-end enterprise automation and building out those capabilities. So it's increasingly a value play, and that's going to be obviously the challenge going forward. Okay, my last one, and then I think you've got some bonus ones. Number 10, hybrid events are the new category. Look it, if I can get a thousand inbounds that are largely self-serving, I can do my own here, 'cause we're in the events business. (Eric chuckling) Here's the prediction though, and this is a trend we're seeing, the number of physical events is going to dramatically increase. That might surprise people, but most of the big giant events are going to get smaller. The exception is AWS with Reinvent, I think Snowflake's going to continue to grow. So there are examples of physical events that are growing, but generally, most of the big ones are getting smaller, and there's going to be many more smaller intimate regional events and road shows. These micro-events, they're going to be stitched together. Digital is becoming a first class citizen, so people really got to get their digital acts together, and brands are prioritizing earned media, and they're beginning to build their own news networks, going direct to their customers. And so that's a trend we see, and I, you know, we're right in the middle of it, Eric, so you know we're going to, you mentioned RSA, I think that's perhaps going to be one of those crazy ones that continues to grow. It's shrunk, and then it, you know, 'cause last year- >> Yeah, it did shrink. >> right, it was the last one before the pandemic, and then they sort of made another run at it last year. It was smaller but it was very vibrant, and I think this year's going to be huge. Global World Congress is another one, we're going to be there end of Feb. That's obviously a big big show, but in general, the brands and the technology vendors, even Oracle is going to scale down. I don't know about Salesforce. We'll see. You had a couple of bonus predictions. Quantum and maybe some others? Bring us home. >> Yeah, sure. I got a few more. I think we touched upon one, but I definitely think the data prep tools are facing extinction, unfortunately, you know, the Talons Informatica is some of those names. The problem there is that the BI tools are kind of including data prep into it already. You know, an example of that is Tableau Prep Builder, and then in addition, Advanced NLP is being worked in as well. ThoughtSpot, Intelius, both often say that as their selling point, Tableau has Ask Data, Click has Insight Bot, so you don't have to really be intelligent on data prep anymore. A regular business user can just self-query, using either the search bar, or even just speaking into what it needs, and these tools are kind of doing the data prep for it. I don't think that's a, you know, an out in left field type of prediction, but it's the time is nigh. The other one I would also state is that I think knowledge graphs are going to break through this year. Neo4j in our survey is growing in pervasion in Mindshare. So more and more people are citing it, AWS Neptune's getting its act together, and we're seeing that spending intentions are growing there. Tiger Graph is also growing in our survey sample. I just think that the time is now for knowledge graphs to break through, and if I had to do one more, I'd say real-time streaming analytics moves from the very, very rich big enterprises to downstream, to more people are actually going to be moving towards real-time streaming, again, because the data prep tools and the data pipelines have gotten easier to use, and I think the ROI on real-time streaming is obviously there. So those are three that didn't make the cut, but I thought deserved an honorable mention. >> Yeah, I'm glad you did. Several weeks ago, we did an analyst prediction roundtable, if you will, a cube session power panel with a number of data analysts and that, you know, streaming, real-time streaming was top of mind. So glad you brought that up. Eric, as always, thank you very much. I appreciate the time you put in beforehand. I know it's been crazy, because you guys are wrapping up, you know, the last quarter survey in- >> Been a nuts three weeks for us. (laughing) >> job. I love the fact that you're doing, you know, the ETS survey now, I think it's quarterly now, right? Is that right? >> Yep. >> Yep. So that's phenomenal. >> Four times a year. I'll be happy to jump on with you when we get that done. I know you were really impressed with that last time. >> It's unbelievable. This is so much data at ETR. Okay. Hey, that's a wrap. Thanks again. >> Take care Dave. Good seeing you. >> All right, many thanks to our team here, Alex Myerson as production, he manages the podcast force. Ken Schiffman as well is a critical component of our East Coast studio. Kristen Martin and Cheryl Knight help get the word out on social media and in our newsletters. And Rob Hoof is our editor-in-chief. He's at siliconangle.com. He's just a great editing for us. Thank you all. Remember all these episodes that are available as podcasts, wherever you listen, podcast is doing great. Just search "Breaking analysis podcast." Really appreciate you guys listening. I publish each week on wikibon.com and siliconangle.com, or you can email me directly if you want to get in touch, david.vellante@siliconangle.com. That's how I got all these. I really appreciate it. I went through every single one with a yellow highlighter. It took some time, (laughing) but I appreciate it. You could DM me at dvellante, or comment on our LinkedIn post and please check out etr.ai. Its data is amazing. Best survey data in the enterprise tech business. This is Dave Vellante for theCube Insights, powered by ETR. Thanks for watching, and we'll see you next time on "Breaking Analysis." (upbeat music beginning) (upbeat music ending)

Published Date : Jan 29 2023

SUMMARY :

insights from the Cube and ETR, do for the community, Dave, good to see you. actually come back to me if you would. It just stays at the top. the most aggressive to cut. that have the most to lose What's the primary method still leads the way, you know, So in addition to what we're seeing here, And so I actually thank you I went through it for you. I'm going to ask you to explain and they're certainly not going to get it to you in a zero trust way. So all of that is the One is just the number of So come back to me in 12 So 52% of the ETR survey amount of money on the Metaverse and also in the data prep tools. the cloud expands to the biggest shock to me "Ah, it's, you know, really and Fastly is their really the folks said, you know, for a home in the enterprise, Yeah, and I got to be honest, in the community, you know, and I don't know if that's the right move and the vertical axis is shared net score. So that's really what you want Well, the way they compete So that's, you know, One of the problems, if and that's going to be obviously even Oracle is going to scale down. and the data pipelines and that, you know, Been a nuts three I love the fact I know you were really is so much data at ETR. and we'll see you next time

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Alex MyersonPERSON

0.99+

EricPERSON

0.99+

Eric BradleyPERSON

0.99+

CiscoORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

Rob HoofPERSON

0.99+

AmazonORGANIZATION

0.99+

OracleORGANIZATION

0.99+

Dave VellantePERSON

0.99+

10QUANTITY

0.99+

Ravi MayuramPERSON

0.99+

Cheryl KnightPERSON

0.99+

George GilbertPERSON

0.99+

Ken SchiffmanPERSON

0.99+

AWSORGANIZATION

0.99+

Tristan HandyPERSON

0.99+

DavePERSON

0.99+

Atif KahnPERSON

0.99+

NovemberDATE

0.99+

Frank SlootmanPERSON

0.99+

APACORGANIZATION

0.99+

ZscalerORGANIZATION

0.99+

PaloORGANIZATION

0.99+

David FoyerPERSON

0.99+

FebruaryDATE

0.99+

January 2023DATE

0.99+

DBT LabsORGANIZATION

0.99+

OctoberDATE

0.99+

Rob EnsslinPERSON

0.99+

Scott StevensonPERSON

0.99+

John FurrierPERSON

0.99+

69%QUANTITY

0.99+

GoogleORGANIZATION

0.99+

CrowdStrikeORGANIZATION

0.99+

4.6%QUANTITY

0.99+

10 timesQUANTITY

0.99+

2023DATE

0.99+

ScottPERSON

0.99+

1,181 responsesQUANTITY

0.99+

Palo AltoORGANIZATION

0.99+

third yearQUANTITY

0.99+

BostonLOCATION

0.99+

AlexPERSON

0.99+

thousandsQUANTITY

0.99+

OneTrustORGANIZATION

0.99+

45%QUANTITY

0.99+

33%QUANTITY

0.99+

DatabricksORGANIZATION

0.99+

two reasonsQUANTITY

0.99+

Palo AltoLOCATION

0.99+

last yearDATE

0.99+

BeyondTrustORGANIZATION

0.99+

7%QUANTITY

0.99+

IBMORGANIZATION

0.99+

Lea Purcell, Foursquare | AWS Marketplace Seller Conference 2022


 

>>Welcome back everyone to the cubes coverage here in Seattle, Washington for AWS's marketplace seller conference. The big news here is that the Amazon partner network and marketplace coming together and reorganizing into one organization, the AIST partner organization, APO bringing together the best of the partnership and the marketplace to sell through. It's a sellers company. This is the second year, but technically with COVID, I call it a year and a half. This is the cube. I'm John for your host. Got a great guest, Leah for sale vice president of business development at four square. Leah, thanks for coming on the cube. Look great. Yeah. >>Hey, thanks. Thanks for having me here. >>So four square, everyone, and that has internet history knows you. You check in you'd become the mayor of a place right back in the day, all fun. It was a great app and I think it was competitor go sold the Facebook, but that was the beginning of location data. Now you got Uber apps, you got all apps, location, everywhere. Data is big here in the marketplace. They sell data, they got a data exchange, Chris head of marketplaces. Like we have all these things we're gonna bring 'em together, make it simpler. So you're on the data side. I'm assuming you're selling data and you're participating at the data exchange. What is Foursquare doing right now? Yeah, >>Exactly. So we are part of the data exchange. And you mentioned checking in. So we, we are really proud of our roots, the, the four square app, and that's kind of the basis still of our business. We have a hundred million data points, which are actually places of interest across the world 200 countries. And we are we're in the business of understanding whereplace are and how people move through those places over time. And >>What's the value proposition for that data. You're selling the data. >>We are selling the data and we're selling it. You can think about use cases. Like how can I improve the engagement with my app through location data? So for example, next door, as a customer of ours, everyone knows next door. When a new business comes online, they wanna make sure that business is a real business. So they use our places to ensure that the address of that business is accurate. >>So how did you, how do you guys get your data? Because if you don't have the first party app, you probably had critical mass of data. Yeah. But then do other people use your data and then re contribute back in kinda like, well, Stripe is for financial. You guys are plugging in yeah. To >>Apps. A great question. So we still do have our consumer apps. We're still proud of those. It's still a basis of our company really. Okay. So, but we take that data. So our first party data, we also, for all the web, we have some partners integrate our SDK. And so we're pulling in all that data from various sources and then scrubbing it and making sure we have the most unique. >>So you guys still have a business where the app's working. Yep. Okay. But also let's just say, I wanna have a cube app. Yeah. And I want to do a check in button. Yep. So rather than build checking in, could I OEM you could four square is that you >>Could, and we could help you understand where people are checking in. So we know someone's here at the Hilton and Bellevue, we know exactly where that place is. You building the Cub app. You could say, I'm gonna check in here and we are verified. We know that that's the >>Right place. So that's a good for developer if they're building an app. >>Absolutely. So we have an SDK that any developer can integrate. >>Great. Okay. So what's the relationship with the marketplace? Take us through how Foursquare works with AWS marketplace. >>Sure. So we are primarily integrated with ADX, which is sort of a piece of marketplace it's for data specifically, we have both of our main products, which are places that POI database and visits, which is how people move through those places over time. So we're able to say these are the top chains in the country. Here's how people move throughout those. And both those products are listed on ADX. >>So if I'm in Palo Alto and I go to Joe in the juice yeah. You know that I kind of hang in one spot or is it privacy there? I mean, how do you know like what goes on? Well, >>We know somebody does that. We don't >>Know that you do that. So >>We ensure, you know, we're very privacy centric and privacy focused. We're not gonna, we don't tell anybody at you >>Yourself it's pattern data. It is. >>Okay. So it's normalized data, right? Over time groups of people, >>How they, how are people using the data to improve processes, user experience? What are some of the use cases? >>So that example, nextdoor, that's really a use case that we see a lot and that's improving their application. So that nextdoor app to ensure that the ACC, the data's accurate and that as you, as a user, you know, that that business is real. Cuz it's verified by four wear. Another one is you can use our data to make business decisions around where you're gonna place your next loca. You know, your next QSR. So young brands is a customer of ours. Those are, those guys are pizza hut KFC. They work with us to figure out where they should put their next KFC. Yeah. >>I mean retail location, location, location. Yeah. >>Right. Yeah. People are still, even though e-commerce right. People still go into stores >>And still are. Yeah. There's, there's, there's probably lot, a lot of math involved in knowing demographics patterns. Volume. >>Yeah. Some of our key customers are really data scientists. Like the think about cus with businesses that have true data science companies. They're really looking at that. >>Yeah. I mean in, and out's on the exit for a reason. Right. They want in and out. Yeah. So they wanna put it inland. >>Right. And we can actually tell you where that customer from in and out where they go next. Right. So then, you know, oh, they go to this park or they go somewhere and we can help you place your next in and out based on that visitation. >>Yeah. And so it's real science involved. So take us through the customers. You said data scientists, >>Mostly data scientists is kind of a key customer data science at a large corporation, like a QSR that's >>Somebody. Okay. So how is the procurement process on the marketplace? What does the buyer get? >>So what we see the real value is, is because they're already a customer of Amazon. That procurement is really easy, right? All the fulfillment goes through Amazon, through ADX. And what you're buying is either at API. So you can, that API can make real time calls or you're buying a flat file, like an actual database of those hundred points of interest. >>And then they integrate into their tool set. Right. They can do it. So it's pretty data friendly in terms of format. >>You can kind of do whatever you want with it. We're gonna give you that as long as you're smart enough to figure out what to do. Do we have a >>Lot of, so what's your experience with AWS marketplace? I mean, obviously we, we see a lot of changes. They had a reorg partner network merging with marketplace. You've been more on the data exchange, Chris kind of called that out. It's yeah. It's kind of a new thing. And, and he was hinting at a lot of confusion, but simplifying things. Yeah. What's your take of the current AWS marketplace >>Religions? I actually think ADX because our experience has primarily been ADX. I think they've done a really good job. They've really focused on the data and they understand how CU, how, you know, people like us sell our data. It hasn't been super confusing. We've had a lot of support. I think that's what Amazon gives you. You have to put a lot of effort into it, but they're also, they also give you a lot of support. >>Yeah. And, and I think data exchange is pretty significant to the strategic. It is >>Mission. It is. We feel that. Yeah. You know, we feel like they really value us as a partner. >>What's the big thing you're seeing out there right now in data, because like you're seeing a lot more data exchanges going on. There's always been data exchange, but you're seeing a lot more exchanges between companies. So let's just take partners. You're seeing a lot more people handle front end of a, a supply chain and you got more data exchanges. What's the future of data exchanges. If you had to kind of, you know, guess given your history in, in the industry. Yeah. What's the next around the corner trend? >>I think. Well, I think there's a, has to be consolidation. I know everyone's building one, but there's probably too many. I know from our experience, we can't support all of them. We're not a huge company. We can't support Amazon and X and Y and Z. Like it's just too many. So we kind of put all of our eggs in a couple baskets. So I think there'll be consolidation. I think there has to be just some innovation on what data products are, you know, for us, we have these two, it's an API and a flat file. I think as exchanges think about, you know, expanding what are the other types of data products that can help us build? >>Yeah. I mean, one of the things that's, you know, we see, we cover a lot of on the cube is edge. You know, you got, yeah. Amazon putting out new products in regions, you got new wavelength out there, you got regions, you got city level connectivity, data coming from cars. So a lot more IOT data. How do you guys see that folding into your vision of data acquisition and data usage, leverage, reuse, durability. These >>Are, yeah. I mean, we're, we are keeping an eye on all of that. You know, I think we haven't quite figured out how we wanna allocate resources against it, but you know, it's definitely, it's a really interesting space to be in. Like, I don't think data's going anywhere and I think it's really just gonna grow and how people use it's >>Gonna expand. Okay. So if I'm a customer, I go to the marketplace, I wanna buy four square data. What's the pitch. >>We can help you improve your business decisions or your applications through location data. We know where places are and how people move through the world over time. So we can tell you we're, we're sure that this is the Hilton in Bellevue. We know that, that we know how many people are moving through here and that's really the pitch. >>And they use that for whatever their needs are, business improvement, user experience. Yeah. >>Those are really the primary. I mean, we also have some financial use cases. So hedge funds, maybe they're thinking about yeah. How they wanna invest their money. They're gonna look at visits over time to understand what people are doing. Right. The pandemic made that super important. >>Yeah. That's awesome. Well, this is great. Great success story. Congratulations. And thanks for sharing on the cube. Really appreciate you coming on. Thank you. My final question is more about kind of the future. I wanna get your thoughts because your season pro, when you have the confluence of physical and digital coming together. Yeah. You know, I was just talking with a friend about FedEx's earnings, comparing that to say, AWS has a fleet of delivery too. Right? Amazon, Amazon nots. So, but physical world only products location matters. But then what about the person when they're walking around the real world? What happens when they get to the metaverses or, you know, they get to digital, they tend an event. Yeah. How do you see that crossroad? Cuz you have foot in both camps. We do, you got the app and you got the physical world it's gonna come together. Is there thoughts around, you can take your course care hat off and put your industry hat on. Yeah. You wanna answer that? Not officially on behalf of Foursquare, but I'm just curious, this is a, this is the confluence of like the blending of physical and digital. >>Yeah. I know. Wow. I admittedly haven't thought a whole lot about that. I think it would be really weird if I could track myself over time and the metaverse I mean, I think, yeah, as you said, it's >>It's, by the way, I'm not Bo on the metaverse when it's blocked diagrams, when you have gaming platforms that are like the best visual experience possible, right? >>Yeah. I mean, I think it, I think we'll see, I don't, I don't know that I have a >>Prediction, well hybrid we've seeing a lot of hybrid events. Like this event is still intimate VIP, but next year I guarantee it's gonna be larger, much larger and it's gonna be physical and face to face, but, but digital right as well. Yeah. Not people experiencing the, both that first party, physical, digital hybrid. Yeah. And it's interesting something that we track a lot >>Of. Yeah, for sure. Yeah. I think we'll have a, well, I think we'll, there's something there for us. I think that those there's a play there as we watch kind >>Of things change. All right, Leah, thank you for coming on the Q appreciate so much it all right. With four Graham, John fur a year checking in with four square here on the cube here at the Amazon web services marketplace seller conference. Second year back from the pandemic in person, more coverage after this break.

Published Date : Sep 21 2022

SUMMARY :

and the marketplace to sell through. Thanks for having me here. So four square, everyone, and that has internet history knows you. So we are part of the data exchange. What's the value proposition for that data. I improve the engagement with my app through location data? So how did you, how do you guys get your data? So our first party data, we also, for all the web, So you guys still have a business where the app's working. Could, and we could help you understand where people are checking in. So that's a good for developer if they're building an app. So we have an SDK that any developer can integrate. Take us through how Foursquare works with AWS So we're able to say these are I mean, how do you know like what goes on? We know somebody does that. Know that you do that. we don't tell anybody at you It is. So that example, nextdoor, that's really a use case that we see a lot and that's improving I mean retail location, location, location. People still go into stores And still are. Like the think about cus with businesses that have true So they wanna put it inland. So then, you know, oh, they go to this park or they go somewhere and we can help you place your next in and out based on that visitation. So take us through the customers. What does the buyer get? So you can, that API can make real time calls or you're buying a flat file, So it's pretty data friendly in terms of You can kind of do whatever you want with it. You've been more on the data exchange, Chris kind of called that out. They've really focused on the data and they understand how CU, how, you know, people like us sell It is You know, we feel like they really value us as a partner. If you had to kind of, you know, guess given your history in, I think as exchanges think about, you know, expanding what are the other types of data products You know, you got, yeah. we wanna allocate resources against it, but you know, it's definitely, it's a really interesting space to be in. What's the pitch. So we can tell you we're, And they use that for whatever their needs are, business improvement, user I mean, we also have some financial use cases. We do, you got the app and you got the physical world it's mean, I think, yeah, as you said, it's that we track a lot I think that those there's a play there as All right, Leah, thank you for coming on the Q appreciate so much it all right.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AmazonORGANIZATION

0.99+

APOORGANIZATION

0.99+

AISTORGANIZATION

0.99+

ChrisPERSON

0.99+

LeahPERSON

0.99+

AWSORGANIZATION

0.99+

FedExORGANIZATION

0.99+

Palo AltoLOCATION

0.99+

BellevueLOCATION

0.99+

JohnPERSON

0.99+

FacebookORGANIZATION

0.99+

FoursquareORGANIZATION

0.99+

200 countriesQUANTITY

0.99+

KFCORGANIZATION

0.99+

both campsQUANTITY

0.99+

bothQUANTITY

0.99+

next yearDATE

0.99+

a year and a halfQUANTITY

0.99+

Lea PurcellPERSON

0.99+

twoQUANTITY

0.98+

second yearQUANTITY

0.98+

one organizationQUANTITY

0.97+

four squareORGANIZATION

0.97+

firstQUANTITY

0.97+

one spotQUANTITY

0.96+

hundred pointsQUANTITY

0.96+

pandemicEVENT

0.96+

fourQUANTITY

0.95+

StripeORGANIZATION

0.95+

Seattle, WashingtonLOCATION

0.95+

UberORGANIZATION

0.94+

a yearQUANTITY

0.94+

oneQUANTITY

0.93+

HiltonORGANIZATION

0.93+

hundred million data pointsQUANTITY

0.92+

Second year backDATE

0.89+

GrahamPERSON

0.87+

AWS Marketplace Seller ConferenceEVENT

0.86+

four squareTITLE

0.86+

first partyQUANTITY

0.85+

web services marketplace sellerEVENT

0.83+

John furPERSON

0.83+

ADXORGANIZATION

0.79+

XORGANIZATION

0.78+

COVIDORGANIZATION

0.77+

ADXTITLE

0.74+

couple basketsQUANTITY

0.74+

JoePERSON

0.73+

four squareQUANTITY

0.69+

CubTITLE

0.58+

2022DATE

0.56+

pizzaORGANIZATION

0.53+

Jim Richberg & Kenny Holmes, Fortinet | AWS re:Invent 2020 Public Sector Day


 

>> Narrator: From around the globe, it's theCube. With digital coverage of AWS re:Invent 2020. Special coverage sponsored by AWS worldwide public sector. >> Hello and welcome to theCube virtual, and our coverage of AWS re:Invent 2020 with special coverage of public sector. We are theCube virtual and I'm your host, Justin Warren, and today I'm joined by two people. We have Jim Richberg the CISO for Public Sector from Fortinet who comes to us from Washington DC. Jim, welcome. >> Thank you. Thank you, Justin. >> And we also have Kenny Holmes. Who's the head of worldwide Public Sector Go-to-market from Fortinet as well. And he comes to us from Chicago in Illinois. Kenny, thanks. >> Yes, thank you. Thank you, Justin. >> Gentlemen, welcome to theCube. Now this year has been pretty dramatic and for a lot of us as I'm sure you're very well aware and it's been a bit of an accelerator for people's interest in public cloud in particular for the public sector. So what have you seen, Kenny? Sorry, Jim, we'll start with you around the federal government's interest in cloud. What have you noticed in their adoption of public cloud and AWS in this year? >> So, we used to joke in the federal government in my 34 years, they'll never let a good crisis go to waste. That you can make an upside out of any situation. And as you noted, Justin this has been a dramatic accelerator to federal government's adoption of cloud. Three quarters of the agencies were already moving in the direction of the cloud and planning to spend roughly $8 billion on it this year. And that was pre COVID. And the pace certainly picked up. We had the guidance that came out of DHS, the interim guidance that facilitated abilities to let these now as of mid-March remote teleworkers connect directly to the cloud without having to connect back through their agency infrastructure. So they issued very quick guidance to say, look you got to get the job done. You got to get it done in the cloud. So they did that as a way to accelerate it in the short term. And then they put out the guidance later this year for a trusted internet connection access which had a use case that was built around again facilitating the ability to say you can connect directly to the cloud with security in that direct line stack. You no longer have to haul your data back to the enterprise edge, to the data center on-premise to then go straight out to the cloud. So the federal government said we will give you the ability to move in the direction of cloud and the agencies have been using this at scale. And that's why roughly half of the federal workforce is now working from home. And many of them are using cloud-based applications and services. So the dramatic impact on the federal government. >> Yeah, we've seen it here in Nate in my home of Australia. The federal government is very keen on that but there's other levels of government as I'm sure we're all aware. Particularly as state and even local governments. So Kenny, maybe you could give us a bit of a flavor for how does local and that more regional government have they been doing it basically the same as federal government or is there something unique to the way that they've had to adapt? >> Well, state and local governments are certainly facing the really the perfect storm of the rising demand and declining resources. The pandemic has certainly driven, a lower tax base and lower revenues. And as a result of that, we've seen adjustments in budgets, et cetera but we're also in a position uniquely where it's also driving digital innovation at the same time. So we're seeing the two of those and they don't necessarily have kind of diabolically opposed if you think about it. So, the two of those are coming together but so they're doing more with less and they're using digital transformation to get there where in the commercial world a lot of folks who've been doing digital transformation for a long time. Now, government is being more forced into doing it. And they're really embracing that from our perspective. So we've seen traditionally security be at the top of their demands from a CIO perspective and their most important initiatives. The now we're seeing digital transformation and more specifically we're seeing cloud, right be a key part of that. So, they've done things initially, obviously moving email and some of those things but today we're seeing an increasing amount of workloads that we're seeing them, move from maybe a previous provider, over to AWS et cetera. So, those are some of the things that we're seeing from our state and local perspective >> To build on Kenny's point. I think the key differentiator Justin, between the federal and the state and local experience has been the resources, the federal government with COVID. The federal government runs a deficit. We've seen the deficit balloon, federal spending is up 17 to 20%, not what it's passed out of the stimulus money but simply what government is spending at the federal level. So we are using cloud at the federal level to do more as Kenny noted, state governments and local governments because they're funded exclusively by taxes they can't run a deficit. They have had to say we need to spend smarter because we can't spend more. We can't even spend as much and oh my goodness we have to deliver more digital services at the same time. So for them it has been a matter of having to eke greater efficiencies out of every dollar which has pointed them in the direction of AWS and the cloud in a different sense. And the federal government that said there's greater efficiencies because we need our remote telework people to get the job done, state government, it's the perfect storm. And if they don't do this they're literally going to have to curtail vital services. >> Yeah and as we've seen the security challenge pretty much is the same everywhere. I mean, there's some variations in exactly one sort of threat you might have as a federal government compared to local but broadly speaking, the malware and ransomware and things of that nature is pretty much just a miasma that we have to wade through. So what does, Fortinet helping with these customers, particularly as they move to as you mentioned, they're moving a lot of things into AWS. So what is Fortinet's role there in helping customers make better use of public cloud? >> So I think one of the things that Fortinet really has brought to this equation is they really are a very broad based cybersecurity provider. The biggest problem that organizations typically have, of course, you know in the cloud, it's misconfiguration by the customer. It's not AWS that's making the mistake 99% plus of the time it's misconfiguration by the customer. So having the ability to say if you know how to do your security in an on-premise environment, and you've got controls, capabilities and settings that you're comfortable with you can migrate those intact if they work for you into your cloud environment. So the fact that we are soup to nuts, that we have things at the edge and offer that same suite of capabilities in AWS allows us to be able to tell, help the users if they've configured it right, not have to go back and start from scratch and say, well, now that I'm in AWS I need to reconfigure other than as you have to do it because it's a different platform, but if you've got the policies in place that are managing security managing risk well for your enterprise carry them forward to a different environment. >> I think Kenny is that a particular opportunity there for local government? As you mentioned that restrained resources means that it's much more difficult for them to correctly configure their environments but also to make this level of change, they have a lot of other responsibilities it's difficult to become cybersecurity experts. Is that where you see Fortinet helping a great deal in more local government. >> Yes it is one of the key areas. The best way you can think of it is the ability to do what Jim was saying in a single pane of glass. And the fact that we can do that. That's something you don't hear a lot about anymore, but Fortinet actually is one of the largest security providers in the world. Has it single pane of glass across, being able to manage your on-prem infrastructure being able to manage whether if someone's migrating away from another cloud over to AWS and being able to look at these holistically it's just a fantastic way for them to be efficient as well as around training and certifications and helping our customers to be able to take advantage of the products without additional costs or other things that I've been throwing down the gauntlet for other providers to say, hey, security shouldn't be something else that they have to invest. They're going to invest in your technology. You should provide them with the training, provide them with security awareness, sobriety with certifications around your product that should be table stakes. >> And we do see a lot of that structure of how to do this and provide that training tends to be the same regardless of where you are. Is that something that we see say to getting defined at federal government level with some of the standards and then that then sort of trickles down into more local government. Kenny, is that something that you see happening at all? Or are we seeing things defined at local government that are actually going back the other way? >> Yeah, well, compliance runs across both. I mean, there's probably more compliance on the federal side that Jim could speak to but there's certainly compliance is always a major factor. And it can't be that just we need to do one-off solutions for a particular compliance issue. It needs to be holistic as we're talking about it. If I have to pick solutions based on what and where they're protecting. And now I have to think about the compliance for those as well. That's yet another thing to think about, I don't see our customers thinking that way. They don't have the skillsets to continue to evolve that way. That's an expanded, use of what they're doing and they just don't have those resources. So they have to be able to do more with less we've talking about, and to be able to take a platform like the fabric that Fortinet it offers it really offers that to them. >> At the federal level I'm not even sure that I would characterize it as compliance and regulatory things that state local government have to do, but the National Institute of Standards and Technology NIST tends to promulgate what are considered best practices. Then your cybersecurity framework has basically been adopted globally modified by certain places. And I did too in different ways, but when NIST comes up with something like zero trust architecture, new standards are understood, the 800 Series. I'm surprised people in local government where we'll talk about 800-53 or 800-207, just like we fed geeks too. So it's really setting best practices and standards that are different from compliance but to build on Kenny's point about resources where I think Kenny has flown the other way from local government up has been in the direction of saying state and local government had been the Canary in the coal mine on saying, you have to migrate to the cloud as a way of doing more with less. So the federal government has been turning the printing press, turning the crank faster and faster that will change, and this is one where can say you're spending smarter by moving in the direction of AWS and in accelerating that growth into the cloud, because my prediction as a former intelligence analyst is probably this time next year, a lot of federal agencies will be having the discussion about how to live in a much tightened budgetary environment because we went through something called sequestration 10 years ago that made for very tight zero sum budgeting. That's going to be a coming attraction and that's going to push federal government even more, so with the saying, I got to get the data off of Graham. I've got to continue to telework, Hey, and look we can follow the best practices of state and local government in this case. >> Well, it certainly sounds like we'll be able to learn from each other and adapt it. It's not going away. We're certainly going to have cybersecurity issues for the foreseeable future, but it sounds like there's a lot of work happening and there is room for happiness about how things are generally going. So, gentlemen, thank you so much for joining us here and please thank you to my guest Jim Richberg and Kenny Holmes from Fortinet. You've been watching theCube virtual and our coverage of AWS re:Invent 2020 with special coverage of the public sector. Make sure you check out all the rest of our coverage on your desktop laptop or phone wherever you might be. I've been your host, Justin Warren. I look forward to seeing you again soon. (soft upbeat music)

Published Date : Dec 9 2020

SUMMARY :

the globe, it's theCube. We have Jim Richberg the Thank you, Justin. And he comes to us from Thank you, Justin. for the public sector. again facilitating the ability to say to the way that they've had to adapt? of the rising demand the federal level to do more as a federal government compared to local So having the ability to say for them to correctly the ability to do what Jim was saying of how to do this and to be able to take a platform has been in the direction of saying I look forward to seeing you again soon.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
StephaniePERSON

0.99+

JimPERSON

0.99+

Jim RichbergPERSON

0.99+

Dave VellantePERSON

0.99+

John FrowerPERSON

0.99+

StevePERSON

0.99+

Justin WarrenPERSON

0.99+

Jim CaseyPERSON

0.99+

Steve HershkowitzPERSON

0.99+

Dave VellantePERSON

0.99+

Stephanie WalterPERSON

0.99+

GeorgePERSON

0.99+

Kenny HolmesPERSON

0.99+

National Institute of Standards and TechnologyORGANIZATION

0.99+

JustinPERSON

0.99+

Bobby PatrickPERSON

0.99+

Michael GilfixPERSON

0.99+

PeterPERSON

0.99+

Aaron PowellPERSON

0.99+

NISTORGANIZATION

0.99+

Daniel BergPERSON

0.99+

IBMORGANIZATION

0.99+

JapanLOCATION

0.99+

Peter BurrisPERSON

0.99+

ChicagoLOCATION

0.99+

CiscoORGANIZATION

0.99+

HPEORGANIZATION

0.99+

MichellePERSON

0.99+

Jim CaseyPERSON

0.99+

2018DATE

0.99+

DavePERSON

0.99+

DanielPERSON

0.99+

UiPathORGANIZATION

0.99+

MichaelPERSON

0.99+

Kenny HolmesPERSON

0.99+

Monty BarlowPERSON

0.99+

PensandoORGANIZATION

0.99+

58%QUANTITY

0.99+

MaiaPERSON

0.99+

six monthsQUANTITY

0.99+

Antonio NeriPERSON

0.99+

Palo AltoLOCATION

0.99+

NVIDIAORGANIZATION

0.99+

twoQUANTITY

0.99+

NASAORGANIZATION

0.99+

BobbyPERSON

0.99+

SMBC BankORGANIZATION

0.99+

Talend Drives Data Health for Business Decisions


 

>>with me are and Crystal Graham, a k a a C. She's the C R O of talent, and Chris Degnan is the C R. O of Snowflake. We have to go to market heavies on this section, folks. Welcome to the Cube. >>Thank you. >>Thanks for having us. >>That's our pleasure. And so let's let's talk about digital transformation, right? Everybody loves to talk about it. It zone overused term. I know, but what does it mean? Let's talk about the vision of the data cloud for snowflake and digital transformation. A. C. We've been hearing a lot about digital transformation over the past few years. It means a lot of things to a lot of people. What are you hearing from customers? How are they thinking about when I come, sometimes called DX and what's important to them? Maybe address some of the challenges even that they're facing >>Dave. That's a great question to our customers. Digital transformation literally means staying in business or not. Um, it's that simple. Um, the reality is most agree on the opportunity to modernize data management infrastructure that they need to do that to create the speed and efficiency and cost savings that digital transformation promises. But but now it's beyond that. What's become front and center for our customers is the need for trusted data, supported by an agile infrastructure that and allow a company to pivot operations as they need. Um, let me give you an example of that. One of our customers, a medical device company, was on their digital journey when Cove it hit. They started last year in 2019, and as the pandemic hit at the earlier part of this year, they really needed to take a closer look at their supply chain. On went through an entire supply chain optimization, having been completely disrupted in the you think about the logistics, the transportation, the location of where they needed to get parts, all those things when they were actually facing a need to increase production by about 20 times. In order to meet the demand on DSO, you can imagine what that required them to do and how reliant they were on clean, compliant, accurate data that they could use to make extremely critical decisions for their business. And in that situation, not just for their business but decisions. That would be the about saving lives, so the stakes have gotten a lot higher, and that's that's just one industry. It's it's really across all industries. So when you think about that, really, when you talk to any of our customers, digital transformation is really mean. It really means now having the confidence in data to support the business at critical times with accurate, trusted information. >>Chris, I've always said a key part of digital transformation is really putting data at the core of everything you know, Not not the manufacturing plant, that the core in the data around it, but putting data at the center. It seems like that's what Snowflake is bringing to the table. Can you comment? >>Yeah. I mean, I think if if I look across what's happening and especially a Z A. C said, you know, through co vid is customers are bringing more and more data sets. They wanna make smarter business decisions based on data making, data driven decisions. And we're seeing acceleration of of data moving to the cloud because they're just in abundance of data. And it's challenging to actually manage that data on premise and and as we see those those customers move those large data sets. Think what A C said is spot on is that customers don't just want to have their data in the cloud. But they actually want to understand what the data is, understand, who has access to that data, making sure that they're actually making smart business decisions based on that data. And I think that's where the partnership between both talent and stuff like are really tremendous, where you know we're helping our customers bring their data assets to to the cloud, really landing it and allowing them to do multiple, different types of workloads on top of this data cloud platform and snowflake. And then I think again what talent is bringing to the table is really helping the customer make sure that they trust the data that they're actually seeing. And I think that's a really important aspect of digital transformation today. >>Awesome and I want to get into the partnership. But I don't wanna leave the pandemic just yet. A c. I want to ask you how it's affected customer priorities and timelines with regard to modernizing their data operations and what I mean to that they think about the end and life cycle of going from raw data insights and how they're approaching those life cycles. Data quality is a key part of, you know, a good data quality. You're gonna I mean, obviously you want to reiterate, and you wanna move fast. But if if it's garbage out, then you got to start all over again. So what are you seeing in terms of the effect of the pandemic and the urgency of modernizing those data operations? >>Yeah, but like Chris just said it accelerated things for those companies that hadn't quite started their digital journey. Maybe it was something that they had budgeted for but hadn't quite resourced completely many of them. This is what it took to to really get them off the dying from that perspective, because there was no longer the the opportunity to wait. They needed to go and take care of this really critical component within their business. So, um, you know what? What Covic, I think, has taught companies have taught all of us is how vulnerable even the largest. Um, you know, companies on most robust enterprises could be those companies that had already begun Their digital transformation, maybe even years ago, had already started that process and we're in a better. We're in a great position in their journey. They fared a lot better and we're able to be agile. Were able Thio in a shift. Priorities were able to go after what they needed to do toe to run their businesses better and be able to do so with riel clarity and confidence. And I think that's really the second piece of it is, um or the last six months people's lives have really depended on the data people's lives that have really dependent on uncertainty. The pandemic has highlighted the importance of reliable and trustworthy information, not just the proliferation of data. And as Chris mentioned this data being available, it's really about making sure that you can use that data as an asset Ondas and that the greatest weapon we all have, really there is the information and good information to make a great business decisions. >>Of course, Chris, the other thing we've seen is the acceleration toe to the cloud, which is obviously you're born in the cloud. It's been a real tailwind. What are you seeing in that regard from your I was gonna say in the field, but from your zoom >>advantage. Yeah, well, I think you know, a C talked about supply chain, um, analytics in in her previous example. And I think one of the things that that we did is we hosted a data set. The covert data set over 19 data set within snowflakes, data marketplace. And we saw customers that were, you know, initially hesitant to move to the cloud really accelerate there. They're used to just snowflake in the cloud with this cove Cove. A data set on Ben. We had other customers that are, you know, in the retail space, for example, and use the cova data set to do supply chain analytics and and and accelerated. You know, it helped them make smarter business decisions on that. So So I'd say that you know, Cove, it has, you know, made customers that maybe we're may be hesitant to to start their journey in the cloud, move faster. And I've seen that, you know, really go at a blistering pace right now. >>You know, you just talked about, you know, value because it's all about value. But the old days of data quality in the early days of Chief Data, Officer all the focus was on risk avoidance. How do I get rid of data? How long do I have to keep it? And that has flipped dramatically. You know, sometime during the last decade, >>you can't get away too much from the need for quality data and and govern data. I think that's the first step. You can't really get to, um, you know, to trust the data without those components. And but to your point, the chief Data officers role, I would say, has changed pretty significantly. And in the round tables that I've participated in over the last, you know, several months. It's certainly a topic that they bring to the table that they'd like Thio chat with their peers about in terms of how they're navigating through the balance, that they still need toe to manage to the quality they still need to manage to the governance they still need. Thio ensure that that they're delivering that trusted information to the business. But now, on the flip side as well, they're being relied upon to bring new insights. And that's on bit's, um, really requiring them to work more cross functionally than they may have needed to in the past where that's been become a big part of their job is being that evangelist for data the evangelist. For that, those insights and being able to bring in new ideas for how the business can operate and identified, you know, not just not just operational efficiencies, but revenue opportunities, ways that they can shift. All you need to do is take a look at, for example, retail. You know, retail was heavily impacted by the pandemic this year on git shows how easily an industry could be could be just kind of thrown off its course simply by by a just a significant change like that. Andi need to be able to to adjust. And this is where, um when I've talked to some of the CEOs of the retail customers that we work with, they've had to really take a deep look at how they can leverage their the data at their fingertips to identify new in different ways in which they can respond to customer demands. So it's a it's a whole different dynamic. For sure, I it doesn't mean that that you walk away from the other and the original part of the role of the or the areas in which they were maybe more defined a few years ago when the role of the chief data officer became very popular. I do believe it's more of a balance at this point and really being able to deliver great value to the organization with the insights that they could bring >>well, is he stayed on that for a second. So you have this concept of data health, and I guess what kind of getting tad is that In the early days of Big Data Hadoop, it was just a lot of rogue efforts going on. People realize, Wow, there's no governance And what what seems like what snowflake and talent are trying to do is to make that the business doesn't have to worry about it. Build, build that in, don't bolt it on. But what's what's this notion of data health that you talk about? >>Companies can measure and do measure just about everything, every aspect of their business health. Um, except what's interesting is they don't have a great way to measure the health of their data, and this is an asset that they truly rely on. Their future depends on is that health of their data. And so if we take a little bit of a step back, maybe let's take a look at an example of a customer experiences to kind of make a little bit of a delineation between the differences of data, data, quality, data trust in what data health truly is. We work with a lot of health, a lot of hotel chains. And like all companies today, hotels collect a ton of information. There's mountains of information, private information about their customers through the loyalty clubs and all the information that they collect from there, the front desk, the systems that store their data. You can start to imagine the amount of information that a hotel chain has about an individual, and frequently that information has, you know, errors in it, such as duplicate entries, you know. Is it a Seagram, or is it in Chris Telegram? Same person, Slightly different, depending on how I might have looked or how I might have checked in at the time. And sometimes the data is also mismanaged, where because it's in so many different locations, it could be accessed by the wrong person of someone that wasn't necessarily intended to have that kind of visibility. And so these are examples of when you look at something like that. Now you're starting to get into, you know, privacy regulations and other kinds of things that could be really impactful to a business if data is in the wrong hands or the wrong data is in the wrong hands. So, you know, in a world of misinformation and mistrust, which is around us every single day, um, talent has really invented a way for businesses to verify the veracity, the accuracy of their data. And that's where data health really comes in Is being able to use a trust score to measure the data health on. That's what we have recently introduced is this concept of the trust score, something that can actually provide and measure, um, at the accuracy and the health of the data all the way down to an individual report. We believe that that that truly, you know, provides the explainable trust issue resolution, the kinds of things that companies are looking for in that next stage of overall data management. >>Thank you, Chris. Bring us home. So, one of the key aspects of what snowflake is doing is building out the ecosystem is very, very important. Really talk about how how you guys we're partnering and adding value in particular things that you're seeing customers do today within the ecosystem or with the help of the ecosystem and stuff like that they weren't able to do previously. >>Yeah. I mean, I think you know a C mentioned it. You mentioned it. You know, we spent I spent a lot of my zoom days talking Thio, chief data officers and as I'm talking to the chief data officers that they are so concerned their responsibility on making sure that the business users air getting accurate data so that they view that as data governance is one aspect of it. But the other aspect is the circumference of the data of where it sits and who has access to that data and making sure it's super secure. And I think you know, snowflake is a tremendous landing spot being a data warehouse or data cloud data platform as a service, you know, we take care of all the, you know, securing that data. And I think where talent really helps our customer base is helps them exactly What what is he talked about is making sure that you know myself as a business users someone like myself who's looking at data all the time, trying to make decisions on how many sales people I wanna hire house my forecast coming. You know, how's the how's the product working all that stuff? I need to make sure that I'm actually looking at at good data. And I think the combination of all sitting in a single repository like snowflake and then layering it on top or laying a tool like talent on top of it, where I can actually say, Yeah, that is good data. It helps me make smarter decisions faster. And ultimately, I think that's really where the ecosystem plays. An incredibly important, important role for snowflake in our customers, >>guys to great cast. I wish we had more time, but we gotta go on dso Thank you so much for sharing your perspectives. A great conversation

Published Date : Nov 19 2020

SUMMARY :

She's the C R O of talent, and Chris Degnan is the C R. O of Snowflake. It means a lot of things to a lot of people. having been completely disrupted in the you think about the logistics, of everything you know, Not not the manufacturing plant, that the core in the data around it, And it's challenging to actually manage that data on premise and and as we I want to ask you how it's affected customer priorities and timelines with regard it's really about making sure that you can use that data as an asset Ondas and that Of course, Chris, the other thing we've seen is the acceleration toe to the cloud, which is obviously you're So So I'd say that you know, Cove, it has, you know, days of data quality in the early days of Chief Data, Officer all the focus was on And in the round tables that I've participated in over the last, that the business doesn't have to worry about it. We believe that that that truly, you know, provides the explainable trust So, one of the key aspects of what snowflake is doing And I think you know, snowflake is a tremendous landing spot being a data warehouse or data cloud I wish we had more time, but we gotta go on dso Thank you so much for sharing your perspectives.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
ChrisPERSON

0.99+

Chris DegnanPERSON

0.99+

Crystal GrahamPERSON

0.99+

ThioPERSON

0.99+

DavePERSON

0.99+

last yearDATE

0.99+

first stepQUANTITY

0.99+

bothQUANTITY

0.99+

OneQUANTITY

0.99+

second pieceQUANTITY

0.99+

CoveORGANIZATION

0.99+

about 20 timesQUANTITY

0.97+

todayDATE

0.97+

one industryQUANTITY

0.97+

SeagramORGANIZATION

0.97+

oneQUANTITY

0.97+

pandemicEVENT

0.97+

one aspectQUANTITY

0.97+

2019DATE

0.95+

BenPERSON

0.95+

this yearDATE

0.93+

few years agoDATE

0.93+

last decadeDATE

0.91+

single repositoryQUANTITY

0.91+

over 19 data setQUANTITY

0.9+

last six monthsDATE

0.88+

Big Data HadoopTITLE

0.83+

yearsDATE

0.82+

TalendORGANIZATION

0.81+

coveORGANIZATION

0.76+

CovicPERSON

0.73+

secondQUANTITY

0.7+

single dayQUANTITY

0.7+

SnowflakeORGANIZATION

0.69+

pastDATE

0.63+

R.PERSON

0.59+

ofDATE

0.56+

tonQUANTITY

0.49+

SnowflakeEVENT

0.48+

TelegramTITLE

0.48+

DXPERSON

0.45+

CubeORGANIZATION

0.38+

4-video test


 

>>don't talk mhm, >>Okay, thing is my presentation on coherent nonlinear dynamics and combinatorial optimization. This is going to be a talk to introduce an approach we're taking to the analysis of the performance of coherent using machines. So let me start with a brief introduction to easing optimization. The easing model represents a set of interacting magnetic moments or spins the total energy given by the expression shown at the bottom left of this slide. Here, the signal variables are meditate binary values. The Matrix element J. I. J. Represents the interaction, strength and signed between any pair of spins. I. J and A Chive represents a possible local magnetic field acting on each thing. The easing ground state problem is to find an assignment of binary spin values that achieves the lowest possible value of total energy. And an instance of the easing problem is specified by giving numerical values for the Matrix J in Vector H. Although the easy model originates in physics, we understand the ground state problem to correspond to what would be called quadratic binary optimization in the field of operations research and in fact, in terms of computational complexity theory, it could be established that the easing ground state problem is np complete. Qualitatively speaking, this makes the easing problem a representative sort of hard optimization problem, for which it is expected that the runtime required by any computational algorithm to find exact solutions should, as anatomically scale exponentially with the number of spends and for worst case instances at each end. Of course, there's no reason to believe that the problem instances that actually arrives in practical optimization scenarios are going to be worst case instances. And it's also not generally the case in practical optimization scenarios that we demand absolute optimum solutions. Usually we're more interested in just getting the best solution we can within an affordable cost, where costs may be measured in terms of time, service fees and or energy required for a computation. This focuses great interest on so called heuristic algorithms for the easing problem in other NP complete problems which generally get very good but not guaranteed optimum solutions and run much faster than algorithms that are designed to find absolute Optima. To get some feeling for present day numbers, we can consider the famous traveling salesman problem for which extensive compilations of benchmarking data may be found online. A recent study found that the best known TSP solver required median run times across the Library of Problem instances That scaled is a very steep route exponential for end up to approximately 4500. This gives some indication of the change in runtime scaling for generic as opposed the worst case problem instances. Some of the instances considered in this study were taken from a public library of T SPS derived from real world Veil aside design data. This feels I TSP Library includes instances within ranging from 131 to 744,710 instances from this library with end between 6880 13,584 were first solved just a few years ago in 2017 requiring days of run time and a 48 core to King hurts cluster, while instances with and greater than or equal to 14,233 remain unsolved exactly by any means. Approximate solutions, however, have been found by heuristic methods for all instances in the VLS i TSP library with, for example, a solution within 0.14% of a no lower bound, having been discovered, for instance, with an equal 19,289 requiring approximately two days of run time on a single core of 2.4 gigahertz. Now, if we simple mindedly extrapolate the root exponential scaling from the study up to an equal 4500, we might expect that an exact solver would require something more like a year of run time on the 48 core cluster used for the N equals 13,580 for instance, which shows how much a very small concession on the quality of the solution makes it possible to tackle much larger instances with much lower cost. At the extreme end, the largest TSP ever solved exactly has an equal 85,900. This is an instance derived from 19 eighties VLSI design, and it's required 136 CPU. Years of computation normalized to a single cord, 2.4 gigahertz. But the 24 larger so called world TSP benchmark instance within equals 1,904,711 has been solved approximately within ophthalmology. Gap bounded below 0.474%. Coming back to the general. Practical concerns have applied optimization. We may note that a recent meta study analyzed the performance of no fewer than 37 heuristic algorithms for Max cut and quadratic pioneer optimization problems and found the performance sort and found that different heuristics work best for different problem instances selected from a large scale heterogeneous test bed with some evidence but cryptic structure in terms of what types of problem instances were best solved by any given heuristic. Indeed, their their reasons to believe that these results from Mexico and quadratic binary optimization reflected general principle of performance complementarity among heuristic optimization algorithms in the practice of solving heart optimization problems there. The cerise is a critical pre processing issue of trying to guess which of a number of available good heuristic algorithms should be chosen to tackle a given problem. Instance, assuming that any one of them would incur high costs to run on a large problem, instances incidence, making an astute choice of heuristic is a crucial part of maximizing overall performance. Unfortunately, we still have very little conceptual insight about what makes a specific problem instance, good or bad for any given heuristic optimization algorithm. This has certainly been pinpointed by researchers in the field is a circumstance that must be addressed. So adding this all up, we see that a critical frontier for cutting edge academic research involves both the development of novel heuristic algorithms that deliver better performance, with lower cost on classes of problem instances that are underserved by existing approaches, as well as fundamental research to provide deep conceptual insight into what makes a given problem in, since easy or hard for such algorithms. In fact, these days, as we talk about the end of Moore's law and speculate about a so called second quantum revolution, it's natural to talk not only about novel algorithms for conventional CPUs but also about highly customized special purpose hardware architectures on which we may run entirely unconventional algorithms for combinatorial optimization such as easing problem. So against that backdrop, I'd like to use my remaining time to introduce our work on analysis of coherent using machine architectures and associate ID optimization algorithms. These machines, in general, are a novel class of information processing architectures for solving combinatorial optimization problems by embedding them in the dynamics of analog, physical or cyber physical systems, in contrast to both MAWR traditional engineering approaches that build using machines using conventional electron ICS and more radical proposals that would require large scale quantum entanglement. The emerging paradigm of coherent easing machines leverages coherent nonlinear dynamics in photonic or Opto electronic platforms to enable near term construction of large scale prototypes that leverage post Simoes information dynamics, the general structure of of current CM systems has shown in the figure on the right. The role of the easing spins is played by a train of optical pulses circulating around a fiber optical storage ring. A beam splitter inserted in the ring is used to periodically sample the amplitude of every optical pulse, and the measurement results are continually read into a refugee A, which uses them to compute perturbations to be applied to each pulse by a synchronized optical injections. These perturbations, air engineered to implement the spin, spin coupling and local magnetic field terms of the easing Hamiltonian, corresponding to a linear part of the CME Dynamics, a synchronously pumped parametric amplifier denoted here as PPL and Wave Guide adds a crucial nonlinear component to the CIA and Dynamics as well. In the basic CM algorithm, the pump power starts very low and has gradually increased at low pump powers. The amplitude of the easing spin pulses behaviors continuous, complex variables. Who Israel parts which can be positive or negative, play the role of play the role of soft or perhaps mean field spins once the pump, our crosses the threshold for parametric self oscillation. In the optical fiber ring, however, the attitudes of the easing spin pulses become effectively Qantas ized into binary values while the pump power is being ramped up. The F P J subsystem continuously applies its measurement based feedback. Implementation of the using Hamiltonian terms, the interplay of the linear rised using dynamics implemented by the F P G A and the threshold conversation dynamics provided by the sink pumped Parametric amplifier result in the final state of the optical optical pulse amplitude at the end of the pump ramp that could be read as a binary strain, giving a proposed solution of the easing ground state problem. This method of solving easing problem seems quite different from a conventional algorithm that runs entirely on a digital computer as a crucial aspect of the computation is performed physically by the analog, continuous, coherent, nonlinear dynamics of the optical degrees of freedom. In our efforts to analyze CIA and performance, we have therefore turned to the tools of dynamical systems theory, namely, a study of modifications, the evolution of critical points and apologies of hetero clinic orbits and basins of attraction. We conjecture that such analysis can provide fundamental insight into what makes certain optimization instances hard or easy for coherent using machines and hope that our approach can lead to both improvements of the course, the AM algorithm and a pre processing rubric for rapidly assessing the CME suitability of new instances. Okay, to provide a bit of intuition about how this all works, it may help to consider the threshold dynamics of just one or two optical parametric oscillators in the CME architecture just described. We can think of each of the pulse time slots circulating around the fiber ring, as are presenting an independent Opio. We can think of a single Opio degree of freedom as a single, resonant optical node that experiences linear dissipation, do toe out coupling loss and gain in a pump. Nonlinear crystal has shown in the diagram on the upper left of this slide as the pump power is increased from zero. As in the CME algorithm, the non linear game is initially to low toe overcome linear dissipation, and the Opio field remains in a near vacuum state at a critical threshold. Value gain. Equal participation in the Popeo undergoes a sort of lazing transition, and the study states of the OPIO above this threshold are essentially coherent states. There are actually two possible values of the Opio career in amplitude and any given above threshold pump power which are equal in magnitude but opposite in phase when the OPI across the special diet basically chooses one of the two possible phases randomly, resulting in the generation of a single bit of information. If we consider to uncoupled, Opio has shown in the upper right diagram pumped it exactly the same power at all times. Then, as the pump power has increased through threshold, each Opio will independently choose the phase and thus to random bits are generated for any number of uncoupled. Oppose the threshold power per opio is unchanged from the single Opio case. Now, however, consider a scenario in which the two appeals air, coupled to each other by a mutual injection of their out coupled fields has shown in the diagram on the lower right. One can imagine that depending on the sign of the coupling parameter Alfa, when one Opio is lazing, it will inject a perturbation into the other that may interfere either constructively or destructively, with the feel that it is trying to generate by its own lazing process. As a result, when came easily showed that for Alfa positive, there's an effective ferro magnetic coupling between the two Opio fields and their collective oscillation threshold is lowered from that of the independent Opio case. But on Lee for the two collective oscillation modes in which the two Opio phases are the same for Alfa Negative, the collective oscillation threshold is lowered on Lee for the configurations in which the Opio phases air opposite. So then, looking at how Alfa is related to the J. I. J matrix of the easing spin coupling Hamiltonian, it follows that we could use this simplistic to a p o. C. I am to solve the ground state problem of a fair magnetic or anti ferro magnetic ankles to easing model simply by increasing the pump power from zero and observing what phase relation occurs as the two appeals first start delays. Clearly, we can imagine generalizing this story toe larger, and however the story doesn't stay is clean and simple for all larger problem instances. And to find a more complicated example, we only need to go to n equals four for some choices of J J for n equals, for the story remains simple. Like the n equals two case. The figure on the upper left of this slide shows the energy of various critical points for a non frustrated and equals, for instance, in which the first bifurcated critical point that is the one that I forget to the lowest pump value a. Uh, this first bifurcated critical point flows as symptomatically into the lowest energy easing solution and the figure on the upper right. However, the first bifurcated critical point flows to a very good but sub optimal minimum at large pump power. The global minimum is actually given by a distinct critical critical point that first appears at a higher pump power and is not automatically connected to the origin. The basic C am algorithm is thus not able to find this global minimum. Such non ideal behaviors needs to become more confident. Larger end for the n equals 20 instance, showing the lower plots where the lower right plot is just a zoom into a region of the lower left lot. It can be seen that the global minimum corresponds to a critical point that first appears out of pump parameter, a around 0.16 at some distance from the idiomatic trajectory of the origin. That's curious to note that in both of these small and examples, however, the critical point corresponding to the global minimum appears relatively close to the idiomatic projector of the origin as compared to the most of the other local minima that appear. We're currently working to characterize the face portrait topology between the global minimum in the antibiotic trajectory of the origin, taking clues as to how the basic C am algorithm could be generalized to search for non idiomatic trajectories that jump to the global minimum during the pump ramp. Of course, n equals 20 is still too small to be of interest for practical optimization applications. But the advantage of beginning with the study of small instances is that we're able reliably to determine their global minima and to see how they relate to the 80 about trajectory of the origin in the basic C am algorithm. In the smaller and limit, we can also analyze fully quantum mechanical models of Syrian dynamics. But that's a topic for future talks. Um, existing large scale prototypes are pushing into the range of in equals 10 to the 4 10 to 5 to six. So our ultimate objective in theoretical analysis really has to be to try to say something about CIA and dynamics and regime of much larger in our initial approach to characterizing CIA and behavior in the large in regime relies on the use of random matrix theory, and this connects to prior research on spin classes, SK models and the tap equations etcetera. At present, we're focusing on statistical characterization of the CIA ingredient descent landscape, including the evolution of critical points in their Eigen value spectra. As the pump power is gradually increased. We're investigating, for example, whether there could be some way to exploit differences in the relative stability of the global minimum versus other local minima. We're also working to understand the deleterious or potentially beneficial effects of non ideologies, such as a symmetry in the implemented these and couplings. Looking one step ahead, we plan to move next in the direction of considering more realistic classes of problem instances such as quadratic, binary optimization with constraints. Eso In closing, I should acknowledge people who did the hard work on these things that I've shown eso. My group, including graduate students Ed winning, Daniel Wennberg, Tatsuya Nagamoto and Atsushi Yamamura, have been working in close collaboration with Syria Ganguly, Marty Fair and Amir Safarini Nini, all of us within the Department of Applied Physics at Stanford University. On also in collaboration with the Oshima Moto over at NTT 55 research labs, Onda should acknowledge funding support from the NSF by the Coherent Easing Machines Expedition in computing, also from NTT five research labs, Army Research Office and Exxon Mobil. Uh, that's it. Thanks very much. >>Mhm e >>t research and the Oshie for putting together this program and also the opportunity to speak here. My name is Al Gore ism or Andy and I'm from Caltech, and today I'm going to tell you about the work that we have been doing on networks off optical parametric oscillators and how we have been using them for icing machines and how we're pushing them toward Cornum photonics to acknowledge my team at Caltech, which is now eight graduate students and five researcher and postdocs as well as collaborators from all over the world, including entity research and also the funding from different places, including entity. So this talk is primarily about networks of resonate er's, and these networks are everywhere from nature. For instance, the brain, which is a network of oscillators all the way to optics and photonics and some of the biggest examples or metal materials, which is an array of small resonate er's. And we're recently the field of technological photonics, which is trying thio implement a lot of the technological behaviors of models in the condensed matter, physics in photonics and if you want to extend it even further, some of the implementations off quantum computing are technically networks of quantum oscillators. So we started thinking about these things in the context of icing machines, which is based on the icing problem, which is based on the icing model, which is the simple summation over the spins and spins can be their upward down and the couplings is given by the JJ. And the icing problem is, if you know J I J. What is the spin configuration that gives you the ground state? And this problem is shown to be an MP high problem. So it's computational e important because it's a representative of the MP problems on NPR. Problems are important because first, their heart and standard computers if you use a brute force algorithm and they're everywhere on the application side. That's why there is this demand for making a machine that can target these problems, and hopefully it can provide some meaningful computational benefit compared to the standard digital computers. So I've been building these icing machines based on this building block, which is a degenerate optical parametric. Oscillator on what it is is resonator with non linearity in it, and we pump these resonate er's and we generate the signal at half the frequency of the pump. One vote on a pump splits into two identical photons of signal, and they have some very interesting phase of frequency locking behaviors. And if you look at the phase locking behavior, you realize that you can actually have two possible phase states as the escalation result of these Opio which are off by pie, and that's one of the important characteristics of them. So I want to emphasize a little more on that and I have this mechanical analogy which are basically two simple pendulum. But there are parametric oscillators because I'm going to modulate the parameter of them in this video, which is the length of the string on by that modulation, which is that will make a pump. I'm gonna make a muscular. That'll make a signal which is half the frequency of the pump. And I have two of them to show you that they can acquire these face states so they're still facing frequency lock to the pump. But it can also lead in either the zero pie face states on. The idea is to use this binary phase to represent the binary icing spin. So each opio is going to represent spin, which can be either is your pie or up or down. And to implement the network of these resonate er's, we use the time off blood scheme, and the idea is that we put impulses in the cavity. These pulses air separated by the repetition period that you put in or t r. And you can think about these pulses in one resonator, xaz and temporarily separated synthetic resonate Er's if you want a couple of these resonator is to each other, and now you can introduce these delays, each of which is a multiple of TR. If you look at the shortest delay it couples resonator wanted to 2 to 3 and so on. If you look at the second delay, which is two times a rotation period, the couple's 123 and so on. And if you have and minus one delay lines, then you can have any potential couplings among these synthetic resonate er's. And if I can introduce these modulators in those delay lines so that I can strength, I can control the strength and the phase of these couplings at the right time. Then I can have a program will all toe all connected network in this time off like scheme, and the whole physical size of the system scales linearly with the number of pulses. So the idea of opium based icing machine is didn't having these o pos, each of them can be either zero pie and I can arbitrarily connect them to each other. And then I start with programming this machine to a given icing problem by just setting the couplings and setting the controllers in each of those delight lines. So now I have a network which represents an icing problem. Then the icing problem maps to finding the face state that satisfy maximum number of coupling constraints. And the way it happens is that the icing Hamiltonian maps to the linear loss of the network. And if I start adding gain by just putting pump into the network, then the OPI ohs are expected to oscillate in the lowest, lowest lost state. And, uh and we have been doing these in the past, uh, six or seven years and I'm just going to quickly show you the transition, especially what happened in the first implementation, which was using a free space optical system and then the guided wave implementation in 2016 and the measurement feedback idea which led to increasing the size and doing actual computation with these machines. So I just want to make this distinction here that, um, the first implementation was an all optical interaction. We also had an unequal 16 implementation. And then we transition to this measurement feedback idea, which I'll tell you quickly what it iss on. There's still a lot of ongoing work, especially on the entity side, to make larger machines using the measurement feedback. But I'm gonna mostly focused on the all optical networks and how we're using all optical networks to go beyond simulation of icing Hamiltonian both in the linear and non linear side and also how we're working on miniaturization of these Opio networks. So the first experiment, which was the four opium machine, it was a free space implementation and this is the actual picture off the machine and we implemented a small and it calls for Mexico problem on the machine. So one problem for one experiment and we ran the machine 1000 times, we looked at the state and we always saw it oscillate in one of these, um, ground states of the icing laboratoria. So then the measurement feedback idea was to replace those couplings and the controller with the simulator. So we basically simulated all those coherent interactions on on FB g. A. And we replicated the coherent pulse with respect to all those measurements. And then we injected it back into the cavity and on the near to you still remain. So it still is a non. They're dynamical system, but the linear side is all simulated. So there are lots of questions about if this system is preserving important information or not, or if it's gonna behave better. Computational wars. And that's still ah, lot of ongoing studies. But nevertheless, the reason that this implementation was very interesting is that you don't need the end minus one delight lines so you can just use one. Then you can implement a large machine, and then you can run several thousands of problems in the machine, and then you can compare the performance from the computational perspective Looks so I'm gonna split this idea of opium based icing machine into two parts. One is the linear part, which is if you take out the non linearity out of the resonator and just think about the connections. You can think about this as a simple matrix multiplication scheme. And that's basically what gives you the icing Hambletonian modeling. So the optical laws of this network corresponds to the icing Hamiltonian. And if I just want to show you the example of the n equals for experiment on all those face states and the history Graham that we saw, you can actually calculate the laws of each of those states because all those interferences in the beam splitters and the delay lines are going to give you a different losses. And then you will see that the ground states corresponds to the lowest laws of the actual optical network. If you add the non linearity, the simple way of thinking about what the non linearity does is that it provides to gain, and then you start bringing up the gain so that it hits the loss. Then you go through the game saturation or the threshold which is going to give you this phase bifurcation. So you go either to zero the pie face state. And the expectation is that Theis, the network oscillates in the lowest possible state, the lowest possible loss state. There are some challenges associated with this intensity Durban face transition, which I'm going to briefly talk about. I'm also going to tell you about other types of non aerodynamics that we're looking at on the non air side of these networks. So if you just think about the linear network, we're actually interested in looking at some technological behaviors in these networks. And the difference between looking at the technological behaviors and the icing uh, machine is that now, First of all, we're looking at the type of Hamilton Ian's that are a little different than the icing Hamilton. And one of the biggest difference is is that most of these technological Hamilton Ian's that require breaking the time reversal symmetry, meaning that you go from one spin to in the one side to another side and you get one phase. And if you go back where you get a different phase, and the other thing is that we're not just interested in finding the ground state, we're actually now interesting and looking at all sorts of states and looking at the dynamics and the behaviors of all these states in the network. So we started with the simplest implementation, of course, which is a one d chain of thes resonate, er's, which corresponds to a so called ssh model. In the technological work, we get the similar energy to los mapping and now we can actually look at the band structure on. This is an actual measurement that we get with this associate model and you see how it reasonably how How? Well, it actually follows the prediction and the theory. One of the interesting things about the time multiplexing implementation is that now you have the flexibility of changing the network as you are running the machine. And that's something unique about this time multiplex implementation so that we can actually look at the dynamics. And one example that we have looked at is we can actually go through the transition off going from top A logical to the to the standard nontrivial. I'm sorry to the trivial behavior of the network. You can then look at the edge states and you can also see the trivial and states and the technological at states actually showing up in this network. We have just recently implement on a two D, uh, network with Harper Hofstadter model and when you don't have the results here. But we're one of the other important characteristic of time multiplexing is that you can go to higher and higher dimensions and keeping that flexibility and dynamics, and we can also think about adding non linearity both in a classical and quantum regimes, which is going to give us a lot of exotic, no classical and quantum, non innate behaviors in these networks. Yeah, So I told you about the linear side. Mostly let me just switch gears and talk about the nonlinear side of the network. And the biggest thing that I talked about so far in the icing machine is this face transition that threshold. So the low threshold we have squeezed state in these. Oh, pios, if you increase the pump, we go through this intensity driven phase transition and then we got the face stays above threshold. And this is basically the mechanism off the computation in these O pos, which is through this phase transition below to above threshold. So one of the characteristics of this phase transition is that below threshold, you expect to see quantum states above threshold. You expect to see more classical states or coherent states, and that's basically corresponding to the intensity off the driving pump. So it's really hard to imagine that it can go above threshold. Or you can have this friends transition happen in the all in the quantum regime. And there are also some challenges associated with the intensity homogeneity off the network, which, for example, is if one opioid starts oscillating and then its intensity goes really high. Then it's going to ruin this collective decision making off the network because of the intensity driven face transition nature. So So the question is, can we look at other phase transitions? Can we utilize them for both computing? And also can we bring them to the quantum regime on? I'm going to specifically talk about the face transition in the spectral domain, which is the transition from the so called degenerate regime, which is what I mostly talked about to the non degenerate regime, which happens by just tuning the phase of the cavity. And what is interesting is that this phase transition corresponds to a distinct phase noise behavior. So in the degenerate regime, which we call it the order state, you're gonna have the phase being locked to the phase of the pump. As I talked about non degenerate regime. However, the phase is the phase is mostly dominated by the quantum diffusion. Off the off the phase, which is limited by the so called shallow towns limit, and you can see that transition from the general to non degenerate, which also has distinct symmetry differences. And this transition corresponds to a symmetry breaking in the non degenerate case. The signal can acquire any of those phases on the circle, so it has a you one symmetry. Okay, and if you go to the degenerate case, then that symmetry is broken and you only have zero pie face days I will look at. So now the question is can utilize this phase transition, which is a face driven phase transition, and can we use it for similar computational scheme? So that's one of the questions that were also thinking about. And it's not just this face transition is not just important for computing. It's also interesting from the sensing potentials and this face transition, you can easily bring it below threshold and just operated in the quantum regime. Either Gaussian or non Gaussian. If you make a network of Opio is now, we can see all sorts off more complicated and more interesting phase transitions in the spectral domain. One of them is the first order phase transition, which you get by just coupling to Opio, and that's a very abrupt face transition and compared to the to the single Opio phase transition. And if you do the couplings right, you can actually get a lot of non her mission dynamics and exceptional points, which are actually very interesting to explore both in the classical and quantum regime. And I should also mention that you can think about the cup links to be also nonlinear couplings. And that's another behavior that you can see, especially in the nonlinear in the non degenerate regime. So with that, I basically told you about these Opio networks, how we can think about the linear scheme and the linear behaviors and how we can think about the rich, nonlinear dynamics and non linear behaviors both in the classical and quantum regime. I want to switch gear and tell you a little bit about the miniaturization of these Opio networks. And of course, the motivation is if you look at the electron ICS and what we had 60 or 70 years ago with vacuum tube and how we transition from relatively small scale computers in the order of thousands of nonlinear elements to billions of non elements where we are now with the optics is probably very similar to 70 years ago, which is a table talk implementation. And the question is, how can we utilize nano photonics? I'm gonna just briefly show you the two directions on that which we're working on. One is based on lithium Diabate, and the other is based on even a smaller resonate er's could you? So the work on Nana Photonic lithium naive. It was started in collaboration with Harvard Marko Loncar, and also might affair at Stanford. And, uh, we could show that you can do the periodic polling in the phenomenon of it and get all sorts of very highly nonlinear processes happening in this net. Photonic periodically polls if, um Diabate. And now we're working on building. Opio was based on that kind of photonic the film Diabate. And these air some some examples of the devices that we have been building in the past few months, which I'm not gonna tell you more about. But the O. P. O. S. And the Opio Networks are in the works. And that's not the only way of making large networks. Um, but also I want to point out that The reason that these Nana photonic goblins are actually exciting is not just because you can make a large networks and it can make him compact in a in a small footprint. They also provide some opportunities in terms of the operation regime. On one of them is about making cat states and Opio, which is, can we have the quantum superposition of the zero pie states that I talked about and the Net a photonic within? I've It provides some opportunities to actually get closer to that regime because of the spatial temporal confinement that you can get in these wave guides. So we're doing some theory on that. We're confident that the type of non linearity two losses that it can get with these platforms are actually much higher than what you can get with other platform their existing platforms and to go even smaller. We have been asking the question off. What is the smallest possible Opio that you can make? Then you can think about really wavelength scale type, resonate er's and adding the chi to non linearity and see how and when you can get the Opio to operate. And recently, in collaboration with us see, we have been actually USC and Creole. We have demonstrated that you can use nano lasers and get some spin Hamilton and implementations on those networks. So if you can build the a P. O s, we know that there is a path for implementing Opio Networks on on such a nano scale. So we have looked at these calculations and we try to estimate the threshold of a pos. Let's say for me resonator and it turns out that it can actually be even lower than the type of bulk Pip Llano Pos that we have been building in the past 50 years or so. So we're working on the experiments and we're hoping that we can actually make even larger and larger scale Opio networks. So let me summarize the talk I told you about the opium networks and our work that has been going on on icing machines and the measurement feedback. And I told you about the ongoing work on the all optical implementations both on the linear side and also on the nonlinear behaviors. And I also told you a little bit about the efforts on miniaturization and going to the to the Nano scale. So with that, I would like Thio >>three from the University of Tokyo. Before I thought that would like to thank you showing all the stuff of entity for the invitation and the organization of this online meeting and also would like to say that it has been very exciting to see the growth of this new film lab. And I'm happy to share with you today of some of the recent works that have been done either by me or by character of Hong Kong. Honest Group indicates the title of my talk is a neuro more fic in silica simulator for the communities in machine. And here is the outline I would like to make the case that the simulation in digital Tektronix of the CME can be useful for the better understanding or improving its function principles by new job introducing some ideas from neural networks. This is what I will discuss in the first part and then it will show some proof of concept of the game and performance that can be obtained using dissimulation in the second part and the protection of the performance that can be achieved using a very large chaos simulator in the third part and finally talk about future plans. So first, let me start by comparing recently proposed izing machines using this table there is elected from recent natural tronics paper from the village Park hard people, and this comparison shows that there's always a trade off between energy efficiency, speed and scalability that depends on the physical implementation. So in red, here are the limitation of each of the servers hardware on, interestingly, the F p G, a based systems such as a producer, digital, another uh Toshiba beautification machine or a recently proposed restricted Bozeman machine, FPD A by a group in Berkeley. They offer a good compromise between speed and scalability. And this is why, despite the unique advantage that some of these older hardware have trust as the currency proposition in Fox, CBS or the energy efficiency off memory Sisters uh P. J. O are still an attractive platform for building large organizing machines in the near future. The reason for the good performance of Refugee A is not so much that they operate at the high frequency. No, there are particular in use, efficient, but rather that the physical wiring off its elements can be reconfigured in a way that limits the funding human bottleneck, larger, funny and phenols and the long propagation video information within the system. In this respect, the LPGA is They are interesting from the perspective off the physics off complex systems, but then the physics of the actions on the photos. So to put the performance of these various hardware and perspective, we can look at the competition of bringing the brain the brain complete, using billions of neurons using only 20 watts of power and operates. It's a very theoretically slow, if we can see and so this impressive characteristic, they motivate us to try to investigate. What kind of new inspired principles be useful for designing better izing machines? The idea of this research project in the future collaboration it's to temporary alleviates the limitations that are intrinsic to the realization of an optical cortex in machine shown in the top panel here. By designing a large care simulator in silicone in the bottom here that can be used for digesting the better organization principles of the CIA and this talk, I will talk about three neuro inspired principles that are the symmetry of connections, neural dynamics orphan chaotic because of symmetry, is interconnectivity the infrastructure? No. Next talks are not composed of the reputation of always the same types of non environments of the neurons, but there is a local structure that is repeated. So here's the schematic of the micro column in the cortex. And lastly, the Iraqi co organization of connectivity connectivity is organizing a tree structure in the brain. So here you see a representation of the Iraqi and organization of the monkey cerebral cortex. So how can these principles we used to improve the performance of the icing machines? And it's in sequence stimulation. So, first about the two of principles of the estimate Trian Rico structure. We know that the classical approximation of the car testing machine, which is the ground toe, the rate based on your networks. So in the case of the icing machines, uh, the okay, Scott approximation can be obtained using the trump active in your position, for example, so the times of both of the system they are, they can be described by the following ordinary differential equations on in which, in case of see, I am the X, I represent the in phase component of one GOP Oh, Theo f represents the monitor optical parts, the district optical Parametric amplification and some of the good I JoJo extra represent the coupling, which is done in the case of the measure of feedback coupling cm using oh, more than detection and refugee A and then injection off the cooking time and eso this dynamics in both cases of CNN in your networks, they can be written as the grand set of a potential function V, and this written here, and this potential functionally includes the rising Maccagnan. So this is why it's natural to use this type of, uh, dynamics to solve the icing problem in which the Omega I J or the eyes in coping and the H is the extension of the icing and attorney in India and expect so. Not that this potential function can only be defined if the Omega I j. R. A. Symmetric. So the well known problem of this approach is that this potential function V that we obtain is very non convicts at low temperature, and also one strategy is to gradually deformed this landscape, using so many in process. But there is no theorem. Unfortunately, that granted conventions to the global minimum of There's even Tony and using this approach. And so this is why we propose, uh, to introduce a macro structures of the system where one analog spin or one D O. P. O is replaced by a pair off one another spin and one error, according viable. And the addition of this chemical structure introduces a symmetry in the system, which in terms induces chaotic dynamics, a chaotic search rather than a learning process for searching for the ground state of the icing. Every 20 within this massacre structure the role of the er variable eyes to control the amplitude off the analog spins toe force. The amplitude of the expense toe become equal to certain target amplitude a uh and, uh, and this is done by modulating the strength off the icing complaints or see the the error variable E I multiply the icing complaint here in the dynamics off air d o p. O. On then the dynamics. The whole dynamics described by this coupled equations because the e I do not necessarily take away the same value for the different. I thesis introduces a symmetry in the system, which in turn creates security dynamics, which I'm sure here for solving certain current size off, um, escape problem, Uh, in which the X I are shown here and the i r from here and the value of the icing energy showing the bottom plots. You see this Celtics search that visit various local minima of the as Newtonian and eventually finds the global minimum? Um, it can be shown that this modulation off the target opportunity can be used to destabilize all the local minima off the icing evertonians so that we're gonna do not get stuck in any of them. On more over the other types of attractors I can eventually appear, such as limits I contractors, Okot contractors. They can also be destabilized using the motivation of the target and Batuta. And so we have proposed in the past two different moderation of the target amateur. The first one is a modulation that ensure the uh 100 reproduction rate of the system to become positive on this forbids the creation off any nontrivial tractors. And but in this work, I will talk about another moderation or arrested moderation which is given here. That works, uh, as well as this first uh, moderation, but is easy to be implemented on refugee. So this couple of the question that represent becoming the stimulation of the cortex in machine with some error correction they can be implemented especially efficiently on an F B. G. And here I show the time that it takes to simulate three system and also in red. You see, at the time that it takes to simulate the X I term the EI term, the dot product and the rising Hamiltonian for a system with 500 spins and Iraq Spain's equivalent to 500 g. O. P. S. So >>in >>f b d a. The nonlinear dynamics which, according to the digital optical Parametric amplification that the Opa off the CME can be computed in only 13 clock cycles at 300 yards. So which corresponds to about 0.1 microseconds. And this is Toby, uh, compared to what can be achieved in the measurements back O C. M. In which, if we want to get 500 timer chip Xia Pios with the one she got repetition rate through the obstacle nine narrative. Uh, then way would require 0.5 microseconds toe do this so the submission in F B J can be at least as fast as ah one g repression. Uh, replicate pulsed laser CIA Um, then the DOT product that appears in this differential equation can be completed in 43 clock cycles. That's to say, one microseconds at 15 years. So I pieced for pouring sizes that are larger than 500 speeds. The dot product becomes clearly the bottleneck, and this can be seen by looking at the the skating off the time the numbers of clock cycles a text to compute either the non in your optical parts or the dog products, respect to the problem size. And And if we had infinite amount of resources and PGA to simulate the dynamics, then the non illogical post can could be done in the old one. On the mattress Vector product could be done in the low carrot off, located off scales as a look at it off and and while the guide off end. Because computing the dot product involves assuming all the terms in the product, which is done by a nephew, GE by another tree, which heights scarce logarithmic any with the size of the system. But This is in the case if we had an infinite amount of resources on the LPGA food, but for dealing for larger problems off more than 100 spins. Usually we need to decompose the metrics into ah, smaller blocks with the block side that are not you here. And then the scaling becomes funny, non inner parts linear in the end, over you and for the products in the end of EU square eso typically for low NF pdf cheap PGA you the block size off this matrix is typically about 100. So clearly way want to make you as large as possible in order to maintain this scanning in a log event for the numbers of clock cycles needed to compute the product rather than this and square that occurs if we decompose the metrics into smaller blocks. But the difficulty in, uh, having this larger blocks eyes that having another tree very large Haider tree introduces a large finding and finance and long distance start a path within the refugee. So the solution to get higher performance for a simulator of the contest in machine eyes to get rid of this bottleneck for the dot product by increasing the size of this at the tree. And this can be done by organizing your critique the electrical components within the LPGA in order which is shown here in this, uh, right panel here in order to minimize the finding finance of the system and to minimize the long distance that a path in the in the fpt So I'm not going to the details of how this is implemented LPGA. But just to give you a idea off why the Iraqi Yahiko organization off the system becomes the extremely important toe get good performance for similar organizing machine. So instead of instead of getting into the details of the mpg implementation, I would like to give some few benchmark results off this simulator, uh, off the that that was used as a proof of concept for this idea which is can be found in this archive paper here and here. I should results for solving escape problems. Free connected person, randomly person minus one spring last problems and we sure, as we use as a metric the numbers of the mattress Victor products since it's the bottleneck of the computation, uh, to get the optimal solution of this escape problem with the Nina successful BT against the problem size here and and in red here, this propose FDJ implementation and in ah blue is the numbers of retrospective product that are necessary for the C. I am without error correction to solve this escape programs and in green here for noisy means in an evening which is, uh, behavior with similar to the Cartesian mission. Uh, and so clearly you see that the scaring off the numbers of matrix vector product necessary to solve this problem scales with a better exponents than this other approaches. So So So that's interesting feature of the system and next we can see what is the real time to solution to solve this SK instances eso in the last six years, the time institution in seconds to find a grand state of risk. Instances remain answers probability for different state of the art hardware. So in red is the F B g. A presentation proposing this paper and then the other curve represent Ah, brick a local search in in orange and silver lining in purple, for example. And so you see that the scaring off this purpose simulator is is rather good, and that for larger plant sizes we can get orders of magnitude faster than the state of the art approaches. Moreover, the relatively good scanning off the time to search in respect to problem size uh, they indicate that the FPD implementation would be faster than risk. Other recently proposed izing machine, such as the hope you know, natural complimented on memories distance that is very fast for small problem size in blue here, which is very fast for small problem size. But which scanning is not good on the same thing for the restricted Bosman machine. Implementing a PGA proposed by some group in Broken Recently Again, which is very fast for small parliament sizes but which canning is bad so that a dis worse than the proposed approach so that we can expect that for programs size is larger than 1000 spins. The proposed, of course, would be the faster one. Let me jump toe this other slide and another confirmation that the scheme scales well that you can find the maximum cut values off benchmark sets. The G sets better candidates that have been previously found by any other algorithms, so they are the best known could values to best of our knowledge. And, um or so which is shown in this paper table here in particular, the instances, uh, 14 and 15 of this G set can be We can find better converse than previously known, and we can find this can vary is 100 times faster than the state of the art algorithm and CP to do this which is a very common Kasich. It s not that getting this a good result on the G sets, they do not require ah, particular hard tuning of the parameters. So the tuning issuing here is very simple. It it just depends on the degree off connectivity within each graph. And so this good results on the set indicate that the proposed approach would be a good not only at solving escape problems in this problems, but all the types off graph sizing problems on Mexican province in communities. So given that the performance off the design depends on the height of this other tree, we can try to maximize the height of this other tree on a large F p g a onda and carefully routing the components within the P G A and and we can draw some projections of what type of performance we can achieve in the near future based on the, uh, implementation that we are currently working. So here you see projection for the time to solution way, then next property for solving this escape programs respect to the prime assize. And here, compared to different with such publicizing machines, particularly the digital. And, you know, 42 is shown in the green here, the green line without that's and, uh and we should two different, uh, hypothesis for this productions either that the time to solution scales as exponential off n or that the time of social skills as expression of square root off. So it seems, according to the data, that time solution scares more as an expression of square root of and also we can be sure on this and this production show that we probably can solve prime escape problem of science 2000 spins, uh, to find the rial ground state of this problem with 99 success ability in about 10 seconds, which is much faster than all the other proposed approaches. So one of the future plans for this current is in machine simulator. So the first thing is that we would like to make dissimulation closer to the rial, uh, GOP oh, optical system in particular for a first step to get closer to the system of a measurement back. See, I am. And to do this what is, uh, simulate Herbal on the p a is this quantum, uh, condoms Goshen model that is proposed described in this paper and proposed by people in the in the Entity group. And so the idea of this model is that instead of having the very simple or these and have shown previously, it includes paired all these that take into account on me the mean off the awesome leverage off the, uh, European face component, but also their violence s so that we can take into account more quantum effects off the g o p. O, such as the squeezing. And then we plan toe, make the simulator open access for the members to run their instances on the system. There will be a first version in September that will be just based on the simple common line access for the simulator and in which will have just a classic or approximation of the system. We don't know Sturm, binary weights and museum in term, but then will propose a second version that would extend the current arising machine to Iraq off F p g. A, in which we will add the more refined models truncated, ignoring the bottom Goshen model they just talked about on the support in which he valued waits for the rising problems and support the cement. So we will announce later when this is available and and far right is working >>hard comes from Universal down today in physics department, and I'd like to thank the organizers for their kind invitation to participate in this very interesting and promising workshop. Also like to say that I look forward to collaborations with with a file lab and Yoshi and collaborators on the topics of this world. So today I'll briefly talk about our attempt to understand the fundamental limits off another continues time computing, at least from the point off you off bullion satisfy ability, problem solving, using ordinary differential equations. But I think the issues that we raise, um, during this occasion actually apply to other other approaches on a log approaches as well and into other problems as well. I think everyone here knows what Dorien satisfy ability. Problems are, um, you have boolean variables. You have em clauses. Each of disjunction of collaterals literally is a variable, or it's, uh, negation. And the goal is to find an assignment to the variable, such that order clauses are true. This is a decision type problem from the MP class, which means you can checking polynomial time for satisfy ability off any assignment. And the three set is empty, complete with K three a larger, which means an efficient trees. That's over, uh, implies an efficient source for all the problems in the empty class, because all the problems in the empty class can be reduced in Polian on real time to reset. As a matter of fact, you can reduce the NP complete problems into each other. You can go from three set to set backing or two maximum dependent set, which is a set packing in graph theoretic notions or terms toe the icing graphs. A problem decision version. This is useful, and you're comparing different approaches, working on different kinds of problems when not all the closest can be satisfied. You're looking at the accusation version offset, uh called Max Set. And the goal here is to find assignment that satisfies the maximum number of clauses. And this is from the NPR class. In terms of applications. If we had inefficient sets over or np complete problems over, it was literally, positively influenced. Thousands off problems and applications in industry and and science. I'm not going to read this, but this this, of course, gives a strong motivation toe work on this kind of problems. Now our approach to set solving involves embedding the problem in a continuous space, and you use all the east to do that. So instead of working zeros and ones, we work with minus one across once, and we allow the corresponding variables toe change continuously between the two bounds. We formulate the problem with the help of a close metrics. If if a if a close, uh, does not contain a variable or its negation. The corresponding matrix element is zero. If it contains the variable in positive, for which one contains the variable in a gated for Mitt's negative one, and then we use this to formulate this products caused quote, close violation functions one for every clause, Uh, which really, continuously between zero and one. And they're zero if and only if the clause itself is true. Uh, then we form the define in order to define a dynamic such dynamics in this and dimensional hyper cube where the search happens and if they exist, solutions. They're sitting in some of the corners of this hyper cube. So we define this, uh, energy potential or landscape function shown here in a way that this is zero if and only if all the clauses all the kmc zero or the clauses off satisfied keeping these auxiliary variables a EMS always positive. And therefore, what you do here is a dynamics that is a essentially ingredient descend on this potential energy landscape. If you were to keep all the M's constant that it would get stuck in some local minimum. However, what we do here is we couple it with the dynamics we cooperated the clothes violation functions as shown here. And if he didn't have this am here just just the chaos. For example, you have essentially what case you have positive feedback. You have increasing variable. Uh, but in that case, you still get stuck would still behave will still find. So she is better than the constant version but still would get stuck only when you put here this a m which makes the dynamics in in this variable exponential like uh, only then it keeps searching until he finds a solution on deer is a reason for that. I'm not going toe talk about here, but essentially boils down toe performing a Grady and descend on a globally time barren landscape. And this is what works. Now I'm gonna talk about good or bad and maybe the ugly. Uh, this is, uh, this is What's good is that it's a hyperbolic dynamical system, which means that if you take any domain in the search space that doesn't have a solution in it or any socially than the number of trajectories in it decays exponentially quickly. And the decay rate is a characteristic in variant characteristic off the dynamics itself. Dynamical systems called the escape right the inverse off that is the time scale in which you find solutions by this by this dynamical system, and you can see here some song trajectories that are Kelty because it's it's no linear, but it's transient, chaotic. Give their sources, of course, because eventually knowledge to the solution. Now, in terms of performance here, what you show for a bunch off, um, constraint densities defined by M overran the ratio between closes toe variables for random, said Problems is random. Chris had problems, and they as its function off n And we look at money toward the wartime, the wall clock time and it behaves quite value behaves Azat party nominally until you actually he to reach the set on set transition where the hardest problems are found. But what's more interesting is if you monitor the continuous time t the performance in terms off the A narrow, continuous Time t because that seems to be a polynomial. And the way we show that is, we consider, uh, random case that random three set for a fixed constraint density Onda. We hear what you show here. Is that the right of the trash hold that it's really hard and, uh, the money through the fraction of problems that we have not been able to solve it. We select thousands of problems at that constraint ratio and resolve them without algorithm, and we monitor the fractional problems that have not yet been solved by continuous 90. And this, as you see these decays exponentially different. Educate rates for different system sizes, and in this spot shows that is dedicated behaves polynomial, or actually as a power law. So if you combine these two, you find that the time needed to solve all problems except maybe appear traction off them scales foreign or merely with the problem size. So you have paranormal, continuous time complexity. And this is also true for other types of very hard constraints and sexual problems such as exact cover, because you can always transform them into three set as we discussed before, Ramsey coloring and and on these problems, even algorithms like survey propagation will will fail. But this doesn't mean that P equals NP because what you have first of all, if you were toe implement these equations in a device whose behavior is described by these, uh, the keys. Then, of course, T the continue style variable becomes a physical work off. Time on that will be polynomial is scaling, but you have another other variables. Oxidative variables, which structured in an exponential manner. So if they represent currents or voltages in your realization and it would be an exponential cost Al Qaeda. But this is some kind of trade between time and energy, while I know how toe generate energy or I don't know how to generate time. But I know how to generate energy so it could use for it. But there's other issues as well, especially if you're trying toe do this son and digital machine but also happens. Problems happen appear. Other problems appear on in physical devices as well as we discuss later. So if you implement this in GPU, you can. Then you can get in order off to magnitude. Speed up. And you can also modify this to solve Max sad problems. Uh, quite efficiently. You are competitive with the best heuristic solvers. This is a weather problems. In 2016 Max set competition eso so this this is this is definitely this seems like a good approach, but there's off course interesting limitations, I would say interesting, because it kind of makes you think about what it means and how you can exploit this thes observations in understanding better on a low continues time complexity. If you monitored the discrete number the number of discrete steps. Don't buy the room, Dakota integrator. When you solve this on a digital machine, you're using some kind of integrator. Um and you're using the same approach. But now you measure the number off problems you haven't sold by given number of this kid, uh, steps taken by the integrator. You find out you have exponential, discrete time, complexity and, of course, thistles. A problem. And if you look closely, what happens even though the analog mathematical trajectory, that's the record here. If you monitor what happens in discrete time, uh, the integrator frustrates very little. So this is like, you know, third or for the disposition, but fluctuates like crazy. So it really is like the intervention frees us out. And this is because of the phenomenon of stiffness that are I'll talk a little bit a more about little bit layer eso. >>You know, it might look >>like an integration issue on digital machines that you could improve and could definitely improve. But actually issues bigger than that. It's It's deeper than that, because on a digital machine there is no time energy conversion. So the outside variables are efficiently representing a digital machine. So there's no exponential fluctuating current of wattage in your computer when you do this. Eso If it is not equal NP then the exponential time, complexity or exponential costs complexity has to hit you somewhere. And this is how um, but, you know, one would be tempted to think maybe this wouldn't be an issue in a analog device, and to some extent is true on our devices can be ordered to maintain faster, but they also suffer from their own problems because he not gonna be affect. That classes soldiers as well. So, indeed, if you look at other systems like Mirandizing machine measurement feedback, probably talk on the grass or selected networks. They're all hinge on some kind off our ability to control your variables in arbitrary, high precision and a certain networks you want toe read out across frequencies in case off CM's. You required identical and program because which is hard to keep, and they kind of fluctuate away from one another, shift away from one another. And if you control that, of course that you can control the performance. So actually one can ask if whether or not this is a universal bottleneck and it seems so aside, I will argue next. Um, we can recall a fundamental result by by showing harder in reaction Target from 1978. Who says that it's a purely computer science proof that if you are able toe, compute the addition multiplication division off riel variables with infinite precision, then you could solve any complete problems in polynomial time. It doesn't actually proposals all where he just chose mathematically that this would be the case. Now, of course, in Real warned, you have also precision. So the next question is, how does that affect the competition about problems? This is what you're after. Lots of precision means information also, or entropy production. Eso what you're really looking at the relationship between hardness and cost of computing off a problem. Uh, and according to Sean Hagar, there's this left branch which in principle could be polynomial time. But the question whether or not this is achievable that is not achievable, but something more cheerful. That's on the right hand side. There's always going to be some information loss, so mental degeneration that could keep you away from possibly from point normal time. So this is what we like to understand, and this information laws the source off. This is not just always I will argue, uh, in any physical system, but it's also off algorithm nature, so that is a questionable area or approach. But China gets results. Security theoretical. No, actual solar is proposed. So we can ask, you know, just theoretically get out off. Curiosity would in principle be such soldiers because it is not proposing a soldier with such properties. In principle, if if you want to look mathematically precisely what the solar does would have the right properties on, I argue. Yes, I don't have a mathematical proof, but I have some arguments that that would be the case. And this is the case for actually our city there solver that if you could calculate its trajectory in a loss this way, then it would be, uh, would solve epic complete problems in polynomial continuous time. Now, as a matter of fact, this a bit more difficult question, because time in all these can be re scared however you want. So what? Burns says that you actually have to measure the length of the trajectory, which is a new variant off the dynamical system or property dynamical system, not off its parameters ization. And we did that. So Suba Corral, my student did that first, improving on the stiffness off the problem off the integrations, using implicit solvers and some smart tricks such that you actually are closer to the actual trajectory and using the same approach. You know what fraction off problems you can solve? We did not give the length of the trajectory. You find that it is putting on nearly scaling the problem sites we have putting on your skin complexity. That means that our solar is both Polly length and, as it is, defined it also poorly time analog solver. But if you look at as a discreet algorithm, if you measure the discrete steps on a digital machine, it is an exponential solver. And the reason is because off all these stiffness, every integrator has tow truck it digitizing truncate the equations, and what it has to do is to keep the integration between the so called stability region for for that scheme, and you have to keep this product within a grimace of Jacoby in and the step size read in this region. If you use explicit methods. You want to stay within this region? Uh, but what happens that some off the Eigen values grow fast for Steve problems, and then you're you're forced to reduce that t so the product stays in this bonded domain, which means that now you have to you're forced to take smaller and smaller times, So you're you're freezing out the integration and what I will show you. That's the case. Now you can move to increase its soldiers, which is which is a tree. In this case, you have to make domain is actually on the outside. But what happens in this case is some of the Eigen values of the Jacobean, also, for six systems, start to move to zero. As they're moving to zero, they're going to enter this instability region, so your soul is going to try to keep it out, so it's going to increase the data T. But if you increase that to increase the truncation hours, so you get randomized, uh, in the large search space, so it's it's really not, uh, not going to work out. Now, one can sort off introduce a theory or language to discuss computational and are computational complexity, using the language from dynamical systems theory. But basically I I don't have time to go into this, but you have for heart problems. Security object the chaotic satellite Ouch! In the middle of the search space somewhere, and that dictates how the dynamics happens and variant properties off the dynamics. Of course, off that saddle is what the targets performance and many things, so a new, important measure that we find that it's also helpful in describing thesis. Another complexity is the so called called Makarov, or metric entropy and basically what this does in an intuitive A eyes, uh, to describe the rate at which the uncertainty containing the insignificant digits off a trajectory in the back, the flow towards the significant ones as you lose information because off arrows being, uh grown or are developed in tow. Larger errors in an exponential at an exponential rate because you have positively up north spawning. But this is an in variant property. It's the property of the set of all. This is not how you compute them, and it's really the interesting create off accuracy philosopher dynamical system. A zay said that you have in such a high dimensional that I'm consistent were positive and negatively upon of exponents. Aziz Many The total is the dimension of space and user dimension, the number off unstable manifold dimensions and as Saddam was stable, manifold direction. And there's an interesting and I think, important passion, equality, equality called the passion, equality that connect the information theoretic aspect the rate off information loss with the geometric rate of which trajectory separate minus kappa, which is the escape rate that I already talked about. Now one can actually prove a simple theorems like back off the envelope calculation. The idea here is that you know the rate at which the largest rated, which closely started trajectory separate from one another. So now you can say that, uh, that is fine, as long as my trajectory finds the solution before the projective separate too quickly. In that case, I can have the hope that if I start from some region off the face base, several close early started trajectories, they kind of go into the same solution orphaned and and that's that's That's this upper bound of this limit, and it is really showing that it has to be. It's an exponentially small number. What? It depends on the end dependence off the exponents right here, which combines information loss rate and the social time performance. So these, if this exponents here or that has a large independence or river linear independence, then you then you really have to start, uh, trajectories exponentially closer to one another in orderto end up in the same order. So this is sort off like the direction that you're going in tow, and this formulation is applicable toe all dynamical systems, uh, deterministic dynamical systems. And I think we can We can expand this further because, uh, there is, ah, way off getting the expression for the escaped rate in terms off n the number of variables from cycle expansions that I don't have time to talk about. What? It's kind of like a program that you can try toe pursuit, and this is it. So the conclusions I think of self explanatory I think there is a lot of future in in, uh, in an allo. Continue start computing. Um, they can be efficient by orders of magnitude and digital ones in solving empty heart problems because, first of all, many of the systems you like the phone line and bottleneck. There's parallelism involved, and and you can also have a large spectrum or continues time, time dynamical algorithms than discrete ones. And you know. But we also have to be mindful off. What are the possibility of what are the limits? And 11 open question is very important. Open question is, you know, what are these limits? Is there some kind off no go theory? And that tells you that you can never perform better than this limit or that limit? And I think that's that's the exciting part toe to derive thes thes this levian 10.

Published Date : Sep 27 2020

SUMMARY :

bifurcated critical point that is the one that I forget to the lowest pump value a. the chi to non linearity and see how and when you can get the Opio know that the classical approximation of the car testing machine, which is the ground toe, than the state of the art algorithm and CP to do this which is a very common Kasich. right the inverse off that is the time scale in which you find solutions by first of all, many of the systems you like the phone line and bottleneck.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Exxon MobilORGANIZATION

0.99+

AndyPERSON

0.99+

Sean HagarPERSON

0.99+

Daniel WennbergPERSON

0.99+

ChrisPERSON

0.99+

USCORGANIZATION

0.99+

CaltechORGANIZATION

0.99+

2016DATE

0.99+

100 timesQUANTITY

0.99+

BerkeleyLOCATION

0.99+

Tatsuya NagamotoPERSON

0.99+

twoQUANTITY

0.99+

1978DATE

0.99+

FoxORGANIZATION

0.99+

six systemsQUANTITY

0.99+

HarvardORGANIZATION

0.99+

Al QaedaORGANIZATION

0.99+

SeptemberDATE

0.99+

second versionQUANTITY

0.99+

CIAORGANIZATION

0.99+

IndiaLOCATION

0.99+

300 yardsQUANTITY

0.99+

University of TokyoORGANIZATION

0.99+

todayDATE

0.99+

BurnsPERSON

0.99+

Atsushi YamamuraPERSON

0.99+

0.14%QUANTITY

0.99+

48 coreQUANTITY

0.99+

0.5 microsecondsQUANTITY

0.99+

NSFORGANIZATION

0.99+

15 yearsQUANTITY

0.99+

CBSORGANIZATION

0.99+

NTTORGANIZATION

0.99+

first implementationQUANTITY

0.99+

first experimentQUANTITY

0.99+

123QUANTITY

0.99+

Army Research OfficeORGANIZATION

0.99+

firstQUANTITY

0.99+

1,904,711QUANTITY

0.99+

oneQUANTITY

0.99+

sixQUANTITY

0.99+

first versionQUANTITY

0.99+

StevePERSON

0.99+

2000 spinsQUANTITY

0.99+

five researcherQUANTITY

0.99+

CreoleORGANIZATION

0.99+

three setQUANTITY

0.99+

second partQUANTITY

0.99+

third partQUANTITY

0.99+

Department of Applied PhysicsORGANIZATION

0.99+

10QUANTITY

0.99+

eachQUANTITY

0.99+

85,900QUANTITY

0.99+

OneQUANTITY

0.99+

one problemQUANTITY

0.99+

136 CPUQUANTITY

0.99+

ToshibaORGANIZATION

0.99+

ScottPERSON

0.99+

2.4 gigahertzQUANTITY

0.99+

1000 timesQUANTITY

0.99+

two timesQUANTITY

0.99+

two partsQUANTITY

0.99+

131QUANTITY

0.99+

14,233QUANTITY

0.99+

more than 100 spinsQUANTITY

0.99+

two possible phasesQUANTITY

0.99+

13,580QUANTITY

0.99+

5QUANTITY

0.99+

4QUANTITY

0.99+

one microsecondsQUANTITY

0.99+

first stepQUANTITY

0.99+

first partQUANTITY

0.99+

500 spinsQUANTITY

0.99+

two identical photonsQUANTITY

0.99+

3QUANTITY

0.99+

70 years agoDATE

0.99+

IraqLOCATION

0.99+

one experimentQUANTITY

0.99+

zeroQUANTITY

0.99+

Amir Safarini NiniPERSON

0.99+

SaddamPERSON

0.99+

Networks of Optical Parametric Oscillators


 

>>Good morning. Good afternoon. Good evening, everyone. I should thank Entity Research and the Oshie for putting together this program and also the opportunity to speak here. My name is Al Gore ism or Andy and I'm from Caltech. And today I'm going to tell you about the work that we have been doing on networks off optical parametric oscillators and how we have been using them for icing machines and how we're pushing them toward Cornum. Photonics should acknowledge my team at Caltech, which is now eight graduate students and five researcher and postdocs as well as collaborators from all over the world, including entity research and also the funding from different places, including entity. So this talk is primarily about networks of resonate er's and these networks are everywhere from nature. For instance, the brain, which is a network of oscillators all the way to optics and photonics and some of the biggest examples or meta materials, which is an array of small resonate er's. And we're recently the field of technological photonics, which is trying thio implement a lot of the technological behaviors of models in the condensed matter, physics in photonics. And if you want to extend it even further. Some of the implementations off quantum computing are technically networks of quantum oscillators. So we started thinking about these things in the context of icing machines, which is based on the icing problem, which is based on the icing model, which is the simple summation over the spins and spins can be their upward down, and the couplings is given by the G I J. And the icing problem is, if you know J I J. What is the spin configuration that gives you the ground state? And this problem is shown to be an MP high problem. So it's computational e important because it's a representative of the MP problems on NPR. Problems are important because first, their heart in standard computers, if you use a brute force algorithm and they're everywhere on the application side. That's why there is this demand for making a machine that can target these problems and hopefully it can provide some meaningful computational benefit compared to the standard digital computers. So I've been building these icing machines based on this building block, which is a degenerate optical parametric oscillator on what it is is resonator with non linearity in it and we pump these resonate er's and we generate the signal at half the frequency of the pump. One vote on a pump splits into two identical photons of signal, and they have some very interesting phase of frequency locking behaviors. And if you look at the phase locking behavior, you realize that you can actually have two possible face states as the escalation result of these Opio, which are off by pie, and that's one of the important characteristics of them. So I want to emphasize >>a little more on that, and I have this mechanical analogy which are basically two simple pendulum. But there are parametric oscillators because I'm going to modulate the parameter of them in this video, which is the length of the strength on by that modulation, which is that will make a pump. I'm gonna make a muscular. That'll make a signal, which is half the frequency of the pump. >>And I have two of them to show you that they can acquire these face states so they're still face their frequency lock to the pump. But it can also lead in either the zero pie face state on. The idea is to use this binary phase to represent the binary icing spin. So each Opio is going to represent spin, which can be >>either is your pie or up or down, >>and to implement the network of these resonate er's. We use the time off blood scheme, and the idea is that we put impulses in the cavity, these pulses air separated by the repetition period that you put in or t R. And you can think about these pulses in one resonator, xaz and temporarily separated synthetic resonate Er's If you want a couple of these resonator is to each other, and now you can introduce these delays, each of which is a multiple of TR. If you look at the shortest delay it couples resonator wanted to 2 to 3 and so on. If you look at the second delay, which is two times a rotation period, the couple's 123 and so on. If you have any minus one delay lines, then you can have any potential couplings among these synthetic resonate er's. And if I can introduce these modulators in those delay lines so that I can strength, I can control the strength and the phase of these couplings at the right time. Then I can >>have a program. We'll all toe all connected network in this time off like scheme. >>And the whole physical size of the system scales linearly with the number of pulses. So the idea of opium based icing machine is didn't having these o pos. Each of them can be either zero pie, and I can arbitrarily connect them to each other. And then I start with programming this machine to a given icing problem by just setting the couplings and setting the controllers in each of those delight lines. So now I have a network which represents an icing problem thin the icing problem maps to finding the face state that satisfy maximum number of coupling constraints. And the way it happens is that the icing Hamiltonian maps to the linear loss of the network. And if I start adding gain by just putting pump into the network, then the OPI ohs are expected to oscillating the lowest, lowest lost state. And, uh and we have been doing these in the past, uh, six or seven years and I'm just going to quickly show you the transition, especially what happened in the first implementation which was using a free space optical system and then the guided wave implementation in 2016 and the measurement feedback idea which led to increasing the size and doing actual computation with these machines. So I just want to make this distinction here that, um the first implementation was on our optical interaction. We also had an unequal 16 implementation and then we transition to this measurement feedback idea, which I'll tell you quickly what it iss on. There's still a lot of ongoing work, especially on the entity side, to make larger machines using the measurement feedback. But I'm gonna mostly focused on the all optical networks and how we're using all optical networks to go beyond simulation of icing. Hamiltonian is both in the linear and >>nonlinear side and also how we're working on miniaturization of these Opio networks. So >>the first experiment, which was the four Opium machine it was a free space implementation and this is the actual picture of the machine and we implemented a small and it calls for Mexico problem on the machine. So one problem for one experiment and we ran the machine 1000 times, we looked at the state and we always saw it oscillate in one of these, um, ground states of the icing laboratoria. Yeah, so then the measurement feedback idea was to replace those couplings and the controller with the simulator. So we basically simulated all those coherent interactions on on FB g A. And we replicated the coherent pulse with respect to all those measurements. And then we injected it back into the cavity and on the near to you still remain. So it still is a non. They're dynamical system, but the linear side is all simulated. So there are lots of questions about if this system is preserving important information or not, or if it's gonna behave better Computational wars. And that's still ah, lot of ongoing studies. But nevertheless, the reason that this implementation was very interesting is that you don't need the end minus one delight lines so you can just use one, and you can implement a large machine, and then you can run several thousands of problems in the machine, and then you can compare the performance from the computational perspective. Looks so I'm gonna split this idea of opium based icing machine into two parts One is the linear part, which is if you take out the non linearity out of the resonator and just think about the connections. You can think about this as a simple matrix multiplication scheme, and that's basically >>what gives you the icing Hamiltonian model A. So the optical loss of this network corresponds to the icing Hamiltonian. >>And if I just want to show you the example of the n equals for experiment on all those face states and the history Graham that we saw, you can actually calculate the laws of each of those states because all those interferences in the beam splitters and the delay lines are going to give you a different losses. And then you will see that ground states corresponds to the lowest laws of the actual optical network. If you add the non linearity, the simple way of thinking about what the non linearity does is that it provides to gain, and then you start bringing up the gain so that it hits the loss. Then you go through the game saturation or the threshold which is going to give you this phase bifurcation. >>So you go either to zero the pie face state, and the expectation is that this the network oscillates in the lowest possible state, the lowest possible loss state. >>There are some challenges associated with this intensity Durban face transition, which I'm going to briefly talk about. I'm also going to tell you about other types of non their dynamics that we're looking at on the non air side of these networks. So if you just think about the linear network, we're actually interested in looking at some technological behaviors in these networks. And the difference between looking at the technological behaviors and the icing uh, machine is that now, First of all, we're looking at the type of Hamilton Ian's that are a little different than the icing Hamilton. And one of the biggest difference is is that most of these technological Hamilton Ian's that require breaking the time reversal symmetry, meaning that you go from one spin to on the one side to another side and you get one phase. And if you go back where you get a different phase, and the other thing is that we're not just interested in finding the ground state, we're actually now interesting and looking at all sorts of States and looking at the dynamics and the behaviors of all these states in the network. So we started with the simplest implementation, of course, which is a one d chain of thes resonate er's which corresponds to a so called ssh model. In the technological work, we get the similar energy to los mapping. And now we can actually look at the band structure on. This is an actual measurement >>that we get with this associate model and you see how it reasonably how how? Well, it actually follows the prediction and the theory. >>One of the interesting things about the time multiplexing implementation is that now you have the flexibility of changing the network as we were running the machine. And that's something unique about this time multiplex implementation so that we can actually look at the dynamics. And one example >>that we have looked at is we can actually go to the transition off going from top a logical to the to the standard nontrivial. I'm sorry to the trivial behavior of the network. >>You can then look at the edge states and you can also see the trivial and states and the technological at states actually showing up in this network. We have just recently implement on a two D, >>uh, network with Harper Hofstadter model when you don't have the results here. But we're one of the other important characteristic of time multiplexing is that you can go to higher and higher dimensions and keeping that flexibility and dynamics. And we can also think about adding non linearity both in a classical and quantum regimes, which is going to give us a lot of exotic oh, classical and quantum, non innate behaviors in these networks. >>So I told you about the linear side. Mostly let me just switch gears and talk about the nonlinear side of the network. And the biggest thing that I talked about so far in the icing machine is this phase transition, that threshold. So the low threshold we have squeezed state in these Oh, pios, if you increase the pump, we go through this intensity driven phase transition and then we got the face stays above threshold. And this is basically the mechanism off the computation in these O pos, which is through this phase transition below to above threshold. So one of the characteristics of this phase transition is that below threshold, you expect to see quantum states above threshold. You expect to see more classical states or coherent states, and that's basically corresponding to the intensity off the driving pump. So it's really hard to imagine that it can go above threshold. Or you can have this friends transition happen in the all in the quantum regime. And there are also some challenges associated with the intensity homogeneity off the network. Which, for example, is if one Opio starts oscillating and then its intensity goes really high. Then it's going to ruin this collective decision making off the network because of the intensity driven face transition nature. So So the question is, can we look at other phase transitions? Can we utilize them for both computing? And also, can we bring them to the quantum regime on? I'm going to specifically talk about the face transition in the spectral domain, which is the transition from the so called degenerate regime, which is what I mostly talked about to the non degenerate regime, which happens by just tuning the phase of the cavity. And what is interesting is that this phase transition corresponds to a distinct phase noise, behavior So in the degenerate regime, which we call it the order state. You're gonna have the phase being locked to the phase of the pump as I talked about in the non the general regime. However, the phase is the phase is mostly dominated by the quantum diffusion off the off the phase, which is limited by the so called shallow towns limit and you can see that transition from the general to non degenerate, which also has distinct symmetry differences. And this transition corresponds to a symmetry breaking in the non degenerate case. The signal can acquire any of those phases on the circle, so it has a you one symmetry. And if you go to the degenerate case, then that symmetry is broken and you only have zero pie face days I will look at So now the question is can utilize this phase transition, which is a face driven phase transition and can we use it for similar computational scheme? So that's one of the questions that were also thinking about. And it's not just this face transition is not just important for computing. It's also interesting from the sensing potentials and this face transition. You can easily bring it below threshold and just operated in the quantum regime. Either Gaussian or non Gaussian. If you make a network of Opio is now, we can see all sorts of more complicated and more interesting phase transitions in the spectral domain. One of them is the first order phase transition, which you get by just coupling to oppose. And that's a very abrupt face transition and compared to the to the single Opio face transition. And if you do the couplings right, you can actually get a lot of non her mission dynamics and exceptional points, which are actually very interesting to explore both in the classical and quantum regime. And I should also mention that you can think about the cup links to be also nonlinear couplings. And that's another behavior that you can see, especially in the nonlinear in the non degenerate regime. So with that, I basically told you about these Opio networks, how we can think about the linear scheme and the linear behaviors and how we can think about the rich, nonlinear dynamics and non linear behaviors both in the classical and quantum regime. I want to switch gear and tell you a little bit about the miniaturization of these Opio networks. And of course, the motivation is if you look at the electron ICS and >>what we had 60 or 70 years ago with vacuum tube and how we transition from relatively small scale computers in the order of thousands of nonlinear elements to billions of non linear elements, where we are now with the optics is probably very similar to seven years ago, which is a tabletop implementation. >>And the question is, how can we utilize nano photonics? I'm gonna just briefly show you the two directions on that which we're working on. One is based on lithium Diabate, and the other is based on even a smaller resonate er's Did you? So the work on Nana Photonic lithium naive. It was started in collaboration with Harvard Marko Loncar and also might affair at Stanford. And, uh, we could show that you can do the >>periodic polling in the phenomenon of it and get all sorts of very highly non in your process is happening in this net. Photonic periodically polls if, um Diabate >>and now we're working on building. Opio was based on that kind of photonic lithium Diabate and these air some some examples of the devices that we have been building in the past few months, which I'm not gonna tell you more about. But the OPI ohs and the Opio networks are in the works, and that's not the only way of making large networks. But also I want to point out that the reason that these Nana photonic goblins are actually exciting is not just because you can make a large networks and it can make him compact in a in a small footprint, they also provide some opportunities in terms of the operation regime. On one of them is about making cat states in o pos, which is can we have the quantum superposition of >>the zero pie states that I talked about >>and the nano photonics within? I would provide some opportunities to actually get >>closer to that regime because of the spatial temporal confinement that you can get in these wave guides. So we're doing some theory on that. We're confident that the type of non linearity two losses that it can get with these platforms are actually much higher than what you can get with other platform, other existing platforms and to >>go even smaller. We have been asking the question off. What is the smallest possible Opio that you can make? Then you can think about really wavelength scale type resonate er's and adding the chi to non linearity and see how and when you can get the Opio to operate. And recently, in collaboration with us. See, we have been actually USC and Creole. We have demonstrated that you can use nano lasers and get some spin Hamiltonian implementations on those networks. So if you can't build a pos, we know that there is a path for implementing Opio Networks on on such a nano scale. So we have looked at these calculations and we try to >>estimate the threshold of a pos. Let's say for me resonator and it turns out that it can actually be even lower than the type of bulk Pippen O pos that we have been building in the past 50 years or so. >>So we're working on the experiments and we're hoping that we can actually make even larger and larger scale Opio networks. So let me summarize the talk I told you about the opium networks and >>our work that has been going on on icing machines and the >>measurement feedback on I told you about the ongoing work on the all optical implementations both on the linear side and also on the nonlinear behaviors. And I also told you >>a little bit about the efforts on miniaturization and going to the to the nano scale. So with that, I would like Thio stop here and thank you for your attention.

Published Date : Sep 21 2020

SUMMARY :

And if you look at the phase locking which is the length of the strength on by that modulation, which is that will make a pump. And I have two of them to show you that they can acquire these face states so they're still face their frequency and the idea is that we put impulses in the cavity, these pulses air separated by the repetition have a program. into the network, then the OPI ohs are expected to oscillating the lowest, So the reason that this implementation was very interesting is that you don't need the end what gives you the icing Hamiltonian model A. So the optical loss of this network and the delay lines are going to give you a different losses. So you go either to zero the pie face state, and the expectation is that this breaking the time reversal symmetry, meaning that you go from one spin to on the one side that we get with this associate model and you see how it reasonably how how? that now you have the flexibility of changing the network as we were running the machine. the to the standard nontrivial. You can then look at the edge states and you can also see the trivial and states and the technological at uh, network with Harper Hofstadter model when you don't have the results the motivation is if you look at the electron ICS and from relatively small scale computers in the order And the question is, how can we utilize nano photonics? periodic polling in the phenomenon of it and get all sorts of very highly non in your been building in the past few months, which I'm not gonna tell you more about. closer to that regime because of the spatial temporal confinement that you can the chi to non linearity and see how and when you can get the Opio be even lower than the type of bulk Pippen O pos that we have been building in the past So let me summarize the talk And I also told you a little bit about the efforts on miniaturization and going to the to the

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
CaltechORGANIZATION

0.99+

AndyPERSON

0.99+

twoQUANTITY

0.99+

2016DATE

0.99+

HarvardORGANIZATION

0.99+

USCORGANIZATION

0.99+

EachQUANTITY

0.99+

1000 timesQUANTITY

0.99+

one problemQUANTITY

0.99+

oneQUANTITY

0.99+

five researcherQUANTITY

0.99+

first experimentQUANTITY

0.99+

OneQUANTITY

0.99+

sixQUANTITY

0.99+

Al Gore ismPERSON

0.99+

todayDATE

0.99+

first implementationQUANTITY

0.99+

thousandsQUANTITY

0.99+

eachQUANTITY

0.99+

123QUANTITY

0.99+

one experimentQUANTITY

0.99+

seven years agoDATE

0.99+

GrahamPERSON

0.99+

CreoleORGANIZATION

0.99+

one phaseQUANTITY

0.98+

bothQUANTITY

0.98+

MexicoLOCATION

0.98+

Harper HofstadterPERSON

0.98+

Entity ResearchORGANIZATION

0.98+

eight graduate studentsQUANTITY

0.98+

billionsQUANTITY

0.98+

two partsQUANTITY

0.98+

ThioPERSON

0.98+

two directionsQUANTITY

0.97+

second delayQUANTITY

0.97+

two possible face statesQUANTITY

0.97+

HamiltonianOTHER

0.97+

two lossesQUANTITY

0.97+

seven yearsQUANTITY

0.96+

one exampleQUANTITY

0.96+

singleQUANTITY

0.95+

two timesQUANTITY

0.95+

One voteQUANTITY

0.95+

two simple pendulumQUANTITY

0.95+

firstQUANTITY

0.94+

one spinQUANTITY

0.94+

60DATE

0.94+

70 years agoDATE

0.94+

GaussianOTHER

0.93+

16 implementationQUANTITY

0.92+

NanaORGANIZATION

0.91+

3QUANTITY

0.91+

two identical photonsQUANTITY

0.9+

StanfordORGANIZATION

0.87+

OpioOTHER

0.85+

one sideQUANTITY

0.82+

thousands of problemsQUANTITY

0.79+

first order phaseQUANTITY

0.79+

one delayQUANTITY

0.77+

zeroQUANTITY

0.76+

lithium DiabateOTHER

0.75+

Marko LoncarPERSON

0.75+

four OpiumQUANTITY

0.75+

NanaOTHER

0.73+

G I J.PERSON

0.72+

2QUANTITY

0.72+

J I J.PERSON

0.72+

one ofQUANTITY

0.7+

OshiePERSON

0.69+

past few monthsDATE

0.66+

NPRORGANIZATION

0.65+

zero pieQUANTITY

0.64+

Towards Understanding the Fundamental Limits of Analog, Continuous Time Computing


 

>> Hello everyone. My name is Zoltan Toroczkai. I am from University of Notre Dame, Physics department, and I'd like to thank the organizers for their kind invitation to participate in this very interesting and promising workshop. Also like to say that I look forward to collaborations with the Redefine Lab and Yoshian collaborators on the topics of this work. So today I'll briefly talk about, our attempt to understand, the fundamental limits of analog, continuous-time computing at least from the point of view of Boolean Satisfiability problem-solving using ordinary differential equations. But I think the issues that we raise during this occasion actually apply to other approaches, analog approaches as well, until to other problems as well. I think everyone here, knows what Boolean Satisfiability problems are. You have N Boolean variables, you have M clauses. Each a disjunction of K literals. Literal is a variable or it's negation. And the goal is to find an assignment to the variable such that all the clauses are true. This is a decision type problem from the NP class, which means you can check in polynomial time for satisfiability of any assignment. And the 3-SAT is NP-complete with K, 3 or larger, which means an efficient 3-SAT solver, (clears throat) implies an efficient solver for all the problems in the NP clause because all the problems in the NP clause can be reduced in polynomial time to 3-SAT. As a matter of fact you can, reduce the NP-complete problems into each other. You can go from 3-SAT to Set Packing or to Maximum Independent Set which is the set packing in graph theoretic notions or terms, to the ising graph SAT problem decision version. This is useful when you are comparing different approaches or working on different kinds of problems. When not all the clauses can be satisfied, you're looking at the optimization version of SAT, called Max-SAT and the goal here is to find the assignment that satisfies the maximum number of clauses, and this is from the NP-hard class. In terms of applications, if we had an efficient SAT solver, or NP-complete problem solver, it would literally, positively influence thousands of problems in applications in industry and science. I'm not going to read this. But this of course gives us some motivation, to work on these kind of problems. Now, our approach to SAT solving, involves embedding the problem in a continuous space, and you use all these to do that. So instead of working zeros and ones, we work with minus one and plus ones, and if we allow the corresponding variables, to change continuously between the two bounds, we formulate the problem with the help of a Clause Matrix. If, if a clause does not contain a variable or its negation, the corresponding matrix element is zero. If it contains the variable in positive form it's one. If it contains the variable in negated form, it's negative one. And now we use this to formulate these products, called clause violation functions, one for every clause, which rarely continues between zero and one and beyond zero if and only if the clause itself is true. Then we form... We define, also define the dynamics, search dynamics in this and the M-dimensional hypercube, where the search happens and if there exists solutions they're sitting in some of the corners of this hypercube. So we define this energy, potential or landscape function as shown here in a way that it, this is zero if and only if all the clauses, all the Kms are zero. All the clauses are satisfied, keeping these auxiliary variables, Ams always positive. And therefore what we do here is a dynamics that is essentially a gradient descent on this potential energy landscape. If you are to keep all the Ams constant then it would get stuck in some local minimum. However what do you do here is, we couple it with the dynamics. We couple it with the clause violation functions as shown here. And if you didn't have these Am here, just had just the Kms, for example, you have essentially, both case you have a positive feedback. You have a decreasing variable, but in that case you'll still get stuck, would still behave... We'll still find solutions better than the constant version or still would get stuck. Only when we put here this Am, which makes them dynamics in this variable exponential like, only then it keeps searching until it finds a solution. And there's a reason for that, that I'm not going to talk about here, but essentially boils down to performing a gradient descent on a globally time-varying landscape. And, and, and this is what works. Now, I'm going to talk about the good or bad, and maybe the ugly. This is, this is... What's good is that it's a hyperbolic dynamical system, which means that if you take any domain in the search space that doesn't have a solution in it or any solution in it, then the number of trajectories in it, the case exponentially quickly and the decay rate is a characteristic, invariant characteristic of the dynamics itself with the dynamical systems called the escape rate. The inverse of that is the timescale in which you find solutions by this dynamical system. And you can see here some trajectories, they are curved because it's, it's not linear but it's transiently curved to give, if there are solutions of course, we could see eventually, it does lead to the solutions. Now, in terms of performance, here what you show, for a bunch of, constraint densities, defined by, M over N, the ratio between clauses to variables, for random SAT problems, is random 3-SAT problems. And they, they, as, as function of N, and we look at, monitor the wall time, the wall clock time, and it, it behaves quite well, it behaves as a, as a polynomialy, until you actually hit, or reach the set on set transition, where the hardest problems are found. But what's more interesting is if you monitor the continuous-time t, the performance in terms of the analog continuous-time t, because that seems to be a polynomial. And the way we show that, is we can see the random K-SAT or random 3-SAT for a fixed constraint density. And we here, what you show here is at the, right at the threshold where it's really hard. And, (clears throat) we monitor the fraction of problems that we have not been able to solve it. We select thousands of problems at that cost rate ratio and we solve them with our algorithm, and we monitor the fraction of problems that have not yet been solved by continuous-time t. And these, as you see these decays exponentially in different decay rates for different system sizes and in this spot shows that this decay rate behaves polynomialy. or actually as a power law. So if you combine these two, you find that the time needed to solve all problems, except maybe appeared fraction of them, scales polynomialy with problem size. So you have polynomial continuous-time complexity. And this is also true, for other types of very hard constraints of the SAT problem such as exact color, because you can always transform them into 3-SAT as we discussed before, Ramsay coloring and, and on these problems, even algorithms like a survey propagation wheel will fail. But this doesn't mean that P equals NP because what you have, first of all, if you were to implement these equations in a device, whose behavior is described by these ODEs, then of course, t the continuous-time variable, becomes a physical wall clock time. And that would be polynomialy scaling but you have other variables, auxiliary variables, which fluctuate in an exponential manner. So if they represent currents or voltages in your realization and it would be an exponential cost algorithm. But this is some kind of trade between time and energy while I know how to generate energy or I don't know how to generate time but I know how to generate energy so it could be useful. But there's other issues as well, especially if you're trying to do this on a digital machine, but also happens, problems happen, appear, other problems appear on in physical devices as well as we discuss later. So if you implement these in GPU, you can, then you can get an order of two magnitude speedup, and you can also modify this, to solve Max-SAT problems quite efficiently, we are competitive with the best, heuristics solvers, this is all the problems in 2016, Max-SAT competition. So, so this, this, this is definitely, this is like a good approach, but there's of course, interesting limitations, I would say interesting, because it kind of makes you think about what it needs and how you can explore this, these observations in understanding better analog continuous-time complexity. If you monitor the discrete number, the number of discrete steps, done by the Runge Kutta integrator, and you solve this on a digital machine. You're using some kind of integrator, and, you know, using the same approach, but now you measure the number of problems you haven't solved, by a given number of discrete steps taken by the integrator. You find out, you have exponential discrete-time complexity. And of course, this is a problem. And if you look closely, what happens, even though the analog mathematical trajectory, that's the red curve here, if you monitor what happens in discrete time, the integrator fluctuates very little. So this is like you know, third or four digits precision, but fluctuates like crazy. So it really is like the integration freezes out, and this is because of the phenomenon of stiffness that I'll talk a little bit, more about a little bit later. So you know, it may look like an integration issue on your digital machines that you could improve and you could definitely improve, but actually the issue is bigger than that. It's, it's deeper than that because on a digital machine there is no time energy conversion. So the auxiliary variables are efficiently represented in a digital machine, so there's no exponential fluctuating current or voltage in your computer when you do this. So if e is not equal NP, then the exponential time complexity or exponential cost complexity has to hit you somewhere. And this is how. But you know one would be tempted to think maybe, this wouldn't be an issue in a analog device, and to some extent is true. Analog devices can be orders of magnitude faster, but they also suffer from their own problems because P not equal NP affects that clause of followers as well. So, indeed if you look at other systems, like Coherent Ising Machine with Measurement-Feedback, or Polariton Condensate Graphs or Oscillator Networks, they all hinge on some kind of, our ability to control real variables with arbitrarily high precision, and Oscillator Networks, you want to read out arbitrarily close frequencies. In case of CIMs, we require identical analog amplitudes which is hard to keep and they kind of fluctuate away from one another, shift away from one another, And, and if you control that, of course, then you can control the performance. So, actually one can ask if whether or not this is a universal bottleneck, and it seems so, as I will argue next. We can recall a fundamental result by A. Schönhage, Graham Schönhage from 1978 who says that, it's a purely computer science proof, that, "If you are able to compute, "the addition, multiplication, division "of real variables with infinite precision then, "you could solve NP-complete problems in polynomial time." He doesn't actually propose a solid work, he just shows mathematically that this will be the case. Now, of course, in real world, you have loss of precision. So the next question is, "How does that affect the computation of our problems?" This is what we are after. Loss of precision means information loss or entropy production. So what we are really looking at, the relationship between hardness and cost of computing of a problem. (clears throat) And according to Sean Harget, there is this left branch, which in principle could be polynomial time, but the question, whether or not this is achievable, that is not achievable, but something more achievable that's on the right-hand side. You know, there's always going to be some information loss, some entropy generation that could keep you away from, possibly from polynomial time. So this is what we'd like to understand. And this information loss, the source of this is not just noise, as, as I will argue in any physical system, but it's also of algorithmic nature. So that is a questionable area or, or approach, but Schönhage's result is purely theoretical, no actual solver is proposed. So we can ask, you know, just theoretically, out of curiosity, "Would in principle be such solvers?" Because he's not proposing a solver. In such properties in principle, if you were to look mathematically, precisely what that solver does, would have the right properties. And I argue, yes, I don't have a mathematical proof but I have some arguments that this would be the case. And this is the case for actually our sitdia solver, that if you could calculate, it's subjectivity in a loss this way, then it would be... Would solve NP-complete problems in polynomial continuous-time. Now, as a matter of fact, this is a bit more difficult question because time in all these can be re-scaled however you want. So what Bournez says, that you actually have to measure the length of the trajectory which is an invariant of the dynamical system or the property of the dynamical system, not of it's parametrization. And we did that. So Shubha Kharel my student did that, by first improving on the stiffness of the problem of the integrations using the implicit solvers and some smart tricks, such that you actually are closer to the actual trajectory and using the same approach to know, what fraction of problems you can solve. We did not give a length of the trajectory, you find that it is polynomialy scaling with the problem size. So we have polynomial scale complexity. That means that our solver is both poly-length, and as it is defined, it's also poly-time analog solver. But if you look at as a discrete algorithm, which will measure the discrete steps on a digital machine, it is an exponential solver, and the reason is because of all this stiffness. So every integrator has to truncate, digitize and truncate the equations. And what it has to do is to keep the integration within this so-called Stimpy TD gen for, for that scheme. And you have to keep this product within Eigenvalues of the Jacobian and the step size within this region, if you use explicit methods, you want to stay within this region. But what happens, that some of the eigenvalues grow fast for stiff problems, and then you're, you're forced to reduce that t, so the product stays in this bounded domain, which means that now you have to, we are forced to take smaller and smaller time steps, so you're, you're freezing out the integration and what I will show you, that's the case. Now you can move to implicit solvers, which is a new trick, in this case, your stability domain is actually on the outside, but what happens in this case, is some of the eigenvalues of the Jacobian, also for this instant start to move to zero, as they are moving to zero, they are going to enter this instability region. So your solver is going to try to keep it out, so it's going to increase the delta t, but if you increase that t, you increase the truncation errors, so you get randomized in the large search space. So it's, it's really not, not willing to work out. Now, one can sort of, introduce a theory or a language to discuss computational, analog computational complexity, using the language from dynamical systems theory. But basically I don't have time to go into this but you have for hard problems, the chaotic object the chaotic saddle in the middle of the search space somewhere, and that dictates how the dynamics happens and invariant properties of the dynamics, of course, of that saddle is what determines performance and many things. So an important measure that we find that, is also helpful in describing, this analog complexity is the so-called Kolmogorov or metric entropy. And basically what this does in an intuitive way, is to describe the rate at which the uncertainty, containing the insignificant digits of a trajectory in the back, they flow towards the significant ones, as you lose information because of errors being, grown or, or or, or developed into larger errors in an exponential, at an exponential rate because you have positive Lyapunov exponents. But this is an invariant property. It's the property of the set of all these, not how you compute them. And it's really the intrinsic rate of accuracy loss of a dynamical system. As I said that you have in such a high dimensional dynamical system, you have positive and negative Lyapunov exponents, as many as the total is the dimension of the space and user dimension, the number of unstable manufactured dimensions and assets now more stable many forms dimensions. And there's an interesting and I think important Pesin equality, equality called the Pesin equality, that connects the information theoretic, as per the rate of information loss with the geometric data each trajectory separate minus cut part which is the escape rate that I already talked about. Now, one can actually prove a simple theorem strike back of the calculation. The idea here is that, you know the rate at which the largest rate at which the closely started trajectory, separate from one another. So now you can say that, that is fine, as long as my trajectory finds the solution, before the trajectory separate too quickly. In that case, I can have the hope, that if I start from some region of the face space, several closely started trajectories, they kind of go into the same solution over time and that's, that's, that's this upper bound of this limit. And it is really showing that it has to be... It's an exponentially smaller number, but it depends on the N, dependence of the exponent right here, which combines information loss rate and the solution time performance. So these, if these exponent here or there, has a large independence, so even a linear independence, then you really have to start trajectories, exponentially closer to one another, in order to end up in the same order. So this is sort of like the, the direction that you are going into, and this formulation is applicable to, to all dynamical systems, deterministic dynamical systems. (clears throat) And I think we can expand this further because the, there is a way of getting the expression for the escape rates in terms of N the number of variables from cycle expansions, that I don't have time to talk about, but it's kind of like a program that you can try to pursue. And this is it. So uh, uh... The conclusions, I think are self-explanatory. I think there is a lot of future in, in analog continuous-time computing. They can be efficient by orders of magnitude than digital ones in solving NP-hard problems, because first of all, many of the systems lack of von Neumann bottleneck, there's parallelism involved and you can also have a larger spectrum of continuous-time dynamical algorithms than discrete ones. And, and, you know, but we also have to be mindful of what are the possibilities, what are the limits? And one, one open question, if any important open question is you know, "What are these limits? "Is there some kind of no-go theorem that tells you that, "you can never perform better than this limit "or, or that limit?" And I think that's, that's the exciting part to, to derive these, these limits and to get to an understanding about what's possible in this, in this area. Thank you.

Published Date : Sep 21 2020

SUMMARY :

in some of the corners of this hypercube.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Zoltan ToroczkaiPERSON

0.99+

Sean HargetPERSON

0.99+

1978DATE

0.99+

2016DATE

0.99+

A. SchönhagePERSON

0.99+

Graham SchönhagePERSON

0.99+

Shubha KharelPERSON

0.99+

BournezPERSON

0.99+

University of Notre DameORGANIZATION

0.99+

SchönhagePERSON

0.99+

Redefine LabORGANIZATION

0.99+

twoQUANTITY

0.99+

todayDATE

0.99+

thirdQUANTITY

0.99+

thousandsQUANTITY

0.99+

two boundsQUANTITY

0.99+

RamsayPERSON

0.98+

zeroQUANTITY

0.98+

oneQUANTITY

0.98+

each trajectoryQUANTITY

0.98+

bothQUANTITY

0.98+

four digitsQUANTITY

0.96+

firstQUANTITY

0.96+

von NeumannPERSON

0.96+

JacobianOTHER

0.95+

EachQUANTITY

0.9+

YoshianORGANIZATION

0.9+

LyapunovOTHER

0.9+

zerosQUANTITY

0.87+

thousands of problemsQUANTITY

0.86+

two magnitudeQUANTITY

0.85+

both caseQUANTITY

0.81+

minus oneQUANTITY

0.78+

3-SATOTHER

0.74+

one open questionQUANTITY

0.73+

Runge KuttaPERSON

0.73+

plusQUANTITY

0.66+

3OTHER

0.65+

everyQUANTITY

0.65+

ClauseOTHER

0.56+

onesQUANTITY

0.55+

N BooleanOTHER

0.55+

SATOTHER

0.46+

problemsQUANTITY

0.44+

Paula D'Amico, Webster Bank | Io Tahoe | Enterprise Data Automation


 

>> Narrator: From around the Globe, it's theCube with digital coverage of Enterprise Data Automation, and event series brought to you by Io-Tahoe. >> Everybody, we're back. And this is Dave Vellante, and we're covering the whole notion of Automated Data in the Enterprise. And I'm really excited to have Paula D'Amico here. Senior Vice President of Enterprise Data Architecture at Webster Bank. Paula, good to see you. Thanks for coming on. >> Hi, nice to see you, too. >> Let's start with Webster bank. You guys are kind of a regional I think New York, New England, believe it's headquartered out of Connecticut. But tell us a little bit about the bank. >> Webster bank is regional Boston, Connecticut, and New York. Very focused on in Westchester and Fairfield County. They are a really highly rated regional bank for this area. They hold quite a few awards for the area for being supportive for the community, and are really moving forward technology wise, they really want to be a data driven bank, and they want to move into a more robust group. >> We got a lot to talk about. So data driven is an interesting topic and your role as Data Architecture, is really Senior Vice President Data Architecture. So you got a big responsibility as it relates to kind of transitioning to this digital data driven bank but tell us a little bit about your role in your Organization. >> Currently, today, we have a small group that is just working toward moving into a more futuristic, more data driven data warehousing. That's our first item. And then the other item is to drive new revenue by anticipating what customers do, when they go to the bank or when they log in to their account, to be able to give them the best offer. And the only way to do that is you have timely, accurate, complete data on the customer and what's really a great value on offer something to offer that, or a new product, or to help them continue to grow their savings, or do and grow their investments. >> Okay, and I really want to get into that. But before we do, and I know you're, sort of partway through your journey, you got a lot to do. But I want to ask you about Covid, how you guys handling that? You had the government coming down and small business loans and PPP, and huge volume of business and sort of data was at the heart of that. How did you manage through that? >> We were extremely successful, because we have a big, dedicated team that understands where their data is and was able to switch much faster than a larger bank, to be able to offer the PPP Long's out to our customers within lightning speed. And part of that was is we adapted to Salesforce very for we've had Salesforce in house for over 15 years. Pretty much that was the driving vehicle to get our PPP loans in, and then developing logic quickly, but it was a 24 seven development role and get the data moving on helping our customers fill out the forms. And a lot of that was manual, but it was a large community effort. >> Think about that too. The volume was probably much higher than the volume of loans to small businesses that you're used to granting and then also the initial guidelines were very opaque. You really didn't know what the rules were, but you were expected to enforce them. And then finally, you got more clarity. So you had to essentially code that logic into the system in real time. >> I wasn't directly involved, but part of my data movement team was, and we had to change the logic overnight. So it was on a Friday night it was released, we pushed our first set of loans through, and then the logic changed from coming from the government, it changed and we had to redevelop our data movement pieces again, and we design them and send them back through. So it was definitely kind of scary, but we were completely successful. We hit a very high peak. Again, I don't know the exact number but it was in the thousands of loans, from little loans to very large loans and not one customer who applied did not get what they needed for, that was the right process and filled out the right amount. >> Well, that is an amazing story and really great support for the region, your Connecticut, the Boston area. So that's fantastic. I want to get into the rest of your story now. Let's start with some of the business drivers in banking. I mean, obviously online. A lot of people have sort of joked that many of the older people, who kind of shunned online banking would love to go into the branch and see their friendly teller had no choice, during this pandemic, to go to online. So that's obviously a big trend you mentioned, the data driven data warehouse, I want to understand that, but what at the top level, what are some of the key business drivers that are catalyzing your desire for change? >> The ability to give a customer, what they need at the time when they need it. And what I mean by that is that we have customer interactions in multiple ways. And I want to be able for the customer to walk into a bank or online and see the same format, and being able to have the same feel the same love, and also to be able to offer them the next best offer for them. But they're if they want looking for a new mortgage or looking to refinance, or whatever it is that they have that data, we have the data and that they feel comfortable using it. And that's an untethered banker. Attitude is, whatever my banker is holding and whatever the person is holding in their phone, that is the same and it's comfortable. So they don't feel that they've walked into the bank and they have to do fill out different paperwork compared to filling out paperwork on just doing it on their phone. >> You actually do want the experience to be better. And it is in many cases. Now you weren't able to do this with your existing I guess mainframe based Enterprise Data Warehouses. Is that right? Maybe talk about that a little bit? >> Yeah, we were definitely able to do it with what we have today the technology we're using. But one of the issues is that it's not timely. And you need a timely process to be able to get the customers to understand what's happening. You need a timely process so we can enhance our risk management. We can apply for fraud issues and things like that. >> Yeah, so you're trying to get more real time. The traditional EDW. It's sort of a science project. There's a few experts that know how to get it. You can so line up, the demand is tremendous. And then oftentimes by the time you get the answer, it's outdated. So you're trying to address that problem. So part of it is really the cycle time the end to end cycle time that you're progressing. And then there's, if I understand it residual benefits that are pretty substantial from a revenue opportunity, other offers that you can make to the right customer, that you maybe know, through your data, is that right? >> Exactly. It's drive new customers to new opportunities. It's enhanced the risk, and it's to optimize the banking process, and then obviously, to create new business. And the only way we're going to be able to do that is if we have the ability to look at the data right when the customer walks in the door or right when they open up their app. And by doing creating more to New York times near real time data, or the data warehouse team that's giving the lines of business the ability to work on the next best offer for that customer as well. >> But Paula, we're inundated with data sources these days. Are there other data sources that maybe had access to before, but perhaps the backlog of ingesting and cleaning in cataloging and analyzing maybe the backlog was so great that you couldn't perhaps tap some of those data sources. Do you see the potential to increase the data sources and hence the quality of the data or is that sort of premature? >> Oh, no. Exactly. Right. So right now, we ingest a lot of flat files and from our mainframe type of front end system, that we've had for quite a few years. But now that we're moving to the cloud and off-prem and on-prem, moving off-prem, into like an S3 Bucket, where that data we can process that data and get that data faster by using real time tools to move that data into a place where, like snowflake could utilize that data, or we can give it out to our market. Right now we're about we do work in batch mode still. So we're doing 24 hours. >> Okay. So when I think about the data pipeline, and the people involved, maybe you could talk a little bit about the organization. You've got, I don't know, if you have data scientists or statisticians, I'm sure you do. You got data architects, data engineers, quality engineers, developers, etc. And oftentimes, practitioners like yourself, will stress about, hey, the data is in silos. The data quality is not where we want it to be. We have to manually categorize the data. These are all sort of common data pipeline problems, if you will. Sometimes we use the term data Ops, which is sort of a play on DevOps applied to the data pipeline. Can you just sort of describe your situation in that context? >> Yeah, so we have a very large data ops team. And everyone that who is working on the data part of Webster's Bank, has been there 13 to 14 years. So they get the data, they understand it, they understand the lines of business. So it's right now. We could the we have data quality issues, just like everybody else does. But we have places in them where that gets cleansed. And we're moving toward and there was very much siloed data. The data scientists are out in the lines of business right now, which is great, because I think that's where data science belongs, we should give them and that's what we're working towards now is giving them more self service, giving them the ability to access the data in a more robust way. And it's a single source of truth. So they're not pulling the data down into their own, like Tableau dashboards, and then pushing the data back out. So they're going to more not, I don't want to say, a central repository, but a more of a robust repository, that's controlled across multiple avenues, where multiple lines of business can access that data. Is that help? >> Got it, Yes. And I think that one of the key things that I'm taking away from your last comment, is the cultural aspects of this by having the data scientists in the line of business, the lines of business will feel ownership of that data as opposed to pointing fingers criticizing the data quality. They really own that that problem, as opposed to saying, well, it's Paula's problem. >> Well, I have my problem is I have data engineers, data architects, database administrators, traditional data reporting people. And because some customers that I have that are business customers lines of business, they want to just subscribe to a report, they don't want to go out and do any data science work. And we still have to provide that. So we still want to provide them some kind of regiment that they wake up in the morning, and they open up their email, and there's the report that they subscribe to, which is great, and it works out really well. And one of the things is why we purchased Io-Tahoe was, I would have the ability to give the lines of business, the ability to do search within the data. And we'll read the data flows and data redundancy and things like that, and help me clean up the data. And also, to give it to the data analysts who say, all right, they just asked me they want this certain report. And it used to take okay, four weeks we're going to go and we're going to look at the data and then we'll come back and tell you what we can do. But now with Io-Tahoe, they're able to look at the data, and then in one or two days, they'll be able to go back and say, Yes, we have the data, this is where it is. This is where we found it. This is the data flows that we found also, which is what I call it, is the break of a column. It's where the column was created, and where it went to live as a teenager. (laughs) And then it went to die, where we archive it. And, yeah, it's this cycle of life for a column. And Io-Tahoe helps us do that. And we do data lineage is done all the time. And it's just takes a very long time and that's why we're using something that has AI in it and machine running. It's accurate, it does it the same way over and over again. If an analyst leaves, you're able to utilize something like Io-Tahoe to be able to do that work for you. Is that help? >> Yeah, so got it. So a couple things there, in researching Io-Tahoe, it seems like one of the strengths of their platform is the ability to visualize data, the data structure and actually dig into it, but also see it. And that speeds things up and gives everybody additional confidence. And then the other piece is essentially infusing AI or machine intelligence into the data pipeline, is really how you're attacking automation. And you're saying it repeatable, and then that helps the data quality and you have this virtual cycle. Maybe you could sort of affirm that and add some color, perhaps. >> Exactly. So you're able to let's say that I have seven cars, lines of business that are asking me questions, and one of the questions they'll ask me is, we want to know, if this customer is okay to contact, and there's different avenues so you can go online, do not contact me, you can go to the bank and you can say, I don't want email, but I'll take texts. And I want no phone calls. All that information. So, seven different lines of business asked me that question in different ways. One said, "No okay to contact" the other one says, "Customer 123." All these. In each project before I got there used to be siloed. So one customer would be 100 hours for them to do that analytical work, and then another analyst would do another 100 hours on the other project. Well, now I can do that all at once. And I can do those types of searches and say, Yes, we already have that documentation. Here it is, and this is where you can find where the customer has said, "No, I don't want to get access from you by email or I've subscribed to get emails from you." >> Got it. Okay. Yeah Okay. And then I want to go back to the cloud a little bit. So you mentioned S3 Buckets. So you're moving to the Amazon cloud, at least, I'm sure you're going to get a hybrid situation there. You mentioned snowflake. What was sort of the decision to move to the cloud? Obviously, snowflake is cloud only. There's not an on-prem, version there. So what precipitated that? >> Alright, so from I've been in the data IT information field for the last 35 years. I started in the US Air Force, and have moved on from since then. And my experience with Bob Graham, was with snowflake with working with GE Capital. And that's where I met up with the team from Io-Tahoe as well. And so it's a proven so there's a couple of things one is Informatica, is worldwide known to move data. They have two products, they have the on-prem and the off-prem. I've used the on-prem and off-prem, they're both great. And it's very stable, and I'm comfortable with it. Other people are very comfortable with it. So we picked that as our batch data movement. We're moving toward probably HVR. It's not a total decision yet. But we're moving to HVR for real time data, which is changed capture data, moves it into the cloud. And then, so you're envisioning this right now. In which is you're in the S3, and you have all the data that you could possibly want. And that's JSON, all that everything is sitting in the S3 to be able to move it through into snowflake. And snowflake has proven to have a stability. You only need to learn and train your team with one thing. AWS as is completely stable at this point too. So all these avenues if you think about it, is going through from, this is your data lake, which is I would consider your S3. And even though it's not a traditional data lake like, you can touch it like a Progressive or Hadoop. And then into snowflake and then from snowflake into sandbox and so your lines of business and your data scientists just dive right in. That makes a big win. And then using Io-Tahoe with the data automation, and also their search engine. I have the ability to give the data scientists and data analysts the way of they don't need to talk to IT to get accurate information or completely accurate information from the structure. And we'll be right back. >> Yeah, so talking about snowflake and getting up to speed quickly. I know from talking to customers you can get from zero to snowflake very fast and then it sounds like the Io-Tahoe is sort of the automation cloud for your data pipeline within the cloud. Is that the right way to think about it? >> I think so. Right now I have Io-Tahoe attached to my on-prem. And I want to attach it to my off-prem eventually. So I'm using Io-Tahoe data automation right now, to bring in the data, and to start analyzing the data flows to make sure that I'm not missing anything, and that I'm not bringing over redundant data. The data warehouse that I'm working of, it's an on-prem. It's an Oracle Database, and it's 15 years old. So it has extra data in it. It has things that we don't need anymore, and Io-Tahoe's helping me shake out that extra data that does not need to be moved into my S3. So it's saving me money, when I'm moving from off-prem to on-prem. >> And so that was a challenge prior, because you couldn't get the lines of business to agree what to delete, or what was the issue there? >> Oh, it was more than that. Each line of business had their own structure within the warehouse. And then they were copying data between each other, and duplicating the data and using that. So there could be possibly three tables that have the same data in it, but it's used for different lines of business. We have identified using Io-Tahoe identified over seven terabytes in the last two months on data that has just been repetitive. It's the same exact data just sitting in a different schema. And that's not easy to find, if you only understand one schema, that's reporting for that line of business. >> More bad news for the storage companies out there. (both laughs) So far. >> It's cheap. That's what we were telling people. >> And it's true, but you still would rather not waste it, you'd like to apply it to drive more revenue. And so, I guess, let's close on where you see this thing going. Again, I know you're sort of partway through the journey, maybe you could sort of describe, where you see the phase is going and really what you want to get out of this thing, down the road, mid-term, longer term, what's your vision or your data driven organization. >> I want for the bankers to be able to walk around with an iPad in their hand, and be able to access data for that customer, really fast and be able to give them the best deal that they can get. I want Webster to be right there on top with being able to add new customers, and to be able to serve our existing customers who had bank accounts since they were 12 years old there and now our multi whatever. I want them to be able to have the best experience with our bankers. >> That's awesome. That's really what I want as a banking customer. I want my bank to know who I am, anticipate my needs, and create a great experience for me. And then let me go on with my life. And so that follow. Great story. Love your experience, your background and your knowledge. I can't thank you enough for coming on theCube. >> Now, thank you very much. And you guys have a great day. >> All right, take care. And thank you for watching everybody. Keep right there. We'll take a short break and be right back. (gentle music)

Published Date : Jun 23 2020

SUMMARY :

to you by Io-Tahoe. And I'm really excited to of a regional I think and they want to move it relates to kind of transitioning And the only way to do But I want to ask you about Covid, and get the data moving And then finally, you got more clarity. and filled out the right amount. and really great support for the region, and being able to have the experience to be better. to be able to get the customers that know how to get it. and it's to optimize the banking process, and analyzing maybe the backlog was and get that data faster and the people involved, And everyone that who is working is the cultural aspects of this the ability to do search within the data. and you have this virtual cycle. and one of the questions And then I want to go back in the S3 to be able to move it Is that the right way to think about it? and to start analyzing the data flows and duplicating the data and using that. More bad news for the That's what we were telling people. and really what you want and to be able to serve And so that follow. And you guys have a great day. And thank you for watching everybody.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

Paula D'AmicoPERSON

0.99+

PaulaPERSON

0.99+

ConnecticutLOCATION

0.99+

WestchesterLOCATION

0.99+

InformaticaORGANIZATION

0.99+

24 hoursQUANTITY

0.99+

oneQUANTITY

0.99+

13QUANTITY

0.99+

thousandsQUANTITY

0.99+

100 hoursQUANTITY

0.99+

Bob GrahamPERSON

0.99+

iPadCOMMERCIAL_ITEM

0.99+

Webster BankORGANIZATION

0.99+

GE CapitalORGANIZATION

0.99+

first itemQUANTITY

0.99+

AWSORGANIZATION

0.99+

two productsQUANTITY

0.99+

sevenQUANTITY

0.99+

New YorkLOCATION

0.99+

BostonLOCATION

0.99+

three tablesQUANTITY

0.99+

Each lineQUANTITY

0.99+

first setQUANTITY

0.99+

two daysQUANTITY

0.99+

DevOpsTITLE

0.99+

Webster bankORGANIZATION

0.99+

14 yearsQUANTITY

0.99+

over 15 yearsQUANTITY

0.99+

seven carsQUANTITY

0.98+

each projectQUANTITY

0.98+

Friday nightDATE

0.98+

Enterprise Data AutomationORGANIZATION

0.98+

New EnglandLOCATION

0.98+

Io-TahoeORGANIZATION

0.98+

todayDATE

0.98+

Webster's BankORGANIZATION

0.98+

one schemaQUANTITY

0.97+

Fairfield CountyLOCATION

0.97+

OneQUANTITY

0.97+

one customerQUANTITY

0.97+

over seven terabytesQUANTITY

0.97+

SalesforceORGANIZATION

0.96+

bothQUANTITY

0.95+

single sourceQUANTITY

0.93+

one thingQUANTITY

0.93+

US Air ForceORGANIZATION

0.93+

WebsterORGANIZATION

0.92+

S3COMMERCIAL_ITEM

0.92+

Enterprise Data ArchitectureORGANIZATION

0.91+

Io TahoePERSON

0.91+

OracleORGANIZATION

0.9+

15 years oldQUANTITY

0.9+

Io-TahoePERSON

0.89+

12 years oldQUANTITY

0.88+

TableauTITLE

0.87+

four weeksQUANTITY

0.86+

S3 BucketsCOMMERCIAL_ITEM

0.84+

CovidPERSON

0.81+

Data ArchitectureORGANIZATION

0.79+

JSONTITLE

0.79+

Senior Vice PresidentPERSON

0.78+

24 seven development roleQUANTITY

0.77+

last 35 yearsDATE

0.77+

both laughsQUANTITY

0.75+

Io-TahoeTITLE

0.73+

eachQUANTITY

0.72+

loansQUANTITY

0.71+

zeroQUANTITY

0.71+

Paula D'Amico, Webster Bank


 

>> Narrator: From around the Globe, it's theCube with digital coverage of Enterprise Data Automation, and event series brought to you by Io-Tahoe. >> Everybody, we're back. And this is Dave Vellante, and we're covering the whole notion of Automated Data in the Enterprise. And I'm really excited to have Paula D'Amico here. Senior Vice President of Enterprise Data Architecture at Webster Bank. Paula, good to see you. Thanks for coming on. >> Hi, nice to see you, too. >> Let's start with Webster bank. You guys are kind of a regional I think New York, New England, believe it's headquartered out of Connecticut. But tell us a little bit about the bank. >> Webster bank is regional Boston, Connecticut, and New York. Very focused on in Westchester and Fairfield County. They are a really highly rated regional bank for this area. They hold quite a few awards for the area for being supportive for the community, and are really moving forward technology wise, they really want to be a data driven bank, and they want to move into a more robust group. >> We got a lot to talk about. So data driven is an interesting topic and your role as Data Architecture, is really Senior Vice President Data Architecture. So you got a big responsibility as it relates to kind of transitioning to this digital data driven bank but tell us a little bit about your role in your Organization. >> Currently, today, we have a small group that is just working toward moving into a more futuristic, more data driven data warehousing. That's our first item. And then the other item is to drive new revenue by anticipating what customers do, when they go to the bank or when they log in to their account, to be able to give them the best offer. And the only way to do that is you have timely, accurate, complete data on the customer and what's really a great value on offer something to offer that, or a new product, or to help them continue to grow their savings, or do and grow their investments. >> Okay, and I really want to get into that. But before we do, and I know you're, sort of partway through your journey, you got a lot to do. But I want to ask you about Covid, how you guys handling that? You had the government coming down and small business loans and PPP, and huge volume of business and sort of data was at the heart of that. How did you manage through that? >> We were extremely successful, because we have a big, dedicated team that understands where their data is and was able to switch much faster than a larger bank, to be able to offer the PPP Long's out to our customers within lightning speed. And part of that was is we adapted to Salesforce very for we've had Salesforce in house for over 15 years. Pretty much that was the driving vehicle to get our PPP loans in, and then developing logic quickly, but it was a 24 seven development role and get the data moving on helping our customers fill out the forms. And a lot of that was manual, but it was a large community effort. >> Think about that too. The volume was probably much higher than the volume of loans to small businesses that you're used to granting and then also the initial guidelines were very opaque. You really didn't know what the rules were, but you were expected to enforce them. And then finally, you got more clarity. So you had to essentially code that logic into the system in real time. >> I wasn't directly involved, but part of my data movement team was, and we had to change the logic overnight. So it was on a Friday night it was released, we pushed our first set of loans through, and then the logic changed from coming from the government, it changed and we had to redevelop our data movement pieces again, and we design them and send them back through. So it was definitely kind of scary, but we were completely successful. We hit a very high peak. Again, I don't know the exact number but it was in the thousands of loans, from little loans to very large loans and not one customer who applied did not get what they needed for, that was the right process and filled out the right amount. >> Well, that is an amazing story and really great support for the region, your Connecticut, the Boston area. So that's fantastic. I want to get into the rest of your story now. Let's start with some of the business drivers in banking. I mean, obviously online. A lot of people have sort of joked that many of the older people, who kind of shunned online banking would love to go into the branch and see their friendly teller had no choice, during this pandemic, to go to online. So that's obviously a big trend you mentioned, the data driven data warehouse, I want to understand that, but what at the top level, what are some of the key business drivers that are catalyzing your desire for change? >> The ability to give a customer, what they need at the time when they need it. And what I mean by that is that we have customer interactions in multiple ways. And I want to be able for the customer to walk into a bank or online and see the same format, and being able to have the same feel the same love, and also to be able to offer them the next best offer for them. But they're if they want looking for a new mortgage or looking to refinance, or whatever it is that they have that data, we have the data and that they feel comfortable using it. And that's an untethered banker. Attitude is, whatever my banker is holding and whatever the person is holding in their phone, that is the same and it's comfortable. So they don't feel that they've walked into the bank and they have to do fill out different paperwork compared to filling out paperwork on just doing it on their phone. >> You actually do want the experience to be better. And it is in many cases. Now you weren't able to do this with your existing I guess mainframe based Enterprise Data Warehouses. Is that right? Maybe talk about that a little bit? >> Yeah, we were definitely able to do it with what we have today the technology we're using. But one of the issues is that it's not timely. And you need a timely process to be able to get the customers to understand what's happening. You need a timely process so we can enhance our risk management. We can apply for fraud issues and things like that. >> Yeah, so you're trying to get more real time. The traditional EDW. It's sort of a science project. There's a few experts that know how to get it. You can so line up, the demand is tremendous. And then oftentimes by the time you get the answer, it's outdated. So you're trying to address that problem. So part of it is really the cycle time the end to end cycle time that you're progressing. And then there's, if I understand it residual benefits that are pretty substantial from a revenue opportunity, other offers that you can make to the right customer, that you maybe know, through your data, is that right? >> Exactly. It's drive new customers to new opportunities. It's enhanced the risk, and it's to optimize the banking process, and then obviously, to create new business. And the only way we're going to be able to do that is if we have the ability to look at the data right when the customer walks in the door or right when they open up their app. And by doing creating more to New York times near real time data, or the data warehouse team that's giving the lines of business the ability to work on the next best offer for that customer as well. >> But Paula, we're inundated with data sources these days. Are there other data sources that maybe had access to before, but perhaps the backlog of ingesting and cleaning in cataloging and analyzing maybe the backlog was so great that you couldn't perhaps tap some of those data sources. Do you see the potential to increase the data sources and hence the quality of the data or is that sort of premature? >> Oh, no. Exactly. Right. So right now, we ingest a lot of flat files and from our mainframe type of front end system, that we've had for quite a few years. But now that we're moving to the cloud and off-prem and on-prem, moving off-prem, into like an S3 Bucket, where that data we can process that data and get that data faster by using real time tools to move that data into a place where, like snowflake could utilize that data, or we can give it out to our market. Right now we're about we do work in batch mode still. So we're doing 24 hours. >> Okay. So when I think about the data pipeline, and the people involved, maybe you could talk a little bit about the organization. You've got, I don't know, if you have data scientists or statisticians, I'm sure you do. You got data architects, data engineers, quality engineers, developers, etc. And oftentimes, practitioners like yourself, will stress about, hey, the data is in silos. The data quality is not where we want it to be. We have to manually categorize the data. These are all sort of common data pipeline problems, if you will. Sometimes we use the term data Ops, which is sort of a play on DevOps applied to the data pipeline. Can you just sort of describe your situation in that context? >> Yeah, so we have a very large data ops team. And everyone that who is working on the data part of Webster's Bank, has been there 13 to 14 years. So they get the data, they understand it, they understand the lines of business. So it's right now. We could the we have data quality issues, just like everybody else does. But we have places in them where that gets cleansed. And we're moving toward and there was very much siloed data. The data scientists are out in the lines of business right now, which is great, because I think that's where data science belongs, we should give them and that's what we're working towards now is giving them more self service, giving them the ability to access the data in a more robust way. And it's a single source of truth. So they're not pulling the data down into their own, like Tableau dashboards, and then pushing the data back out. So they're going to more not, I don't want to say, a central repository, but a more of a robust repository, that's controlled across multiple avenues, where multiple lines of business can access that data. Is that help? >> Got it, Yes. And I think that one of the key things that I'm taking away from your last comment, is the cultural aspects of this by having the data scientists in the line of business, the lines of business will feel ownership of that data as opposed to pointing fingers criticizing the data quality. They really own that that problem, as opposed to saying, well, it's Paula's problem. >> Well, I have my problem is I have data engineers, data architects, database administrators, traditional data reporting people. And because some customers that I have that are business customers lines of business, they want to just subscribe to a report, they don't want to go out and do any data science work. And we still have to provide that. So we still want to provide them some kind of regiment that they wake up in the morning, and they open up their email, and there's the report that they subscribe to, which is great, and it works out really well. And one of the things is why we purchased Io-Tahoe was, I would have the ability to give the lines of business, the ability to do search within the data. And we'll read the data flows and data redundancy and things like that, and help me clean up the data. And also, to give it to the data analysts who say, all right, they just asked me they want this certain report. And it used to take okay, four weeks we're going to go and we're going to look at the data and then we'll come back and tell you what we can do. But now with Io-Tahoe, they're able to look at the data, and then in one or two days, they'll be able to go back and say, Yes, we have the data, this is where it is. This is where we found it. This is the data flows that we found also, which is what I call it, is the break of a column. It's where the column was created, and where it went to live as a teenager. (laughs) And then it went to die, where we archive it. And, yeah, it's this cycle of life for a column. And Io-Tahoe helps us do that. And we do data lineage is done all the time. And it's just takes a very long time and that's why we're using something that has AI in it and machine running. It's accurate, it does it the same way over and over again. If an analyst leaves, you're able to utilize something like Io-Tahoe to be able to do that work for you. Is that help? >> Yeah, so got it. So a couple things there, in researching Io-Tahoe, it seems like one of the strengths of their platform is the ability to visualize data, the data structure and actually dig into it, but also see it. And that speeds things up and gives everybody additional confidence. And then the other piece is essentially infusing AI or machine intelligence into the data pipeline, is really how you're attacking automation. And you're saying it repeatable, and then that helps the data quality and you have this virtual cycle. Maybe you could sort of affirm that and add some color, perhaps. >> Exactly. So you're able to let's say that I have seven cars, lines of business that are asking me questions, and one of the questions they'll ask me is, we want to know, if this customer is okay to contact, and there's different avenues so you can go online, do not contact me, you can go to the bank and you can say, I don't want email, but I'll take texts. And I want no phone calls. All that information. So, seven different lines of business asked me that question in different ways. One said, "No okay to contact" the other one says, "Customer 123." All these. In each project before I got there used to be siloed. So one customer would be 100 hours for them to do that analytical work, and then another analyst would do another 100 hours on the other project. Well, now I can do that all at once. And I can do those types of searches and say, Yes, we already have that documentation. Here it is, and this is where you can find where the customer has said, "No, I don't want to get access from you by email or I've subscribed to get emails from you." >> Got it. Okay. Yeah Okay. And then I want to go back to the cloud a little bit. So you mentioned S3 Buckets. So you're moving to the Amazon cloud, at least, I'm sure you're going to get a hybrid situation there. You mentioned snowflake. What was sort of the decision to move to the cloud? Obviously, snowflake is cloud only. There's not an on-prem, version there. So what precipitated that? >> Alright, so from I've been in the data IT information field for the last 35 years. I started in the US Air Force, and have moved on from since then. And my experience with Bob Graham, was with snowflake with working with GE Capital. And that's where I met up with the team from Io-Tahoe as well. And so it's a proven so there's a couple of things one is Informatica, is worldwide known to move data. They have two products, they have the on-prem and the off-prem. I've used the on-prem and off-prem, they're both great. And it's very stable, and I'm comfortable with it. Other people are very comfortable with it. So we picked that as our batch data movement. We're moving toward probably HVR. It's not a total decision yet. But we're moving to HVR for real time data, which is changed capture data, moves it into the cloud. And then, so you're envisioning this right now. In which is you're in the S3, and you have all the data that you could possibly want. And that's JSON, all that everything is sitting in the S3 to be able to move it through into snowflake. And snowflake has proven to have a stability. You only need to learn and train your team with one thing. AWS as is completely stable at this point too. So all these avenues if you think about it, is going through from, this is your data lake, which is I would consider your S3. And even though it's not a traditional data lake like, you can touch it like a Progressive or Hadoop. And then into snowflake and then from snowflake into sandbox and so your lines of business and your data scientists just dive right in. That makes a big win. And then using Io-Tahoe with the data automation, and also their search engine. I have the ability to give the data scientists and data analysts the way of they don't need to talk to IT to get accurate information or completely accurate information from the structure. And we'll be right back. >> Yeah, so talking about snowflake and getting up to speed quickly. I know from talking to customers you can get from zero to snowflake very fast and then it sounds like the Io-Tahoe is sort of the automation cloud for your data pipeline within the cloud. Is that the right way to think about it? >> I think so. Right now I have Io-Tahoe attached to my on-prem. And I want to attach it to my off-prem eventually. So I'm using Io-Tahoe data automation right now, to bring in the data, and to start analyzing the data flows to make sure that I'm not missing anything, and that I'm not bringing over redundant data. The data warehouse that I'm working of, it's an on-prem. It's an Oracle Database, and it's 15 years old. So it has extra data in it. It has things that we don't need anymore, and Io-Tahoe's helping me shake out that extra data that does not need to be moved into my S3. So it's saving me money, when I'm moving from off-prem to on-prem. >> And so that was a challenge prior, because you couldn't get the lines of business to agree what to delete, or what was the issue there? >> Oh, it was more than that. Each line of business had their own structure within the warehouse. And then they were copying data between each other, and duplicating the data and using that. So there could be possibly three tables that have the same data in it, but it's used for different lines of business. We have identified using Io-Tahoe identified over seven terabytes in the last two months on data that has just been repetitive. It's the same exact data just sitting in a different schema. And that's not easy to find, if you only understand one schema, that's reporting for that line of business. >> More bad news for the storage companies out there. (both laughs) So far. >> It's cheap. That's what we were telling people. >> And it's true, but you still would rather not waste it, you'd like to apply it to drive more revenue. And so, I guess, let's close on where you see this thing going. Again, I know you're sort of partway through the journey, maybe you could sort of describe, where you see the phase is going and really what you want to get out of this thing, down the road, mid-term, longer term, what's your vision or your data driven organization. >> I want for the bankers to be able to walk around with an iPad in their hand, and be able to access data for that customer, really fast and be able to give them the best deal that they can get. I want Webster to be right there on top with being able to add new customers, and to be able to serve our existing customers who had bank accounts since they were 12 years old there and now our multi whatever. I want them to be able to have the best experience with our bankers. >> That's awesome. That's really what I want as a banking customer. I want my bank to know who I am, anticipate my needs, and create a great experience for me. And then let me go on with my life. And so that follow. Great story. Love your experience, your background and your knowledge. I can't thank you enough for coming on theCube. >> Now, thank you very much. And you guys have a great day. >> All right, take care. And thank you for watching everybody. Keep right there. We'll take a short break and be right back. (gentle music)

Published Date : Jun 4 2020

SUMMARY :

to you by Io-Tahoe. And I'm really excited to of a regional I think and they want to move it relates to kind of transitioning And the only way to do But I want to ask you about Covid, and get the data moving And then finally, you got more clarity. and filled out the right amount. and really great support for the region, and being able to have the experience to be better. to be able to get the customers that know how to get it. and it's to optimize the banking process, and analyzing maybe the backlog was and get that data faster and the people involved, And everyone that who is working is the cultural aspects of this the ability to do search within the data. and you have this virtual cycle. and one of the questions And then I want to go back in the S3 to be able to move it Is that the right way to think about it? and to start analyzing the data flows and duplicating the data and using that. More bad news for the That's what we were telling people. and really what you want and to be able to serve And so that follow. And you guys have a great day. And thank you for watching everybody.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

Paula D'AmicoPERSON

0.99+

PaulaPERSON

0.99+

ConnecticutLOCATION

0.99+

WestchesterLOCATION

0.99+

InformaticaORGANIZATION

0.99+

24 hoursQUANTITY

0.99+

oneQUANTITY

0.99+

13QUANTITY

0.99+

thousandsQUANTITY

0.99+

100 hoursQUANTITY

0.99+

Bob GrahamPERSON

0.99+

iPadCOMMERCIAL_ITEM

0.99+

Webster BankORGANIZATION

0.99+

GE CapitalORGANIZATION

0.99+

first itemQUANTITY

0.99+

AWSORGANIZATION

0.99+

two productsQUANTITY

0.99+

sevenQUANTITY

0.99+

New YorkLOCATION

0.99+

BostonLOCATION

0.99+

three tablesQUANTITY

0.99+

Each lineQUANTITY

0.99+

first setQUANTITY

0.99+

two daysQUANTITY

0.99+

DevOpsTITLE

0.99+

Webster bankORGANIZATION

0.99+

14 yearsQUANTITY

0.99+

over 15 yearsQUANTITY

0.99+

seven carsQUANTITY

0.98+

each projectQUANTITY

0.98+

Friday nightDATE

0.98+

New EnglandLOCATION

0.98+

Io-TahoeORGANIZATION

0.98+

todayDATE

0.98+

Webster's BankORGANIZATION

0.98+

one schemaQUANTITY

0.97+

Fairfield CountyLOCATION

0.97+

OneQUANTITY

0.97+

one customerQUANTITY

0.97+

over seven terabytesQUANTITY

0.97+

SalesforceORGANIZATION

0.96+

bothQUANTITY

0.95+

single sourceQUANTITY

0.93+

one thingQUANTITY

0.93+

US Air ForceORGANIZATION

0.93+

WebsterORGANIZATION

0.92+

S3COMMERCIAL_ITEM

0.92+

Enterprise Data ArchitectureORGANIZATION

0.91+

OracleORGANIZATION

0.9+

15 years oldQUANTITY

0.9+

Io-TahoePERSON

0.89+

12 years oldQUANTITY

0.88+

TableauTITLE

0.87+

four weeksQUANTITY

0.86+

S3 BucketsCOMMERCIAL_ITEM

0.84+

CovidPERSON

0.81+

Data ArchitectureORGANIZATION

0.79+

JSONTITLE

0.79+

Senior Vice PresidentPERSON

0.78+

24 seven development roleQUANTITY

0.77+

last 35 yearsDATE

0.77+

both laughsQUANTITY

0.75+

Io-TahoeTITLE

0.73+

eachQUANTITY

0.72+

loansQUANTITY

0.71+

zeroQUANTITY

0.71+

Amazon cloudORGANIZATION

0.65+

last two monthsDATE

0.65+

Yuvi Kochar, GameStop | Mayfield People First Network


 

>> Announcer: From Sand Hill Road in the heart of Silicon Valley, it's theCUBE, presenting the People First Network, insights from entrepreneurs and tech leaders. (bright electronic music) >> Everyone, welcome to this special CUBE conversation. We're here at Sand Hill Road at Mayfield Fund. This is theCUBE, co-creation of the People First Network content series. I'm John Furrier, host of theCUBE. Our next guest, Yuvi Kochar, who's the Data-centric Digital Transformation Strategist at GameStop. Variety of stints in the industry, going in cutting-edge problems around data, Washington Post, comScore, among others. You've got your own practice. From Washington, DC, thanks for joining us. >> Thank you, thanks for hosting me. >> This is a awesome conversation. We were just talking before we came on camera about data and the roles you've had over your career have been very interesting, and this seems to be the theme for some of the innovators that I've been interviewing and were on the People First is they see an advantage with technology, and they help companies, they grow companies, and they assist. You did a lot of different things, most notably that I recognized was the Washington Post, which is on the mainstream conversations now as a rebooted media company with a storied, historic experience from the Graham family. Jeff Bezos purchased them for a song, with my opinion, and now growing still, with the monetization, with subscriber base growing. I think they're number one in subscribers, I don't believe, I believe so. Interesting time for media and data. You've been there for what, how many years were you at the Washington Post? >> I spent about 13 years in the corporate office. So the Washington Post company was a conglomerate. They'd owned a lot of businesses. Not very well known to have owned Kaplan, education company. We owned Slate, we owned Newsweek, we owned TV stations and now they're into buying all kinds of stuff. So I was involved with a lot of varied businesses, but obviously, we were in the same building with the Washington Post, and I had front row seat to see the digital transformation of the media industry. >> John: Yeah, we-- >> And how we responded. >> Yeah, I want to dig into that because I think that illustrates kind of a lot what's happening now, we're seeing with cloud computing. Obviously, Cloud 1.0 and the rise of Amazon public cloud. Clearly, check, done that, a lot of companies, startups go there. Why would you provision a data center? You're a startup, you're crazy, but at some point, you can have a data center. Now, hybrid cloud's important. Devops, the application development market, building your own stack, is shifting now. It seems like the old days, but upside down. It's flipped around, where applications are in charge, data's critical for the application, infrastructure's now elastic. Unlike the old days of here's your infrastructure. You're limited to what you can run on it based on the infrastructure. >> Right. >> What's your thoughts on that? >> My thoughts are that, I'm a very, as my title suggests, data-centric person. So I think about everything data first. We were in a time when cloud-first is becoming old, and we are now moving into data-first because what's happening in the marketplace is the ability, the capability, of data analytics has reached a point where prediction, in any aspect of a business, has become really inexpensive. So empowering employees with prediction machines, whether you call them bots, or you call them analytics, or you call them machine learning, or AI, has become really inexpensive, and so I'm thinking more of applications, which are built data-out instead of data-in, which is you build process and you capture data, and then you decide, oh, maybe I should build some reporting. That's what we used to do. Now, you need to start with what's the data I have got? What's the data I need? What's the data I can get? We were just talking about, everybody needs a data monetization strategy. People don't realize how much asset is sitting in their data and where to monetize it and how to use it. >> It's interesting. I mean, I got my computer science degree in the 80s and one of the tracks I got a degree in was database, and let's just say that my main one was operating system. Database was kind of the throwaway at that time. It wasn't considered a big field. Database wasn't sexy at all. It was like, database, like. Now, if you're a database, you're a data guru, you're a rock star. The world has changed, but also databases are changing. It used to be one centralized database rules the world. Oracle made a lot of money with that, bought all their competitors. Now you have open source came into the realm, so the world of data is also limited by where the data's stored, how the data is retrieved, how the data moves around the network. This is a new dynamic. How do you look at that because, again, lagging in business has a lot to do with the data, whether it's in an application, that's one thing, but also having data available, not necessarily in real time, but if I'm going to work on something, I want the data set handy, which means I can download it or maybe get real-time. What's your thoughts on data as an element in all that moving around? >> So I think what you're talking about is still data analytics. How do I get insights about my business? How do I make decisions using data in a better way? What flexibility do I need? So you talk about open source, you think about MongoDB and those kind of databases. They give you a lot of flexibility. You can develop interesting insights very quickly, but I think that is still very much thinking about data in an old-school kind of way. I think what's happening now is we're teaching algorithms with data. So data is actually the software, right? So you get an open source algorithm. I mean Google and everybody else is happy to open source their algorithms. They're all available for free. But what, the asset is now the data, which means how you train your algorithm with your data, and then now, moving towards deploying it on the edge, which is you take an algorithm, you train it, then you deploy it on the edge in an IoT kind of environment, and now you're doing decision-making, whether it's self-driving cars, I mean those are great examples, but I think it's going down into very interesting spaces in enterprise, which is, so we have to all think about software differently because, actually, data is a software. >> That's an interesting take on it, and I love that. I mean I wrote a blog post in 2007 when we first started playing with the, in looking at the network effects on social media and those platforms was, I wrote a post, it was called Data is the New Development Kit. Development kit was what people did back then. They had a development kit and they would download stuff and then code, but the idea was is that data has to be part of the runtime and the compilation of, as software acts, data needs to be resident, not just here's a database, access it, pull it out, use it, present it, where data is much more of a key ingredient into the development. Is that kind of what you're getting at? >> Yes. >> Notion of-- >> And I think we're moving from the age of arithmetic-based machines, which is we put arithmetic onto chips, and we then made general-purpose chips, which were used to solve a huge amount of problems in the world. We're talking about, now, prediction machines on a chip, so you think about algorithms that are trained using data, which are going to be available on chips. And now you can do very interesting algorithmic work right on the edge devices, and so I think a lot of businesses, and I've seen that recently at GameStop, I think business leaders have a hard time understanding the change because we have moved from process-centric, process automation, how can I do it better? How can I be more productive? How can I make better decisions? We have trained our business partners on that kind of thinking, and now we are starting to say, no, no, no, we've got something that's going to help you make those decisions. >> It's interesting, you mentioned GameStop. Obviously, well-known, my sons are all gamers. I used to be a gamer back before I had kids, but then, can't keep up anymore. Got to be on that for so long, but GameStop was a retail giant in gaming. Okay, when they had physical displays, but now, with online, they're under pressure, and I had interviewed, again, at an Amazon event, this Best Buy CIO, and he says, "We don't compete with price anymore. "If they want to buy from Amazon, no problem, "but our store traffic is off the charts. "We personalize 50,000 emails a day." So personalization became their strategy, it was a data strategy. This is a user experience, not a purchase decision. Is this how you guys are thinking about it at GameStop? >> I think retail, if you look at the segment per se, personalization, Amazon obviously led the way, but it's obvious that personalization is key to attract the customer. If I don't know what games you play, or if I don't know what video you watched a little while ago, about which game, then I'm not offering you the product that you are most prone or are looking for or what you want to buy, and I think that's why personalization is key. I think that's-- >> John: And data drives that, and data drives that. >> Data drives that, and for personalization, if you look at retail, there's customer information. You need to know the customer. You need to know, understand the customer preferences, but then there's the product, and you need to marry the two. And that's where personalization comes into play. >> So I'll get your thoughts. You have, obviously, a great perspective on how tech has been built and now working on some real cutting-edge, clear view on what the future looks like. Totally agree with you, by the way, on the data. There's kind of an old guard/new guard, kind of two sides of the street, the winners and the losers, but hey, look, I think the old guard, if they don't innovate and become fresh and new and adopt the modern things that need to attract the new expectations and new experiences from their customers, are going to die. That being said, what is the success formula, because some people might say, hey, I'm data-driven. I'm doing it, look at me, I'm data. Well, not really. Well, how do you tell if someone's really data-driven or data-centric? What's the difference? Is there a tell sign? >> I think when you say the old guard, you're talking about companies that have large assets, that have been very successful in a business model that maybe they even innovated, like GameStop came up with pre-owned games, and for the longest of times, we've made huge amount of revenue and profit from that segment of our business. So yes, that's becoming old now, but I think the most important thing for large enterprises at least, to battle the incumbent, the new upstarts, is to develop strategies which are leveraging the new technologies, but are building on their existing capability, and that's what I drive at GameStop. >> And also the startups too, that they were here in a venture capital firm, we're at Mayfield Fund, doing this program, startups want to come and take a big market down, or come in on a narrow entry and get a position and then eat away at an incumbent. They could do it fast if they're data-centric. >> And I think it's speed is what you're talking about. I think the biggest challenge large companies have is an ability to to play the field at the speed of the new upstarts and the firms that Mayfield and others are investing in. That's the big challenge because you see this, you see an opportunity, but you're, and I saw that at the Washington Post. Everybody went to meetings and said, yes, we need to be digital, but they went-- >> They were talking. >> They went back to their desk and they had to print a paper, and so yes, so we'll be digital tomorrow, and that's very hard because, finally, the paper had to come out. >> Let's take us through the journey. You were the CTO, VP of Technology, Graham Holdings, Washington Post, they sold it to Jeff Bezos, well-documented, historic moment, but what a storied company, Washington Post, local paper, was the movie about it, all the historic things they've done from a reporting and journalism standpoint. We admire that. Then they hit, the media business starts changing, gets bloated, not making any money, online classifieds are dying, search engine marketing is growing, they have to adjust. You were there. What was the big, take us through that journey. >> I think the transformation was occurring really fast. The new opportunities were coming up fast. We were one of the first companies to set up a website, but we were not allowed to use the brand on the website because there was a lot of concern in the newsroom that we are going to use or put the brand on this misunderstood, nearly misunderstood opportunity. So I think it started there, and then-- >> John: This is classic old guard mentality. >> Yes, and it continued down because people had seen downturns. It's not like media companies hadn't been through downturns. They had, because the market crashes and we have a recession and there's a downturn, but it always came back because-- >> But this was a wave. I mean the thing is, downturns are economic and there's business that happens there, advertisers, consumption changes. This was a shift in their user base based upon a technology wave, and they didn't see it coming. >> And they hadn't ever experienced it. So they were experiencing it as it was happening, and I think it's very hard to respond to a transformation of that kind in a very old-- >> As a leader, how did you handle that? Give us an example of what you did, how you make your mark, how do you get them to move? What were some of the things that were notable moments? >> I think the main thing that happened there was that we spun out washingtonpost.com. So it became an independent business. It was actually running across the river. It moved out of the corporate offices. It went to a separate place. >> The renegades. >> And they were given-- >> John: Like Steve Jobs and the Macintosh team, they go into separate building. >> And we were given, I was the CTO of the dotcom for some time while we were turning over our CTO there, and we were given a lot of flexibility. We were not held accountable to the same level. We used the, obviously, we used-- >> John: You were running fast and loose. >> And we were, yes, we had a lot of flexibility and we were doing things differently. We were giving away the content in some way. On the online side, there was no pay wall. We started with a pay wall, but advertising kind of was so much more lucrative in the beginning, that the pay wall was shut down, and so I think we experimented a lot, and I think where we missed, and a lot of large companies miss, is that you need to leave your existing business behind and scale your new business, and I think that's very hard to do, which is, okay, we're going to, it's happening at GameStop. We're no longer completely have a control of the market where we are the primary source of where, you talk about your kids, where they go to get their games. They can get the games online and I think-- >> It's interesting, people are afraid to let go because they're so used to operating their business, and now it has to pivot to a new operating model and grow. Two different dynamics, growth, operation, operating and growing. Not all managers have that growth mindset. >> And I think there's also an experience thing. So most people who are in these businesses, who've been running these businesses very successfully, have not been watching what's happening in technology. And so the technology team comes out and says, look, let me show you what we can do. I think there has to be this open and very, very candid discussion around how we are going to transform-- >> How would you talk about your peer, developed peers out there, your peers and other CIOs, and even CISOs on the security side, have been dealing with the same suppliers over, and in fact, on the security side, the supplier base is getting larger. There's more tools coming out. I mean who wants another tool? So platform, tool, these are big decisions being made around companies, that if you want to be data-centric, you want to be a data-centric model, you got to understand platforms, not just buying tools. If you buy a hammer, they will look like a nail, and you have so many hammers, what version, so platform discussions come in. What's your thoughts on this? Because this is a cutting-edge topic we've been talking about with a lot of senior engineering leaders around Platform 2.0 coming, not like a classic platform to... >> Right, I think that each organization has to leverage or build their, our stack on top of commodity platforms. You talked about AWS or Azure or whatever cloud you use, and you take all their platform capability and services that they offer, but then on top of that, you structure your own platform with your vertical capabilities, which become your differentiators, which is what you take to market. You enable those for all your product lines, so that now you are building capability, which is a layer on top of, and the commodity platforms will continue to bite into your platform because they will start offering capabilities that earlier, I remember, I started at this company called BrassRing, recruitment automation. One of the first software-as-a-service companies, and I, we bought a little company, and the CTO there had built a web server. It was called, it was his name, it was called Barrett's Engine. (chuckles) And so-- >> Probably Apache with something built around it. >> So, in those days, we used to build our own web servers. But now today, you can't even find an engineer who will build a web server. >> I mean the web stack and these notions of just simple Web 1.0 building blocks of change. We've been calling it Cloud 2.0, and I want to get your thoughts on this because one of the things I've been riffing on lately is this, I remember Marc Andreessen wrote the famous article in Wall Street Journal, Software is Eating the World, which I agree with in general, no debate there, but also the 10x Engineer, you go into any forum online, talking about 10x Engineers, you get five different opinions, meaning, a 10x Engineer's an engineer who can do 10 times more work than an old school, old classical engineer. I bring this up because the notion of full stack developer used to be a real premium, but what you're talking about here with cloud is a horizontally scalable commodity layer with differentiation at the application level. That's not full stack, that's half stack. So you think the world's kind of changing. If you're going to be data-centric, the control plane is data. The software that's domain-specific is on top. That's what you're essentially letting out. >> That's what I'm talking about, but I think that also, what I'm beginning to find, and we've been working on a couple of projects, is you put the data scientists in the same room with engineers who write code, write software, and it's fascinating to see them communicate and collaborate. They do not talk the same language at all. >> John: What's it like? Give us a mental picture. >> So a data scientist-- >> Are they throwing rocks at each other? >> Well, nearly, because the data scientists come from the math side of the house. They're very math-oriented, they're very algorithm-oriented. Mathematical algorithms, whereas software engineers are much more logic-oriented, and they're thinking about scalability and a whole lot of other things, and if you think about, a data scientist develops an algorithm, it rarely scales. You have to actually then hand it to an engineer to rewrite it in a scalable form. >> I want to ask you a question on that. This is why I got you and you're an awesome guest. Thanks for your insights here, and we'll take a detour into machine learning. Machine learning really is what AI is about. AI is really nothing more than just, I love AI, it gets people excited about computer science, which is great. I mean my kids talk about AI, they don't talk about IoT, which is good that AI does that, but it's really machine learning. So there's two schools of thought on machine. I call it the Berkeley school on one end, not Berkeley per se but Berkeley talks about math, machine learning, math, math, math, and then you have other schools of thought that are on cognition, that machine learning should be more cognitive, less math-driven, spectrum of full math, full cognition, and everything in between. What's your thoughts on the relationship between math and cognition? >> Yeah, so it's interesting. You get gray hair and you kind of move up the stack, and I'm much more business-focused. These are tools. You can get passionate about either school of thought, but I think that what that does is you lose sight of what the business needs, and I think it's most important to start with what are we here trying to do, and what is the best tool? What is the approach that we should utilize to meet that need? Like the other day, we were looking at product data from GameStop, and we know that the quality of data should be better, but we found a simple algorithm that we could utilize to create product affinity. Now whether it's cognition or math, it doesn't matter. >> John: The outcome's the outcome. >> The outcome is the outcome, and so-- >> They're not mutually exclusive, and that's a good conversation debate but it really gets to your point of does it really matter as long as it's accurate and the data drives that, and this is where I think data is interesting. If you look at folks who are thinking about data, back to the cloud as an example, it's only good as what you can get access to, and cybersecurity, the transparency issue around sharing data becomes a big thing. Having access to the data's super important. How do you view that for, as CIOs, and start to think about they're re-architecting their organizations for these digital transformations. Is there a school of thought there? >> Yes, so I think data is now getting consolidated. For the longest time, we were building data warehouses, departmental data warehouses. You can go do your own analytics and just take your data and add whatever else you want to do, and so the part of data that's interesting to you becomes much more clean, much more reliable, but the rest, you don't care much about. I think given the new technologies that are available and the opportunity of the data, data is coming back together, and it's being put into a single place. >> (mumbles) Well, that's certainly a honeypot for a hacker, but we'll get that in a second. If you and I were doing a startup, we say, hey, let's, we've got a great idea, we're going to build something. How would we want to think about the data in terms of having data be a competitive advantage, being native into the architecture of the system. I'll say we use cloud unless we need some scale on premise for privacy reasons or whatever, but we would, how would we go to market, and we have an app, as apps defined, great use case, but I want to have extensibility around the data, I don't want to foreclose any future options, How should I think about my, how should we think about our data strategy? >> Yes, so there was a very interesting conversation I had just a month ago with a friend of mine who's working at a startup in New York, and they're going to build a solution, take it to market, and he said, "I want to try it only in a small market "and learn from it," and he's going very old school, focus groups, analytics, analysis, and I sat down, we sat at Grand Central Station, and we talked about how, today, he should be thinking about capturing the data and letting the data tell him what's working and what's not working, instead of trying to find focus groups and find very small data points to make big decisions. He should actually utilize the target, the POC market, to capture data and get ready for scale because if you want to go national after having run a test in... >> Des Moines, Iowa. >> Part of New York or wherever, then you need to already have built the data capability to scale that business in today's-- >> John: Is it a SaaS business? >> No, it's a service and-- >> So he can instrument it, just watch the data. >> And yes, but he's not thinking like that because most business people are still thinking the old way, and if you look at Uber and others, they have gone global at such a rapid pace because they're very data-centric, and they scale with data, and they don't scale with just let's go to that market and then let's try-- >> Yeah, ship often, get the data, then think of it as part of the life cycle of development. Don't think it as the old school, craft, launch it, and then see how it goes and watch it fail or succeed, and know six months later what happened, know immediately. >> And if you go data-centric, then you can turn the R&D crank really fast. Learn, test and learn, test and learn, test and learn at a very rapid pace. That changes the game, and I think people are beginning to realize that data needs to be thought about as the application and the service is being developed, because the data will help scale the service really fast. >> Data comes into applications. I love your line of data is the new software. That's better than the new oil, which has been said before, but data comes into the app. You also mentioned that app throws off data. >> Yuvi: Yes. >> We know that humans have personal, data exhaust all the time. Facebook made billions of dollars on our exhaust and our data. The role of data in and out of the application, the I/O of the application, is a new concept, you brought that up. I like that and I see that happening. How should we capture that data? This used to be log files. Now you got observability, all kinds of new words kind of coming into this cloud equation. How should people think about this? >> I think that has to be part of the design of your applications, because data is application, and you need to design the application with data in mind, and that needs to be thought of upfront, and not later. >> Yuvi, what's next for you? We're here in Sand Hill Road, VC firm, they're doing a lot of investments, you've got a great project with GameStop, you're advising startups, what's going on in your world? >> Yes, so I'm totally focused, as you probably are beginning to sense, on the opportunity that data is enabling, especially in the enterprise. I'm very interested in helping business understand how to leverage data, because this is another major shift that's occurring in the marketplace. Opportunities have opened up, prediction is becoming cheap and at scale, and I think any business runs on their capability to predict, what is the shirt I should buy? How many I should buy? What color should I buy? I think data is going to drive that prediction at scale. >> This is a legit way that everyone should pay attention to. All businesses, not just one-- >> All businesses, everything, because prediction is becoming cheap and automated and granular. That means you need to be able to not just, you need to empower your people with low-level prediction that comes out of the machines. >> Data is the new software. Yuvi, thanks so much for great insight. This is theCUBE conversation. I'm John Furrier here at Sand Hill Road at the Mayfield Fund, for the People First Network series. Thanks for watching. >> Yuvi: Thank you. (bright electronic music)

Published Date : Sep 11 2019

SUMMARY :

Announcer: From Sand Hill Road in the heart of the People First Network content series. and the roles you've had over your career So the Washington Post company was a conglomerate. Obviously, Cloud 1.0 and the rise of Amazon public cloud. and then you decide, oh, and one of the tracks I got a degree in was database, So data is actually the software, right? of the runtime and the compilation of, as software acts, that's going to help you make those decisions. Is this how you guys are thinking about it at GameStop? I think retail, if you look at the segment per se, but then there's the product, and you need to marry the two. and become fresh and new and adopt the modern things I think when you say the old guard, And also the startups too, that they were here That's the big challenge because you see this, and they had to print a paper, and so yes, Washington Post, they sold it to Jeff Bezos, I think the transformation was occurring really fast. They had, because the market crashes and we have a recession I mean the thing is, downturns are economic and I think it's very hard to respond to a transformation It moved out of the corporate offices. John: Like Steve Jobs and the Macintosh team, and we were given a lot of flexibility. is that you need to leave your existing business behind and now it has to pivot to a new operating model and grow. I think there has to be this open and in fact, on the security side, and you take all their platform capability and services But now today, you can't even find an engineer but also the 10x Engineer, you go into any forum online, and it's fascinating to see them communicate John: What's it like? and if you think about, a data scientist and then you have other schools of thought but I think that what that does is you lose sight as what you can get access to, and cybersecurity, much more reliable, but the rest, you don't care much about. being native into the architecture of the system. and letting the data tell him what's working Yeah, ship often, get the data, then think of it That changes the game, and I think people but data comes into the app. the I/O of the application, is a new concept, and you need to design the application with data in mind, I think data is going to drive that prediction at scale. This is a legit way that everyone should pay attention to. you need to empower your people with low-level prediction Data is the new software. (bright electronic music)

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Marc AndreessenPERSON

0.99+

Yuvi KocharPERSON

0.99+

JohnPERSON

0.99+

Jeff BezosPERSON

0.99+

GameStopORGANIZATION

0.99+

2007DATE

0.99+

FacebookORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

GoogleORGANIZATION

0.99+

GrahamPERSON

0.99+

New YorkLOCATION

0.99+

OracleORGANIZATION

0.99+

John FurrierPERSON

0.99+

10 timesQUANTITY

0.99+

Washington PostORGANIZATION

0.99+

YuviPERSON

0.99+

UberORGANIZATION

0.99+

Silicon ValleyLOCATION

0.99+

AWSORGANIZATION

0.99+

Washington, DCLOCATION

0.99+

Steve JobsPERSON

0.99+

KaplanORGANIZATION

0.99+

twoQUANTITY

0.99+

MacintoshORGANIZATION

0.99+

two schoolsQUANTITY

0.99+

BerkeleyORGANIZATION

0.99+

Sand Hill RoadLOCATION

0.99+

OneQUANTITY

0.99+

todayDATE

0.99+

Mayfield FundORGANIZATION

0.99+

a month agoDATE

0.99+

Graham HoldingsORGANIZATION

0.99+

oneQUANTITY

0.98+

People First NetworkORGANIZATION

0.98+

SlateORGANIZATION

0.98+

MayfieldORGANIZATION

0.98+

comScoreORGANIZATION

0.98+

six months laterDATE

0.98+

tomorrowDATE

0.98+

NewsweekORGANIZATION

0.98+

Best BuyORGANIZATION

0.98+

BrassRingORGANIZATION

0.98+

two sidesQUANTITY

0.98+

washingtonpost.comOTHER

0.97+

50,000 emails a dayQUANTITY

0.97+

about 13 yearsQUANTITY

0.97+

MongoDBTITLE

0.97+

80sDATE

0.97+

Software is Eating the WorldTITLE

0.96+

ApacheORGANIZATION

0.96+

Des Moines, IowaLOCATION

0.96+

dotcomORGANIZATION

0.96+

five different opinionsQUANTITY

0.96+

Cloud 1.0TITLE

0.95+

CUBEORGANIZATION

0.95+

one endQUANTITY

0.95+

Mayfield People First NetworkORGANIZATION

0.94+

Grand Central StationLOCATION

0.94+

each organizationQUANTITY

0.94+

Rik Tamm-Daniels, Informatica & Tarik Dwiek, Snowflake | Informatica World 2019


 

>> Live from Las Vegas, it's theCUBE. Covering Informatica World 2019. Brought to you by Informatica. >> Hey welcome back everyone, you're here live in Las Vegas for theCUBE, for Informatica World 2019. I'm John Furrier, co-host of theCUBE. We've got two great guests here from Snowflake. We've got Tarik Dwiek who's the Director of Technology Alliances at Snowflake, and Rik Tamm-Daniels, Vice President of Strategic Ecosystems and Technology at Informatica. Welcome back to theCUBE, good to see you guys. >> Good to see you as well. >> Thanks for coming on Snowflake. Congratulations, you guys are doing really well. >> Thank you. >> Big growth, new CEO, Frank Slootman, Informatica, The Data, Zar, Neutral Third Party, Switzerland, cloud, you've got Switzerland, what's the relationship, explain. >> Well, I think you know, it's funny that comment comes up a fair amount and yeah, I look at this way. It's not so much that you know, with Switzerland what we're focused on though is where customers are choosing to go in their journey, we want to provide them the best experience possible, right. So we end up going very deep in our strategic ecosystems, and Snowflakes is one of those partners that we've seen tremendous growth with, and customers are adopting, So, very excited about the partnership. >> How about your relationship with Informatica, Why are you here? What's the story? >> Yeah definitely, so at Snowflake, we put customers first, right? And as Rick mentioned, it's all about having a diverse ecosystem in the enterprise. Informatica is a leader. When you look at where customers are going with data, right? Obviously data integration is key. Data quality is key, data governance. All the areas that Informatica has been the best to breed in, it just makes sense for continued to make traction in these enterprise customers. >> Take a bit to explain the business model of Snowflake, what you guys do, quick one minute. >> Sure, so Snowflake's a data warehouse solution built from the ground up for the cloud. Why the distinction is important is because we're the only data warehouse born in the cloud. If you look at how the other solutions are doing it today, they're taking an architecture, an architecture created a decade ago for an on-premise world and they're just shifting into cloud. And the challenge that you have there is that you can't take full advantage of things like instant and infinite resources, both compute and storage, right? Independent scaling of computing storage. Elasticity right, the ability to scale up and down and out with a click of a button. And then even being able to support massive incurrence. Things like loading data at the same time that you're querying data. This is what Snowflake was built for. >> How about datasets from other people. That's one of the benefits of having data in the cloud. >> Correct, so our architecture is key. That's the key to our business and our product and what we've done is we separated compute from storage and we become a centralized database. And what we found by creating additional views, you can actually share your data with yourself and you can share with other customers. We've created this concept of data sharing. Data sharing has been around for decades, but it's been very painful. What we've done is created an online performant, secure way for customers to share the data. >> Rik this really highlights the value proposition for Informatica. I always say, you know, data is always, beauty of the data is in the eye of the beholder. Depending on where you're sitting in from. You could be on-premises, you have legacy, you could be born in the cloud and taking advantage of all that cloud stuff. Graham Thompson was on earlier he said, "Hey if you've got data in the cloud "why move it on premise?" So you know, there should be a choice of what's best. And that's what you guys come in. What specifically are you guys tying together with data warehouse in the cloud and and maybe a customer may want to choose to have for compliance reasons, or a viariety of other reasons on prem or another location. >> I think one of the big things about cloud data warehouses in particular, it's not all things being equal at the on-premise world, right? The level of agility you get with the Snowflake where it's infinite scale out, up in a few minutes. That empowers so much transformation in the organization. That's why it's so compelling, and so many folks are adopting it. And so what we're doing is we're helping customers on that journey though. Because they've got a very complex data environment and they got to first of all understand how's this all put together to be able to start modernizing moving to the cloud. >> I'm sorry if I asked the question where should a customer store their data; on the cloud or on-premise. I know where you'll come in on that. It's cloud all the way, because that's what you do. But this is something that architects in the enterprise have been dealing with because they do have legacy stuff. So and we've seen with the SAS business models, data has been really key for their success because it gives them risk-taking or, actually risk taking meaning they can do things, maybe testing to whatever. Test certain features on certain users. Basically use the data basically to create value. And then the upside of taking that risk is reward. You have more revenue, hockey stick growth and the numbers are pretty clear. Enterprises want that. >> They do. >> But they're not really set up for it. How do they get there? >> The best part with a SAS model is customers can de-risk by putting some of their data, for instance Snowflake, right? We work across AWS and Azure. So customers that maybe aren't all in yet on either cloud provider can start using Snowflake and put data in Snowflake and test it out. Test out the performance and the security of cloud. And if for whatever reason it doesn't work out they haven't risked very much if anything. And if it does work out then they've got a great proving ground for that. So the SAS opens up a lot of possibilities for enterprise customers. >> I brought this up with Graeme Connelly. You know, he's from Scotland so I understand his perspective. I'm from Silicon Valley so I took my perspective. I said you know, when I hear regulation I see you know, anti innovation, right? Like when I hear governments coming involved putting you know, regulation on things. We're seeing a very active regulatory environment on tech companies around data. GDPR one-year anniversary. This is a real issue. How do you turn that regulatory constraints around data, because what it means is more complexity around how to deal with the data. How do you turn that into an advantage. Obviously software abstraction certainly helps in tech, but customers are trying to move move faster with cloud. They can do that for all those reasons talked earlier. But now you got complexity around regulation. >> I think first off from a from a data warehouse perspective we were built with security and compliance in mind from day one, right? So you build in things like encryption, always-on encryption. You build things like role based access controls. Things like key management, right? And then when you think of Informatica within the data pipeline getting data from sources in and out of Snowflake, then you build additional data quality, data governance tools on top of that. Things like data catalog, right? Where you can, now just go discover what data you have out there, what data are you moving into the cloud, and what is the lineage of that data. >> Talk about this migration and movement because that becomes, people are generally skeptical when they hear migration like, oh my god migration. If they know it's going to cost some money or potentially technical risk. What's, how do you guys handle the migration in a way that's risk-free. >> I'll take that one. I'd say one of the things that we really put in front of all of our migration approaches for customers is the enterprise data catalog. And using the machine learning capabilities in the catalog to take what is a very complex landscape and make it very understandable accessible to the business. But then also understand how it's all put together. Where data's coming from, where it's going, who's consuming it. And once you have that view and that clarity of how things are put together it actually means you can take a use case based approach to adoption of the cloud and moving data. So you're actually realizing business value incrementally as you're moving. Which i think is really key right? if you do these massive multi-year projects and it takes a year to get any results it's not going to fly anymore, right? This is a much more agile world and so we're really empowering of that with the intelligence around data. >> Digital transformation has got three kind of categories we find when we poll people and do the research. You got the early adopters who have a full team they're cloud native, their jammin and their DevOps rockstars. They're kicking ass taking names. Then on the other end of spectrum you got you know, fear, oh my god, like I don't really have the talent. I'm going to do some, study it, spec it out, we got to figure it out. then you have people who are kind of like, you know, the fast followers, influenced kind of like focused. They tend to break down in the middle of projects. This seems to be the pattern. They get going and they get stuck in the mud. This is a real issue around culture and people. So I got to ask you, you know, a lot of these challenges around people and culture is huge skills gap. What is the biggest hiring skills gap that's needed to be filled so that people can be successful whether they're got a really rockstar team or smart team that just got to re-skill up. Or how do you take a project that's stuck in the mud and reboot it? These are challenges. >> I think when the nice things about Informatica is that you know, there's 100,000 folks out there who are familiar with Informatica's approach of implementations. So, by, you know, us bringing our technologies and embracing these journeys we're actually empowering customers to not have to get coders and data scientists. They're using some of those same data engineers but now they're bringing data to the cloud. >> And I think along the same lines we think of practitioners usually right? I need data scientists, I need more data engineers. I think a valuable asset that's that's becoming more clear now, is to have a new breed of data analyst, right? That understand how to put AI and machine learning together. How to start to grab all of the data that's out there for customers, right? Structured data, semi-structured data and make sure that they've got a single strategy along how to become data-driven. >> Give an example of some customers that you guys are working together with using Snowflake and Informatica. What are they, what are they doing? What's some of the use cases? What's some of the applications? >> Yeah so I think one of the biggest use cases is a data warehouse modernization, right? So you have the existing on-premise data warehouses. And I always like when I talk to customers think about, well realistically when you have a new use case on your on-premise warehouse. How long is it going to take you to actually see your first piece of data? I don't know a lot of people have extra capacity that's kind of hanging around in their warehouse right? We think about they have to make business cases, they have to get new Hardware, new licenses. It could take six months to see their first piece of data. So, you know I think it's a tremendous accelerator for them to go to the cloud. >> So the main thing there's agility. >> Yes, absolutely. >> Fast time to value. How's business with Snowflake? What's going on with you guys? What other use case you seeing besides the data warehouse. Modern data warehouse. >> Sure John, I can start with business in general. It's very exciting times at Snowflake right now. Late last year we got a funding round of $450 million for growth funding. Brings our total funding to just over $920 million. Our valuation doubled to 3.9 billion. That puts us in the top 25 highest valued private U.S. tech firms. Like I mentioned before we tripled the number of employees to over a thousand, across nine countries globally. We're going to expand to 20 or more in the next 12 months. And then in terms of my favorite part-- >> What's been the traction of that? Why this success? What's been the ah ha moment for customers with Snowflake? >> Yeah I think about what customers try and do in their data journey, there are probably three key things. Number one, they want to get access to all their data, right? And they want to do that in a very fast and economic way. They want to be able to get all the different variety of data that's out there. All the modern data types, right? Both the structured data, right? Their ERP is CRM systems, things about customers and product, and sales transactions, and then all this modern data, from web and social, from behavior data, from machine generate data in IOT. But they want to put all together. They don't want to have different, disparate systems to go and process this and try to bring back together today. That's been the challenge, is the complexity and the cost. And what we've done is start to remove those barriers. >> You know, I love the term now because I've hated it when it came out. Data Lake, during the Hadoop days we heard Data Lake. And then it turned into a data swamp. You start to see that get fixed a little bit. Because what people are afraid of is they're afraid of throwing all those data into a data swamp. They really want to get value out of it. This has been a hard thing the early days of Hadoop, but it was cool technically to be you know, putting Hadoop clusters together, and standing them up, but then it's like where's the value? >> I think the Data Lake concept in essence makes a lot of sense. Because you want to get all your data in one central place so you can ask these questions across all the different data types, and all different data sources. The challenge we had was you had the traditional data warehouse which couldn't support the new data types, and the diversity, just pure volume. And then you had newer no SQL like systems like Hadoop that could start to address just the sheer mass of data. But they were so complex that you needed an army, and you still do need an army, and then there's some limitations around performance, and other issues, and so no data projects we're making it into production. I think we still have a very small success rate when you think about data projects that actually make it to production. This is where with Snowflake, because we had the luxury to build it from the ground up, we saw the needs of both using a relational SQL database because SQL is still an amazing expressive language. People have invested skill sets and tools. And then be able to support the new semi-structured data types. All within the same system, right. All within SaaS model so you can start to remove complexity. it's self-managed. We have a self-managed SaaS offering, so customers don't have to worry about all the operational lifting. They can go and get inside to the data. And then because of the cloud they can take advantage of the elasticity in the scale and pay for what they use. >> What was the big bet on Snowflake that paid off. You had to kind of hone it down. >> But the biggest bet John was, we are architecting a database from scratch. Because if you look all the other solutions out there that get the fastest time to market is you can take an architecture that's been existing for a decade or so, and wrap it on a cloud. And that gets you some benefits of the cloud. For instance no need for upfront costs and implementing Hardware in the data center. You can offload some of the management and some of the maintenance to the cloud providers. But like I mentioned before you can't scale automatically. You can't take advantage of infinite scale, right? Because these systems were designed and on-premise role that had a thinking of finite resources. So I think our big bet was, do you create a new architecture. That's a big risk, but luckily it's paid off well. >> Big risk pay offs. Rik talk about the ecosystem. You guys have a big partner strategy. You have to. >> Yep. >> You guys are integrating integration points as comparing to you guys, not the sound like it's in a bad way but, Slack is going public so I'll use them as example. Slack is a software that's cloud-based but what made them really big besides, copying the message board kind of IRC chat, is that they have a huge integration points with all the key players that really fed that in. This is kind of something that in, as a metaphor is not directly directed to you guys but, you guys are very integration partner oriented. >> Yeah >> How is that playing out? Again, I'm sure this, I didn't see any strategy change still continuing. Give us the update, how's that going? It's a great example Snowflake here on theCUBE. This is core of Informatica. Take a minute to explain that strategy. >> Well I think the beginning of the journey of any of our ecosystem partners does start with the connectivity layer. But honestly you know, moving data from point A to point B. That's kind of, that's the tip of the iceberg, right? And so we've really focused on bringing really addressing all the challenges in the entire data journey. So it's one thing about first of all how do I even find the data to bring there. Now once I found it can I connect to it? Do I have the access to the data? Can I bring it to the right targets the customer wants consumed. But then once the data is there, is it usable, is it consumed, is it clean? If I'm doing customer 360, do I need to get my golden records? Or you mentioned GDPR, our whole data protection focus on, you know trying to create a perimeter between different parts of the enterprise, we're automatically applying masking encryption, those sorts of things. So we're really focused on integrating that as tightly as we can and making it seamless for customers to be able to tap into those capabilities when they need them. >> I mean feeding data to machine learning and then powering AI is a great example. If you don't have the right data at the right time for the machine learning, the AI doesn't work well. And then applications that are going to be using machine learning need to have access to data as fast as possible. Lag really hurts everything. This is a huge issue. >> Yeah I mean and we're looking at complete acceleration. You know that whole data discovery phase to build your models and train them. But to your point, garbage in garbage out, right? The old adage is still applicable today, and I think even but you've got security issues. What happens if your training data includes some sensitive code names that show up in your models all of a sudden, right? There's all these issues. But then you take it those models and operationalize them as well. Again, the inputs need to be clean, so. >> Cloud or on-premise, final word. Get your both take on it. Obviously your data warehouse in the cloud. For the customers that have an On-premise dynamic, whether it's legacy or whatever. I got to move to the cloud. I'm eventually going to have some cloud, and how it's going to look. What do they do? What's the State of the Union for dealing with data that's not just in the cloud. >> Yeah. >> Yeah >> You were first, go ahead. >> Yeah sure, I think again going back to having a SAS model, customers can pick specific project specific data sets to go and try out, right? Snowflake gives them a perfect example of, not even having to directly engage the cloud partner yet, right? They want to see if data can be ingested in the cloud in a very fast performant way. They want to see if security meets their needs, right? They want to test out all of the different things around management and ease of use. They can do that with Snowflake. Again, at a very low risk way. Because we are a SaaS platform. We've got a great model on elasticity. The customers can pay as they go just to try it out. So for me, when I think of these customers that are stuck there and trying to make a decision, I say look try Snowflake. It's a very risk-free way to start to analyze some data sets, and if it works for you then you've got a proof point of starting to move more and more workloads into the cloud. >> Rik, digital transformation. What are customers doing? What's the playbook? >> Yeah I think the recipe is, you know, one, the laser focus on value, right? Have you have your eyes on how am I going to get value as quickly as I can this transformation. Second thing is, understand what you have. Understand your existing landscape. That third piece is go. I get started, because I think the case for the cloud is so compelling for customers. I don't know a single customer that I talk with who is not already on the cloud journey. So it's really about making sure you get business value as you proceed down that journey. >> Get the proof points up front. >> Absolutely >> Think smaller steps >> Yep, incremental and casual >> Show the value. Sounds like agility DevOps. Guys thanks for coming on. Good to see you. It's Cube coverage here in Las Vegas, I'm John Furrier. Your host for theCube is Rebeca Night. Two days of wall-to-wall coverage. We'll back with more after this short break. (dramatic music)

Published Date : May 21 2019

SUMMARY :

Brought to you by Informatica. Welcome back to theCUBE, good to see you guys. Congratulations, you guys are doing really well. Switzerland, cloud, you've got Switzerland, It's not so much that you know, with Switzerland When you look at where customers are going with data, right? what you guys do, quick one minute. And the challenge that you have there is That's one of the benefits of having data in the cloud. That's the key to our business and our product And that's what you guys come in. and they got to first of all understand It's cloud all the way, because that's what you do. How do they get there? So the SAS opens up a lot of possibilities I said you know, when I hear regulation I see And then when you think of Informatica What's, how do you guys handle the migration in the catalog to take what is a very complex landscape Then on the other end of spectrum you got you know, but now they're bringing data to the cloud. is to have a new breed of data analyst, right? that you guys are working together with How long is it going to take you What's going on with you guys? the number of employees to over a thousand, is the complexity and the cost. but it was cool technically to be you know, And then you had newer no SQL like systems like Hadoop You had to kind of hone it down. and some of the maintenance to the cloud providers. Rik talk about the ecosystem. as a metaphor is not directly directed to you guys Take a minute to explain that strategy. Do I have the access to the data? And then applications that are going to be Again, the inputs need to be clean, so. and how it's going to look. and if it works for you What's the playbook? Yeah I think the recipe is, you know, Good to see you.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
RickPERSON

0.99+

JohnPERSON

0.99+

InformaticaORGANIZATION

0.99+

John FurrierPERSON

0.99+

Tarik DwiekPERSON

0.99+

SnowflakesORGANIZATION

0.99+

Rik Tamm-DanielsPERSON

0.99+

ScotlandLOCATION

0.99+

SnowflakeORGANIZATION

0.99+

20QUANTITY

0.99+

Las VegasLOCATION

0.99+

Silicon ValleyLOCATION

0.99+

six monthsQUANTITY

0.99+

Graham ThompsonPERSON

0.99+

AWSORGANIZATION

0.99+

3.9 billionQUANTITY

0.99+

third pieceQUANTITY

0.99+

100,000 folksQUANTITY

0.99+

Frank SlootmanPERSON

0.99+

over $920 millionQUANTITY

0.99+

Two daysQUANTITY

0.99+

first pieceQUANTITY

0.99+

SQLTITLE

0.99+

ZarORGANIZATION

0.99+

$450 millionQUANTITY

0.99+

nine countriesQUANTITY

0.99+

The DataORGANIZATION

0.99+

SASORGANIZATION

0.99+

Graeme ConnellyPERSON

0.98+

bothQUANTITY

0.98+

BothQUANTITY

0.98+

firstQUANTITY

0.98+

theCUBEORGANIZATION

0.98+

oneQUANTITY

0.98+

over a thousandQUANTITY

0.98+

todayDATE

0.97+

SnowflakeTITLE

0.97+

a yearQUANTITY

0.97+

GDPRTITLE

0.97+

one minuteQUANTITY

0.97+

Second thingQUANTITY

0.97+

two great guestsQUANTITY

0.97+

HadoopTITLE

0.96+

three key thingsQUANTITY

0.96+

single customerQUANTITY

0.95+

one thingQUANTITY

0.94+

jamminPERSON

0.94+

U.S.LOCATION

0.94+

Late last yearDATE

0.94+

SlackTITLE

0.94+

a decade agoDATE

0.93+

Neutral Third PartyORGANIZATION

0.93+

one-year anniversaryQUANTITY

0.92+

a decadeQUANTITY

0.92+

one central placeQUANTITY

0.9+

SnowflakeEVENT

0.89+

Strategic Ecosystems and TechnologyORGANIZATION

0.89+

Vice PresidentPERSON

0.88+

Informatica World 2019EVENT

0.88+

Ken Ringdahl, Veeam, & Mark Nijmeijer, Nutanix | Nutanix .NEXT Conference 2019


 

>> live from Anaheim, California. It's the queue covering nutanix dot Next twenty nineteen. Brought to you by Nutanix. >> Welcome back, everyone to the cubes. Live coverage of nutanix dot Next here in Anaheim, California. I'm your host, Rebecca Night, along with my co host, John Furrier. We have two guests for the segment. We have Ken Ringle. He is the vice president Global Alliance Architecture at Wien. Thanks so much for coming on. The your Cube alum Returning to the >> great to be here again >> And we have Mark Ni Mire. He is the director of product management for data protection Nutanix Thank you for coming on the Cube. So we're one of the big thing when the big announcements today is nutanix mine. I want to talk to you and ask you Ken. What brings nutanix and team together to create Nutanix? Mine? >> Yeah, sure where you know we're super excited. You know, we've been partners for many years. We actually brought a product to market together last year, called the availability for nutanix, which added support for primary workloads. But we hadn't been working together on the secondary side, right where we land are backups And it became very clear, you know, from our customers that they were, You know, we really want to provide that seamless experience, a turnkey experience for our customers. So we started talking together and really, this is over a year in the making, right? We came together and we started brainstorming and it became very clear in a lot of synergies between the companies and and what we could deliver to our customers. So it became obvious. Hey, let's let's bring this together. It was more about the high. Not not not when they're you know, it was It was it was how how do we do it? >> And what were the problems you were trying to solve here? What were the issues that you were hearing from customers? >> So when we talk to customers, a lot of complaints that there are customers are voicing its around the complexity in their backup infrastructure, Right? Nutanix is known for providing simplicity for the primary infrastructure, right, reducing complexity that you typically having your free chair our protection. New tenants mind will provides the same amount ofthe simplicity for your for your lack of infrastructure, a type of converts solution that includes the Wien sell fair to provide data protection services for any workload running in your data center >> Integrations A big part of the modernized in hybrid on cloud with, you know, on premises Private Cloud. As you guys know, integrating it is not always that easy. This's pretty important. You guys been very successful with your partnering. Your product has been successful. Revenues actually show that as the cloud comes into the picture, a lot of people have been tweaking the game there game a little bit on the product side because of the unique differences with Cloud. So with multi cloud, private cloud and hybrid, what changes what's changing in the customer mind right now? Because they got their own premises thing pretty solid, but operationally it feels like cloud. But how does it affect the d Rp? Because this is going to be one of the big conversations. >> Yeah, no question. I mean, when we when we talked to our customers on how they're protecting their data, you know, we hear from a lot of customers is hey, we want to leverage the cloud for for a number of things. And I think the cloud has gone through an evolution right, You know, it's just like anything there's, you know, the great great hey could do all these things. And then people come back to reality. And what we see a lot of our customers doing is is using the cloud for long term data retention, using it as a secondary d our site. You know, you go back five years, you know, customer, especially large customers, all have two physical data centers. So now what? We're seeing a lot of our customers. They have that one physical primary data center, but they're leveraging the cloud. Is there as there d our site, right? So they're they're moving their data there with our recovery capabilities, you know, you can actually get a cloud workload recovered in a disaster scenario quite rapidly. And that's that's been a major change over the especially over the last couple years. >> And then, if you really look at integration, right, the the new Tenants Mind solution to Platform provides integration in six different areas. Integration is sizing, making it very easy to size, or we've identified some form. Factors were building it into new. He's an ex isar, very easy to, uh, to buy single skew that basically provides the hardware hardware support suffer for from from nutanix and suffer from being easy to deploy. Very automated installer that turns the nutanix appliance into a into a mine appliance in a matter of minutes and an easy to manage integrated dashboards Easy to scale right Horse entering is tailing out for capacity, but also for increased performance and then integrated support, where we have a joint support model between the two companies to really help our customers in case there are issues. >> So why why did you choose each other? What was the courtship like and and how how did they have the relationship evolve? >> So if you look at vino and new tenants, we really focus on quality and providing simplicity for our customers. That if that is something that really it was very apparent from the beginning that we have the same view points in the same Mantorras, basically around simplicity, providing quality both off our MPs scores are definitely the highest in the industry, something that is that is practically unheard of. So it was a very natural. I think this company's coming together and providing value together. >> Yeah, I mean, we're maniacal about customer success and customer support and customer satisfaction. That was that was very clear early on. You know, Venus as a peer software company in a way, and we need a partner in order to deliver a full stack solution. Nutanix is there's just a lot of synergies that culture, the companies, the size of the companies, the age of the cos it just It's just a great partnership in a great fit where, you know, there's just we're both moving in the same direction in in concert >> both hard charging cultures to, you know, entrepreneurial high quality was focus on the customer but hard charging. You guys move fast, so well, I got the two experts here on data protection. I gotta ask you about my favorite topic, ransomware, because people are fun and get rid of that tape. I got to get stuff back faster on recoveries. But ransomware really highlights the data protection scenario because they target like departments that maybe understaffed or might be vulnerable or just don't fix their problem. They go back to the well every time that it's everything you want to make some cash and go back. This >> is where >> software. Khun solved a lot of problem. What's your what's your guy's view of the whole ransomware thing? Because it becomes huge. >> Yeah, no question. Way Hear this from a lot of our customers And of course, we can't talk about it when we have customers come to us. But, you know, we've had many customers come to us, and unfortunately, it's after the fact A I you know, I had a ransomware attack and, you know, I lost all this, but now you know I can't let it happen again, but it's really from a backup strategy perspective. It's still important to keep air gap. You know, these ransom where these folks that are building these, these ransomware attacks, they're very intelligent. They've gotten extremely intelligent and how they move from one system to another and they even hide out. So you, you you eliminate a ransomware attack and that thing can come right back. You restore a backup that was a month old that has that sitting and waiting. So, you know, having a solution that can actually test your backups before you put him in production. Haven't air gap, you know, have a mutability on some of your backup date of those. These are all things we talk to our coast. >> You'd be a point about the bridges up because it was just going to a customer about this. They fixed the ransomware paid but didn't fix the problem. Yeah, so it's, like, end of the month and eat some cash right around the end of the month. But, you know, saying they shake him down again. Yes. The wells there, they keep on coming back. So there's, like, community of data perfection. I mean, professionals getting together to kind of get ahead of this problem >> on DH, then the other aspect ofthe basically being able to recover quickly his performance, right? Nutanix platform provides have informed the throughput. So you can very quickly restore your work clothes as well. >> Yeah, that would be a great problem of simplifying. Yeah, exactly. >> So what are the next steps for this alliance? Where where where do we go from here? >> So from from basically we've just finished a round of vested beta testing right way are going to be maniacally focused on the first hundred customers really understanding how they're going to put mine in their data centers. How they were going to use it as in their data sent to protect their Derek. There their workloads and their applications from their own. We have a lot of plans, very interesting plans around Rome Emperor. We can build even tighter integration from a management perspective, but also from a data fabric perspective. Weather that's on prime a weather gets goes into desire clouded nutanix icloud There's a lot of interesting areas that brain and I have been brainstorming on white boarding and so on that you'LL see coming out in the next two versions of the products. >> What's the big customer request? What's the big feature request? What's the big ask from customers for you guys together? >> At the end of the day, you know, our customers are really asking for simplicity. They they want, they want to simplify their environment. I mean, it is moving from specialists generalists, and they and they want a system that works well together. That's going to lower their costs and they want peace of mind. So they want. They want to know their backups are protected, They want to know they can restore. And that's really what we're focused on is providing that to our customers >> and reliable. Have making sure their works hundred percent any new things emerging out the multi cloud thing that you guys see coming down around the quarter that you're getting ready for to help customers simplified any any signals from this multi cloud equation. >> So one of the things I look at is really the lines between on Graham and primary and secondary and tertiary. They're really blurring. Also, the lines between Young Prem and Cloud are blurring as well, but you can replicate data and replicate backups really, really efficiently to wherever it needs to be. So I really see that as a zoo core strength to enable value that plays into the military >> true operational model across whatever environment, and still do the tearing and things you need to do. >> Yeah, no doubt flexibility and being able to support, you know, multiple environments. You know, that's that's that's absolutely what we're after. It's It's what we what we leverage is part of the nutanix ecosystem is is that breath of coverage, but but also given customer choice. >> Just talking to Rebecca, which we love data project. Should I leave lights? Ideo delegate always whimsy will you guys be on next week? This is a huge conversation that used to be a bolt on conversation in the old days of now. Data protection, backup in recovery, disaster planning. All part of a operating model. Holistic picture. Yeah. How is that? We're one hundred percent there yet. And all customers where they still use. This stuff's still kind of like, not forgetting to design in. >> Yeah, I mean, protection. You know where you know, lots of our customers are coming to us because their struggle with legacy solutions and they're looking to modernize their whole infrastructure right there, modernizing where they land. The backups are modernizing the platform that that lands those backups on the infrastructure. And so, you know, that's it's a major problem for our customers and really, you know, you you mentioned, you know, availability and you know, you you go back five years, maybe five, seven, eight years. You know, availability was measured in three nines. Four, ninety five, ninety availability. You know, everyone in the world of of everything cloud and everything sas, you know, availability is one hundred percent or nothing. You know, it's there is no there. There really is no sort of anything but a one hundred percent availability, >> and its security highlights all the problems. So another customer about this ransom, one other ransomware customer they were doing all the backups on tape. Can you imagine? Of course, they're talking for ransom where it's just good on the director. He was still using tape because they can't turn around fast enough. It was a big problem. >> Yeah, you know, it's funny, you know, you you know, we're focused on innovation and next things. But when you you know, you you then have some of those customer conversations. And some of them are still, you know, because of their compliance and processing procedures, There's still, you know, five years behind may be where we are. You know, you've got a you gotto sort of bring them along for the journey to knowing that they're gonna they're gonna trail behind. But for the for the early adopters and the innovators way also have to serve them as well. >> And they got there. They gotta level up themselves to it, son. Them too. They had they had the level of >> So speaking of innovation, you are two different companies. You already talked about this, its energies and the similarities in culture. But you are two companies coming together to build a product. How does that work? I mean, do you do get in the same room? Do you watch the same movies? Do you have a happy you? >> So >> get one brain working on this >> female. Vamos a distributed company. We are distributed company. So it's it's It's a lot of calls and so on. But it's it's really fun to really see it. She had come together and becoming really right. Yes, there's a lot of hard engineering problems that we have to solve in some very deep discussions around layout and things like that. But then doubling it up, working on the joint value prop and working on the joint marketing it really is a very nice wide set of off capabilities and skills that we've been working >> on. And when I went out, I mean, it is hard. It is hard to bring to two things together and work on them jointly. And we've, you know, so far been fairly successful. What I would tell you is it it brings some some advantages to us as well Because we have a best of breed platform. We have a best to breed data protection platform. You know, bringing those together bring some advantages that maybe someone that does all that together on their own don't have because it's not a focus area for them. Right? So, you know, it's our job to make sure we take advantage of that and provide some additional things for our customers that maybe they won't get out of some of those other platforms. >> Well, Mark and Ken, thank you both. So much for coming on the Cube. It was a pleasure having you. >> Thank you very much. >> Thanks for having us. >> I'm Rebecca Knight for John Furrier. We will have Ah, we'Ll have more from nutanix dot Next coming up just a little bit. Stay with us.

Published Date : May 9 2019

SUMMARY :

Brought to you by Nutanix. He is the vice president Global Alliance Architecture at Wien. He is the director of product management for data protection Nutanix Thank you for right where we land are backups And it became very clear, you know, from our customers that they were, reducing complexity that you typically having your free chair our protection. As you guys know, integrating it is not you know, you can actually get a cloud workload recovered in a disaster scenario quite rapidly. And then, if you really look at integration, right, the the new Tenants Mind solution to Platform So if you look at vino and new tenants, we really focus on quality and providing partnership in a great fit where, you know, there's just we're both moving in the same direction in in concert They go back to the well every time that it's everything you want to make some cash and go back. What's your what's your guy's view of the whole ransomware thing? it's after the fact A I you know, I had a ransomware attack and, you know, But, you know, saying they shake him down again. So you can very quickly restore your Yeah, that would be a great problem of simplifying. are going to be maniacally focused on the first hundred customers really understanding how they're going to put mine At the end of the day, you know, our customers are really asking for simplicity. that you guys see coming down around the quarter that you're getting ready for to help customers simplified any any Cloud are blurring as well, but you can replicate data and replicate backups really, Yeah, no doubt flexibility and being able to support, you know, multiple environments. you guys be on next week? You know where you know, lots of our customers are coming to us because their struggle with Can you imagine? Yeah, you know, it's funny, you know, you you know, we're focused on innovation and And they got there. So speaking of innovation, you are two different companies. But it's it's really fun to really see it. And we've, you know, so far been fairly successful. Well, Mark and Ken, thank you both. We will have Ah, we'Ll have more from nutanix dot Next coming up just

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Ken RinglePERSON

0.99+

John FurrierPERSON

0.99+

RebeccaPERSON

0.99+

KenPERSON

0.99+

Rebecca NightPERSON

0.99+

MarkPERSON

0.99+

Rebecca KnightPERSON

0.99+

Ken RingdahlPERSON

0.99+

Mark NijmeijerPERSON

0.99+

Mark Ni MirePERSON

0.99+

fiveQUANTITY

0.99+

NutanixORGANIZATION

0.99+

FourQUANTITY

0.99+

two companiesQUANTITY

0.99+

two guestsQUANTITY

0.99+

one hundred percentQUANTITY

0.99+

Anaheim, CaliforniaLOCATION

0.99+

sevenQUANTITY

0.99+

two expertsQUANTITY

0.99+

last yearDATE

0.99+

VeeamPERSON

0.99+

Anaheim, CaliforniaLOCATION

0.99+

ninetyQUANTITY

0.99+

eight yearsQUANTITY

0.99+

five yearsQUANTITY

0.99+

Global Alliance ArchitectureORGANIZATION

0.99+

oneQUANTITY

0.99+

two thingsQUANTITY

0.99+

nutanixORGANIZATION

0.99+

next weekDATE

0.99+

Young PremORGANIZATION

0.98+

first hundred customersQUANTITY

0.98+

bothQUANTITY

0.98+

hundred percentQUANTITY

0.98+

two versionsQUANTITY

0.98+

WienLOCATION

0.98+

two different companiesQUANTITY

0.98+

VenusORGANIZATION

0.98+

DerekPERSON

0.98+

todayDATE

0.98+

six different areasQUANTITY

0.97+

three ninesQUANTITY

0.95+

over a yearQUANTITY

0.94+

KhunPERSON

0.94+

CloudORGANIZATION

0.93+

one brainQUANTITY

0.93+

two physical data centersQUANTITY

0.93+

one systemQUANTITY

0.89+

GrahamPERSON

0.87+

IdeoORGANIZATION

0.86+

Next twenty nineteenDATE

0.79+

one physicalQUANTITY

0.75+

WienORGANIZATION

0.75+

last couple yearsDATE

0.73+

ninety fiveQUANTITY

0.72+

end ofDATE

0.7+

single skewQUANTITY

0.69+

a monthQUANTITY

0.67+

TenantsORGANIZATION

0.67+

nutanixTITLE

0.63+

icloudTITLE

0.57+

NutanixEVENT

0.57+

Conference 2019EVENT

0.54+

monthDATE

0.5+

RomeLOCATION

0.43+

Eric Brewer, Google Cloud | Google Cloud Next 2019


 

>> fly from San Francisco. It's the Cube covering Google Cloud next nineteen, brought to you by Google Cloud and its ecosystem partners. >> Welcome back. This is Day three of Google Cloud. Next, you're watching the Cube, the leader in live tech coverage. The cube goes out to the events. We extract the signal from the noise. My name is Dave Volante. I'm here with my co host to minimum. John Farrier has been here >> all week. Wall to wall >> coverage, three days. Check out cube dot net for all the videos. Silicon angle dot com For all the news, Eric Brewer is here is the vice president of Infrastructure and a Google fellow. Dr Breuer, Thanks for coming on The Cube. >> Happy to be here to see >> you. So tell us the story of sort of infrastructure and the evolution at Google. And then we'll talk about how you're you're taking what you've learned inside a googol and helping customers apply it. >> Yeah, one or two things about Google is it essentially makes no use of virtual machines internally. That's because Google started in nineteen ninety eight, which is the same year that VM where started it was kind of brought the modern virtual machine to bear. And so good infrastructure tends to be built really on kind of classic Unix processes on communication. And so scaling that up, you get a system that works a lot with just prophecies and containers. So kind of when I saw containers come along with Doctor who said, Well, that's a good model for us and we could take what we know internally, which was called Boring a big scheduler and we could turn that into Cooper Netease and we'LL open source it. And suddenly we have kind of a a cloud version of Google that works the way we would like it to work a bit more about the containers and AP eyes and services rather than kind of the low level infrastructure. >> Would you refer from from that comment that you essentially had a cleaner sheet of paper when when containers started to ascend, I >> kind of feel like it's not an accident. But Google influenced Lena Lennox's use of containers right, which influenced doctors use of containers, and we kind of merged the two concepts on. It became a good way to deploy applications that separates the application from the underlying machine instead of playing a machine and OS and application together, we'd actually like to separate those and say we'LL manage the Western machine and let's just deploy applications independent of machines. Now we can have lots of applications for machine improved realization. Improve your productivity. That's kind of way we're already doing internally what was not common in the traditional cloud. But it's actually a more productive way to work, >> Eric. My backgrounds and infrastructure. And, you know, I was actually at the first doctor. Calm back in twenty fourteen, only a few hundred of us, you know, right across the street from where we were here. And I saw the Google presentation. I was like, Oh, my gosh, I lived through that wave of virtual ization, and the nirvana we want is I want to just be able to build my application, not worry about all of those underlying pieces of infrastructure we're making progress for. We're not there. How are we doing as an industry as a whole? And, you know, get Teo, say it's where are we? And what Google looking that Cooper, Netease and all these other pieces to improve that. What do you still see is the the the room for growth. >> Well, it's pretty clear that you Burnett is one in the sense that if you're building new applications for enterprise, that's currently the way you would build them now. But it doesn't help you move your legacy stuff on it for, say, help you move to the cloud. It may be that you have worth loads on Crim that you would like to modernize their on V EMS or bare metal, their traditional kind of eighties APS in Java or whatever. And how does Cooper Netease affect those? That's that's actually still place where I think things are evolving. The good news now is much easier to mix kind of additional services and new services using SDO and other things on GC people contain arising workloads. But actually it would say most people are actually just do the new stuff in Cooper Netease and and wrapped the old stuff to make it look like a service that gets you pretty far. And then over time you khun containerized workloads that you really care about. You want to invest in and what's new with an so so you can kind of make some of those transitions on fram. Ifyou'd like separate from moving to the cloud and then you can decide. Oh, this workload goes in the cloud. This work load. I need to keep on priming for awhile, but I still want to modernize it of a lot more flexibility. >> Can you just parts that a little bit for us? You're talking about the migration service that that's that's coming out? Or is it part of >> the way the Val Estrada work, which is kind of can take a V M A. Converted to a container? It's a newer version of that which really kind of gives you a A manifest, essentially for the container. So you know what's inside it. You can actually use it as in the modern way. That's migration tool, and it's super useful. But I kind of feel like even just being able to run high call the Communities on Crim is a pretty useful step because you get to developer velocity, you get released frequency. You get more the coupling of operations and development, so you get a lot of benefits on treme. But also, when you move to cloud, you could go too geeky and get a you know, a great community experience whenever you're ready to make that transition. >> So it sounds like that what you described with Santos is particularly on from pieces like an elixir to help people you know more easily get to a cloud native environment and then, ultimately, Brigitte to the >> class. That's kind of like we're helping people get cloud native benefits where they are right now. On a day on their own time. Khun decide. You know not only when to move a workload, but even frankly, which cloud to move it to right. We prefer, obviously moved to Google Cloud, and we'LL take our chances because I think these cattle native applications were particularly good at. But it's more important that they are moving to this kind of modern platform but helps them, and it increases our impact on the Indus. Sory to have this happen. >> Help us understand the nuance there because there's obvious benefits of being in the public cloud. You know, being able to rent infrastructure op X versus cap packs and manage services, etcetera. But to the extent that you could bring that cloud experience, Tio, you're on premises to your data. That's what many people want to have that hybrid experience for sure. But but other than that, the obvious benefits that I get from a public cloud, what are the other nuances of actually moving into the public cloud from experience standpoint in the business value perspective? >> Well, one question is, how much rewriting do you have to do because it's a big transition? Moved a cloud that's also big transition to rewrite some of your applications. So in this model, we're actually separating those two steps, and you can do them in either order. You can lift and shift to move to cloud and then modernize it, but it's also perfectly fine. I'm gonna modernize on Graham, read my do my rewrites in a safe controlled environment that I understand this low risk for me. And then I'm going to move it to the cloud because now I have something that's really ready for the cloud and has been thought through carefully that way on that having those two options is actually an important change. With Anthony >> Wavered some stats. I think Thomas mentioned them that eighty percent of the workloads are still on prams way here. That all the time. And some portion of those workloads are mission critical workloads with a lot of custom code that people really don't want to necessarily freeze. Ah, and a lot of times, if you gonna migrate, you have to free. So my question is, can I bring some of those Antos on other Google benefits to on Prem and not have to freeze the code, not have to rewrite just kind of permanently essentially, uh, leave those there and it take my other stuff and move it into the cloud? Is that what people are doing? And can I >> work? Things mix. But I would say the beachhead is having well managed Cooper and his clusters on Prem. Okay, you can use for new development or a place to do your read rights or partial read writes. You convicts V EMS and mainframes and Cooper Netease. They're all mix herbal. It's not a big problem, especially this to where it could make him look like they're part of the same service >> on framework, Right? >> S o. I think it's more about having the ability to execute modern development on prim and feel like you're really being able to change those acts the way you want and on a good timeline. >> Okay, so I've heard several times this week that Santos is a game changer. That's how Google I think is looking at this. You guys are super excited about it. So one would presume then that that eighty percent on Prem is gonna just gonna really start to move. What your thoughts on that? >> I think the way to think about it is all the customs you talked to actually do want to move there were close to cloud. That's not really the discussion point anymore. It's more about reasons they can't, which could be. They already have a data center. They fully paid for two. There's regulatory issues they have to get resolved to. This workload is too messy. They don't want to touch it at all. The people that wrote it are here anymore. There's all kinds of reasons and so it's gone. I feel like the essence of it is let's just interacted the customer right now before they make a decision about their cloud on DH, help them and in exchange for that, I believe we have a much better chance to be their future clown, right? Right, Because we're helping them. But also, they're starting to use frameworks that were really good at all. Right, if they're betting on coordinates containers, I like our chances for winning their business down the road. >> You're earning their trust by providing those those capabilities. >> That's really the difference. We can interact with those eighty percent of workloads right now and make them better. >> Alright. So, Eric, with you, the term we've heard a bunch this meat, we because we're listening customers where we're meeting them where they are now. David Iran analyst. So we could tell customers they suck out a lot stuff. You should listen to Google. They're really smart, and they know how to do these things, right? Hopes up. Tell us some of those gaps there is to the learnings you've had. And we understand. You know, migrations and modernization is a really challenging thing, you know? What are some of those things that customers can do toe >> that's on the the basic issues. I would say one thing you get you noticed when using geeky, is that huh? The os has been passed for me magically. All right, We had these huge security issues in the past year, and no one on G had to do anything right. They didn't restart their servers. We didn't tell them. Oh, you get down time because we have to deal with these massive security tax All that was magically handled. Uh, then you say, Oh, I want to upgrade Cooper Netease. Well, you could do that yourself. Guess what? It's not that easy to do. Who Burnett is is a beast, and it's changing quickly every quarter. That's good in terms of velocity and trajectory, and it's the reason that so many people can participate at the same time. If you're a group trying to run communities on Prem, it's not that easy to do right, So there's a lot of benefit Justin saying We update Custer's all the time. Wear experts at this way will update your clusters, including the S and the Cuban A's version, and we can give you modern ing data and tell you how your clusters doing. Just stuff. It honestly is not core to these customers, right? They want to focus on there advertising campaign or their Their oil and gas were close. They don't want to focus on cluster management. So that's really the second thing >> they got that operating model. If I do Antos in my own data center of the same kind of environment, how do we deal with things like, Well, I need to worry about change management testing at all my other pieces Most of the >> way. The general answer to that is, you use many clusters. You could have a thousand clusters on time. If you want that, there's good reason to do that. But one reason is, well, upgrade the clusters individually so you could say, Let's make this cluster a test cluster We'LL upgrade it first and we'LL tell you what broke. If anything, if you give us tests we can run the test on then once we're comfortable that the upgrade is working, we'LL roll it out to all your clusters. Automatic thing with policy changes. You want to change your quota management or access control. We can roll up that change in a progressive way so that we do it first on clusters that are not so critical. >> So I gotta ask a question. You software guy, Uh and you're approaching this problem from a real software perspective. There are no box. I don't see a box on DH there. Three examples in the marketplace as your stack er, Oracle Clouded customer and Amazon Outpost Where there's a box. A box from Google. Pure software. Why no box? Do you need a box? The box Guys say you gotta have that. You have a box? Yes, you don't have a box, >> There's it's more like I would say, You don't have to have a box >> that's ever box. Okay, that's >> because again all these customers sorting the data center because they already have the hardware, right. If they're going to buy new hardware, they might as well move to cloud the police for some of the customers. And it turns out we can run on. Most of their hardware were leveraging VM wear for that with the partnership we announced here. So that's generally works. But that being said, we also now partnerships with Dell and others about if you want a box Cisco, Dell, HP. You can Actually, we'LL have offerings that way as well, and there's certainly good reason to do that. You can get up that infrastructure will know it works well. It's been tested, but the bottom line is, uh, we're going to do both models. >> Yeah, okay. So I could get a full stack from hardware through software. Yet through the partnerships on there's Your stack, >> Right And it'll always come from Partners were really working with a partner model for a lot of these things because we honestly don't have enough people to do all the things we would like to do with these customers. >> And how important is it that that on Prem Stack is identical from homogeneous with what's in the public cloud? Is it really? It sounds like you're cooking growing, but their philosophies well, the software components have to be >> really at least the core pieces to be the same, like Uber Netease studio on a policy management. If youse open source things like my sequel or Kafka or elastic, those auto operate the same way as well, right? So that when you're in different environments, you really kind of get the feeling of one environment one stroll plane used. Now that being said, if you want to use a special feature like I want to use big query that's only available on Google Cloud right, you can call it but that stuff won't be portable. Likewise is something you want to use on Amazon. You can use it, and that part will be portable. But at least you'LL get the most. Your infrastructure will be consistent across the platforms. >> How should we think about the future? You guys, I mean, just without giving away, you know, confidential information, obviously not going to do that, but just philosophically, Were you going when you talk to customers? What should their mindset be? How should they repeat preparing for the future? >> Well, I think it's a few bets were making. So you know, we're happy to work on kind of traditional cloud things with Bush machines and discs and lots of classic stuff that's still important. It's still needed. But I would say a few things that are interesting that we're pushing on pretty hard won in general. This move to a higher level stack about containers and AP eyes and services, and that's Cuba nowadays and SDO and its genre. But then the other thing I think interesting is we're making a pretty fundamental bit on open source, and it's a it's a deeper bad, then others air making right with partnerships with open source companies where they're helping us build the manage version of there of their product on. So I think that's that's really going to lead to the best experience for each of those packages, because the people that developed that package are working on it right, and we will share revenue with them. So it's it's, uh, Cooper. What is open source? Tension flows open. Source. This is kind of the way we're going to approach this thing, especially for a hybrid and mostly cloud where they're really in my mind is no other way to do multi cloud other than open source because it's the space is too fast moving. You're not going to say, Oh, here's a standard FBI for multi cloud because whatever a pair you define is going to be obsolete in a quarter or two, right? What we're saying is, the standard is not particular standard per se. It's the collection of open source software that evolves together, and that's how you get consistency across the environment is because the code is the same and in fact there is a standard. But we don't even know what it is exactly right. It's it's implicit in the code, >> Okay, but so any other competitors say, Okay, we love open source, too, will embrace open stores. What's different about Google's philosophy? >> Well, first of all, you could just look at a very high level of contribution back into the open source packages, not just the ones that were doing. You can see we've contributed things like the community's trademark so that that means it's actually not a Google thing anymore. Belonged to the proud Native Reading Foundation. But also, the way we're trying to partner with open source projects is really to give them a path to revenue. All right, give them a long term future on DH. Expectation is, that makes the products better. And it also means that, uh, we're implicitly preferred partner because we're the ones helping them. All >> right, Eric, One of things caught our attention this week really kind of extending containers with things like cloud code and cloud run. You speak a little bit to that and you know directionally where that's going, >> Yeah, crowd runs one of my favorite releases of this week. Both the one God code is great, also, especially, it's V s code integration which is really nice for developers. But I would say the cloud run kind of says we can take you know, any container that has a kind of a stateless thing inside and http interface and make it something we can run for you in a very clean way. What I mean by that is you pay per call and in particular Well, listen twenty four seven and case it call comes But if no call comes, we're going to charge you zero, right? So we'll eat the cost of listening for your package to arrive. But if a packet arrives for you, we will magically make sure you're there in time to execute it on. If you get a ton of connections, we'll scale you up. We could have a thousand servers running your cloud run containers. And so what you get is a very easy deployment model That is a generalization. Frankly, of functions, you can run a function, but you also run not only a container with kind of a managed run time ap engine style, but also any arbitrary container with your own custom python and image processing libraries. Whatever you want, >> here are our last guest at Google Cloud next twenty nineteen. So thank you. And so put a bow on the show this year. Obviously got the bigger, better shiny er Mosconi Center. It's awesome. Definitely bigger crowd. You see the growth here, but but tie a bow. Tell us what you think. Take us home. >> I have to say it's been really gratifying to see the reception that anthrax is getting. I do think it is a big shift for Google and a big shift for the industry. And, uh, you know, we actually have people using it, so I kind of feel like we're at the starting line of this change. But I feel like it's it's really resonated well this week, and it's been great to watch the reaction. >> Everybody wants their infrastructure to be like Google's. This is one of the people who made it happen. Eric, Thanks very much for coming in the Cube. Appreciate. Pleasure. All right, keep right, everybody. We'Ll be back to wrap up Google Cloud next twenty nineteen. My name is David. Dante. Student meant John Furry will be back on set. You're watching. The cube will be right back

Published Date : Apr 11 2019

SUMMARY :

Google Cloud next nineteen, brought to you by Google Cloud and The cube goes out to the events. Wall to wall Eric Brewer is here is the vice president of Infrastructure and a Google fellow. And then we'll talk about how you're you're taking what you've learned inside And so scaling that up, you get a system that works a lot with just prophecies and That's kind of way we're Calm back in twenty fourteen, only a few hundred of us, you know, right across the street from where we were here. the old stuff to make it look like a service that gets you pretty far. But I kind of feel like even just being able to run high call the Communities But it's more important that they are moving to this kind of modern platform but helps But to the extent that you could bring that cloud experience, Tio, Well, one question is, how much rewriting do you have to do because it's Ah, and a lot of times, if you gonna migrate, you have to free. Okay, you can use for new development or a place to do your read rights S o. I think it's more about having the ability to execute modern development is gonna just gonna really start to move. I think the way to think about it is all the customs you talked to actually do That's really the difference. you know? Cuban A's version, and we can give you modern ing data and tell you how your clusters doing. Most of the The general answer to that is, you use many clusters. The box Guys say you gotta have that. Okay, that's It's been tested, but the bottom line is, uh, we're going to do both models. So I could get a full stack from hardware through software. we honestly don't have enough people to do all the things we would like to do with these customers. really at least the core pieces to be the same, like Uber Netease studio on a policy This is kind of the way we're going to approach this Okay, but so any other competitors say, Okay, we love open source, too, will embrace open stores. Well, first of all, you could just look at a very high level of contribution back into the open You speak a little bit to that and you know directionally where that's And so what you get is a very easy deployment model That is a generalization. Tell us what you think. And, uh, you know, we actually have people using it, so I kind of feel like we're at the starting line This is one of the people who made it happen.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VolantePERSON

0.99+

Eric BrewerPERSON

0.99+

CiscoORGANIZATION

0.99+

John FarrierPERSON

0.99+

DellORGANIZATION

0.99+

DavidPERSON

0.99+

EricPERSON

0.99+

Native Reading FoundationORGANIZATION

0.99+

San FranciscoLOCATION

0.99+

HPORGANIZATION

0.99+

AnthonyPERSON

0.99+

BreuerPERSON

0.99+

JustinPERSON

0.99+

ThomasPERSON

0.99+

oneQUANTITY

0.99+

Lena LennoxPERSON

0.99+

GoogleORGANIZATION

0.99+

zeroQUANTITY

0.99+

twoQUANTITY

0.99+

AmazonORGANIZATION

0.99+

David IranPERSON

0.99+

two optionsQUANTITY

0.99+

eighty percentQUANTITY

0.99+

one questionQUANTITY

0.99+

Three examplesQUANTITY

0.99+

pythonTITLE

0.99+

DantePERSON

0.99+

John FurryPERSON

0.99+

two conceptsQUANTITY

0.99+

both modelsQUANTITY

0.99+

three daysQUANTITY

0.99+

two stepsQUANTITY

0.98+

this yearDATE

0.98+

eachQUANTITY

0.98+

second thingQUANTITY

0.98+

UberORGANIZATION

0.98+

this weekDATE

0.98+

one reasonQUANTITY

0.98+

SDOTITLE

0.98+

twenty fourteenQUANTITY

0.97+

KafkaTITLE

0.97+

twenty four sevenQUANTITY

0.97+

two thingsQUANTITY

0.97+

GrahamPERSON

0.97+

first doctorQUANTITY

0.97+

next nineteenDATE

0.96+

Day threeQUANTITY

0.96+

FBIORGANIZATION

0.96+

CooperPERSON

0.96+

Google CloudTITLE

0.96+

past yearDATE

0.95+

CubaLOCATION

0.95+

Cooper NeteaseORGANIZATION

0.95+

BothQUANTITY

0.95+

JavaTITLE

0.94+

TeoPERSON

0.94+

next twenty nineteenDATE

0.93+

firstQUANTITY

0.92+

CooperORGANIZATION

0.91+

OracleORGANIZATION

0.9+

Prem StackTITLE

0.9+

Amazon OutpostORGANIZATION

0.85+

BrigittePERSON

0.82+

CloudTITLE

0.82+

Syamla Bandla, Facebook | 7th Annual CloudNOW Awards


 

>> From the heart of Silicon Valley, it's the Cube, covering CloudNOW's seventh annual Top Women Entrepreneurs in Cloud Innovation Awards. (upbeat music) >> Lisa Martin on the ground with the Cube at Facebook headquarters. We are at the seventh annual CloudNOW Top Women Entrepreneurs in Cloud Innovations Awards event. Joined by one of the 2016 winners welcoming you back to the Cube Syamala Bandla you are now at Facebook, a Director of Production Engineering. Welcome back to the Cube. >> Thank you Lisa. >> So we are at Facebook Headquarters, and we were talking with Jocelyn DeGance Graham a little bit ago who is the founder of CloudNOW. Their 7th annual event, first time at Facebook, and you a past winner are largely responsible for getting Facebook to say yes. Tell us a little bit about how you are paying it forward as a winner and enabling this years awards to have such a boost up. >> So I attended the CloudNOW event at the Google campus in 2016, and when I walked out of the event just not being the award recipient, but just meeting the other award winners, as well as the speakers, I was completely pumped up and charged. When I joined Facebook last year I saw how much deeply Facebook actually cares about diversity and inclusion. And I know that cloud computing and conversion technologies as an area where women are under represented. So when I pitched to my leadership team that when we care so much about, we should be hosting this years event and they jumped on board immediately. >> So it was an easy sell, but something also that's pretty remarkable that you should know about is that this year one of the keynote speakers is the COO of Facebook Sheryl Sandberg, and again Syamala you were instrumental in securing Sheryl who has a crazy busy schedule. That's huge for everybody here, and the ground swell of women in technology. Tell us a little bit about that coup. >> So when we decided finally to host it and as we were planning and all the line up of, great line up of speakers and the winners. We couldn't have thought about anybody else to do the opening remarks than Sheryl Sandberg. I know we had, she had, a very very tough schedule but my leadership team and I, we were persistent, and it's an honor to have her here to do the opening remarks. >> Absolutely yes. So talk to us a little bit about your tenure here at Facebook, you mentioned joining about eight or nine months or so ago and being a culture that fosters diversity, gender diversity, thought diversity. Tell us a little bit about your team in production engineering and how that culture, how are you helping to grow that? >> That's a great question. So definitely I'll be very honest, we have a lot more to do. Production engineering predominately in the industry is male-dominated. But just this year, just in the teams around me, we have hired quite a bit of female managers as well as individual contributors. And the support we get from our peers, the open thoughts, the collaboration, it's just great to be in an environment where we can foster that culture. >> And one of the things too, tell me about your background. Is your education background in a STEM field? Your engineering background? >> Yes. >> Yes, so talk to us about one of the things that's also challenging that we're all very familiar with, with women in technical roles, is the under representation, but it's also being able to retain women. You are establishing a great tech career yourself, what's your advice for inspiring your generation, and then the younger generation that you're helping to hire here at Facebook, to stay in technology. >> So cloud computing or technology, I mean we all have to pay it forward. I think we as women who are in influential positions and can make an impact on the younger generation I think need to absolutely do a lot more to pay it forward. It is not only with awareness but also wherever, whenever you get opportunities try to mentor students. Early on in their career encourage them to believe in themselves, to reach out for mentors and sponsors, do networking, which I think in general girls and women they shy away from it. I would say networking, meeting with people in the industry, they would be learning a lot more early on in their career. >> Great advice. One of the things that's also fantastic, and a first for this 7th annual CloudNOW event, is its, not only is it sold out, they're expecting over 300 attendees here tonight, both men and women, but also there was no advertising for the selling of tickets, so this was all word of mouth from the sponsors, Facebook, Google, Intel, past winners like yourself. So Jocelyn talked about that ground swell, that momentum that we're all feeling, what are your expectations for the event tonight? >> First of all we are super thrilled and excited. Like when I look at the list of the guest list, when I look at the speakers, when I look at the winners. I mean it was just the word of mouth as we started telling who will be the speakers and we will have a VC panel, and the winners. I think the word of mouth really paid it forward and we're super thrilled to have about close to 300 people attending the evening tonight. >> And there's a really nice diverse set of winners you mentioned. I was chatting with Jocelyn earlier and this is the first year that they've been able to recognize female, technical founders who are venture backed. And there's a variety of technologies, we're going to be speaking with all the winners tonight from the smart homes, the smart apartments technology, to blockchain, intelligence on blockchain, so the diversity there, and also not just the technologies but also the background of some of these entrepreneurs who, one of them is a lawyer who was a practicing attorney for 17 years founding Digitory Legal. Just really interesting backgrounds, what are your thoughts on that? >> So I think when we looked we had more than 100 nominees. It was very very hard, and I was also part of the committee as we were going through the winner, choosing the winners. It was very hard. But one of the things we really wanted to make sure was that we had a diverse set of winners. Not only from their backgrounds, but also the technology domain they were representing, which is very very important. And as we were going through the planning deck, and looking at the presentations, I can't wait to hear what they have to present. It is so thrilling to see the accomplishments and what they have achieved in their respective fields. >> And we're excited as well. Syamala thanks for taking time to stop by and join us on the program tonight, and it was good to see you again. >> Thank you so much Lisa, it's been a pleasure being here. Thank You >> Excellent. We want to thank you for watching Lisa Martin on the ground at Facebook for the Cube. Thanks for watching. (upbeat music)

Published Date : Jan 29 2019

SUMMARY :

it's the Cube, covering Lisa Martin on the ground with and we were talking with So I attended the CloudNOW event and the ground swell and all the line up of, So talk to us a little bit about And the support we get from our peers, And one of the things too, one of the things that's on the younger generation One of the things that's also fantastic, list of the guest list, and also not just the technologies But one of the things we and it was good to see you again. Thank you so much Lisa, it's on the ground at Facebook for the Cube.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JocelynPERSON

0.99+

SherylPERSON

0.99+

Sheryl SandbergPERSON

0.99+

GoogleORGANIZATION

0.99+

LisaPERSON

0.99+

FacebookORGANIZATION

0.99+

Lisa MartinPERSON

0.99+

SyamalaPERSON

0.99+

2016DATE

0.99+

17 yearsQUANTITY

0.99+

Jocelyn DeGance GrahamPERSON

0.99+

Syamla BandlaPERSON

0.99+

IntelORGANIZATION

0.99+

last yearDATE

0.99+

Silicon ValleyLOCATION

0.99+

first timeQUANTITY

0.99+

first yearQUANTITY

0.98+

OneQUANTITY

0.98+

more than 100 nomineesQUANTITY

0.98+

CloudNOWEVENT

0.98+

over 300 attendeesQUANTITY

0.98+

oneQUANTITY

0.98+

tonightDATE

0.98+

CloudNOWORGANIZATION

0.97+

this yearDATE

0.96+

7th Annual CloudNOW AwardsEVENT

0.96+

bothQUANTITY

0.95+

CloudNOW Top Women Entrepreneurs in Cloud Innovations AwardsEVENT

0.94+

FirstQUANTITY

0.94+

firstQUANTITY

0.93+

7th annual eventQUANTITY

0.9+

monthsDATE

0.88+

7th annualQUANTITY

0.87+

Digitory LegalORGANIZATION

0.84+

seventh annual Top Women Entrepreneurs in Cloud Innovation AwardsEVENT

0.82+

Syamala BandlaPERSON

0.79+

GoogleLOCATION

0.78+

CubeORGANIZATION

0.77+

Facebook HeadquartersLOCATION

0.7+

seventh annualEVENT

0.67+

close to 300 peopleQUANTITY

0.64+

nineQUANTITY

0.6+

about eightDATE

0.59+

themQUANTITY

0.54+

Vaughn Stewart, Pure Storage | VMworld 2018


 

>> Live from Las Vegas, it's the CUBE, Covering VMWorld 2018. Brought to you by VM Ware and it's ecosystem partners. >> Hey, welcome back to Las Vegas Mandalay Bay. Lisa Martin with Dave Vellante at VMWorld 2018 Day One. Dave, this has been an awesome day. >> Yeah, jam-packed and almost 1/3 of the way through, 94 guests, I think our biggest show ever. >> I think I'm going to say, I'm going to make up a word and say it's going to get awesomer because we have one of our distinguished alumni, Vaughn Stewart, >> Wow. VP of Technology Alliances and Strategy at Pure Storage, Vaughn, great to have you here. >> Lisa, Dave, thanks for having us back. >> Great to see you again. >> Yes, ditto. >> We had a blast hosting the CUBE at Pure//Accelerate just a couple months ago. >> We got T-shirts. >> But we were sporting our, yeah, in the context of the Bill Graham Civic. I feel too dressed-up actually for talking to Pure Storage. So some great momentum you guys had when we were there a few months ago, great momentum continues, quarterly revenue earnings just announced, 37% year-over-year growth, almost 400 new customers. Gartner, fifth year in a row, you guys are a leader in the Magic Quadrant for Solid-State Arrays, wow! >> Yeah a lot was shared last week with the financial results, right? Couple more just points of color-commentary, if you will. 309 million dollars, 27% of quarter-over-quarter, 35% of penetration of the Fortune 500, roughly 30% of the revenue comes from the cloud providers, say like clouds number eight through 500, on the Magic Quadrant, right, five years in a row being in that upper-right quarter, quadrant. And if you look back on it historically, just the players that have come and gone and their positions have changed and we've kind of been the foundational element in that corner, I think speaks to, how well we know the length of market, on top of all that, right, Pure Storage's first acquisition, right, StorReduce. >> Congratulations. >> For those of you who maybe haven't heard of StorReduce, start-up, their focus is on providing data deduplication across object stores, born in the cloud, Pure software play, I think we're going to continue to leverage that within it's current focus in market area as well as expand our, it's part of our cloud strategy and even maybe bring some of it into the current on-prem product portfolio. Lot's of opportunities available to us with that IP. >> So, you know, when you look back at the sort of, well first of all, flash, Solid-State, upper right. But there's life beyond flash arrays, right. So if you look at some of the early guys, you remember Astec, if that's even how you say it, Fusion, and a lot of people predicted, oh you know, same thing, everybody's going to catch up to Pure, but you guys kept innovating, cloud is now a fly wheel for you guys, you really went hard after it. So I wonder if you could talk about the evolution and sort of phases as you guys see them of the company? >> Yeah so for your audience, I think one way to look at this with a start-up is when your founders have an idea of bringing a product to market, you have to be very laser-focused which means there's trade-offs, right, there's a lot of things that you can't do so that you can bring your technology to bear, your product, you've got to you know be able to gain market share, right. Customers' revenue is kind of like the lifeblood early on. And we've evolved past that, right, there's been the passing on the torch last year with our change in CEO from Scott who moved on to be chairman of the board, bringing in Charlie, and I think we're really at this phase of the beginning of what I call Act II, along the way, flash array which is our flagship and our initial product, has helped customers adopt technologies through different business models, right, the Evergreen Storage play, us introducing non-volatile memory express into all of our products, you know, half of our customer shipments in Q2 were all NVMe, right, so. Allowing customers to adopt technologies in new models that they didn't have before that aren't rip and replacements has been a key to our success beyond the tech. Flash blades often up and running in net new areas of business opportunities for us like AI and ML. And now you get StorReduce, right, this cloud component. I would say that the legacy of Pure, that Act I that Scott built is going to continue to run for the next couple years kind of on auto-pilot. And that's not to be dismissive of the field that's got to go out and execute every, you know, every day, every quarter, but Charlie's vision about what we're going to evolve into, I mean we're really if, to use a baseball analogy because someone was talking Sox before the cameras went on >> Who could that have been? >> Yeah, yeah. (laughs) You know, we're in the beginning of the first inning, you know, StorReduce is just, I think the tip of what we're going to do. We got 1.1 billion in the bank, you know, we've got a little bit of capital to continue to invest in the portfolio. So right now the focus is on still, I think there's two ways to look at this. What I find most enterprise customers want to talk about is how do I merge three modern technologies, right? All-flash, hyper-conversion, and cloud? Give me a strategy that unifies them, not one that divides. And we can have a whole conversation on that. Then there's this whole other segment around analytics and AI which, you heard it here in the keynote this morning with Pat. Focus area, you know for VM Ware, AI is the modern version of what analytics were six years ago. And so this is something that I don't think all the practitioners here are aware of. It comes from a data science or the application side into the infrastructure, and we're trying to help people make a turnkey AI-ready infrastructure through the RE product within video so there's just a lot to talk about. >> And you can see those worlds coming together with, take cloud, take AI, take data, which is what you're all about. >> Yup. >> That's kind of the innovation sandwich of the next 10 years. It ain't Moore's Law anymore, right? It's AI, applying machine intelligence to the data and scaling a cloud. >> You know one of the things that Silicone Angle I think may have been at least the largest analyst firm that I saw jump on this early, was around the notion of bringing your data to the infrastructure. >> Yeah, absolutely. >> And then you guys pushed in, you guys leaned in really hard about three or four years ago on that the world is a hybrid model. It's not one or the other, it's all hybrid. And you even talked about the differences in the type of data sets and it's computational requirements and where it may or may not be placed, as well as you really leaned in on the interop requirements to cross the different silos. >> Yeah, that's right. >> So kudos to your research. >> Yeah, thank you, and we've quantified that. It's actually that whole idea of bringing the compute to the data, for example, wherever it resides. I mean that's a big, big business. If you look at the size of the market of those folks trying to replicate substantially cloud on-prem, it's 30 billion dollar businessing growing very, very rapidly. And you guys play in both sides of that, I mean that's what's impressive, you're not just on-prem, you're not just in cloud, you're hybrid. >> Here's a good example of how cloud evolves. We're really proud of our net promoter score, right. It's 86.6, top 1% of B2B businesses, right. I look at external points of validation whether it's a net promoter score, what an analyst firm ranks you at in their Magic Quadrant or others, as are you delivering to your customers your promise to them, right, you're marketing material. Part of why our score is so high is the product's reliability is there and it delivers. Underpinning that is we've got a predictive analytics technology that helps the arrays achieve greater than Six Nines availability, right? That component, that's Pure One, that was born in the cloud. That was born in AWS, and we talked about this in a session at our Accelerate conference, which is we've got greater than nine petabytes up there. Every time we do a new, we're working a new algorithm for AI to make our product better for our customers, we have to download a year's worth of historical data. That takes 45 days to download in stage. So we're moving it to a hybrid model. And what's it going to allow us to do? It's instantly going to help us reduce our cost and accelerate our AI initiatives by six X. And it's just a bridging of the technologies. Regardless of what you have, you have an all-flash array, you're a cloud provider, you're a hyper-converged. Sometimes your product teams look at the world with like, I got to hammer and that's a nail and what really provides sophisticated outcomes is when you can bridge the technologies based on results. >> Speaking of marketing messaging, some people, some companies like to say they are data-driven, or they will enable their customers to become data-driven. At Accelerate a few months ago, Pure Storage talked about data-centric architecture. Now we all hear data is the lifeblood, data is power, data is currency, it's none of those things unless an organization can harness it and extract the insights and act on them immediately. >> Right. >> Talk to us about the data-centric architecture. What is that, how have you seen that, we'll say, accelerate in momentum in the last few months? >> Great questions, so thank you for bringing that up. I think on the surface, one may look at a data-centric architecture message as being oh, that's what you would expect from a storage vendor to say, right? Sounds like something aligns to your products. And I think there was some inside baseball being shared, if you will, in that message, right? There was some telegraphing going on. Because at the core of the message, what we're trying to say is, your traditional applications tend to be more stove piped and siloed, right? What you see, and I'll take this through two levels, what you see with taking traditional applications or legacy apps and you virtualize them, and now you want this mobility where you can move the application around anywhere, all-flash or on-prem or into the cloud, that's one form of movement. Modern applications are distributed, right? There are a collection of processes, different data sets and the application's much more like a pipeline. And so when you look at data from a view of pipeline, you have to stop thinking about your silo that's wrapped around your one tool that you as a developer may have a responsibility for in the product or the code. >> Your God box, as it were, right. >> You got to figure out how does it work in a pipeline with others, how are you going to ingest data and hand off data? So in a data-centric architecture, we're trying to advocate that there's a value in shared architectures and in addition to this, there's been this whole market that's grown up over the last decade initially around analytics where their architectures were designed around DAS architectures. And you have to look back a little bit to get a understanding of where we are today which was, you go back ten or twelve years ago, it was really easy with the par of intel to bury a disc-based storage rate, no matter what size it was and which vendor put it out. You could saturate the IL bandwidth. Now we're at a day and age today, shared accelerated storage, fast network interconnect with non-viable memory express over fabric whether we're talking ethernet or fiber channel. I now have the latency that's within ten microseconds of direct attach storage. I get all the benefits of shared. And I get some new architectural models that may help me with costs and efficiencies. And so you're starting to see vendors in the software space follow in suit and so, for example, you've got Vertica releasing support for S3 on-prem. You've got VM Ware adding more fuel around VVOLS and interoperability between VVOLS, vSAN, and VM Cloud. There's more partners that have more activity going on that I can't share because they've got announcements coming through the second half of the year but vCloud Air just published in July a new paper on HDFS on remote storage regardless of the protocol so you're seeing all these DAS-centric vendors start to say, alright, our customer-base is telling us they need a shared model. So shared accelerated, flash, NVMe, NVMe over fabric, it's going to fuel new architectures that are more flexible. >> So I want to follow up with that because you're right, the data pipeline is elongating and it's getting quite complex. I mean if you're an AWS customer, which we are actually, if you use kinesics, DynamoDB, EC2, S3, you know Red Shift, etc. Those are all sort of different proprietary APIs. Sometimes you don't know what you should do where until after you get the bill. >> Right. >> Can you help solve that problem for customers and simplify that or are you just a piece of that chain? >> So we have a component within the chain but we're working with our field and our field technologists to help advise customers particularly around what I'd call like a cloud-first strategy. So, if we look outside of storage and you're looking in the cloud developers and it's function as a service, for example. >> Right. >> So we use our own case study, right, Pure One. We got hooked into function as a service within our provider. And what we've found was our ability to use multiple clouds, our ability to go hybrid-cloud, and our ability to actually take our analytics and be able to package it up and deliver it to dark side customers that, there's about a third of our customers that won't allows for their units to phone home, okay? Three-letter acronyms that run in the federal space. Cloud-first meant that we just take that function as a service and instead of making the direct API call put it in a container. Now once you're containerized, I can run it on any cloud. Right, and now again, cross-public cloud, hybrid, into private, and it gives you a lot of flexibility. So we're working on architectures and educational conversations, not just about the data pipeline and how your data has to transform as it goes through these different phases, but also at the higher level, really going to be leaning in on containerization and so the customers can have greater mobility, and again, we'll use our own use-case and evolution of Pure One is the front and center message there. >> I'd love to get your perspective, kind of changing the topic, on the ecosystem evolution. You've observed the VM Ware ecosystem. You remember well, I mean it's just strange that EMC ended up with this asset, right? I mean it's kind of unnatural and all of a sudden, boom, it explodes, and you had this storage company somewhat controlling, you had the storage cartel kind of which, VM Ware wanted to placate, so that was good, that sort of was a bulwark against EMC having too much control. Now you see Del's ownership, you see the AWS relationship. As an ecosystem partner who's now reached escape velocity and beyond, what do you make of all this? >> I think you have to look across Pat's time and before Pat to Diane, right? Diane made it clear, right, when there was acquisitions in play for VM Ware, right, she said, we'll never be owned by a server vendor. And so storage vendor acquires EMC, and for all the blustering of EMC control, there was never anything that was proprietary towards EMC with VM Ware, right. >> Right. >> The focus was on the entire partner ecosystem. That's a good bat, right, let the harbor vendors go battle out for who's got best in class, just deliver the VM software to the market. Allow VM Ware to go innovate on different timeframe than the storage layer. Now that Del is in the ownership seat, you have the same answers from Pat, when he sits down with Charlie it's like, look, we're going to be independent, we're going to be agnostic, we're going to take you as a partner to help us build frameworks. So for example, we're one of the lead design partners on NVMe over fabric, we're doing technology previews with vSphere in the booth. We're the fastest growing VVOL partner. So I know I'm making plugs here but I don't think anything's changed, right. I think VM Ware's business model's been brilliant to not become tied to any hardware partner and focus on what you do better than anyone else which has been delivering virtualization and what I really like about this show, and tell me if you think so, right. AWS was shared last year, right? Containers have been shared at this show for about four years. This year was a focus, right, it was AWS, it was containers, it was automate everything, and then inherently it brings security in as an inherent component of the products, right? These are really bold, strong investments that they've made that are new, right. So you see the evolution of VM Ware, and we're partnering with them on a number of these initiatives and there's nothing to share now. That'll be next year. >> Well and you're right, Vaughn, the picture's getting clearer. I thought Pat's keynote was very good this year, and crisper and more cogent relative to the strategy than last year and previous years. It's really starting to come together. Now what about the AWS piece because that's also a company with whom you have a relationship. So does the VM Ware, AWS partnership, is that a tailwind for you guys? Or is it, hey, we're trying to get the attention of AWS, too. >> So I would say our, so we signed a formal VM Ware alliance relationship this year, and I would say it's progressing well. What we can share with the market right now is minuscule to what we'll be sharing, say later in the year, beginning of next year. But for right now where we're at is, so we're a direct-connect partner, gold-level sponsor for their conference, re:Invent. With VM Ware and AWS and Pure as a three-way alliance and partnership, VM Cloud, VM C, is going to add support for iSCSI, that's a second-half of the year initiative, or fourth-quarter initiative, and we'll be there as a lead development partner supporting that framework when it comes online. It's going to open a lot more flexibility for us and our joint customers about adopting either your own on-prem hardware or running it on the Amazon hardware. Make it fit your business model whichever way you want to roll but make it fully interoperable and move the data and the compute instances seamlessly and non-disruptively. >> Guys. >> It helps to be a hot company. >> I wish we had more time. I'm hearing accelerated momentum and maybe some teasers that Vaughn dropped, >> Yes. That maybe the CUBE needs to be, yeah. >> We'll stay in touch. >> We'll get some more interviews. >> Yeah. >> (Laughs) Vaughn, thanks so much for joining Dave and me and sharing all this exciting news that's going on, and like I said, accelerated momentum, pun intended by the way. >> Thank you, thanks guys. >> Great to see you. >> We want to thank you for watching the CUBE for Dave Vellonte, I'm Lisa Martin with the CUBE at VM World Day One from Las Vegas, stick around, we'll be right back. (funky music) >> Hi, I'm John Walls. I've been with the CUBE for a couple years.

Published Date : Aug 28 2018

SUMMARY :

Brought to you by VM Ware Lisa Martin with Dave Vellante almost 1/3 of the way through, Vaughn, great to have you here. We had a blast hosting the CUBE in the Magic Quadrant for 35% of penetration of the Fortune 500, available to us with that IP. and sort of phases as you got to you know be able We got 1.1 billion in the bank, you know, And you can see those of the next 10 years. the notion of bringing your on that the world is a hybrid model. idea of bringing the Regardless of what you have, and extract the insights in the last few months? and now you want this mobility and in addition to this, what you should do where looking in the cloud and so the customers can and beyond, what do you make of all this? and for all the blustering of EMC control, and focus on what you do is that a tailwind for you guys? and the compute instances that Vaughn dropped, That maybe the CUBE needs to be, yeah. We'll get some more pun intended by the way. We want to thank you I've been with the CUBE

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Lisa MartinPERSON

0.99+

PatPERSON

0.99+

AWSORGANIZATION

0.99+

Dave VellantePERSON

0.99+

John WallsPERSON

0.99+

Dave VellontePERSON

0.99+

Lisa MartinPERSON

0.99+

45 daysQUANTITY

0.99+

CharliePERSON

0.99+

JulyDATE

0.99+

DavePERSON

0.99+

DianePERSON

0.99+

last yearDATE

0.99+

Las VegasLOCATION

0.99+

1.1 billionQUANTITY

0.99+

27%QUANTITY

0.99+

35%QUANTITY

0.99+

VaughnPERSON

0.99+

next yearDATE

0.99+

AccelerateORGANIZATION

0.99+

94 guestsQUANTITY

0.99+

last yearDATE

0.99+

Vaughn StewartPERSON

0.99+

five yearsQUANTITY

0.99+

VM WareORGANIZATION

0.99+

fifth yearQUANTITY

0.99+

37%QUANTITY

0.99+

AmazonORGANIZATION

0.99+

86.6QUANTITY

0.99+

Pure StorageORGANIZATION

0.99+

This yearDATE

0.99+

last weekDATE

0.99+

EMCORGANIZATION

0.99+

LisaPERSON

0.99+

Three-letterQUANTITY

0.99+

this yearDATE

0.99+

AstecORGANIZATION

0.99+

both sidesQUANTITY

0.99+

StorReduceORGANIZATION

0.99+

two levelsQUANTITY

0.98+

1%QUANTITY

0.98+

two waysQUANTITY

0.98+

309 million dollarsQUANTITY

0.98+

three-wayQUANTITY

0.98+

first acquisitionQUANTITY

0.98+

greater than nine petabytesQUANTITY

0.98+

CUBEORGANIZATION

0.98+

Las Vegas Mandalay BayLOCATION

0.98+

second-halfQUANTITY

0.97+

six years agoDATE

0.97+

VM World Day OneEVENT

0.97+

VMWorld 2018EVENT

0.97+

one toolQUANTITY

0.97+

Pure StorageORGANIZATION

0.97+

VerticaORGANIZATION

0.97+

DelPERSON

0.97+

about four yearsQUANTITY

0.96+

first inningQUANTITY

0.96+

oneQUANTITY

0.96+

Q2DATE

0.96+

VMworld 2018EVENT

0.95+

almost 400 new customersQUANTITY

0.95+

VM WareTITLE

0.95+

30%QUANTITY

0.95+

a yearQUANTITY

0.95+

couple months agoDATE

0.95+

tenDATE

0.94+

ScottPERSON

0.94+