Image Title

Search Results for A thousand flower bloom:

Closing Panel | Generative AI: Riding the Wave | AWS Startup Showcase S3 E1


 

(mellow music) >> Hello everyone, welcome to theCUBE's coverage of AWS Startup Showcase. This is the closing panel session on AI machine learning, the top startups generating generative AI on AWS. It's a great panel. This is going to be the experts talking about riding the wave in generative AI. We got Ankur Mehrotra, who's the director and general manager of AI and machine learning at AWS, and Clem Delangue, co-founder and CEO of Hugging Face, and Ori Goshen, who's the co-founder and CEO of AI21 Labs. Ori from Tel Aviv dialing in, and rest coming in here on theCUBE. Appreciate you coming on for this closing session for the Startup Showcase. >> Thanks for having us. >> Thank you for having us. >> Thank you. >> I'm super excited to have you all on. Hugging Face was recently in the news with the AWS relationship, so congratulations. Open source, open science, really driving the machine learning. And we got the AI21 Labs access to the LLMs, generating huge scale live applications, commercial applications, coming to the market, all powered by AWS. So everyone, congratulations on all your success, and thank you for headlining this panel. Let's get right into it. AWS is powering this wave here. We're seeing a lot of push here from applications. Ankur, set the table for us on the AI machine learning. It's not new, it's been goin' on for a while. Past three years have been significant advancements, but there's been a lot of work done in AI machine learning. Now it's released to the public. Everybody's super excited and now says, "Oh, the future's here!" It's kind of been going on for a while and baking. Now it's kind of coming out. What's your view here? Let's get it started. >> Yes, thank you. So, yeah, as you may be aware, Amazon has been in investing in machine learning research and development since quite some time now. And we've used machine learning to innovate and improve user experiences across different Amazon products, whether it's Alexa or Amazon.com. But we've also brought in our expertise to extend what we are doing in the space and add more generative AI technology to our AWS products and services, starting with CodeWhisperer, which is an AWS service that we announced a few months ago, which is, you can think of it as a coding companion as a service, which uses generative AI models underneath. And so this is a service that customers who have no machine learning expertise can just use. And we also are talking to customers, and we see a lot of excitement about generative AI, and customers who want to build these models themselves, who have the talent and the expertise and resources. For them, AWS has a number of different options and capabilities they can leverage, such as our custom silicon, such as Trainium and Inferentia, as well as distributed machine learning capabilities that we offer as part of SageMaker, which is an end-to-end machine learning development service. At the same time, many of our customers tell us that they're interested in not training and building these generative AI models from scratch, given they can be expensive and can require specialized talent and skills to build. And so for those customers, we are also making it super easy to bring in existing generative AI models into their machine learning development environment within SageMaker for them to use. So we recently announced our partnership with Hugging Face, where we are making it super easy for customers to bring in those models into their SageMaker development environment for fine tuning and deployment. And then we are also partnering with other proprietary model providers such as AI21 and others, where we making these generative AI models available within SageMaker for our customers to use. So our approach here is to really provide customers options and choices and help them accelerate their generative AI journey. >> Ankur, thank you for setting the table there. Clem and Ori, I want to get your take, because the riding the waves, the theme of this session, and to me being in California, I imagine the big surf, the big waves, the big talent out there. This is like alpha geeks, alpha coders, developers are really leaning into this. You're seeing massive uptake from the smartest people. Whether they're young or around, they're coming in with their kind of surfboards, (chuckles) if you will. These early adopters, they've been on this for a while; Now the waves are hitting. This is a big wave, everyone sees it. What are some of those early adopter devs doing? What are some of the use cases you're seeing right out of the gate? And what does this mean for the folks that are going to come in and get on this wave? Can you guys share your perspective on this? Because you're seeing the best talent now leaning into this. >> Yeah, absolutely. I mean, from Hugging Face vantage points, it's not even a a wave, it's a tidal wave, or maybe even the tide itself. Because actually what we are seeing is that AI and machine learning is not something that you add to your products. It's very much a new paradigm to do all technology. It's this idea that we had in the past 15, 20 years, one way to build software and to build technology, which was writing a million lines of code, very rule-based, and then you get your product. Now what we are seeing is that every single product, every single feature, every single company is starting to adopt AI to build the next generation of technology. And that works both to make the existing use cases better, if you think of search, if you think of social network, if you think of SaaS, but also it's creating completely new capabilities that weren't possible with the previous paradigm. Now AI can generate text, it can generate image, it can describe your image, it can do so many new things that weren't possible before. >> It's going to really make the developers really productive, right? I mean, you're seeing the developer uptake strong, right? >> Yes, we have over 15,000 companies using Hugging Face now, and it keeps accelerating. I really think that maybe in like three, five years, there's not going to be any company not using AI. It's going to be really kind of the default to build all technology. >> Ori, weigh in on this. APIs, the cloud. Now I'm a developer, I want to have live applications, I want the commercial applications on this. What's your take? Weigh in here. >> Yeah, first, I absolutely agree. I mean, we're in the midst of a technology shift here. I think not a lot of people realize how big this is going to be. Just the number of possibilities is endless, and I think hard to imagine. And I don't think it's just the use cases. I think we can think of it as two separate categories. We'll see companies and products enhancing their offerings with these new AI capabilities, but we'll also see new companies that are AI first, that kind of reimagine certain experiences. They build something that wasn't possible before. And that's why I think it's actually extremely exciting times. And maybe more philosophically, I think now these large language models and large transformer based models are helping us people to express our thoughts and kind of making the bridge from our thinking to a creative digital asset in a speed we've never imagined before. I can write something down and get a piece of text, or an image, or a code. So I'll start by saying it's hard to imagine all the possibilities right now, but it's certainly big. And if I had to bet, I would say it's probably at least as big as the mobile revolution we've seen in the last 20 years. >> Yeah, this is the biggest. I mean, it's been compared to the Enlightenment Age. I saw the Wall Street Journal had a recent story on this. We've been saying that this is probably going to be bigger than all inflection points combined in the tech industry, given what transformation is coming. I guess I want to ask you guys, on the early adopters, we've been hearing on these interviews and throughout the industry that there's already a set of big companies, a set of companies out there that have a lot of data and they're already there, they're kind of tinkering. Kind of reminds me of the old hyper scaler days where they were building their own scale, and they're eatin' glass, spittin' nails out, you know, they're hardcore. Then you got everybody else kind of saying board level, "Hey team, how do I leverage this?" How do you see those two things coming together? You got the fast followers coming in behind the early adopters. What's it like for the second wave coming in? What are those conversations for those developers like? >> I mean, I think for me, the important switch for companies is to change their mindset from being kind of like a traditional software company to being an AI or machine learning company. And that means investing, hiring machine learning engineers, machine learning scientists, infrastructure in members who are working on how to put these models in production, team members who are able to optimize models, specialized models, customized models for the company's specific use cases. So it's really changing this mindset of how you build technology and optimize your company building around that. Things are moving so fast that I think now it's kind of like too late for low hanging fruits or small, small adjustments. I think it's important to realize that if you want to be good at that, and if you really want to surf this wave, you need massive investments. If there are like some surfers listening with this analogy of the wave, right, when there are waves, it's not enough just to stand and make a little bit of adjustments. You need to position yourself aggressively, paddle like crazy, and that's how you get into the waves. So that's what companies, in my opinion, need to do right now. >> Ori, what's your take on the generative models out there? We hear a lot about foundation models. What's your experience running end-to-end applications for large foundation models? Any insights you can share with the app developers out there who are looking to get in? >> Yeah, I think first of all, it's start create an economy, where it probably doesn't make sense for every company to create their own foundation models. You can basically start by using an existing foundation model, either open source or a proprietary one, and start deploying it for your needs. And then comes the second round when you are starting the optimization process. You bootstrap, whether it's a demo, or a small feature, or introducing new capability within your product, and then start collecting data. That data, and particularly the human feedback data, helps you to constantly improve the model, so you create this data flywheel. And I think we're now entering an era where customers have a lot of different choice of how they want to start their generative AI endeavor. And it's a good thing that there's a variety of choices. And the really amazing thing here is that every industry, any company you speak with, it could be something very traditional like industrial or financial, medical, really any company. I think peoples now start to imagine what are the possibilities, and seriously think what's their strategy for adopting this generative AI technology. And I think in that sense, the foundation model actually enabled this to become scalable. So the barrier to entry became lower; Now the adoption could actually accelerate. >> There's a lot of integration aspects here in this new wave that's a little bit different. Before it was like very monolithic, hardcore, very brittle. A lot more integration, you see a lot more data coming together. I have to ask you guys, as developers come in and grow, I mean, when I went to college and you were a software engineer, I mean, I got a degree in computer science, and software engineering, that's all you did was code, (chuckles) you coded. Now, isn't it like everyone's a machine learning engineer at this point? Because that will be ultimately the science. So, (chuckles) you got open source, you got open software, you got the communities. Swami called you guys the GitHub of machine learning, Hugging Face is the GitHub of machine learning, mainly because that's where people are going to code. So this is essentially, machine learning is computer science. What's your reaction to that? >> Yes, my co-founder Julien at Hugging Face have been having this thing for quite a while now, for over three years, which was saying that actually software engineering as we know it today is a subset of machine learning, instead of the other way around. People would call us crazy a few years ago when we're seeing that. But now we are realizing that you can actually code with machine learning. So machine learning is generating code. And we are starting to see that every software engineer can leverage machine learning through open models, through APIs, through different technology stack. So yeah, it's not crazy anymore to think that maybe in a few years, there's going to be more people doing AI and machine learning. However you call it, right? Maybe you'll still call them software engineers, maybe you'll call them machine learning engineers. But there might be more of these people in a couple of years than there is software engineers today. >> I bring this up as more tongue in cheek as well, because Ankur, infrastructure's co is what made Cloud great, right? That's kind of the DevOps movement. But here the shift is so massive, there will be a game-changing philosophy around coding. Machine learning as code, you're starting to see CodeWhisperer, you guys have had coding companions for a while on AWS. So this is a paradigm shift. How is the cloud playing into this for you guys? Because to me, I've been riffing on some interviews where it's like, okay, you got the cloud going next level. This is an example of that, where there is a DevOps-like moment happening with machine learning, whether you call it coding or whatever. It's writing code on its own. Can you guys comment on what this means on top of the cloud? What comes out of the scale? What comes out of the benefit here? >> Absolutely, so- >> Well first- >> Oh, go ahead. >> Yeah, so I think as far as scale is concerned, I think customers are really relying on cloud to make sure that the applications that they build can scale along with the needs of their business. But there's another aspect to it, which is that until a few years ago, John, what we saw was that machine learning was a data scientist heavy activity. They were data scientists who were taking the data and training models. And then as machine learning found its way more and more into production and actual usage, we saw the MLOps become a thing, and MLOps engineers become more involved into the process. And then we now are seeing, as machine learning is being used to solve more business critical problems, we're seeing even legal and compliance teams get involved. We are seeing business stakeholders more engaged. So, more and more machine learning is becoming an activity that's not just performed by data scientists, but is performed by a team and a group of people with different skills. And for them, we as AWS are focused on providing the best tools and services for these different personas to be able to do their job and really complete that end-to-end machine learning story. So that's where, whether it's tools related to MLOps or even for folks who cannot code or don't know any machine learning. For example, we launched SageMaker Canvas as a tool last year, which is a UI-based tool which data analysts and business analysts can use to build machine learning models. So overall, the spectrum in terms of persona and who can get involved in the machine learning process is expanding, and the cloud is playing a big role in that process. >> Ori, Clem, can you guys weigh in too? 'Cause this is just another abstraction layer of scale. What's it mean for you guys as you look forward to your customers and the use cases that you're enabling? >> Yes, I think what's important is that the AI companies and providers and the cloud kind of work together. That's how you make a seamless experience and you actually reduce the barrier to entry for this technology. So that's what we've been super happy to do with AWS for the past few years. We actually announced not too long ago that we are doubling down on our partnership with AWS. We're excited to have many, many customers on our shared product, the Hugging Face deep learning container on SageMaker. And we are working really closely with the Inferentia team and the Trainium team to release some more exciting stuff in the coming weeks and coming months. So I think when you have an ecosystem and a system where the AWS and the AI providers, AI startups can work hand in hand, it's to the benefit of the customers and the companies, because it makes it orders of magnitude easier for them to adopt this new paradigm to build technology AI. >> Ori, this is a scale on reasoning too. The data's out there and making sense out of it, making it reason, getting comprehension, having it make decisions is next, isn't it? And you need scale for that. >> Yes. Just a comment about the infrastructure side. So I think really the purpose is to streamline and make these technologies much more accessible. And I think we'll see, I predict that we'll see in the next few years more and more tooling that make this technology much more simple to consume. And I think it plays a very important role. There's so many aspects, like the monitoring the models and their kind of outputs they produce, and kind of containing and running them in a production environment. There's so much there to build on, the infrastructure side will play a very significant role. >> All right, that's awesome stuff. I'd love to change gears a little bit and get a little philosophy here around AI and how it's going to transform, if you guys don't mind. There's been a lot of conversations around, on theCUBE here as well as in some industry areas, where it's like, okay, all the heavy lifting is automated away with machine learning and AI, the complexity, there's some efficiencies, it's horizontal and scalable across all industries. Ankur, good point there. Everyone's going to use it for something. And a lot of stuff gets brought to the table with large language models and other things. But the key ingredient will be proprietary data or human input, or some sort of AI whisperer kind of role, or prompt engineering, people are saying. So with that being said, some are saying it's automating intelligence. And that creativity will be unleashed from this. If the heavy lifting goes away and AI can fill the void, that shifts the value to the intellect or the input. And so that means data's got to come together, interact, fuse, and understand each other. This is kind of new. I mean, old school AI was, okay, got a big model, I provisioned it long time, very expensive. Now it's all free flowing. Can you guys comment on where you see this going with this freeform, data flowing everywhere, heavy lifting, and then specialization? >> Yeah, I think- >> Go ahead. >> Yeah, I think, so what we are seeing with these large language models or generative models is that they're really good at creating stuff. But I think it's also important to recognize their limitations. They're not as good at reasoning and logic. And I think now we're seeing great enthusiasm, I think, which is justified. And the next phase would be how to make these systems more reliable. How to inject more reasoning capabilities into these models, or augment with other mechanisms that actually perform more reasoning so we can achieve more reliable results. And we can count on these models to perform for critical tasks, whether it's medical tasks, legal tasks. We really want to kind of offload a lot of the intelligence to these systems. And then we'll have to get back, we'll have to make sure these are reliable, we'll have to make sure we get some sort of explainability that we can understand the process behind the generated results that we received. So I think this is kind of the next phase of systems that are based on these generated models. >> Clem, what's your view on this? Obviously you're at open community, open source has been around, it's been a great track record, proven model. I'm assuming creativity's going to come out of the woodwork, and if we can automate open source contribution, and relationships, and onboarding more developers, there's going to be unleashing of creativity. >> Yes, it's been so exciting on the open source front. We all know Bert, Bloom, GPT-J, T5, Stable Diffusion, that work up. The previous or the current generation of open source models that are on Hugging Face. It has been accelerating in the past few months. So I'm super excited about ControlNet right now that is really having a lot of impact, which is kind of like a way to control the generation of images. Super excited about Flan UL2, which is like a new model that has been recently released and is open source. So yeah, it's really fun to see the ecosystem coming together. Open source has been the basis for traditional software, with like open source programming languages, of course, but also all the great open source that we've gotten over the years. So we're happy to see that the same thing is happening for machine learning and AI, and hopefully can help a lot of companies reduce a little bit the barrier to entry. So yeah, it's going to be exciting to see how it evolves in the next few years in that respect. >> I think the developer productivity angle that's been talked about a lot in the industry will be accelerated significantly. I think security will be enhanced by this. I think in general, applications are going to transform at a radical rate, accelerated, incredible rate. So I think it's not a big wave, it's the water, right? I mean, (chuckles) it's the new thing. My final question for you guys, if you don't mind, I'd love to get each of you to answer the question I'm going to ask you, which is, a lot of conversations around data. Data infrastructure's obviously involved in this. And the common thread that I'm hearing is that every company that looks at this is asking themselves, if we don't rebuild our company, start thinking about rebuilding our business model around AI, we might be dinosaurs, we might be extinct. And it reminds me that scene in Moneyball when, at the end, it's like, if we're not building the model around your model, every company will be out of business. What's your advice to companies out there that are having those kind of moments where it's like, okay, this is real, this is next gen, this is happening. I better start thinking and putting into motion plans to refactor my business, 'cause it's happening, business transformation is happening on the cloud. This kind of puts an exclamation point on, with the AI, as a next step function. Big increase in value. So it's an opportunity for leaders. Ankur, we'll start with you. What's your advice for folks out there thinking about this? Do they put their toe in the water? Do they jump right into the deep end? What's your advice? >> Yeah, John, so we talk to a lot of customers, and customers are excited about what's happening in the space, but they often ask us like, "Hey, where do we start?" So we always advise our customers to do a lot of proof of concepts, understand where they can drive the biggest ROI. And then also leverage existing tools and services to move fast and scale, and try and not reinvent the wheel where it doesn't need to be. That's basically our advice to customers. >> Get it. Ori, what's your advice to folks who are scratching their head going, "I better jump in here. "How do I get started?" What's your advice? >> So I actually think that need to think about it really economically. Both on the opportunity side and the challenges. So there's a lot of opportunities for many companies to actually gain revenue upside by building these new generative features and capabilities. On the other hand, of course, this would probably affect the cogs, and incorporating these capabilities could probably affect the cogs. So I think we really need to think carefully about both of these sides, and also understand clearly if this is a project or an F word towards cost reduction, then the ROI is pretty clear, or revenue amplifier, where there's, again, a lot of different opportunities. So I think once you think about this in a structured way, I think, and map the different initiatives, then it's probably a good way to start and a good way to start thinking about these endeavors. >> Awesome. Clem, what's your take on this? What's your advice, folks out there? >> Yes, all of these are very good advice already. Something that you said before, John, that I disagreed a little bit, a lot of people are talking about the data mode and proprietary data. Actually, when you look at some of the organizations that have been building the best models, they don't have specialized or unique access to data. So I'm not sure that's so important today. I think what's important for companies, and it's been the same for the previous generation of technology, is their ability to build better technology faster than others. And in this new paradigm, that means being able to build machine learning faster than others, and better. So that's how, in my opinion, you should approach this. And kind of like how can you evolve your company, your teams, your products, so that you are able in the long run to build machine learning better and faster than your competitors. And if you manage to put yourself in that situation, then that's when you'll be able to differentiate yourself to really kind of be impactful and get results. That's really hard to do. It's something really different, because machine learning and AI is a different paradigm than traditional software. So this is going to be challenging, but I think if you manage to nail that, then the future is going to be very interesting for your company. >> That's a great point. Thanks for calling that out. I think this all reminds me of the cloud days early on. If you went to the cloud early, you took advantage of it when the pandemic hit. If you weren't native in the cloud, you got hamstrung by that, you were flatfooted. So just get in there. (laughs) Get in the cloud, get into AI, you're going to be good. Thanks for for calling that. Final parting comments, what's your most exciting thing going on right now for you guys? Ori, Clem, what's the most exciting thing on your plate right now that you'd like to share with folks? >> I mean, for me it's just the diversity of use cases and really creative ways of companies leveraging this technology. Every day I speak with about two, three customers, and I'm continuously being surprised by the creative ideas. And the future is really exciting of what can be achieved here. And also I'm amazed by the pace that things move in this industry. It's just, there's not at dull moment. So, definitely exciting times. >> Clem, what are you most excited about right now? >> For me, it's all the new open source models that have been released in the past few weeks, and that they'll keep being released in the next few weeks. I'm also super excited about more and more companies getting into this capability of chaining different models and different APIs. I think that's a very, very interesting development, because it creates new capabilities, new possibilities, new functionalities that weren't possible before. You can plug an API with an open source embedding model, with like a no-geo transcription model. So that's also very exciting. This capability of having more interoperable machine learning will also, I think, open a lot of interesting things in the future. >> Clem, congratulations on your success at Hugging Face. Please pass that on to your team. Ori, congratulations on your success, and continue to, just day one. I mean, it's just the beginning. It's not even scratching the service. Ankur, I'll give you the last word. What are you excited for at AWS? More cloud goodness coming here with AI. Give you the final word. >> Yeah, so as both Clem and Ori said, I think the research in the space is moving really, really fast, so we are excited about that. But we are also excited to see the speed at which enterprises and other AWS customers are applying machine learning to solve real business problems, and the kind of results they're seeing. So when they come back to us and tell us the kind of improvement in their business metrics and overall customer experience that they're driving and they're seeing real business results, that's what keeps us going and inspires us to continue inventing on their behalf. >> Gentlemen, thank you so much for this awesome high impact panel. Ankur, Clem, Ori, congratulations on all your success. We'll see you around. Thanks for coming on. Generative AI, riding the wave, it's a tidal wave, it's the water, it's all happening. All great stuff. This is season three, episode one of AWS Startup Showcase closing panel. This is the AI ML episode, the top startups building generative AI on AWS. I'm John Furrier, your host. Thanks for watching. (mellow music)

Published Date : Mar 9 2023

SUMMARY :

This is the closing panel I'm super excited to have you all on. is to really provide and to me being in California, and then you get your product. kind of the default APIs, the cloud. and kind of making the I saw the Wall Street Journal I think it's important to realize that the app developers out there So the barrier to entry became lower; I have to ask you guys, instead of the other way around. That's kind of the DevOps movement. and the cloud is playing a and the use cases that you're enabling? the barrier to entry And you need scale for that. in the next few years and AI can fill the void, a lot of the intelligence and if we can automate reduce a little bit the barrier to entry. I'd love to get each of you drive the biggest ROI. to folks who are scratching So I think once you think Clem, what's your take on this? and it's been the same of the cloud days early on. And also I'm amazed by the pace in the past few weeks, Please pass that on to your team. and the kind of results they're seeing. This is the AI ML episode,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Ankur MehrotraPERSON

0.99+

JohnPERSON

0.99+

AWSORGANIZATION

0.99+

ClemPERSON

0.99+

Ori GoshenPERSON

0.99+

John FurrierPERSON

0.99+

CaliforniaLOCATION

0.99+

OriPERSON

0.99+

Clem DelanguePERSON

0.99+

Hugging FaceORGANIZATION

0.99+

JulienPERSON

0.99+

AnkurPERSON

0.99+

AmazonORGANIZATION

0.99+

Tel AvivLOCATION

0.99+

threeQUANTITY

0.99+

AnkurORGANIZATION

0.99+

second roundQUANTITY

0.99+

AI21 LabsORGANIZATION

0.99+

two separate categoriesQUANTITY

0.99+

Amazon.comORGANIZATION

0.99+

last yearDATE

0.99+

two thingsQUANTITY

0.99+

firstQUANTITY

0.98+

over 15,000 companiesQUANTITY

0.98+

BothQUANTITY

0.98+

five yearsQUANTITY

0.98+

bothQUANTITY

0.98+

over three yearsQUANTITY

0.98+

three customersQUANTITY

0.98+

eachQUANTITY

0.98+

TrainiumORGANIZATION

0.98+

todayDATE

0.98+

AlexaTITLE

0.98+

Stable DiffusionORGANIZATION

0.97+

SwamiPERSON

0.97+

InferentiaORGANIZATION

0.96+

GPT-JORGANIZATION

0.96+

SageMakerTITLE

0.96+

AI21 LabsORGANIZATION

0.95+

Riding the WaveTITLE

0.95+

ControlNetORGANIZATION

0.94+

one wayQUANTITY

0.94+

a million linesQUANTITY

0.93+

Startup ShowcaseEVENT

0.92+

few months agoDATE

0.92+

second waveEVENT

0.91+

theCUBEORGANIZATION

0.91+

few years agoDATE

0.91+

CodeWhispererTITLE

0.9+

AI21ORGANIZATION

0.89+

SiliconANGLE News | AWS Responds to OpenAI with Hugging Face Expanded Partnership


 

(upbeat music) >> Hello everyone. Welcome to Silicon Angle news breaking story here. Amazon Web Services, expanding their relationship with Hugging Face, breaking news here on Silicon Angle. I'm John Furrier, Silicon Angle reporter, founder and also co-host of theCUBE. And I have with me Swami from Amazon Web Services, vice president of database analytics machine learning with AWS. Swami, great to have you on for this breaking news segment on AWS's big news. Thanks for coming on, taking the time. >> Hey John, pleasure to be here. >> We've had many conversations on theCUBE over the years. We've watched Amazon really move fast into the large data modeling. You SageMaker became a very smashing success. Obviously you've been on this for a while, now with Chat GPT, open AI, a lot of buzz going mainstream, takes it from behind the curtain, inside the ropes, if you will, in the industry to a mainstream. And so this is a big moment I think in the industry. I want to get your perspective because your news with Hugging Face, I think is a is another tell sign that we're about to tip over into a new accelerated growth around making AI now application aware application centric, more programmable, more API access. What's the big news about with AWS Hugging Face, you know, what's going on with this announcement? >> Yeah, first of all, they're very excited to announce our expanded collaboration with Hugging Face because with this partnership, our goal, as you all know, I mean Hugging Face I consider them like the GitHub for machine learning. And with this partnership, Hugging Face and AWS will be able to democratize AI for a broad range of developers, not just specific deep AI startups. And now with this we can accelerate the training, fine tuning, and deployment of these large language models and vision models from Hugging Face in the cloud. So, and the broader context, when you step back and see what customer problem we are trying to solve with this announcement, essentially if you see these foundational models are used to now create like a huge number of applications, suggest like tech summarization, question answering, or search image generation, creative, other things. And these are all stuff we are seeing in the likes of these Chat GPT style applications. But there is a broad range of enterprise use cases that we don't even talk about. And it's because these kind of transformative generative AI capabilities and models are not available to, I mean, millions of developers. And because either training these elements from scratch can be very expensive or time consuming and need deep expertise, or more importantly, they don't need these generic models. They need them to be fine tuned for the specific use cases. And one of the biggest complaints we hear is that these models, when they try to use it for real production use cases, they are incredibly expensive to train and incredibly expensive to run inference on, to use it at a production scale, so And unlike search, web search style applications where the margins can be really huge, here in production use cases and enterprises, you want efficiency at scale. That's where a Hugging Face and AWS share our mission. And by integrating with Trainium and Inferentia, we're able to handle the cost efficient training and inference at scale. I'll deep dive on it and by training teaming up on the SageMaker front now the time it takes to build these models and fine tune them as also coming down. So that's what makes this partnership very unique as well. So I'm very excited. >> I want to get into the, to the time savings and the cost savings as well on the on the training and inference. It's a huge issue. But before we get into that, just how long have you guys been working with Hugging Face? I know this is a previous relationship. This is an expansion of that relationship. Can you comment on the what's different about what's happened before and then now? >> Yeah, so Hugging Face, we have had an great relationship in the past few years as well where they have actually made their models available to run on AWS in a fashion, even inspect their Bloom project was something many of our customers even used. Bloom Project for context is their open source project, which builds a GPT three style model. And now with this expanded collaboration, now Hugging Face selected AWS for that next generation of this generative AI model, building on their highly successful Bloom project as well. And the nice thing is now by direct integration with Trainium and Inferentia, where you get cost savings in a really significant way. Now for instance, tier 1 can provide up to 50% cost to train savings, and Inferentia can deliver up to 60% better costs and Forex more higher throughput. Now these models, especially as they train that next generation generated AI model, it is going to be not only more accessible to all the developers who use it in open. So it'll be a lot cheaper as well. And that's what makes this moment really exciting because yeah, we can't democratize AI unless we make it broadly accessible and cost efficient, and easy to program and use as well. >> Okay, thanks Swami. We really appreciate. Swami's a Cube alumni, but also vice President, database analyst machine learning web services breaking down the Hugging Face announcement. Obviously the relationship he called it the GitHub of machine learning. This is the beginning of what we will see, a continuing competitive battle with Microsoft. Microsoft launching OpenAI. Amazon's been doing it for years. They got Alexa, they know what they're doing. It's going to be very interesting to see how this all plays out. You're watching Silicon Angle News, breaking here. I'm John Furrier, host of the Cube. Thanks for watching. (ethereal music)

Published Date : Feb 23 2023

SUMMARY :

And I have with me Swami into the large data modeling. the time it takes to build these models and the cost savings as well on the and easy to program and use as well. I'm John Furrier, host of the

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Amazon Web ServicesORGANIZATION

0.99+

John FurrierPERSON

0.99+

JohnPERSON

0.99+

AWSORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

SwamiPERSON

0.99+

AmazonORGANIZATION

0.99+

millionsQUANTITY

0.99+

GitHubORGANIZATION

0.98+

AlexaTITLE

0.98+

InferentiaORGANIZATION

0.97+

Silicon AngleORGANIZATION

0.97+

TrainiumORGANIZATION

0.97+

Hugging FaceORGANIZATION

0.96+

oneQUANTITY

0.95+

up to 60%QUANTITY

0.95+

up to 50%QUANTITY

0.95+

CubeORGANIZATION

0.94+

Hugging FaceTITLE

0.94+

Chat GPTTITLE

0.86+

BloomPERSON

0.84+

OpenAITITLE

0.83+

theCUBEORGANIZATION

0.77+

Chat GPTTITLE

0.76+

1OTHER

0.75+

Silicon Angle NewsTITLE

0.74+

FaceTITLE

0.73+

BloomTITLE

0.72+

developersQUANTITY

0.7+

TrainiumTITLE

0.7+

Silicon AngleORGANIZATION

0.64+

past few yearsDATE

0.63+

BloomORGANIZATION

0.56+

SiliconANGLE NewsTITLE

0.55+

SageMakerTITLE

0.53+

tierQUANTITY

0.52+

HuggingORGANIZATION

0.49+

SiliconORGANIZATION

0.48+

AngleLOCATION

0.47+

SiliconANGLE News | Swami Sivasubramanian Extended Version


 

(bright upbeat music) >> Hello, everyone. Welcome to SiliconANGLE News breaking story here. Amazon Web Services expanding their relationship with Hugging Face, breaking news here on SiliconANGLE. I'm John Furrier, SiliconANGLE reporter, founder, and also co-host of theCUBE. And I have with me, Swami, from Amazon Web Services, vice president of database, analytics, machine learning with AWS. Swami, great to have you on for this breaking news segment on AWS's big news. Thanks for coming on and taking the time. >> Hey, John, pleasure to be here. >> You know- >> Looking forward to it. >> We've had many conversations on theCUBE over the years, we've watched Amazon really move fast into the large data modeling, SageMaker became a very smashing success, obviously you've been on this for a while. Now with ChatGPT OpenAI, a lot of buzz going mainstream, takes it from behind the curtain inside the ropes, if you will, in the industry to a mainstream. And so this is a big moment, I think, in the industry, I want to get your perspective, because your news with Hugging Face, I think is another tell sign that we're about to tip over into a new accelerated growth around making AI now application aware, application centric, more programmable, more API access. What's the big news about, with AWS Hugging Face, you know, what's going on with this announcement? >> Yeah. First of all, they're very excited to announce our expanded collaboration with Hugging Face, because with this partnership, our goal, as you all know, I mean, Hugging Face, I consider them like the GitHub for machine learning. And with this partnership, Hugging Face and AWS, we'll be able to democratize AI for a broad range of developers, not just specific deep AI startups. And now with this, we can accelerate the training, fine tuning and deployment of these large language models, and vision models from Hugging Face in the cloud. And the broader context, when you step back and see what customer problem we are trying to solve with this announcement, essentially if you see these foundational models, are used to now create like a huge number of applications, suggest like tech summarization, question answering, or search image generation, creative, other things. And these are all stuff we are seeing in the likes of these ChatGPT style applications. But there is a broad range of enterprise use cases that we don't even talk about. And it's because these kind of transformative, generative AI capabilities and models are not available to, I mean, millions of developers. And because either training these elements from scratch can be very expensive or time consuming and need deep expertise, or more importantly, they don't need these generic models, they need them to be fine tuned for the specific use cases. And one of the biggest complaints we hear is that these models, when they try to use it for real production use cases, they are incredibly expensive to train and incredibly expensive to run inference on, to use it at a production scale. So, and unlike web search style applications, where the margins can be really huge, here in production use cases and enterprises, you want efficiency at scale. That's where Hugging Face and AWS share our mission. And by integrating with Trainium and Inferentia, we're able to handle the cost efficient training and inference at scale, I'll deep dive on it. And by teaming up on the SageMaker front, now the time it takes to build these models and fine tune them is also coming down. So that's what makes this partnership very unique as well. So I'm very excited. >> I want to get into the time savings and the cost savings as well on the training and inference, it's a huge issue, but before we get into that, just how long have you guys been working with Hugging Face? I know there's a previous relationship, this is an expansion of that relationship, can you comment on what's different about what's happened before and then now? >> Yeah. So, Hugging Face, we have had a great relationship in the past few years as well, where they have actually made their models available to run on AWS, you know, fashion. Even in fact, their Bloom Project was something many of our customers even used. Bloom Project, for context, is their open source project which builds a GPT-3 style model. And now with this expanded collaboration, now Hugging Face selected AWS for that next generation office generative AI model, building on their highly successful Bloom Project as well. And the nice thing is, now, by direct integration with Trainium and Inferentia, where you get cost savings in a really significant way, now, for instance, Trn1 can provide up to 50% cost to train savings, and Inferentia can deliver up to 60% better costs, and four x more higher throughput than (indistinct). Now, these models, especially as they train that next generation generative AI models, it is going to be, not only more accessible to all the developers, who use it in open, so it'll be a lot cheaper as well. And that's what makes this moment really exciting, because we can't democratize AI unless we make it broadly accessible and cost efficient and easy to program and use as well. >> Yeah. >> So very exciting. >> I'll get into the SageMaker and CodeWhisperer angle in a second, but you hit on some good points there. One, accessibility, which is, I call the democratization, which is getting this in the hands of developers, and/or AI to develop, we'll get into that in a second. So, access to coding and Git reasoning is a whole nother wave. But the three things I know you've been working on, I want to put in the buckets here and comment, one, I know you've, over the years, been working on saving time to train, that's a big point, you mentioned some of those stats, also cost, 'cause now cost is an equation on, you know, bundling whether you're uncoupling with hardware and software, that's a big issue. Where do I find the GPUs? Where's the horsepower cost? And then also sustainability. You've mentioned that in the past, is there a sustainability angle here? Can you talk about those three things, time, cost, and sustainability? >> Certainly. So if you look at it from the AWS perspective, we have been supporting customers doing machine learning for the past years. Just for broader context, Amazon has been doing ML the past two decades right from the early days of ML powered recommendation to actually also supporting all kinds of generative AI applications. If you look at even generative AI application within Amazon, Amazon search, when you go search for a product and so forth, we have a team called MFi within Amazon search that helps bring these large language models into creating highly accurate search results. And these are created with models, really large models with tens of billions of parameters, scales to thousands of training jobs every month and trained on large model of hardware. And this is an example of a really good large language foundation model application running at production scale, and also, of course, Alexa, which uses a large generator model as well. And they actually even had a research paper that showed that they are more, and do better in accuracy than other systems like GPT-3 and whatnot. So, and we also touched on things like CodeWhisperer, which uses generative AI to improve developer productivity, but in a responsible manner, because 40% of some of the studies show 40% of this generated code had serious security flaws in it. This is where we didn't just do generative AI, we combined with automated reasoning capabilities, which is a very, very useful technique to identify these issues and couple them so that it produces highly secure code as well. Now, all these learnings taught us few things, and which is what you put in these three buckets. And yeah, like more than 100,000 customers using ML and AI services, including leading startups in the generative AI space, like stability AI, AI21 Labs, or Hugging Face, or even Alexa, for that matter. They care about, I put them in three dimension, one is around cost, which we touched on with Trainium and Inferentia, where we actually, the Trainium, you provide to 50% better cost savings, but the other aspect is, Trainium is a lot more power efficient as well compared to traditional one. And Inferentia is also better in terms of throughput, when it comes to what it is capable of. Like it is able to deliver up to three x higher compute performance and four x higher throughput, compared to it's previous generation, and it is extremely cost efficient and power efficient as well. >> Well. >> Now, the second element that really is important is in a day, developers deeply value the time it takes to build these models, and they don't want to build models from scratch. And this is where SageMaker, which is, even going to Kaggle uses, this is what it is, number one, enterprise ML platform. What it did to traditional machine learning, where tens of thousands of customers use StageMaker today, including the ones I mentioned, is that what used to take like months to build these models have dropped down to now a matter of days, if not less. Now, a generative AI, the cost of building these models, if you look at the landscape, the model parameter size had jumped by more than thousand X in the past three years, thousand x. And that means the training is like a really big distributed systems problem. How do you actually scale these model training? How do you actually ensure that you utilize these efficiently? Because these machines are very expensive, let alone they consume a lot of power. So, this is where SageMaker capability to build, automatically train, tune, and deploy models really concern this, especially with this distributor training infrastructure, and those are some of the reasons why some of the leading generative AI startups are actually leveraging it, because they do not want a giant infrastructure team, which is constantly tuning and fine tuning, and keeping these clusters alive. >> It sounds like a lot like what startups are doing with the cloud early days, no data center, you move to the cloud. So, this is the trend we're seeing, right? You guys are making it easier for developers with Hugging Face, I get that. I love that GitHub for machine learning, large language models are complex and expensive to build, but not anymore, you got Trainium and Inferentia, developers can get faster time to value, but then you got the transformers data sets, token libraries, all that optimized for generator. This is a perfect storm for startups. Jon Turow, a former AWS person, who used to work, I think for you, is now a VC at Madrona Venture, he and I were talking about the generator AI landscape, it's exploding with startups. Every alpha entrepreneur out there is seeing this as the next frontier, that's the 20 mile stairs, next 10 years is going to be huge. What is the big thing that's happened? 'Cause some people were saying, the founder of Yquem said, "Oh, the start ups won't be real, because they don't all have AI experience." John Markoff, former New York Times writer told me that, AI, there's so much work done, this is going to explode, accelerate really fast, because it's almost like it's been waiting for this moment. What's your reaction? >> I actually think there is going to be an explosion of startups, not because they need to be AI startups, but now finally AI is really accessible or going to be accessible, so that they can create remarkable applications, either for enterprises or for disrupting actually how customer service is being done or how creative tools are being built. And I mean, this is going to change in many ways. When we think about generative AI, we always like to think of how it generates like school homework or arts or music or whatnot, but when you look at it on the practical side, generative AI is being actually used across various industries. I'll give an example of like Autodesk. Autodesk is a customer who runs an AWS and SageMaker. They already have an offering that enables generated design, where designers can generate many structural designs for products, whereby you give a specific set of constraints and they actually can generate a structure accordingly. And we see similar kind of trend across various industries, where it can be around creative media editing or various others. I have the strong sense that literally, in the next few years, just like now, conventional machine learning is embedded in every application, every mobile app that we see, it is pervasive, and we don't even think twice about it, same way, like almost all apps are built on cloud. Generative AI is going to be part of every startup, and they are going to create remarkable experiences without needing actually, these deep generative AI scientists. But you won't get that until you actually make these models accessible. And I also don't think one model is going to rule the world, then you want these developers to have access to broad range of models. Just like, go back to the early days of deep learning. Everybody thought it is going to be one framework that will rule the world, and it has been changing, from Caffe to TensorFlow to PyTorch to various other things. And I have a suspicion, we had to enable developers where they are, so. >> You know, Dave Vellante and I have been riffing on this concept called super cloud, and a lot of people have co-opted to be multicloud, but we really were getting at this whole next layer on top of say, AWS. You guys are the most comprehensive cloud, you guys are a super cloud, and even Adam and I are talking about ISVs evolving to ecosystem partners. I mean, your top customers have ecosystems building on top of it. This feels like a whole nother AWS. How are you guys leveraging the history of AWS, which by the way, had the same trajectory, startups came in, they didn't want to provision a data center, the heavy lifting, all the things that have made Amazon successful culturally. And day one thinking is, provide the heavy lifting, undifferentiated heavy lifting, and make it faster for developers to program code. AI's got the same thing. How are you guys taking this to the next level, because now, this is an opportunity for the competition to change the game and take it over? This is, I'm sure, a conversation, you guys have a lot of things going on in AWS that makes you unique. What's the internal and external positioning around how you take it to the next level? >> I mean, so I agree with you that generative AI has a very, very strong potential in terms of what it can enable in terms of next generation application. But this is where Amazon's experience and expertise in putting these foundation models to work internally really has helped us quite a bit. If you look at it, like amazon.com search is like a very, very important application in terms of what is the customer impact on number of customers who use that application openly, and the amount of dollar impact it does for an organization. And we have been doing it silently for a while now. And the same thing is true for like Alexa too, which actually not only uses it for natural language understanding other city, even national leverages is set for creating stories and various other examples. And now, our approach to it from AWS is we actually look at it as in terms of the same three tiers like we did in machine learning, because when you look at generative AI, we genuinely see three sets of customers. One is, like really deep technical expert practitioner startups. These are the startups that are creating the next generation models like the likes of stability AIs or Hugging Face with Bloom or AI21. And they generally want to build their own models, and they want the best price performance of their infrastructure for training and inference. That's where our investments in silicon and hardware and networking innovations, where Trainium and Inferentia really plays a big role. And we can nearly do that, and that is one. The second middle tier is where I do think developers don't want to spend time building their own models, let alone, they actually want the model to be useful to that data. They don't need their models to create like high school homeworks or various other things. What they generally want is, hey, I had this data from my enterprises that I want to fine tune and make it really work only for this, and make it work remarkable, can be for tech summarization, to generate a report, or it can be for better Q&A, and so forth. This is where we are. Our investments in the middle tier with SageMaker, and our partnership with Hugging Face and AI21 and co here are all going to very meaningful. And you'll see us investing, I mean, you already talked about CodeWhisperer, which is an open preview, but we are also partnering with a whole lot of top ISVs, and you'll see more on this front to enable the next wave of generated AI apps too, because this is an area where we do think lot of innovation is yet to be done. It's like day one for us in this space, and we want to enable that huge ecosystem to flourish. >> You know, one of the things Dave Vellante and I were talking about in our first podcast we just did on Friday, we're going to do weekly, is we highlighted the AI ChatGPT example as a horizontal use case, because everyone loves it, people are using it in all their different verticals, and horizontal scalable cloud plays perfectly into it. So I have to ask you, as you look at what AWS is going to bring to the table, a lot's changed over the past 13 years with AWS, a lot more services are available, how should someone rebuild or re-platform and refactor their application of business with AI, with AWS? What are some of the tools that you see and recommend? Is it Serverless, is it SageMaker, CodeWhisperer? What do you think's going to shine brightly within the AWS stack, if you will, or service list, that's going to be part of this? As you mentioned, CodeWhisperer and SageMaker, what else should people be looking at as they start tinkering and getting all these benefits, and scale up their ups? >> You know, if we were a startup, first, I would really work backwards from the customer problem I try to solve, and pick and choose, bar, I don't need to deal with the undifferentiated heavy lifting, so. And that's where the answer is going to change. If you look at it then, the answer is not going to be like a one size fits all, so you need a very strong, I mean, granted on the compute front, if you can actually completely accurate it, so unless, I will always recommend it, instead of running compute for running your ups, because it takes care of all the undifferentiated heavy lifting, but on the data, and that's where we provide a whole variety of databases, right from like relational data, or non-relational, or dynamo, and so forth. And of course, we also have a deep analytical stack, where data directly flows from our relational databases into data lakes and data virus. And you can get value along with partnership with various analytical providers. The area where I do think fundamentally things are changing on what people can do is like, with CodeWhisperer, I was literally trying to actually program a code on sending a message through Twilio, and I was going to pull up to read a documentation, and in my ID, I was actually saying like, let's try sending a message to Twilio, or let's actually update a Route 53 error code. All I had to do was type in just a comment, and it actually started generating the sub-routine. And it is going to be a huge time saver, if I were a developer. And the goal is for us not to actually do it just for AWS developers, and not to just generate the code, but make sure the code is actually highly secure and follows the best practices. So, it's not always about machine learning, it's augmenting with automated reasoning as well. And generative AI is going to be changing, and not just in how people write code, but also how it actually gets built and used as well. You'll see a lot more stuff coming on this front. >> Swami, thank you for your time. I know you're super busy. Thank you for sharing on the news and giving commentary. Again, I think this is a AWS moment and industry moment, heavy lifting, accelerated value, agility. AIOps is going to be probably redefined here. Thanks for sharing your commentary. And we'll see you next time, I'm looking forward to doing more follow up on this. It's going to be a big wave. Thanks. >> Okay. Thanks again, John, always a pleasure. >> Okay. This is SiliconANGLE's breaking news commentary. I'm John Furrier with SiliconANGLE News, as well as host of theCUBE. Swami, who's a leader in AWS, has been on theCUBE multiple times. We've been tracking the growth of how Amazon's journey has just been exploding past five years, in particular, past three. You heard the numbers, great performance, great reviews. This is a watershed moment, I think, for the industry, and it's going to be a lot of fun for the next 10 years. Thanks for watching. (bright music)

Published Date : Feb 22 2023

SUMMARY :

Swami, great to have you on inside the ropes, if you And one of the biggest complaints we hear and easy to program and use as well. I call the democratization, the Trainium, you provide And that means the training What is the big thing that's happened? and they are going to create this to the next level, and the amount of dollar impact that's going to be part of this? And generative AI is going to be changing, AIOps is going to be John, always a pleasure. and it's going to be a lot

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Dave VellantePERSON

0.99+

SwamiPERSON

0.99+

Amazon Web ServicesORGANIZATION

0.99+

Jon TurowPERSON

0.99+

John MarkoffPERSON

0.99+

AWSORGANIZATION

0.99+

JohnPERSON

0.99+

AmazonORGANIZATION

0.99+

John FurrierPERSON

0.99+

40%QUANTITY

0.99+

AutodeskORGANIZATION

0.99+

50%QUANTITY

0.99+

Madrona VentureORGANIZATION

0.99+

20 mileQUANTITY

0.99+

Hugging FaceORGANIZATION

0.99+

FridayDATE

0.99+

second elementQUANTITY

0.99+

more than 100,000 customersQUANTITY

0.99+

AI21ORGANIZATION

0.99+

tens of thousandsQUANTITY

0.99+

first podcastQUANTITY

0.99+

three tiersQUANTITY

0.98+

SiliconANGLEORGANIZATION

0.98+

twiceQUANTITY

0.98+

Bloom ProjectTITLE

0.98+

oneQUANTITY

0.98+

SageMakerORGANIZATION

0.98+

Hugging FaceTITLE

0.98+

AlexaTITLE

0.98+

firstQUANTITY

0.98+

GitHubORGANIZATION

0.98+

one modelQUANTITY

0.98+

up to 50%QUANTITY

0.97+

ChatGPTTITLE

0.97+

FirstQUANTITY

0.97+

more than thousand XQUANTITY

0.97+

amazon.comORGANIZATION

0.96+

tens of billionsQUANTITY

0.96+

OneQUANTITY

0.96+

up to 60%QUANTITY

0.96+

one frameworkQUANTITY

0.96+

YquemORGANIZATION

0.94+

three thingsQUANTITY

0.94+

InferentiaORGANIZATION

0.94+

CodeWhispererTITLE

0.93+

fourQUANTITY

0.92+

three setsQUANTITY

0.92+

threeQUANTITY

0.92+

TwilioORGANIZATION

0.92+

Breaking Analysis: ChatGPT Won't Give OpenAI First Mover Advantage


 

>> From theCUBE Studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR. This is Breaking Analysis with Dave Vellante. >> OpenAI The company, and ChatGPT have taken the world by storm. Microsoft reportedly is investing an additional 10 billion dollars into the company. But in our view, while the hype around ChatGPT is justified, we don't believe OpenAI will lock up the market with its first mover advantage. Rather, we believe that success in this market will be directly proportional to the quality and quantity of data that a technology company has at its disposal, and the compute power that it could deploy to run its system. Hello and welcome to this week's Wikibon CUBE insights, powered by ETR. In this Breaking Analysis, we unpack the excitement around ChatGPT, and debate the premise that the company's early entry into the space may not confer winner take all advantage to OpenAI. And to do so, we welcome CUBE collaborator, alum, Sarbjeet Johal, (chuckles) and John Furrier, co-host of the Cube. Great to see you Sarbjeet, John. Really appreciate you guys coming to the program. >> Great to be on. >> Okay, so what is ChatGPT? Well, actually we asked ChatGPT, what is ChatGPT? So here's what it said. ChatGPT is a state-of-the-art language model developed by OpenAI that can generate human-like text. It could be fine tuned for a variety of language tasks, such as conversation, summarization, and language translation. So I asked it, give it to me in 50 words or less. How did it do? Anything to add? >> Yeah, think it did good. It's large language model, like previous models, but it started applying the transformers sort of mechanism to focus on what prompt you have given it to itself. And then also the what answer it gave you in the first, sort of, one sentence or two sentences, and then introspect on itself, like what I have already said to you. And so just work on that. So it it's self sort of focus if you will. It does, the transformers help the large language models to do that. >> So to your point, it's a large language model, and GPT stands for generative pre-trained transformer. >> And if you put the definition back up there again, if you put it back up on the screen, let's see it back up. Okay, it actually missed the large, word large. So one of the problems with ChatGPT, it's not always accurate. It's actually a large language model, and it says state of the art language model. And if you look at Google, Google has dominated AI for many times and they're well known as being the best at this. And apparently Google has their own large language model, LLM, in play and have been holding it back to release because of backlash on the accuracy. Like just in that example you showed is a great point. They got almost right, but they missed the key word. >> You know what's funny about that John, is I had previously asked it in my prompt to give me it in less than a hundred words, and it was too long, I said I was too long for Breaking Analysis, and there it went into the fact that it's a large language model. So it largely, it gave me a really different answer the, for both times. So, but it's still pretty amazing for those of you who haven't played with it yet. And one of the best examples that I saw was Ben Charrington from This Week In ML AI podcast. And I stumbled on this thanks to Brian Gracely, who was listening to one of his Cloudcasts. Basically what Ben did is he took, he prompted ChatGPT to interview ChatGPT, and he simply gave the system the prompts, and then he ran the questions and answers into this avatar builder and sped it up 2X so it didn't sound like a machine. And voila, it was amazing. So John is ChatGPT going to take over as a cube host? >> Well, I was thinking, we get the questions in advance sometimes from PR people. We should actually just plug it in ChatGPT, add it to our notes, and saying, "Is this good enough for you? Let's ask the real question." So I think, you know, I think there's a lot of heavy lifting that gets done. I think the ChatGPT is a phenomenal revolution. I think it highlights the use case. Like that example we showed earlier. It gets most of it right. So it's directionally correct and it feels like it's an answer, but it's not a hundred percent accurate. And I think that's where people are seeing value in it. Writing marketing, copy, brainstorming, guest list, gift list for somebody. Write me some lyrics to a song. Give me a thesis about healthcare policy in the United States. It'll do a bang up job, and then you got to go in and you can massage it. So we're going to do three quarters of the work. That's why plagiarism and schools are kind of freaking out. And that's why Microsoft put 10 billion in, because why wouldn't this be a feature of Word, or the OS to help it do stuff on behalf of the user. So linguistically it's a beautiful thing. You can input a string and get a good answer. It's not a search result. >> And we're going to get your take on on Microsoft and, but it kind of levels the playing- but ChatGPT writes better than I do, Sarbjeet, and I know you have some good examples too. You mentioned the Reed Hastings example. >> Yeah, I was listening to Reed Hastings fireside chat with ChatGPT, and the answers were coming as sort of voice, in the voice format. And it was amazing what, he was having very sort of philosophy kind of talk with the ChatGPT, the longer sentences, like he was going on, like, just like we are talking, he was talking for like almost two minutes and then ChatGPT was answering. It was not one sentence question, and then a lot of answers from ChatGPT and yeah, you're right. I, this is our ability. I've been thinking deep about this since yesterday, we talked about, like, we want to do this segment. The data is fed into the data model. It can be the current data as well, but I think that, like, models like ChatGPT, other companies will have those too. They can, they're democratizing the intelligence, but they're not creating intelligence yet, definitely yet I can say that. They will give you all the finite answers. Like, okay, how do you do this for loop in Java, versus, you know, C sharp, and as a programmer you can do that, in, but they can't tell you that, how to write a new algorithm or write a new search algorithm for you. They cannot create a secretive code for you to- >> Not yet. >> Have competitive advantage. >> Not yet, not yet. >> but you- >> Can Google do that today? >> No one really can. The reasoning side of the data is, we talked about at our Supercloud event, with Zhamak Dehghani who's was CEO of, now of Nextdata. This next wave of data intelligence is going to come from entrepreneurs that are probably cross discipline, computer science and some other discipline. But they're going to be new things, for example, data, metadata, and data. It's hard to do reasoning like a human being, so that needs more data to train itself. So I think the first gen of this training module for the large language model they have is a corpus of text. Lot of that's why blog posts are, but the facts are wrong and sometimes out of context, because that contextual reasoning takes time, it takes intelligence. So machines need to become intelligent, and so therefore they need to be trained. So you're going to start to see, I think, a lot of acceleration on training the data sets. And again, it's only as good as the data you can get. And again, proprietary data sets will be a huge winner. Anyone who's got a large corpus of content, proprietary content like theCUBE or SiliconANGLE as a publisher will benefit from this. Large FinTech companies, anyone with large proprietary data will probably be a big winner on this generative AI wave, because it just, it will eat that up, and turn that back into something better. So I think there's going to be a lot of interesting things to look at here. And certainly productivity's going to be off the charts for vanilla and the internet is going to get swarmed with vanilla content. So if you're in the content business, and you're an original content producer of any kind, you're going to be not vanilla, so you're going to be better. So I think there's so much at play Dave (indistinct). >> I think the playing field has been risen, so we- >> Risen and leveled? >> Yeah, and leveled to certain extent. So it's now like that few people as consumers, as consumers of AI, we will have a advantage and others cannot have that advantage. So it will be democratized. That's, I'm sure about that. But if you take the example of calculator, when the calculator came in, and a lot of people are, "Oh, people can't do math anymore because calculator is there." right? So it's a similar sort of moment, just like a calculator for the next level. But, again- >> I see it more like open source, Sarbjeet, because like if you think about what ChatGPT's doing, you do a query and it comes from somewhere the value of a post from ChatGPT is just a reuse of AI. The original content accent will be come from a human. So if I lay out a paragraph from ChatGPT, did some heavy lifting on some facts, I check the facts, save me about maybe- >> Yeah, it's productive. >> An hour writing, and then I write a killer two, three sentences of, like, sharp original thinking or critical analysis. I then took that body of work, open source content, and then laid something on top of it. >> And Sarbjeet's example is a good one, because like if the calculator kids don't do math as well anymore, the slide rule, remember we had slide rules as kids, remember we first started using Waze, you know, we were this minority and you had an advantage over other drivers. Now Waze is like, you know, social traffic, you know, navigation, everybody had, you know- >> All the back roads are crowded. >> They're car crowded. (group laughs) Exactly. All right, let's, let's move on. What about this notion that futurist Ray Amara put forth and really Amara's Law that we're showing here, it's, the law is we, you know, "We tend to overestimate the effect of technology in the short run and underestimate it in the long run." Is that the case, do you think, with ChatGPT? What do you think Sarbjeet? >> I think that's true actually. There's a lot of, >> We don't debate this. >> There's a lot of awe, like when people see the results from ChatGPT, they say what, what the heck? Like, it can do this? But then if you use it more and more and more, and I ask the set of similar question, not the same question, and it gives you like same answer. It's like reading from the same bucket of text in, the interior read (indistinct) where the ChatGPT, you will see that in some couple of segments. It's very, it sounds so boring that the ChatGPT is coming out the same two sentences every time. So it is kind of good, but it's not as good as people think it is right now. But we will have, go through this, you know, hype sort of cycle and get realistic with it. And then in the long term, I think it's a great thing in the short term, it's not something which will (indistinct) >> What's your counter point? You're saying it's not. >> I, no I think the question was, it's hyped up in the short term and not it's underestimated long term. That's what I think what he said, quote. >> Yes, yeah. That's what he said. >> Okay, I think that's wrong with this, because this is a unique, ChatGPT is a unique kind of impact and it's very generational. People have been comparing it, I have been comparing to the internet, like the web, web browser Mosaic and Netscape, right, Navigator. I mean, I clearly still remember the days seeing Navigator for the first time, wow. And there weren't not many sites you could go to, everyone typed in, you know, cars.com, you know. >> That (indistinct) wasn't that overestimated, the overhyped at the beginning and underestimated. >> No, it was, it was underestimated long run, people thought. >> But that Amara's law. >> That's what is. >> No, they said overestimated? >> Overestimated near term underestimated- overhyped near term, underestimated long term. I got, right I mean? >> Well, I, yeah okay, so I would then agree, okay then- >> We were off the charts about the internet in the early days, and it actually exceeded our expectations. >> Well there were people who were, like, poo-pooing it early on. So when the browser came out, people were like, "Oh, the web's a toy for kids." I mean, in 1995 the web was a joke, right? So '96, you had online populations growing, so you had structural changes going on around the browser, internet population. And then that replaced other things, direct mail, other business activities that were once analog then went to the web, kind of read only as you, as we always talk about. So I think that's a moment where the hype long term, the smart money, and the smart industry experts all get the long term. And in this case, there's more poo-pooing in the short term. "Ah, it's not a big deal, it's just AI." I've heard many people poo-pooing ChatGPT, and a lot of smart people saying, "No this is next gen, this is different and it's only going to get better." So I think people are estimating a big long game on this one. >> So you're saying it's bifurcated. There's those who say- >> Yes. >> Okay, all right, let's get to the heart of the premise, and possibly the debate for today's episode. Will OpenAI's early entry into the market confer sustainable competitive advantage for the company. And if you look at the history of tech, the technology industry, it's kind of littered with first mover failures. Altair, IBM, Tandy, Commodore, they and Apple even, they were really early in the PC game. They took a backseat to Dell who came in the scene years later with a better business model. Netscape, you were just talking about, was all the rage in Silicon Valley, with the first browser, drove up all the housing prices out here. AltaVista was the first search engine to really, you know, index full text. >> Owned by Dell, I mean DEC. >> Owned by Digital. >> Yeah, Digital Equipment >> Compaq bought it. And of course as an aside, Digital, they wanted to showcase their hardware, right? Their super computer stuff. And then so Friendster and MySpace, they came before Facebook. The iPhone certainly wasn't the first mobile device. So lots of failed examples, but there are some recent successes like AWS and cloud. >> You could say smartphone. So I mean. >> Well I know, and you can, we can parse this so we'll debate it. Now Twitter, you could argue, had first mover advantage. You kind of gave me that one John. Bitcoin and crypto clearly had first mover advantage, and sustaining that. Guys, will OpenAI make it to the list on the right with ChatGPT, what do you think? >> I think categorically as a company, it probably won't, but as a category, I think what they're doing will, so OpenAI as a company, they get funding, there's power dynamics involved. Microsoft put a billion dollars in early on, then they just pony it up. Now they're reporting 10 billion more. So, like, if the browsers, Microsoft had competitive advantage over Netscape, and used monopoly power, and convicted by the Department of Justice for killing Netscape with their monopoly, Netscape should have had won that battle, but Microsoft killed it. In this case, Microsoft's not killing it, they're buying into it. So I think the embrace extend Microsoft power here makes OpenAI vulnerable for that one vendor solution. So the AI as a company might not make the list, but the category of what this is, large language model AI, is probably will be on the right hand side. >> Okay, we're going to come back to the government intervention and maybe do some comparisons, but what are your thoughts on this premise here? That, it will basically set- put forth the premise that it, that ChatGPT, its early entry into the market will not confer competitive advantage to >> For OpenAI. >> To Open- Yeah, do you agree with that? >> I agree with that actually. It, because Google has been at it, and they have been holding back, as John said because of the scrutiny from the Fed, right, so- >> And privacy too. >> And the privacy and the accuracy as well. But I think Sam Altman and the company on those guys, right? They have put this in a hasty way out there, you know, because it makes mistakes, and there are a lot of questions around the, sort of, where the content is coming from. You saw that as your example, it just stole the content, and without your permission, you know? >> Yeah. So as quick this aside- >> And it codes on people's behalf and the, those codes are wrong. So there's a lot of, sort of, false information it's putting out there. So it's a very vulnerable thing to do what Sam Altman- >> So even though it'll get better, others will compete. >> So look, just side note, a term which Reid Hoffman used a little bit. Like he said, it's experimental launch, like, you know, it's- >> It's pretty damn good. >> It is clever because according to Sam- >> It's more than clever. It's good. >> It's awesome, if you haven't used it. I mean you write- you read what it writes and you go, "This thing writes so well, it writes so much better than you." >> The human emotion drives that too. I think that's a big thing. But- >> I Want to add one more- >> Make your last point. >> Last one. Okay. So, but he's still holding back. He's conducting quite a few interviews. If you want to get the gist of it, there's an interview with StrictlyVC interview from yesterday with Sam Altman. Listen to that one it's an eye opening what they want- where they want to take it. But my last one I want to make it on this point is that Satya Nadella yesterday did an interview with Wall Street Journal. I think he was doing- >> You were not impressed. >> I was not impressed because he was pushing it too much. So Sam Altman's holding back so there's less backlash. >> Got 10 billion reasons to push. >> I think he's almost- >> Microsoft just laid off 10000 people. Hey ChatGPT, find me a job. You know like. (group laughs) >> He's overselling it to an extent that I think it will backfire on Microsoft. And he's over promising a lot of stuff right now, I think. I don't know why he's very jittery about all these things. And he did the same thing during Ignite as well. So he said, "Oh, this AI will write code for you and this and that." Like you called him out- >> The hyperbole- >> During your- >> from Satya Nadella, he's got a lot of hyperbole. (group talks over each other) >> All right, Let's, go ahead. >> Well, can I weigh in on the whole- >> Yeah, sure. >> Microsoft thing on whether OpenAI, here's the take on this. I think it's more like the browser moment to me, because I could relate to that experience with ChatG, personally, emotionally, when I saw that, and I remember vividly- >> You mean that aha moment (indistinct). >> Like this is obviously the future. Anything else in the old world is dead, website's going to be everywhere. It was just instant dot connection for me. And a lot of other smart people who saw this. Lot of people by the way, didn't see it. Someone said the web's a toy. At the company I was worked for at the time, Hewlett Packard, they like, they could have been in, they had invented HTML, and so like all this stuff was, like, they just passed, the web was just being passed over. But at that time, the browser got better, more websites came on board. So the structural advantage there was online web usage was growing, online user population. So that was growing exponentially with the rise of the Netscape browser. So OpenAI could stay on the right side of your list as durable, if they leverage the category that they're creating, can get the scale. And if they can get the scale, just like Twitter, that failed so many times that they still hung around. So it was a product that was always successful, right? So I mean, it should have- >> You're right, it was terrible, we kept coming back. >> The fail whale, but it still grew. So OpenAI has that moment. They could do it if Microsoft doesn't meddle too much with too much power as a vendor. They could be the Netscape Navigator, without the anti-competitive behavior of somebody else. So to me, they have the pole position. So they have an opportunity. So if not, if they don't execute, then there's opportunity. There's not a lot of barriers to entry, vis-a-vis say the CapEx of say a cloud company like AWS. You can't replicate that, Many have tried, but I think you can replicate OpenAI. >> And we're going to talk about that. Okay, so real quick, I want to bring in some ETR data. This isn't an ETR heavy segment, only because this so new, you know, they haven't coverage yet, but they do cover AI. So basically what we're seeing here is a slide on the vertical axis's net score, which is a measure of spending momentum, and in the horizontal axis's is presence in the dataset. Think of it as, like, market presence. And in the insert right there, you can see how the dots are plotted, the two columns. And so, but the key point here that we want to make, there's a bunch of companies on the left, is he like, you know, DataRobot and C3 AI and some others, but the big whales, Google, AWS, Microsoft, are really dominant in this market. So that's really the key takeaway that, can we- >> I notice IBM is way low. >> Yeah, IBM's low, and actually bring that back up and you, but then you see Oracle who actually is injecting. So I guess that's the other point is, you're not necessarily going to go buy AI, and you know, build your own AI, you're going to, it's going to be there and, it, Salesforce is going to embed it into its platform, the SaaS companies, and you're going to purchase AI. You're not necessarily going to build it. But some companies obviously are. >> I mean to quote IBM's general manager Rob Thomas, "You can't have AI with IA." information architecture and David Flynn- >> You can't Have AI without IA >> without, you can't have AI without IA. You can't have, if you have an Information Architecture, you then can power AI. Yesterday David Flynn, with Hammersmith, was on our Supercloud. He was pointing out that the relationship of storage, where you store things, also impacts the data and stressablity, and Zhamak from Nextdata, she was pointing out that same thing. So the data problem factors into all this too, Dave. >> So you got the big cloud and internet giants, they're all poised to go after this opportunity. Microsoft is investing up to 10 billion. Google's code red, which was, you know, the headline in the New York Times. Of course Apple is there and several alternatives in the market today. Guys like Chinchilla, Bloom, and there's a company Jasper and several others, and then Lena Khan looms large and the government's around the world, EU, US, China, all taking notice before the market really is coalesced around a single player. You know, John, you mentioned Netscape, they kind of really, the US government was way late to that game. It was kind of game over. And Netscape, I remember Barksdale was like, "Eh, we're going to be selling software in the enterprise anyway." and then, pshew, the company just dissipated. So, but it looks like the US government, especially with Lena Khan, they're changing the definition of antitrust and what the cause is to go after people, and they're really much more aggressive. It's only what, two years ago that (indistinct). >> Yeah, the problem I have with the federal oversight is this, they're always like late to the game, and they're slow to catch up. So in other words, they're working on stuff that should have been solved a year and a half, two years ago around some of the social networks hiding behind some of the rules around open web back in the days, and I think- >> But they're like 15 years late to that. >> Yeah, and now they got this new thing on top of it. So like, I just worry about them getting their fingers. >> But there's only two years, you know, OpenAI. >> No, but the thing (indistinct). >> No, they're still fighting other battles. But the problem with government is that they're going to label Big Tech as like a evil thing like Pharma, it's like smoke- >> You know Lena Khan wants to kill Big Tech, there's no question. >> So I think Big Tech is getting a very seriously bad rap. And I think anything that the government does that shades darkness on tech, is politically motivated in most cases. You can almost look at everything, and my 80 20 rule is in play here. 80% of the government activity around tech is bullshit, it's politically motivated, and the 20% is probably relevant, but off the mark and not organized. >> Well market forces have always been the determining factor of success. The governments, you know, have been pretty much failed. I mean you look at IBM's antitrust, that, what did that do? The market ultimately beat them. You look at Microsoft back in the day, right? Windows 95 was peaking, the government came in. But you know, like you said, they missed the web, right, and >> so they were hanging on- >> There's nobody in government >> to Windows. >> that actually knows- >> And so, you, I think you're right. It's market forces that are going to determine this. But Sarbjeet, what do you make of Microsoft's big bet here, you weren't impressed with with Nadella. How do you think, where are they going to apply it? Is this going to be a Hail Mary for Bing, or is it going to be applied elsewhere? What do you think. >> They are saying that they will, sort of, weave this into their products, office products, productivity and also to write code as well, developer productivity as well. That's a big play for them. But coming back to your antitrust sort of comments, right? I believe the, your comment was like, oh, fed was late 10 years or 15 years earlier, but now they're two years. But things are moving very fast now as compared to they used to move. >> So two years is like 10 Years. >> Yeah, two years is like 10 years. Just want to make that point. (Dave laughs) This thing is going like wildfire. Any new tech which comes in that I think they're going against distribution channels. Lina Khan has commented time and again that the marketplace model is that she wants to have some grip on. Cloud marketplaces are a kind of monopolistic kind of way. >> I don't, I don't see this, I don't see a Chat AI. >> You told me it's not Bing, you had an interesting comment. >> No, no. First of all, this is great from Microsoft. If you're Microsoft- >> Why? >> Because Microsoft doesn't have the AI chops that Google has, right? Google is got so much core competency on how they run their search, how they run their backends, their cloud, even though they don't get a lot of cloud market share in the enterprise, they got a kick ass cloud cause they needed one. >> Totally. >> They've invented SRE. I mean Google's development and engineering chops are off the scales, right? Amazon's got some good chops, but Google's got like 10 times more chops than AWS in my opinion. Cloud's a whole different story. Microsoft gets AI, they get a playbook, they get a product they can render into, the not only Bing, productivity software, helping people write papers, PowerPoint, also don't forget the cloud AI can super help. We had this conversation on our Supercloud event, where AI's going to do a lot of the heavy lifting around understanding observability and managing service meshes, to managing microservices, to turning on and off applications, and or maybe writing code in real time. So there's a plethora of use cases for Microsoft to deploy this. combined with their R and D budgets, they can then turbocharge more research, build on it. So I think this gives them a car in the game, Google may have pole position with AI, but this puts Microsoft right in the game, and they already have a lot of stuff going on. But this just, I mean everything gets lifted up. Security, cloud, productivity suite, everything. >> What's under the hood at Google, and why aren't they talking about it? I mean they got to be freaked out about this. No? Or do they have kind of a magic bullet? >> I think they have the, they have the chops definitely. Magic bullet, I don't know where they are, as compared to the ChatGPT 3 or 4 models. Like they, but if you look at the online sort of activity and the videos put out there from Google folks, Google technology folks, that's account you should look at if you are looking there, they have put all these distinctions what ChatGPT 3 has used, they have been talking about for a while as well. So it's not like it's a secret thing that you cannot replicate. As you said earlier, like in the beginning of this segment, that anybody who has more data and the capacity to process that data, which Google has both, I think they will win this. >> Obviously living in Palo Alto where the Google founders are, and Google's headquarters next town over we have- >> We're so close to them. We have inside information on some of the thinking and that hasn't been reported by any outlet yet. And that is, is that, from what I'm hearing from my sources, is Google has it, they don't want to release it for many reasons. One is it might screw up their search monopoly, one, two, they're worried about the accuracy, 'cause Google will get sued. 'Cause a lot of people are jamming on this ChatGPT as, "Oh it does everything for me." when it's clearly not a hundred percent accurate all the time. >> So Lina Kahn is looming, and so Google's like be careful. >> Yeah so Google's just like, this is the third, could be a third rail. >> But the first thing you said is a concern. >> Well no. >> The disruptive (indistinct) >> What they will do is do a Waymo kind of thing, where they spin out a separate company. >> They're doing that. >> The discussions happening, they're going to spin out the separate company and put it over there, and saying, "This is AI, got search over there, don't touch that search, 'cause that's where all the revenue is." (chuckles) >> So, okay, so that's how they deal with the Clay Christensen dilemma. What's the business model here? I mean it's not advertising, right? Is it to charge you for a query? What, how do you make money at this? >> It's a good question, I mean my thinking is, first of all, it's cool to type stuff in and see a paper get written, or write a blog post, or gimme a marketing slogan for this or that or write some code. I think the API side of the business will be critical. And I think Howie Xu, I know you're going to reference some of his comments yesterday on Supercloud, I think this brings a whole 'nother user interface into technology consumption. I think the business model, not yet clear, but it will probably be some sort of either API and developer environment or just a straight up free consumer product, with some sort of freemium backend thing for business. >> And he was saying too, it's natural language is the way in which you're going to interact with these systems. >> I think it's APIs, it's APIs, APIs, APIs, because these people who are cooking up these models, and it takes a lot of compute power to train these and to, for inference as well. Somebody did the analysis on the how many cents a Google search costs to Google, and how many cents the ChatGPT query costs. It's, you know, 100x or something on that. You can take a look at that. >> A 100x on which side? >> You're saying two orders of magnitude more expensive for ChatGPT >> Much more, yeah. >> Than for Google. >> It's very expensive. >> So Google's got the data, they got the infrastructure and they got, you're saying they got the cost (indistinct) >> No actually it's a simple query as well, but they are trying to put together the answers, and they're going through a lot more data versus index data already, you know. >> Let me clarify, you're saying that Google's version of ChatGPT is more efficient? >> No, I'm, I'm saying Google search results. >> Ah, search results. >> What are used to today, but cheaper. >> But that, does that, is that going to confer advantage to Google's large language (indistinct)? >> It will, because there were deep science (indistinct). >> Google, I don't think Google search is doing a large language model on their search, it's keyword search. You know, what's the weather in Santa Cruz? Or how, what's the weather going to be? Or you know, how do I find this? Now they have done a smart job of doing some things with those queries, auto complete, re direct navigation. But it's, it's not entity. It's not like, "Hey, what's Dave Vellante thinking this week in Breaking Analysis?" ChatGPT might get that, because it'll get your Breaking Analysis, it'll synthesize it. There'll be some, maybe some clips. It'll be like, you know, I mean. >> Well I got to tell you, I asked ChatGPT to, like, I said, I'm going to enter a transcript of a discussion I had with Nir Zuk, the CTO of Palo Alto Networks, And I want you to write a 750 word blog. I never input the transcript. It wrote a 750 word blog. It attributed quotes to him, and it just pulled a bunch of stuff that, and said, okay, here it is. It talked about Supercloud, it defined Supercloud. >> It's made, it makes you- >> Wow, But it was a big lie. It was fraudulent, but still, blew me away. >> Again, vanilla content and non accurate content. So we are going to see a surge of misinformation on steroids, but I call it the vanilla content. Wow, that's just so boring, (indistinct). >> There's so many dangers. >> Make your point, cause we got to, almost out of time. >> Okay, so the consumption, like how do you consume this thing. As humans, we are consuming it and we are, like, getting a nicely, like, surprisingly shocked, you know, wow, that's cool. It's going to increase productivity and all that stuff, right? And on the danger side as well, the bad actors can take hold of it and create fake content and we have the fake sort of intelligence, if you go out there. So that's one thing. The second thing is, we are as humans are consuming this as language. Like we read that, we listen to it, whatever format we consume that is, but the ultimate usage of that will be when the machines can take that output from likes of ChatGPT, and do actions based on that. The robots can work, the robot can paint your house, we were talking about, right? Right now we can't do that. >> Data apps. >> So the data has to be ingested by the machines. It has to be digestible by the machines. And the machines cannot digest unorganized data right now, we will get better on the ingestion side as well. So we are getting better. >> Data, reasoning, insights, and action. >> I like that mall, paint my house. >> So, okay- >> By the way, that means drones that'll come in. Spray painting your house. >> Hey, it wasn't too long ago that robots couldn't climb stairs, as I like to point out. Okay, and of course it's no surprise the venture capitalists are lining up to eat at the trough, as I'd like to say. Let's hear, you'd referenced this earlier, John, let's hear what AI expert Howie Xu said at the Supercloud event, about what it takes to clone ChatGPT. Please, play the clip. >> So one of the VCs actually asked me the other day, right? "Hey, how much money do I need to spend, invest to get a, you know, another shot to the openAI sort of the level." You know, I did a (indistinct) >> Line up. >> A hundred million dollar is the order of magnitude that I came up with, right? You know, not a billion, not 10 million, right? So a hundred- >> Guys a hundred million dollars, that's an astoundingly low figure. What do you make of it? >> I was in an interview with, I was interviewing, I think he said hundred million or so, but in the hundreds of millions, not a billion right? >> You were trying to get him up, you were like "Hundreds of millions." >> Well I think, I- >> He's like, eh, not 10, not a billion. >> Well first of all, Howie Xu's an expert machine learning. He's at Zscaler, he's a machine learning AI guy. But he comes from VMware, he's got his technology pedigrees really off the chart. Great friend of theCUBE and kind of like a CUBE analyst for us. And he's smart. He's right. I think the barriers to entry from a dollar standpoint are lower than say the CapEx required to compete with AWS. Clearly, the CapEx spending to build all the tech for the run a cloud. >> And you don't need a huge sales force. >> And in some case apps too, it's the same thing. But I think it's not that hard. >> But am I right about that? You don't need a huge sales force either. It's, what, you know >> If the product's good, it will sell, this is a new era. The better mouse trap will win. This is the new economics in software, right? So- >> Because you look at the amount of money Lacework, and Snyk, Snowflake, Databrooks. Look at the amount of money they've raised. I mean it's like a billion dollars before they get to IPO or more. 'Cause they need promotion, they need go to market. You don't need (indistinct) >> OpenAI's been working on this for multiple five years plus it's, hasn't, wasn't born yesterday. Took a lot of years to get going. And Sam is depositioning all the success, because he's trying to manage expectations, To your point Sarbjeet, earlier. It's like, yeah, he's trying to "Whoa, whoa, settle down everybody, (Dave laughs) it's not that great." because he doesn't want to fall into that, you know, hero and then get taken down, so. >> It may take a 100 million or 150 or 200 million to train the model. But to, for the inference to, yeah to for the inference machine, It will take a lot more, I believe. >> Give it, so imagine, >> Because- >> Go ahead, sorry. >> Go ahead. But because it consumes a lot more compute cycles and it's certain level of storage and everything, right, which they already have. So I think to compute is different. To frame the model is a different cost. But to run the business is different, because I think 100 million can go into just fighting the Fed. >> Well there's a flywheel too. >> Oh that's (indistinct) >> (indistinct) >> We are running the business, right? >> It's an interesting number, but it's also kind of, like, context to it. So here, a hundred million spend it, you get there, but you got to factor in the fact that the ways companies win these days is critical mass scale, hitting a flywheel. If they can keep that flywheel of the value that they got going on and get better, you can almost imagine a marketplace where, hey, we have proprietary data, we're SiliconANGLE in theCUBE. We have proprietary content, CUBE videos, transcripts. Well wouldn't it be great if someone in a marketplace could sell a module for us, right? We buy that, Amazon's thing and things like that. So if they can get a marketplace going where you can apply to data sets that may be proprietary, you can start to see this become bigger. And so I think the key barriers to entry is going to be success. I'll give you an example, Reddit. Reddit is successful and it's hard to copy, not because of the software. >> They built the moat. >> Because you can, buy Reddit open source software and try To compete. >> They built the moat with their community. >> Their community, their scale, their user expectation. Twitter, we referenced earlier, that thing should have gone under the first two years, but there was such a great emotional product. People would tolerate the fail whale. And then, you know, well that was a whole 'nother thing. >> Then a plane landed in (John laughs) the Hudson and it was over. >> I think verticals, a lot of verticals will build applications using these models like for lawyers, for doctors, for scientists, for content creators, for- >> So you'll have many hundreds of millions of dollars investments that are going to be seeping out. If, all right, we got to wrap, if you had to put odds on it that that OpenAI is going to be the leader, maybe not a winner take all leader, but like you look at like Amazon and cloud, they're not winner take all, these aren't necessarily winner take all markets. It's not necessarily a zero sum game, but let's call it winner take most. What odds would you give that open AI 10 years from now will be in that position. >> If I'm 0 to 10 kind of thing? >> Yeah, it's like horse race, 3 to 1, 2 to 1, even money, 10 to 1, 50 to 1. >> Maybe 2 to 1, >> 2 to 1, that's pretty low odds. That's basically saying they're the favorite, they're the front runner. Would you agree with that? >> I'd say 4 to 1. >> Yeah, I was going to say I'm like a 5 to 1, 7 to 1 type of person, 'cause I'm a skeptic with, you know, there's so much competition, but- >> I think they're definitely the leader. I mean you got to say, I mean. >> Oh there's no question. There's no question about it. >> The question is can they execute? >> They're not Friendster, is what you're saying. >> They're not Friendster and they're more like Twitter and Reddit where they have momentum. If they can execute on the product side, and if they don't stumble on that, they will continue to have the lead. >> If they say stay neutral, as Sam is, has been saying, that, hey, Microsoft is one of our partners, if you look at their company model, how they have structured the company, then they're going to pay back to the investors, like Microsoft is the biggest one, up to certain, like by certain number of years, they're going to pay back from all the money they make, and after that, they're going to give the money back to the public, to the, I don't know who they give it to, like non-profit or something. (indistinct) >> Okay, the odds are dropping. (group talks over each other) That's a good point though >> Actually they might have done that to fend off the criticism of this. But it's really interesting to see the model they have adopted. >> The wildcard in all this, My last word on this is that, if there's a developer shift in how developers and data can come together again, we have conferences around the future of data, Supercloud and meshs versus, you know, how the data world, coding with data, how that evolves will also dictate, 'cause a wild card could be a shift in the landscape around how developers are using either machine learning or AI like techniques to code into their apps, so. >> That's fantastic insight. I can't thank you enough for your time, on the heels of Supercloud 2, really appreciate it. All right, thanks to John and Sarbjeet for the outstanding conversation today. Special thanks to the Palo Alto studio team. My goodness, Anderson, this great backdrop. You guys got it all out here, I'm jealous. And Noah, really appreciate it, Chuck, Andrew Frick and Cameron, Andrew Frick switching, Cameron on the video lake, great job. And Alex Myerson, he's on production, manages the podcast for us, Ken Schiffman as well. Kristen Martin and Cheryl Knight help get the word out on social media and our newsletters. Rob Hof is our editor-in-chief over at SiliconANGLE, does some great editing, thanks to all. Remember, all these episodes are available as podcasts. All you got to do is search Breaking Analysis podcast, wherever you listen. Publish each week on wikibon.com and siliconangle.com. Want to get in touch, email me directly, david.vellante@siliconangle.com or DM me at dvellante, or comment on our LinkedIn post. And by all means, check out etr.ai. They got really great survey data in the enterprise tech business. This is Dave Vellante for theCUBE Insights powered by ETR. Thanks for watching, We'll see you next time on Breaking Analysis. (electronic music)

Published Date : Jan 20 2023

SUMMARY :

bringing you data-driven and ChatGPT have taken the world by storm. So I asked it, give it to the large language models to do that. So to your point, it's So one of the problems with ChatGPT, and he simply gave the system the prompts, or the OS to help it do but it kind of levels the playing- and the answers were coming as the data you can get. Yeah, and leveled to certain extent. I check the facts, save me about maybe- and then I write a killer because like if the it's, the law is we, you know, I think that's true and I ask the set of similar question, What's your counter point? and not it's underestimated long term. That's what he said. for the first time, wow. the overhyped at the No, it was, it was I got, right I mean? the internet in the early days, and it's only going to get better." So you're saying it's bifurcated. and possibly the debate the first mobile device. So I mean. on the right with ChatGPT, and convicted by the Department of Justice the scrutiny from the Fed, right, so- And the privacy and thing to do what Sam Altman- So even though it'll get like, you know, it's- It's more than clever. I mean you write- I think that's a big thing. I think he was doing- I was not impressed because You know like. And he did the same thing he's got a lot of hyperbole. the browser moment to me, So OpenAI could stay on the right side You're right, it was terrible, They could be the Netscape Navigator, and in the horizontal axis's So I guess that's the other point is, I mean to quote IBM's So the data problem factors and the government's around the world, and they're slow to catch up. Yeah, and now they got years, you know, OpenAI. But the problem with government to kill Big Tech, and the 20% is probably relevant, back in the day, right? are they going to apply it? and also to write code as well, that the marketplace I don't, I don't see you had an interesting comment. No, no. First of all, the AI chops that Google has, right? are off the scales, right? I mean they got to be and the capacity to process that data, on some of the thinking So Lina Kahn is looming, and this is the third, could be a third rail. But the first thing What they will do out the separate company Is it to charge you for a query? it's cool to type stuff in natural language is the way and how many cents the and they're going through Google search results. It will, because there were It'll be like, you know, I mean. I never input the transcript. Wow, But it was a big lie. but I call it the vanilla content. Make your point, cause we And on the danger side as well, So the data By the way, that means at the Supercloud event, So one of the VCs actually What do you make of it? you were like "Hundreds of millions." not 10, not a billion. Clearly, the CapEx spending to build all But I think it's not that hard. It's, what, you know This is the new economics Look at the amount of And Sam is depositioning all the success, or 150 or 200 million to train the model. So I think to compute is different. not because of the software. Because you can, buy They built the moat And then, you know, well that the Hudson and it was over. that are going to be seeping out. Yeah, it's like horse race, 3 to 1, 2 to 1, that's pretty low odds. I mean you got to say, I mean. Oh there's no question. is what you're saying. and if they don't stumble on that, the money back to the public, to the, Okay, the odds are dropping. the model they have adopted. Supercloud and meshs versus, you know, on the heels of Supercloud

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

SarbjeetPERSON

0.99+

Brian GracelyPERSON

0.99+

Lina KhanPERSON

0.99+

Dave VellantePERSON

0.99+

IBMORGANIZATION

0.99+

Reid HoffmanPERSON

0.99+

Alex MyersonPERSON

0.99+

Lena KhanPERSON

0.99+

Sam AltmanPERSON

0.99+

AppleORGANIZATION

0.99+

AWSORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

Rob ThomasPERSON

0.99+

MicrosoftORGANIZATION

0.99+

Ken SchiffmanPERSON

0.99+

GoogleORGANIZATION

0.99+

David FlynnPERSON

0.99+

SamPERSON

0.99+

NoahPERSON

0.99+

Ray AmaraPERSON

0.99+

10 billionQUANTITY

0.99+

150QUANTITY

0.99+

Rob HofPERSON

0.99+

ChuckPERSON

0.99+

Palo AltoLOCATION

0.99+

Howie XuPERSON

0.99+

AndersonPERSON

0.99+

Cheryl KnightPERSON

0.99+

John FurrierPERSON

0.99+

Hewlett PackardORGANIZATION

0.99+

Santa CruzLOCATION

0.99+

1995DATE

0.99+

Lina KahnPERSON

0.99+

Zhamak DehghaniPERSON

0.99+

50 wordsQUANTITY

0.99+

Hundreds of millionsQUANTITY

0.99+

CompaqORGANIZATION

0.99+

10QUANTITY

0.99+

Kristen MartinPERSON

0.99+

two sentencesQUANTITY

0.99+

DavePERSON

0.99+

hundreds of millionsQUANTITY

0.99+

Satya NadellaPERSON

0.99+

CameronPERSON

0.99+

100 millionQUANTITY

0.99+

Silicon ValleyLOCATION

0.99+

one sentenceQUANTITY

0.99+

10 millionQUANTITY

0.99+

yesterdayDATE

0.99+

Clay ChristensenPERSON

0.99+

Sarbjeet JohalPERSON

0.99+

NetscapeORGANIZATION

0.99+

Jeff Bloom & Keith McClellan


 

(upbeat techno music) >> Hello, wonderful cloud community, and welcome to theCUBE's continuing coverage of AWS re:Invent. My name is Savannah Peterson, and I am very excited to be joined by two brilliant gentlemen today. Please welcome Keith from Cockroach Labs and Jeff from AMD. Thank you both for tuning in, coming in from the East coast. How you doing? >> Not too bad. A little cold, but we're going >> Doing great. >> Love that and I love the enthusiasm Keith, you're definitely bringing the heat in the green room before we got on, so I'm going to open this up with you. Cockroach Labs puts out a pretty infamous and useful cloud report each year. Can you tell us a little bit about that, the approach and the data that you report on? >> Yeah, so Cockroach Labs builds a distributed SQL database that we are able to run across multiple cloud regions, multiple sites, multiple data centers. Frequently is running a hybrid kind of a use case and it's important for our customers to be able to compare the performance of configurations when they don't have exact the same hardware available to them in every single location. So since we were already doing this internally for ourselves and for our customers, we decided to turn it into something we shared with the greater community. And it's been a great experience for us. A lot of people come and ask us every year, "Hey, when's the new cloud report coming out?" Because they want to read it. It's been a great win for us. >> How many different things are you looking at? I mean, when you're comparing configurations I imagine there's a lot of different complex variables there. Just how much are you taking into consideration when you publish this report? >> Yeah, so we look at micro benchmarks around CPU network and storage. And then our flagship benchmark is we use the database itself where we have the most expertise to create a real world benchmark on across all of these instances. This year I think we tested over 150 different discrete configurations and it's a bit of a labor of love for us because we then not only do we consume it for best practices for our own as a service offering, but we share it with our customers. We use it internally to make all kinds of different decisions. >> Yeah, 150 different comparisons is not a small number. And Jeff, I know that AMD's position in this cloud report is really important. Where do you fit into all of this and what does it mean for you? >> Right, so what it means for us and for our customers is, there's a good breath and depth of testing that has gone of from the lab. And you look at this cloud report and it helps them traverse this landscape of, why to go on instance A, B, or C on certain workloads. And it really is very meaningful because they now have the real data across all those dimensional kinds of tests. So this definitely helps not only the customers but also for ourselves. So we can now look at ourselves more independently for feedback loops and say, "Hey, here's where we're doing well, here's where we're doing okay, here's where we need to improve on." All those things are important for us. So love seeing the lab present out such a great report as I've seen, very comprehensive, so I very much appreciate it. >> And specifically I love that you're both fans of each other, obviously, specifically digging in there, what does it mean that AMD had the best performance ratio tested on AWS instances? >> Yeah, so when we're looking at instances, we're not just looking at how fast something is, we're also looking at how much it costs to get that level of performance because CockroachDB as a distributed system has the opportunity to scale up and out. And so rather than necessarily wanting the fastest single instance performance, which is an important metric for certain use cases for sure, the comparison of price for performance when you can add notes to get more performance can be a much more economical thing for a lot of our customers. And so AMD has had a great showing on the price performance ratio for I think two years now. And it makes it hard to justify other instance types in a lot of circumstances simply because it's cheaper to get, for each transaction per second that you need, it's cheaper to use an AMD instance than it would be a competitive instance from another vendor. >> I mean, everyone I think no matter their sector wants to do things faster and cheaper and you're able to achieve both, it's easy to see why it's a choice that many folks would like to make. So what do these results mean for CIOs and CTOs? I can imagine there's a lot of value here in the FinOps world. >> Yep. Oh, I'll start a few of 'em. So from the C-suite when they're really looking at the problem statement, think of it as less granular, but higher level. So they're really looking at CapEx, OpEx, sustainability, security, sort of ecosystem on there. And then as Keith pointed out, hey, there's this TCO conversation that has to happen. In other words, as they're moving from sort of this lift and shift from their on-prem into the cloud, what does that mean to them for spend? So now if you're looking at the consistency around sort of the performance and the total cost of running this to their insights, to the conclusions, less time, more money in their pocket and maybe a reduction for their own customers so they can provide better for the customer side. What you're actually seeing is that's the challenge that they're facing in that landscape that they're driving towards that they need guidance and help with towards that. And we find AMD lends itself well to that scale out architecture that connects so well with how cloud microservices are run today. >> It's not surprising to hear that. Keith, what other tips and tricks do you have for CIOs and CTOs trying to reduce FinOps and continue to excel as they're building out? >> Yeah, so there were a couple of other insights that we learned this year. One of those two insights that I'd like to mention is that it's not always obvious what size and shape infrastructure you need to acquire to maximize your cost productions, right? So we found that smaller instance types were by and large had a better TCO than larger instances even across the exact same configurations, we kept everything else the same. Smaller instances had a better price performance ratio than the larger instances. The other thing that we discovered this year that was really interesting, we did a bit of a cost analysis on networking. And largely because we're distributed system, we can scan span across availability zones, we can span across regions, right? And one of the things we discovered this year is the amount of cost for transferring data between availability zones and the amount of cost for transferring data across regions at least in the United States was the same. So you could potentially get more resiliency by spanning your infrastructure across regions, then you would necessarily just spanning across availability zones. So you could be across multiple regions at the same cost as you were across availability zones, which for something like CockroachDB, we were designed to support those workloads is a really big and important thing for us. Now you have to be very particular about where you're purchasing your infrastructure and where those regions are. Because those data transfer rates change depending on what the source and the target is. But at least within the United States, we found that there was a strong correlation to being more survivable if you were in a multi-region deployment and the cost stayed pretty flat. >> That's interesting. So it's interesting to see what the correlation is between things and when you think there may be relationship between variables and when there maybe isn't. So on that note, since it seems like you're both always learning, I can imagine, what are you excited to test or learn about looking forward? Jeff, let's start with you actually. >> For sort of future testing. One of those things is certainly those more scale out sort of workloads with respect to showing scale. Meaning as I'm increasing the working set, as I'm increasing the number of connections, variability is another big thing of showing that minimization from run to run because performance is interesting but consistency is better. And as the lower side is from the instant sizes as I was talking about earlier, a (indistinct) architecture lends itself so well to it because they have the local caching and the CCDs that you can now put a number of vCPUs that will benefit from that delivery of the local caching and drive better performance at the lower side for that scale out sort of architecture, which is so consistent with the microservices. So I would be looking for more of those dimensional testings variability across a variety of workloads that you can go from memory intense workloads to database persistence store as well as a blend of the two, Kafka, et cetera. So there's a great breath and depth of testing that I am looking for and to more connect with sort of the CTOs and CIOs, the higher level that really show them that that CapEx, OpEx, sustainability and provide a bit more around that side of it because those are are the big things that they're focused on as well as security, the fact that based on working sets et cetera, AMD has the ability with confidential compute around those kind of offerings that can start to drive to those outcomes and help from what the CTOs and CIOs are looking for from compliance as well. So set them out (indistinct). >> So you're excited about a lot. No, that's great. That means you're very excited about the future. >> It's a journey that continues as Keith knows, there's always something new. >> Yeah, absolutely. What about you Keith? What is the most excited on the journey? >> Yeah, there are a couple of things I'd like to see us test next year. One of those is to test a multi-region CockroachDB config. We have a lot of customers running in that configuration and production but we haven't scaled that testing up to the same breadth that we we do with our single region testing which is what we've based the cloud report on for the past four years. The other thing that I'd really love to see us do,, I'm a Kubernetes SME, at least that's kind of my technical background. I would love to see us get to a spot where we're comparing the performance of raw EC2 instances to using that same infrastructure running CockroachDB via EKS and kind of see what the differences are there. The vast majority of CockroachDB customers are running at least a portion of their infrastructure in Kubernetes. So I feel like that would be a real great value add to the report for the next time that we go around but go about publishing it. >> If I don't mind adding to that just to volley it back for a moment. And also as I was saying about the ScaleOut and how it leverages our AMD architecture so well with EKS specifically around the spin up, spin down. So you think of a whole development life cycle. As they grow and shrink the resources over time, time of those spin ups to spin downs are expensive. So that has to be as reduced as much as possible. And I think they'll see a lot of benefits in AMD's architecture with EKS running on it as well. >> The future is bright. There's a lot of hype about many of the technologies that you both just mentioned, so I'm very curious to see what the next cloud report looks like. Thank you Keith, and the team for the labor of love that you put into that every year. And Jeff, I hope that you continue to be as well positioned as everyone's innovation journey continues. Keith and Jeff, thank you so much for being on the show with us today. As you know, this is a continuation of our coverage of AWS re:Invent here on theCUBE. My name's Savannah Peterson and we'll see you for our next fascinating segment. (upbeat music)

Published Date : Nov 19 2022

SUMMARY :

coming in from the East coast. A little cold, but we're going data that you report on? that we are able to run things are you looking at? and it's a bit of a labor of And Jeff, I know that AMD's position of testing that has gone of from the lab. has the opportunity to scale up and out. here in the FinOps world. So from the C-suite and continue to excel at the same cost as you were So it's interesting to see and the CCDs that you can excited about the future. It's a journey that What is the most excited on the journey? One of those is to test a So that has to be as And Jeff, I hope that you

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
KeithPERSON

0.99+

JeffPERSON

0.99+

Savannah PetersonPERSON

0.99+

Jeff BloomPERSON

0.99+

Cockroach LabsORGANIZATION

0.99+

United StatesLOCATION

0.99+

next yearDATE

0.99+

AMDORGANIZATION

0.99+

Keith McClellanPERSON

0.99+

two yearsQUANTITY

0.99+

United StatesLOCATION

0.99+

OneQUANTITY

0.99+

AWSORGANIZATION

0.99+

twoQUANTITY

0.99+

oneQUANTITY

0.99+

this yearDATE

0.98+

This yearDATE

0.98+

two insightsQUANTITY

0.98+

bothQUANTITY

0.98+

todayDATE

0.98+

150 different comparisonsQUANTITY

0.98+

each transactionQUANTITY

0.98+

CockroachDBTITLE

0.97+

each yearQUANTITY

0.97+

both fansQUANTITY

0.97+

KubernetesTITLE

0.96+

CapExORGANIZATION

0.95+

theCUBEORGANIZATION

0.95+

KafkaTITLE

0.94+

two brilliant gentlemenQUANTITY

0.94+

single regionQUANTITY

0.93+

over 150 different discrete configurationsQUANTITY

0.92+

SQLTITLE

0.92+

EC2TITLE

0.91+

OpExORGANIZATION

0.9+

single instanceQUANTITY

0.9+

past four yearsDATE

0.81+

EKSTITLE

0.8+

single locationQUANTITY

0.7+

coastLOCATION

0.67+

ScaleOutTITLE

0.65+

KubernetesORGANIZATION

0.64+

secondQUANTITY

0.64+

CockroachDBORGANIZATION

0.63+

re:InventEVENT

0.62+

EKSORGANIZATION

0.59+

every yearQUANTITY

0.57+

A lot of peopleQUANTITY

0.52+

InventEVENT

0.45+

FinOpsORGANIZATION

0.41+

theCUBETITLE

0.34+

John Chambers, JC2 Ventures & Umesh Sachdev, Uniphore | CUBE Conversation, April 2020


 

>> Announcer: From theCUBE Studios in Palo Alto and Boston, connecting with thought leaders all around the world, this is a Cube Conversation. >> Hey welcome back everybody, Jeff Frick here with theCUBE. We're in our Palo Alto Studios today, having a Cube Conversation, you know, with the COVID situation going on we've had to change our business and go pretty much 100% digital. And as part of that process, we wanted to reach out to our community, and talk to some of the leaders out there, because I think leadership in troubling times is even more amplified in it's importance. So we're excited to be joined today by two leaders in our community. First one being John Chambers, a very familiar face from many, many years at Cisco, who's now the founder and CEO of JC2 Ventures. John, great to see you. >> Jeff, it's a pleasure to be with you again. >> Absolutely. And joining him is Umesh Sachdev, he's the co-founder and CEO of Uniphore. First time on theCUBE, Umesh, great to meet you. >> Jeff, thank you for having me, it's great to be with you. >> You as well, and I had one of your great people on the other day, talking about CX, and I think CX is the whole solution. Why did Uber beat cabs, do you want to stand on a corner and raise your hand in the rain? Or do you want to know when the guy's going to come pick you up, in just a couple minutes? So anyway, welcome. So let's jump into it. John, one of your things, that you talked about last time we talked, I think it was in October, wow how the world has changed. >> Yes. >> Is about having a playbook, and really, you know, kind of thinking about what you want to do before it's time to actually do it, and having some type of a script, and some type of direction, and some type of structure, as to how you respond to situations. Well there's nothing like a disaster to really fire off, you know, the need to shift gears, and go to kind of into a playbook mode. So I wonder if you could share with the viewers, kind of what is your playbook, you've been through a couple of these bumps. Not necessarily like COVID-19, but you've seen a couple bumps over your career. >> So it's my pleasure Jeff. What I'll do is kind of outline how I believe you use an innovation playbook on everything from acquisitions, to digitizing a company, to dealing with crisis. Let's focus on the playbook for crisis. You are right, and I'm not talking about my age, (John laughing) but this is my sixth financial crisis, and been through the late 1990s with the Asian financial crisis, came out of it even stronger at Cisco. Like everybody else we got knocked down in the 2001 tech bubble, came back from it even stronger. Then in 2008, 2009, Great Recession. We came through that one very, very strong, and we saw that one coming. It's my fourth major health crisis. Some of them turned out to be pretty small. I was in Mexico when the bird pandemic hit, with the President of Mexico, when we thought it was going to be terrible. We literally had to cancel the meetings that evening. That's why Cisco built the PLAR Presence. I was in Brazil for the issue with the Zika virus, that never really developed much, and the Olympics went on there, and I only saw one mosquito during the event. It bit me. But what I'm sharing with you is I've seen this movie again and again. And then, with supply chain, which not many people were talking about yet, supply chain crisis, like we saw in Japan with the Tsunami. What's happening this time is you're seeing all three at one time, and they're occurring even faster. So the playbook is pretty simple in crisis management, and then it would be fun to put Umesh on the spot and say how closely did you follow it? Did you agree with issues, or did you disagree, et cetera, on it. Now I won't mention, Umesh, that you've got a review coming up shortly from your board, so that should not affect your answer at all. But the first playbook is being realistic, how much was self-inflicted, how much was market. This one's largely market, but if you had problems before, you got to address them at the same time. The second thing is what are the five to seven things that are material, what you're going to do to lead through this crisis. That's everything from expense management, to cash preservation. It's about how do you interface to your employees, and how do you build on culture. It's about how do you interface to your customers as they change from their top priority being growth and innovation, to a top priority being cost savings, and the ability to really keep their current revenue streams from churning and moving. And it's about literally, how do make your big bets for what you want to look like as you move out of this market. Then it's how do you communicate that to your employees, to your shareholders, to your customers, to your partners. Painting the picture of what you look like as you come out. As basic as that sounds, that's what crisis management is all about. Don't hide, be visible, CEOs should take the role on implementing that playbook. Umesh to you, do you agree? And have fun with it a little bit, I like the give and take. >> I want to see the playbook, do you have it there, just below the camera? (Jeff laughing) >> I have it right here by my side. I will tell you, Jeff, in crisis times and difficult times like these, you count all the things that go right for you, you count your blessings. And one of the blessings that I have, as a CEO, is to have John Chambers as my mentor, by my side, sharing not just the learning that he had through the crisis, but talking through this, with me on a regular basis. I've read John's book more than a few times, I bet more than anybody in the world, I've read it over and over. And that, to me, is preparation going into this mode. One of the things that John has always taught me is when times get difficult, you get calmer than usual. It's one thing that when you're cruising on the freeway and you're asked to put the brakes, but it's quite another when you're in rocket ship, and accelerating, which is what my company situation was in the month of January. We were coming out of a year of 300% growth, we were driving towards another 300% growth, hiring tremendously, at a high pace. Winning customers at a high pace, and then this hit us. And so what I had to do, from a playbook perspective, is, you know, take a deep breath, and just for a couple of days, just slow down, and calmly look at the situation. My first few steps were, I reached out to 15 of our top customers, the CEOs, and give them calls, and said let's just talk about what you're seeing, and what we are observing in our business. We get a sense of where they are in their businesses. We had the benefit, my co-founder works out of Singapore, and runs our Asia business. We had the benefit of picking up the sign probably a month before everyone else did it in the U.S. I was with John in Australia, and I was telling John that "John, something unusual is happening, "a couple of our customers in these countries in Asia "are starting to tell us they would do the deal "a quarter later." And it's one thing when one of them says it, it's another when six of them say it together. And John obviously has seen this movie, he could connect the dots early. He told me to prepare, he told the rest of the portfolio companies that are in his investment group to start preparing. We then went to the playbook that John spoke of, being visible. For me, culture and communication take front seat. We have employees in ten different countries, we have offices, and very quickly, even before the governments mandated, we had all of them work, you know, go work from home, and be remote, because employee safety and health was the number one priority. We did our first virtual all-hands meeting on Zoom. We had about 240 people join in from around the world. And my job as CEO, usually our all-hands meeting were different functional leaders, different people in the group talk to the team about their initiatives. This all-hands was almost entirely run by me, addressing the whole company about what's going to be the situation from my lens, what have we learned. Be very factual. At the same time, communicating to the team that because of the fact that we raised our funding the last year, it was a good amount of money, we still have a lot of that in the bank, so we going to be very secure. At the same time, our customers are probably going to need us more than ever. Call centers are in more demand than ever, people can't walk up to a bank branch, they can't go up to a hospital without taking an appointment. So the first thing everyone is doing is trying to reach call centers. There aren't enough people, and anyways the work force that call centers have around the world, are 50% working from home, so the capacity has dropped. So our responsibility almost, is to step up, and have our AI and automation products available to as many call centers as we can. So as we are planning our own business continuity, and making sure every single employee is safe, the message to my team was we also have to be aggressive and making sure we are more out there, and more available, to our customers, that would also mean business growth for us. But first, and foremost is for us to be responsible citizens, and just make it available where it's needed. As we did that, I quickly went back to my leadership team, and again, the learning from John is usually it's more of a consensus driven approach, we go around the table, talk about a topic for a couple of hours, get the consensus, and move out of the room. My leadership meetings, they have become more frequent, we get together once a week, on video call with my executive leaders, and it's largely these days run by me. I broke down the team into five different war rooms, with different objectives. One of them we called it the preservation, we said one leader, supported by others will take the responsibility of making sure every single employee, their families, and our current customers, are addressed, taken care of. So we made somebody lead that group. Another group was made responsible for growth. Business needs to, you know, in a company that's growing at 300%, and we still have the opportunity, because call centers need us more than ever, we wanted to make sure we are responding to growth, and not just hunkering down, and, you know, ignoring the opportunity. So we had a second war room take care of the growth. And a third war room, lead by the head of finance, to look at all the financial scenarios, do the stress tests, and see if we are going to be ready for any eventuality that's going to come. Because, you know, we have a huge amount of people, who work at Uniphore around the world, and we wanted to make sure their well being is taken care of. So from being over communicative, to the team and customers, and being out there personally, to making sure we break down the teams. We have tremendous talent, and we let different people, set of people, run different set of priorities, and report back to me more frequently. And now, as we have settled into this rhythm, Jeff, you know, as we've been in, at least in the Bay area here, we've been shelter in place for about a month now. As we are in the rhythm, we are beginning to do virtual happy hours, every Thursday evening. Right after this call, I get together with my team with a glass of wine, and we get together, we talk every but work, and every employee, it's not divided by functions, or leadership, and we are getting the rhythm back into the organization. So we've gone and adjusted in the crisis, I would say very well. And the business is just humming along, as we had anticipated, going into this crisis. But I would say, if I didn't have John by my side, if I hadn't read his book, the number of times that I have, every plane ride we've done together, every place we've gone together, John has spoken about war stories. About the 2001, about 2008, and until you face the first one of your own, just like I did right now, you don't appreciate when John says leadership is lonely. But having him by our side makes it easier. >> Well I'm sure he's told you the Jack Welch story, right? That you've quoted before, John, where Jack told you that you're not really a good leader, yet, until you've been tested, right. So you go through some tough stuff, it's not that hard to lead on an upward to the right curve, it's when things get a little challenging that the real leadership shines through. >> Completely agree, and Jack said it the best, we were on our way to becoming the most valuable company in the world, he looked me in the eye and said "John, you have a very good company." And I knew he was about to give me a teaching moment, and I said "What does it take to have a great one?" He said a near death experience. And I thought I did that in '97, and some of the other management, and he said, "No, it's when you went through something "like we went through in 2001, "which many of our peers did die in." And we were knocked down really hard. When we came back from it, you get better. But what you see in Umesh is a very humble, young CEO. I have to remember he's only 34 years old, because his maturity is like he's 50, and he's seen it before. As you tell, he's like a sponge on learning, and he doesn't mind challenging. And what what he didn't say, in his humbleness, is they had the best month in March ever. And again, well over 300% versus the same quarter a year ago. So it shows you, if you're in the right spot, i.e. artificial intelligence, i.e. cost savings, i.e. customer relationship with their customers, how you can grow even during the tough times, and perhaps set a bold vision, based upon facts and a execution plan that very few companies will be able to deliver on today. So off to a great start, and you can see why I'm so honored and proud to be his strategic partner, and his coach. >> Well it's interesting, right, the human toll of this crisis is horrible, and there's a lot of people getting sick, and a lot of people are dying, and all the estimations are a lot more are going to die this month, as hopefully we get over the hump of some of these curves. So that aside, you know, we're here talking kind of more about the, kind of, the business of this thing. And it's really interesting kind of what a catalyst COVID has become, in terms of digital transformation. You know, we've been talking about new ways to work for years, and years, and years, and digital transformation, and all these kind of things. You mentioned the Cisco telepresence was out years, and decades ago. I mean I worked in Mitsubishi, we had a phone camera in 1986, I looked it up today, it was ridiculous, didn't work. But now, it's here, right. Now working from home is here. Umesh mentioned, you know, these huge call centers, now everybody's got to go home. Do they have infrastructure to go home? Do they have a place to work at home? Do they have support to go home? Teachers are now being forced, from K-12, and I know it's a hot topic for you, John, to teach from home. Teach on Zoom, with no time to prep, no time to really think it through. It's just like the kids aren't coming back, we got to learn it. You know I think this is such a transformational moment, and to your point, if this goes on for weeks, and weeks, and months, and months, which I think we all are in agreement that it will. I think you said, John, you know, many, many quarters. As people get new habits, and get into this new flow, I don't think they're going to go back back to the old ways. So I think it's a real, you know, kind of forcing function for digital transformation. And it's, you can't, you can't sit on the sidelines, cause your people can't come to the office anymore. >> So you've raised a number of questions, and I'll let Umesh handle the tough part of it. I will answer the easy part, which is I think this is the new normal. And I think it's here now, and the question is are you ready for it. And as you think about what we're really saying is the video sessions will become such an integral part of our daily lives, that we will not go back to having to do 90% of our work physically. Today alone I've done seven major group meetings, on Zoom, and Google Hangouts, and Cisco Webex. I've done six meetings with individuals, or the key CEOs of my portfolio. So that part is here to stay. Now what's going to be fascinating is does that also lead into digitization of our company, or do the companies make the mistake of saying I'm going to use this piece, because it's so obvious, and I get it, in terms of effectiveness, but I'm not going to change the other things in my normal work, in my normal business. This is why, unfortunately, I think you will see, we originally said, Jeff, you remember, 40% maybe as high as 45% of the Fortune 500 wouldn't exist in a decade. And perhaps 70% of the start-ups wouldn't exist in a decade, that are venture capital backed. I now think, unfortunately, you're going to see 20-35% of the start-ups not exist in 2 years, and I think it's going to shock you with the number of Fortune 500 companies that do not make this transition. So where you're leading this, that I completely agree with, is the ability to take this terrible event, with all of the issues, and again thank our healthcare workers for what they've been able to do to help so many people, and deal with the world the way it is. As my parents who are doctors taught me to do, not the way we wish it was. And then get your facts, prepare for the changes, and get ready for the future. The key would be how many companies do this. On the area Umesh has responsibility for, customer experience, I think you're going to see almost all companies focus on that. So it can be an example of perhaps how large companies learn to use the new technology, not just video capability, but AI, assistance for the agents, and then once they get the feel for it, just like we got the feel for these meetings, change their rhythm entirely. It was a dinner in New York, virtually, when we stopped, six weeks ago, traveling, that was supposed to be a bunch of board meetings, customer meetings, that was easy. But we were supposed to have a dinner with Shake Shack's CEO, and we were supposed to have him come out and show how he does cool innovation. We had a bunch of enterprise companies, and a bunch of media, and subject matter expertise, we ended up canceling it, and then we said why not do it virtually? And to your point, we did it in 24 different locations. Half the people, remember six weeks ago, had never even used Zoom. We had milk shakes, and hamburgers, and french fries delivered to their home. And it was one of the best two hour meetings I've seen. The future is this now. It's going to change dramatically, and Umesh, I think, is going to be at the front edge of how enterprise companies understand how their relationship with their customers is going to completely transform, using AI, conversational AI capability, speech recognition, et cetera. >> Yeah, I mean, Umesh, we haven't even really got into Uniphore, or what you guys are all about. But, you know, you're supporting call centers, you're using natural language technology, both on the inbound and all that, give us the overview, but you're playing on so many kind of innovation spaces, you know, the main interaction now with customers, and a brand, is either through the mobile phone, or through a call center, right. And that's becoming more, and increasingly, digitized. The ability to have a voice interaction, with a machine. Fascinating, and really, I think, revolutionary, and kind of taking, you know, getting us away from these stupid qwerty keyboards, which are supposed to slow us down on purpose. It's still the funniest thing ever, that we're still using these qwerty keyboards. So I wonder if you can share with us a little bit about, you know, kind of your vision of natural language, and how that changes the interaction with people, and machines. I think your TED Talk was really powerful, and I couldn't help but think of, you know, kind of mobile versus land lines, in terms of transformation. Transforming telecommunications in rural, and hard to serve areas, and then actually then adding the AI piece, to not only make it better for the front end person, but actually make it for the person servicing the account. >> Absolutely Jeff, so Uniphore, the company that I founded in 2008. We were talking about it's such a coincidence that I founded the company in 2008, the year of the Great Recession, and here we are again, talking in midst of the impact that we all have because of COVID. Uniphore does artificial intelligence and automation products, for the customer service industry. Call centers, as we know it, have fundamentally, for the last 20, 30 years, not have had a major technology disruption. We've seen a couple of ways of business model disruption, where call centers, you know, started to become offshore, in locations in Asia, India, and Mexico. Where our calls started to get routed around the world internationally, but fundamentally, the core technology in call centers, up until very recently, hadn't seen a major shift. With artificial intelligence, with natural language processings, speech recognition, available in over 100 languages. And, you know, in the last year or so, automation, and RPA, sort of adding to that mix, there's a whole new opportunity to re-think what customer service will mean to us, more in the future. As I think about the next five to seven years, with 5G happening, with 15 billion connected devices, you know, my five year old daughter, she the first thing she does when she enters the house from a playground, she goes to talk to her friend called Alexa. She speaks to Alexa. So, you know, these next generation of users, and technology users will grow up with AI, and voice, and NLP, all around us. And so their expectation of customer service and customer experience is going to be quantum times higher than some of us have, from our brands. I mean, today when a microwave or a TV doesn't work in our homes, our instinct could be to either go to the website of the brand, and try to do a chat with the agent, or do an 800 number phone call, and get them to visit the house to fix the TV. With, like I said with 5G, with TV, and microwave, and refrigerator becoming intelligent devices, you know, I could totally see my daughter telling the microwave "Why aren't you working?" And, you know, that question might still get routed to a remote contact center. Now the whole concept of contact center, the word has center in it, which means, in the past, we used to have these physical, massive locations, where people used to come in and put on their headsets to receive calls. Like John said, more than ever, we will see these centers become dispersed, and virtual. The channels with which these queries will come in would no more be just a phone, it would be the microwave, the car, the fridge. And the receivers of these calls would be anywhere in the world, sitting in their home, or sitting on a holiday in the Himalayas, and answering these situations to us. You know, I was reading, just for everyone to realize how drastic this shift has been, for the customer service industry. There are over 14 million workers, who work in contact centers around the world. Like I said, the word center means something here. All of them, right now, are working remote. This industry was never designed to work remote. Enterprises who fundamentally didn't plan for this. To your point Jeff, who thought digitization or automation, was a project they could have picked next year, or they were sitting on the fence, will now know more have a choice to make this adjustment. There's a report by a top analyst firm that said by 2023, up to 30% of customer service representatives would be remote. Well guess what, we just way blew past that number right away. And most of the CEOs that I talked to recently tell me that now that this shift has happened, about 40% of their workers will probably never return back to the office. They will always remain a permanent virtual workforce. Now when the workforce is remote, you need all the tools and technology, and AI, that A, if on any given day, 7-10% of your workforce calls in sick, you need bots, like the Amazon's Alexa, taking over a full conversation. Uniphore has a product called Akira, which does that in call centers. Most often, when these call center workers are talking, we have the experience of being put on hold, because call center workers have to type in something on their keyboard, and take notes. Well guess what, today AI and automation can assist them in doing that, making the call shorter, allowing the call center workers to take a lot more calls in the same time frame. And I don't know your experience, but, you know, a couple of weekends ago, the modem in my house wasn't working. I had a seven hour wait time to my service provider. Seven hour. I started calling at 8:30, it was somewhere around 3-4:00, finally, after call backs, wait, call back, wait, that it finally got resolved. It was just a small thing, I just couldn't get to the representative. So the enterprises are truly struggling, technology can help. They weren't designed to go remote, think about it, some of the unique challenges that I've heard now, from my customers, is that how do I know that my call center representative, who I've trained over years to be so nice, and empathetic, when they take a pee break, or a bio break, they don't get their 10 year old son to attend a call. How do I know that? Because now I can no more physically check in on them. How do I know that if I'm a bank, there's compliance? There's nothing being said that isn't being, is, you know, supposed to be said, because in a center, in an office, a supervisor can listen in. When everyone's remote, you can't do that. So AI, automation, monitoring, supporting, aiding human beings to take calls much better, and drive automation, as well as AI take over parts of a complete call, by the way of being a bot like Alexa, are sort of the things that Uniphore does, and I just feel that this is a permanent shift that we are seeing. While it's happening because of a terrible reason, the virus, that's affecting human beings, but the shift in business and behavior, is going to be permanent in this industry. >> Yeah, I think so, you know it's funny, I had Marten Mickos on, or excuse me, yeah, Marten Mickos, as part of this series. And I asked him, he's been doing distributed companies since he was doing MySQL, before Sun bought them. And he's, he was funny, it's like actually easier to fake it in an office, than when you're at home, because at home all you have to show is your deliverables. You can't look busy, you can't be going to meetings, you can't be doing things at your computer. All you have to show is your output. He said it's actually much more efficient, and it drives people, you know, to manage to the output, manage to what you want. But I want to shift gears a little bit, before we let you go, and really talk a little bit about the role of government. And John, I know you've been very involved with the Indian government, and the French government, trying to help them, in their kind of entrepreneurial pursuits, and Uniphore, I think, was founded in India, right, before you moved over here. You know we've got this huge stimulus package coming from the U.S. government, to try to help, as people, you know, can't pay their mortgage, a lot of people aren't so fortunate to be in digital businesses. It's two trillion dollars, so as kind of a thought experiment, I'm like well how much is two trillion dollars? And I did the cash balance of the FAANG companies. Facebook, Apple, Amazon, Netflix, and Alphabet, just looking at Yahoo Finance, the latest one that was there. It's 333 billion, compared to two trillion. Even when you add Microsoft's 133 billion on top, it's still shy, it's still shy of 500 billion. You know, and really, the federal government is really the only people in a position to make kind of sweeping, these types of investments. But should we be scared? Should we be worried about, you know, kind of this big shift in control? And should, do you think these companies with these big balance sheets, as you said John, priorities change a little bit. Should it be, keep that money to pay the people, so that they can stay employed and pay their mortgage, and go buy groceries, and maybe get take out from their favorite restaurant, versus, you know, kind of what we've seen in the past, where there's a lot more, you know, stock buy backs, and kind of other uses of these cash. As you said, if it's a crisis, and you got to cut to survive, you got to do that. But clearly some of these other companies are not in that position. >> So you, let me break it into two pieces, Jeff, if I may. The first is for the first time in my lifetime I have seen the federal government and federal agencies move very rapidly. And if you would have told me government could move with the speed we've seen over the last three months, I would have said probably not. The fed was ahead of both the initial interest rate cuts, and the fed was ahead in terms of the slowing down, i.e. your 2 trillion discussion, by central banks here, and around the world. But right behind it was the Treasury, which put on 4 trillion on top of that. And only governments can move in this way, but the coordination with government and businesses, and the citizens, has been remarkable. And the citizens being willing to shelter in place. To your question about India, Prime Minister Modi spent the last five years digitizing his country. And he put in place the most bandwidth of any country in the world, and literally did transformation of the currency to a virtual currency, so that people could get paid online, et cetera, within it. He then looked at start-ups and job creation, and he positioned this when an opportunity or problem came along, to be able to perhaps navigate through it in a way that other countries might struggle. I would argue President Macron in France is doing a remarkable job with his innovation economy, but also saying how do you preserve jobs. So you suddenly see government doing something that no business can do, with the scale, and the speed, and a equal approach. But at the same time, may of these companies, and being very candid, that some people might have associated with tech for good, or with tech for challenges, have been unbelievably generous in giving both from the CEOs pockets perspective, and number two and three founders perspective, as well as a company giving to the CDC, and giving to people to help create jobs. So I actually like this opportunity for tech to regain its image of being good for everybody in the world, and leadership within the world. And I think it's a unique opportunity. For my start-ups, I've been so proud, Jeff. I didn't have to tell them to go do the right thing with their employees, I didn't have to tell them that you got to treat people, human lives first, the economy second, but we can do both in parallel. And you saw companies like Sprinklr suddenly say how can I help the World Health Organization anticipate through social media, where the next spread of the virus is going to be? A company, like Bloom Energy, with what KR did there, rebuilding all of the ventilators that were broken here in California, of which about 40% were, out of the stock that they got, because it had been in storage for so long, and doing it for all of California in their manufacturing plant, at cost. A company like Aspire Foods, a cricket company down in Texas, who does 3D capabilities, taking part of their production in 3D, and saying how many thousand masks can I generate, per week, using 3D printers. You watch what Umesh has done, and how he literally is changing peoples lives, and making that experience, instead of being a negative from working at home, perhaps to a positive, and increasing the customer loyalty in the process, as opposed to when you got a seven hour wait time on a line. Not only are you probably not going to order anything else from that company, you're probably going to change it. So what is fascinating to me is I believe companies owe an obligation to be successful, to their employees, and to their shareholders, but also to give back to society. And it's one of the things I'm most proud about the portfolio companies that I'm a part of, and why I'm so proud of what Umesh is doing, in both a economically successful environment, but really giving back and making a difference. >> Yeah, I mean, there's again, there's all the doctor stuff, and the medical stuff, which I'm not qualified to really talk about. Thankfully we have good professionals that have the data, and the knowledge, and know what to do, and got out ahead of the social distancing, et cetera, but on the backside, it really looks like a big data problem in so many ways, right. And now we have massive amounts of compute at places like Amazon, and Google, and we have all types of machine learning and AI to figure out, you know, there's kind of resource allocation, whether that be hospital beds, or ventilators, or doctors, or nurses, and trying to figure out how to sort that all out. But then all of the, you know, genome work, and you know, kind of all that big heavy lifting data crunching, you know, CPU consuming work, that hopefully is accelerating the vaccine. Because I don't know how we get all the way out of this until, it just seems like kind of race to the vaccine, or massive testing, so we know that it's not going to spike up. So it seems like there is a real opportunity, it's not necessarily Kaiser building ships, or Ford building planes, but there is a role for tech to play in trying to combat this thing, and bring it under control. Umesh, I wonder if you could just kind of contrast being from India, and now being in the States for a couple years. Anything kind of jump out to you, in terms of the differences in what you're hearing back home, in the way this has been handled? >> You know, it's been very interesting, Jeff, I'm sure everyone is concerned that India, for many reasons, so far hasn't become a big hot spot yet. And, you know, we can hope and pray that that remains to be the case. There are many things that the government back home has done, I think India took lessons from what they saw in Europe, and the U.S, and China. They went into a countrywide lockdown pretty early, you know, pretty much when they were lower than a two hundred positive tested cases, the country went into lockdown. And remember this is a 1.5 billion people all together going into lockdown. What I've seen in the U.S. is that, you know, California thankfully reacted fast. We've all been sheltered in place, there's cabin fever for all of us, but you know, I'm sure at the end of the day, we're going to be thankful for the steps that are taken. Both by the administration at the state level, at the federal level, and the medical doctors, who are doing everything they can. But India, on the other hand, has taken the more aggressive stance, in terms of doing a country lockdown. We just last evening went live at a University in the city of Chennai, where Uniphore was born. The government came out with the request, much like the U.S., where they're government departments were getting a surge of traffic about information about COVID, the hospitals that are serving, what beds are available, where is the testing? We stood up a voice bot with AI, in less than a week, in three languages. Which even before the government started to advertise, we started to get thousands of calls. And this is AI answering these questions for the citizens, in doing so. So it goes back to your point of there's a real opportunity of using all the technology that the world has today, to be put to good use. And at the same time, it's really partnering meaningfully with government, in India, in Singapore, in Vietnam, and here in the U.S., to make sure that happens on, you know, John's coaching and nudging, I became a part of the U.S.-India Strategic Partnership Forum, which is truly a premier trade and commerce body between U.S. and India. And I, today, co-chaired the start-up program with, you know, the top start-ups between U.S. and India, being part of that program. And I think we got, again, tremendously fortunate, and lucky with the timeline. We started working on this start-up program between U.S. and India, and getting the start-ups together, two quarters ago, and as this new regulation with the government support, and the news about the two trillion dollar packages coming out, and the support for small businesses, we could quickly get some of the questions answered for the start-ups. Had we not created this body, which had the ability to poll the Treasury Department, and say here are questions, can start-ups do A, B, and C? What do you have by way of regulation? And I think as a response to one of our letters, on Monday the Treasury put out an FAQ on their website, which makes it super clear for start-ups and small businesses, to figure out whether they qualify or they don't qualify. So I think there's ton that both from a individual company, and the technology that each one of us have, but also as a community, how do we, all of us, meaningfully get together, as a community, and just drive benefit, both for our people, for the economy, and for our countries. Wherever we have the businesses, like I said in the U.S., or in India, or parts of Asia. >> Yeah, it's interesting. So, this is a great conversation, I could talk to you guys all night long, but I probably would hear about it later, so we'll wrap it, but I just want to kind of close on the following thought, which is really, as you've talked about before John, and as Umesh as you're now living, you know, when we go through these disruptions, things do get changed, and as you said a lot of people, and companies don't get through it. On the other hand many companies are birthed from it, right, people that are kind of on the new trend, and are in a good position to take advantage, and it's not that you're laughing over the people that didn't make it, but it does stir up the pot, and it sounds like, Umesh, you're in a really good position to take advantage of this new kind of virtual world, this new digital transformation, that's just now waiting anymore. I love your stat, they were going to move X% out of the call center over some period of time, and then it's basically snap your fingers, everybody out, without much planning. So just give you the final word, you know, kind of advice for people, as they're looking forward, and Umesh, we'll get you on another time, because I want to go deep diving in natural language, I think that's just a fascinating topic in the way that people are going to interact with machines and get rid of the stupid qwerty keyboard. But let me get kind of your last thoughts as we wrap this segment. Umesh we'll let you go first. >> Umesh, you want to go first? >> I'll go first. My last thoughts are first for the entrepreneurs, everyone who's sort of going through this together. I think in difficult times is when real heroes are born. I read a quote that when it's a sunny day, you can't overtake too many cars, but when it's raining you have a real opportunity. And the other one that I read was when fishermen can't go out fishing, because of the high tide, they come back, and mend their nets, and be ready for the time that they can go out. So I think there's no easy way to say, this is a difficult time for the economy, health wise, I hope that, you know, we can contain the damage that's being done through the virus, but some of us have the opportunity to really take our products and technology out there, more than usual. Uniphore, particularly, has a unique opportunity, the contact center industry just cannot keep up with the traffic that it's seeing. Around the world, across US, across Asia, across India, and the need for AI and automation would never be pronounced more than it is today. As much as it's a great business opportunity, it's more of a responsibility, as I see it. There can be scale up as fast as the demand is coming, and really come out of this with a much stronger business model. John has always told me in final words you always paint the picture of what you want to be, a year or two out. And I see Uniphore being a much stronger AI plus automation company, in the customer service space, really transforming the face of call centers, and customer service. Which have been forced to rethink their core business value in the last few weeks. And, you know, every fence sitter who would think that digitalization and automation was an option that they could think of in the future years, would be forced to make those decisions now. And I'm just making sure that my team, and my company, and I, am ready to gear to that great responsibility and opportunity that's ahead of us. >> John, give you the final word. >> Say Jeff, I don't know if you can still hear me, we went blank there, maybe for me to follow up. >> We gotcha. >> Shimon Peres taught me a lot about life, and dealing with life the way it is, not the way you wish it was. So did my parents, but he also taught me it always looks darkest just before the tide switches, and you move on to victory. I think the challenges in front of us are huge, I think our nation knows how to deal with that, I do believe the government has moved largely pretty effectively, to give us the impetus to move, and then if we continue to flatten the curve on the issues with the pandemic, if we get some therapeutic drugs that dramatically reduce the risk of death, for people that get the challenges the worst, and over time a vaccine, I think you look to the future, America will rebound, it will be rebounding around start-ups, new job creation, using technology in every business. So not only is there a light at the tunnel, at the end of the tunnel, I think we will emerge from this a stronger nation, a stronger start-up community. But it depends on how well we work together as a group, and I just want to say to Umesh, it's an honor to be your coach, and I learn from you as much as I give back. Jeff, as always, you do a great job. Thank you for your time today. >> Thank you both, and I look forward to our next catch up. Stay safe, wash your hands, and thanks for spending some time with us. >> And I just want to say I hope and pray that all of us can get together in Palo Alto real quick, and in person, and doing fist bumps, not shake hands or probably a namaste. Thank you, it's an honor. >> Thank you very much. All right, that was John and Umesh, you're watching theCUBE from our Palo Alto Studios, thanks for tuning in, stay safe, wash your hands, keep away from people that you're not that familiar with, and we'll see you next time. Thanks for watching. (calm music)

Published Date : Apr 14 2020

SUMMARY :

connecting with thought leaders all around the world, and talk to some of the leaders out there, he's the co-founder and CEO of Uniphore. it's great to be with you. going to come pick you up, in just a couple minutes? and really, you know, kind of thinking about and the ability to really keep the message to my team was that the real leadership shines through. and some of the other management, and all the estimations are a lot more are going to die and the question is are you ready for it. and how that changes the interaction with people, And most of the CEOs that I talked to recently and it drives people, you know, to manage to the output, and the fed was ahead in terms of the slowing down, and AI to figure out, you know, and here in the U.S., I could talk to you guys all night long, and be ready for the time that they can go out. Say Jeff, I don't know if you can still hear me, not the way you wish it was. and thanks for spending some time with us. and in person, and doing fist bumps, and we'll see you next time.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AmazonORGANIZATION

0.99+

JeffPERSON

0.99+

JohnPERSON

0.99+

FacebookORGANIZATION

0.99+

AppleORGANIZATION

0.99+

NetflixORGANIZATION

0.99+

JackPERSON

0.99+

AlphabetORGANIZATION

0.99+

Marten MickosPERSON

0.99+

Umesh SachdevPERSON

0.99+

TexasLOCATION

0.99+

2001DATE

0.99+

Jeff FrickPERSON

0.99+

two trillionQUANTITY

0.99+

CiscoORGANIZATION

0.99+

AustraliaLOCATION

0.99+

fiveQUANTITY

0.99+

CaliforniaLOCATION

0.99+

AsiaLOCATION

0.99+

UmeshPERSON

0.99+

JC2 VenturesORGANIZATION

0.99+

1986DATE

0.99+

VietnamLOCATION

0.99+

MexicoLOCATION

0.99+

Aspire FoodsORGANIZATION

0.99+

IndiaLOCATION

0.99+

seven hourQUANTITY

0.99+

April 2020DATE

0.99+

SingaporeLOCATION

0.99+

OctoberDATE

0.99+

two trillion dollarsQUANTITY

0.99+

50%QUANTITY

0.99+

New YorkLOCATION

0.99+

UberORGANIZATION

0.99+

90%QUANTITY

0.99+

MicrosoftORGANIZATION

0.99+

John ChambersPERSON

0.99+

50QUANTITY

0.99+

2008DATE

0.99+

UniphoreORGANIZATION

0.99+

15QUANTITY

0.99+

BostonLOCATION

0.99+

BrazilLOCATION

0.99+

U.S.LOCATION

0.99+

2 trillionQUANTITY

0.99+

MondayDATE

0.99+

4 trillionQUANTITY

0.99+

World Health OrganizationORGANIZATION

0.99+

Bloom EnergyORGANIZATION

0.99+

Palo AltoLOCATION

0.99+

MarchDATE

0.99+

Renaud Gaubert, NVIDIA & Diane Mueller, Red Hat | KubeCon + CloudNativeCon NA 2019


 

>>Live from San Diego, California It's the Q covering Koopa and Cloud Native Cot brought to you by Red Cloud, Native Computing Pounding and its ecosystem March. >>Welcome back to the Cube here at Q. Khan Club native Khan, 2019 in San Diego, California Instrumental in my co host is Jon Cryer and first of all, happy to welcome back to the program. Diane Mueller, who is the technical of the tech lead of cloud native technology. I'm sorry. I'm getting the wrong That's director of community development Red Hat, because renew. Goodbye is the technical lead of cognitive technologies at in video game to the end of day one. I've got three days. I gotta make sure >>you get a little more Red Bull in the conversation. >>All right, well, there's definitely a lot of energy. Most people we don't even need Red Bull here because we're a day one. But Diane, we're going to start a day zero. So, you know, you know, you've got a good group of community of geeks when they're like Oh, yeah, let me fly in a day early and do like 1/2 day or full day of deep dives. There So the Red Hat team decided to bring everybody on a boat, I guess. >>Yeah. So, um, open ships Commons gathering for this coup con we hosted at on the inspiration Hornblower. We had about 560 people on a boat. I promised them that it wouldn't leave the dock, but we deal still have a little bit of that weight going on every time one of the big military boats came by. And so people were like a little, you know, by the end of the day, but from 8 a.m. in the morning till 8 p.m. In the evening, we just gathered had some amazing deep dives. There was unbelievable conversations onstage offstage on we had, ah, wonderful conversation with some of the new Dev ops folks that have just come on board. That's a metaphor for navigation and Coop gone. And and for events, you know, Andrew Cliche for John Willis, the inevitable Crispin Ella, who runs Open Innovation Labs, and J Bloom have all just formed the global Transformation Office. I love that title on dhe. They're gonna be helping Thio preach the gospel of Cultural Dev ops and agile transformation from a red hat office From now going on, there was a wonderful conversation. I felt privileged to actually get to moderate it and then just amazing people coming forward and sharing their stories. It was a great session. Steve Dake, who's with IBM doing all the SDO stuff? Did you know I've never seen SDO done so well, Deployment explains so well and all of the contents gonna be recorded and up on Aaron. We streamed it live on Facebook. But I'm still, like reeling from the amount of information overload. And I think that's the nice thing about doing a day zero event is that it's a smaller group of people. So we had 600 people register, but I think was 560 something. People show up and we got that facial recognition so that now when they're traveling through the hallways here with 12,000 other people, that go Oh, you were in the room. I met you there. And that's really the whole purpose for comments. Events? >>Yeah, I tell you, this is definitely one of those shows that it doesn't take long where I say, Hey, my brain is full. Can I go home. Now. You know I love your first impressions of Q Khan. Did you get to go to the day zero event And, uh, what sort of things have you been seeing? So >>I've been mostly I went to the lightning talks, which were amazing. Anything? Definitely. There. A number of shout outs to the GPU one, of course. Uh, friend in video. But I definitely enjoyed, for example, of the amazing D. M s one, the one about operators. And generally all of them were very high quality. >>Is this your first Q? Khan, >>I've been there. I've been a year. This is my third con. I've been accused in Europe in the past. Send you an >>old hat old hand at this. Well, before we get into the operator framework and I wanna love to dig into this, I just wanted to ask one more thought. Thought about open shift, Commons, The Commons in general, the relationship between open shift, the the offering. And then Okay, the comments and okay, D and then maybe the announcement about about Okay. Dee da da i o >>s. Oh, a couple of things happened yesterday. Yesterday we dropped. Okay, D for the Alfa release. So anyone who wants to test that out and try it out it's an all operators based a deployment of open shift, which is what open ship for is. It's all a slightly new architectural deployment methodology based on the operator framework, and we've been working very diligently. Thio populate operator hub dot io, which is where all of the upstream projects that have operators like the one that Reynolds has created for in the videos GP use are being hosted so that anyone could deploy them, whether on open shift or any kubernetes so that that dropped. And yesterday we dropped um, and announced Open Sourcing Quay as project quay dot io. So there's a lot of Io is going on here, but project dia dot io is, um, it's a fulfillment, really, of a commitment by Red Hat that whenever we do an acquisition and the poor folks have been their acquired by Cora West's and Cora Weston acquired by Red Hat in an IBM there. And so in the interim, they've been diligently working away to make the code available as open source. And that hit last week and, um, to some really interesting and users that are coming up and now looking forward to having them to contribute to that project as well. But I think the operator framework really has been a big thing that we've been really hearing, getting a lot of uptake on. It's been the new pattern for deploying applications or service is on getting things beyond just a basic install of a service on open shift or any kubernetes. And that's really where one of the exciting things yesterday on we were talking, you know, and I were talking about this earlier was that Exxon Mobil sent a data scientist to the open ship Commons, Audrey Resnick, who gave this amazing presentation about Jupiter Hub, deeper notebooks, deploying them and how like open shift and the advent of operators for things like GP use is really helping them enable data scientists to do their work. Because a lot of the stuff that data signs it's do is almost disposable. They'll run an experiment. Maybe they don't get the result they want, and then it just goes away, which is perfect for a kubernetes workload. But there are other things you need, like a Jeep use and work that video has been doing to enable that on open shift has been just really very helpful. And it was It was a great talk, but we were talking about it from the first day. Signs don't want to know anything about what's under the hood. They just want to run their experiments. So, >>you know, let's like to understand how you got involved in the creation of the operator. >>So generally, if we take a step back and look a bit at what we're trying to do is with a I am l and generally like EJ infrastructure and five G. We're seeing a lot of people. They're trying to build and run applications. Whether it's in data Center at the and we're trying to do here with this operator is to bring GPS to enterprise communities. And this is what we're working with. Red Hat. And this is where, for example, things like the op Agrestic A helps us a lot. So what we've built is this video Gee, few operator that space on the upper air sdk where it wants us to multiple phases to in the first space, for example, install all the components that a data scientist were generally a GPU cluster of might want to need. Whether it's the NVIDIA driver, the container runtime, the community's device again feast do is as you go on and build an infrastructure. You want to be able to have the automation that is here and, more importantly, the update part. So being able to update your different components, face three is generally being able to have a life cycle. So as you manage multiple machines, these are going to get into different states. Some of them are gonna fail, being able to get from these bad states to good states. How do you recover from them? It's super helpful. And then last one is monitoring, which is being able to actually given sites dr users. So the upper here is decay has helped us a lot here, just laying out these different state slips. And in a way, it's done the same thing as what we're trying to do for our customers. The different data scientists, which is basically get out of our way and allow us to focus on core business value. So the operator, who basically takes care of things that are pretty cool as an engineer I lost due to your election. But it doesn't really help me to focus on like my core business value. How do I do with the updates, >>you know? Can I step back one second, maybe go up a level? The problem here is that each physical machine has only ah limited number of NVIDIA. GPU is there and you've got a bunch of containers that maybe spawning on different machines. And so they have to figure out, Do I have a GPU? Can I grab one? And if I'm using it, I assume I have to reserve it and other people can't use and then I have to give it up. Is that is that the problem we're solving here? So this is >>a problem that we've worked with communities community so that like the whole resource management, it's something that is integrated almost first class, citizen in communities, being able to advertise the number of deep, use their your cluster and used and then being able to actually run or schedule these containers. The interesting components that were also recently added are, for example, the monitoring being able to see that a specific Jupiter notebook is using this much of GP utilization. So these air supercool like features that have been coming in the past two years in communities and which red hat has been super helpful, at least in these discussions pushing these different features forward so that we see better enterprise support. Yeah, >>I think the thing with with operators and the operator lifecycle management part of it is really trying to get to Day two. So lots of different methodologies, whether it's danceable or python or job or or UH, that's helm or anything else that can get you an insult of a service or an application or something. And in Stan, she ate it. But and the operator and we support all of that with SD case to help people. But what we're trying to do is bridge the to this day to stuff So Thea, you know, to get people to auto pilot, you know, and there's a whole capacity maturity model that if you go to operator hab dot io, you can see different operators are a different stages of the game. So it's been it's been interesting to work with people to see Theo ah ha moment when they realize Oh, I could do this and then I can walk away. And then if that pod that cluster dies, it'll just you know, I love the word automatically, but they, you know, it's really the goal is to help alleviate the hands on part of Day two and get more automation into the service's and applications we deploy >>right and when they when they this is created. Of course it works well with open shift, but it also works for any kubernetes >>correct operator. HAB Daddio. Everything in there runs on any kubernetes, and that's really the goal is to be ableto take stuff in a hybrid cloud model. You want to be able to run it anywhere you want, so we want people to be unable to do it anywhere. >>So if this really should be an enabler for everything that it's Vinny has been doing to be fully cloud native, Yes, >>I think completely arable here is this is a new attack. Of course, this is a bit there's a lot of complexity, and this is where we're working towards is reducing the complexity and making true that people there. Dan did that a scientist air machine learning engineers are able to focus on their core business. >>You watch all of the different service is in the different things that the data scientists are using. They don't I really want to know what's under under the hood. They would like to just open up a Jupiter Hub notebook, have everything there. They need, train their models, have them run. And then after they're done, they're done and it goes away. And hopefully they remember to turn off the Jeep, use in the woods or wherever it is, and they don't keep getting billed for it. But that's the real beauty of it is that they don't have to worry so much anymore about that. And we've got a whole nice life cycle with source to image or us to I. And they could just quickly build on deploy its been, you know, it's near and dear to my heart, the machine learning the eyesight of stuff. It is one of the more interesting, you know, it's the catchy thing, but the work was, but people are really doing it today, and it's been we had 23 weeks ago in San Francisco, we had a whole open ship comments gathering just on a I and ML and you know, it was amazing to hear. I think that's the most redeeming thing or most rewarding thing rather for people who are working on Kubernetes is to have the folks who are doing workloads come and say, Wow, you know, this is what we're doing because we don't get to see that all the time. And it was pretty amazing. And it's been, you know, makes it all worthwhile. So >>Diane Renaud, thank you so much for the update. Congratulations on the launch of the operators and look forward to hearing more in the future. >>All right >>to >>be here >>for John Troy runs to minimum. More coverage here from Q. Khan Club native Khan, 2019. Thanks for watching. Thank you.

Published Date : Nov 20 2019

SUMMARY :

Koopa and Cloud Native Cot brought to you by Red Cloud, California Instrumental in my co host is Jon Cryer and first of all, happy to welcome back to the program. There So the Red Hat team decided to bring everybody on a boat, And that's really the whole purpose for comments. Did you get to go to the day zero event And, uh, what sort of things have you been seeing? But I definitely enjoyed, for example, of the amazing D. I've been accused in Europe in the past. The Commons in general, the relationship between open shift, And so in the interim, you know, let's like to understand how you got involved in the creation of the So the operator, who basically takes care of things that Is that is that the problem we're solving here? added are, for example, the monitoring being able to see that a specific Jupiter notebook is using this the operator and we support all of that with SD case to help people. Of course it works well with open shift, and that's really the goal is to be ableto take stuff in a hybrid lot of complexity, and this is where we're working towards is reducing the complexity and It is one of the more interesting, you know, it's the catchy thing, but the work was, Congratulations on the launch of the operators and look forward for John Troy runs to minimum.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Audrey ResnickPERSON

0.99+

Andrew ClichePERSON

0.99+

Diane MuellerPERSON

0.99+

Steve DakePERSON

0.99+

IBMORGANIZATION

0.99+

Jon CryerPERSON

0.99+

Exxon MobilORGANIZATION

0.99+

Diane RenaudPERSON

0.99+

EuropeLOCATION

0.99+

John TroyPERSON

0.99+

San FranciscoLOCATION

0.99+

1/2 dayQUANTITY

0.99+

Red HatORGANIZATION

0.99+

San Diego, CaliforniaLOCATION

0.99+

firstQUANTITY

0.99+

J BloomPERSON

0.99+

DianePERSON

0.99+

2019DATE

0.99+

Open Innovation LabsORGANIZATION

0.99+

yesterdayDATE

0.99+

Red CloudORGANIZATION

0.99+

560QUANTITY

0.99+

NVIDIAORGANIZATION

0.99+

600 peopleQUANTITY

0.99+

three daysQUANTITY

0.99+

John WillisPERSON

0.99+

8 a.m.DATE

0.99+

Crispin EllaPERSON

0.99+

JeepORGANIZATION

0.99+

San Diego, CaliforniaLOCATION

0.99+

Cora WestORGANIZATION

0.99+

YesterdayDATE

0.99+

last weekDATE

0.99+

SDOTITLE

0.99+

DanPERSON

0.99+

8 p.m.DATE

0.98+

23 weeks agoDATE

0.98+

first impressionsQUANTITY

0.98+

one secondQUANTITY

0.98+

Q. Khan ClubORGANIZATION

0.98+

oneQUANTITY

0.98+

RenauPERSON

0.98+

Red BullORGANIZATION

0.98+

ReynoldsPERSON

0.97+

AaronPERSON

0.97+

Day twoQUANTITY

0.97+

MarchDATE

0.96+

third con.QUANTITY

0.96+

first spaceQUANTITY

0.96+

first dayQUANTITY

0.95+

VinnyPERSON

0.95+

Cora WestonORGANIZATION

0.94+

ThioPERSON

0.94+

CloudORGANIZATION

0.93+

FacebookORGANIZATION

0.92+

first classQUANTITY

0.92+

todayDATE

0.9+

about 560 peopleQUANTITY

0.9+

JupiterLOCATION

0.89+

each physical machineQUANTITY

0.88+

12,000 otherQUANTITY

0.88+

day zeroQUANTITY

0.88+

D. MPERSON

0.87+

CloudNativeCon NA 2019EVENT

0.87+

d GaubertPERSON

0.87+

TheaPERSON

0.86+

pythonTITLE

0.84+

Native Computing PoundingORGANIZATION

0.83+

a dayQUANTITY

0.79+

day zeroEVENT

0.78+

day oneQUANTITY

0.78+

KoopaORGANIZATION

0.76+

one more thoughtQUANTITY

0.74+

KhanPERSON

0.72+

CommonsORGANIZATION

0.72+

KubeCon +EVENT

0.72+

Jupiter HubORGANIZATION

0.71+

Shaun Connolly, Hortonworks - DataWorks Summit Europe 2017 - #DW17 - #theCUBE


 

>> Announcer: Coverage DataWorks Summit Europe 2017 brought to you by Hortonworks. >> Welcome back everyone. Live here in Munich, Germany for theCUBE'S special presentation of Hortonworks Hadoop Summit now called DataWorks 2017. I'm John Furrier, my co-host Dave Vellante, our next guest is Shaun Connolly, Vice President of Corporate Strategy, Chief Strategy Officer. Shaun great to see you again. >> Thanks for having me guys. Always a pleasure. >> Super exciting. Obviously we always pontificating on the status of Hadoop and Hadoop is dead, long live Hadoop, but runs in demise is greatly over-exaggerated, but reality is is that no major shifts in the trends other than the fact that the amplification with AI and machine learning has upleveled the narrative to mainstream around data, big data has been written on on gen one on Hadoop, DevOps, culture, open-source. Starting with Hadoop you guys certainly have been way out in front of all the trends. How you guys have been rolling out the products. But it's now with IoT and AI as that sizzle, the future self driving cars, smart cities, you're starting to really see demand for comprehensive solutions that involve data-centric thinking. Okay, said one. Two, open-source continues to dominate MuleSoft went public, you guys went public years ago, Cloudera filed their S-1. A crop of public companies that are open-source, haven't seen that since Red Hat. >> Exactly. 99 is when Red Hat went public. >> Data-centric, big megatrend with open-source powering it, you couldn't be happier for the stars lining up. >> Yeah, well we definitely placed our bets on that. We went public in 2014 and it's nice to see that graduating class of Taal and MuleSoft, Cloudera coming out. That just I think helps socializes movement that enterprise open-source, whether it's for on-prem or powering cloud solutions pushed out to the edge, and technologies that are relevant in IoT. That's the wave. We had a panel earlier today where Dahl Jeppe from Centric of British Gas, was talking about his ... The digitization of energy and virtual power plant notions. He can't achieve that without open-source powering and fueling that. >> And the thing about it is is just kind of ... For me personally being my age in this generation of computer industry since I was 19, to see the open-source go mainstream the way it is, is even gets better every time, but it really is the thousandth flower bloom strategy. Throwing the seeds out there of innovation. I want to ask you as a strategy question, you guys from a performance standpoint, I would say kind of got hammered in the public market. Cloudera's valuation privately is 4.1 billion, you guys are close to 700 million. Certainly Cloudera's going to get a haircut looks like. The public market is based on the multiples from Dave and I's intro, but there's so much value being created. Where's the value for you guys as you look at the horizon? You're talking about white spaces that are really developing with use cases that are creating value. The practitioners in the field creating value, real value for customers. >> So you covered some of the trends, but I'll translate em into how the customers are deploying. Cloud computing and IoT are somewhat related. One is a centralization, the other is decentralization, so it actually calls for a connected data architecture as we refer to it. We're working with a variety of IoT-related use cases. Coca-Cola, East Japan spoke at Tokyo Summit about beverage replenishment analytics. Getting vending machine analytics from vending machines even on Mount Fuji. And optimizing their flow-through of inventory in just-in-time delivery. That's an IoT-related to run on Azure. It's a cloud-related story and it's a big data analytics story that's actually driving better margins for the business and actually better revenues cuz they're getting the inventory where it needs to be so people can buy it. Those are really interesting use cases that we're seeing being deployed and it's at this convergence of IoT cloud and big data. Ultimately that leads to AI, but I think that's what we're seeing the rise of. >> Can you help us understand that sort of value chain. You've got the edge, you got the cloud, you need something in-between, you're calling it connected data platform. How do you guys participate in that value chain? >> When we went public our primary workhorse platform was Hortonworks Data Platform. We had first class cloud services with Azure HDInsight and Hortonworks Data Cloud for AWS, curated cloud services pay-as-you-go, and Hortonworks DataFlow, I call as our connective tissue, it manages all of your data motion, it's a data logistics platform, it's like FedEx for data delivery. It goes all the way out to the edge. There's a little component called Minify, mini and ify, which does secure intelligent analytics at the edge and transmission. These smart manufacturing lines, you're gathering the data, you're doing analytics on the manufacturing lines, and then you're bringing the historical stuff into the data center where you can do historical analytics across manufacturing lines. Those are the use cases that are connect the data archives-- >> Dave: A subset of that data comes back, right? >> A subset of the data, yep. The key events of that data it may not be full of-- >> 10%, half, 90%? >> It depends if you have operational events that you want to store, sometimes you may want to bring full fidelity of that data so you can do ... As you manufacture stuff and when it got deployed and you're seeing issues in the field, like Western Digital Hard Drives, that failure's in the field, they want that data full fidelity to connect the data architecture and analytics around that data. You need to ... One of the terms I use is in the new world, you need to play it where it lies. If it's out at the edge, you need to play it there. If it makes a stop in the cloud, you need to play it there. If it comes into the data center, you also need to play it there. >> So a couple years ago, you and I were doing a panel at our Big Data NYC event and I used the term "profitless prosperity," I got the hairy eyeball from you, but nonetheless, we talked about you guys as a steward of the industry, you have to invest in open-source projects. And it's expensive. I mean HDFS itself, YARN, Tez, you guys lead a lot of those initiatives. >> Shaun: With the community, yeah, but we-- >> With the community yeah, but you provided contributions and co-leadership let's say. You're there at the front of the pack. How do we project it forward without making forward-looking statements, but how does this industry become a cashflow positive industry? >> Public companies since end of 2014, the markets turned beginning at 2016 towards, prior to that high growth with some losses was palatable, losses were not palatable. That his us, Splunk, Tableau most of the IT sector. That's just the nature of the public markets. As more public open-source, data-driven companies will come in I think it will better educate the market of the value. There's only so much I can do to control the stock price. What I can from a business perspective is hit key measures from a path to profitability. The end of Q4 2016, we hit what we call the just-to-even or breakeven, which is a stepping stone. On our earnings call at the end of 2016 we ended with 185 million in revenue for the year. Only five years into this journey, so that's a hard revenue growth pace and we basically stated in Q3 or Q4 of 17, we will hit operating cashflow neutrality. So we are operating business-- >> John: But you guys also hit a 100 million at record pace too, I believe. >> Yeah, in four years. So revenue is one thing, but operating margins, like if you look at our margins on our subscription business for instance, we've got 84% margin on that. It's a really nice margin business. We can make that better margins, but that's a software margin. >> You know what's ironic, we were talking about Red Hat off camera. Here's Red Hat kicking butt, really hitting all cylinders, three billion dollars in bookings, one would think, okay hey I can maybe project forth some of these open-source companies. Maybe the flip side of this, oh wow we want it now. To your point, the market kind of flipped, but you would think that Red Hat is an indicator of how an open-source model can work. >> By the way Red Hat went public in 99, so it was a different trajectory, like you know I charted their trajectory out. Oracle's trajectory was different. They didn't even in inflation adjusted dollars they didn't hit a 100 million in four years, I think it was seven or eight years or what have you. Salesforce did it in five. So these SaaS models and these subscription models and the cloud services, which is an area that's near and dear to my heart. >> John: Goes faster. >> You get multiple revenue streams across different products. We're a multi-products cloud service company. Not just a single platform. >> So we were actually teasing this out on our-- >> And that's how you grow the business, and that's how Red Hat did it. >> Well I want to get your thoughts on this while we're just kind of ripping live here because Dave and I were talking on our intro segment about the business model and how there's some camouflage out there, at least from my standpoint. One of the main areas that I was kind of pointing at and trying to poke at and want to get your reaction to is in the classic enterprise go-to-market, you have sales force expansive, you guys pay handsomely for that today. Incubating that market, getting the profitability for it is a good thing, but there's also channels, VARs, ISVs, and so on. You guys have an open-source channel that kind of not as a VAR or an ISV, these are entrepreneurs and or businesses themselves. There's got to be a monetization shift there for you guys in the subscription business certainly. When you look at these partners, they're co-developing, they're in open-source, you can almost see the dots connecting. Is this new ecosystem, there's always been an ecosystem, but now that you have kind of a monetization inherently in a pure open distribution model. >> It forces you to collaborate. IBM was on stage talking about our system certified on the Power Systems. Many may look at IBM as competitive, we view them as a partner. Amazon, some may view them as a competitor with us, they've been a great partner in our for AWS. So it forces you to think about how do you collaborate around deeply engineered systems and value and we get great revenue streams that are pulled through that they can sell into the market to their ecosystems. >> How do you vision monetizing the partners? Let's just say Dave and I start this epic idea and we create some connective tissue with your orchestrator called the Data Platform you have and we start making some serious bang. We make a billion dollars. Do you get paid on that if it's open-source? I mean would we be more subscriptions? I'm trying to see how the tide comes in, whose boats float on the rising tide of the innovation in these white spaces. >> Platform thinking is you provide the platform. You provide the platform for 10x value that rides atop that platform. That's how the model works. So if you're riding atop the platform, I expect you and that ecosystem to drive at least 10x above and beyond what I would make as a platform provider in that space. >> So you expect some contributions? >> That's how it works. You need a thousand flowers to be running on the platform. >> You saw that with VMware. They hit 10x and ultimately got to 15 or 16, 17x. >> Shaun: Exactly. >> I think they don't talk about it anymore. I think it's probably trading the other way. >> You know my days at JBoss Red Hat it was somewhere between 15 to 20x. That was the value that was created on top of the platforms. >> What about the ... I want to ask you about the forking of the Hadoop distros. I mean there was a time when everybody was announcing Hadoop distros. John Furrier announced SiliconANGLE was announcing Hadoop distro. So we saw consolidation, and then you guys announced the ODP, then the ODPI initiative, but there seems to be a bit of a forking in Hadoop distros. Is that a fair statement? Unfair? >> I think if you look at how the Linux market played out. You have clearly Red Hat, you had Conicho Ubuntu, you had SUSE. You're always going to have curated platforms for different purposes. We have a strong opinion and a strong focus in the area of IoT, fast analytic data from the edge, and a centralized platform with HDP in the cloud and on-prem. Others in the market Cloudera is running sort of a different play where they're curating different elements and investing in different elements. Doesn't make either one bad or good, we are just going after the markets slightly differently. The other point I'll make there is in 2014 if you looked at the then chart diagrams, there was a lot of overlap. Now if you draw the areas of focus, there's a lot of white space that we're going after that they aren't going after, and they're going after other places and other new vendors are going after others. With the market dynamics of IoT, cloud and AI, you're going to see folks chase the market opportunities. >> Is that dispersity not a problem for customers now or is it challenging? >> There has to be a core level of interoperability and that's one of the reasons why we're collaborating with folks in the ODPI, as an example. There's still when it comes to some of the core components, there has to be a level of predictability, because if you're an ISV riding atop, you're slowed down by death by infinite certification and choices. So ultimately it has to come down to just a much more sane approach to what you can rely on. >> When you guys announced ODP, then ODPI, the extension, Mike Olson wrote a blog saying it's not necessary, people came out against it. Now we're three years in looking back. Was he right or not? >> I think ODPI take away this year, there's more than we can do above and beyond the Hadoop platform. It's expanded to include SQL and other things recently, so there's been some movement on this spec, but frankly you talk to John Mertic at ODPI, you talk to SAS and others, I think we want to be a bit more aggressive in the areas that we go after and try and drive there from a standardization perspective. >> We had Wei Wang on earlier-- >> Shaun: There's more we can do and there's more we should do. >> We had Wei on with Microsoft at our Big Data SV event a couple weeks ago. Talk about the Microsoft relationship with you guys. It seems to be doing very well. Comments on that. >> Microsoft was one of the two companies we chose to partner with early on, so and 2011, 2012 Microsoft and Teradata were the two. Microsoft was how do I democratize and make this technology easy for people. That's manifest itself as Azure Cloud Service, Azure HDInsight-- >> Which is growing like crazy. >> Which is globally deployed and we just had another update. It's fundamentally changed our engineering and delivering model. This latest release was a cloud first delivery model, so one of the things that we're proud of is the interactive SQL and the LLAP technology that's in HDP, that went out through Azure HDInsight what works data cloud first. Then it certified in HDP 2.6 and it went power at the same time. It's that cadence of delivery and cloud first delivery model. We couldn't do it without a partnership with Microsoft. I think we've really learned what it takes-- >> If you look at Microsoft at that time. I remember interviewing you on theCUBE. Microsoft was trading something like $26 a share at that time, around their low point. Now the stock is performing really well. Stockinnetel very cloud oriented-- >> Shaun: They're very open-source. >> They're very open-source and friendly they've been donating a lot to the OCP, to the data center piece. Extremely different Microsoft, so you slipped into that beautiful spot, reacted on that growth. >> I think as one of the stalwarts of enterprise software providers, I think they've done a really great job of bending the curve towards cloud and still having a mixed portfolio, but in sending a field, and sending a channel, and selling cloud and growing that revenue stream, that's nontrivial, that's hard. >> They know the enterprise sales motions too. I want to ask you how that's going over all within Hortonworks. What are some of the conversations that you're involved in with customers today? Again we were saying in our opening segment, it's on YouTube if you're not watching, but the customers is the forcing function right now. They're really putting the pressure one the suppliers, you're one of them, to get tight, reduce friction, lower costs of ownership, get into the cloud, flywheel. And so you see a lot-- >> I'll throw in another aspect some of the more late majority adopters traditionally, over and over right here by 2025 they want to power down the data center and have more things running in the public cloud, if not most everything. That's another eight years or what have you, so it's still a journey, but this journey to making that an imperative because of the operational, because of the agility, because of better predictability, ease of use. That's fundamental. >> As you get into the connected tissue, I love that example, with Kubernetes containers, you've got developers, a big open-source participant and you got all the stuff you have, you just start to see some coalescing around the cloud native. How do you guys look at that conversation? >> I view container platforms, whether they're container services that are running one on cloud or what have you, as the new lightweight rail that everything will ride atop. The cloud currently plays a key role in that, I think that's going to be the defacto way. In particularly if you go cloud first models, particularly for delivery. You need that packaging notion and you need the agility of updates that that's going to provide. I think Red Hat as a partner has been doing great things on hardening that, making it secure. There's others in the ecosystem as well as the cloud providers. All three cloud providers actually are investing in it. >> John: So it's good for your business? >> It removes friction of deployment ... And I ride atop that new rail. It can't get here soon enough from my perspective. >> So I want to ask about clouds. You were talking about the Microsoft shift, personally I think Microsoft realized holy cow, we could actaully make a lot of money if we're selling hardware services. We can make more money if we're selling the full stack. It was sort of an epiphany and so Amazon seems to be doing the same thing. You mentioned earlier you know Amazon is a great partner, even though a lot of people look at them as a competitor, it seems like Amazon, Azure etc., they're building out their own big data stack and offering it as a service. People say that's a threat to you guys, is it a threat or is it a tailwind, is it it is what it is? >> This is why I bring up industry-wide we always have waves of centralization, decentralization. They're playing out simultaneously right now with cloud and IoT. The fact of the matter is that you're going to have multiple clouds on-prem data and data at the edge. That's the problem I am looking to facilitate and solve. I don't view them as competitors, I view them as partners because we need to collaborate because there's a value chain of the flow of the data and some of it's going to be running through and on those platforms. >> The cloud's not going to solve the edge problem. Too expensive. It's just physics. >> So I think that's where things need to go. I think that's why we talk about this notion of connected data. I don't talk hybrid cloud computing, that's for compute. I talk about how do you connect to your data, how do you know where your data is and are you getting the right value out of the data by playing it where it lies. >> I think IoT has been a great sweet trend for the big data industry. It really accelerates the value proposition of the cloud too because now you have a connected network, you can have your cake and eat it too. Central and distributed. >> There's different dynamics in the US versus Europe, as an example. US definitely we're seeing a cloud adoption that's independent of IoT. Here in Europe, I would argue the smart mobility initiatives, the smart manufacturing initiatives, and the connected grid initiatives are bringing cloud in, so it's IoT and cloud and that's opening up the cloud opportunity here. >> Interesting. So on a prospects for Hortonworks cashflow positive Q4 you guys have made a public statement, any other thoughts you want to share. >> Just continue to grow the business, focus on these customer use cases, get them to talk about them at things like DataWorks Summit, and then the more the merrier, the more data-oriented open-source driven companies that can graduate in the public markets, I think is awesome. I think it will just help the industry. >> Operating in the open, with full transparency-- >> Shaun: On the business and the code. (laughter) >> Welcome to the party baby. This is theCUBE here at DataWorks 2017 in Munich, Germany. Live coverage, I'm John Furrier with Dave Vellante. Stay with us. More great coverage coming after this short break. (upbeat music)

Published Date : Apr 5 2017

SUMMARY :

brought to you by Hortonworks. Shaun great to see you again. Always a pleasure. in front of all the trends. Exactly. 99 is when you couldn't be happier for the and it's nice to see that graduating class Where's the value for you guys margins for the business You've got the edge, into the data center where you A subset of the data, yep. that failure's in the field, I got the hairy eyeball from you, With the community yeah, of the public markets. John: But you guys like if you look at our margins the market kind of flipped, and the cloud services, You get multiple revenue streams And that's how you grow the business, but now that you have kind on the Power Systems. called the Data Platform you have You provide the platform for 10x value to be running on the platform. You saw that with VMware. I think they don't between 15 to 20x. and then you guys announced the ODP, I think if you look at how and that's one of the reasons When you guys announced and beyond the Hadoop platform. and there's more we should do. Talk about the Microsoft the two companies we chose so one of the things that I remember interviewing you on theCUBE. so you slipped into that beautiful spot, of bending the curve towards cloud but the customers is the because of the operational, and you got all the stuff you have, and you need the agility of updates that And I ride atop that new rail. People say that's a threat to you guys, The fact of the matter is to solve the edge problem. and are you getting the It really accelerates the value and the connected grid you guys have made a public statement, that can graduate in the public Shaun: On the business and the code. Welcome to the party baby.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
DavePERSON

0.99+

Dave VellantePERSON

0.99+

JohnPERSON

0.99+

EuropeLOCATION

0.99+

AmazonORGANIZATION

0.99+

2014DATE

0.99+

John FurrierPERSON

0.99+

MicrosoftORGANIZATION

0.99+

John MerticPERSON

0.99+

Mike OlsonPERSON

0.99+

ShaunPERSON

0.99+

IBMORGANIZATION

0.99+

Shaun ConnollyPERSON

0.99+

CentricORGANIZATION

0.99+

TeradataORGANIZATION

0.99+

OracleORGANIZATION

0.99+

Coca-ColaORGANIZATION

0.99+

John FurrierPERSON

0.99+

2016DATE

0.99+

4.1 billionQUANTITY

0.99+

ClouderaORGANIZATION

0.99+

AWSORGANIZATION

0.99+

90%QUANTITY

0.99+

twoQUANTITY

0.99+

100 millionQUANTITY

0.99+

fiveQUANTITY

0.99+

2011DATE

0.99+

Mount FujiLOCATION

0.99+

USLOCATION

0.99+

sevenQUANTITY

0.99+

185 millionQUANTITY

0.99+

eight yearsQUANTITY

0.99+

four yearsQUANTITY

0.99+

10xQUANTITY

0.99+

Dahl JeppePERSON

0.99+

YouTubeORGANIZATION

0.99+

FedExORGANIZATION

0.99+

HortonworksORGANIZATION

0.99+

100 millionQUANTITY

0.99+

oneQUANTITY

0.99+

MuleSoftORGANIZATION

0.99+

2025DATE

0.99+

Red HatORGANIZATION

0.99+

three yearsQUANTITY

0.99+

15QUANTITY

0.99+

two companiesQUANTITY

0.99+

2012DATE

0.99+

Munich, GermanyLOCATION

0.98+

HadoopTITLE

0.98+

DataWorks 2017EVENT

0.98+

Wei WangPERSON

0.98+

WeiPERSON

0.98+

10%QUANTITY

0.98+

eight yearsQUANTITY

0.98+

20xQUANTITY

0.98+

Hortonworks Hadoop SummitEVENT

0.98+

end of 2016DATE

0.98+

three billion dollarsQUANTITY

0.98+

SiliconANGLEORGANIZATION

0.98+

AzureORGANIZATION

0.98+

DataWorks SummitEVENT

0.97+

Scott Gnau | DataWorks Summit Europe 2017


 

>> More information, click here. (soothing technological music) >> Announcer: Live from Munich, Germany, it's theCUBE. Covering Dataworks Summit Europe 2017. Brought to you by Hortonworks. (soft technological music) >> Okay welcome back everyone, we're here in Munich, Germany for Dataworks Summit 2017 formerly Hadoop Summit powered by Hortonworks. It's their event, but now called Dataworks because data is at the center of the value proposition Hadoop plus Airal Data and storage. I'm John, my cohost David. Our next guest is Scott Gnau he's the CTO of Hortonworks joining us again from the keynote stage, good to see you again. >> Thanks for having me back, great to be here. >> Good having you back. Get down and dirty and get technical. I'm super excited about the conversations that are happening in the industry right now for a variety of reasons. One is you can't get more excited about what's happening in the data business. Machine learning AI has really brought up the hype around, to me is human America, people can visualize AI and see the self-driving cars and understand how software's powering all this. But still it's data driven and Hadoop is extending into data seeing that natural extension and CloudAIR has filed their S1 to go public. So it brings back the conversations of this opensource community that's been doin' all this work in the big data industry, originally riding in the horse of Hadoop. You guys have an update to your Hadoop data platform which we'll get to in a second, but I want to ask you a lot of stories around Hadoop, I say Hadoop was the first horse that everyone rode in on in the big data industry... When I say big data, I mean like DevOps, Cloud, the whole open sourcing he does, but it's evolving it's not being replaced. So I want you to clarify your position on this because we're just talkin' about some of the false premises, a lot of stories being written about the demise of Hadoop, long-live Hadoop. Yeah, well, how long do we have? (laughing) I think you hit it first, we're at Dataworks Summit 2017 and we rebranded and it was previously Hadoop Summit. We rebranded it to really recognize that there's this bigger thing going on and it's not just Hadoop. Hadoop is a big contributor, a big driver, a very important part of the ecosystem but it's more than that. It's really about being able to manage and deliver analytic content on all data across that data's lifecycle from when it gets created at the edge to its moving through networks, to its landed and store in a cluster to analytics run and decisions go back out. It's that entire lifecycle and you mentioned some of the megatrends and I talked about this morning in the opening keynote. With AI and streaming and IoT, all of these things kind of converging are creating a much larger problem set and frankly, opportunity for us as an industry to go soft. So that's the context that we're really looking-- >> And there's real demand there. This is not like, I mean there's certainly a hype factor on AI, but IoT is real. You have data now, not just a back office concept, you have a front-facing business centric... I mean there's real customer demand here. >> There's real customer demand and it really creates the ability to dramatically change a business. A simple example that I used onstage this morning is think about the electric utility business. I live in Southern California. 25 years ago, by the way I studied to be an electrical engineer, 20 years ago, 30 years ago, that business not entirely simple was about building a big power plant and distributing electrons out to all the consumers of electrons. One direction and optimization of that grid, network and that business was very hard and there was billions of dollars at stake. Fast forward to today, now you still got those generating plants online, but you've also got folks like me generating their own power and putting it back into the grid. So now you've got bidirectional electrons. The optimization is totally different. Then how do you figure out how most effectively to create capacity and distribute that capacity because created capacity that's not consumed is 100% spoiled. So it's a huge data problem but it's a huge data problem meeting IoT, right? Devices, smart meter devices out at the edge creating data doing it in realtime. A cloud blew over, my generating capacity on my roof went down so I've got to pull from the grid, combining all of that data to make realtime decisions is we're talking hundreds of billions of dollars and it's being done today in an industry, it's not a high-tech Silicon Valley kind of industry, electric utilities are taking advantage of this technology today. >> So we were talking off-camera about you know some commentary about the Hadoop is failed and obviously you take exception to that and I and you also made the point it's not just about Hadoop but in a way it is because Hadoop was the catalyst of all this open Why has Hadoop not failed in your view >> Well because we have customers and you know the great thing about conferences like this is we're actually able to get a lot of folks to come in and talk about what they're doing with the technology and how they're driving business benefit and share that business benefit to their colleagues so we see that that it's business benefit coming along you know In any hype cycle you know people can go down a path maybe they had false expectations right early on you know six years ago years ago we were talking about hey is open source of Hadoop is going to come along and replace EDW complete fallacy right what I talked about in that opportunity being able to store all kinds of disparate data being able to manage and maneuver analytics in real time that's the value proposition is very different than some of the legacy ten. So if you view it as hey this thing is going to replace that thing okay maybe not but the point is is very successful for what is not verified that-- >> Just to clarify what you just said there that was you guys never kicked that position. CloudAIR or did with their impala was their initial on you could give me that you don't agree with that? >> Publicly they would say oh it's not a replacement but you're right i mean the actions were maybe designed to do that >> And set in the marketplace that that might be one of the outcomes >> Yeah, but they pivoted quickly when they realized that was failed strategy but i mean that but that became a premise that people locked in on. >> If that becomes your yardstick for measuring then then so-- >> Oh but but wouldn't you agree that that Hadoop in many respects was designed to solve some of the problems that edw never could >> Exactly so so you know again when you think about the the variety of data when you think about the analytic content doing time series analysis is very hard to do in a relational model so it's a new tool in the workbench to go solve analytic problems and so when you look at it from that perspective and I use the utility example the manufacturing example financial consumer finance telco all of these companies are using this technology leveraging this technology to solve problems they couldn't solve or and frankly to build new businesses that they couldn't build before because they didn't have access to that real time-- >> And so money did shift from pouring money into the edw with limited returns because you were at the steep part or the flat part of the s-curve to hey let's put it over here and this so called big data thing and that's why the market I think was conditioned to sort of come to that simple conclusion but dollars the spending did shift did it not? >> Yeah I mean if you subscribe kind of that to that herd mentality and you know the net increase the net new expenditure in the new technology is always going to outpace the growth of the existing kind of plateau technologists. That's just math. >> The growth yes, but not the size not the absolute dollars and so you have a lot of companies right now struggling in the traditional legacy space and you got this rocket ship going in-- >> And again I think if you think about kind of the converging forces that are out there in addition to you know i OT and streaming the ability frankly Hadoop is an enabler of AI when you think about the success of AI and machine learning it's about having massive massive massive amounts of data right? And I think back 25 years ago my first data Mart was 30 gigabytes and we thought that was all the data in the world Now fits on your phone so so when you think about just having the utter capacity and the ability to actually process that capacity of data these are technology breakthroughs that have been driven in the poor open source in Hadoop community when combined with the ability then to execute in clouds and ephemeral kinds of workloads you combine all that stuff together now instead of going to capital committee for 20 millioin dollars for a bunch of hardware to do an exabyte kind of study where you may not get an answer that means anything you can now spin that up in the cloud and for a couple of thousand dollars get the answer take that answer and go build a new system of insight that's going to drive your business and this is a whole new area of opportunity or even by the convergence of all that >> So I agree i mean it's absurd to say Hadoop and big data has failed, it's crazy. Okay but despite the growth i called profitless prosperity can the industry fund itself I mean you've got to make big bets yarn tezz different clouds how does the industry turn into one that is profitable and growing well I mean obviously it creates new business models and new ways of monetizing software in deploying software you know one of the key things that is core to our belief system is really leveraging and working with and nurturing the community is going to be a key success factor for our business right nurturing that innovation in collaboration across the community to keep up with the rate of pace of change is one of the aspects of being relevant as a business and then obviously creating a great service experience for our customers so that they they know that they can depend on enterprise class support enterprise-class security and governance and operational management in the cloud and on-prem in creating that value propisition along with the the advanced and accelerated delivery of innovation is where I think you know we kind of intersect uniquely in in the in the industry. >> and one of the things that I think that people point out and I have this conversation all the time of people who try to squint through the you know the wall street implications of the value proposition of the industry and this and that and I want to get your thoughts on because open source at this era that we're living in today bringing so much value outside of just important works in your your company Dave would made a comment on the intro package we're doing is that the practitioners are getting a lot of value people out in the field so these are the white space as a value and they're actually transformative can you give some examples where things are getting done that are real of real value as use cases that are that are highlighted you guys can i light I think that's the unwritten story that no one thought about it that rising tide floating all boat happening? >> Yeah yes I mean what is the most use cases the white so you have some of those use cases again it really involves kind of integrating legacy traditional transactional information right very valuable information about a company its operations its customers its products and all this kind of thing about being able to combine that with the ability to do real-time sensor management and ultimately have a technology stack that enables kind of the connection of all of those sources of data for an analytic and that's an important differentiation you know for the first 25 years of my career right it was all about what school all this data into a place and then let's do something with it and then we can push analytics back not an entirely bad model but a model that breaks in the world of IOT connected devices it's just frankly isn't enough money to spend on bandwidth to make that happen and as fast as the speed of light is it creates latency so those decisions aren't going to be able to be made in time so we're seeing even in traditional i mentioned utility business think about manufacturing oil and gas right sensors everywhere being able to take advantage not not of collecting all the central data and all of that but being able to actually create analytics based on sensor data and put those analytics outs of the sensors to make real-time decisions that can affect hundreds of millions of dollars of production or equipment are the use cases that we're seeing be deployed today and that's complete white space that was unavailable before. >> Yeah and customer demand too I mean Dave and I were also debating about the this not being a new trend this is just big data happening the customers are demanding production workload so you've seen a lot more forcing function driven by the customer and you guys have some news I want to get to and give your thoughts on HTTP or worse data platform two points dicks what's the key news their house in real time you talking about real time. >> Yeah it's about real time real time flexibility and choice you know motherhood and apple pie >> And the major highlights of that operate >> So the upgrades really inside of hive we now have operational analytic query capabilities where when you do tactical response times second sub second kind of response time. >> You know Hadoop and Hive wasn't previously known for that kind of a tactical response we've been able to now add inside of that technology the ability to view that workload we have customers who building these white space applications who have hundreds or thousands of users or applications that depend on consistency of very quick analytic response time we now deliver that inside the platform what's really cool about it in addition to the fact that it works is is that we did it inside a pipe so we didn't create yet another project or yet another thing that a customer has to integrate to or rewrite their application so any high based application cannot take advantage of this performance enhancement and that's part of our thinking of it as a platform the second thing inside of that that we've done that really it creaks to those kinds of workload is is we've really enhance the ability to incremental data acquisition right whether it be streaming whether it be patch up certs right on the sequel person doing up service being able to do that data maintenance in an active compliant fashion completely automatically and behind the scenes so that those applications again can just kind of run without any heavy lifting >> Just staying in motion kind of thing going on >> Right it's anywhere from data in motion even to batch to mini batch and anywhere kind of in between but we're doing those incremental data loads you know, it's easy to get the same file twice by mistake you don't want to double count you want to have sanctity of the transactions we now handle that inside of Hive with acid compliance. >> So a layperson question for the CTO if I may you mentioned Hadoop was not known for a sort of real-time response you just mentioned acid it was never in the early days known for a sort of acid you know complies others would say you know Hadoop the original Big Data Platform is not designed for the matrix of the matrix math of AI for example are these misconceptions and like Tim Berners-lee when we met Tim Berners-lee web 2.0 this is what the web was designed for would you say the same thing about Hadoop? >> Yeah. Ultimately from my perspective and kind of mending it out, Hadoop was designed for the easy acquisition of data the easy onboarding of data and then once you've onboarded that data it it also was known for enabling new kinds of analytics that could be plugged in certainly starting out with MapReduce in HDFS was kind of before but the whole idea is I have now the flexible way to easily acquire data in its native form without having to apply schema without having to have any formatting distort I can get it exactly as it was and store it and then I can apply whatever schema whatever rules whatever analytics on top of that that I want so the center of gravity from my mind has really moved up to yarn which enables a multi-tenancy approach to having pluggable multiple different kinds of file formats and pluggable different kinds of analytics and data access methods whether it be sequel whether it be machine learning whether the HBase will look up and indexing and anywhere kind of in between it's that it's that Swiss Army knife as it were for handling all of this new stuff that is changing every second we sit here data has changed. >> And just a quick follow-up if I can just clarification so you said new types of analytics that can be plugged in by design because of its openness is that right? >> By design because of its openness and the flexibility that the platform was was built for in addition on the performance we've also got a new update to spark and usability consume ability and collaboration for data scientists using the latest versions of spark inside the platform we've got a whole lot of other features and functions as that our customers have asked for and then on the flexibility and choice it's available public cloud infrastructures of service public cloud platform as a service on Prem x and net new on prem with power >> Just got final question for you just as the industry evolves what are some of the key areas that open source can pivot to that really takes advantage of the machine learning the AI trends going on because you start to see that really increase the narrative around the importance of data and a lot of people are scratching their heads going okay i need to do the back office to set up my IT to have all those crates stuff always open source projects all that the Hadoop data platform but then I got to get down and dirty i might do multiple clouds on the hybrid cloud going on i might want to leverage the moles canoe cool containers and super Nettie's and micro services and almost devops where's that transition happening as a CTO what do you see that that how do you talk to customers about that this transition this evolution of how the data businesses in getting more and more mainstream? >> Yeah i mean i think i think the big thing that people had to get over is we've reverse polarity from again 30 years of I want a stack vendor to have an integrated stack of everything a plug-and-play it's integrated and end it might not be a hundred percent what I want but the cost leverage that I get out of the stack versus what I'm going to go do that's perfect in this world if the opposite it's about enabling the ecosystem and that's where having and by the way it's a combination of open source and proprietary software that you know some of our partners have proprietary software that's okay but it's really about enabling the ecosystem and I think the biggest service that we as an open source community can do is to continue to kind of keep that standard kernel for the platform and make it very usable and very easy for many apps and software providers and other folks. >> A thousand flower bloom and kind of concept and that's what you've done with the white spaces as these cases are evolving very rapidly and then the bigger apps are kind of going to settling into a workload with realtime. >> Yeah all time you know think about the next generation of IT professional the next generation of business professional grew up with iphones and here comes they grew up in a mini app world i mean it download an app i'm going to try it is a widget boom and it's going to help me get something done but it's not a big stack that I'm going to spend 30 years to implement and I liked it and then I want to take to those widgets and connect them together to do things that i haven't been able to do before and that's how this ecosystem is really-- >> Great DevOps culture very agile that's their mindset. So Scott congratulations on your 2.6 upgrade and >> Scott: We're thrilled about it. >> Great stuff acid compliance really big deal again these compliance because little things are important in the enterprise great all right thanks for coming to accuse the Dataworks in Germany Munich I'm John thanks for watching more coverage live here in Germany after this short break

Published Date : Apr 5 2017

SUMMARY :

(soothing technological music) Brought to you by Hortonworks. because data is at the center of the value proposition that are happening in the industry you have a front-facing business centric... combining all of that data to make realtime decisions and share that business benefit to their Just to clarify what you just said there a premise that people locked in on. that to that herd mentality and you know the community to keep up with the rate cases the white so you have some of debating about the this not being a new So the upgrades really inside of hive we it's easy to get the same file twice by mistake you the CTO if I may you mentioned Hadoop acquisition of data the easy onboarding the big thing that people had to get kind of going to settling into a So Scott congratulations on your 2.6 upgrade and

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
ScottPERSON

0.99+

100%QUANTITY

0.99+

JohnPERSON

0.99+

DavidPERSON

0.99+

DavePERSON

0.99+

GermanyLOCATION

0.99+

Southern CaliforniaLOCATION

0.99+

30 yearsQUANTITY

0.99+

30 gigabytesQUANTITY

0.99+

Scott GnauPERSON

0.99+

hundredsQUANTITY

0.99+

HortonworksORGANIZATION

0.99+

Swiss ArmyORGANIZATION

0.99+

six years ago years agoDATE

0.99+

AmericaLOCATION

0.99+

25 years agoDATE

0.99+

HadoopTITLE

0.99+

Munich, GermanyLOCATION

0.99+

todayDATE

0.98+

Dataworks Summit 2017EVENT

0.98+

30 years agoDATE

0.98+

two pointsQUANTITY

0.98+

iphonesCOMMERCIAL_ITEM

0.98+

telcoORGANIZATION

0.98+

HadoopORGANIZATION

0.98+

hundred percentQUANTITY

0.98+

billions of dollarsQUANTITY

0.98+

first 25 yearsQUANTITY

0.97+

DevOpsTITLE

0.97+

hundreds of millions of dollarsQUANTITY

0.97+

20 years agoDATE

0.97+

20 millioin dollarsQUANTITY

0.97+

twiceQUANTITY

0.97+

DataWorks SummitEVENT

0.97+

firstQUANTITY

0.97+

oneQUANTITY

0.97+

OneQUANTITY

0.96+

second thingQUANTITY

0.96+

Tim Berners-leePERSON

0.96+

Silicon ValleyLOCATION

0.96+

MunichLOCATION

0.96+

Hadoop SummitEVENT

0.96+

One directionQUANTITY

0.96+

first horseQUANTITY

0.95+

first dataQUANTITY

0.95+

DataworksORGANIZATION

0.94+

secondQUANTITY

0.92+

CloudTITLE

0.92+

EDWORGANIZATION

0.85+

2017EVENT

0.85+

couple of thousand dollarsQUANTITY

0.84+

Dataworks Summit Europe 2017EVENT

0.84+

MapReduceTITLE

0.84+

thousands of usersQUANTITY

0.83+

lot of folksQUANTITY

0.83+

this morningDATE

0.8+

S1TITLE

0.79+

EuropeLOCATION

0.78+

A thousand flower bloomQUANTITY

0.78+

2.6OTHER

0.76+

appsQUANTITY

0.73+

Deepti Srivastava, ‎Google - PBWC 2017 - #InclusionNow - #theCUBE


 

>> Hey, welcome back everybody, Jeff Frick here with theCUBE. We're in downtown San Francisco at the Professional BusinessWomen of California Conference. It's the 28th year, Jackie Speier started it a long time ago and now it's grown to 6,000 people. It's a pretty amazing conference, it crosses all indrustries and actually a lot more than California as well. And we're excited to actually have somebody to come talk to us about the conference itself. It's Deepti Srivastava, she's a Project Manager of Google Cloud from Google. Great to see you again, last we saw you, I looked it up was 2014 >> I know. >> at Topcoder Open. >> Indeed. >> And you were doing great work then, you were on a panel with a bunch of high school girls. I remember they'd bust in a couple of busloads of high school girls and you and a couple other mainly young professional women talkin' to 'em about the life of an engineer. So you're still doin' good things. >> I hope so. (laughs) >> Absolutely. >> I hope so, yeah, it's a passion of mine and I'm really happy to bring it to something like PBWC where I'm on the board. And we do a bunch of work across industries and across all levels. PBWC's mission is to work for gender equity and equal pay for women across all industries and in all professional settings. >> Right. >> That includes young professionals, as well as the pipeline of professionals coming in. >> That's terrific. So we could talk about your day job all day long. (Deepti laughs) Google Cloud's kickin' tail, you guys had your big conference a couple weeks back-- >> Here in fact. (chuckles) >> Here in Moscone West, right? >> Yeah. >> But in terms of what you're doing here with PBWC, give us a little bit of the history. So we know it was started by Jackie Speier, I think you said 1988. >> Yeah. >> That's just amazing. >> I know. >> Obviously it's much more than California. >> Yeah. >> But what is the top-level mission and how has the conference evolved over the last several years? >> So Professional BusinessWomen of California, as you said was started by Congresswoman Jackie Speier and Judy Bloom, who's a co-founder. And we still exist and we've been doing this for so long and we really care about our mission, which is to work for basically gender equity and equal pay as I said, for all professional settings for women. And in this particular case, this conference we are talking about inclusion. And we chose this theme because we really think it's pertinent to what's going on right now in the world and in our country. And we, PBWC, believe that the things that unite us, the potentials and aspirations that unite us are greater than our differences and things like that. So we want to make a statement and really address the inclusion work that we do, and the inclusion work that's required for all of us to really move forward as a country and as a people. And if you look at our lineup of speakers today, we really do walk the talk that we're talking about. We have amazing speakers today with Rosario Dawson to Taraji P. Henson and all the way to Secretary Clinton who's closing out our day today, we are so excited to have her. And there's nobody better to represent breaking the glass ceiling than she has so we're very excited to hear. >> And what a get, I think I heard that it's her first public speaking engagement post the election. >> Yeah, I know. And it's very exciting because again, I think we're all about coming together and rallying and being a force for good. The conferences, that's our aim ultimately as an organization. And having her here to give her speech, first public appearance after the election last year, very exciting I think. >> Right, right. >> And we're very excited to hear from her. I'm already inspired by the thought that she's going to be here. >> And really a big part of the theme was kind of the strategy work is done, everybody knows it's good. Now it's really time for the rubber to hit the road. It's about execution and about taking steps and measuring. And a lot of the real concrete, nuts-and-bolts activities that need to happen to really move this thing down the road. >> You mean like gender equity and-- >> Yeah, yeah. >> Yeah, absolutely. I think it's been a topic for awhile and I think, exactly, we need to have the rubber hit the road, we have to get together, we have to have actionable plans and that's what a bunch of our seminars today talk about. How to address those things in your, we really want to empower women and actually people of all backgrounds and ages and all sorts of people to take charge of their own lives. And especially, we are a professional women conference so that's kind of where we focus our messaging. But really we want women to take control of their own lives and we want to give them the tools, the networking opportunities, the inspirations to meet their aspirations in those fields. And so we want them to take charge and move forward by themselves, take away from here and go back to your job, to your work, to your home, to really bring your messaging forward. Take inspiration from here and bring it back to your life. >> Right, and I think Bev Crair, in the keynotes said, "Fill your well today." >> Yeah. >> 'Cause as soon as you leave here it's back to the grind and you're going to need that energy. So while you're here surrounded by this energy and your peers, take it all in and load up. >> Absolutely. And I also want to say that we started out as a conference, an annual conference, and that's definitely our marquee thing that we do every year. But we actually have a lot more offerings that people can continue to engage over the year. So we have webinars and seminars that people can attend, there's community events that happen here. And you can go to the PBWC website and see what all offerings we have. But we want people to engage and we want to be able to provide them with the means to engage throughout the year, not just here but take this, everything you get today and then take it along the rest of the year and recharge yourself. >> It's kind of this whole 365 concept which we talk about on theCUBE a lot too, 'cause we go to so many shows. And there's a huge investment of time and energy and money on those two or three days, but how do you extend that out beyond the show? How do you build the excitement leading into the show so it's not just a one time kind of a shot, then everything goes back to normal? >> Yeah exactly, I think that's exactly the point, that this is not just a one day, you go there, you get inspired and then what next, right? >> Right. >> There's something you can go back to with our various offerings and continue your learning journey if that's what you want, or networking journey if that's what you want to do. Wherever you are in your career, we actually have a Young Women's Professional Summit that I have the honor of chairing, that we have every year and it's meant to help young professional women navigate their way from being in college and high school and those entering a professional life so as I said, we want to cater to all levels and all ages and all sorts of challenges that people face as they're going through their professional careers. >> So that's a separate event? >> It is, it is an annual conference. >> And when is that? Give a plug. Or do you have a date? (Deepti chuckles) >> Yeah, we don't have a date yet but it's going to be in the summer. >> In the summer, okay great. Well I think when we met last, I thought that was such an important piece of that Topcoder Open because it wasn't the Sheryl Sandbergs or the Hillary Clintons or these super mega top-of-the-pyramid people. It was a bunch of young professionals, one of the gals was still in school, hadn't finished graduating, to make it so much real for those high schoolers. They didn't have to look so far to say, "I could see myself, I kind of look like that person, "I kind of see things touch." >> And I think that's very important, Jeff. Exactly. It's very important and that's what we try to do here at PBWC as well. We want to go from catering to the Millennials and how we interact with them and all the way up to C-suite, we had a Senior Leadership Summit yesterday leading up to the conference today where we have a bunch of C-suites and CDOs, Chief Diversity Officers, come together and talk about trending topics and how to solve them. So we really are trying to move the needle forward on many fronts here, but our aim is all of that to culminate into moving women and people of all backgrounds forward. >> Right. And then there's this whole entrepreneurial bit which you can't see behind the camera, but there's booths all over for Intel and LinkedIn and Microsoft and the names that you would expect, Google of course, but there's also all the little boutiques, clothing stores and jewelry stores and crafty things. There's even of course women-focused snacks with the Luna Bars and I forget the other one. (chuckles) So it's kind of a cool entrepreneurial spirit kind of on top of everything else. >> Absolutely. And you know Jackie Speier, Congresswoman, started this conference to help women who were in the SMB, sort of SME market, basically women who ran small businesses. And we want to continue to do that as well but now of course the world is changing and we have a much more of a corporate presence and we want to help there too. But yeah, we pay homage to that by having women who are women entrepreneurs running women-focused businesses, and we have them here in the expo area if you can get a shot of that later. >> Right. >> The energy is palpable, the excitement is there and it's so great to be here and harness that, and take it back, I mean the first time I was here many years ago when I was not even on the board, I was just like, oh my gosh, there's so many women here who are like me or who are, they're people I could look up to all the way up to the C-suite who are making their presence felt here. And also all the people around me and like-minded, like me. So it's a really inspiring event. And I've been here for many years but I'm still inspired by it. So I'm so excited that we do this and continue to do this. >> So, little harder to question. So, and you've been doing this for awhile, what surprises you on the negative that still you know, you're still fighting that battle that you wouldn't have expected to still be doing? And then conversely what has surprised you on the positive, in terms of what's moved maybe further than you might've thought or faster than you might've thought? >> That's a good question. I think you already nailed it, right. The fact that we are still here talking about this is interesting to me, and as I got more involved in this kind of work I realized that people have been doing this for a long time. Congresswoman herself has been doing this for so long and a fearless advocate for women's rights and equal pay and diversity and inclusion. And the fact that we are still here, it is indicative of the fact that we need to have a groundswell movement in order to change policy. We can talk about it all we want but unless there's actionable things you can take away and really have that grassroots-level work to push the envelope forward, it's not going to happen. I think the positive is, as I've seen this conference over the years, it's grown. And it's gotten a lot more young people involved and it's not just the senior leadership that is trying to pull people forward, it's the people starting out early in their careers or mid-level in their careers that are looking at taking charge of their own destiny and pushing their agenda forward in this sense. They want, they're asking for equal pay. They're really engaged and aware. And conferences like PBWC actually help with that, getting those minds together and making things move forward. So I think from a positive side I'm really excited to see so many more people engaged in this fight. And the more people we have, the more we can actually make real progress and real inroads. >> And if you look back, as someone who's never been here and then they see this interview and they say, "This looks awesome, I'm going to sign up," what do you think the biggest surprise when they come for the first-timer? >> I'll tell you what I was surprised by, is seeing so many women together across industries, across ages, across backgrounds. Everybody together, really wanting to move forward. They're really wanting to engage, to connect with each other and to actually make a difference. People are here to make a difference, right? >> Right, right. >> And that's, to say that 6,000 people come together and really all of them have that same sort of mentality of like yes, I'm empowered to make a difference, is electrifying. >> Deepti, I love the energy. >> (laughs) Thank you. >> I love the energy, absolutely. >> It's all these people. >> It is. >> Trust me, I'm sleep deprived (Jeff laughs) with my very young son. So yeah, this is all the energy that I need to feed off of. >> No, it's good. And there is something special here. >> Mm-hmm. >> And you can feel it. 'Cause we go to a lot of shows, you go to a lot of shows. And again, it's not an exclusive tech show which is kind of nice 'cause we cross a lot of industries. But there's definitely, there's an energy, there's a vibe that comes from the little entrepreneurial outlets, it just comes from the, that room was packed. The keynote room was... >> I know. >> Was not fire marshal friendly. (Deepti laughs) Hopefully the fire marshal was not close by-- >> Yes, we had some discussion on that too. But to your point, this is one of the conferences that I've seen where we really, perhaps the only conference I've seen where we really cut across all industries. Because there's tech-focused, there's business-focused, there's all sorts of focused conferences trying to do either their professional work on technology or whatnot, or they're trying to solve the problem on the gender and diversity and inclusion piece in their own silos. And we try to cut across so that we can actually have a coming together of all of these various industries and their leaders, thought leaders, sharing ideas and sharing best practices so that we can actually all move forward together, I think that's again our Senior Leadership Summit which happened last night and the VIP reception which happened last night is all about getting those thought leaders together and getting them to share their best practices and ideas so that again, they can take it back to their companies and really move forward with DNI initiatives. >> It's action right, it's all about the action. >> Absolutely. >> So I promise next time that we talk, we'll talk about Google Cloud. >> Oh, sure. >> 'Cause that's hoppin'. (Deepti laughs) But it was great to see you and congratulations on all your work with the board and with your event >> Thank you. >> in the summer. People should go to the website, keep an eye out. >> Absolutely. >> It'll be comin' out. >> Yeah. >> So thank you. >> Thank you so much, it was great to see you too, Jeff. >> Absolutely. Alright she's Deepti, I'm Jeff, you're watching theCUBE. We're at the Professional BusinessWomen of California Conference. The 28th year, pretty amazing, 6,000 people. Here at Moscone West, thanks for watchin'. (upbeat techno music)

Published Date : Mar 31 2017

SUMMARY :

and now it's grown to 6,000 people. and you and a couple other mainly young professional women I hope so. and I'm really happy to bring it That includes young professionals, So we could talk about Here in fact. So we know it was started by Jackie Speier, I think and the inclusion work that's required for all of us And what a get, And having her here to give her speech, that she's going to be here. And a lot of the real concrete, nuts-and-bolts activities Yeah, and we want to give them the tools, Right, and I think Bev Crair, in the keynotes said, and your peers, take it all in And I also want to say that we started out as a conference, on theCUBE a lot too, 'cause we go to so many shows. that we have every year and it's meant to help And when is that? Yeah, we don't have a date yet but it's going to be They didn't have to look so far to say, and how we interact with them and all the way up to C-suite, and Microsoft and the names that you would expect, and we have them here in the expo area if you can get a shot and it's so great to be here and harness that, And then conversely what has surprised you on the positive, And the fact that we are still here, and to actually make a difference. And that's, to say that 6,000 people come together I love the energy, that I need to feed off of. And there is something special here. 'Cause we go to a lot of shows, you go to a lot of shows. Hopefully the fire marshal was not close by-- and sharing best practices so that we can actually So I promise next time that we talk, and with your event in the summer. the Professional BusinessWomen of California Conference.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FrickPERSON

0.99+

JeffPERSON

0.99+

PBWCORGANIZATION

0.99+

Jackie SpeierPERSON

0.99+

MicrosoftORGANIZATION

0.99+

twoQUANTITY

0.99+

Judy BloomPERSON

0.99+

DeeptiPERSON

0.99+

Deepti SrivastavaPERSON

0.99+

1988DATE

0.99+

LinkedInORGANIZATION

0.99+

2014DATE

0.99+

todayDATE

0.99+

yesterdayDATE

0.99+

6,000 peopleQUANTITY

0.99+

GoogleORGANIZATION

0.99+

last nightDATE

0.99+

three daysQUANTITY

0.99+

Moscone WestLOCATION

0.99+

one timeQUANTITY

0.99+

Hillary ClintonsPERSON

0.99+

IntelORGANIZATION

0.99+

Bev CrairPERSON

0.98+

last yearDATE

0.98+

one dayQUANTITY

0.98+

first timeQUANTITY

0.97+

Taraji P. HensonPERSON

0.97+

CaliforniaLOCATION

0.97+

SecretaryPERSON

0.97+

Luna BarsORGANIZATION

0.97+

Rosario DawsonPERSON

0.97+

28th yearQUANTITY

0.97+

Topcoder OpenEVENT

0.96+

first public speaking engagementQUANTITY

0.96+

Young Women's Professional SummitEVENT

0.96+

Sheryl SandbergsPERSON

0.94+

ClintonPERSON

0.92+

oneQUANTITY

0.92+

PBWC 2017EVENT

0.9+

DNIORGANIZATION

0.9+

theCUBEORGANIZATION

0.89+

first public appearanceQUANTITY

0.88+

Professional BusinessWomenEVENT

0.85+

firstQUANTITY

0.82+

‎GoogleORGANIZATION

0.78+

ConferenceEVENT

0.78+

Senior Leadership SummitEVENT

0.7+

downtown San FranciscoLOCATION

0.7+

yearsDATE

0.69+

many years agoDATE

0.68+

lot of showsQUANTITY

0.68+

Google CloudTITLE

0.68+

365 conceptQUANTITY

0.67+

couple weeks backDATE

0.66+

MillennialsPERSON

0.64+

annualQUANTITY

0.64+

#theCUBEORGANIZATION

0.63+

lastDATE

0.62+

one of theQUANTITY

0.62+

Professional BusinessWomen ofEVENT

0.59+

CloudTITLE

0.58+