SiliconANGLE News | AWS Responds to OpenAI with Hugging Face Expanded Partnership
(upbeat music) >> Hello everyone. Welcome to Silicon Angle news breaking story here. Amazon Web Services, expanding their relationship with Hugging Face, breaking news here on Silicon Angle. I'm John Furrier, Silicon Angle reporter, founder and also co-host of theCUBE. And I have with me Swami from Amazon Web Services, vice president of database analytics machine learning with AWS. Swami, great to have you on for this breaking news segment on AWS's big news. Thanks for coming on, taking the time. >> Hey John, pleasure to be here. >> We've had many conversations on theCUBE over the years. We've watched Amazon really move fast into the large data modeling. You SageMaker became a very smashing success. Obviously you've been on this for a while, now with Chat GPT, open AI, a lot of buzz going mainstream, takes it from behind the curtain, inside the ropes, if you will, in the industry to a mainstream. And so this is a big moment I think in the industry. I want to get your perspective because your news with Hugging Face, I think is a is another tell sign that we're about to tip over into a new accelerated growth around making AI now application aware application centric, more programmable, more API access. What's the big news about with AWS Hugging Face, you know, what's going on with this announcement? >> Yeah, first of all, they're very excited to announce our expanded collaboration with Hugging Face because with this partnership, our goal, as you all know, I mean Hugging Face I consider them like the GitHub for machine learning. And with this partnership, Hugging Face and AWS will be able to democratize AI for a broad range of developers, not just specific deep AI startups. And now with this we can accelerate the training, fine tuning, and deployment of these large language models and vision models from Hugging Face in the cloud. So, and the broader context, when you step back and see what customer problem we are trying to solve with this announcement, essentially if you see these foundational models are used to now create like a huge number of applications, suggest like tech summarization, question answering, or search image generation, creative, other things. And these are all stuff we are seeing in the likes of these Chat GPT style applications. But there is a broad range of enterprise use cases that we don't even talk about. And it's because these kind of transformative generative AI capabilities and models are not available to, I mean, millions of developers. And because either training these elements from scratch can be very expensive or time consuming and need deep expertise, or more importantly, they don't need these generic models. They need them to be fine tuned for the specific use cases. And one of the biggest complaints we hear is that these models, when they try to use it for real production use cases, they are incredibly expensive to train and incredibly expensive to run inference on, to use it at a production scale, so And unlike search, web search style applications where the margins can be really huge, here in production use cases and enterprises, you want efficiency at scale. That's where a Hugging Face and AWS share our mission. And by integrating with Trainium and Inferentia, we're able to handle the cost efficient training and inference at scale. I'll deep dive on it and by training teaming up on the SageMaker front now the time it takes to build these models and fine tune them as also coming down. So that's what makes this partnership very unique as well. So I'm very excited. >> I want to get into the, to the time savings and the cost savings as well on the on the training and inference. It's a huge issue. But before we get into that, just how long have you guys been working with Hugging Face? I know this is a previous relationship. This is an expansion of that relationship. Can you comment on the what's different about what's happened before and then now? >> Yeah, so Hugging Face, we have had an great relationship in the past few years as well where they have actually made their models available to run on AWS in a fashion, even inspect their Bloom project was something many of our customers even used. Bloom Project for context is their open source project, which builds a GPT three style model. And now with this expanded collaboration, now Hugging Face selected AWS for that next generation of this generative AI model, building on their highly successful Bloom project as well. And the nice thing is now by direct integration with Trainium and Inferentia, where you get cost savings in a really significant way. Now for instance, tier 1 can provide up to 50% cost to train savings, and Inferentia can deliver up to 60% better costs and Forex more higher throughput. Now these models, especially as they train that next generation generated AI model, it is going to be not only more accessible to all the developers who use it in open. So it'll be a lot cheaper as well. And that's what makes this moment really exciting because yeah, we can't democratize AI unless we make it broadly accessible and cost efficient, and easy to program and use as well. >> Okay, thanks Swami. We really appreciate. Swami's a Cube alumni, but also vice President, database analyst machine learning web services breaking down the Hugging Face announcement. Obviously the relationship he called it the GitHub of machine learning. This is the beginning of what we will see, a continuing competitive battle with Microsoft. Microsoft launching OpenAI. Amazon's been doing it for years. They got Alexa, they know what they're doing. It's going to be very interesting to see how this all plays out. You're watching Silicon Angle News, breaking here. I'm John Furrier, host of the Cube. Thanks for watching. (ethereal music)
SUMMARY :
And I have with me Swami into the large data modeling. the time it takes to build these models and the cost savings as well on the and easy to program and use as well. I'm John Furrier, host of the
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Amazon Web Services | ORGANIZATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
John | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Swami | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
millions | QUANTITY | 0.99+ |
GitHub | ORGANIZATION | 0.98+ |
Alexa | TITLE | 0.98+ |
Inferentia | ORGANIZATION | 0.97+ |
Silicon Angle | ORGANIZATION | 0.97+ |
Trainium | ORGANIZATION | 0.97+ |
Hugging Face | ORGANIZATION | 0.96+ |
one | QUANTITY | 0.95+ |
up to 60% | QUANTITY | 0.95+ |
up to 50% | QUANTITY | 0.95+ |
Cube | ORGANIZATION | 0.94+ |
Hugging Face | TITLE | 0.94+ |
Chat GPT | TITLE | 0.86+ |
Bloom | PERSON | 0.84+ |
OpenAI | TITLE | 0.83+ |
theCUBE | ORGANIZATION | 0.77+ |
Chat GPT | TITLE | 0.76+ |
1 | OTHER | 0.75+ |
Silicon Angle News | TITLE | 0.74+ |
Face | TITLE | 0.73+ |
Bloom | TITLE | 0.72+ |
developers | QUANTITY | 0.7+ |
Trainium | TITLE | 0.7+ |
Silicon Angle | ORGANIZATION | 0.64+ |
past few years | DATE | 0.63+ |
Bloom | ORGANIZATION | 0.56+ |
SiliconANGLE News | TITLE | 0.55+ |
SageMaker | TITLE | 0.53+ |
tier | QUANTITY | 0.52+ |
Hugging | ORGANIZATION | 0.49+ |
Silicon | ORGANIZATION | 0.48+ |
Angle | LOCATION | 0.47+ |
Closing Panel | Generative AI: Riding the Wave | AWS Startup Showcase S3 E1
(mellow music) >> Hello everyone, welcome to theCUBE's coverage of AWS Startup Showcase. This is the closing panel session on AI machine learning, the top startups generating generative AI on AWS. It's a great panel. This is going to be the experts talking about riding the wave in generative AI. We got Ankur Mehrotra, who's the director and general manager of AI and machine learning at AWS, and Clem Delangue, co-founder and CEO of Hugging Face, and Ori Goshen, who's the co-founder and CEO of AI21 Labs. Ori from Tel Aviv dialing in, and rest coming in here on theCUBE. Appreciate you coming on for this closing session for the Startup Showcase. >> Thanks for having us. >> Thank you for having us. >> Thank you. >> I'm super excited to have you all on. Hugging Face was recently in the news with the AWS relationship, so congratulations. Open source, open science, really driving the machine learning. And we got the AI21 Labs access to the LLMs, generating huge scale live applications, commercial applications, coming to the market, all powered by AWS. So everyone, congratulations on all your success, and thank you for headlining this panel. Let's get right into it. AWS is powering this wave here. We're seeing a lot of push here from applications. Ankur, set the table for us on the AI machine learning. It's not new, it's been goin' on for a while. Past three years have been significant advancements, but there's been a lot of work done in AI machine learning. Now it's released to the public. Everybody's super excited and now says, "Oh, the future's here!" It's kind of been going on for a while and baking. Now it's kind of coming out. What's your view here? Let's get it started. >> Yes, thank you. So, yeah, as you may be aware, Amazon has been in investing in machine learning research and development since quite some time now. And we've used machine learning to innovate and improve user experiences across different Amazon products, whether it's Alexa or Amazon.com. But we've also brought in our expertise to extend what we are doing in the space and add more generative AI technology to our AWS products and services, starting with CodeWhisperer, which is an AWS service that we announced a few months ago, which is, you can think of it as a coding companion as a service, which uses generative AI models underneath. And so this is a service that customers who have no machine learning expertise can just use. And we also are talking to customers, and we see a lot of excitement about generative AI, and customers who want to build these models themselves, who have the talent and the expertise and resources. For them, AWS has a number of different options and capabilities they can leverage, such as our custom silicon, such as Trainium and Inferentia, as well as distributed machine learning capabilities that we offer as part of SageMaker, which is an end-to-end machine learning development service. At the same time, many of our customers tell us that they're interested in not training and building these generative AI models from scratch, given they can be expensive and can require specialized talent and skills to build. And so for those customers, we are also making it super easy to bring in existing generative AI models into their machine learning development environment within SageMaker for them to use. So we recently announced our partnership with Hugging Face, where we are making it super easy for customers to bring in those models into their SageMaker development environment for fine tuning and deployment. And then we are also partnering with other proprietary model providers such as AI21 and others, where we making these generative AI models available within SageMaker for our customers to use. So our approach here is to really provide customers options and choices and help them accelerate their generative AI journey. >> Ankur, thank you for setting the table there. Clem and Ori, I want to get your take, because the riding the waves, the theme of this session, and to me being in California, I imagine the big surf, the big waves, the big talent out there. This is like alpha geeks, alpha coders, developers are really leaning into this. You're seeing massive uptake from the smartest people. Whether they're young or around, they're coming in with their kind of surfboards, (chuckles) if you will. These early adopters, they've been on this for a while; Now the waves are hitting. This is a big wave, everyone sees it. What are some of those early adopter devs doing? What are some of the use cases you're seeing right out of the gate? And what does this mean for the folks that are going to come in and get on this wave? Can you guys share your perspective on this? Because you're seeing the best talent now leaning into this. >> Yeah, absolutely. I mean, from Hugging Face vantage points, it's not even a a wave, it's a tidal wave, or maybe even the tide itself. Because actually what we are seeing is that AI and machine learning is not something that you add to your products. It's very much a new paradigm to do all technology. It's this idea that we had in the past 15, 20 years, one way to build software and to build technology, which was writing a million lines of code, very rule-based, and then you get your product. Now what we are seeing is that every single product, every single feature, every single company is starting to adopt AI to build the next generation of technology. And that works both to make the existing use cases better, if you think of search, if you think of social network, if you think of SaaS, but also it's creating completely new capabilities that weren't possible with the previous paradigm. Now AI can generate text, it can generate image, it can describe your image, it can do so many new things that weren't possible before. >> It's going to really make the developers really productive, right? I mean, you're seeing the developer uptake strong, right? >> Yes, we have over 15,000 companies using Hugging Face now, and it keeps accelerating. I really think that maybe in like three, five years, there's not going to be any company not using AI. It's going to be really kind of the default to build all technology. >> Ori, weigh in on this. APIs, the cloud. Now I'm a developer, I want to have live applications, I want the commercial applications on this. What's your take? Weigh in here. >> Yeah, first, I absolutely agree. I mean, we're in the midst of a technology shift here. I think not a lot of people realize how big this is going to be. Just the number of possibilities is endless, and I think hard to imagine. And I don't think it's just the use cases. I think we can think of it as two separate categories. We'll see companies and products enhancing their offerings with these new AI capabilities, but we'll also see new companies that are AI first, that kind of reimagine certain experiences. They build something that wasn't possible before. And that's why I think it's actually extremely exciting times. And maybe more philosophically, I think now these large language models and large transformer based models are helping us people to express our thoughts and kind of making the bridge from our thinking to a creative digital asset in a speed we've never imagined before. I can write something down and get a piece of text, or an image, or a code. So I'll start by saying it's hard to imagine all the possibilities right now, but it's certainly big. And if I had to bet, I would say it's probably at least as big as the mobile revolution we've seen in the last 20 years. >> Yeah, this is the biggest. I mean, it's been compared to the Enlightenment Age. I saw the Wall Street Journal had a recent story on this. We've been saying that this is probably going to be bigger than all inflection points combined in the tech industry, given what transformation is coming. I guess I want to ask you guys, on the early adopters, we've been hearing on these interviews and throughout the industry that there's already a set of big companies, a set of companies out there that have a lot of data and they're already there, they're kind of tinkering. Kind of reminds me of the old hyper scaler days where they were building their own scale, and they're eatin' glass, spittin' nails out, you know, they're hardcore. Then you got everybody else kind of saying board level, "Hey team, how do I leverage this?" How do you see those two things coming together? You got the fast followers coming in behind the early adopters. What's it like for the second wave coming in? What are those conversations for those developers like? >> I mean, I think for me, the important switch for companies is to change their mindset from being kind of like a traditional software company to being an AI or machine learning company. And that means investing, hiring machine learning engineers, machine learning scientists, infrastructure in members who are working on how to put these models in production, team members who are able to optimize models, specialized models, customized models for the company's specific use cases. So it's really changing this mindset of how you build technology and optimize your company building around that. Things are moving so fast that I think now it's kind of like too late for low hanging fruits or small, small adjustments. I think it's important to realize that if you want to be good at that, and if you really want to surf this wave, you need massive investments. If there are like some surfers listening with this analogy of the wave, right, when there are waves, it's not enough just to stand and make a little bit of adjustments. You need to position yourself aggressively, paddle like crazy, and that's how you get into the waves. So that's what companies, in my opinion, need to do right now. >> Ori, what's your take on the generative models out there? We hear a lot about foundation models. What's your experience running end-to-end applications for large foundation models? Any insights you can share with the app developers out there who are looking to get in? >> Yeah, I think first of all, it's start create an economy, where it probably doesn't make sense for every company to create their own foundation models. You can basically start by using an existing foundation model, either open source or a proprietary one, and start deploying it for your needs. And then comes the second round when you are starting the optimization process. You bootstrap, whether it's a demo, or a small feature, or introducing new capability within your product, and then start collecting data. That data, and particularly the human feedback data, helps you to constantly improve the model, so you create this data flywheel. And I think we're now entering an era where customers have a lot of different choice of how they want to start their generative AI endeavor. And it's a good thing that there's a variety of choices. And the really amazing thing here is that every industry, any company you speak with, it could be something very traditional like industrial or financial, medical, really any company. I think peoples now start to imagine what are the possibilities, and seriously think what's their strategy for adopting this generative AI technology. And I think in that sense, the foundation model actually enabled this to become scalable. So the barrier to entry became lower; Now the adoption could actually accelerate. >> There's a lot of integration aspects here in this new wave that's a little bit different. Before it was like very monolithic, hardcore, very brittle. A lot more integration, you see a lot more data coming together. I have to ask you guys, as developers come in and grow, I mean, when I went to college and you were a software engineer, I mean, I got a degree in computer science, and software engineering, that's all you did was code, (chuckles) you coded. Now, isn't it like everyone's a machine learning engineer at this point? Because that will be ultimately the science. So, (chuckles) you got open source, you got open software, you got the communities. Swami called you guys the GitHub of machine learning, Hugging Face is the GitHub of machine learning, mainly because that's where people are going to code. So this is essentially, machine learning is computer science. What's your reaction to that? >> Yes, my co-founder Julien at Hugging Face have been having this thing for quite a while now, for over three years, which was saying that actually software engineering as we know it today is a subset of machine learning, instead of the other way around. People would call us crazy a few years ago when we're seeing that. But now we are realizing that you can actually code with machine learning. So machine learning is generating code. And we are starting to see that every software engineer can leverage machine learning through open models, through APIs, through different technology stack. So yeah, it's not crazy anymore to think that maybe in a few years, there's going to be more people doing AI and machine learning. However you call it, right? Maybe you'll still call them software engineers, maybe you'll call them machine learning engineers. But there might be more of these people in a couple of years than there is software engineers today. >> I bring this up as more tongue in cheek as well, because Ankur, infrastructure's co is what made Cloud great, right? That's kind of the DevOps movement. But here the shift is so massive, there will be a game-changing philosophy around coding. Machine learning as code, you're starting to see CodeWhisperer, you guys have had coding companions for a while on AWS. So this is a paradigm shift. How is the cloud playing into this for you guys? Because to me, I've been riffing on some interviews where it's like, okay, you got the cloud going next level. This is an example of that, where there is a DevOps-like moment happening with machine learning, whether you call it coding or whatever. It's writing code on its own. Can you guys comment on what this means on top of the cloud? What comes out of the scale? What comes out of the benefit here? >> Absolutely, so- >> Well first- >> Oh, go ahead. >> Yeah, so I think as far as scale is concerned, I think customers are really relying on cloud to make sure that the applications that they build can scale along with the needs of their business. But there's another aspect to it, which is that until a few years ago, John, what we saw was that machine learning was a data scientist heavy activity. They were data scientists who were taking the data and training models. And then as machine learning found its way more and more into production and actual usage, we saw the MLOps become a thing, and MLOps engineers become more involved into the process. And then we now are seeing, as machine learning is being used to solve more business critical problems, we're seeing even legal and compliance teams get involved. We are seeing business stakeholders more engaged. So, more and more machine learning is becoming an activity that's not just performed by data scientists, but is performed by a team and a group of people with different skills. And for them, we as AWS are focused on providing the best tools and services for these different personas to be able to do their job and really complete that end-to-end machine learning story. So that's where, whether it's tools related to MLOps or even for folks who cannot code or don't know any machine learning. For example, we launched SageMaker Canvas as a tool last year, which is a UI-based tool which data analysts and business analysts can use to build machine learning models. So overall, the spectrum in terms of persona and who can get involved in the machine learning process is expanding, and the cloud is playing a big role in that process. >> Ori, Clem, can you guys weigh in too? 'Cause this is just another abstraction layer of scale. What's it mean for you guys as you look forward to your customers and the use cases that you're enabling? >> Yes, I think what's important is that the AI companies and providers and the cloud kind of work together. That's how you make a seamless experience and you actually reduce the barrier to entry for this technology. So that's what we've been super happy to do with AWS for the past few years. We actually announced not too long ago that we are doubling down on our partnership with AWS. We're excited to have many, many customers on our shared product, the Hugging Face deep learning container on SageMaker. And we are working really closely with the Inferentia team and the Trainium team to release some more exciting stuff in the coming weeks and coming months. So I think when you have an ecosystem and a system where the AWS and the AI providers, AI startups can work hand in hand, it's to the benefit of the customers and the companies, because it makes it orders of magnitude easier for them to adopt this new paradigm to build technology AI. >> Ori, this is a scale on reasoning too. The data's out there and making sense out of it, making it reason, getting comprehension, having it make decisions is next, isn't it? And you need scale for that. >> Yes. Just a comment about the infrastructure side. So I think really the purpose is to streamline and make these technologies much more accessible. And I think we'll see, I predict that we'll see in the next few years more and more tooling that make this technology much more simple to consume. And I think it plays a very important role. There's so many aspects, like the monitoring the models and their kind of outputs they produce, and kind of containing and running them in a production environment. There's so much there to build on, the infrastructure side will play a very significant role. >> All right, that's awesome stuff. I'd love to change gears a little bit and get a little philosophy here around AI and how it's going to transform, if you guys don't mind. There's been a lot of conversations around, on theCUBE here as well as in some industry areas, where it's like, okay, all the heavy lifting is automated away with machine learning and AI, the complexity, there's some efficiencies, it's horizontal and scalable across all industries. Ankur, good point there. Everyone's going to use it for something. And a lot of stuff gets brought to the table with large language models and other things. But the key ingredient will be proprietary data or human input, or some sort of AI whisperer kind of role, or prompt engineering, people are saying. So with that being said, some are saying it's automating intelligence. And that creativity will be unleashed from this. If the heavy lifting goes away and AI can fill the void, that shifts the value to the intellect or the input. And so that means data's got to come together, interact, fuse, and understand each other. This is kind of new. I mean, old school AI was, okay, got a big model, I provisioned it long time, very expensive. Now it's all free flowing. Can you guys comment on where you see this going with this freeform, data flowing everywhere, heavy lifting, and then specialization? >> Yeah, I think- >> Go ahead. >> Yeah, I think, so what we are seeing with these large language models or generative models is that they're really good at creating stuff. But I think it's also important to recognize their limitations. They're not as good at reasoning and logic. And I think now we're seeing great enthusiasm, I think, which is justified. And the next phase would be how to make these systems more reliable. How to inject more reasoning capabilities into these models, or augment with other mechanisms that actually perform more reasoning so we can achieve more reliable results. And we can count on these models to perform for critical tasks, whether it's medical tasks, legal tasks. We really want to kind of offload a lot of the intelligence to these systems. And then we'll have to get back, we'll have to make sure these are reliable, we'll have to make sure we get some sort of explainability that we can understand the process behind the generated results that we received. So I think this is kind of the next phase of systems that are based on these generated models. >> Clem, what's your view on this? Obviously you're at open community, open source has been around, it's been a great track record, proven model. I'm assuming creativity's going to come out of the woodwork, and if we can automate open source contribution, and relationships, and onboarding more developers, there's going to be unleashing of creativity. >> Yes, it's been so exciting on the open source front. We all know Bert, Bloom, GPT-J, T5, Stable Diffusion, that work up. The previous or the current generation of open source models that are on Hugging Face. It has been accelerating in the past few months. So I'm super excited about ControlNet right now that is really having a lot of impact, which is kind of like a way to control the generation of images. Super excited about Flan UL2, which is like a new model that has been recently released and is open source. So yeah, it's really fun to see the ecosystem coming together. Open source has been the basis for traditional software, with like open source programming languages, of course, but also all the great open source that we've gotten over the years. So we're happy to see that the same thing is happening for machine learning and AI, and hopefully can help a lot of companies reduce a little bit the barrier to entry. So yeah, it's going to be exciting to see how it evolves in the next few years in that respect. >> I think the developer productivity angle that's been talked about a lot in the industry will be accelerated significantly. I think security will be enhanced by this. I think in general, applications are going to transform at a radical rate, accelerated, incredible rate. So I think it's not a big wave, it's the water, right? I mean, (chuckles) it's the new thing. My final question for you guys, if you don't mind, I'd love to get each of you to answer the question I'm going to ask you, which is, a lot of conversations around data. Data infrastructure's obviously involved in this. And the common thread that I'm hearing is that every company that looks at this is asking themselves, if we don't rebuild our company, start thinking about rebuilding our business model around AI, we might be dinosaurs, we might be extinct. And it reminds me that scene in Moneyball when, at the end, it's like, if we're not building the model around your model, every company will be out of business. What's your advice to companies out there that are having those kind of moments where it's like, okay, this is real, this is next gen, this is happening. I better start thinking and putting into motion plans to refactor my business, 'cause it's happening, business transformation is happening on the cloud. This kind of puts an exclamation point on, with the AI, as a next step function. Big increase in value. So it's an opportunity for leaders. Ankur, we'll start with you. What's your advice for folks out there thinking about this? Do they put their toe in the water? Do they jump right into the deep end? What's your advice? >> Yeah, John, so we talk to a lot of customers, and customers are excited about what's happening in the space, but they often ask us like, "Hey, where do we start?" So we always advise our customers to do a lot of proof of concepts, understand where they can drive the biggest ROI. And then also leverage existing tools and services to move fast and scale, and try and not reinvent the wheel where it doesn't need to be. That's basically our advice to customers. >> Get it. Ori, what's your advice to folks who are scratching their head going, "I better jump in here. "How do I get started?" What's your advice? >> So I actually think that need to think about it really economically. Both on the opportunity side and the challenges. So there's a lot of opportunities for many companies to actually gain revenue upside by building these new generative features and capabilities. On the other hand, of course, this would probably affect the cogs, and incorporating these capabilities could probably affect the cogs. So I think we really need to think carefully about both of these sides, and also understand clearly if this is a project or an F word towards cost reduction, then the ROI is pretty clear, or revenue amplifier, where there's, again, a lot of different opportunities. So I think once you think about this in a structured way, I think, and map the different initiatives, then it's probably a good way to start and a good way to start thinking about these endeavors. >> Awesome. Clem, what's your take on this? What's your advice, folks out there? >> Yes, all of these are very good advice already. Something that you said before, John, that I disagreed a little bit, a lot of people are talking about the data mode and proprietary data. Actually, when you look at some of the organizations that have been building the best models, they don't have specialized or unique access to data. So I'm not sure that's so important today. I think what's important for companies, and it's been the same for the previous generation of technology, is their ability to build better technology faster than others. And in this new paradigm, that means being able to build machine learning faster than others, and better. So that's how, in my opinion, you should approach this. And kind of like how can you evolve your company, your teams, your products, so that you are able in the long run to build machine learning better and faster than your competitors. And if you manage to put yourself in that situation, then that's when you'll be able to differentiate yourself to really kind of be impactful and get results. That's really hard to do. It's something really different, because machine learning and AI is a different paradigm than traditional software. So this is going to be challenging, but I think if you manage to nail that, then the future is going to be very interesting for your company. >> That's a great point. Thanks for calling that out. I think this all reminds me of the cloud days early on. If you went to the cloud early, you took advantage of it when the pandemic hit. If you weren't native in the cloud, you got hamstrung by that, you were flatfooted. So just get in there. (laughs) Get in the cloud, get into AI, you're going to be good. Thanks for for calling that. Final parting comments, what's your most exciting thing going on right now for you guys? Ori, Clem, what's the most exciting thing on your plate right now that you'd like to share with folks? >> I mean, for me it's just the diversity of use cases and really creative ways of companies leveraging this technology. Every day I speak with about two, three customers, and I'm continuously being surprised by the creative ideas. And the future is really exciting of what can be achieved here. And also I'm amazed by the pace that things move in this industry. It's just, there's not at dull moment. So, definitely exciting times. >> Clem, what are you most excited about right now? >> For me, it's all the new open source models that have been released in the past few weeks, and that they'll keep being released in the next few weeks. I'm also super excited about more and more companies getting into this capability of chaining different models and different APIs. I think that's a very, very interesting development, because it creates new capabilities, new possibilities, new functionalities that weren't possible before. You can plug an API with an open source embedding model, with like a no-geo transcription model. So that's also very exciting. This capability of having more interoperable machine learning will also, I think, open a lot of interesting things in the future. >> Clem, congratulations on your success at Hugging Face. Please pass that on to your team. Ori, congratulations on your success, and continue to, just day one. I mean, it's just the beginning. It's not even scratching the service. Ankur, I'll give you the last word. What are you excited for at AWS? More cloud goodness coming here with AI. Give you the final word. >> Yeah, so as both Clem and Ori said, I think the research in the space is moving really, really fast, so we are excited about that. But we are also excited to see the speed at which enterprises and other AWS customers are applying machine learning to solve real business problems, and the kind of results they're seeing. So when they come back to us and tell us the kind of improvement in their business metrics and overall customer experience that they're driving and they're seeing real business results, that's what keeps us going and inspires us to continue inventing on their behalf. >> Gentlemen, thank you so much for this awesome high impact panel. Ankur, Clem, Ori, congratulations on all your success. We'll see you around. Thanks for coming on. Generative AI, riding the wave, it's a tidal wave, it's the water, it's all happening. All great stuff. This is season three, episode one of AWS Startup Showcase closing panel. This is the AI ML episode, the top startups building generative AI on AWS. I'm John Furrier, your host. Thanks for watching. (mellow music)
SUMMARY :
This is the closing panel I'm super excited to have you all on. is to really provide and to me being in California, and then you get your product. kind of the default APIs, the cloud. and kind of making the I saw the Wall Street Journal I think it's important to realize that the app developers out there So the barrier to entry became lower; I have to ask you guys, instead of the other way around. That's kind of the DevOps movement. and the cloud is playing a and the use cases that you're enabling? the barrier to entry And you need scale for that. in the next few years and AI can fill the void, a lot of the intelligence and if we can automate reduce a little bit the barrier to entry. I'd love to get each of you drive the biggest ROI. to folks who are scratching So I think once you think Clem, what's your take on this? and it's been the same of the cloud days early on. And also I'm amazed by the pace in the past few weeks, Please pass that on to your team. and the kind of results they're seeing. This is the AI ML episode,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Ankur Mehrotra | PERSON | 0.99+ |
John | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Clem | PERSON | 0.99+ |
Ori Goshen | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
California | LOCATION | 0.99+ |
Ori | PERSON | 0.99+ |
Clem Delangue | PERSON | 0.99+ |
Hugging Face | ORGANIZATION | 0.99+ |
Julien | PERSON | 0.99+ |
Ankur | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Tel Aviv | LOCATION | 0.99+ |
three | QUANTITY | 0.99+ |
Ankur | ORGANIZATION | 0.99+ |
second round | QUANTITY | 0.99+ |
AI21 Labs | ORGANIZATION | 0.99+ |
two separate categories | QUANTITY | 0.99+ |
Amazon.com | ORGANIZATION | 0.99+ |
last year | DATE | 0.99+ |
two things | QUANTITY | 0.99+ |
first | QUANTITY | 0.98+ |
over 15,000 companies | QUANTITY | 0.98+ |
Both | QUANTITY | 0.98+ |
five years | QUANTITY | 0.98+ |
both | QUANTITY | 0.98+ |
over three years | QUANTITY | 0.98+ |
three customers | QUANTITY | 0.98+ |
each | QUANTITY | 0.98+ |
Trainium | ORGANIZATION | 0.98+ |
today | DATE | 0.98+ |
Alexa | TITLE | 0.98+ |
Stable Diffusion | ORGANIZATION | 0.97+ |
Swami | PERSON | 0.97+ |
Inferentia | ORGANIZATION | 0.96+ |
GPT-J | ORGANIZATION | 0.96+ |
SageMaker | TITLE | 0.96+ |
AI21 Labs | ORGANIZATION | 0.95+ |
Riding the Wave | TITLE | 0.95+ |
ControlNet | ORGANIZATION | 0.94+ |
one way | QUANTITY | 0.94+ |
a million lines | QUANTITY | 0.93+ |
Startup Showcase | EVENT | 0.92+ |
few months ago | DATE | 0.92+ |
second wave | EVENT | 0.91+ |
theCUBE | ORGANIZATION | 0.91+ |
few years ago | DATE | 0.91+ |
CodeWhisperer | TITLE | 0.9+ |
AI21 | ORGANIZATION | 0.89+ |
Adam Wenchel & John Dickerson, Arthur | AWS Startup Showcase S3 E1
(upbeat music) >> Welcome everyone to theCUBE's presentation of the AWS Startup Showcase AI Machine Learning Top Startups Building Generative AI on AWS. This is season 3, episode 1 of the ongoing series covering the exciting startup from the AWS ecosystem to talk about AI and machine learning. I'm your host, John Furrier. I'm joined by two great guests here, Adam Wenchel, who's the CEO of Arthur, and Chief Scientist of Arthur, John Dickerson. Talk about how they help people build better LLM AI systems to get them into the market faster. Gentlemen, thank you for coming on. >> Yeah, thanks for having us, John. >> Well, I got to say I got to temper my enthusiasm because the last few months explosion of interest in LLMs with ChatGPT, has opened the eyes to everybody around the reality of that this is going next gen, this is it, this is the moment, this is the the point we're going to look back and say, this is the time where AI really hit the scene for real applications. So, a lot of Large Language Models, also known as LLMs, foundational models, and generative AI is all booming. This is where all the alpha developers are going. This is where everyone's focusing their business model transformations on. This is where developers are seeing action. So it's all happening, the wave is here. So I got to ask you guys, what are you guys seeing right now? You're in the middle of it, it's hitting you guys right on. You're in the front end of this massive wave. >> Yeah, John, I don't think you have to temper your enthusiasm at all. I mean, what we're seeing every single day is, everything from existing enterprise customers coming in with new ways that they're rethinking, like business things that they've been doing for many years that they can now do an entirely different way, as well as all manner of new companies popping up, applying LLMs to everything from generating code and SQL statements to generating health transcripts and just legal briefs. Everything you can imagine. And when you actually sit down and look at these systems and the demos we get of them, the hype is definitely justified. It's pretty amazing what they're going to do. And even just internally, we built, about a month ago in January, we built an Arthur chatbot so customers could ask questions, technical questions from our, rather than read our product documentation, they could just ask this LLM a particular question and get an answer. And at the time it was like state of the art, but then just last week we decided to rebuild it because the tooling has changed so much that we, last week, we've completely rebuilt it. It's now way better, built on an entirely different stack. And the tooling has undergone a full generation worth of change in six weeks, which is crazy. So it just tells you how much energy is going into this and how fast it's evolving right now. >> John, weigh in as a chief scientist. I mean, you must be blown away. Talk about kid in the candy store. I mean, you must be looking like this saying, I mean, she must be super busy to begin with, but the change, the acceleration, can you scope the kind of change you're seeing and be specific around the areas you're seeing movement and highly accelerated change? >> Yeah, definitely. And it is very, very exciting actually, thinking back to when ChatGPT was announced, that was a night our company was throwing an event at NeurIPS, which is maybe the biggest machine learning conference out there. And the hype when that happened was palatable and it was just shocking to see how well that performed. And then obviously over the last few months since then, as LLMs have continued to enter the market, we've seen use cases for them, like Adam mentioned all over the place. And so, some things I'm excited about in this space are the use of LLMs and more generally, foundation models to redesign traditional operations, research style problems, logistics problems, like auctions, decisioning problems. So moving beyond the already amazing news cases, like creating marketing content into more core integration and a lot of the bread and butter companies and tasks that drive the American ecosystem. And I think we're just starting to see some of that. And in the next 12 months, I think we're going to see a lot more. If I had to make other predictions, I think we're going to continue seeing a lot of work being done on managing like inference time costs via shrinking models or distillation. And I don't know how to make this prediction, but at some point we're going to be seeing lots of these very large scale models operating on the edge as well. So the time scales are extremely compressed, like Adam mentioned, 12 months from now, hard to say. >> We were talking on theCUBE prior to this session here. We had theCUBE conversation here and then the Wall Street Journal just picked up on the same theme, which is the printing press moment created the enlightenment stage of the history. Here we're in the whole nother automating intellect efficiency, doing heavy lifting, the creative class coming back, a whole nother level of reality around the corner that's being hyped up. The question is, is this justified? Is there really a breakthrough here or is this just another result of continued progress with AI? Can you guys weigh in, because there's two schools of thought. There's the, "Oh my God, we're entering a new enlightenment tech phase, of the equivalent of the printing press in all areas. Then there's, Ah, it's just AI (indistinct) inch by inch. What's your guys' opinion? >> Yeah, I think on the one hand when you're down in the weeds of building AI systems all day, every day, like we are, it's easy to look at this as an incremental progress. Like we have customers who've been building on foundation models since we started the company four years ago, particular in computer vision for classification tasks, starting with pre-trained models, things like that. So that part of it doesn't feel real new, but what does feel new is just when you apply these things to language with all the breakthroughs and computational efficiency, algorithmic improvements, things like that, when you actually sit down and interact with ChatGPT or one of the other systems that's out there that's building on top of LLMs, it really is breathtaking, like, the level of understanding that they have and how quickly you can accelerate your development efforts and get an actual working system in place that solves a really important real world problem and makes people way faster, way more efficient. So I do think there's definitely something there. It's more than just incremental improvement. This feels like a real trajectory inflection point for the adoption of AI. >> John, what's your take on this? As people come into the field, I'm seeing a lot of people move from, hey, I've been coding in Python, I've been doing some development, I've been a software engineer, I'm a computer science student. I'm coding in C++ old school, OG systems person. Where do they come in? Where's the focus, where's the action? Where are the breakthroughs? Where are people jumping in and rolling up their sleeves and getting dirty with this stuff? >> Yeah, all over the place. And it's funny you mentioned students in a different life. I wore a university professor hat and so I'm very, very familiar with the teaching aspects of this. And I will say toward Adam's point, this really is a leap forward in that techniques like in a co-pilot for example, everybody's using them right now and they really do accelerate the way that we develop. When I think about the areas where people are really, really focusing right now, tooling is certainly one of them. Like you and I were chatting about LangChain right before this interview started, two or three people can sit down and create an amazing set of pipes that connect different aspects of the LLM ecosystem. Two, I would say is in engineering. So like distributed training might be one, or just understanding better ways to even be able to train large models, understanding better ways to then distill them or run them. So like this heavy interaction now between engineering and what I might call traditional machine learning from 10 years ago where you had to know a lot of math, you had to know calculus very well, things like that. Now you also need to be, again, a very strong engineer, which is exciting. >> I interviewed Swami when he talked about the news. He's ahead of Amazon's machine learning and AI when they announced Hugging Face announcement. And I reminded him how Amazon was easy to get into if you were developing a startup back in 2007,8, and that the language models had that similar problem. It's step up a lot of content and a lot of expense to get provisioned up, now it's easy. So this is the next wave of innovation. So how do you guys see that from where we are right now? Are we at that point where it's that moment where it's that cloud-like experience for LLMs and large language models? >> Yeah, go ahead John. >> I think the answer is yes. We see a number of large companies that are training these and serving these, some of which are being co-interviewed in this episode. I think we're at that. Like, you can hit one of these with a simple, single line of Python, hitting an API, you can boot this up in seconds if you want. It's easy. >> Got it. >> So I (audio cuts out). >> Well let's take a step back and talk about the company. You guys being featured here on the Showcase. Arthur, what drove you to start the company? How'd this all come together? What's the origination story? Obviously you got a big customers, how'd get started? What are you guys doing? How do you make money? Give a quick overview. >> Yeah, I think John and I come at it from slightly different angles, but for myself, I have been a part of a number of technology companies. I joined Capital One, they acquired my last company and shortly after I joined, they asked me to start their AI team. And so even though I've been doing AI for a long time, I started my career back in DARPA. It was the first time I was really working at scale in AI at an organization where there were hundreds of millions of dollars in revenue at stake with the operation of these models and that they were impacting millions of people's financial livelihoods. And so it just got me hyper-focused on these issues around making sure that your AI worked well and it worked well for your company and it worked well for the people who were being affected by it. At the time when I was doing this 2016, 2017, 2018, there just wasn't any tooling out there to support this production management model monitoring life phase of the life cycle. And so we basically left to start the company that I wanted. And John has a his own story. I'll let let you share that one, John. >> Go ahead John, you're up. >> Yeah, so I'm coming at this from a different world. So I'm on leave now from a tenured role in academia where I was leading a large lab focusing on the intersection of machine learning and economics. And so questions like fairness or the response to the dynamism on the underlying environment have been around for quite a long time in that space. And so I've been thinking very deeply about some of those more like R and D style questions as well as having deployed some automation code across a couple of different industries, some in online advertising, some in the healthcare space and so on, where concerns of, again, fairness come to bear. And so Adam and I connected to understand the space of what that might look like in the 2018 20 19 realm from a quantitative and from a human-centered point of view. And so booted things up from there. >> Yeah, bring that applied engineering R and D into the Capital One, DNA that he had at scale. I could see that fit. I got to ask you now, next step, as you guys move out and think about LLMs and the recent AI news around the generative models and the foundational models like ChatGPT, how should we be looking at that news and everyone watching might be thinking the same thing. I know at the board level companies like, we should refactor our business, this is the future. It's that kind of moment, and the tech team's like, okay, boss, how do we do this again? Or are they prepared? How should we be thinking? How should people watching be thinking about LLMs? >> Yeah, I think they really are transformative. And so, I mean, we're seeing companies all over the place. Everything from large tech companies to a lot of our large enterprise customers are launching significant projects at core parts of their business. And so, yeah, I would be surprised, if you're serious about becoming an AI native company, which most leading companies are, then this is a trend that you need to be taking seriously. And we're seeing the adoption rate. It's funny, I would say the AI adoption in the broader business world really started, let's call it four or five years ago, and it was a relatively slow adoption rate, but I think all that kind of investment in and scaling the maturity curve has paid off because the rate at which people are adopting and deploying systems based on this is tremendous. I mean, this has all just happened in the few months and we're already seeing people get systems into production. So, now there's a lot of things you have to guarantee in order to put these in production in a way that basically is added into your business and doesn't cause more headaches than it solves. And so that's where we help customers is where how do you put these out there in a way that they're going to represent your company well, they're going to perform well, they're going to do their job and do it properly. >> So in the use case, as a customer, as I think about this, there's workflows. They might have had an ML AI ops team that's around IT. Their inference engines are out there. They probably don't have a visibility on say how much it costs, they're kicking the tires. When you look at the deployment, there's a cost piece, there's a workflow piece, there's fairness you mentioned John, what should be, I should be thinking about if I'm going to be deploying stuff into production, I got to think about those things. What's your opinion? >> Yeah, I'm happy to dive in on that one. So monitoring in general is extremely important once you have one of these LLMs in production, and there have been some changes versus traditional monitoring that we can dive deeper into that LLMs are really accelerated. But a lot of that bread and butter style of things you should be looking out for remain just as important as they are for what you might call traditional machine learning models. So the underlying environment of data streams, the way users interact with these models, these are all changing over time. And so any performance metrics that you care about, traditional ones like an accuracy, if you can define that for an LLM, ones around, for example, fairness or bias. If that is a concern for your particular use case and so on. Those need to be tracked. Now there are some interesting changes that LLMs are bringing along as well. So most ML models in production that we see are relatively static in the sense that they're not getting flipped in more than maybe once a day or once a week or they're just set once and then not changed ever again. With LLMs, there's this ongoing value alignment or collection of preferences from users that is often constantly updating the model. And so that opens up all sorts of vectors for, I won't say attack, but for problems to arise in production. Like users might learn to use your system in a different way and thus change the way those preferences are getting collected and thus change your system in ways that you never intended. So maybe that went through governance already internally at the company and now it's totally, totally changed and it's through no fault of your own, but you need to be watching over that for sure. >> Talk about the reinforced learnings from human feedback. How's that factoring in to the LLMs? Is that part of it? Should people be thinking about that? Is that a component that's important? >> It certainly is, yeah. So this is one of the big tweaks that happened with InstructGPT, which is the basis model behind ChatGPT and has since gone on to be used all over the place. So value alignment I think is through RLHF like you mentioned is a very interesting space to get into and it's one that you need to watch over. Like, you're asking humans for feedback over outputs from a model and then you're updating the model with respect to that human feedback. And now you've thrown humans into the loop here in a way that is just going to complicate things. And it certainly helps in many ways. You can ask humans to, let's say that you're deploying an internal chat bot at an enterprise, you could ask humans to align that LLM behind the chatbot to, say company values. And so you're listening feedback about these company values and that's going to scoot that chatbot that you're running internally more toward the kind of language that you'd like to use internally on like a Slack channel or something like that. Watching over that model I think in that specific case, that's a compliance and HR issue as well. So while it is part of the greater LLM stack, you can also view that as an independent bit to watch over. >> Got it, and these are important factors. When people see the Bing news, they freak out how it's doing great. Then it goes off the rails, it goes big, fails big. (laughing) So these models people see that, is that human interaction or is that feedback, is that not accepting it or how do people understand how to take that input in and how to build the right apps around LLMs? This is a tough question. >> Yeah, for sure. So some of the examples that you'll see online where these chatbots go off the rails are obviously humans trying to break the system, but some of them clearly aren't. And that's because these are large statistical models and we don't know what's going to pop out of them all the time. And even if you're doing as much in-house testing at the big companies like the Go-HERE's and the OpenAI's of the world, to try to prevent things like toxicity or racism or other sorts of bad content that might lead to bad pr, you're never going to catch all of these possible holes in the model itself. And so, again, it's very, very important to keep watching over that while it's in production. >> On the business model side, how are you guys doing? What's the approach? How do you guys engage with customers? Take a minute to explain the customer engagement. What do they need? What do you need? How's that work? >> Yeah, I can talk a little bit about that. So it's really easy to get started. It's literally a matter of like just handing out an API key and people can get started. And so we also offer alternative, we also offer versions that can be installed on-prem for models that, we find a lot of our customers have models that deal with very sensitive data. So you can run it in your cloud account or use our cloud version. And so yeah, it's pretty easy to get started with this stuff. We find people start using it a lot of times during the validation phase 'cause that way they can start baselining performance models, they can do champion challenger, they can really kind of baseline the performance of, maybe they're considering different foundation models. And so it's a really helpful tool for understanding differences in the way these models perform. And then from there they can just flow that into their production inferencing, so that as these systems are out there, you have really kind of real time monitoring for anomalies and for all sorts of weird behaviors as well as that continuous feedback loop that helps you make make your product get better and observability and you can run all sorts of aggregated reports to really understand what's going on with these models when they're out there deciding. I should also add that we just today have another way to adopt Arthur and that is we are in the AWS marketplace, and so we are available there just to make it that much easier to use your cloud credits, skip the procurement process, and get up and running really quickly. >> And that's great 'cause Amazon's got SageMaker, which handles a lot of privacy stuff, all kinds of cool things, or you can get down and dirty. So I got to ask on the next one, production is a big deal, getting stuff into production. What have you guys learned that you could share to folks watching? Is there a cost issue? I got to monitor, obviously you brought that up, we talked about the even reinforcement issues, all these things are happening. What is the big learnings that you could share for people that are going to put these into production to watch out for, to plan for, or be prepared for, hope for the best plan for the worst? What's your advice? >> I can give a couple opinions there and I'm sure Adam has. Well, yeah, the big one from my side is, again, I had mentioned this earlier, it's just the input data streams because humans are also exploring how they can use these systems to begin with. It's really, really hard to predict the type of inputs you're going to be seeing in production. Especially, we always talk about chatbots, but then any generative text tasks like this, let's say you're taking in news articles and summarizing them or something like that, it's very hard to get a good sampling even of the set of news articles in such a way that you can really predict what's going to pop out of that model. So to me, it's, adversarial maybe isn't the word that I would use, but it's an unnatural shifting input distribution of like prompts that you might see for these models. That's certainly one. And then the second one that I would talk about is, it can be hard to understand the costs, the inference time costs behind these LLMs. So the pricing on these is always changing as the models change size, it might go up, it might go down based on model size, based on energy cost and so on, but your pricing per token or per a thousand tokens and that I think can be difficult for some clients to wrap their head around. Again, you don't know how these systems are going to be used after all so it can be tough. And so again that's another metric that really should be tracked. >> Yeah, and there's a lot of trade off choices in there with like, how many tokens do you want at each step and in the sequence and based on, you have (indistinct) and you reject these tokens and so based on how your system's operating, that can make the cost highly variable. And that's if you're using like an API version that you're paying per token. A lot of people also choose to run these internally and as John mentioned, the inference time on these is significantly higher than a traditional classifi, even NLP classification model or tabular data model, like orders of magnitude higher. And so you really need to understand how that, as you're constantly iterating on these models and putting out new versions and new features in these models, how that's affecting the overall scale of that inference cost because you can use a lot of computing power very quickly with these profits. >> Yeah, scale, performance, price all come together. I got to ask while we're here on the secret sauce of the company, if you had to describe to people out there watching, what's the secret sauce of the company? What's the key to your success? >> Yeah, so John leads our research team and they've had a number of really cool, I think AI as much as it's been hyped for a while, it's still commercial AI at least is really in its infancy. And so the way we're able to pioneer new ways to think about performance for computer vision NLP LLMs is probably the thing that I'm proudest about. John and his team publish papers all the time at Navs and other places. But I think it's really being able to define what performance means for basically any kind of model type and give people really powerful tools to understand that on an ongoing basis. >> John, secret sauce, how would you describe it? You got all the action happening all around you. >> Yeah, well I going to appreciate Adam talking me up like that. No, I. (all laughing) >> Furrier: Robs to you. >> I would also say a couple of other things here. So we have a very strong engineering team and so I think some early hires there really set the standard at a very high bar that we've maintained as we've grown. And I think that's really paid dividends as scalabilities become even more of a challenge in these spaces, right? And so that's not just scalability when it comes to LLMs, that's scalability when it comes to millions of inferences per day, that kind of thing as well in traditional ML models. And I think that's compared to potential competitors, that's really... Well, it's made us able to just operate more efficiently and pass that along to the client. >> Yeah, and I think the infancy comment is really important because it's the beginning. You really is a long journey ahead. A lot of change coming, like I said, it's a huge wave. So I'm sure you guys got a lot of plannings at the foundation even for your own company, so I appreciate the candid response there. Final question for you guys is, what should the top things be for a company in 2023? If I'm going to set the agenda and I'm a customer moving forward, putting the pedal to the metal, so to speak, what are the top things I should be prioritizing or I need to do to be successful with AI in 2023? >> Yeah, I think, so number one, as we talked about, we've been talking about this entire episode, the things are changing so quickly and the opportunities for business transformation and really disrupting different applications, different use cases, is almost, I don't think we've even fully comprehended how big it is. And so really digging in to your business and understanding where I can apply these new sets of foundation models is, that's a top priority. The interesting thing is I think there's another force at play, which is the macroeconomic conditions and a lot of places are, they're having to work harder to justify budgets. So in the past, couple years ago maybe, they had a blank check to spend on AI and AI development at a lot of large enterprises that was limited primarily by the amount of talent they could scoop up. Nowadays these expenditures are getting scrutinized more. And so one of the things that we really help our customers with is like really calculating the ROI on these things. And so if you have models out there performing and you have a new version that you can put out that lifts the performance by 3%, how many tens of millions of dollars does that mean in business benefit? Or if I want to go to get approval from the CFO to spend a few million dollars on this new project, how can I bake in from the beginning the tools to really show the ROI along the way? Because I think in these systems when done well for a software project, the ROI can be like pretty spectacular. Like we see over a hundred percent ROI in the first year on some of these projects. And so, I think in 2023, you just need to be able to show what you're getting for that spend. >> It's a needle moving moment. You see it all the time with some of these aha moments or like, whoa, blown away. John, I want to get your thoughts on this because one of the things that comes up a lot for companies that I talked to, that are on my second wave, I would say coming in, maybe not, maybe the front wave of adopters is talent and team building. You mentioned some of the hires you got were game changing for you guys and set the bar high. As you move the needle, new developers going to need to come in. What's your advice given that you've been a professor, you've seen students, I know a lot of computer science people want to shift, they might not be yet skilled in AI, but they're proficient in programming, is that's going to be another opportunity with open source when things are happening. How do you talk to that next level of talent that wants to come in to this market to supplement teams and be on teams, lead teams? Any advice you have for people who want to build their teams and people who are out there and want to be a coder in AI? >> Yeah, I've advice, and this actually works for what it would take to be a successful AI company in 2023 as well, which is, just don't be afraid to iterate really quickly with these tools. The space is still being explored on what they can be used for. A lot of the tasks that they're used for now right? like creating marketing content using a machine learning is not a new thing to do. It just works really well now. And so I'm excited to see what the next year brings in terms of folks from outside of core computer science who are, other engineers or physicists or chemists or whatever who are learning how to use these increasingly easy to use tools to leverage LLMs for tasks that I think none of us have really thought about before. So that's really, really exciting. And so toward that I would say iterate quickly. Build things on your own, build demos, show them the friends, host them online and you'll learn along the way and you'll have somebody to show for it. And also you'll help us explore that space. >> Guys, congratulations with Arthur. Great company, great picks and shovels opportunities out there for everybody. Iterate fast, get in quickly and don't be afraid to iterate. Great advice and thank you for coming on and being part of the AWS showcase, thanks. >> Yeah, thanks for having us on John. Always a pleasure. >> Yeah, great stuff. Adam Wenchel, John Dickerson with Arthur. Thanks for coming on theCUBE. I'm John Furrier, your host. Generative AI and AWS. Keep it right there for more action with theCUBE. Thanks for watching. (upbeat music)
SUMMARY :
of the AWS Startup Showcase has opened the eyes to everybody and the demos we get of them, but the change, the acceleration, And in the next 12 months, of the equivalent of the printing press and how quickly you can accelerate As people come into the field, aspects of the LLM ecosystem. and that the language models in seconds if you want. and talk about the company. of the life cycle. in the 2018 20 19 realm I got to ask you now, next step, in the broader business world So in the use case, as a the way users interact with these models, How's that factoring in to that LLM behind the chatbot and how to build the Go-HERE's and the OpenAI's What's the approach? differences in the way that are going to put So the pricing on these is always changing and in the sequence What's the key to your success? And so the way we're able to You got all the action Yeah, well I going to appreciate Adam and pass that along to the client. so I appreciate the candid response there. get approval from the CFO to spend You see it all the time with some of A lot of the tasks that and being part of the Yeah, thanks for having us Generative AI and AWS.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
John | PERSON | 0.99+ |
Adam Wenchel | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Adam | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
two | QUANTITY | 0.99+ |
John Dickerson | PERSON | 0.99+ |
2016 | DATE | 0.99+ |
2018 | DATE | 0.99+ |
2023 | DATE | 0.99+ |
3% | QUANTITY | 0.99+ |
2017 | DATE | 0.99+ |
Capital One | ORGANIZATION | 0.99+ |
last week | DATE | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Arthur | PERSON | 0.99+ |
Python | TITLE | 0.99+ |
millions | QUANTITY | 0.99+ |
Two | QUANTITY | 0.99+ |
each step | QUANTITY | 0.99+ |
2018 20 19 | DATE | 0.99+ |
two schools | QUANTITY | 0.99+ |
couple years ago | DATE | 0.99+ |
once a week | QUANTITY | 0.99+ |
one | QUANTITY | 0.98+ |
first year | QUANTITY | 0.98+ |
Swami | PERSON | 0.98+ |
four years ago | DATE | 0.98+ |
four | DATE | 0.98+ |
first time | QUANTITY | 0.98+ |
Arthur | ORGANIZATION | 0.98+ |
two great guests | QUANTITY | 0.98+ |
next year | DATE | 0.98+ |
once a day | QUANTITY | 0.98+ |
six weeks | QUANTITY | 0.97+ |
10 years ago | DATE | 0.97+ |
ChatGPT | TITLE | 0.97+ |
second one | QUANTITY | 0.96+ |
three people | QUANTITY | 0.96+ |
front | EVENT | 0.95+ |
second wave | EVENT | 0.95+ |
January | DATE | 0.95+ |
hundreds of millions of dollars | QUANTITY | 0.95+ |
five years ago | DATE | 0.94+ |
about a month ago | DATE | 0.94+ |
tens of millions | QUANTITY | 0.93+ |
today | DATE | 0.92+ |
next 12 months | DATE | 0.91+ |
LangChain | ORGANIZATION | 0.91+ |
over a hundred percent | QUANTITY | 0.91+ |
million dollars | QUANTITY | 0.89+ |
millions of inferences | QUANTITY | 0.89+ |
theCUBE | ORGANIZATION | 0.88+ |
Robert Nishihara, Anyscale | AWS Startup Showcase S3 E1
(upbeat music) >> Hello everyone. Welcome to theCube's presentation of the "AWS Startup Showcase." The topic this episode is AI and machine learning, top startups building foundational model infrastructure. This is season three, episode one of the ongoing series covering exciting startups from the AWS ecosystem. And this time we're talking about AI and machine learning. I'm your host, John Furrier. I'm excited I'm joined today by Robert Nishihara, who's the co-founder and CEO of a hot startup called Anyscale. He's here to talk about Ray, the open source project, Anyscale's infrastructure for foundation as well. Robert, thank you for joining us today. >> Yeah, thanks so much as well. >> I've been following your company since the founding pre pandemic and you guys really had a great vision scaled up and in a perfect position for this big wave that we all see with ChatGPT and OpenAI that's gone mainstream. Finally, AI has broken out through the ropes and now gone mainstream, so I think you guys are really well positioned. I'm looking forward to to talking with you today. But before we get into it, introduce the core mission for Anyscale. Why do you guys exist? What is the North Star for Anyscale? >> Yeah, like you mentioned, there's a tremendous amount of excitement about AI right now. You know, I think a lot of us believe that AI can transform just every different industry. So one of the things that was clear to us when we started this company was that the amount of compute needed to do AI was just exploding. Like to actually succeed with AI, companies like OpenAI or Google or you know, these companies getting a lot of value from AI, were not just running these machine learning models on their laptops or on a single machine. They were scaling these applications across hundreds or thousands or more machines and GPUs and other resources in the Cloud. And so to actually succeed with AI, and this has been one of the biggest trends in computing, maybe the biggest trend in computing in, you know, in recent history, the amount of compute has been exploding. And so to actually succeed with that AI, to actually build these scalable applications and scale the AI applications, there's a tremendous software engineering lift to build the infrastructure to actually run these scalable applications. And that's very hard to do. So one of the reasons many AI projects and initiatives fail is that, or don't make it to production, is the need for this scale, the infrastructure lift, to actually make it happen. So our goal here with Anyscale and Ray, is to make that easy, is to make scalable computing easy. So that as a developer or as a business, if you want to do AI, if you want to get value out of AI, all you need to know is how to program on your laptop. Like, all you need to know is how to program in Python. And if you can do that, then you're good to go. Then you can do what companies like OpenAI or Google do and get value out of machine learning. >> That programming example of how easy it is with Python reminds me of the early days of Cloud, when infrastructure as code was talked about was, it was just code the infrastructure programmable. That's super important. That's what AI people wanted, first program AI. That's the new trend. And I want to understand, if you don't mind explaining, the relationship that Anyscale has to these foundational models and particular the large language models, also called LLMs, was seen with like OpenAI and ChatGPT. Before you get into the relationship that you have with them, can you explain why the hype around foundational models? Why are people going crazy over foundational models? What is it and why is it so important? >> Yeah, so foundational models and foundation models are incredibly important because they enable businesses and developers to get value out of machine learning, to use machine learning off the shelf with these large models that have been trained on tons of data and that are useful out of the box. And then, of course, you know, as a business or as a developer, you can take those foundational models and repurpose them or fine tune them or adapt them to your specific use case and what you want to achieve. But it's much easier to do that than to train them from scratch. And I think there are three, for people to actually use foundation models, there are three main types of workloads or problems that need to be solved. One is training these foundation models in the first place, like actually creating them. The second is fine tuning them and adapting them to your use case. And the third is serving them and actually deploying them. Okay, so Ray and Anyscale are used for all of these three different workloads. Companies like OpenAI or Cohere that train large language models. Or open source versions like GPTJ are done on top of Ray. There are many startups and other businesses that fine tune, that, you know, don't want to train the large underlying foundation models, but that do want to fine tune them, do want to adapt them to their purposes, and build products around them and serve them, those are also using Ray and Anyscale for that fine tuning and that serving. And so the reason that Ray and Anyscale are important here is that, you know, building and using foundation models requires a huge scale. It requires a lot of data. It requires a lot of compute, GPUs, TPUs, other resources. And to actually take advantage of that and actually build these scalable applications, there's a lot of infrastructure that needs to happen under the hood. And so you can either use Ray and Anyscale to take care of that and manage the infrastructure and solve those infrastructure problems. Or you can build the infrastructure and manage the infrastructure yourself, which you can do, but it's going to slow your team down. It's going to, you know, many of the businesses we work with simply don't want to be in the business of managing infrastructure and building infrastructure. They want to focus on product development and move faster. >> I know you got a keynote presentation we're going to go to in a second, but I think you hit on something I think is the real tipping point, doing it yourself, hard to do. These are things where opportunities are and the Cloud did that with data centers. Turned a data center and made it an API. The heavy lifting went away and went to the Cloud so people could be more creative and build their product. In this case, build their creativity. Is that kind of what's the big deal? Is that kind of a big deal happening that you guys are taking the learnings and making that available so people don't have to do that? >> That's exactly right. So today, if you want to succeed with AI, if you want to use AI in your business, infrastructure work is on the critical path for doing that. To do AI, you have to build infrastructure. You have to figure out how to scale your applications. That's going to change. We're going to get to the point, and you know, with Ray and Anyscale, we're going to remove the infrastructure from the critical path so that as a developer or as a business, all you need to focus on is your application logic, what you want the the program to do, what you want your application to do, how you want the AI to actually interface with the rest of your product. Now the way that will happen is that Ray and Anyscale will still, the infrastructure work will still happen. It'll just be under the hood and taken care of by Ray in Anyscale. And so I think something like this is really necessary for AI to reach its potential, for AI to have the impact and the reach that we think it will, you have to make it easier to do. >> And just for clarification to point out, if you don't mind explaining the relationship of Ray and Anyscale real quick just before we get into the presentation. >> So Ray is an open source project. We created it. We were at Berkeley doing machine learning. We started Ray so that, in order to provide an easy, a simple open source tool for building and running scalable applications. And Anyscale is the managed version of Ray, basically we will run Ray for you in the Cloud, provide a lot of tools around the developer experience and managing the infrastructure and providing more performance and superior infrastructure. >> Awesome. I know you got a presentation on Ray and Anyscale and you guys are positioning as the infrastructure for foundational models. So I'll let you take it away and then when you're done presenting, we'll come back, I'll probably grill you with a few questions and then we'll close it out so take it away. >> Robert: Sounds great. So I'll say a little bit about how companies are using Ray and Anyscale for foundation models. The first thing I want to mention is just why we're doing this in the first place. And the underlying observation, the underlying trend here, and this is a plot from OpenAI, is that the amount of compute needed to do machine learning has been exploding. It's been growing at something like 35 times every 18 months. This is absolutely enormous. And other people have written papers measuring this trend and you get different numbers. But the point is, no matter how you slice and dice it, it' a astronomical rate. Now if you compare that to something we're all familiar with, like Moore's Law, which says that, you know, the processor performance doubles every roughly 18 months, you can see that there's just a tremendous gap between the needs, the compute needs of machine learning applications, and what you can do with a single chip, right. So even if Moore's Law were continuing strong and you know, doing what it used to be doing, even if that were the case, there would still be a tremendous gap between what you can do with the chip and what you need in order to do machine learning. And so given this graph, what we've seen, and what has been clear to us since we started this company, is that doing AI requires scaling. There's no way around it. It's not a nice to have, it's really a requirement. And so that led us to start Ray, which is the open source project that we started to make it easy to build these scalable Python applications and scalable machine learning applications. And since we started the project, it's been adopted by a tremendous number of companies. Companies like OpenAI, which use Ray to train their large models like ChatGPT, companies like Uber, which run all of their deep learning and classical machine learning on top of Ray, companies like Shopify or Spotify or Instacart or Lyft or Netflix, ByteDance, which use Ray for their machine learning infrastructure. Companies like Ant Group, which makes Alipay, you know, they use Ray across the board for fraud detection, for online learning, for detecting money laundering, you know, for graph processing, stream processing. Companies like Amazon, you know, run Ray at a tremendous scale and just petabytes of data every single day. And so the project has seen just enormous adoption since, over the past few years. And one of the most exciting use cases is really providing the infrastructure for building training, fine tuning, and serving foundation models. So I'll say a little bit about, you know, here are some examples of companies using Ray for foundation models. Cohere trains large language models. OpenAI also trains large language models. You can think about the workloads required there are things like supervised pre-training, also reinforcement learning from human feedback. So this is not only the regular supervised learning, but actually more complex reinforcement learning workloads that take human input about what response to a particular question, you know is better than a certain other response. And incorporating that into the learning. There's open source versions as well, like GPTJ also built on top of Ray as well as projects like Alpa coming out of UC Berkeley. So these are some of the examples of exciting projects in organizations, training and creating these large language models and serving them using Ray. Okay, so what actually is Ray? Well, there are two layers to Ray. At the lowest level, there's the core Ray system. This is essentially low level primitives for building scalable Python applications. Things like taking a Python function or a Python class and executing them in the cluster setting. So Ray core is extremely flexible and you can build arbitrary scalable applications on top of Ray. So on top of Ray, on top of the core system, what really gives Ray a lot of its power is this ecosystem of scalable libraries. So on top of the core system you have libraries, scalable libraries for ingesting and pre-processing data, for training your models, for fine tuning those models, for hyper parameter tuning, for doing batch processing and batch inference, for doing model serving and deployment, right. And a lot of the Ray users, the reason they like Ray is that they want to run multiple workloads. They want to train and serve their models, right. They want to load their data and feed that into training. And Ray provides common infrastructure for all of these different workloads. So this is a little overview of what Ray, the different components of Ray. So why do people choose to go with Ray? I think there are three main reasons. The first is the unified nature. The fact that it is common infrastructure for scaling arbitrary workloads, from data ingest to pre-processing to training to inference and serving, right. This also includes the fact that it's future proof. AI is incredibly fast moving. And so many people, many companies that have built their own machine learning infrastructure and standardized on particular workflows for doing machine learning have found that their workflows are too rigid to enable new capabilities. If they want to do reinforcement learning, if they want to use graph neural networks, they don't have a way of doing that with their standard tooling. And so Ray, being future proof and being flexible and general gives them that ability. Another reason people choose Ray in Anyscale is the scalability. This is really our bread and butter. This is the reason, the whole point of Ray, you know, making it easy to go from your laptop to running on thousands of GPUs, making it easy to scale your development workloads and run them in production, making it easy to scale, you know, training to scale data ingest, pre-processing and so on. So scalability and performance, you know, are critical for doing machine learning and that is something that Ray provides out of the box. And lastly, Ray is an open ecosystem. You can run it anywhere. You can run it on any Cloud provider. Google, you know, Google Cloud, AWS, Asure. You can run it on your Kubernetes cluster. You can run it on your laptop. It's extremely portable. And not only that, it's framework agnostic. You can use Ray to scale arbitrary Python workloads. You can use it to scale and it integrates with libraries like TensorFlow or PyTorch or JAX or XG Boost or Hugging Face or PyTorch Lightning, right, or Scikit-learn or just your own arbitrary Python code. It's open source. And in addition to integrating with the rest of the machine learning ecosystem and these machine learning frameworks, you can use Ray along with all of the other tooling in the machine learning ecosystem. That's things like weights and biases or ML flow, right. Or you know, different data platforms like Databricks, you know, Delta Lake or Snowflake or tools for model monitoring for feature stores, all of these integrate with Ray. And that's, you know, Ray provides that kind of flexibility so that you can integrate it into the rest of your workflow. And then Anyscale is the scalable compute platform that's built on top, you know, that provides Ray. So Anyscale is a managed Ray service that runs in the Cloud. And what Anyscale does is it offers the best way to run Ray. And if you think about what you get with Anyscale, there are fundamentally two things. One is about moving faster, accelerating the time to market. And you get that by having the managed service so that as a developer you don't have to worry about managing infrastructure, you don't have to worry about configuring infrastructure. You also, it provides, you know, optimized developer workflows. Things like easily moving from development to production, things like having the observability tooling, the debug ability to actually easily diagnose what's going wrong in a distributed application. So things like the dashboards and the other other kinds of tooling for collaboration, for monitoring and so on. And then on top of that, so that's the first bucket, developer productivity, moving faster, faster experimentation and iteration. The second reason that people choose Anyscale is superior infrastructure. So this is things like, you know, cost deficiency, being able to easily take advantage of spot instances, being able to get higher GPU utilization, things like faster cluster startup times and auto scaling. Things like just overall better performance and faster scheduling. And so these are the kinds of things that Anyscale provides on top of Ray. It's the managed infrastructure. It's fast, it's like the developer productivity and velocity as well as performance. So this is what I wanted to share about Ray in Anyscale. >> John: Awesome. >> Provide that context. But John, I'm curious what you think. >> I love it. I love the, so first of all, it's a platform because that's the platform architecture right there. So just to clarify, this is an Anyscale platform, not- >> That's right. >> Tools. So you got tools in the platform. Okay, that's key. Love that managed service. Just curious, you mentioned Python multiple times, is that because of PyTorch and TensorFlow or Python's the most friendly with machine learning or it's because it's very common amongst all developers? >> That's a great question. Python is the language that people are using to do machine learning. So it's the natural starting point. Now, of course, Ray is actually designed in a language agnostic way and there are companies out there that use Ray to build scalable Java applications. But for the most part right now we're focused on Python and being the best way to build these scalable Python and machine learning applications. But, of course, down the road there always is that potential. >> So if you're slinging Python code out there and you're watching that, you're watching this video, get on Anyscale bus quickly. Also, I just, while you were giving the presentation, I couldn't help, since you mentioned OpenAI, which by the way, congratulations 'cause they've had great scale, I've noticed in their rapid growth 'cause they were the fastest company to the number of users than anyone in the history of the computer industry, so major successor, OpenAI and ChatGPT, huge fan. I'm not a skeptic at all. I think it's just the beginning, so congratulations. But I actually typed into ChatGPT, what are the top three benefits of Anyscale and came up with scalability, flexibility, and ease of use. Obviously, scalability is what you guys are called. >> That's pretty good. >> So that's what they came up with. So they nailed it. Did you have an inside prompt training, buy it there? Only kidding. (Robert laughs) >> Yeah, we hard coded that one. >> But that's the kind of thing that came up really, really quickly if I asked it to write a sales document, it probably will, but this is the future interface. This is why people are getting excited about the foundational models and the large language models because it's allowing the interface with the user, the consumer, to be more human, more natural. And this is clearly will be in every application in the future. >> Absolutely. This is how people are going to interface with software, how they're going to interface with products in the future. It's not just something, you know, not just a chat bot that you talk to. This is going to be how you get things done, right. How you use your web browser or how you use, you know, how you use Photoshop or how you use other products. Like you're not going to spend hours learning all the APIs and how to use them. You're going to talk to it and tell it what you want it to do. And of course, you know, if it doesn't understand it, it's going to ask clarifying questions. You're going to have a conversation and then it'll figure it out. >> This is going to be one of those things, we're going to look back at this time Robert and saying, "Yeah, from that company, that was the beginning of that wave." And just like AWS and Cloud Computing, the folks who got in early really were in position when say the pandemic came. So getting in early is a good thing and that's what everyone's talking about is getting in early and playing around, maybe replatforming or even picking one or few apps to refactor with some staff and managed services. So people are definitely jumping in. So I have to ask you the ROI cost question. You mentioned some of those, Moore's Law versus what's going on in the industry. When you look at that kind of scale, the first thing that jumps out at people is, "Okay, I love it. Let's go play around." But what's it going to cost me? Am I going to be tied to certain GPUs? What's the landscape look like from an operational standpoint, from the customer? Are they locked in and the benefit was flexibility, are you flexible to handle any Cloud? What is the customers, what are they looking at? Basically, that's my question. What's the customer looking at? >> Cost is super important here and many of the companies, I mean, companies are spending a huge amount on their Cloud computing, on AWS, and on doing AI, right. And I think a lot of the advantage of Anyscale, what we can provide here is not only better performance, but cost efficiency. Because if we can run something faster and more efficiently, it can also use less resources and you can lower your Cloud spending, right. We've seen companies go from, you know, 20% GPU utilization with their current setup and the current tools they're using to running on Anyscale and getting more like 95, you know, 100% GPU utilization. That's something like a five x improvement right there. So depending on the kind of application you're running, you know, it's a significant cost savings. We've seen companies that have, you know, processing petabytes of data every single day with Ray going from, you know, getting order of magnitude cost savings by switching from what they were previously doing to running their application on Ray. And when you have applications that are spending, you know, potentially $100 million a year and getting a 10 X cost savings is just absolutely enormous. So these are some of the kinds of- >> Data infrastructure is super important. Again, if the customer, if you're a prospect to this and thinking about going in here, just like the Cloud, you got infrastructure, you got the platform, you got SaaS, same kind of thing's going to go on in AI. So I want to get into that, you know, ROI discussion and some of the impact with your customers that are leveraging the platform. But first I hear you got a demo. >> Robert: Yeah, so let me show you, let me give you a quick run through here. So what I have open here is the Anyscale UI. I've started a little Anyscale Workspace. So Workspaces are the Anyscale concept for interactive developments, right. So here, imagine I'm just, you want to have a familiar experience like you're developing on your laptop. And here I have a terminal. It's not on my laptop. It's actually in the cloud running on Anyscale. And I'm just going to kick this off. This is going to train a large language model, so OPT. And it's doing this on 32 GPUs. We've got a cluster here with a bunch of CPU cores, bunch of memory. And as that's running, and by the way, if I wanted to run this on instead of 32 GPUs, 64, 128, this is just a one line change when I launch the Workspace. And what I can do is I can pull up VS code, right. Remember this is the interactive development experience. I can look at the actual code. Here it's using Ray train to train the torch model. We've got the training loop and we're saying that each worker gets access to one GPU and four CPU cores. And, of course, as I make the model larger, this is using deep speed, as I make the model larger, I could increase the number of GPUs that each worker gets access to, right. And how that is distributed across the cluster. And if I wanted to run on CPUs instead of GPUs or a different, you know, accelerator type, again, this is just a one line change. And here we're using Ray train to train the models, just taking my vanilla PyTorch model using Hugging Face and then scaling that across a bunch of GPUs. And, of course, if I want to look at the dashboard, I can go to the Ray dashboard. There are a bunch of different visualizations I can look at. I can look at the GPU utilization. I can look at, you know, the CPU utilization here where I think we're currently loading the model and running that actual application to start the training. And some of the things that are really convenient here about Anyscale, both I can get that interactive development experience with VS code. You know, I can look at the dashboards. I can monitor what's going on. It feels, I have a terminal, it feels like my laptop, but it's actually running on a large cluster. And I can, with however many GPUs or other resources that I want. And so it's really trying to combine the best of having the familiar experience of programming on your laptop, but with the benefits, you know, being able to take advantage of all the resources in the Cloud to scale. And it's like when, you know, you're talking about cost efficiency. One of the biggest reasons that people waste money, one of the silly reasons for wasting money is just forgetting to turn off your GPUs. And what you can do here is, of course, things will auto terminate if they're idle. But imagine you go to sleep, I have this big cluster. You can turn it off, shut off the cluster, come back tomorrow, restart the Workspace, and you know, your big cluster is back up and all of your code changes are still there. All of your local file edits. It's like you just closed your laptop and came back and opened it up again. And so this is the kind of experience we want to provide for our users. So that's what I wanted to share with you. >> Well, I think that whole, couple of things, lines of code change, single line of code change, that's game changing. And then the cost thing, I mean human error is a big deal. People pass out at their computer. They've been coding all night or they just forget about it. I mean, and then it's just like leaving the lights on or your water running in your house. It's just, at the scale that it is, the numbers will add up. That's a huge deal. So I think, you know, compute back in the old days, there's no compute. Okay, it's just compute sitting there idle. But you know, data cranking the models is doing, that's a big point. >> Another thing I want to add there about cost efficiency is that we make it really easy to use, if you're running on Anyscale, to use spot instances and these preemptable instances that can just be significantly cheaper than the on-demand instances. And so when we see our customers go from what they're doing before to using Anyscale and they go from not using these spot instances 'cause they don't have the infrastructure around it, the fault tolerance to handle the preemption and things like that, to being able to just check a box and use spot instances and save a bunch of money. >> You know, this was my whole, my feature article at Reinvent last year when I met with Adam Selipsky, this next gen Cloud is here. I mean, it's not auto scale, it's infrastructure scale. It's agility. It's flexibility. I think this is where the world needs to go. Almost what DevOps did for Cloud and what you were showing me that demo had this whole SRE vibe. And remember Google had site reliability engines to manage all those servers. This is kind of like an SRE vibe for data at scale. I mean, a similar kind of order of magnitude. I mean, I might be a little bit off base there, but how would you explain it? >> It's a nice analogy. I mean, what we are trying to do here is get to the point where developers don't think about infrastructure. Where developers only think about their application logic. And where businesses can do AI, can succeed with AI, and build these scalable applications, but they don't have to build, you know, an infrastructure team. They don't have to develop that expertise. They don't have to invest years in building their internal machine learning infrastructure. They can just focus on the Python code, on their application logic, and run the stuff out of the box. >> Awesome. Well, I appreciate the time. Before we wrap up here, give a plug for the company. I know you got a couple websites. Again, go, Ray's got its own website. You got Anyscale. You got an event coming up. Give a plug for the company looking to hire. Put a plug in for the company. >> Yeah, absolutely. Thank you. So first of all, you know, we think AI is really going to transform every industry and the opportunity is there, right. We can be the infrastructure that enables all of that to happen, that makes it easy for companies to succeed with AI, and get value out of AI. Now we have, if you're interested in learning more about Ray, Ray has been emerging as the standard way to build scalable applications. Our adoption has been exploding. I mentioned companies like OpenAI using Ray to train their models. But really across the board companies like Netflix and Cruise and Instacart and Lyft and Uber, you know, just among tech companies. It's across every industry. You know, gaming companies, agriculture, you know, farming, robotics, drug discovery, you know, FinTech, we see it across the board. And all of these companies can get value out of AI, can really use AI to improve their businesses. So if you're interested in learning more about Ray and Anyscale, we have our Ray Summit coming up in September. This is going to highlight a lot of the most impressive use cases and stories across the industry. And if your business, if you want to use LLMs, you want to train these LLMs, these large language models, you want to fine tune them with your data, you want to deploy them, serve them, and build applications and products around them, give us a call, talk to us. You know, we can really take the infrastructure piece, you know, off the critical path and make that easy for you. So that's what I would say. And, you know, like you mentioned, we're hiring across the board, you know, engineering, product, go-to-market, and it's an exciting time. >> Robert Nishihara, co-founder and CEO of Anyscale, congratulations on a great company you've built and continuing to iterate on and you got growth ahead of you, you got a tailwind. I mean, the AI wave is here. I think OpenAI and ChatGPT, a customer of yours, have really opened up the mainstream visibility into this new generation of applications, user interface, roll of data, large scale, how to make that programmable so we're going to need that infrastructure. So thanks for coming on this season three, episode one of the ongoing series of the hot startups. In this case, this episode is the top startups building foundational model infrastructure for AI and ML. I'm John Furrier, your host. Thanks for watching. (upbeat music)
SUMMARY :
episode one of the ongoing and you guys really had and other resources in the Cloud. and particular the large language and what you want to achieve. and the Cloud did that with data centers. the point, and you know, if you don't mind explaining and managing the infrastructure and you guys are positioning is that the amount of compute needed to do But John, I'm curious what you think. because that's the platform So you got tools in the platform. and being the best way to of the computer industry, Did you have an inside prompt and the large language models and tell it what you want it to do. So I have to ask you and you can lower your So I want to get into that, you know, and you know, your big cluster is back up So I think, you know, the on-demand instances. and what you were showing me that demo and run the stuff out of the box. I know you got a couple websites. and the opportunity is there, right. and you got growth ahead
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Robert Nishihara | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Robert | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
Netflix | ORGANIZATION | 0.99+ |
35 times | QUANTITY | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
$100 million | QUANTITY | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
100% | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
Ant Group | ORGANIZATION | 0.99+ |
first | QUANTITY | 0.99+ |
Python | TITLE | 0.99+ |
20% | QUANTITY | 0.99+ |
32 GPUs | QUANTITY | 0.99+ |
Lyft | ORGANIZATION | 0.99+ |
hundreds | QUANTITY | 0.99+ |
tomorrow | DATE | 0.99+ |
Anyscale | ORGANIZATION | 0.99+ |
three | QUANTITY | 0.99+ |
128 | QUANTITY | 0.99+ |
September | DATE | 0.99+ |
today | DATE | 0.99+ |
Moore's Law | TITLE | 0.99+ |
Adam Selipsky | PERSON | 0.99+ |
PyTorch | TITLE | 0.99+ |
Ray | ORGANIZATION | 0.99+ |
second reason | QUANTITY | 0.99+ |
64 | QUANTITY | 0.99+ |
each worker | QUANTITY | 0.99+ |
each worker | QUANTITY | 0.99+ |
Photoshop | TITLE | 0.99+ |
UC Berkeley | ORGANIZATION | 0.99+ |
Java | TITLE | 0.99+ |
Shopify | ORGANIZATION | 0.99+ |
OpenAI | ORGANIZATION | 0.99+ |
Anyscale | PERSON | 0.99+ |
third | QUANTITY | 0.99+ |
two things | QUANTITY | 0.99+ |
ByteDance | ORGANIZATION | 0.99+ |
Spotify | ORGANIZATION | 0.99+ |
One | QUANTITY | 0.99+ |
95 | QUANTITY | 0.99+ |
Asure | ORGANIZATION | 0.98+ |
one line | QUANTITY | 0.98+ |
one GPU | QUANTITY | 0.98+ |
ChatGPT | TITLE | 0.98+ |
TensorFlow | TITLE | 0.98+ |
last year | DATE | 0.98+ |
first bucket | QUANTITY | 0.98+ |
both | QUANTITY | 0.98+ |
two layers | QUANTITY | 0.98+ |
Cohere | ORGANIZATION | 0.98+ |
Alipay | ORGANIZATION | 0.98+ |
Ray | PERSON | 0.97+ |
one | QUANTITY | 0.97+ |
Instacart | ORGANIZATION | 0.97+ |
SiliconANGLE News | Beyond the Buzz: A deep dive into the impact of AI
(upbeat music) >> Hello, everyone, welcome to theCUBE. I'm John Furrier, the host of theCUBE in Palo Alto, California. Also it's SiliconANGLE News. Got two great guests here to talk about AI, the impact of the future of the internet, the applications, the people. Amr Awadallah, the founder and CEO, Ed Alban is the CEO of Vectara, a new startup that emerged out of the original Cloudera, I would say, 'cause Amr's known, famous for the Cloudera founding, which was really the beginning of the big data movement. And now as AI goes mainstream, there's so much to talk about, so much to go on. And plus the new company is one of the, now what I call the wave, this next big wave, I call it the fifth wave in the industry. You know, you had PCs, you had the internet, you had mobile. This generative AI thing is real. And you're starting to see startups come out in droves. Amr obviously was founder of Cloudera, Big Data, and now Vectara. And Ed Albanese, you guys have a new company. Welcome to the show. >> Thank you. It's great to be here. >> So great to see you. Now the story is theCUBE started in the Cloudera office. Thanks to you, and your friendly entrepreneurship views that you have. We got to know each other over the years. But Cloudera had Hadoop, which was the beginning of what I call the big data wave, which then became what we now call data lakes, data oceans, and data infrastructure that's developed from that. It's almost interesting to look back 12 plus years, and see that what AI is doing now, right now, is opening up the eyes to the mainstream, and the application's almost mind blowing. You know, Sati Natel called it the Mosaic Moment, didn't say Netscape, he built Netscape (laughing) but called it the Mosaic Moment. You're seeing companies in startups, kind of the alpha geeks running here, because this is the new frontier, and there's real meat on the bone, in terms of like things to do. Why? Why is this happening now? What's is the confluence of the forces happening, that are making this happen? >> Yeah, I mean if you go back to the Cloudera days, with big data, and so on, that was more about data processing. Like how can we process data, so we can extract numbers from it, and do reporting, and maybe take some actions, like this is a fraud transaction, or this is not. And in the meanwhile, many of the researchers working in the neural network, and deep neural network space, were trying to focus on data understanding, like how can I understand the data, and learn from it, so I can take actual actions, based on the data directly, just like a human does. And we were only good at doing that at the level of somebody who was five years old, or seven years old, all the way until about 2013. And starting in 2013, which is only 10 years ago, a number of key innovations started taking place, and each one added on. It was no major innovation that just took place. It was a couple of really incremental ones, but they added on top of each other, in a very exponentially additive way, that led to, by the end of 2019, we now have models, deep neural network models, that can read and understand human text just like we do. Right? And they can reason about it, and argue with you, and explain it to you. And I think that's what is unlocking this whole new wave of innovation that we're seeing right now. So data understanding would be the essence of it. >> So it's not a Big Bang kind of theory, it's been evolving over time, and I think that the tipping point has been the advancements and other things. I mean look at cloud computing, and look how fast it just crept up on AWS. I mean AWS you back three, five years ago, I was talking to Swami yesterday, and their big news about AI, expanding the Hugging Face's relationship with AWS. And just three, five years ago, there wasn't a model training models out there. But as compute comes out, and you got more horsepower,, these large language models, these foundational models, they're flexible, they're not monolithic silos, they're interacting. There's a whole new, almost fusion of data happening. Do you see that? I mean is that part of this? >> Of course, of course. I mean this wave is building on all the previous waves. We wouldn't be at this point if we did not have hardware that can scale, in a very efficient way. We wouldn't be at this point, if we don't have data that we're collecting about everything we do, that we're able to process in this way. So this, this movement, this motion, this phase we're in, absolutely builds on the shoulders of all the previous phases. For some of the observers from the outside, when they see chatGPT for the first time, for them was like, "Oh my god, this just happened overnight." Like it didn't happen overnight. (laughing) GPT itself, like GPT3, which is what chatGPT is based on, was released a year ahead of chatGPT, and many of us were seeing the power it can provide, and what it can do. I don't know if Ed agrees with that. >> Yeah, Ed? >> I do. Although I would acknowledge that the possibilities now, because of what we've hit from a maturity standpoint, have just opened up in an incredible way, that just wasn't tenable even three years ago. And that's what makes it, it's true that it developed incrementally, in the same way that, you know, the possibilities of a mobile handheld device, you know, in 2006 were there, but when the iPhone came out, the possibilities just exploded. And that's the moment we're in. >> Well, I've had many conversations over the past couple months around this area with chatGPT. John Markoff told me the other day, that he calls it, "The five dollar toy," because it's not that big of a deal, in context to what AI's doing behind the scenes, and all the work that's done on ethics, that's happened over the years, but it has woken up the mainstream, so everyone immediately jumps to ethics. "Does it work? "It's not factual," And everyone who's inside the industry is like, "This is amazing." 'Cause you have two schools of thought there. One's like, people that think this is now the beginning of next gen, this is now we're here, this ain't your grandfather's chatbot, okay?" With NLP, it's got reasoning, it's got other things. >> I'm in that camp for sure. >> Yeah. Well I mean, everyone who knows what's going on is in that camp. And as the naysayers start to get through this, and they go, "Wow, it's not just plagiarizing homework, "it's helping me be better. "Like it could rewrite my memo, "bring the lead to the top." It's so the format of the user interface is interesting, but it's still a data-driven app. >> Absolutely. >> So where does it go from here? 'Cause I'm not even calling this the first ending. This is like pregame, in my opinion. What do you guys see this going, in terms of scratching the surface to what happens next? >> I mean, I'll start with, I just don't see how an application is going to look the same in the next three years. Who's going to want to input data manually, in a form field? Who is going to want, or expect, to have to put in some text in a search box, and then read through 15 different possibilities, and try to figure out which one of them actually most closely resembles the question they asked? You know, I don't see that happening. Who's going to start with an absolute blank sheet of paper, and expect no help? That is not how an application will work in the next three years, and it's going to fundamentally change how people interact and spend time with opening any element on their mobile phone, or on their computer, to get something done. >> Yes. I agree with that. Like every single application, over the next five years, will be rewritten, to fit within this model. So imagine an HR application, I don't want to name companies, but imagine an HR application, and you go into application and you clicking on buttons, because you want to take two weeks of vacation, and menus, and clicking here and there, reasons and managers, versus just telling the system, "I'm taking two weeks of vacation, going to Las Vegas," book it, done. >> Yeah. >> And the system just does it for you. If you weren't completing in your input, in your description, for what you want, then the system asks you back, "Did you mean this? "Did you mean that? "Were you trying to also do this as well?" >> Yeah. >> "What was the reason?" And that will fit it for you, and just do it for you. So I think the user interface that we have with apps, is going to change to be very similar to the user interface that we have with each other. And that's why all these apps will need to evolve. >> I know we don't have a lot of time, 'cause you guys are very busy, but I want to definitely have multiple segments with you guys, on this topic, because there's so much to talk about. There's a lot of parallels going on here. I was talking again with Swami who runs all the AI database at AWS, and I asked him, I go, "This feels a lot like the original AWS. "You don't have to provision a data center." A lot of this heavy lifting on the back end, is these large language models, with these foundational models. So the bottleneck in the past, was the energy, and cost to actually do it. Now you're seeing it being stood up faster. So there's definitely going to be a tsunami of apps. I would see that clearly. What is it? We don't know yet. But also people who are going to leverage the fact that I can get started building value. So I see a startup boom coming, and I see an application tsunami of refactoring things. >> Yes. >> So the replatforming is already kind of happening. >> Yes, >> OpenAI, chatGPT, whatever. So that's going to be a developer environment. I mean if Amazon turns this into an API, or a Microsoft, what you guys are doing. >> We're turning it into API as well. That's part of what we're doing as well, yes. >> This is why this is exciting. Amr, you've lived the big data dream, and and we used to talk, if you didn't have a big data problem, if you weren't full of data, you weren't really getting it. Now people have all the data, and they got to stand this up. >> Yeah. >> So the analogy is again, the mobile, I like the mobile movement, and using mobile as an analogy, most companies were not building for a mobile environment, right? They were just building for the web, and legacy way of doing apps. And as soon as the user expectations shifted, that my expectation now, I need to be able to do my job on this small screen, on the mobile device with a touchscreen. Everybody had to invest in re-architecting, and re-implementing every single app, to fit within that model, and that model of interaction. And we are seeing the exact same thing happen now. And one of the core things we're focused on at Vectara, is how to simplify that for organizations, because a lot of them are overwhelmed by large language models, and ML. >> They don't have the staff. >> Yeah, yeah, yeah. They're understaffed, they don't have the skills. >> But they got developers, they've got DevOps, right? >> Yes. >> So they have the DevSecOps going on. >> Exactly, yes. >> So our goal is to simplify it enough for them that they can start leveraging this technology effectively, within their applications. >> Ed, you're the COO of the company, obviously a startup. You guys are growing. You got great backup, and good team. You've also done a lot of business development, and technical business development in this area. If you look at the landscape right now, and I agree the apps are coming, every company I talk to, that has that jet chatGPT of, you know, epiphany, "Oh my God, look how cool this is. "Like magic." Like okay, it's code, settle down. >> Mm hmm. >> But everyone I talk to is using it in a very horizontal way. I talk to a very senior person, very tech alpha geek, very senior person in the industry, technically. they're using it for log data, they're using it for configuration of routers. And in other areas, they're using it for, every vertical has a use case. So this is horizontally scalable from a use case standpoint. When you hear horizontally scalable, first thing I chose in my mind is cloud, right? >> Mm hmm. >> So cloud, and scalability that way. And the data is very specialized. So now you have this vertical specialization, horizontally scalable, everyone will be refactoring. What do you see, and what are you seeing from customers, that you talk to, and prospects? >> Yeah, I mean put yourself in the shoes of an application developer, who is actually trying to make their application a bit more like magic. And to have that soon-to-be, honestly, expected experience. They've got to think about things like performance, and how efficiently that they can actually execute a query, or a question. They've got to think about cost. Generative isn't cheap, like the inference of it. And so you've got to be thoughtful about how and when you take advantage of it, you can't use it as a, you know, everything looks like a nail, and I've got a hammer, and I'm going to hit everything with it, because that will be wasteful. Developers also need to think about how they're going to take advantage of, but not lose their own data. So there has to be some controls around what they feed into the large language model, if anything. Like, should they fine tune a large language model with their own data? Can they keep it logically separated, but still take advantage of the powers of a large language model? And they've also got to take advantage, and be aware of the fact that when data is generated, that it is a different class of data. It might not fully be their own. >> Yeah. >> And it may not even be fully verified. And so when the logical cycle starts, of someone making a request, the relationship between that request, and the output, those things have to be stored safely, logically, and identified as such. >> Yeah. >> And taken advantage of in an ongoing fashion. So these are mega problems, each one of them independently, that, you know, you can think of it as middleware companies need to take advantage of, and think about, to help the next wave of application development be logical, sensible, and effective. It's not just calling some raw API on the cloud, like openAI, and then just, you know, you get your answer and you're done, because that is a very brute force approach. >> Well also I will point, first of all, I agree with your statement about the apps experience, that's going to be expected, form filling. Great point. The interesting about chatGPT. >> Sorry, it's not just form filling, it's any action you would like to take. >> Yeah. >> Instead of clicking, and dragging, and dropping, and doing it on a menu, or on a touch screen, you just say it, and it's and it happens perfectly. >> Yeah. It's a different interface. And that's why I love that UIUX experiences, that's the people falling out of their chair moment with chatGPT, right? But a lot of the things with chatGPT, if you feed it right, it works great. If you feed it wrong and it goes off the rails, it goes off the rails big. >> Yes, yes. >> So the the Bing catastrophes. >> Yeah. >> And that's an example of garbage in, garbage out, classic old school kind of comp-side phrase that we all use. >> Yep. >> Yes. >> This is about data in injection, right? It reminds me the old SQL days, if you had to, if you can sling some SQL, you were a magician, you know, to get the right answer, it's pretty much there. So you got to feed the AI. >> You do, Some people call this, the early word to describe this as prompt engineering. You know, old school, you know, search, or, you know, engagement with data would be, I'm going to, I have a question or I have a query. New school is, I have, I have to issue it a prompt, because I'm trying to get, you know, an action or a reaction, from the system. And the active engineering, there are a lot of different ways you could do it, all the way from, you know, raw, just I'm going to send you whatever I'm thinking. >> Yeah. >> And you get the unintended outcomes, to more constrained, where I'm going to just use my own data, and I'm going to constrain the initial inputs, the data I already know that's first party, and I trust, to, you know, hyper constrain, where the application is actually, it's looking for certain elements to respond to. >> It's interesting Amr, this is why I love this, because one we are in the media, we're recording this video now, we'll stream it. But we got all your linguistics, we're talking. >> Yes. >> This is data. >> Yep. >> So the data quality becomes now the new intellectual property, because, if you have that prompt source data, it makes data or content, in our case, the original content, intellectual property. >> Absolutely. >> Because that's the value. And that's where you see chatGPT fall down, is because they're trying to scroll the web, and people think it's search. It's not necessarily search, it's giving you something that you wanted. It is a lot of that, I remember in Cloudera, you said, "Ask the right questions." Remember that phrase you guys had, that slogan? >> Mm hmm. And that's prompt engineering. So that's exactly, that's the reinvention of "Ask the right question," is prompt engineering is, if you don't give these models the question in the right way, and very few people know how to frame it in the right way with the right context, then you will get garbage out. Right? That is the garbage in, garbage out. But if you specify the question correctly, and you provide with it the metadata that constrain what that question is going to be acted upon or answered upon, then you'll get much better answers. And that's exactly what we solved Vectara. >> Okay. So before we get into the last couple minutes we have left, I want to make sure we get a plug in for the opportunity, and the profile of Vectara, your new company. Can you guys both share with me what you think the current situation is? So for the folks who are now having those moments of, "Ah, AI's bullshit," or, "It's not real, it's a lot of stuff," from, "Oh my god, this is magic," to, "Okay, this is the future." >> Yes. >> What would you say to that person, if you're at a cocktail party, or in the elevator say, "Calm down, this is the first inning." How do you explain the dynamics going on right now, to someone who's either in the industry, but not in the ropes? How would you explain like, what this wave's about? How would you describe it, and how would you prepare them for how to change their life around this? >> Yeah, so I'll go first and then I'll let Ed go. Efficiency, efficiency is the description. So we figured that a way to be a lot more efficient, a way where you can write a lot more emails, create way more content, create way more presentations. Developers can develop 10 times faster than they normally would. And that is very similar to what happened during the Industrial Revolution. I always like to look at examples from the past, to read what will happen now, and what will happen in the future. So during the Industrial Revolution, it was about efficiency with our hands, right? So I had to make a piece of cloth, like this piece of cloth for this shirt I'm wearing. Our ancestors, they had to spend month taking the cotton, making it into threads, taking the threads, making them into pieces of cloth, and then cutting it. And now a machine makes it just like that, right? And the ancestors now turned from the people that do the thing, to manage the machines that do the thing. And I think the same thing is going to happen now, is our efficiency will be multiplied extremely, as human beings, and we'll be able to do a lot more. And many of us will be able to do things they couldn't do before. So another great example I always like to use is the example of Google Maps, and GPS. Very few of us knew how to drive a car from one location to another, and read a map, and get there correctly. But once that efficiency of an AI, by the way, behind these things is very, very complex AI, that figures out how to do that for us. All of us now became amazing navigators that can go from any point to any point. So that's kind of how I look at the future. >> And that's a great real example of impact. Ed, your take on how you would talk to a friend, or colleague, or anyone who asks like, "How do I make sense of the current situation? "Is it real? "What's in it for me, and what do I do?" I mean every company's rethinking their business right now, around this. What would you say to them? >> You know, I usually like to show, rather than describe. And so, you know, the other day I just got access, I've been using an application for a long time, called Notion, and it's super popular. There's like 30 or 40 million users. And the new version of Notion came out, which has AI embedded within it. And it's AI that allows you primarily to create. So if you could break down the world of AI into find and create, for a minute, just kind of logically separate those two things, find is certainly going to be massively impacted in our experiences as consumers on, you know, Google and Bing, and I can't believe I just said the word Bing in the same sentence as Google, but that's what's happening now (all laughing), because it's a good example of change. >> Yes. >> But also inside the business. But on the crate side, you know, Notion is a wiki product, where you try to, you know, note down things that you are thinking about, or you want to share and memorialize. But sometimes you do need help to get it down fast. And just in the first day of using this new product, like my experience has really fundamentally changed. And I think that anybody who would, you know, anybody say for example, that is using an existing app, I would show them, open up the app. Now imagine the possibility of getting a starting point right off the bat, in five seconds of, instead of having to whole cloth draft this thing, imagine getting a starting point then you can modify and edit, or just dispose of and retry again. And that's the potential for me. I can't imagine a scenario where, in a few years from now, I'm going to be satisfied if I don't have a little bit of help, in the same way that I don't manually spell check every email that I send. I automatically spell check it. I love when I'm getting type ahead support inside of Google, or anything. Doesn't mean I always take it, or when texting. >> That's efficiency too. I mean the cloud was about developers getting stuff up quick. >> Exactly. >> All that heavy lifting is there for you, so you don't have to do it. >> Right? >> And you get to the value faster. >> Exactly. I mean, if history taught us one thing, it's, you have to always embrace efficiency, and if you don't fast enough, you will fall behind. Again, looking at the industrial revolution, the companies that embraced the industrial revolution, they became the leaders in the world, and the ones who did not, they all like. >> Well the AI thing that we got to watch out for, is watching how it goes off the rails. If it doesn't have the right prompt engineering, or data architecture, infrastructure. >> Yes. >> It's a big part. So this comes back down to your startup, real quick, I know we got a couple minutes left. Talk about the company, the motivation, and we'll do a deeper dive on on the company. But what's the motivation? What are you targeting for the market, business model? The tech, let's go. >> Actually, I would like Ed to go first. Go ahead. >> Sure, I mean, we're a developer-first, API-first platform. So the product is oriented around allowing developers who may not be superstars, in being able to either leverage, or choose, or select their own large language models for appropriate use cases. But they that want to be able to instantly add the power of large language models into their application set. We started with search, because we think it's going to be one of the first places that people try to take advantage of large language models, to help find information within an application context. And we've built our own large language models, focused on making it very efficient, and elegant, to find information more quickly. So what a developer can do is, within minutes, go up, register for an account, and get access to a set of APIs, that allow them to send data, to be converted into a format that's easy to understand for large language models, vectors. And then secondarily, they can issue queries, ask questions. And they can ask them very, the questions that can be asked, are very natural language questions. So we're talking about long form sentences, you know, drill down types of questions, and they can get answers that either come back in depending upon the form factor of the user interface, in list form, or summarized form, where summarized equals the opportunity to kind of see a condensed, singular answer. >> All right. I have a. >> Oh okay, go ahead, you go. >> I was just going to say, I'm going to be a customer for you, because I want, my dream was to have a hologram of theCUBE host, me and Dave, and have questions be generated in the metaverse. So you know. (all laughing) >> There'll be no longer any guests here. They'll all be talking to you guys. >> Give a couple bullets, I'll spit out 10 good questions. Publish a story. This brings the automation, I'm sorry to interrupt you. >> No, no. No, no, I was just going to follow on on the same. So another way to look at exactly what Ed described is, we want to offer you chatGPT for your own data, right? So imagine taking all of the recordings of all of the interviews you have done, and having all of the content of that being ingested by a system, where you can now have a conversation with your own data and say, "Oh, last time when I met Amr, "which video games did we talk about? "Which movie or book did we use as an analogy "for how we should be embracing data science, "and big data, which is moneyball," I know you use moneyball all the time. And you start having that conversation. So, now the data doesn't become a passive asset that you just have in your organization. No. It's an active participant that's sitting with you, on the table, helping you make decisions. >> One of my favorite things to do with customers, is to go to their site or application, and show them me using it. So for example, one of the customers I talked to was one of the biggest property management companies in the world, that lets people go and rent homes, and houses, and things like that. And you know, I went and I showed them me searching through reviews, looking for information, and trying different words, and trying to find out like, you know, is this place quiet? Is it comfortable? And then I put all the same data into our platform, and I showed them the world of difference you can have when you start asking that question wholeheartedly, and getting real information that doesn't have anything to do with the words you asked, but is really focused on the meaning. You know, when I asked like, "Is it quiet?" You know, answers would come back like, "The wind whispered through the trees peacefully," and you know, it's like nothing to do with quiet in the literal word sense, but in the meaning sense, everything to do with it. And that that was magical even for them, to see that. >> Well you guys are the front end of this big wave. Congratulations on the startup, Amr. I know you guys got great pedigree in big data, and you've got a great team, and congratulations. Vectara is the name of the company, check 'em out. Again, the startup boom is coming. This will be one of the major waves, generative AI is here. I think we'll look back, and it will be pointed out as a major inflection point in the industry. >> Absolutely. >> There's not a lot of hype behind that. People are are seeing it, experts are. So it's going to be fun, thanks for watching. >> Thanks John. (soft music)
SUMMARY :
I call it the fifth wave in the industry. It's great to be here. and the application's almost mind blowing. And in the meanwhile, and you got more horsepower,, of all the previous phases. in the same way that, you know, and all the work that's done on ethics, "bring the lead to the top." in terms of scratching the surface and it's going to fundamentally change and you go into application And the system just does it for you. is going to change to be very So the bottleneck in the past, So the replatforming is So that's going to be a That's part of what and they got to stand this up. And one of the core things don't have the skills. So our goal is to simplify it and I agree the apps are coming, I talk to a very senior And the data is very specialized. and be aware of the fact that request, and the output, some raw API on the cloud, about the apps experience, it's any action you would like to take. you just say it, and it's But a lot of the things with chatGPT, comp-side phrase that we all use. It reminds me the old all the way from, you know, raw, and I'm going to constrain But we got all your So the data quality And that's where you That is the garbage in, garbage out. So for the folks who are and how would you prepare them that do the thing, to manage the current situation? And the new version of Notion came out, But on the crate side, you I mean the cloud was about developers so you don't have to do it. and the ones who did not, they all like. If it doesn't have the So this comes back down to Actually, I would like Ed to go first. factor of the user interface, I have a. generated in the metaverse. They'll all be talking to you guys. This brings the automation, of all of the interviews you have done, one of the customers I talked to Vectara is the name of the So it's going to be fun, Thanks John.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
John Markoff | PERSON | 0.99+ |
2013 | DATE | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Ed Alban | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
30 | QUANTITY | 0.99+ |
10 times | QUANTITY | 0.99+ |
2006 | DATE | 0.99+ |
John Furrier | PERSON | 0.99+ |
two weeks | QUANTITY | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Dave | PERSON | 0.99+ |
Ed Albanese | PERSON | 0.99+ |
John | PERSON | 0.99+ |
five seconds | QUANTITY | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
Ed | PERSON | 0.99+ |
iPhone | COMMERCIAL_ITEM | 0.99+ |
10 good questions | QUANTITY | 0.99+ |
Swami | PERSON | 0.99+ |
15 different possibilities | QUANTITY | 0.99+ |
Palo Alto, California | LOCATION | 0.99+ |
Vectara | ORGANIZATION | 0.99+ |
Amr Awadallah | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
Cloudera | ORGANIZATION | 0.99+ |
first time | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
end of 2019 | DATE | 0.99+ |
yesterday | DATE | 0.98+ |
Big Data | ORGANIZATION | 0.98+ |
40 million users | QUANTITY | 0.98+ |
two things | QUANTITY | 0.98+ |
two great guests | QUANTITY | 0.98+ |
12 plus years | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
five dollar | QUANTITY | 0.98+ |
Netscape | ORGANIZATION | 0.98+ |
five years ago | DATE | 0.98+ |
SQL | TITLE | 0.98+ |
first inning | QUANTITY | 0.98+ |
Amr | PERSON | 0.97+ |
two schools | QUANTITY | 0.97+ |
first | QUANTITY | 0.97+ |
10 years ago | DATE | 0.97+ |
One | QUANTITY | 0.96+ |
first day | QUANTITY | 0.96+ |
three | DATE | 0.96+ |
chatGPT | TITLE | 0.96+ |
first places | QUANTITY | 0.95+ |
Bing | ORGANIZATION | 0.95+ |
Notion | TITLE | 0.95+ |
first thing | QUANTITY | 0.94+ |
theCUBE | ORGANIZATION | 0.94+ |
Beyond the Buzz | TITLE | 0.94+ |
Sati Natel | PERSON | 0.94+ |
Industrial Revolution | EVENT | 0.93+ |
one location | QUANTITY | 0.93+ |
three years ago | DATE | 0.93+ |
single application | QUANTITY | 0.92+ |
one thing | QUANTITY | 0.91+ |
first platform | QUANTITY | 0.91+ |
five years old | QUANTITY | 0.91+ |
SiliconANGLE News | Swami Sivasubramanian Extended Version
(bright upbeat music) >> Hello, everyone. Welcome to SiliconANGLE News breaking story here. Amazon Web Services expanding their relationship with Hugging Face, breaking news here on SiliconANGLE. I'm John Furrier, SiliconANGLE reporter, founder, and also co-host of theCUBE. And I have with me, Swami, from Amazon Web Services, vice president of database, analytics, machine learning with AWS. Swami, great to have you on for this breaking news segment on AWS's big news. Thanks for coming on and taking the time. >> Hey, John, pleasure to be here. >> You know- >> Looking forward to it. >> We've had many conversations on theCUBE over the years, we've watched Amazon really move fast into the large data modeling, SageMaker became a very smashing success, obviously you've been on this for a while. Now with ChatGPT OpenAI, a lot of buzz going mainstream, takes it from behind the curtain inside the ropes, if you will, in the industry to a mainstream. And so this is a big moment, I think, in the industry, I want to get your perspective, because your news with Hugging Face, I think is another tell sign that we're about to tip over into a new accelerated growth around making AI now application aware, application centric, more programmable, more API access. What's the big news about, with AWS Hugging Face, you know, what's going on with this announcement? >> Yeah. First of all, they're very excited to announce our expanded collaboration with Hugging Face, because with this partnership, our goal, as you all know, I mean, Hugging Face, I consider them like the GitHub for machine learning. And with this partnership, Hugging Face and AWS, we'll be able to democratize AI for a broad range of developers, not just specific deep AI startups. And now with this, we can accelerate the training, fine tuning and deployment of these large language models, and vision models from Hugging Face in the cloud. And the broader context, when you step back and see what customer problem we are trying to solve with this announcement, essentially if you see these foundational models, are used to now create like a huge number of applications, suggest like tech summarization, question answering, or search image generation, creative, other things. And these are all stuff we are seeing in the likes of these ChatGPT style applications. But there is a broad range of enterprise use cases that we don't even talk about. And it's because these kind of transformative, generative AI capabilities and models are not available to, I mean, millions of developers. And because either training these elements from scratch can be very expensive or time consuming and need deep expertise, or more importantly, they don't need these generic models, they need them to be fine tuned for the specific use cases. And one of the biggest complaints we hear is that these models, when they try to use it for real production use cases, they are incredibly expensive to train and incredibly expensive to run inference on, to use it at a production scale. So, and unlike web search style applications, where the margins can be really huge, here in production use cases and enterprises, you want efficiency at scale. That's where Hugging Face and AWS share our mission. And by integrating with Trainium and Inferentia, we're able to handle the cost efficient training and inference at scale, I'll deep dive on it. And by teaming up on the SageMaker front, now the time it takes to build these models and fine tune them is also coming down. So that's what makes this partnership very unique as well. So I'm very excited. >> I want to get into the time savings and the cost savings as well on the training and inference, it's a huge issue, but before we get into that, just how long have you guys been working with Hugging Face? I know there's a previous relationship, this is an expansion of that relationship, can you comment on what's different about what's happened before and then now? >> Yeah. So, Hugging Face, we have had a great relationship in the past few years as well, where they have actually made their models available to run on AWS, you know, fashion. Even in fact, their Bloom Project was something many of our customers even used. Bloom Project, for context, is their open source project which builds a GPT-3 style model. And now with this expanded collaboration, now Hugging Face selected AWS for that next generation office generative AI model, building on their highly successful Bloom Project as well. And the nice thing is, now, by direct integration with Trainium and Inferentia, where you get cost savings in a really significant way, now, for instance, Trn1 can provide up to 50% cost to train savings, and Inferentia can deliver up to 60% better costs, and four x more higher throughput than (indistinct). Now, these models, especially as they train that next generation generative AI models, it is going to be, not only more accessible to all the developers, who use it in open, so it'll be a lot cheaper as well. And that's what makes this moment really exciting, because we can't democratize AI unless we make it broadly accessible and cost efficient and easy to program and use as well. >> Yeah. >> So very exciting. >> I'll get into the SageMaker and CodeWhisperer angle in a second, but you hit on some good points there. One, accessibility, which is, I call the democratization, which is getting this in the hands of developers, and/or AI to develop, we'll get into that in a second. So, access to coding and Git reasoning is a whole nother wave. But the three things I know you've been working on, I want to put in the buckets here and comment, one, I know you've, over the years, been working on saving time to train, that's a big point, you mentioned some of those stats, also cost, 'cause now cost is an equation on, you know, bundling whether you're uncoupling with hardware and software, that's a big issue. Where do I find the GPUs? Where's the horsepower cost? And then also sustainability. You've mentioned that in the past, is there a sustainability angle here? Can you talk about those three things, time, cost, and sustainability? >> Certainly. So if you look at it from the AWS perspective, we have been supporting customers doing machine learning for the past years. Just for broader context, Amazon has been doing ML the past two decades right from the early days of ML powered recommendation to actually also supporting all kinds of generative AI applications. If you look at even generative AI application within Amazon, Amazon search, when you go search for a product and so forth, we have a team called MFi within Amazon search that helps bring these large language models into creating highly accurate search results. And these are created with models, really large models with tens of billions of parameters, scales to thousands of training jobs every month and trained on large model of hardware. And this is an example of a really good large language foundation model application running at production scale, and also, of course, Alexa, which uses a large generator model as well. And they actually even had a research paper that showed that they are more, and do better in accuracy than other systems like GPT-3 and whatnot. So, and we also touched on things like CodeWhisperer, which uses generative AI to improve developer productivity, but in a responsible manner, because 40% of some of the studies show 40% of this generated code had serious security flaws in it. This is where we didn't just do generative AI, we combined with automated reasoning capabilities, which is a very, very useful technique to identify these issues and couple them so that it produces highly secure code as well. Now, all these learnings taught us few things, and which is what you put in these three buckets. And yeah, like more than 100,000 customers using ML and AI services, including leading startups in the generative AI space, like stability AI, AI21 Labs, or Hugging Face, or even Alexa, for that matter. They care about, I put them in three dimension, one is around cost, which we touched on with Trainium and Inferentia, where we actually, the Trainium, you provide to 50% better cost savings, but the other aspect is, Trainium is a lot more power efficient as well compared to traditional one. And Inferentia is also better in terms of throughput, when it comes to what it is capable of. Like it is able to deliver up to three x higher compute performance and four x higher throughput, compared to it's previous generation, and it is extremely cost efficient and power efficient as well. >> Well. >> Now, the second element that really is important is in a day, developers deeply value the time it takes to build these models, and they don't want to build models from scratch. And this is where SageMaker, which is, even going to Kaggle uses, this is what it is, number one, enterprise ML platform. What it did to traditional machine learning, where tens of thousands of customers use StageMaker today, including the ones I mentioned, is that what used to take like months to build these models have dropped down to now a matter of days, if not less. Now, a generative AI, the cost of building these models, if you look at the landscape, the model parameter size had jumped by more than thousand X in the past three years, thousand x. And that means the training is like a really big distributed systems problem. How do you actually scale these model training? How do you actually ensure that you utilize these efficiently? Because these machines are very expensive, let alone they consume a lot of power. So, this is where SageMaker capability to build, automatically train, tune, and deploy models really concern this, especially with this distributor training infrastructure, and those are some of the reasons why some of the leading generative AI startups are actually leveraging it, because they do not want a giant infrastructure team, which is constantly tuning and fine tuning, and keeping these clusters alive. >> It sounds like a lot like what startups are doing with the cloud early days, no data center, you move to the cloud. So, this is the trend we're seeing, right? You guys are making it easier for developers with Hugging Face, I get that. I love that GitHub for machine learning, large language models are complex and expensive to build, but not anymore, you got Trainium and Inferentia, developers can get faster time to value, but then you got the transformers data sets, token libraries, all that optimized for generator. This is a perfect storm for startups. Jon Turow, a former AWS person, who used to work, I think for you, is now a VC at Madrona Venture, he and I were talking about the generator AI landscape, it's exploding with startups. Every alpha entrepreneur out there is seeing this as the next frontier, that's the 20 mile stairs, next 10 years is going to be huge. What is the big thing that's happened? 'Cause some people were saying, the founder of Yquem said, "Oh, the start ups won't be real, because they don't all have AI experience." John Markoff, former New York Times writer told me that, AI, there's so much work done, this is going to explode, accelerate really fast, because it's almost like it's been waiting for this moment. What's your reaction? >> I actually think there is going to be an explosion of startups, not because they need to be AI startups, but now finally AI is really accessible or going to be accessible, so that they can create remarkable applications, either for enterprises or for disrupting actually how customer service is being done or how creative tools are being built. And I mean, this is going to change in many ways. When we think about generative AI, we always like to think of how it generates like school homework or arts or music or whatnot, but when you look at it on the practical side, generative AI is being actually used across various industries. I'll give an example of like Autodesk. Autodesk is a customer who runs an AWS and SageMaker. They already have an offering that enables generated design, where designers can generate many structural designs for products, whereby you give a specific set of constraints and they actually can generate a structure accordingly. And we see similar kind of trend across various industries, where it can be around creative media editing or various others. I have the strong sense that literally, in the next few years, just like now, conventional machine learning is embedded in every application, every mobile app that we see, it is pervasive, and we don't even think twice about it, same way, like almost all apps are built on cloud. Generative AI is going to be part of every startup, and they are going to create remarkable experiences without needing actually, these deep generative AI scientists. But you won't get that until you actually make these models accessible. And I also don't think one model is going to rule the world, then you want these developers to have access to broad range of models. Just like, go back to the early days of deep learning. Everybody thought it is going to be one framework that will rule the world, and it has been changing, from Caffe to TensorFlow to PyTorch to various other things. And I have a suspicion, we had to enable developers where they are, so. >> You know, Dave Vellante and I have been riffing on this concept called super cloud, and a lot of people have co-opted to be multicloud, but we really were getting at this whole next layer on top of say, AWS. You guys are the most comprehensive cloud, you guys are a super cloud, and even Adam and I are talking about ISVs evolving to ecosystem partners. I mean, your top customers have ecosystems building on top of it. This feels like a whole nother AWS. How are you guys leveraging the history of AWS, which by the way, had the same trajectory, startups came in, they didn't want to provision a data center, the heavy lifting, all the things that have made Amazon successful culturally. And day one thinking is, provide the heavy lifting, undifferentiated heavy lifting, and make it faster for developers to program code. AI's got the same thing. How are you guys taking this to the next level, because now, this is an opportunity for the competition to change the game and take it over? This is, I'm sure, a conversation, you guys have a lot of things going on in AWS that makes you unique. What's the internal and external positioning around how you take it to the next level? >> I mean, so I agree with you that generative AI has a very, very strong potential in terms of what it can enable in terms of next generation application. But this is where Amazon's experience and expertise in putting these foundation models to work internally really has helped us quite a bit. If you look at it, like amazon.com search is like a very, very important application in terms of what is the customer impact on number of customers who use that application openly, and the amount of dollar impact it does for an organization. And we have been doing it silently for a while now. And the same thing is true for like Alexa too, which actually not only uses it for natural language understanding other city, even national leverages is set for creating stories and various other examples. And now, our approach to it from AWS is we actually look at it as in terms of the same three tiers like we did in machine learning, because when you look at generative AI, we genuinely see three sets of customers. One is, like really deep technical expert practitioner startups. These are the startups that are creating the next generation models like the likes of stability AIs or Hugging Face with Bloom or AI21. And they generally want to build their own models, and they want the best price performance of their infrastructure for training and inference. That's where our investments in silicon and hardware and networking innovations, where Trainium and Inferentia really plays a big role. And we can nearly do that, and that is one. The second middle tier is where I do think developers don't want to spend time building their own models, let alone, they actually want the model to be useful to that data. They don't need their models to create like high school homeworks or various other things. What they generally want is, hey, I had this data from my enterprises that I want to fine tune and make it really work only for this, and make it work remarkable, can be for tech summarization, to generate a report, or it can be for better Q&A, and so forth. This is where we are. Our investments in the middle tier with SageMaker, and our partnership with Hugging Face and AI21 and co here are all going to very meaningful. And you'll see us investing, I mean, you already talked about CodeWhisperer, which is an open preview, but we are also partnering with a whole lot of top ISVs, and you'll see more on this front to enable the next wave of generated AI apps too, because this is an area where we do think lot of innovation is yet to be done. It's like day one for us in this space, and we want to enable that huge ecosystem to flourish. >> You know, one of the things Dave Vellante and I were talking about in our first podcast we just did on Friday, we're going to do weekly, is we highlighted the AI ChatGPT example as a horizontal use case, because everyone loves it, people are using it in all their different verticals, and horizontal scalable cloud plays perfectly into it. So I have to ask you, as you look at what AWS is going to bring to the table, a lot's changed over the past 13 years with AWS, a lot more services are available, how should someone rebuild or re-platform and refactor their application of business with AI, with AWS? What are some of the tools that you see and recommend? Is it Serverless, is it SageMaker, CodeWhisperer? What do you think's going to shine brightly within the AWS stack, if you will, or service list, that's going to be part of this? As you mentioned, CodeWhisperer and SageMaker, what else should people be looking at as they start tinkering and getting all these benefits, and scale up their ups? >> You know, if we were a startup, first, I would really work backwards from the customer problem I try to solve, and pick and choose, bar, I don't need to deal with the undifferentiated heavy lifting, so. And that's where the answer is going to change. If you look at it then, the answer is not going to be like a one size fits all, so you need a very strong, I mean, granted on the compute front, if you can actually completely accurate it, so unless, I will always recommend it, instead of running compute for running your ups, because it takes care of all the undifferentiated heavy lifting, but on the data, and that's where we provide a whole variety of databases, right from like relational data, or non-relational, or dynamo, and so forth. And of course, we also have a deep analytical stack, where data directly flows from our relational databases into data lakes and data virus. And you can get value along with partnership with various analytical providers. The area where I do think fundamentally things are changing on what people can do is like, with CodeWhisperer, I was literally trying to actually program a code on sending a message through Twilio, and I was going to pull up to read a documentation, and in my ID, I was actually saying like, let's try sending a message to Twilio, or let's actually update a Route 53 error code. All I had to do was type in just a comment, and it actually started generating the sub-routine. And it is going to be a huge time saver, if I were a developer. And the goal is for us not to actually do it just for AWS developers, and not to just generate the code, but make sure the code is actually highly secure and follows the best practices. So, it's not always about machine learning, it's augmenting with automated reasoning as well. And generative AI is going to be changing, and not just in how people write code, but also how it actually gets built and used as well. You'll see a lot more stuff coming on this front. >> Swami, thank you for your time. I know you're super busy. Thank you for sharing on the news and giving commentary. Again, I think this is a AWS moment and industry moment, heavy lifting, accelerated value, agility. AIOps is going to be probably redefined here. Thanks for sharing your commentary. And we'll see you next time, I'm looking forward to doing more follow up on this. It's going to be a big wave. Thanks. >> Okay. Thanks again, John, always a pleasure. >> Okay. This is SiliconANGLE's breaking news commentary. I'm John Furrier with SiliconANGLE News, as well as host of theCUBE. Swami, who's a leader in AWS, has been on theCUBE multiple times. We've been tracking the growth of how Amazon's journey has just been exploding past five years, in particular, past three. You heard the numbers, great performance, great reviews. This is a watershed moment, I think, for the industry, and it's going to be a lot of fun for the next 10 years. Thanks for watching. (bright music)
SUMMARY :
Swami, great to have you on inside the ropes, if you And one of the biggest complaints we hear and easy to program and use as well. I call the democratization, the Trainium, you provide And that means the training What is the big thing that's happened? and they are going to create this to the next level, and the amount of dollar impact that's going to be part of this? And generative AI is going to be changing, AIOps is going to be John, always a pleasure. and it's going to be a lot
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Vellante | PERSON | 0.99+ |
Swami | PERSON | 0.99+ |
Amazon Web Services | ORGANIZATION | 0.99+ |
Jon Turow | PERSON | 0.99+ |
John Markoff | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
John | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
40% | QUANTITY | 0.99+ |
Autodesk | ORGANIZATION | 0.99+ |
50% | QUANTITY | 0.99+ |
Madrona Venture | ORGANIZATION | 0.99+ |
20 mile | QUANTITY | 0.99+ |
Hugging Face | ORGANIZATION | 0.99+ |
Friday | DATE | 0.99+ |
second element | QUANTITY | 0.99+ |
more than 100,000 customers | QUANTITY | 0.99+ |
AI21 | ORGANIZATION | 0.99+ |
tens of thousands | QUANTITY | 0.99+ |
first podcast | QUANTITY | 0.99+ |
three tiers | QUANTITY | 0.98+ |
SiliconANGLE | ORGANIZATION | 0.98+ |
twice | QUANTITY | 0.98+ |
Bloom Project | TITLE | 0.98+ |
one | QUANTITY | 0.98+ |
SageMaker | ORGANIZATION | 0.98+ |
Hugging Face | TITLE | 0.98+ |
Alexa | TITLE | 0.98+ |
first | QUANTITY | 0.98+ |
GitHub | ORGANIZATION | 0.98+ |
one model | QUANTITY | 0.98+ |
up to 50% | QUANTITY | 0.97+ |
ChatGPT | TITLE | 0.97+ |
First | QUANTITY | 0.97+ |
more than thousand X | QUANTITY | 0.97+ |
amazon.com | ORGANIZATION | 0.96+ |
tens of billions | QUANTITY | 0.96+ |
One | QUANTITY | 0.96+ |
up to 60% | QUANTITY | 0.96+ |
one framework | QUANTITY | 0.96+ |
Yquem | ORGANIZATION | 0.94+ |
three things | QUANTITY | 0.94+ |
Inferentia | ORGANIZATION | 0.94+ |
CodeWhisperer | TITLE | 0.93+ |
four | QUANTITY | 0.92+ |
three sets | QUANTITY | 0.92+ |
three | QUANTITY | 0.92+ |
Twilio | ORGANIZATION | 0.92+ |
Breaking Analysis: We Have the Data…What Private Tech Companies Don’t Tell you About Their Business
>> From The Cube Studios in Palo Alto and Boston, bringing you data driven insights from The Cube at ETR. This is "Breaking Analysis" with Dave Vellante. >> The reverse momentum in tech stocks caused by rising interest rates, less attractive discounted cash flow models, and more tepid forward guidance, can be easily measured by public market valuations. And while there's lots of discussion about the impact on private companies and cash runway and 409A valuations, measuring the performance of non-public companies isn't as easy. IPOs have dried up and public statements by private companies, of course, they accentuate the good and they kind of hide the bad. Real data, unless you're an insider, is hard to find. Hello and welcome to this week's "Wikibon Cube Insights" powered by ETR. In this "Breaking Analysis", we unlock some of the secrets that non-public, emerging tech companies may or may not be sharing. And we do this by introducing you to a capability from ETR that we've not exposed you to over the past couple of years, it's called the Emerging Technologies Survey, and it is packed with sentiment data and performance data based on surveys of more than a thousand CIOs and IT buyers covering more than 400 companies. And we've invited back our colleague, Erik Bradley of ETR to help explain the survey and the data that we're going to cover today. Erik, this survey is something that I've not personally spent much time on, but I'm blown away at the data. It's really unique and detailed. First of all, welcome. Good to see you again. >> Great to see you too, Dave, and I'm really happy to be talking about the ETS or the Emerging Technology Survey. Even our own clients of constituents probably don't spend as much time in here as they should. >> Yeah, because there's so much in the mainstream, but let's pull up a slide to bring out the survey composition. Tell us about the study. How often do you run it? What's the background and the methodology? >> Yeah, you were just spot on the way you were talking about the private tech companies out there. So what we did is we decided to take all the vendors that we track that are not yet public and move 'em over to the ETS. And there isn't a lot of information out there. If you're not in Silicon (indistinct), you're not going to get this stuff. So PitchBook and Tech Crunch are two out there that gives some data on these guys. But what we really wanted to do was go out to our community. We have 6,000, ITDMs in our community. We wanted to ask them, "Are you aware of these companies? And if so, are you allocating any resources to them? Are you planning to evaluate them," and really just kind of figure out what we can do. So this particular survey, as you can see, 1000 plus responses, over 450 vendors that we track. And essentially what we're trying to do here is talk about your evaluation and awareness of these companies and also your utilization. And also if you're not utilizing 'em, then we can also figure out your sales conversion or churn. So this is interesting, not only for the ITDMs themselves to figure out what their peers are evaluating and what they should put in POCs against the big guys when contracts come up. But it's also really interesting for the tech vendors themselves to see how they're performing. >> And you can see 2/3 of the respondents are director level of above. You got 28% is C-suite. There is of course a North America bias, 70, 75% is North America. But these smaller companies, you know, that's when they start doing business. So, okay. We're going to do a couple of things here today. First, we're going to give you the big picture across the sectors that ETR covers within the ETS survey. And then we're going to look at the high and low sentiment for the larger private companies. And then we're going to do the same for the smaller private companies, the ones that don't have as much mindshare. And then I'm going to put those two groups together and we're going to look at two dimensions, actually three dimensions, which companies are being evaluated the most. Second, companies are getting the most usage and adoption of their offerings. And then third, which companies are seeing the highest churn rates, which of course is a silent killer of companies. And then finally, we're going to look at the sentiment and mindshare for two key areas that we like to cover often here on "Breaking Analysis", security and data. And data comprises database, including data warehousing, and then big data analytics is the second part of data. And then machine learning and AI is the third section within data that we're going to look at. Now, one other thing before we get into it, ETR very often will include open source offerings in the mix, even though they're not companies like TensorFlow or Kubernetes, for example. And we'll call that out during this discussion. The reason this is done is for context, because everyone is using open source. It is the heart of innovation and many business models are super glued to an open source offering, like take MariaDB, for example. There's the foundation and then there's with the open source code and then there, of course, the company that sells services around the offering. Okay, so let's first look at the highest and lowest sentiment among these private firms, the ones that have the highest mindshare. So they're naturally going to be somewhat larger. And we do this on two dimensions, sentiment on the vertical axis and mindshare on the horizontal axis and note the open source tool, see Kubernetes, Postgres, Kafka, TensorFlow, Jenkins, Grafana, et cetera. So Erik, please explain what we're looking at here, how it's derived and what the data tells us. >> Certainly, so there is a lot here, so we're going to break it down first of all by explaining just what mindshare and net sentiment is. You explain the axis. We have so many evaluation metrics, but we need to aggregate them into one so that way we can rank against each other. Net sentiment is really the aggregation of all the positive and subtracting out the negative. So the net sentiment is a very quick way of looking at where these companies stand versus their peers in their sectors and sub sectors. Mindshare is basically the awareness of them, which is good for very early stage companies. And you'll see some names on here that are obviously been around for a very long time. And they're clearly be the bigger on the axis on the outside. Kubernetes, for instance, as you mentioned, is open source. This de facto standard for all container orchestration, and it should be that far up into the right, because that's what everyone's using. In fact, the open source leaders are so prevalent in the emerging technology survey that we break them out later in our analysis, 'cause it's really not fair to include them and compare them to the actual companies that are providing the support and the security around that open source technology. But no survey, no analysis, no research would be complete without including these open source tech. So what we're looking at here, if I can just get away from the open source names, we see other things like Databricks and OneTrust . They're repeating as top net sentiment performers here. And then also the design vendors. People don't spend a lot of time on 'em, but Miro and Figma. This is their third survey in a row where they're just dominating that sentiment overall. And Adobe should probably take note of that because they're really coming after them. But Databricks, we all know probably would've been a public company by now if the market hadn't turned, but you can see just how dominant they are in a survey of nothing but private companies. And we'll see that again when we talk about the database later. >> And I'll just add, so you see automation anywhere on there, the big UiPath competitor company that was not able to get to the public markets. They've been trying. Snyk, Peter McKay's company, they've raised a bunch of money, big security player. They're doing some really interesting things in developer security, helping developers secure the data flow, H2O.ai, Dataiku AI company. We saw them at the Snowflake Summit. Redis Labs, Netskope and security. So a lot of names that we know that ultimately we think are probably going to be hitting the public market. Okay, here's the same view for private companies with less mindshare, Erik. Take us through this one. >> On the previous slide too real quickly, I wanted to pull that security scorecard and we'll get back into it. But this is a newcomer, that I couldn't believe how strong their data was, but we'll bring that up in a second. Now, when we go to the ones of lower mindshare, it's interesting to talk about open source, right? Kubernetes was all the way on the top right. Everyone uses containers. Here we see Istio up there. Not everyone is using service mesh as much. And that's why Istio is in the smaller breakout. But still when you talk about net sentiment, it's about the leader, it's the highest one there is. So really interesting to point out. Then we see other names like Collibra in the data side really performing well. And again, as always security, very well represented here. We have Aqua, Wiz, Armis, which is a standout in this survey this time around. They do IoT security. I hadn't even heard of them until I started digging into the data here. And I couldn't believe how well they were doing. And then of course you have AnyScale, which is doing a second best in this and the best name in the survey Hugging Face, which is a machine learning AI tool. Also doing really well on a net sentiment, but they're not as far along on that access of mindshare just yet. So these are again, emerging companies that might not be as well represented in the enterprise as they will be in a couple of years. >> Hugging Face sounds like something you do with your two year old. Like you said, you see high performers, AnyScale do machine learning and you mentioned them. They came out of Berkeley. Collibra Governance, InfluxData is on there. InfluxDB's a time series database. And yeah, of course, Alex, if you bring that back up, you get a big group of red dots, right? That's the bad zone, I guess, which Sisense does vis, Yellowbrick Data is a NPP database. How should we interpret the red dots, Erik? I mean, is it necessarily a bad thing? Could it be misinterpreted? What's your take on that? >> Sure, well, let me just explain the definition of it first from a data science perspective, right? We're a data company first. So the gray dots that you're seeing that aren't named, that's the mean that's the average. So in order for you to be on this chart, you have to be at least one standard deviation above or below that average. So that gray is where we're saying, "Hey, this is where the lump of average comes in. This is where everyone normally stands." So you either have to be an outperformer or an underperformer to even show up in this analysis. So by definition, yes, the red dots are bad. You're at least one standard deviation below the average of your peers. It's not where you want to be. And if you're on the lower left, not only are you not performing well from a utilization or an actual usage rate, but people don't even know who you are. So that's a problem, obviously. And the VCs and the PEs out there that are backing these companies, they're the ones who mostly are interested in this data. >> Yeah. Oh, that's great explanation. Thank you for that. No, nice benchmarking there and yeah, you don't want to be in the red. All right, let's get into the next segment here. Here going to look at evaluation rates, adoption and the all important churn. First new evaluations. Let's bring up that slide. And Erik, take us through this. >> So essentially I just want to explain what evaluation means is that people will cite that they either plan to evaluate the company or they're currently evaluating. So that means we're aware of 'em and we are choosing to do a POC of them. And then we'll see later how that turns into utilization, which is what a company wants to see, awareness, evaluation, and then actually utilizing them. That's sort of the life cycle for these emerging companies. So what we're seeing here, again, with very high evaluation rates. H2O, we mentioned. SecurityScorecard jumped up again. Chargebee, Snyk, Salt Security, Armis. A lot of security names are up here, Aqua, Netskope, which God has been around forever. I still can't believe it's in an Emerging Technology Survey But so many of these names fall in data and security again, which is why we decided to pick those out Dave. And on the lower side, Vena, Acton, those unfortunately took the dubious award of the lowest evaluations in our survey, but I prefer to focus on the positive. So SecurityScorecard, again, real standout in this one, they're in a security assessment space, basically. They'll come in and assess for you how your security hygiene is. And it's an area of a real interest right now amongst our ITDM community. >> Yeah, I mean, I think those, and then Arctic Wolf is up there too. They're doing managed services. You had mentioned Netskope. Yeah, okay. All right, let's look at now adoption. These are the companies whose offerings are being used the most and are above that standard deviation in the green. Take us through this, Erik. >> Sure, yet again, what we're looking at is, okay, we went from awareness, we went to evaluation. Now it's about utilization, which means a survey respondent's going to state "Yes, we evaluated and we plan to utilize it" or "It's already in our enterprise and we're actually allocating further resources to it." Not surprising, again, a lot of open source, the reason why, it's free. So it's really easy to grow your utilization on something that's free. But as you and I both know, as Red Hat proved, there's a lot of money to be made once the open source is adopted, right? You need the governance, you need the security, you need the support wrapped around it. So here we're seeing Kubernetes, Postgres, Apache Kafka, Jenkins, Grafana. These are all open source based names. But if we're looking at names that are non open source, we're going to see Databricks, Automation Anywhere, Rubrik all have the highest mindshare. So these are the names, not surprisingly, all names that probably should have been public by now. Everyone's expecting an IPO imminently. These are the names that have the highest mindshare. If we talk about the highest utilization rates, again, Miro and Figma pop up, and I know they're not household names, but they are just dominant in this survey. These are applications that are meant for design software and, again, they're going after an Autodesk or a CAD or Adobe type of thing. It is just dominant how high the utilization rates are here, which again is something Adobe should be paying attention to. And then you'll see a little bit lower, but also interesting, we see Collibra again, we see Hugging Face again. And these are names that are obviously in the data governance, ML, AI side. So we're seeing a ton of data, a ton of security and Rubrik was interesting in this one, too, high utilization and high mindshare. We know how pervasive they are in the enterprise already. >> Erik, Alex, keep that up for a second, if you would. So yeah, you mentioned Rubrik. Cohesity's not on there. They're sort of the big one. We're going to talk about them in a moment. Puppet is interesting to me because you remember the early days of that sort of space, you had Puppet and Chef and then you had Ansible. Red Hat bought Ansible and then Ansible really took off. So it's interesting to see Puppet on there as well. Okay. So now let's look at the churn because this one is where you don't want to be. It's, of course, all red 'cause churn is bad. Take us through this, Erik. >> Yeah, definitely don't want to be here and I don't love to dwell on the negative. So we won't spend as much time. But to your point, there's one thing I want to point out that think it's important. So you see Rubrik in the same spot, but Rubrik has so many citations in our survey that it actually would make sense that they're both being high utilization and churn just because they're so well represented. They have such a high overall representation in our survey. And the reason I call that out is Cohesity. Cohesity has an extremely high churn rate here about 17% and unlike Rubrik, they were not on the utilization side. So Rubrik is seeing both, Cohesity is not. It's not being utilized, but it's seeing a high churn. So that's the way you can look at this data and say, "Hm." Same thing with Puppet. You noticed that it was on the other slide. It's also on this one. So basically what it means is a lot of people are giving Puppet a shot, but it's starting to churn, which means it's not as sticky as we would like. One that was surprising on here for me was Tanium. It's kind of jumbled in there. It's hard to see in the middle, but Tanium, I was very surprised to see as high of a churn because what I do hear from our end user community is that people that use it, like it. It really kind of spreads into not only vulnerability management, but also that endpoint detection and response side. So I was surprised by that one, mostly to see Tanium in here. Mural, again, was another one of those application design softwares that's seeing a very high churn as well. >> So you're saying if you're in both... Alex, bring that back up if you would. So if you're in both like MariaDB is for example, I think, yeah, they're in both. They're both green in the previous one and red here, that's not as bad. You mentioned Rubrik is going to be in both. Cohesity is a bit of a concern. Cohesity just brought on Sanjay Poonen. So this could be a go to market issue, right? I mean, 'cause Cohesity has got a great product and they got really happy customers. So they're just maybe having to figure out, okay, what's the right ideal customer profile and Sanjay Poonen, I guarantee, is going to have that company cranking. I mean they had been doing very well on the surveys and had fallen off of a bit. The other interesting things wondering the previous survey I saw Cvent, which is an event platform. My only reason I pay attention to that is 'cause we actually have an event platform. We don't sell it separately. We bundle it as part of our offerings. And you see Hopin on here. Hopin raised a billion dollars during the pandemic. And we were like, "Wow, that's going to blow up." And so you see Hopin on the churn and you didn't see 'em in the previous chart, but that's sort of interesting. Like you said, let's not kind of dwell on the negative, but you really don't. You know, churn is a real big concern. Okay, now we're going to drill down into two sectors, security and data. Where data comprises three areas, database and data warehousing, machine learning and AI and big data analytics. So first let's take a look at the security sector. Now this is interesting because not only is it a sector drill down, but also gives an indicator of how much money the firm has raised, which is the size of that bubble. And to tell us if a company is punching above its weight and efficiently using its venture capital. Erik, take us through this slide. Explain the dots, the size of the dots. Set this up please. >> Yeah. So again, the axis is still the same, net sentiment and mindshare, but what we've done this time is we've taken publicly available information on how much capital company is raised and that'll be the size of the circle you see around the name. And then whether it's green or red is basically saying relative to the amount of money they've raised, how are they doing in our data? So when you see a Netskope, which has been around forever, raised a lot of money, that's why you're going to see them more leading towards red, 'cause it's just been around forever and kind of would expect it. Versus a name like SecurityScorecard, which is only raised a little bit of money and it's actually performing just as well, if not better than a name, like a Netskope. OneTrust doing absolutely incredible right now. BeyondTrust. We've seen the issues with Okta, right. So those are two names that play in that space that obviously are probably getting some looks about what's going on right now. Wiz, we've all heard about right? So raised a ton of money. It's doing well on net sentiment, but the mindshare isn't as well as you'd want, which is why you're going to see a little bit of that red versus a name like Aqua, which is doing container and application security. And hasn't raised as much money, but is really neck and neck with a name like Wiz. So that is why on a relative basis, you'll see that more green. As we all know, information security is never going away. But as we'll get to later in the program, Dave, I'm not sure in this current market environment, if people are as willing to do POCs and switch away from their security provider, right. There's a little bit of tepidness out there, a little trepidation. So right now we're seeing overall a slight pause, a slight cooling in overall evaluations on the security side versus historical levels a year ago. >> Now let's stay on here for a second. So a couple things I want to point out. So it's interesting. Now Snyk has raised over, I think $800 million but you can see them, they're high on the vertical and the horizontal, but now compare that to Lacework. It's hard to see, but they're kind of buried in the middle there. That's the biggest dot in this whole thing. I think I'm interpreting this correctly. They've raised over a billion dollars. It's a Mike Speiser company. He was the founding investor in Snowflake. So people watch that very closely, but that's an example of where they're not punching above their weight. They recently had a layoff and they got to fine tune things, but I'm still confident they they're going to do well. 'Cause they're approaching security as a data problem, which is probably people having trouble getting their arms around that. And then again, I see Arctic Wolf. They're not red, they're not green, but they've raised fair amount of money, but it's showing up to the right and decent level there. And a couple of the other ones that you mentioned, Netskope. Yeah, they've raised a lot of money, but they're actually performing where you want. What you don't want is where Lacework is, right. They've got some work to do to really take advantage of the money that they raised last November and prior to that. >> Yeah, if you're seeing that more neutral color, like you're calling out with an Arctic Wolf, like that means relative to their peers, this is where they should be. It's when you're seeing that red on a Lacework where we all know, wow, you raised a ton of money and your mindshare isn't where it should be. Your net sentiment is not where it should be comparatively. And then you see these great standouts, like Salt Security and SecurityScorecard and Abnormal. You know they haven't raised that much money yet, but their net sentiment's higher and their mindshare's doing well. So those basically in a nutshell, if you're a PE or a VC and you see a small green circle, then you're doing well, then it means you made a good investment. >> Some of these guys, I don't know, but you see these small green circles. Those are the ones you want to start digging into and maybe help them catch a wave. Okay, let's get into the data discussion. And again, three areas, database slash data warehousing, big data analytics and ML AI. First, we're going to look at the database sector. So Alex, thank you for bringing that up. Alright, take us through this, Erik. Actually, let me just say Postgres SQL. I got to ask you about this. It shows some funding, but that actually could be a mix of EDB, the company that commercializes Postgres and Postgres the open source database, which is a transaction system and kind of an open source Oracle. You see MariaDB is a database, but open source database. But the companies they've raised over $200 million and they filed an S-4. So Erik looks like this might be a little bit of mashup of companies and open source products. Help us understand this. >> Yeah, it's tough when you start dealing with the open source side and I'll be honest with you, there is a little bit of a mashup here. There are certain names here that are a hundred percent for profit companies. And then there are others that are obviously open source based like Redis is open source, but Redis Labs is the one trying to monetize the support around it. So you're a hundred percent accurate on this slide. I think one of the things here that's important to note though, is just how important open source is to data. If you're going to be going to any of these areas, it's going to be open source based to begin with. And Neo4j is one I want to call out here. It's not one everyone's familiar with, but it's basically geographical charting database, which is a name that we're seeing on a net sentiment side actually really, really high. When you think about it's the third overall net sentiment for a niche database play. It's not as big on the mindshare 'cause it's use cases aren't as often, but third biggest play on net sentiment. I found really interesting on this slide. >> And again, so MariaDB, as I said, they filed an S-4 I think $50 million in revenue, that might even be ARR. So they're not huge, but they're getting there. And by the way, MariaDB, if you don't know, was the company that was formed the day that Oracle bought Sun in which they got MySQL and MariaDB has done a really good job of replacing a lot of MySQL instances. Oracle has responded with MySQL HeatWave, which was kind of the Oracle version of MySQL. So there's some interesting battles going on there. If you think about the LAMP stack, the M in the LAMP stack was MySQL. And so now it's all MariaDB replacing that MySQL for a large part. And then you see again, the red, you know, you got to have some concerns about there. Aerospike's been around for a long time. SingleStore changed their name a couple years ago, last year. Yellowbrick Data, Fire Bolt was kind of going after Snowflake for a while, but yeah, you want to get out of that red zone. So they got some work to do. >> And Dave, real quick for the people that aren't aware, I just want to let them know that we can cut this data with the public company data as well. So we can cross over this with that because some of these names are competing with the larger public company names as well. So we can go ahead and cross reference like a MariaDB with a Mongo, for instance, or of something of that nature. So it's not in this slide, but at another point we can certainly explain on a relative basis how these private names are doing compared to the other ones as well. >> All right, let's take a quick look at analytics. Alex, bring that up if you would. Go ahead, Erik. >> Yeah, I mean, essentially here, I can't see it on my screen, my apologies. I just kind of went to blank on that. So gimme one second to catch up. >> So I could set it up while you're doing that. You got Grafana up and to the right. I mean, this is huge right. >> Got it thank you. I lost my screen there for a second. Yep. Again, open source name Grafana, absolutely up and to the right. But as we know, Grafana Labs is actually picking up a lot of speed based on Grafana, of course. And I think we might actually hear some noise from them coming this year. The names that are actually a little bit more disappointing than I want to call out are names like ThoughtSpot. It's been around forever. Their mindshare of course is second best here but based on the amount of time they've been around and the amount of money they've raised, it's not actually outperforming the way it should be. We're seeing Moogsoft obviously make some waves. That's very high net sentiment for that company. It's, you know, what, third, fourth position overall in this entire area, Another name like Fivetran, Matillion is doing well. Fivetran, even though it's got a high net sentiment, again, it's raised so much money that we would've expected a little bit more at this point. I know you know this space extremely well, but basically what we're looking at here and to the bottom left, you're going to see some names with a lot of red, large circles that really just aren't performing that well. InfluxData, however, second highest net sentiment. And it's really pretty early on in this stage and the feedback we're getting on this name is the use cases are great, the efficacy's great. And I think it's one to watch out for. >> InfluxData, time series database. The other interesting things I just noticed here, you got Tamer on here, which is that little small green. Those are the ones we were saying before, look for those guys. They might be some of the interesting companies out there and then observe Jeremy Burton's company. They do observability on top of Snowflake, not green, but kind of in that gray. So that's kind of cool. Monte Carlo is another one, they're sort of slightly green. They are doing some really interesting things in data and data mesh. So yeah, okay. So I can spend all day on this stuff, Erik, phenomenal data. I got to get back and really dig in. Let's end with machine learning and AI. Now this chart it's similar in its dimensions, of course, except for the money raised. We're not showing that size of the bubble, but AI is so hot. We wanted to cover that here, Erik, explain this please. Why TensorFlow is highlighted and walk us through this chart. >> Yeah, it's funny yet again, right? Another open source name, TensorFlow being up there. And I just want to explain, we do break out machine learning, AI is its own sector. A lot of this of course really is intertwined with the data side, but it is on its own area. And one of the things I think that's most important here to break out is Databricks. We started to cover Databricks in machine learning, AI. That company has grown into much, much more than that. So I do want to state to you Dave, and also the audience out there that moving forward, we're going to be moving Databricks out of only the MA/AI into other sectors. So we can kind of value them against their peers a little bit better. But in this instance, you could just see how dominant they are in this area. And one thing that's not here, but I do want to point out is that we have the ability to break this down by industry vertical, organization size. And when I break this down into Fortune 500 and Fortune 1000, both Databricks and Tensorflow are even better than you see here. So it's quite interesting to see that the names that are succeeding are also succeeding with the largest organizations in the world. And as we know, large organizations means large budgets. So this is one area that I just thought was really interesting to point out that as we break it down, the data by vertical, these two names still are the outstanding players. >> I just also want to call it H2O.ai. They're getting a lot of buzz in the marketplace and I'm seeing them a lot more. Anaconda, another one. Dataiku consistently popping up. DataRobot is also interesting because all the kerfuffle that's going on there. The Cube guy, Cube alum, Chris Lynch stepped down as executive chairman. All this stuff came out about how the executives were taking money off the table and didn't allow the employees to participate in that money raising deal. So that's pissed a lot of people off. And so they're now going through some kind of uncomfortable things, which is unfortunate because DataRobot, I noticed, we haven't covered them that much in "Breaking Analysis", but I've noticed them oftentimes, Erik, in the surveys doing really well. So you would think that company has a lot of potential. But yeah, it's an important space that we're going to continue to watch. Let me ask you Erik, can you contextualize this from a time series standpoint? I mean, how is this changed over time? >> Yeah, again, not show here, but in the data. I'm sorry, go ahead. >> No, I'm sorry. What I meant, I should have interjected. In other words, you would think in a downturn that these emerging companies would be less interesting to buyers 'cause they're more risky. What have you seen? >> Yeah, and it was interesting before we went live, you and I were having this conversation about "Is the downturn stopping people from evaluating these private companies or not," right. In a larger sense, that's really what we're doing here. How are these private companies doing when it comes down to the actual practitioners? The people with the budget, the people with the decision making. And so what I did is, we have historical data as you know, I went back to the Emerging Technology Survey we did in November of 21, right at the crest right before the market started to really fall and everything kind of started to fall apart there. And what I noticed is on the security side, very much so, we're seeing less evaluations than we were in November 21. So I broke it down. On cloud security, net sentiment went from 21% to 16% from November '21. That's a pretty big drop. And again, that sentiment is our one aggregate metric for overall positivity, meaning utilization and actual evaluation of the name. Again in database, we saw it drop a little bit from 19% to 13%. However, in analytics we actually saw it stay steady. So it's pretty interesting that yes, cloud security and security in general is always going to be important. But right now we're seeing less overall net sentiment in that space. But within analytics, we're seeing steady with growing mindshare. And also to your point earlier in machine learning, AI, we're seeing steady net sentiment and mindshare has grown a whopping 25% to 30%. So despite the downturn, we're seeing more awareness of these companies in analytics and machine learning and a steady, actual utilization of them. I can't say the same in security and database. They're actually shrinking a little bit since the end of last year. >> You know it's interesting, we were on a round table, Erik does these round tables with CISOs and CIOs, and I remember one time you had asked the question, "How do you think about some of these emerging tech companies?" And one of the executives said, "I always include somebody in the bottom left of the Gartner Magic Quadrant in my RFPs. I think he said, "That's how I found," I don't know, it was Zscaler or something like that years before anybody ever knew of them "Because they're going to help me get to the next level." So it's interesting to see Erik in these sectors, how they're holding up in many cases. >> Yeah. It's a very important part for the actual IT practitioners themselves. There's always contracts coming up and you always have to worry about your next round of negotiations. And that's one of the roles these guys play. You have to do a POC when contracts come up, but it's also their job to stay on top of the new technology. You can't fall behind. Like everyone's a software company. Now everyone's a tech company, no matter what you're doing. So these guys have to stay in on top of it. And that's what this ETS can do. You can go in here and look and say, "All right, I'm going to evaluate their technology," and it could be twofold. It might be that you're ready to upgrade your technology and they're actually pushing the envelope or it simply might be I'm using them as a negotiation ploy. So when I go back to the big guy who I have full intentions of writing that contract to, at least I have some negotiation leverage. >> Erik, we got to leave it there. I could spend all day. I'm going to definitely dig into this on my own time. Thank you for introducing this, really appreciate your time today. >> I always enjoy it, Dave and I hope everyone out there has a great holiday weekend. Enjoy the rest of the summer. And, you know, I love to talk data. So anytime you want, just point the camera on me and I'll start talking data. >> You got it. I also want to thank the team at ETR, not only Erik, but Darren Bramen who's a data scientist, really helped prepare this data, the entire team over at ETR. I cannot tell you how much additional data there is. We are just scratching the surface in this "Breaking Analysis". So great job guys. I want to thank Alex Myerson. Who's on production and he manages the podcast. Ken Shifman as well, who's just coming back from VMware Explore. Kristen Martin and Cheryl Knight help get the word out on social media and in our newsletters. And Rob Hof is our editor in chief over at SiliconANGLE. Does some great editing for us. Thank you. All of you guys. Remember these episodes, they're all available as podcast, wherever you listen. All you got to do is just search "Breaking Analysis" podcast. I publish each week on wikibon.com and siliconangle.com. Or you can email me to get in touch david.vellante@siliconangle.com. You can DM me at dvellante or comment on my LinkedIn posts and please do check out etr.ai for the best survey data in the enterprise tech business. This is Dave Vellante for Erik Bradley and The Cube Insights powered by ETR. Thanks for watching. Be well. And we'll see you next time on "Breaking Analysis". (upbeat music)
SUMMARY :
bringing you data driven it's called the Emerging Great to see you too, Dave, so much in the mainstream, not only for the ITDMs themselves It is the heart of innovation So the net sentiment is a very So a lot of names that we And then of course you have AnyScale, That's the bad zone, I guess, So the gray dots that you're rates, adoption and the all And on the lower side, Vena, Acton, in the green. are in the enterprise already. So now let's look at the churn So that's the way you can look of dwell on the negative, So again, the axis is still the same, And a couple of the other And then you see these great standouts, Those are the ones you want to but Redis Labs is the one And by the way, MariaDB, So it's not in this slide, Alex, bring that up if you would. So gimme one second to catch up. So I could set it up but based on the amount of time Those are the ones we were saying before, And one of the things I think didn't allow the employees to here, but in the data. What have you seen? the market started to really And one of the executives said, And that's one of the Thank you for introducing this, just point the camera on me We are just scratching the surface
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Erik | PERSON | 0.99+ |
Alex Myerson | PERSON | 0.99+ |
Ken Shifman | PERSON | 0.99+ |
Sanjay Poonen | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Erik Bradley | PERSON | 0.99+ |
November 21 | DATE | 0.99+ |
Darren Bramen | PERSON | 0.99+ |
Alex | PERSON | 0.99+ |
Cheryl Knight | PERSON | 0.99+ |
Postgres | ORGANIZATION | 0.99+ |
Databricks | ORGANIZATION | 0.99+ |
Netskope | ORGANIZATION | 0.99+ |
Adobe | ORGANIZATION | 0.99+ |
Rob Hof | PERSON | 0.99+ |
Fivetran | ORGANIZATION | 0.99+ |
$50 million | QUANTITY | 0.99+ |
21% | QUANTITY | 0.99+ |
Chris Lynch | PERSON | 0.99+ |
19% | QUANTITY | 0.99+ |
Jeremy Burton | PERSON | 0.99+ |
$800 million | QUANTITY | 0.99+ |
6,000 | QUANTITY | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
Redis Labs | ORGANIZATION | 0.99+ |
November '21 | DATE | 0.99+ |
ETR | ORGANIZATION | 0.99+ |
First | QUANTITY | 0.99+ |
25% | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
OneTrust | ORGANIZATION | 0.99+ |
two dimensions | QUANTITY | 0.99+ |
two groups | QUANTITY | 0.99+ |
November of 21 | DATE | 0.99+ |
both | QUANTITY | 0.99+ |
Boston | LOCATION | 0.99+ |
more than 400 companies | QUANTITY | 0.99+ |
Kristen Martin | PERSON | 0.99+ |
MySQL | TITLE | 0.99+ |
Moogsoft | ORGANIZATION | 0.99+ |
The Cube | ORGANIZATION | 0.99+ |
third | QUANTITY | 0.99+ |
Grafana | ORGANIZATION | 0.99+ |
H2O | ORGANIZATION | 0.99+ |
Mike Speiser | PERSON | 0.99+ |
david.vellante@siliconangle.com | OTHER | 0.99+ |
second | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
28% | QUANTITY | 0.99+ |
16% | QUANTITY | 0.99+ |
Second | QUANTITY | 0.99+ |
Laura Heisman, VMware | VMware Explore 2022
>>Welcome back everyone to the Cube's live coverage of VMware Explorer, 2022. I'm John furrier with Dave Valante host of the cube. We're here on the ground floor, Moscone west two sets Walter Wall coverage. Three days. We heard Laura Heisman, the senior vice president and CMO of VMware, put it all together. Great to see you. Nice, thanks for, to see you for spending time outta your very busy week. >>It is a busy week. It is a great week. >>So a lot of people were anticipating what world was gonna look like. And then the name changed to VMware Explorer. This is our 12th year covering VMware's annual conference, formerly known ASM world. Now VMware Explorer, bold move, but Raghu teased it out on his keynote. Some reason behind it, expand on, on the thought process. The name change, obviously multi-cloud big headline here. vSphere eight partnerships with cloud hyperscale is a completely clear direction for VMware. Take us through why the name changed. Exactly, exactly. And why it's all coming together. Think he kind of hinted that he kinda said exactly, you know, exploring the new things, blah, blah, blah. Yeah. But take us through that. You've architected it. >>Yeah. It is a, a change of, we have a great past at VMware and we're looking to our future at the same time. And so when you come back from a pandemic and things changing, and you're really looking at the expansion of the business now is the time because it wasn't just to come back to what we were doing before. And every company should be thinking about that, but it's what are we gonna do to actually go forward? And VMware itself is on our own journey as expanding in more into the cloud, our multi-cloud leadership and everything that we're doing there. And we wanted to make sure that our audience was able to explore that with us. And so it was the perfect opportunity we're back live. And VMware Explorer is for everyone. That's been coming Tom world for so many years. We love our community and expanding it to our new communities that maybe don't have that legacy and that history and have them here with us at >>VMware. You did a great job. I love the event here. Love how it turned out. And, and a lot of interesting things happened along the way. Prior to this event you had we're coming outta the pandemic. So it's the first face to face yes. Of the VMware community coming together, which this is an annual right of passage for everyone in the customer base. Broadcom buys VMware. No, no, if you name change it to VMware Explorer and then Broadcom buys VMware. So announces, announces the, the buyout. So, and all the certainty, uncertainty kind of hanging around it. You had to navigate those waters, take us through, what was that like? How did you pull it off? It was a huge success. Yeah, because everyone showed up. Yeah. It's, it's, it's the same event, different name, >>It's >>Same vibe. >>The only thing constant is change. Right? And so it's the, we've gotta focus on the business and our VMware customers and our partners and our community at large. And so it's really keeping the eye on what we're trying to communicate to our community. And this is for our VMware community. The VMO community is here in spades. It is wonderful to have the VMO community here. We have tons of different customers, new customers, old customers, and it's just being able to share everything VMware. And I think people are just excited about that. It's great energy on the show floor and all >>Around. And it's not like you had years to plan it. I mean, you basically six months in you, you went, you said you went on a six month listening tour the other day. What was the number one question you got on that listening tour? >>Well, definitely about the name change was one, but I would say also, it's not just the question. It was the ask of, we have we're in what we call our chapter three here. And it's really our move into multicloud and helping all of our customers with their complexities. >>So virtualization, private cloud, and now multi-cloud correct. The third chapter. >>Yeah. And the, the question and the ask is how do we let our customers and partners know what this is, help us Laura. Like that was the number one ask to me of help us explain it. And that was my challenge and opportunity coming into explore, and really to explain everything about our, if you watched the gen session yesterday, these was, was going through our multiple different chapters where we are helping our customers with their multi-cloud strategies. And so it is been that evolution gets us today and it doesn't end today. It starts today. And we keep going, >>Like, like a lot of companies, obviously in you in this new role, you inherited a hybrid world and, and you've got, you got two years of virtual under your belt, and now you're running a completely different event from that standpoint. How does the sort of the COVID online translate into new relationships and how you're cultivating those? What's that dynamic like? >>Well, let's start with how happy everyone is to see each other in person. No doubt. Yeah. It is amazing just to see people, the high fives in the hallways, the hugs, oh, some people just the fist pump, whatever people mats are there masks aren't there, right? It is something of where everyone's comfort level, but it is really just about getting everyone together and thinking about how do, how was it before the pandemic? You don't necessarily just wanna repeat coming back. And so how do you think about this from an in-person event? People have been sitting behind their screens. How do we engage and how are we interactive? Knowing that attention spans are probably a little bit shorter. People are used to getting up and going get their coffee. We have coffee in the conference rooms, right? Things like that, making the experience just a really great one for everyone. So they're comfortable back in person, but I mean, honestly the energy and seeing people's smiles on their faces, it's wonderful to be back in person. >>It's interesting, you know, the cube, we've had some transformations ourselves with the pandemic and, and living through and getting back to events, but hybrid cloud and hybrid events is now the steady state. So, and in a way it's kind of interesting how hybrid cloud and now multi-cloud the digital aspect of integrating into the physical events is now key. First class citizen thinking. Yeah. For CMOs, you guys did a great job of preserving the, the, the, the best part of it, which is face to face people seeing each other and now bringing in the digital and then extending this. So that it's an always on kind of explore. Is that the thinking behind it? Yes. What's your vision on where you go next? Because if it's not, it's not one and done and see you next year. No anymore, because no, the pandemic showed us that hybrid and digital and physical together. If design as first class citizens with each other. Yeah. One sub-optimize me obviously face to face is better than digital, but if you can't make it, it shouldn't be a bad experience. >>No, not at all. Good's your vision. And, and we're in a point where not everyone's gonna come back, that everyone has what's going on with their life. And so you have to think about it as in person and online, it's not necessarily even hybrid. And so it's, what's the experience for people that are here, you know, over 10,000 people here, you wanna be sure that that is a great experience for them. And then our viewers online, we wanna be sure that they're able to, to know what's going on, stay in touch with everything VMware and enjoy that. So the gen session that was live, we have a ton of on demand content. And this is just the start. So now we go on to essentially multiple other VMware explorers around the world. >>It's interesting. The business model of events is so tickets driven or sponsorship on site on the location that you can get almost addicted to the, no, we don't wanna do digital and kind of foreclose that you guys embraced the, the combo. So what's the attendance. I mean, probably wasn't as big as when everyone was physical. Yep. What are some of the numbers? Can you give us some D data on attendance? Some of the stats around the show, cuz obviously people showed up and drove. Yes. It wasn't a no show. That's sure a lot of great stuff here >>We have. So it's over 10,000 people that are registered and we see them here. The gen session was packed. They're walking the show floor and then I don't have the numbers yet for our online viewership, but everything that we're doing to promote it online, if anyone missed it online, the gen session is already up and they'll see more sessions going live as well as all the on demand content so that everyone can stay in the loop of what's happening. And all of our announcements, >>You're obviously not disappointed. Were you surprised? A little nervous. >>So I will say one thing that we learned from others, thank goodness others have gone before us. So as far as coming back in person is the big change is actually registration happens closer to the event, right. Is a very big change from pre so, >>So it's at the end. Yes. >>The last three weeks. And we had been told that from peers at RSA and other conferences, that that's what happens. So we were prepared for that, but people wanna know what's going on in the world. Yeah. Right. You wanna have that faith before you buy that ticket and book your travel. And so that has definitely been one of the biggest changes and one that I think that will maybe continue to see here. So that was probably the biggest thing that changed as far as what to expect as registration. But we planned for this. We knew it was not going to be as big in the past and that that's gonna be, I think the new norm, >>I think you're right. I think a lot of last minute decisions, you know, sometimes people >>Wanna know, I mean, it's, what's gonna happen another gonna be outbreak or, I mean, I think people have gotten trained to be disappointed >>Well and be flexible >>With COVID I and, and, and weirded out by things. So people get anxiety on the COVID you've seen that. Yeah. >>Yeah. Yeah. I wanna ask you about the developer messaging cause that's one of the real huge takeaways. It was so strong. And you said the other day in the analyst session, the developers of the Kings and the Queens now, you know, we, when we hear developers, we think we pictured Steve Bama running around on stage developers develop, but it's different. It's a different vibe here. It it's like you're serving the Kings in the, in the Queens with, through partnerships and embracing open source. Can you talk a little bit about how you approached or, and you are approaching developer messages? Yeah, >>I, so, you know, I came from GitHub and so developers have been on my mind for many years now. And so joining VMware, I got to join this great world of enterprise software background and my developer background. And we have such an opportunity to really help our developer community understand the benefits of VMware to make them heroes just like we made sort of virtualization professionals heroes in the past, we can do the same thing with developers. We wanna be sure that we're speaking with our developer community. That was very much on stage as well as many of the sessions. And so our, we think about that with our products and what we're doing as far as product development and helping developers be able to test and learn with our products. And it's really thinking about the enterprise developer and how can we help them be successful. >>And I think, I think the beautiful thing about that message is, is that the enterprises that you guys have that great base with, they're all pretty much leaned into cl cloud native and they see it and it's starting to see the hybrid private cloud public cloud. And now with edge coming, it's pretty much a mandate that cloud native drive the architecture and that came clear in the messaging. So I have to ask you on the activations, you guys have done how much developer ops customer base mix are you seeing transfer over? Because the trend that we're seeing is is that it operations and that's generic. I'll say that word generically, but you know, your base is it almost every company has VMware. So they're also enabling inside their company developers. So how much is developer percentage to ops or is they blending in, it's almost a hundred percent, which how would you see >>That it's growing? So it's definitely growing. I wouldn't say it's a hundred percent, but it is growing. And it is one where every company is thinking about their developer. There's not enough developers in the world per the number of job openings out there. Everyone wants to innovate fast and they need to be able to invest in their developers. And we wanna be able to give them the tools to be able to do that. Cuz you want your developers to be happy and make it easier to do their jobs. And so that's what we're committed to really being able to help them do. And so we're seeing an uptick there and we're seeing, you'll see that with our product announcements and what we're doing. And so it's growing. >>The other thing I want to ask you, we saw again, we saw a lot of energy on the customer vibe. We're getting catching that here, cuz the sessions are right behind us and upstairs the floor, we've heard comments like the ecosystem's back. I mean not to anywhere, but there was a definitely an ecosystem spring to the step. If you will, amongst the partners, can you share what's happening here? Observations things that you've noticed that have been cool, that that can highlight some trends in the partner side of it. Yeah. What's going on with partners. >>Yeah. I mean our partners are so important to us. We're thrilled that they're here with us here. The expo floor, it is busy and people are visiting and reuniting and learning from each other and everything that you want to happen on the expo floor. And we've done special things throughout the week. For example, we have a whole hyperscaler day essentially happening where we wanna highlight some of the hyperscalers and let them be able to, to share with all of our attendees what they're doing. So we've given them more time within the sessions as well. And so you'll see our partner ecosystem all over the place, not just on the expo >>Floor, a lot of range of partners. Dave, you got the hyperscalers, you have the big, the big whales and cloud whales. And then you have now the second tier we call 'em super cloud type customer and partners. And you got the multi-cloud architecture, developing a lot of moving parts that are changing and growing and evolving. How do you view that? How you just gonna ride the wave? Are you watching it? Are you gonna explore it through more, you know, kind of joint marketing. I mean, what's your, how do you take this momentum that you have? And by the way, a lot of stuff's coming outta the oven. I was talking with Joan last night at the, at the press analyst event. And there's a lot of stuff coming outta the VMware oven product wise that hasn't hit the market yet. Yep. That's that's that's I mean, you can't really put a number on that sales yet, but it's got value. Yep. So you got that happening. You got this momentum behind you, you just ride the wave and what's the strategy. Well, >>It is all about how do we pass to the partner, right? So it is about the partner relationship. And we think about that our partner community is huge to us at VMware. I'm sure you've been hearing that from everyone you've been speaking to. So it's not even it's ride the wave, but it's embrace. Got it. It's embrace our partners. We need their help, our customer base. We do touch everybody and we need them to be able to support us and share what it is that we're doing from our product E evolution, our product announcements. So it's continuous education. It's there in educating us. It's definitely a two way relationship and really what we're even to get done here at explore together. It's progress that you can't always do on a zoom or a teams call or a WebEx call. You can't do that in two weeks, two years sometimes. And we're able to even have really great conversations >>Here and, and your go to market is transforming as well. You, you guys have talked about how you're reaching many different touchpoints. We've talked about developers. I mean, the other thing we've seen at events, we talked about the last minute, you know, registrations. The other thing we've seen is a lot more senior members of audiences. And now part of that is maybe okay, maybe some of the junior folks can't travel, they can't get, but, but, but why is it that the senior people come, they, they maybe they wouldn't have come before maybe because they're going through digital transformations. They wanna lean in and understand it better. But it seemed, I know you had an executive summit, you know, on day zero and Hawk 10 was here and, and so forth. So, okay. I get that. But it seems in talking to the partners, they're like, wow, the quality of the conversations that we're having has really been up leveled compared to previous years in other conferences. >>So yeah. Yeah. I think it's that they're all thinking about their transformation as well. We had the executive summit on day zero for us Monday, right? And it was a hundred plus executives invited in for a day who have stayed because they wanna hear what's going on. When I joined VMware, I said, VMware has a gift that so many companies are jealous of because we have relationships with the executives and that's what every company's startup to large company wants. And they're, they're really trusted customers of ours. And so we haven't been together and they want to be here to be able to know what's going on and join us in the meetings. And we have tons of meetings happening throughout >>The event and they're loyal and they're loyal. They're absolutely, they're active, active in a good way. They'll give you great feedback, candid feedback. Sometimes, you know, you might not wanna hear, but it's truthful. They're rare, engaging feedback gift. And they stay with you and they're loyal and they show up and they learn they're in sessions. So all good stuff. And then we only have about a minute left. Laura. I want to get your thoughts and, and end the segment with your explanation to the world around explore. What's next? What does it mean? What's gonna happen next? What does this brand turn into? Yeah. How do you see this unfolding? How do people, how should people view the VMware Explorer event brand and future activities? >>Yeah. VMware Explorer. This is just the start. So we're after this, we're going to Brazil, Barcelona, Singapore, China, and Japan. And so it is definitely a momentum that we're going on. The brand is unbelievable. It is so beautiful. We're exploring with it. We can have so much fun with this brand and we plan to continue to have fun with this brand. And it is all about the, the momentum with our sales team and our customers and our partners. And just continuing what we're doing, this is, this is just the beginning. It's not the, it's a global >>Brand explore >>Global. Absolutely. Absolutely. >>All right, Dave, that's gonna be great for the cube global activities. There you go, Laura. Great to see you. Thank you for coming on. I know you're super busy. Final question. It's kind of the trick question. What's your favorite aspect of the event? Pick a favorite child. What's going on here? Okay. In your mind, what's the most exciting thing about this event that that's near and dear to >>Your heart? So first it's hearing the feedback from the customers, but I do have to say my team as well. I mean, huge shout out to my team. They are the hub and spoke of all parts of explore. Yeah. VMware Explorer. Wouldn't be here without them. And so it's great to see it all coming >>Together. As they say in the scoring and the Olympics, the degree of difficulty for this event, given all the things going on, you guys did an amazing job. >>We witnessed >>To it. Congratulations. Thank you. Thank you for a great booth here. It looks beautiful. Thanks for coming. Wonderful. >>Thank you for >>Having me. Okay. The cues live coverage here on the floor of Moscone west I'm Trevor Dave. Valante two sets, three days. Stay with us for more live coverage. We'll be right back.
SUMMARY :
Nice, thanks for, to see you for spending time outta your very busy It is a great week. Think he kind of hinted that he kinda said exactly, you know, exploring the new things, blah, blah, blah. And VMware itself is on our own journey as expanding in more into the cloud, So it's the first face And so it's really keeping the eye on what we're trying to communicate to And it's not like you had years to plan it. It was the ask of, we have we're in what So virtualization, private cloud, and now multi-cloud correct. and really to explain everything about our, if you watched the gen session yesterday, Like, like a lot of companies, obviously in you in this new role, you inherited a hybrid world and, And so how do you think about this from an in-person event? One sub-optimize me obviously face to face is better than digital, but if you can't make it, So the gen session that was live, we have a ton of on demand content. that you can get almost addicted to the, no, we don't wanna do digital and kind of foreclose that you guys embraced So it's over 10,000 people that are registered and we see them here. Were you surprised? So as far as coming back in person is the big change is actually registration happens So it's at the end. And so that has definitely been one of the biggest changes and one that I I think a lot of last minute decisions, you know, sometimes people So people get anxiety on the COVID you've seen that. And you said the other day in the analyst session, the developers of the Kings and the Queens now, And so our, we think about that with our products and what we're doing as far as product development So I have to ask you on the activations, you guys have done how much developer ops And so that's what we're committed to really being able to help them do. amongst the partners, can you share what's happening here? of the hyperscalers and let them be able to, to share with all of our attendees And then you have now the second tier we call 'em super cloud type customer and So it is about the partner relationship. And now part of that is maybe okay, maybe some of the junior folks can't travel, And so we haven't been together and they want to be here to be able to know And they stay with you and they're loyal and they show up and they learn they're in sessions. And so it is definitely a momentum that we're going on. Absolutely. It's kind of the trick question. So first it's hearing the feedback from the customers, but I do have to say my you guys did an amazing job. Thank you for a great booth here. Stay with us for more live coverage.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Michael | PERSON | 0.99+ |
Howard | PERSON | 0.99+ |
Maria | PERSON | 0.99+ |
Laura Heisman | PERSON | 0.99+ |
Laura | PERSON | 0.99+ |
Jamaica | LOCATION | 0.99+ |
Mark Falto | PERSON | 0.99+ |
David | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Dave Valante | PERSON | 0.99+ |
California | LOCATION | 0.99+ |
2006 | DATE | 0.99+ |
2012 | DATE | 0.99+ |
Dan Savarese | PERSON | 0.99+ |
Compaq | ORGANIZATION | 0.99+ |
Joe | PERSON | 0.99+ |
EMC | ORGANIZATION | 0.99+ |
Paul Gillan | PERSON | 0.99+ |
Ron | PERSON | 0.99+ |
Jonathan | PERSON | 0.99+ |
Dell | ORGANIZATION | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
Rhonda | PERSON | 0.99+ |
Jonathan Weinert | PERSON | 0.99+ |
Steve Bama | PERSON | 0.99+ |
two | QUANTITY | 0.99+ |
two years | QUANTITY | 0.99+ |
Vegas | LOCATION | 0.99+ |
Bangalore | LOCATION | 0.99+ |
2009 | DATE | 0.99+ |
John Troyer | PERSON | 0.99+ |
Amazon Web Services | ORGANIZATION | 0.99+ |
Europe | LOCATION | 0.99+ |
India | LOCATION | 0.99+ |
2018 | DATE | 0.99+ |
Forty | QUANTITY | 0.99+ |
Monday | DATE | 0.99+ |
Mark | PERSON | 0.99+ |
September | DATE | 0.99+ |
San Francisco | LOCATION | 0.99+ |
Dave Matthews | PERSON | 0.99+ |
Adobe | ORGANIZATION | 0.99+ |
Sanjay Poonen | PERSON | 0.99+ |
Trevor Dave | PERSON | 0.99+ |
Ben | PERSON | 0.99+ |
1999 | DATE | 0.99+ |
VMware | ORGANIZATION | 0.99+ |
Jonathan Seckler | PERSON | 0.99+ |
Howard Elias | PERSON | 0.99+ |
16 acre | QUANTITY | 0.99+ |
10 | QUANTITY | 0.99+ |
80 percent | QUANTITY | 0.99+ |
Japan | LOCATION | 0.99+ |
200 acre | QUANTITY | 0.99+ |
BMC | ORGANIZATION | 0.99+ |
$50 million | QUANTITY | 0.99+ |
John Wood, Telos & Shannon Kellogg, AWS
>>Welcome back to the cubes coverage of AWS public sector summit live in Washington D. C. A face to face event were on the ground here is to keep coverage. I'm john Kerry, your hosts got two great guests. Both cuba alumni Shannon Kellogg VP of public policy for the Americas and john would ceo tell us congratulations on some announcement on stage and congressional john being a public company. Last time I saw you in person, you are private. Now your I. P. O. Congratulations >>totally virtually didn't meet one investor, lawyer, accountant or banker in person. It's all done over zoom. What's amazing. >>We'll go back to that and a great great to see you had great props here earlier. You guys got some good stuff going on in the policy side, a core max on stage talking about this Virginia deal. Give us the update. >>Yeah. Hey thanks john, it's great to be back. I always like to be on the cube. Uh, so we made an announcement today regarding our economic impact study, uh, for the commonwealth of Virginia. And this is around the amazon web services business and our presence in Virginia or a WS as we all, uh, call, uh, amazon web services. And um, basically the data that we released today shows over the last decade the magnitude of investment that we're making and I think reflects just the overall investments that are going into Virginia in the data center industry of which john and I have been very involved with over the years. But the numbers are quite um, uh, >>just clever. This is not part of the whole H. 20. H. Q. Or whatever they call HQ >>To HQ two. It's so Virginia Amazon is investing uh in Virginia as part of our HQ two initiative. And so Arlington Virginia will be the second headquarters in the U. S. In addition to that, AWS has been in Virginia for now many years, investing in both data center infrastructure and also other corporate facilities where we house AWS employees uh in other parts of Virginia, particularly out in what's known as the dullest technology corridor. But our data centers are actually spread throughout three counties in Fairfax County, Loudoun County in Prince William County. >>So this is the maxim now. So it wasn't anything any kind of course this is Virginia impact. What was, what did he what did he announce? What did he say? >>Yeah. So there were a few things that we highlighted in this economic impact study. One is that over the last decade, if you can believe it, we've invested $35 billion 2020 alone. The AWS investment in construction and these data centers. uh it was actually $1.3 billion 2020. And this has created over 13,500 jobs in the Commonwealth of Virginia. So it's a really great story of investment and job creation and many people don't know John in this Sort of came through in your question too about HQ two, But aws itself has over 8000 employees in Virginia today. Uh, and so we've had this very significant presence for a number of years now in Virginia over the last, you know, 15 years has become really the cloud capital of the country, if not the world. Uh, and you see all this data center infrastructure that's going in there, >>John What's your take on this? You've been very active in the county there. Um, you've been a legend in the area and tech, you've seen this many years, you've been doing so I think the longest running company doing cyber my 31st year, 31st year. So you've been on the ground. What does this all mean to you? >>Well, you know, it goes way back to, it was roughly 2005 when I served on the Economic Development Commission, Loudon County as the chairman. And at the time we were the fastest-growing county in America in Loudon County. But our residential real property taxes were going up stratospherically because when you look at it, every dollar real property tax that came into residential, we lose $2 because we had to fund schools and police and fire departments and so forth. And we realized for every dollar of commercial real property tax that came in, We made $97 in profit, but only 13% of the money that was coming into the county was coming in commercially. So a small group got together from within the county to try and figure out what were the assets that we had to offer to companies like Amazon and we realized we had a lot of land, we had water and then we had, you know this enormous amount of dark fiber, unused fibre optic. And so basically the county made it appealing to companies like amazon to come out to Loudon County and other places in northern Virginia and the rest is history. If you look today, we're Loudon County is Loudon County generates a couple $100 million surplus every year. It's real property taxes have come down in in real dollars and the percentage of revenue that comes from commercials like 33 34%. That's really largely driven by the data center ecosystem that my friend over here Shannon was talking. So >>the formula basically is look at the assets resources available that may align with the kind of commercial entities that good. How's their domicile there >>that could benefit. >>So what about power? Because the data centers need power, fiber fiber is great. The main, the main >>power you can build power but the main point is is water for cooling. So I think I think we had an abundance of water which allowed us to build power sources and allowed companies like amazon to build their own power sources. So I think it was really a sort of a uh uh better what do they say? Better lucky than good. So we had a bunch of assets come together that helps. Made us, made us pretty lucky as a, as a region. >>Thanks area too. >>It is nice and >>john, it's really interesting because the vision that john Wood and several of his colleagues had on that economic development board has truly come through and it was reaffirmed in the numbers that we released this week. Um, aws paid $220 million 2020 alone for our data centers in those three counties, including loud >>so amazon's contribution to >>The county. $220 million 2020 alone. And that actually makes up 20% of overall property tax revenues in these counties in 2020. So, you know, the vision that they had 15 years ago, 15, 16 years ago has really come true today. And that's just reaffirmed in these numbers. >>I mean, he's for the amazon. So I'll ask you the question. I mean, there's a lot of like for misinformation going around around corporate reputation. This is clearly an example of the corporation contributing to the, to the society. >>No, no doubt. And you think >>About it like that's some good numbers, 20 million, 30 >>$5 million dollar capital investment. You know, 10, it's, what is it? 8000 9000 >>Jobs. jobs, a W. S. jobs in the Commonwealth alone. >>And then you look at the economic impact on each of those counties financially. It really benefits everybody at the end of the day. >>It's good infrastructure across the board. How do you replicate that? Not everyone's an amazon though. So how do you take the formula? What's your take on best practice? How does this rollout? And that's the amazon will continue to grow, but that, you know, this one company, is there a lesson here for the rest of us? >>I think I think all the data center companies in the cloud companies out there see value in this region. That's why so much of the internet traffic comes through northern Virginia. I mean it's I've heard 70%, I've heard much higher than that too. So I think everybody realizes this is a strategic asset at a national level. But I think the main point to bring out is that every state across America should be thinking about investments from companies like amazon. There are, there are really significant benefits that helps the entire community. So it helps build schools, police departments, fire departments, etcetera, >>jobs opportunities. What's the what's the vision though? Beyond data center gets solar sustainability. >>We do. We have actually a number of renewable energy projects, which I want to talk about. But just one other quick on the data center industry. So I also serve on the data center coalition which is a national organization of data center and cloud providers. And we look at uh states all over this country were very active in multiple states and we work with governors and state governments as they put together different frameworks and policies to incent investment in their states and Virginia is doing it right. Virginia has historically been very forward looking, very forward thinking and how they're trying to attract these data center investments. They have the right uh tax incentives in place. Um and then you know, back to your point about renewable energy over the last several years, Virginia is also really made some statutory changes and other policy changes to drive forward renewable energy in Virginia. Six years ago this week, john I was in a coma at county in Virginia, which is the eastern shore. It's a very rural area where we helped build our first solar farm amazon solar farm in Virginia in 2015 is when we made this announcement with the governor six years ago this week, it was 88 megawatts, which basically at the time quadruple the virginias solar output in one project. So since that first project we at Amazon have gone from building that one facility, quadrupling at the time, the solar output in Virginia to now we're by the end of 2023 going to be 1430 MW of solar power in Virginia with 15 projects which is the equivalent of enough power to actually Enough electricity to power 225,000 households, which is the equivalent of Prince William county Virginia. So just to give you the scale of what we're doing here in Virginia on renewable energy. >>So to me, I mean this comes down to not to put my opinion out there because I never hold back on the cube. It's a posture, we >>count on that. It's a >>posture issue of how people approach business. I mean it's the two schools of thought on the extreme true business. The government pays for everything or business friendly. So this is called, this is a modern story about friendly business kind of collaborative posture. >>Yeah, it's putting money to very specific use which has a very specific return in this case. It's for everybody that lives in the northern Virginia region benefits everybody. >>And these policies have not just attracted companies like amazon and data center building builders and renewable energy investments. These policies are also leading to rapid growth in the cybersecurity industry in Virginia as well. You know john founded his company decades ago and you have all of these cybersecurity companies now located in Virginia. Many of them are partners like >>that. I know john and I both have contributed heavily to a lot of the systems in place in America here. So congratulations on that. But I got to ask you guys, well I got you for the last minute or two cybersecurity has become the big issue. I mean there's a lot of these policies all over the place. But cyber is super critical right now. I mean, where's the red line Shannon? Where's you know, things are happening? You guys bring security to the table, businesses are out there fending for themselves. There's no militia. Where's the, where's the, where's the support for the commercial businesses. People are nervous >>so you want to try it? >>Well, I'm happy to take the first shot because this is and then we'll leave john with the last word because he is the true cyber expert. But I had the privilege of hosting a panel this morning with the director of the cybersecurity and Infrastructure Security agency at the department, Homeland Security, Jenness easterly and the agency is relatively new and she laid out a number of initiatives that the DHS organization that she runs is working on with industry and so they're leaning in their partnering with industry and a number of areas including, you know, making sure that we have the right information sharing framework and tools in place, so the government and, and we in industry can act on information that we get in real time, making sure that we're investing for the future and the workforce development and cyber skills, but also as we enter national cybersecurity month, making sure that we're all doing our part in cyber security awareness and training, for example, one of the things that are amazon ceo Andy Jassy recently announced as he was participating in a White house summit, the president biden hosted in late august was that we were going to at amazon make a tool that we've developed for information and security awareness for our employees free, available to the public. And in addition to that we announced that we were going to provide free uh strong authentication tokens for AWS customers as part of that announcement going into national cybersecurity months. So what I like about what this administration is doing is they're reaching out there looking for ways to work with industry bringing us together in these summits but also looking for actionable things that we can do together to make a difference. >>So my, my perspective echoing on some of Shannon's points are really the following. Uh the key in general is automation and there are three components to automation that are important in today's environment. One is cyber hygiene and education is a piece of that. The second is around mis attribution meaning if the bad guy can't see you, you can't be hacked. And the third one is really more or less around what's called attribution, meaning I can figure out actually who the bad guy is and then report that bad guys actions to the appropriate law enforcement and military types and then they take it from there >>unless he's not attributed either. So >>well over the basic point is we can't as industry hat back, it's illegal, but what we can do is provide the tools and methods necessary to our government counterparts at that point about information sharing, where they can take the actions necessary and try and find those bad guys. >>I just feel like we're not moving fast enough. Businesses should be able to hack back. In my opinion. I'm a hawk on this one item. So like I believe that because if people dropped on our shores with troops, the government will protect us. >>So your your point is directly taken when cyber command was formed uh before that as airlines seeing space physical domains, each of those physical domains have about 100 and $50 billion they spend per year when cyber command was formed, it was spending less than Jpmorgan chase to defend the nation. So, you know, we do have a ways to go. I do agree with you that there needs to be more uh flexibility given the industry to help help with the fight. You know, in this case. Andy Jassy has offered a couple of tools which are, I think really good strong tokens training those >>are all really good. >>We've been working with amazon for a long time, you know, ever since, uh, really, ever since the CIA embrace the cloud, which was sort of the shot heard around the world for cloud computing. We do the security compliance automation for that air gap region for amazon as well as other aspects >>were all needs more. Tell us faster, keep cranking up that software because tell you right now people are getting hit >>and people are getting scared. You know, the colonial pipeline hack that affected everybody started going wait a minute, I can't get gas. >>But again in this area of the line and jenny easterly said this this morning here at the summit is that this truly has to be about industry working with government, making sure that we're working together, you know, government has a role, but so does the private sector and I've been working cyber issues for a long time to and you know, kind of seeing where we are this year in this recent cyber summit that the president held, I really see just a tremendous commitment coming from the private sector to be an effective partner in securing the nation this >>full circle to our original conversation around the Virginia data that you guys are looking at the Loudon County amazon contribution. The success former is really commercial public sector. I mean, the government has to recognize that technology is now lingua franca for all things everything society >>well. And one quick thing here that segues into the fact that Virginia is the cloud center of the nation. Um uh the president issued a cybersecurity executive order earlier this year that really emphasizes the migration of federal systems into cloud in the modernization that jOHN has worked on, johN had a group called the Alliance for Digital Innovation and they're very active in the I. T. Modernization world and we remember as well. Um but you know, the federal government is really emphasizing this, this migration to cloud and that was reiterated in that cybersecurity executive order >>from the, well we'll definitely get you guys back on the show, we're gonna say something. >>Just all I'd say about about the executive order is that I think one of the main reasons why the president thought was important is that the legacy systems that are out there are mainly written on kobol. There aren't a lot of kids graduating with degrees in COBOL. So COBOL was designed in 1955. I think so I think it's very imperative that we move has made these workloads as we can, >>they teach it anymore. >>They don't. So from a security point of view, the amount of threats and vulnerabilities are through the >>roof awesome. Well john I want to get you on the show our next cyber security event. You have you come into a fireside chat and unpack all the awesome stuff that you're doing. But also the challenges. Yes. And there are many, you have to keep up the good work on the policy. I still say we got to remove that red line and identified new rules of engagement relative to what's on our sovereign virtual land. So a whole nother Ballgame, thanks so much for coming. I appreciate it. Thank you appreciate it. Okay, cute coverage here at eight of public sector seven Washington john ferrier. Thanks for watching. Mhm. Mhm.
SUMMARY :
Both cuba alumni Shannon Kellogg VP of public policy for the Americas and john would ceo tell It's all done over zoom. We'll go back to that and a great great to see you had great props here earlier. in the data center industry of which john and I have been very involved with over the This is not part of the whole H. 20. And so Arlington Virginia So this is the maxim now. One is that over the last decade, if you can believe it, we've invested $35 billion in the area and tech, you've seen this many years, And so basically the county made it appealing to companies like amazon the formula basically is look at the assets resources available that may align Because the data centers need power, fiber fiber is great. So I think I think we had an abundance of water which allowed us to build power sources john, it's really interesting because the vision that john Wood and several of So, you know, the vision that they had 15 This is clearly an example of the corporation contributing And you think You know, 10, everybody at the end of the day. And that's the amazon will continue to grow, benefits that helps the entire community. What's the what's the vision though? So just to give you the scale of what we're doing here in Virginia So to me, I mean this comes down to not to put my opinion out there because I never It's a I mean it's the two schools of thought on the It's for everybody that lives in the northern Virginia region benefits in the cybersecurity industry in Virginia as well. But I got to ask you guys, well I got you for the last minute or two cybersecurity But I had the privilege of hosting a panel this morning with And the third one is really more So counterparts at that point about information sharing, where they can take the actions necessary and So like I believe that because if people dropped on our shores flexibility given the industry to help help with the fight. really, ever since the CIA embrace the cloud, which was sort of the shot heard around the world for tell you right now people are getting hit You know, the colonial pipeline hack that affected everybody started going wait I mean, the government has to recognize that technology is now lingua franca for all things everything of federal systems into cloud in the modernization that jOHN has Just all I'd say about about the executive order is that I think one of the main reasons why the president thought So from a security point of view, the amount of threats and vulnerabilities are through the But also the challenges.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
AWS | ORGANIZATION | 0.99+ |
amazon | ORGANIZATION | 0.99+ |
Virginia | LOCATION | 0.99+ |
Homeland Security | ORGANIZATION | 0.99+ |
$2 | QUANTITY | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
America | LOCATION | 0.99+ |
Andy Jassy | PERSON | 0.99+ |
2015 | DATE | 0.99+ |
$97 | QUANTITY | 0.99+ |
john | PERSON | 0.99+ |
john Wood | PERSON | 0.99+ |
2020 | DATE | 0.99+ |
CIA | ORGANIZATION | 0.99+ |
Loudon County | LOCATION | 0.99+ |
15 projects | QUANTITY | 0.99+ |
20 million | QUANTITY | 0.99+ |
2005 | DATE | 0.99+ |
Economic Development Commission | ORGANIZATION | 0.99+ |
John | PERSON | 0.99+ |
30 | QUANTITY | 0.99+ |
$35 billion | QUANTITY | 0.99+ |
Shannon | PERSON | 0.99+ |
20% | QUANTITY | 0.99+ |
Fairfax County | LOCATION | 0.99+ |
john Kerry | PERSON | 0.99+ |
$1.3 billion | QUANTITY | 0.99+ |
northern Virgin | LOCATION | 0.99+ |
Prince William County | LOCATION | 0.99+ |
1955 | DATE | 0.99+ |
88 megawatts | QUANTITY | 0.99+ |
Alliance for Digital Innovation | ORGANIZATION | 0.99+ |
$220 million | QUANTITY | 0.99+ |
1430 MW | QUANTITY | 0.99+ |
15 years | QUANTITY | 0.99+ |
two schools | QUANTITY | 0.99+ |
13% | QUANTITY | 0.99+ |
70% | QUANTITY | 0.99+ |
first shot | QUANTITY | 0.99+ |
Shannon Kellogg | PERSON | 0.99+ |
31st year | QUANTITY | 0.99+ |
over 13,500 jobs | QUANTITY | 0.99+ |
late august | DATE | 0.99+ |
today | DATE | 0.99+ |
$5 million | QUANTITY | 0.99+ |
John Wood | PERSON | 0.99+ |
$50 billion | QUANTITY | 0.99+ |
15 years ago | DATE | 0.99+ |
northern Virginia | LOCATION | 0.99+ |
Josue Montero, EduTech, and Rafael Ramirez Pacheco, Costa Rica | AWS PS Partner Awards 2021
>>Mhm Hello and welcome to today's session of the 2021 aws Global Public Sector partner awards. I'm Natalie early, your host for the cube and I'm delighted to present our guests. They are Jose Montero, ceo logitech the central America and Rafael Ramirez Product manager. Costa rica Ministry of Education. Welcome gentlemen to today's session. >>Think in Italy >>religion and belief. Well, let's start with Rafael. Please tell us about some of the key challenges that are affecting the Ministry of Education in Post A Rekha. >>One of the main challenges was to be able to have a product that is always available to schools that is easy to use for schools and at the same time that the product should be user friendly. That is you don't need so much training for schools to use it more. A few things that we thought of was to consider our client because schools have a very limited connectivity so we could not use very highly tech technologies because that required very huge. Both advanced and our clients, the schools would be subject to a service that was not available to them. One of the main things was to consider the client and how to reach them. Thanks to Ed attack, the ministry made an alliance with a company that thought about the innovation and they recommended different services that we can provide with a cloud through the cloud so that we are able to get to take the service to deliver the service to our clients and then they can use the platform that we are building in an easy way and at the same time to take care of the quality they need. Something important about schools was that while they were using the product, they were getting benefit that made schools to be willing to participate. >>Terrific. Well Jose I'd love it if you could give us some insight on some of the services that you are providing to the ministry. >>Sure. Um, so when, when the ministry approaches and um, and we had the opportunity to work with them um, of course, as an AWS partner, we thought, well, this is couldn't be better, right? And um, so we um, we we started to think on all of the different services that AWS offers in the cloud to provide to the ministry to be able to reach this gap. That has been for a long time where you see still, you know, people using Excel, using access Microsoft access as databases, um, instead of using all of the energy and all of the, the power that the cloud has. So when we approach to them and um, and we were able to um um, to show all of these different services that AWS could um, could provide to the Minister of Education. It was it was a perfect marriage. So, um, we we started to work with uh, with them and I think it's been awesome. This is only the first part of of a project of eight stages, We are currently working on stage two and stage Three, which will come in August and in January of 2020, And, um, but we're we're super happy to to see just in this first face, everything that has come and all of the data that has come to help the Ministry of Education in order to take action in the student's lives. >>Yeah, that's really terrific to hear. Um, you know, I'd love to hear from Rafael further about why he thinks it was so important to have cloud data at the Ministry of Education level. >>Okay, I >>will give you an important example for us in our country. We would rather gather the, collect data in paper and take that to the central office and this would enter into an Excel file. This take around two months to process all this later and make decisions. Mm When we started with the first service, which was to record the number of enrollees of the students, we could pay teachers on time, we could get the number of students and know where we had the biggest needs. So this would make a very innovative solution. And when the pandemic started, we had the first active service. This allowed us to react very quickly and we realized that in the first quarter, 19,000 students were not in in our schools because we were from a face to face service to a virtual service. So we could react very quickly. We plant a strategy with the Ministry of Education that was to come back. That is the idea goes to locate where students were. And in the next four months we could reduce the dropout From 90 students to 18,000 students. After that, we initiated a Another stage to retrieve those 18,000 students back to school. This was thanks to having the information online in some countries that may not have this problem. This might be very little. But for us, this was very, very important because we were able to reach the poll a wrist households so as to bring those students back to the school. >>Terrific. Well, that's really fantastic. Um, you know, in a non covid world, how do you think this technology will really help you, uh, to enhance education within Costa rica? See I can't. The important thing. >>This is important in the idea of this innovative product for us has a strategy of having a single file of the student. This allows us to do a follow up of what the student has done during the different school years and we can identify their lacks the weaknesses and we can see which are the programs that are more appropriate. Was to replicate this in the rest of the country without a centralized file. Like we have now, we are looking to have this traceability of students so as to have strengthened our witnesses and replicate our strength in the rest of the educational system. one of the most important things when you is that this technological unit, this implementation not only reached primary school students, but also preschool kindergarten, primary school, secondary school higher education, technical Education. So we reached every single sector where the Ministry of Education was able to detect where there was a need in the country. >>Yeah, Terrific. Well, I'd love to hear more from our other guest Jose monteiro Ceo of ecotech to central America. Uh, you know, if you could give us a, you know, more insight, more depth on the services that you provide. You, you talked about like an eight step plan. If you could just highlight those eight steps. >>Sure. Um, so part of this aid stages that we're going to be developing and um, and we hope that we'll be working with the Ministry of Education and every single one of them. Um, It causes where it brings a lot of technologies. For example, there's one that were planning on using, which is recognition from AWS. Um, the fact of um, there was, there's a lot of students that come to the country that have no documentation. There's no passports, There's no um, document I. D. There's nothing, right? So it's really hard for a um within the same school system to be able to track these students, right? Because they can they can go, they can come and they can, if they want, they can change their name. They can they can do a lot of things that are maybe are not correct. And um and sometimes it's not even because they want to do something incorrect. It's just that the uh the system or the yeah the the way of doing things manually, it allows us to do these types of changes. So for example, with with the service like recognition have been able to recognize their face or or recognize their um their idea with their with their fingerprints um and and being able to a um to interact and give give an actual recognition as the word says to this student. It's amazing. It's amazing technology that allows the Ministry of Education and the students to have a voice to have a presence even though they don't have their actual documentation because of whatever reason. Um There is something behind this that helps them um b be valuable and the b at the same time, a present in the in the system. Right? And so and and with with not only that, but with the grading with um with the attendance, with with the behavior with um with a lot of things that we're creating within these stages. Uh It's gonna be, for example, let me give you a quick example. Um There's, for example, the system that we've created for the dropouts. Um The student doesn't come one day, two days, three days and automatically. Now it'll, it'll become an alert and it will start to shot emails and alerts to the different people involved in order to see, hey listen, this student has not come for the last week, two classes. Um, we need you to go and see what's going on, Right? So this is maybe it is something very small, but it can, it can change people's life and they can change students lives and um, and, and the fact of, of knowing where they are, how they are, how are they doing, how their grades are, where we can help them and activate these different types of alerts that, um, that the system allows them to, um, to do that. It helps incredibly, the life of the student in the future, of this, of this student. And uh, in that exact, that is exactly what we're trying to do here. At the end. It's not only, um, it's okay, all of the technological and all of the different efforts that we're doing, but at the end, that's what it matters. It's, it's the student, right? It's it's the fact that, um, that he can come and he can finish his school, he can graduate, he can go to college, he can, he can become an, uh, an entrepreneur and, and be some, some day here and I at AWS conference and give him give a conference, and, and and that is exactly what the Ministry of Education is looking at, what we are looking at the project per se. >>Yeah, I mean, that's a really excellent point that you're making. I mean, this technology is helping real people on the ground and actually shaping their lives for the better. So, I mean, it's really incredible, you know, I'd love to hear more now from Rafael, just a bit what insight he can provide to other ministries, who, you know, also, you know, ministers of Education, who also would consider implementing this kind of technology and also his own experience um with this project in the AWS. >>Well, the connectivity for us is really important, not only with within the institutions of the Ministry of Education, but we also have connections with the Ministry of Health, we also have connections with the software called Sienna Julia, which allows the identification of people within the country and the benefits provided by the stage. So the country where all by little is incorporating the pieces and these cloud services, we have found that before we developed everything AWS has a set of services that allow us to focus on the problem and instead of on the solution of the technology, because services are already available. So at the country level, other ministries are incorporating these services nowadays, for covid management, the Minister of Health has a set of applications that allowed to set links between people that has positive. So this has allowed us to associate the situation with that particular student in our classrooms. So little by little services are converting education and other services into a need that allows us to focus on the problem instead of on technological solutions because services are already there for us to consume >>terrific. You know, I'd love to now shift to our other guest um Jose could you give us some insight what is the next phase for your business when you look at 2021? You know, it's gonna be, I mean, we hope it's going to be a wonderful year. Uh post Covid. Uh you know, what's your vision? >>It's it's interesting that you're saying that Natalie um education has changed Covid has um has put an acceleration to um has accelerated the the whole shift of the technological change in in education. It will not, well I hope it will not go back to the same before Covid. Um it's all of these technologies that are being created that are being organized, that are being it developed um for education specifically um an area where everything has been done the same for a long time. Um we need it, it's crazy to say this, but we needed a Covid time in order to accelerate this type of of organizations right in and now like ministry, the ministries of Education, like like the Minister of Education of Costa rica, they've had this for a long time and they've they've been thinking of the importance of making changes and everything, but until now it became a priority. Why? Because they realized that without these technologies with another pandemic, oh boy, we're going to see the effects of this and, and, and it's going to affect a lot of countries and a lot of students. Um, but it's gonna help to accelerate and understand that for example, internet, it has to be a worldwide access, just like water or electricity is in some, in our countries right now. You know, the fact of a student not having internet, um, we're taking away lot of development for this student. So I believe that after this post covid time education is going to continue to do a lot of changes and you and you'll see this and you'll see this in all of the areas in elementary, in preschool, in university, in high school. Um, you're going to see the changes that this is, um, is starting to do and we've seen it and we've seen it, but now it's going to be at a 23 or four X. So we're pretty excited. We're pretty excited what what the world it's gonna what the world's gonna bring to this table and to this specific area which is education. >>Yeah. That's really terrific to hear a silver lining in this pandemic. And just real quick uh final thoughts from rafael, are you looking to ramp up further? Uh you know, in light of what Jose has said, you know, to ramp up the digital transformation process? >>Yes, I believe this is an opportunity. The country is facing the opportunity, the resistance that we had in the sector of education, the current emergency situation. And they need to use virtual tools Have flattened these curves and narratives. Since 2000 and 20, Costa Rica started a very strong uh teach that trainer process that every four years ago it was very difficult to set to involve all teachers. But nowadays all teachers want to get trained. So we are getting there with virtual trainings with new tools, with the implementation and the use of technology in the classroom. So these kinds of emergencies somehow we have to uh, we know the pain but we know that also the gain of this whole idea of this whole situation. So this opportunity for change is something that we have to take advantage of. Thanks to these cloud services, I believe this is nowadays available and the country realized that these things are closer than what we thought of. An innovation is here to stay and I believe we have to exploit this a little by little >>terrific. Well gentlemen, thank you so much for your insights, loved hearing about the innovations taking place in the classroom, especially overseas in Costa rica. And that of course was Rafael Ramirez, the Product Manager, Costa rica, Ministry of Education, as well as Jose monteiro, the ceo of Ecotech D central America. And of course, I'm Natalie ehrlich, your host for the cube for today's session for the 2021 AWS Global Public Sector Partner Awards. Thanks very much for watching. >>Mhm.
SUMMARY :
ceo logitech the central America and Rafael Ramirez Product Well, let's start with Rafael. at the same time to take care of the quality they need. some of the services that you are providing to the ministry. the different services that AWS offers in the cloud to provide Yeah, that's really terrific to hear. That is the idea goes to Um, you know, in a non covid world, This is important in the idea of this innovative the services that you provide. the Ministry of Education and the students to have a voice to have real people on the ground and actually shaping their lives for the better. the Minister of Health has a set of applications that allowed to set links You know, I'd love to now shift to our other guest um Jose You know, the fact of a student not having internet, um, we're taking away has said, you know, to ramp up the digital transformation process? and the country realized that these things are closer than for the 2021 AWS Global Public Sector Partner Awards.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Rafael | PERSON | 0.99+ |
Natalie ehrlich | PERSON | 0.99+ |
Rafael Ramirez | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
three days | QUANTITY | 0.99+ |
two days | QUANTITY | 0.99+ |
one day | QUANTITY | 0.99+ |
January of 2020 | DATE | 0.99+ |
August | DATE | 0.99+ |
Natalie | PERSON | 0.99+ |
Jose | PERSON | 0.99+ |
Costa rica | LOCATION | 0.99+ |
Jose Montero | PERSON | 0.99+ |
Jose monteiro | PERSON | 0.99+ |
Ministry of Health | ORGANIZATION | 0.99+ |
2021 | DATE | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
two classes | QUANTITY | 0.99+ |
Excel | TITLE | 0.99+ |
19,000 students | QUANTITY | 0.99+ |
18,000 students | QUANTITY | 0.99+ |
rafael | PERSON | 0.99+ |
90 students | QUANTITY | 0.99+ |
Josue Montero | PERSON | 0.99+ |
Ministry of Education | ORGANIZATION | 0.99+ |
Ministry of Education | ORGANIZATION | 0.99+ |
23 | QUANTITY | 0.99+ |
Covid | PERSON | 0.99+ |
first service | QUANTITY | 0.99+ |
first part | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
Rafael Ramirez Pacheco | PERSON | 0.99+ |
Jose monteiro Ceo | PERSON | 0.99+ |
Costa Rica | LOCATION | 0.98+ |
eight steps | QUANTITY | 0.98+ |
Sienna Julia | TITLE | 0.98+ |
Both | QUANTITY | 0.98+ |
Italy | LOCATION | 0.98+ |
last week | DATE | 0.98+ |
four | QUANTITY | 0.98+ |
eight step | QUANTITY | 0.97+ |
around two months | QUANTITY | 0.97+ |
eight stages | QUANTITY | 0.96+ |
2000 | DATE | 0.96+ |
pandemic | EVENT | 0.96+ |
2021 AWS Global Public Sector Partner Awards | EVENT | 0.96+ |
central America | LOCATION | 0.96+ |
20 | DATE | 0.95+ |
today | DATE | 0.94+ |
first quarter | DATE | 0.94+ |
EduTech | ORGANIZATION | 0.94+ |
single file | QUANTITY | 0.94+ |
ecotech | ORGANIZATION | 0.91+ |
first face | QUANTITY | 0.91+ |
stage Three | OTHER | 0.89+ |
first active service | QUANTITY | 0.87+ |
Minister of Health | PERSON | 0.87+ |
PS Partner Awards 2021 | EVENT | 0.86+ |
one | QUANTITY | 0.86+ |
next four months | DATE | 0.84+ |
AWS | EVENT | 0.81+ |
Ed | PERSON | 0.81+ |
Covid | OTHER | 0.78+ |
four years ago | DATE | 0.78+ |
Costa | ORGANIZATION | 0.78+ |
Ecotech D central | ORGANIZATION | 0.7+ |
ceo logitech | ORGANIZATION | 0.7+ |
single sector | QUANTITY | 0.69+ |
stage two | OTHER | 0.68+ |
Wilfred Justin, AWS WWPS | AWS re:Invent 2020 Public Sector Day
>>from around the >>globe. It's the Cube with digital coverage of AWS reinvent 2020. Special coverage sponsored by AWS Worldwide Public sector. >>Right. Hello and welcome to the Cube. Virtual our coverage of aws reinvent 2020 with special coverage of the public sector experience. This is the day when we go through all the great conversations around public sector in context to reinvent great guest will for Justin, head of A W s ai and machine learning enablement and partnership with AWS Wilfred. Thanks for joining us. >>Thanks, John. Thanks for having me on. I'm pretty excited to be part of this cube interview. >>Well, I wish we could be in person, but with the pandemic, we gotta do the remote. But I want to get into some of the things you're working on. The A I m l Rapid Adoption Assistance Initiative eyes a big story. What is? What is it described what it is. >>So we launched this artificial intelligence slash machine learning rapid adoption assistance for all public sector partners who are part of the AP in network in September 2020. Onda. We launched this in response to the president's Executive water called the American Year Initiative. So the rapid adoption assistant what it provides us. It provides a direct scalable on automated mechanism for all the public sector partners to reach out to AWS experts within our team for assistance in building and deploying machine learning workloads on behalf of the agencies. So for all all the partners who are part off, this rapid adoption assistance will go through a journey with AWS with my team and they will go through three different faces. The first face will be the envisioning face. The second phase would be the enablement face on the third would be the bill face, as you know, in the envisioning face will dive deeply The use case, the problem that they're trying to solve. This is where we will talk about the algorithms and framework on. We will solidify the architecture er on validate the architecture er on following that will be an enablement face where we engage with the partners trained their technical team, meaning that it will be a hands on approach hands on on keyboard kind of approach where we trained them on machine learning stack On the third phase would be the bill face on the partners leverage the knowledge that they have gained through the enablement and envisioning face, and they start building on rolling out workloads on behalf of the agencies. So we will stay with them throughout the journey on We will doom or any kind of blockers be technical or business, so that's a quick overview off a more rapid adoption assistance program. >>It's funny talking to Swami over the years and watching every year at reinvent the A I. M L Portfolio. Dr Matt Wood is always doing something new. This year is no exception. Even Mawr Machine Learning and AI in the In the News on this rapid adoption assistant initiative sounds like it's an accelerant. Um, so I get all that, But I want to ask you, what problem does it solve for the customer? Or Amazon is because there's demand. There's too much demand. People wanna go faster. What problem does this initiative this rapid adoption of a I machine learning initiative solved? >>So as you know, John, artificial intelligence and related technologies like deep learning and machine learning can literally transform the way agencies operate. They can enable them to provide better services, quicker services and more secure services to the citizens of this country. And that's the reason the president released an executive water called American Initiative on it drives all the government agencies, specifically federal agencies, to promote artificial intelligence to protect and improve the security and economy of the nation. So if you think about it, the best way to achieve the goal is to enable the partners toe build workloads on behalf of agencies, because when it comes to public sector, most of the workloads are delivered by partners. So the problem that we face based on our interaction with the partners is that though the partners have been building a lot off applications with AWS for more than a decade, when it comes to artificial intelligence, they have very limited resources when it comes to deep learning and machine learning, right, like speech recognition, cognitive computing, national language frosting. So we wanted exactly address that. And that's the problem you're trying to solve by launching this rapid adoption assistance, which is nothing but a dry direct mechanism for partners to reach our creative, these experts to help them to build those kind of solutions for the government. >>You know, it's interesting because AI and machine learning it's a secret sauce for workload, especially modern workloads. You mentioned agencies and also public sector. You know, we've seen Certainly there's been pandemic a ton of focus on moving faster, right? So getting those APS out quickly ai drives a lot of that, so totally get it. Um, I think it's an accelerant great program. It just makes a lot of sense. And I know you guys have been going in tow by vertical and kind of having stage making all these other tools kind of be specialized within those verticals. So it makes a ton of sense. I get it, and it is a great, great initiative and solve the problem. The question I have is who gets access to this, right? Is it just agencies you mentioned? Is it all public sector? Could you just clarify who can apply to this program? >>Yes, it is a partner focused program. So all the existing partners, though it is going to affect the end agencies, were trying to help the agency's through the partners. So all the existing AP in partners who are part of the PSP program, we call it the public sector partner program can apply for this rapid adoption assistance. So you have been following John, you have been following AWS and AWS partners on a lot of partners have different kind of expertise on they. They show that by achieving a lot of competencies, right, it could be technical competencies like big data storage and security. Or it could be domain specific competencies like public safety education on government competency. But for a playing this program, the partners don't need to have any kind of competency, and all they have to have is they have to be part of the Amazon Partner Network on they have to be part of the public sector partner program. That is number one Second. It is open toe all partners, meaning that it is open toe. Both technology partners, as well as consulting partners Number three are playing is pretty simple, John, right? You can quickly search for a I M or rapid adoption assistance on a little pop up a page on a P network, the partners have to go on Phil pretty basic information about the workload, the problem that they're trying to solve the machine learning services that they're planning to use on a couple of other information, like contact information, and then our team reaches out to the partner on help them with the journey. >>So real. No other requirements are prerequisites. Just part of the partner program. >>Absolutely. It is meant for partners. And all you have to do is you have to be a part off 18 network, and you have to be a public sector apartment. >>Public sector partner makes sense. I mean, how you're gonna handle the demand. I'm sure the it's gonna be a tsunami of interest, because, I mean, why wouldn't someone take advantage of this? >>Yep. It is open to all kinds of partners because they have some kind of prerequisites, right? So that's what I'm trying to explain. It is open to all partners, but we have since it is open to existing partners, we kind of expect the partners toe understand the best practices off deploying a machine, learning workloads, or for that case, any kind of workload which should be scalable, land secure and resilient. So we're not going to touch? Yeah, >>Well, I wanna ask you what's what's the response been on this launch? Because, you know, I mean to me, it just makes it's just common sense. Why wouldn't someone take advantage of it? E. Whether responses partner or you have domain expertise or in a vertical just makes a lot of sense. You get access to the experts. >>The response has been great. As I said, the once you apply the journey takes six weeks, but already we just launched it. Probably close toe. Two months back in September 2nd week of September, it is almost, uh, almost two months, and we have more than 15 partners as part of this program on dykan name couple of partners say, for example, we worked with delight on We Are. We will be working on number of work clothes for the Indy agencies through delight. And there are other couple of number of other partners were making significant progress using this rapid adoption assistance that includes after associates attained ardent emcee on infinitive. So to answer your question, the response has been great so far. >>So what's the I So I gotta ask, you know, one of things I thought that Teresa Carlson about all the time in Sandy Carter is, you know, trying to get the accelerant get whether it's Fed ramp and getting certifications. I mean, you guys have done a great job of getting partners on board. Is there any kind of paperwork? What's the process? What should a partner expect to take advantage of that? I'm sure they'll be interest beyond just the launch. What's what's involved? What zit Web bases it check a form? Is that a lot of hoops to jump through? Explain what? What? The process >>is. Very interesting question. And it probably is a very important question from a part of perspective, right? So since it is offered for a peon partners, absolutely, they should have already gone through the AP in terms and conditions they should have. Already, a customer agreement or advanced partners might have enterprise agreement. So for utilizing this for leveraging this rapid adoption assistance program, absolutely. There's no paperwork involved. All they have to do is log into the Web form, fill up the basic information. It comes to us way, take it from there. So there is no hard requirements as long as you're part of the AP network. And as long as you're part of the PSP program, >>well, for great insight, congratulations on a great program. I think it's gonna be a smash hit. Who wouldn't wanna take? I know you guys a lot of goodness there with Amazon Cloud higher level services with a I machine learning people could bring it into the table. I know from a cybersecurity standpoint to just education the range of, um, workloads is gonna be phenomenal. Obviously military as well. Eso totally cool. Love it. Congratulations. Like my final question is, um, one about the partner. So I'm a partner. I like this. Say I'm a partner. I jump in Easy to get in. Walk me through What happens? I mean, I signed some paperwork. You check the boxes, I get involved, I get, like, a rep. Do I do things? Do I? What happens to me? Walk me down the path of execution. What's expectation of what will happen? >>I'll explain that in two parts, John. Right? One is from a partner journey perspective and then from AWS perspective. What? What we expect out off partners, right? So, from a experience perspective, as long as they fill out, fill out the web form on, fill out the basic information about the project that they're trying to work. It comes to us. The workflow is automated. All the information is captured on the information comes to my team on. We get back to the partners within three days, but the journey itself can take from 6 to 8 weeks because, as I mentioned during the envisioning case, we try to map the problem to the solution. But the enablement phases the second phase is where it can take anywhere from 2 to 3 weeks because, as I mentioned, we focused on the three layers of the machine learning stack for certain kind of partners. They might be interested in sage maker because they might want to build a custom machine learning model. But for some of the partners, they want the argument that existing applications using S. R or NLP or nL you so we can focus on the high level services. Or we can train them on stage makers so it can take anywhere between 2 to 3 weeks or 3 to 4 weeks. And finally, the build phase varies from partner to partner on the complexity of the work. Lord at that point were still involved with a partner, but the partner will be taking the lead on will be with them to remove any kid of Glaucus being technical or, uh, business couple of Yeah, well, I just >>want to say the word enablement in your title kind of speaks volumes. This isn't about enabling customers. >>It is all about enabling the in customers through partners. So we focus on enabling partners. They could be business big system integrators like Lockheed's or Raytheon's or Delight. Or it could be nimble in small partners. Or it could be a technology partner building an entire pass or SAS service on behalf of the government agencies. Right or that could help the comment agencies in different verticals. So we just enabled the in the agency's through the partners. And the focus of this program is all about partner enablement. >>Well, for just ahead of a does a i machine learning enablement in partnership, part of public sector with a W. S. This is our special coverage. Well, for thanks for coming on being a cube virtual guest. I wish we could be in person, but this year it's remote. This is the cube virtual. I'm John for a year. Host of the Cube. Thanks for watching. >>Thanks a lot, John.
SUMMARY :
It's the Cube with digital coverage of AWS This is the day when we go through all the great I'm pretty excited to be part of this cube interview. of the things you're working on. So for all all the partners Even Mawr Machine Learning and AI in the In the News on this rapid adoption So the problem that we face based And I know you guys have been going in tow by vertical and kind of having stage making all these other tools kind So all the existing AP in partners who are part of the PSP program, Just part of the partner program. And all you have to do is you have to be a part off 18 I'm sure the it's gonna be a tsunami It is open to all partners, but we have since it You get access to the experts. As I said, the once you apply the journey takes six weeks, So what's the I So I gotta ask, you know, one of things I thought that Teresa Carlson about all the time in Sandy Carter is, All they have to do is log into the Web form, I know from a cybersecurity standpoint to just education the range of, All the information is captured on the information comes to my team on. want to say the word enablement in your title kind of speaks volumes. It is all about enabling the in customers through partners. This is the cube virtual.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Lockheed | ORGANIZATION | 0.99+ |
John | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
September 2020 | DATE | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Teresa Carlson | PERSON | 0.99+ |
Raytheon | ORGANIZATION | 0.99+ |
Justin | PERSON | 0.99+ |
Wilfred Justin | PERSON | 0.99+ |
six weeks | QUANTITY | 0.99+ |
2 | QUANTITY | 0.99+ |
3 | QUANTITY | 0.99+ |
two parts | QUANTITY | 0.99+ |
Matt Wood | PERSON | 0.99+ |
Sandy Carter | PERSON | 0.99+ |
Amazon Partner Network | ORGANIZATION | 0.99+ |
4 weeks | QUANTITY | 0.99+ |
second phase | QUANTITY | 0.99+ |
third | QUANTITY | 0.99+ |
3 weeks | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
6 | QUANTITY | 0.99+ |
Delight | ORGANIZATION | 0.99+ |
more than a decade | QUANTITY | 0.99+ |
three days | QUANTITY | 0.99+ |
8 weeks | QUANTITY | 0.98+ |
this year | DATE | 0.98+ |
third phase | QUANTITY | 0.98+ |
more than 15 partners | QUANTITY | 0.98+ |
first face | QUANTITY | 0.98+ |
a year | QUANTITY | 0.97+ |
Swami | PERSON | 0.97+ |
Phil | PERSON | 0.97+ |
Second | QUANTITY | 0.96+ |
This year | DATE | 0.96+ |
September 2nd week of September | DATE | 0.95+ |
three layers | QUANTITY | 0.94+ |
three different faces | QUANTITY | 0.94+ |
Indy | ORGANIZATION | 0.94+ |
pandemic | EVENT | 0.93+ |
Two months | DATE | 0.92+ |
We Are | ORGANIZATION | 0.92+ |
almost two months | QUANTITY | 0.91+ |
AWS Worldwide | ORGANIZATION | 0.9+ |
NLP | ORGANIZATION | 0.89+ |
A W | ORGANIZATION | 0.87+ |
one | QUANTITY | 0.86+ |
couple of partners | QUANTITY | 0.85+ |
Number three | QUANTITY | 0.82+ |
AP | ORGANIZATION | 0.82+ |
Mawr | ORGANIZATION | 0.8+ |
AWS Wilfred | ORGANIZATION | 0.79+ |
Invent 2020 Public Sector Day | EVENT | 0.75+ |
public sector partner program | OTHER | 0.71+ |
Both technology | QUANTITY | 0.7+ |
couple | QUANTITY | 0.69+ |
Amazon Cloud | ORGANIZATION | 0.67+ |
S. R | ORGANIZATION | 0.66+ |
Cube | COMMERCIAL_ITEM | 0.65+ |
American Initiative | TITLE | 0.63+ |
Onda | ORGANIZATION | 0.63+ |
Rapid Adoption Assistance Initiative | OTHER | 0.61+ |
American Year Initiative | OTHER | 0.61+ |
Glaucus | ORGANIZATION | 0.59+ |
18 network | QUANTITY | 0.58+ |
aws reinvent 2020 | TITLE | 0.58+ |
SAS | ORGANIZATION | 0.58+ |
infinitive | TITLE | 0.57+ |
reinvent 2020 | TITLE | 0.49+ |
WWPS | TITLE | 0.45+ |
dykan | OTHER | 0.39+ |
Networks of Optical Parametric Oscillators
>>Good morning. Good afternoon. Good evening, everyone. I should thank Entity Research and the Oshie for putting together this program and also the opportunity to speak here. My name is Al Gore ism or Andy and I'm from Caltech. And today I'm going to tell you about the work that we have been doing on networks off optical parametric oscillators and how we have been using them for icing machines and how we're pushing them toward Cornum. Photonics should acknowledge my team at Caltech, which is now eight graduate students and five researcher and postdocs as well as collaborators from all over the world, including entity research and also the funding from different places, including entity. So this talk is primarily about networks of resonate er's and these networks are everywhere from nature. For instance, the brain, which is a network of oscillators all the way to optics and photonics and some of the biggest examples or meta materials, which is an array of small resonate er's. And we're recently the field of technological photonics, which is trying thio implement a lot of the technological behaviors of models in the condensed matter, physics in photonics. And if you want to extend it even further. Some of the implementations off quantum computing are technically networks of quantum oscillators. So we started thinking about these things in the context of icing machines, which is based on the icing problem, which is based on the icing model, which is the simple summation over the spins and spins can be their upward down, and the couplings is given by the G I J. And the icing problem is, if you know J I J. What is the spin configuration that gives you the ground state? And this problem is shown to be an MP high problem. So it's computational e important because it's a representative of the MP problems on NPR. Problems are important because first, their heart in standard computers, if you use a brute force algorithm and they're everywhere on the application side. That's why there is this demand for making a machine that can target these problems and hopefully it can provide some meaningful computational benefit compared to the standard digital computers. So I've been building these icing machines based on this building block, which is a degenerate optical parametric oscillator on what it is is resonator with non linearity in it and we pump these resonate er's and we generate the signal at half the frequency of the pump. One vote on a pump splits into two identical photons of signal, and they have some very interesting phase of frequency locking behaviors. And if you look at the phase locking behavior, you realize that you can actually have two possible face states as the escalation result of these Opio, which are off by pie, and that's one of the important characteristics of them. So I want to emphasize >>a little more on that, and I have this mechanical analogy which are basically two simple pendulum. But there are parametric oscillators because I'm going to modulate the parameter of them in this video, which is the length of the strength on by that modulation, which is that will make a pump. I'm gonna make a muscular. That'll make a signal, which is half the frequency of the pump. >>And I have two of them to show you that they can acquire these face states so they're still face their frequency lock to the pump. But it can also lead in either the zero pie face state on. The idea is to use this binary phase to represent the binary icing spin. So each Opio is going to represent spin, which can be >>either is your pie or up or down, >>and to implement the network of these resonate er's. We use the time off blood scheme, and the idea is that we put impulses in the cavity, these pulses air separated by the repetition period that you put in or t R. And you can think about these pulses in one resonator, xaz and temporarily separated synthetic resonate Er's If you want a couple of these resonator is to each other, and now you can introduce these delays, each of which is a multiple of TR. If you look at the shortest delay it couples resonator wanted to 2 to 3 and so on. If you look at the second delay, which is two times a rotation period, the couple's 123 and so on. If you have any minus one delay lines, then you can have any potential couplings among these synthetic resonate er's. And if I can introduce these modulators in those delay lines so that I can strength, I can control the strength and the phase of these couplings at the right time. Then I can >>have a program. We'll all toe all connected network in this time off like scheme. >>And the whole physical size of the system scales linearly with the number of pulses. So the idea of opium based icing machine is didn't having these o pos. Each of them can be either zero pie, and I can arbitrarily connect them to each other. And then I start with programming this machine to a given icing problem by just setting the couplings and setting the controllers in each of those delight lines. So now I have a network which represents an icing problem thin the icing problem maps to finding the face state that satisfy maximum number of coupling constraints. And the way it happens is that the icing Hamiltonian maps to the linear loss of the network. And if I start adding gain by just putting pump into the network, then the OPI ohs are expected to oscillating the lowest, lowest lost state. And, uh and we have been doing these in the past, uh, six or seven years and I'm just going to quickly show you the transition, especially what happened in the first implementation which was using a free space optical system and then the guided wave implementation in 2016 and the measurement feedback idea which led to increasing the size and doing actual computation with these machines. So I just want to make this distinction here that, um the first implementation was on our optical interaction. We also had an unequal 16 implementation and then we transition to this measurement feedback idea, which I'll tell you quickly what it iss on. There's still a lot of ongoing work, especially on the entity side, to make larger machines using the measurement feedback. But I'm gonna mostly focused on the all optical networks and how we're using all optical networks to go beyond simulation of icing. Hamiltonian is both in the linear and >>nonlinear side and also how we're working on miniaturization of these Opio networks. So >>the first experiment, which was the four Opium machine it was a free space implementation and this is the actual picture of the machine and we implemented a small and it calls for Mexico problem on the machine. So one problem for one experiment and we ran the machine 1000 times, we looked at the state and we always saw it oscillate in one of these, um, ground states of the icing laboratoria. Yeah, so then the measurement feedback idea was to replace those couplings and the controller with the simulator. So we basically simulated all those coherent interactions on on FB g A. And we replicated the coherent pulse with respect to all those measurements. And then we injected it back into the cavity and on the near to you still remain. So it still is a non. They're dynamical system, but the linear side is all simulated. So there are lots of questions about if this system is preserving important information or not, or if it's gonna behave better Computational wars. And that's still ah, lot of ongoing studies. But nevertheless, the reason that this implementation was very interesting is that you don't need the end minus one delight lines so you can just use one, and you can implement a large machine, and then you can run several thousands of problems in the machine, and then you can compare the performance from the computational perspective. Looks so I'm gonna split this idea of opium based icing machine into two parts One is the linear part, which is if you take out the non linearity out of the resonator and just think about the connections. You can think about this as a simple matrix multiplication scheme, and that's basically >>what gives you the icing Hamiltonian model A. So the optical loss of this network corresponds to the icing Hamiltonian. >>And if I just want to show you the example of the n equals for experiment on all those face states and the history Graham that we saw, you can actually calculate the laws of each of those states because all those interferences in the beam splitters and the delay lines are going to give you a different losses. And then you will see that ground states corresponds to the lowest laws of the actual optical network. If you add the non linearity, the simple way of thinking about what the non linearity does is that it provides to gain, and then you start bringing up the gain so that it hits the loss. Then you go through the game saturation or the threshold which is going to give you this phase bifurcation. >>So you go either to zero the pie face state, and the expectation is that this the network oscillates in the lowest possible state, the lowest possible loss state. >>There are some challenges associated with this intensity Durban face transition, which I'm going to briefly talk about. I'm also going to tell you about other types of non their dynamics that we're looking at on the non air side of these networks. So if you just think about the linear network, we're actually interested in looking at some technological behaviors in these networks. And the difference between looking at the technological behaviors and the icing uh, machine is that now, First of all, we're looking at the type of Hamilton Ian's that are a little different than the icing Hamilton. And one of the biggest difference is is that most of these technological Hamilton Ian's that require breaking the time reversal symmetry, meaning that you go from one spin to on the one side to another side and you get one phase. And if you go back where you get a different phase, and the other thing is that we're not just interested in finding the ground state, we're actually now interesting and looking at all sorts of States and looking at the dynamics and the behaviors of all these states in the network. So we started with the simplest implementation, of course, which is a one d chain of thes resonate er's which corresponds to a so called ssh model. In the technological work, we get the similar energy to los mapping. And now we can actually look at the band structure on. This is an actual measurement >>that we get with this associate model and you see how it reasonably how how? Well, it actually follows the prediction and the theory. >>One of the interesting things about the time multiplexing implementation is that now you have the flexibility of changing the network as we were running the machine. And that's something unique about this time multiplex implementation so that we can actually look at the dynamics. And one example >>that we have looked at is we can actually go to the transition off going from top a logical to the to the standard nontrivial. I'm sorry to the trivial behavior of the network. >>You can then look at the edge states and you can also see the trivial and states and the technological at states actually showing up in this network. We have just recently implement on a two D, >>uh, network with Harper Hofstadter model when you don't have the results here. But we're one of the other important characteristic of time multiplexing is that you can go to higher and higher dimensions and keeping that flexibility and dynamics. And we can also think about adding non linearity both in a classical and quantum regimes, which is going to give us a lot of exotic oh, classical and quantum, non innate behaviors in these networks. >>So I told you about the linear side. Mostly let me just switch gears and talk about the nonlinear side of the network. And the biggest thing that I talked about so far in the icing machine is this phase transition, that threshold. So the low threshold we have squeezed state in these Oh, pios, if you increase the pump, we go through this intensity driven phase transition and then we got the face stays above threshold. And this is basically the mechanism off the computation in these O pos, which is through this phase transition below to above threshold. So one of the characteristics of this phase transition is that below threshold, you expect to see quantum states above threshold. You expect to see more classical states or coherent states, and that's basically corresponding to the intensity off the driving pump. So it's really hard to imagine that it can go above threshold. Or you can have this friends transition happen in the all in the quantum regime. And there are also some challenges associated with the intensity homogeneity off the network. Which, for example, is if one Opio starts oscillating and then its intensity goes really high. Then it's going to ruin this collective decision making off the network because of the intensity driven face transition nature. So So the question is, can we look at other phase transitions? Can we utilize them for both computing? And also, can we bring them to the quantum regime on? I'm going to specifically talk about the face transition in the spectral domain, which is the transition from the so called degenerate regime, which is what I mostly talked about to the non degenerate regime, which happens by just tuning the phase of the cavity. And what is interesting is that this phase transition corresponds to a distinct phase noise, behavior So in the degenerate regime, which we call it the order state. You're gonna have the phase being locked to the phase of the pump as I talked about in the non the general regime. However, the phase is the phase is mostly dominated by the quantum diffusion off the off the phase, which is limited by the so called shallow towns limit and you can see that transition from the general to non degenerate, which also has distinct symmetry differences. And this transition corresponds to a symmetry breaking in the non degenerate case. The signal can acquire any of those phases on the circle, so it has a you one symmetry. And if you go to the degenerate case, then that symmetry is broken and you only have zero pie face days I will look at So now the question is can utilize this phase transition, which is a face driven phase transition and can we use it for similar computational scheme? So that's one of the questions that were also thinking about. And it's not just this face transition is not just important for computing. It's also interesting from the sensing potentials and this face transition. You can easily bring it below threshold and just operated in the quantum regime. Either Gaussian or non Gaussian. If you make a network of Opio is now, we can see all sorts of more complicated and more interesting phase transitions in the spectral domain. One of them is the first order phase transition, which you get by just coupling to oppose. And that's a very abrupt face transition and compared to the to the single Opio face transition. And if you do the couplings right, you can actually get a lot of non her mission dynamics and exceptional points, which are actually very interesting to explore both in the classical and quantum regime. And I should also mention that you can think about the cup links to be also nonlinear couplings. And that's another behavior that you can see, especially in the nonlinear in the non degenerate regime. So with that, I basically told you about these Opio networks, how we can think about the linear scheme and the linear behaviors and how we can think about the rich, nonlinear dynamics and non linear behaviors both in the classical and quantum regime. I want to switch gear and tell you a little bit about the miniaturization of these Opio networks. And of course, the motivation is if you look at the electron ICS and >>what we had 60 or 70 years ago with vacuum tube and how we transition from relatively small scale computers in the order of thousands of nonlinear elements to billions of non linear elements, where we are now with the optics is probably very similar to seven years ago, which is a tabletop implementation. >>And the question is, how can we utilize nano photonics? I'm gonna just briefly show you the two directions on that which we're working on. One is based on lithium Diabate, and the other is based on even a smaller resonate er's Did you? So the work on Nana Photonic lithium naive. It was started in collaboration with Harvard Marko Loncar and also might affair at Stanford. And, uh, we could show that you can do the >>periodic polling in the phenomenon of it and get all sorts of very highly non in your process is happening in this net. Photonic periodically polls if, um Diabate >>and now we're working on building. Opio was based on that kind of photonic lithium Diabate and these air some some examples of the devices that we have been building in the past few months, which I'm not gonna tell you more about. But the OPI ohs and the Opio networks are in the works, and that's not the only way of making large networks. But also I want to point out that the reason that these Nana photonic goblins are actually exciting is not just because you can make a large networks and it can make him compact in a in a small footprint, they also provide some opportunities in terms of the operation regime. On one of them is about making cat states in o pos, which is can we have the quantum superposition of >>the zero pie states that I talked about >>and the nano photonics within? I would provide some opportunities to actually get >>closer to that regime because of the spatial temporal confinement that you can get in these wave guides. So we're doing some theory on that. We're confident that the type of non linearity two losses that it can get with these platforms are actually much higher than what you can get with other platform, other existing platforms and to >>go even smaller. We have been asking the question off. What is the smallest possible Opio that you can make? Then you can think about really wavelength scale type resonate er's and adding the chi to non linearity and see how and when you can get the Opio to operate. And recently, in collaboration with us. See, we have been actually USC and Creole. We have demonstrated that you can use nano lasers and get some spin Hamiltonian implementations on those networks. So if you can't build a pos, we know that there is a path for implementing Opio Networks on on such a nano scale. So we have looked at these calculations and we try to >>estimate the threshold of a pos. Let's say for me resonator and it turns out that it can actually be even lower than the type of bulk Pippen O pos that we have been building in the past 50 years or so. >>So we're working on the experiments and we're hoping that we can actually make even larger and larger scale Opio networks. So let me summarize the talk I told you about the opium networks and >>our work that has been going on on icing machines and the >>measurement feedback on I told you about the ongoing work on the all optical implementations both on the linear side and also on the nonlinear behaviors. And I also told you >>a little bit about the efforts on miniaturization and going to the to the nano scale. So with that, I would like Thio stop here and thank you for your attention.
SUMMARY :
And if you look at the phase locking which is the length of the strength on by that modulation, which is that will make a pump. And I have two of them to show you that they can acquire these face states so they're still face their frequency and the idea is that we put impulses in the cavity, these pulses air separated by the repetition have a program. into the network, then the OPI ohs are expected to oscillating the lowest, So the reason that this implementation was very interesting is that you don't need the end what gives you the icing Hamiltonian model A. So the optical loss of this network and the delay lines are going to give you a different losses. So you go either to zero the pie face state, and the expectation is that this breaking the time reversal symmetry, meaning that you go from one spin to on the one side that we get with this associate model and you see how it reasonably how how? that now you have the flexibility of changing the network as we were running the machine. the to the standard nontrivial. You can then look at the edge states and you can also see the trivial and states and the technological at uh, network with Harper Hofstadter model when you don't have the results the motivation is if you look at the electron ICS and from relatively small scale computers in the order And the question is, how can we utilize nano photonics? periodic polling in the phenomenon of it and get all sorts of very highly non in your been building in the past few months, which I'm not gonna tell you more about. closer to that regime because of the spatial temporal confinement that you can the chi to non linearity and see how and when you can get the Opio be even lower than the type of bulk Pippen O pos that we have been building in the past So let me summarize the talk And I also told you a little bit about the efforts on miniaturization and going to the to the
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Caltech | ORGANIZATION | 0.99+ |
Andy | PERSON | 0.99+ |
two | QUANTITY | 0.99+ |
2016 | DATE | 0.99+ |
Harvard | ORGANIZATION | 0.99+ |
USC | ORGANIZATION | 0.99+ |
Each | QUANTITY | 0.99+ |
1000 times | QUANTITY | 0.99+ |
one problem | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
five researcher | QUANTITY | 0.99+ |
first experiment | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
six | QUANTITY | 0.99+ |
Al Gore ism | PERSON | 0.99+ |
today | DATE | 0.99+ |
first implementation | QUANTITY | 0.99+ |
thousands | QUANTITY | 0.99+ |
each | QUANTITY | 0.99+ |
123 | QUANTITY | 0.99+ |
one experiment | QUANTITY | 0.99+ |
seven years ago | DATE | 0.99+ |
Graham | PERSON | 0.99+ |
Creole | ORGANIZATION | 0.99+ |
one phase | QUANTITY | 0.98+ |
both | QUANTITY | 0.98+ |
Mexico | LOCATION | 0.98+ |
Harper Hofstadter | PERSON | 0.98+ |
Entity Research | ORGANIZATION | 0.98+ |
eight graduate students | QUANTITY | 0.98+ |
billions | QUANTITY | 0.98+ |
two parts | QUANTITY | 0.98+ |
Thio | PERSON | 0.98+ |
two directions | QUANTITY | 0.97+ |
second delay | QUANTITY | 0.97+ |
two possible face states | QUANTITY | 0.97+ |
Hamiltonian | OTHER | 0.97+ |
two losses | QUANTITY | 0.97+ |
seven years | QUANTITY | 0.96+ |
one example | QUANTITY | 0.96+ |
single | QUANTITY | 0.95+ |
two times | QUANTITY | 0.95+ |
One vote | QUANTITY | 0.95+ |
two simple pendulum | QUANTITY | 0.95+ |
first | QUANTITY | 0.94+ |
one spin | QUANTITY | 0.94+ |
60 | DATE | 0.94+ |
70 years ago | DATE | 0.94+ |
Gaussian | OTHER | 0.93+ |
16 implementation | QUANTITY | 0.92+ |
Nana | ORGANIZATION | 0.91+ |
3 | QUANTITY | 0.91+ |
two identical photons | QUANTITY | 0.9+ |
Stanford | ORGANIZATION | 0.87+ |
Opio | OTHER | 0.85+ |
one side | QUANTITY | 0.82+ |
thousands of problems | QUANTITY | 0.79+ |
first order phase | QUANTITY | 0.79+ |
one delay | QUANTITY | 0.77+ |
zero | QUANTITY | 0.76+ |
lithium Diabate | OTHER | 0.75+ |
Marko Loncar | PERSON | 0.75+ |
four Opium | QUANTITY | 0.75+ |
Nana | OTHER | 0.73+ |
G I J. | PERSON | 0.72+ |
2 | QUANTITY | 0.72+ |
J I J. | PERSON | 0.72+ |
one of | QUANTITY | 0.7+ |
Oshie | PERSON | 0.69+ |
past few months | DATE | 0.66+ |
NPR | ORGANIZATION | 0.65+ |
zero pie | QUANTITY | 0.64+ |
Haiyan Song & Oliver Friedrichs, Splunk | Splunk .conf2019
>>live from Las Vegas. It's the Cube covering Splunk dot com. 19. Brought to You by spunk >>Hey, welcome back. Everyone's two cubes coverage here in Las Vegas for spunk dot com. 19 dot com 19. This is slugs. 10th year doing dot Com Cube seventh year of coverage. We've watched the progression have security data market log files. Getting the data data exhaust turned into gold nuggets now is the centerpiece of data security, data protection and a variety of other great things and important things going on. And we're here to great guests from slug i n songs. Vice president and general manager of security markets and Friedrichs, a VP of security automation. Guys, great to see you again. We just saw you and there's reinforce. Thanks for coming back. >>Thank you for having us. >>So you guys announced security operation Sweet last year. Okay, now it's being discussed here. What's the update? What our customers doing? How are they embracing the security piece of it? >>Wow. Well, it's being a very busy year for us. Way really updated the entire suite. More innovation going in. Yes, six. Tato got announce and phantom and you be a every product is getting some major enhancement for concealing scale. For example, years now way have customers running in the cloud like 15 terabytes, and that's like three X and from It's like 50 terrifies 50 with Search has classes. So that's one example and fend him throughout the years is just lots of capabilities. We're adding a case. Management was a major theme, and that's actually the release before the current one. So we'll be, really, you know, 80 and focusing on that just to summarize sort of sweet right. You be a continue to be machine learning driven, and there's a lot of maturity that's that's going into the product, and there's a lot of more scale and backup. Restore was like one of the major features, because become more mission critical. But what's really, really, really exciting? It's how we're using a new product called Mission Control to bring everything all together. >>I want to get into the Mission control because I love that announcement. Just love The name was behind it, but staying on the sweet when they're talking about it's a portfolio. One of the things that's been consistent every year at dot com of our coverage and reporting has been wth e evolution of a platform on enabling platform. So has that evolves? What does the guiding principles remain? The same. How you guys sing because now you're shipping it. It's available. It's not just a point. Product is a portfolio and an ecosystem falling behind it. You know the APP, showcase, developer, Security and Compliance Foundation and platforms on Just I T ops and A I ops are having. So you have a variety of things coming out of for what's the guiding principle these days is continuing to push the security. You share the vision >>guiding principle and division. It's really way believe the world. As we digitize more as everything's happening, machines speed as people really need to go to analytics to bring insides into things and bring data into doing that's that's really turning that into doing so. It's the security nerve center vision that continue guide what we do, and we believe Security nerve center needs really data analytics and operations to come together and again, I'm gonna tell you, Mission Control is one of the first examples that we bring all of the entire stack together and you talk about ecosystem. It takes a village is a team sport. And I'm so excited to see everybody here. And we've done a lot of integrations as part of sweets to continue to mature more than 1900 AP I integrations more than 300 APS. Justice Phantom alone. That's a lot of automated actions. People can take >>the response from the people in the hallways and also the interviews have been very positive. I gotta get to Mission Control. Phantom was a huge success. You're a big part of building taking that into the world now. Part was flung. Mission Control. Love the name Mission Control. This is the headline, by the way, Splunk Mission Control takes off super sharp itching security operations. So I think Mission Control, I think NASA launching rockets Space X Really new innovation. Really big story behind his unification. You share where this came from, what it is what's in the announcement? >>Yeah. So this is all about optimizing how sock analysts actually work. So if you think about it, a sock typically is made up of literally a dozen different products and technologies that are all different consuls, different vendors, different tabs in your Web browser, so it for an analyst to do their job literally pivoting between all of these consoles. We call it swivel chair syndrome, like you're literally are frantically moving between different products. Mission Control ties those together, and we started by tying slugs products together. So we allow you to take our sin, which is enterprise security, or you be a product's monkey. Be a and phantom, which is our automation and orchestration platformer sore platform and manage them and integrate them into one single presentation layer to be able to provide that unified sock experience for the analyst So it it's an industry first, but it also boosts productivity. Leading analysts do their job more effectively to reduce the time it takes. So now you're able to both automate, investigate and detect in one unified presentation, layer or work surface. >>You know, the name evokes, you know, dashboards, NASA. But what that really was wasn't an accumulation, an extraction of data into service air, where people who were analysts do their job and managed launching rockets. But I want to ask you a question. Because of this, all is based on the underpinnings of massive amounts of volume of data and the old expression Rising tide floats all boats also is rising tide floats, Maur adversaries ransomware attacks is data attacks are everywhere. But also there's value in that data. So as the data volume grows, this is a big deal. How does mission Control help me manage to take advantage of that all you How do you guys see that playing out? >>Yes, Emission control really optimizes the time it takes to resolving incident. Ultimately, because you're able to now orient all of your investigation around a single notable event eso It provides a kn optimal work surface where an analyst can see the event interrogated, investigated triage, they can collaborate with others. So if I want to pull you into my investigation, we can use a chat ops that capability, whether it's directly in mission control or slack integration waken manage a case like you would with a normal case management toe be ableto drive your incident to closure, leveraging a case template. So if I want to pull in crisis communications team my legal team, my external forensics team, and help them work together as well. Case management lets me do that in triage that event. It also does something really powerful. High end mentioned. The operations layer the analytics in the data layer. Mission Control ties together the operational layer where you and I are doing work to the data layer underneath. So we're able to now run worries directly from our operational layer into the data layer like SPL quarries, which spunk is built on from the cloud where Mission Control is delivered from two on premise Face Plunk installations So you could have Michigan still running in the Cloud Splunk running on premise, and you could have multiple Splunk on premise installs. You could have won in one city, another one in another city or even another country. You could have a Splunk instance in the Cloud, and Mission Control will connect all of those tying them together for investigative purposes. So it's very powerful. >>That's a first huge, powerful when this comes back to the the new branding data to everywhere, and I see the themes everywhere, the new colors, new brake congratulations. But it's about things. What do ours doing stuff, thinking and making things happen. Connecting these layers not easy, okay? And diverse data is hard. Thio get access to, but diverse data creates great machine learning. Ay, ay, ay, ay, ay creates great business value. So way see a flywheel development and you guys got going on here. Can you elaborate on that? Dated everywhere And why this connective tissue that you're talking about is so important? Is it access to the war data? Is that flywheel happening? How do you see that playing out? >>I'll start with that because they were so excited where data to everything company or new tagline is turning data into doing. And this wouldn't be possible without technologies like Phantom coming in right way have traditionally been doing really great with enterprise was data platforms. And with an Alex now was phantom. We can turn that into doing now with some of the new solutions around data stream processing. Now we're able to do a lot of things in real time. On you mentioned about the scale, right scales changes everything. So for us, I think we're uniquely positioned in this new age of data, and it's exploding. But we have the technology to help your payment, and it's representing your business way. Have the analytics to help you understand the insights, and it's really the ones gonna impact day today enabling your business. And we have two engine to help you take actions. That's the exciting part. >>Is that what this flywheel, because diverse data is sounds great, makes sense more data way, see better? The machines can respond, and hopefully there's no blind spots that creates good eye. That kind of knows that if they're in data, but customers may not have the ability to do that. I think that's where the connecting these platforms together is important, because if you guys could bring on the data, it could be ugly data on his Chuck's data data, data, data. But it's not always in the form you need. Things has always been a challenge in the industry. How do you see that Flywheel? Yeah, developing. >>Yeah, I think one of the challenges is the normalization of the data. How do you normalize it across vendors or devices, you know. So if I have firewalls from Cisco, Palo Alto Checkpoint Jennifer alive, that day is not the same. But a lot of it is firewall blocked data, for example, that I want to feed into my SIM or my data platform and analyze similarly across endpoint vendors. You know you have semantic McAfee crowdstrike in all of these >>vendors, so normalization >>is really key and normalizing that data effectively so that you can look me in at the entire environment as a single from a single pane of glass. Essentially, that's response does really well is both our scheme on reed ability to be able to quarry that data without having a scheme in place. But then also, the normalization of that data eyes really key. And then it comes down to writing the correlation searches our analytics stories to find the attacks in that data. Next, right. And that's where we provide E s content updates, for example, that provide out of the box examples on how to look for threats in that data. >>So I'm gonna get you guys reaction to some observations that we've made on the Q. In the spirit of our cube observe ability we talked to people are CEOs is si sos about how they cloud security from collecting laws and workloads, tracking cloud APS and on premise infrastructure. And we ask them who's protecting this? Who is your go to security vendors? It was interesting because Cloud was in their cloud is number one if it's cloud are not number one, but they used to clear rely on tools in the cloud. But then, when asked on premise, Who's the number one? Splunk clearly comes up and pretty much every conversation. Xanatos. Not a scientific survey, it's more of it handpicks. But that means it's funk is essentially the number one provider with customers in terms of managing those workloads logs across ABS. But the cloud is now a new equation because now you've got Amazon, Azur and Google all upping their game on cloud security. You guys partner with it? So how do you guys see that? How do you talk cutters? Because with an enabling platform and you guys are offering you're enabling applications. Clouds have Apple case. So how do you guys tell that story with customers? Is your number one right now? How do you thread that needle into this explosive data in the cloud data on premise. What's the story? >>So I wish you were part of our security super session. We actually spent a lot of energy talking about how the cloud is shifting the paradigm paradigm of how software gets billed, deployed and consumed. How security needs to really sort of rethink where we start, right? We need to shift left. We need to make sure that I think you use the word observe ability, right? T you got to start from there. That's why as a company we bought, you know, signal effects and all the others. So the story for us is start from our ability to work with all the partners. You know, they're all like great partners of ours AWS and G, C, P and Microsoft. In many ways, because ecosystem for cloud it's important. We're taking cloud data. We're building cloud security models. Actually, a research team just released that today. Check that out and we'll be working with customers and building more and more use cases. Way also spend a lot of time with her. See, So customer advisory council just happened yesterday talking about how they would like us to help them, and part of that they were super super excited. The other part is what we didn't understand how complicated this is. So I think the story have to start in the cloudy world. You've gotto do security by design. You gotta think about automation because automation is everywhere. How deployment happens. I think we're really sit in a very interesting intersection off that we bring the cloud and on prime together >>the mission, See says, I want to get cameras in that room. I'm sure they don't want any cameras in the sea. So room Oliver taking that to the next level. It's a complexity is not necessarily a bad thing, because software contract away complexity is from the history of the computer industry that that's where innovation could happen, taking away complexity. How do you see that? Because Cloud is a benefit, it shouldn't be a hindrance. So you guys were right in the middle of this big wave. What? You're taking all this? >>Yeah. Look, I think Cloud is inevitable. I would say all of our customers in some form or another, are moving to the cloud, so our goal is to be not only deliver solutions from the cloud, but to protect them when they're in the cloud. So being able to work with cloud data source types, whether it's a jury, w s, G, C P and so on, is essential across our entire portfolio, whether it's enterprise security but also phantom. You know, one exciting announcement that we made today is we're open sourcing 300 phantom maps and making making him available with the Apache to get a license on get hubs so you'll be able to take integrations for Cloud Service is, like many eight of US service is, for example, extend them, share them in the community, and it allows our customers to leverage that ecosystem to be able to benefit from each other. So cloud is something that we work with not only from detection getting data in, but then also taking action on the cloud to be. Will it protect yourself? Whether it's you, I want to suspend an Amazon on your instance right to be able to stop it when it's when it's infected. For example, right those air it's finishing that whole Oodle Ooh and the investigate monitor, analyze act cycle for the cloud as we do with on from it. >>I think you guys in a really good position again citizen 2013. But I think my adjustment today would be talking to Andy Jackson, CEO of AWS. He and I always talk all the time around question he gets every year. Is Amazon going to kill the ecosystem? Runs afraid Amazon, he says. John. No, we rely on third party. Our ecosystem is super important. And I think as on premises and hybrid cloud becomes so critical. And certainly the Io ti equations with industrial really makes you guys really in a good position. So I think Amazon would agree. Having third party if you wanna call it that. I mean, a supplier is a critical linchpin today that needs to be scalable, >>and we need equal system for security way. You know, you one of the things I shared is really an asymmetric warfare. Where's the anniversary? You talk about a I and machine learning data at the end of the day is the oxygen for really powering that arm race. And for us, if we don't collaborate as ecosystem, we're not gonna have a apprehend because the other site has always say there's no regulations. There's no lawyers they can share. They can do whatever. So I think as a call to action for our industry way, gotta work together. Way got to really sort of share and events or industry together. >>Congratulations on all the new shipping General availability of E s six point. Oh, Phantoms continue to be a great success. You guys on the open source got an APB out there? You got Mission Control. Guys, keep on evolving Splunk platform. You got ABS showcase here. Good stuff. >>Beginning of the new date. Excited. >>We're riding the waves together with Splunk. Been there from day one, actually 30 year in but their 10th year dot com our seventh year covering Splunk. I'm John Ferrier. Thanks for watching. We'll be back with more live coverage. Three days of cube coverage here in Las Vegas. We'll be right back.
SUMMARY :
It's the Cube covering great to see you again. So you guys announced security operation Sweet last year. So we'll be, really, you know, 80 and focusing on that just to So you have a variety of things coming out Mission Control is one of the first examples that we bring all of the entire stack together You're a big part of building taking that into the world now. So we allow you to take our sin, which is enterprise security, or you be a product's monkey. You know, the name evokes, you know, dashboards, NASA. So if I want to pull you into my investigation, we can use a chat ops that capability, whether it's directly in mission So way see a flywheel development and you guys got going on here. Have the analytics to help you understand But it's not always in the form you need. that day is not the same. the correlation searches our analytics stories to find the attacks in that data. So how do you guys see that? We need to make sure that I think you use the word observe So room Oliver taking that to the next level. from the cloud, but to protect them when they're in the cloud. And certainly the Io ti equations with industrial really makes you guys really So I think as a call to action for our industry way, You guys on the open source got an APB out there? Beginning of the new date. We're riding the waves together with Splunk.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Amazon | ORGANIZATION | 0.99+ |
Andy Jackson | PERSON | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
NASA | ORGANIZATION | 0.99+ |
John Ferrier | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Azur | ORGANIZATION | 0.99+ |
Apple | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
last year | DATE | 0.99+ |
more than 300 APS | QUANTITY | 0.99+ |
yesterday | DATE | 0.99+ |
15 terabytes | QUANTITY | 0.99+ |
John | PERSON | 0.99+ |
today | DATE | 0.99+ |
seventh year | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
30 year | QUANTITY | 0.99+ |
US | LOCATION | 0.99+ |
one city | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
Oliver Friedrichs | PERSON | 0.98+ |
two cubes | QUANTITY | 0.98+ |
G | ORGANIZATION | 0.98+ |
Jennifer | PERSON | 0.98+ |
McAfee | ORGANIZATION | 0.98+ |
One | QUANTITY | 0.97+ |
one | QUANTITY | 0.97+ |
10th year | QUANTITY | 0.97+ |
eight | QUANTITY | 0.97+ |
both | QUANTITY | 0.97+ |
single | QUANTITY | 0.97+ |
Palo Alto | ORGANIZATION | 0.96+ |
Xanatos | ORGANIZATION | 0.96+ |
Michigan | LOCATION | 0.96+ |
two engine | QUANTITY | 0.96+ |
first examples | QUANTITY | 0.96+ |
C | ORGANIZATION | 0.95+ |
10th year | QUANTITY | 0.95+ |
2013 | DATE | 0.95+ |
Splunk | ORGANIZATION | 0.95+ |
Three days | QUANTITY | 0.94+ |
one example | QUANTITY | 0.94+ |
Oliver | PERSON | 0.93+ |
Chuck | PERSON | 0.93+ |
Friedrichs | PERSON | 0.92+ |
Face Plunk | TITLE | 0.9+ |
single pane | QUANTITY | 0.89+ |
one single presentation layer | QUANTITY | 0.88+ |
day one | QUANTITY | 0.88+ |
Splunk | TITLE | 0.86+ |
six | QUANTITY | 0.85+ |
one exciting announcement | QUANTITY | 0.85+ |
more than 1900 AP | QUANTITY | 0.85+ |
Haiyan Song | PERSON | 0.82+ |
a dozen different produc | QUANTITY | 0.82+ |
number one | QUANTITY | 0.8+ |
Cloud Service | TITLE | 0.79+ |
Flywheel | ORGANIZATION | 0.75+ |
Mission | TITLE | 0.73+ |
nd part | QUANTITY | 0.72+ |
P | ORGANIZATION | 0.72+ |
Mission Control | TITLE | 0.71+ |
Splunk | OTHER | 0.71+ |
See | PERSON | 0.69+ |
big wave | EVENT | 0.69+ |
80 | QUANTITY | 0.69+ |
Cloud | TITLE | 0.68+ |
slugs | ORGANIZATION | 0.67+ |
Security and Compliance Foundation | ORGANIZATION | 0.67+ |
every conversation | QUANTITY | 0.65+ |
Space X | COMMERCIAL_ITEM | 0.62+ |
Mission Control | TITLE | 0.61+ |
300 phantom maps | QUANTITY | 0.6+ |
Betsy Sutter, VMware | Women Transforming Technology 2019
>> From Palo Alto, California, it's theCUBE. Covering VMware, Women Transforming Technology 2019. Brought to you by VMware. >> Hi, Lisa Martin, on the ground with theCUBE, at Vmware in Palo Alto, California at the fourth annual Women Transforming Technology event, WT-squared. Love this event. So excited to welcome back to theCUBE Betsy Sutter, VMware's Chief People Officer. Betsy, this event is incredible, year after year. >> Yeah. >> How do you do it? >> I don't do it. A team of people do it. But I love it and I love it that you're here. You're as passionate about this as I am. Our fourth! And this one is bigger and better than ever. I love it. And, you know, it's really all about just connecting women so we can continue to innovate and shape the future. So, super fun! >> It is super fun. One of the things that I love is that as soon as you walk onto the campus in the morning, ahead of the event, even walking up to registration, you can feel positivity, sharing, collaboration, experiences being shared. This community movement-- you literally can feel it. And then we walked in, your opening keynote this morning. >> Yeah, wasn't she amazing? Joy Buolamwini >> Wow. Amazing. What she was sharing. Breakthrough data of all the biases that are being built into just facial recognition software alone. >> Yeah. >> Her passion for highlighting the bias and then identifying it and then mitigating it, that passion was not only coming from her, but the entire audience. In person, I can imagine the livestream, just got it. >> Yeah. You know, she is amazing. I mean, she's an innovator. I mean, she's a brainiac. She's funny, she's artsy. But she's an innovator. But what's interesting about her is she's an inclusive innovator. Right? It's all about inclusion and I love her approach to this. I just spent an hour with her in a Fireside Chat where a number of us got to have a conversation with her and she's about as interesting as anybody I've ever met in terms of where she's taking this research so that she can create, just a better world. >> And she's doing that. One of the things that was, the word inclusivity kind of popped up, and intersectionality, a number of times, where she was showing data, AI data, from Microsoft, IBM, Face++, and just showing the massive differences in those data sets alone, so the whole inclusivity theme was very paralleled, in my opinion, but she's actually getting these companies to start evaluating their data sets to change that so that Oprah Winfrey, for example, face recognition doesn't come up as a male. >> That's right. Yeah, she has done some interesting, interesting work, and she's not approaching it as if it's a race issue in particular, right. She's taking a completely different, very positive approach, to highlighting a real problem. I mean, we knew that inclusion is a challenge in technology, but inclusion in artificial intelligence is by far worse, and I love it that she's unpacking that. >> I also love that, as a marketer, I loved how she formed the Algorithmic Justice League. >> Right. >> I couldn't think of a better name, myself. But that she's seeing three tenets of that. One is highlight the bias. >> That's right. >> And I thought, that's awareness. There needs to be more awareness of that because my mind was blown seeing these models today, and then she brings in Amazon and shows them, look at your data sets. >> Right. >> And so there needs to be more awareness, consistent awareness, it's kind of classic marketing of, there are a lot of challenges, but AI is so pervasive, I can imagine a lot of baby boomers probably have iPhones with facial recognition and don't understand, wow, even that, unlocking my phone, is a problem. How deep does this go across emerging technologies that are being developed today? >> That's right. And then she just talks about, in such broad terms, I mean she has a global mind around the social impact that this is having, whether it's in artwork, whether it's in self-driving car technologies, whatever it is. I mean, it's huge. And she's able to kind of look out and think about it in that light. And given the work that we're doing at VMware around inclusion and diversity, it's kind of a fresh new angle to really unpacking the layers of complexity that face these issues. >> Yeah, you're right. That was a thing that also caught my attention was there were so many layers of bias. >> Yeah, yeah. >> We can think of, you know, the numbers of women, or lack thereof, in technology. One of the things that Joy said, kind of along the parallels of layers was, the under-represented majority, as she says, it's women and people of color. >> That's right. >> It's layer upon layer upon layer. >> It is. >> Wow. Just cracking the surface. >> She's just scratching things, but the way she's doing her approach, I think, just brings a whole new light to this. I'm very grateful that she was able to speak to all of us, right. It's really about bringing women together to have these kinds of conversations so we can start to think about how we want to innovate and shape the future. She also touches on just this aspect of communities, which I love. And, you know, I've long said that people join communities, not companies, per se, and one of the things that we've done at VMware is tried to think about how do you create an inclusive culture, if you will, that embraces all sorts of communities. And Joy just started talking about a whole new dimension to how we think about that, which was fun. >> So you have been at the helm of people at VMware for a long time. >> I have. >> Lots of transformation. >> Yeah. >> I'm curious to get your, if you look back at the last four years now of WT-squared, how have you learned from even just speakers like Joy and helped to transform not just WT-squared but VMware, its diversity and inclusion efforts in and of themself? >> Yeah, you know, one of the things that I love about VMware and I love about WT-squared is that it's really a consortium or a collective of companies coming together, so this is not a VMware branded event, or a VMware event just by itself. It's just a collective. And then we try and broaden that circle so we can have more and more conversation. And I think that's what I'm most pleased with, I mean, we work hard at making sure that this collective is involved from the get-go in terms of, what do we want to talk about, so we can have the real and relevant conversations about inclusion and diversity, especially as women in tech, which, in some regards, is getting better, but in many, it's just not, and so how do you double down on that in an authentic way and really get business results. >> Exactly. It's all about getting business results. >> It is. >> One of the things that surprises me, in some cases, is when you see, whether it's from McKenzie or whatnot, different studies that show how much more profitable businesses are with women at the executive levels, and it just, that seems like a no-brainer, yet there's so many, the lack of women in technology, but also the attrition rates. >> Yeah. >> Really staggering, if you look at it, compared to any other industries. >> That's right. And, you know, we have a longstanding relationship with Stanford. >> Yes. >> The Clayman Institute. VMware helped found the VMware Stanford Women's Leadership Innovation Lab, which I'm exceedingly proud of. But, yeah, research shows this over and over. But one of the things that I love about my work is bridging that into how corporations operate and how people just work at work, and so that keeps me intellectually engaged, I'll say that, for sure. But, yeah, that is the big challenge. >> I'm also, what I love, just observing the attendees at the event, is you see all age levels. >> Yeah, I love that, too. >> And you have the tracks, the Emerging Leaders track for those who are younger, earlier in their career, The Executive track, the Technical track, and you've got a track about of sharing best practices, which I also love, or just hearing stories of, "How did you face this obstacle, maybe it wasn't, that didn't cause you to turn, or to leave the industry?" I think those are so important to help share. "Oh my God, I'm going through the same thing," for example. But might just help the next, or not just the next generation, but even those of us who might be middle-career from not leaving and going, "Okay, maybe it's the situation, I need to get into a different department, a different company, but I love technology and I'm going to stay no matter what." >> Yeah. Keeping those conversations elevated is one aspect of this, but then to your point, the cross-pollination of all these different kinds of women and what they've experienced in tech, the panel today was amazing, right. We had Ray, we had Lisa, and we had Susan. All different perspectives, different generations, but talking about sort of their challenges as they've navigated this, and where they all want to see it go. So I do think there's a bit of a common vision for where we want this to go, which is wonderful, but bringing all these different perspectives is the differential. And that's what we do here. We try and replicate that. And what will happen all through the day as I go to those different tracks, I'll hear from these different women and the questions are always just a blast to hear, right, because I learn so much from what's top-of-mind, what's keeping people up at night as they venture into tech and continue into tech. >> Anything in particular that surprises you? >> You know, one young woman asked me about my concern around communication and interaction because of how technology's affected how people do that-- rarely face-to-face like you and I are right now. And there're so many other visual and sensory cues that go into having a conversation with another human being, so we had a great conversation about what's good about it from a technology standpoint, and what's bad about it, and I think that's actually what Joy was talking about in her talk today, as well. But I was pleased that a very young person asked me that question. I know people of my generation, we talk about it, but it was fun to hear, kind of inspiring to hear a younger person say, "Is this all good?" >> Well and you're right, it probably was a nice, pleasant, refreshing surprise because we think of younger generations as, kind of, you say, cloud-native or born of the cloud, born on the phone, who are so used to communicating through different social media platforms. To hear that generation saying, you know, or even bringing it to our attention, like, "Shouldn't we be actually talking in person or by using technology like video conferencing and zoom things for engaging?" Think of how many people wouldn't fall asleep in meetings if video conferencing was required? >> That's right. That's exactly right. And another woman, a little further along in her career, what was weighing on her was how she stayed being a responsible and ethical person when she doesn't really know all the ingredients of what she's helping to create. And that's just a mindset that I haven't heard before. I thought that was wonderful. >> That is. Because we often talk about responsibility and accountability with respect to data science or AI, for example. It's interesting to hear an individual contributor talking about, "Where do I fall in that accountability/responsibility spectrum?" Is not a common question. >> No, and you know, we think we're creating a world of more transparency but, really, when you're coding you're not really sure what might happen with that code. And I thought Susan Fowler did a lovely job talking about that today on the panel, as well. That there's a huge responsibility in terms of what you're doing. So connecting those dots, understanding all the ingredients, I think corporations like VMware, and VMware does this in large part today, it gets harder, it's more complex, but we're going to have to answer those questions about what kind of pie or cake are we really baking with this, right? >> Exactly. Exactly. Could you have, if you looked back to when you first joined VMware, envisioned all of the transformation and the strength in community and numbers that you're helping to achieve with women transforming technology? >> I really couldn't. I mean, the industry is amazing, you know, I was at the right place at the right time and got to ride this tech wave. It's been great. No, I couldn't have imagined it, and now things are moving at an unprecedented place, things are much more complex. I have to call my adult children to get input onto this, that, and the other. >> (laughs) >> But no, it is a dream come true. It's been an absolute honor and privilege for me to be a part of this. I love it. >> When you talk with VMware partners or customers, are they looking to-- Betsy, how have you been able to build this groundswell and maintain it? >> Yeah, you know, my focus is primarily on the culture and the environment of the company, and I'm a really good listener. So that's the key. >> It is key. You just listen and pay attention to what people are saying, what matters to them, what's bothering them, and you continue to hold on to, sort of, those, you know, those North Stars of what you're trying to build and I always knew that I wanted to build the sustainable cultures, something that would last the test of time. So we're at 21 years. I've done 19 of them, so it's been great. You know, but you want to make sure you keep that rebar in the ground as you continue to build up. This community is solid. They're doin' it. Yeah, it's great. >> And it must be receptive. We talked about companies or leaders or businesses being receptive to change. I think I talked about that with Caroline and Shannon, who were part of that panel, and said, you know, oftentimes, we're talking with leaders, again, business units, companies, who aren't receptive to that change. Cultural change is really difficult, but it's essential. I was talking with Michael Dell a few months ago at Boomi World and said, "How have you managed as Dell has grown so massively to change the culture in a way that, you know, enables that growth?" It's a really hard thing to do. But for companies to do digital transformation and IT transformation, the culture, the people have to be receptive. I think, to one of your strengths, they have to be willing to listen. >> Yeah. And you never really arrive, right. So you constantly are in beta mode in the world, and so if you never assume that you've arrived, then you can pause, or that you just constantly want to beta things, then you have an edge, and I think Michael Dell's clearly got vision around that, right. I know Pat Gelsinger does, too. And so I like just partnering with those great minds, those great business and strategic minds, and then just building on the people component or the cultural component. But I, too, I'm constantly trying to produce new products and pay attention to what the customer wants. >> When you see things in the news like some of the harassment issues, say, for example, that Uber has experienced, I imagine you're watching the news or reading it and you're thinking, if I could just say three things to those people. When you see things like that, what are the top three things you would recommend that, not in reaction, though, but how can that culture change to deliver the customer experience, ultimately, that they need to, but what are some of the things that you think, these are easy fixes? >> Yeah, I think in watching a lot of my companies in the industry and how they've responded, for me, my advice would be, you should elevate that conversation. That conversation's not going to go away. And so you need to elevate it, give it a lot of sunlight and oxygen, really understand it, don't try and move away from it, don't push it down. And that's something we do at VMware, we're constantly elevating the conversation. One of the things I love about this culture, it's made me a lot better at what I do, is I can always answer the question, "Why are we doing that?" And so that's, why are we doing that? And if I can't answer why, we have a problem. And a why just sort of symbolizes intellectual curiosity, right, so that's what we're trying to keep alive and that's what I tell my other colleagues in the industry is just keep that conversation going: there's no quick fix to this, people are complex, don't pretend you really know. So elevate it and let's get to really know each other a lot better. >> And there's so much good that can come from any sort of blight or negativity, there really is, but you're right. Especially in this day and age, with everything being on camera, you can't hide. >> And, you know, it's okay to admit that you made a mistake. >> I agree. >> It's really okay. And so there's something about that that we've got to get back. >> I think it's one of the most admirable things of any human trait or corporation is just admitting, ah, this was the wrong turn, >> Right. >> I said the wrong thing. >> You know what, we made a mistake. We've course-corrected. >> I'm human. >> Yes. >> Exactly. >> Exactly. >> So we talked about Joy opening things off today and Ashley Judd-- >> I know, I can't wait. >> I bet you can't wait. She is the closing keynote. What are the things that inspire you about Ashley's work? >> I just think that she's wicked-smart. And I think she's using her platform in a really powerful way. And for her to want to come here and speak to us just reflects her passion, and the juxtaposition of Joy with Ashley is fabulous, right. Really gives you a lot to think about, so I can't wait to see Ashley. >> And just even juxtaposing those two, like you said, you can just see massive diversity there, in thought, in background, and experience, in life experiences, but both coming from different perspectives and different angles that can be so inspirational >> Yeah. To all of us in the audience. >> Yeah, and positive. You know, they're taking this positive approach to this movement and, yeah, very different women, but both really, really smart, very passionate. Resilient, clearly. And persistent. They're going to keep movin' it forward. >> Persistence is the key. So, great event so far. It's not even over, but what are your dreams for next year's event? >> Oh, we just have to keep going. I'd love to see more companies join the consortium. We've learned a couple things about, we just are going to start the conversation earlier about what we want the event to be. We love hosting people on the campus, obviously, and luckily we have terrific weather today, but I would just like to see companies come together and have the conversation, and that was really the impetus for this, is that we wanted to make sure we got a lot of diverse perspectives that were dealing with these real issues, and let's talk about what women in technology at all levels, as you pointed out, what's top-of-mind for them? And what do they need to have the conversation about? Let's bring 'em together, let's let 'em connect and start to innovate and create the future. >> Well I'm already looking forward to next year, Betsy. >> Yeah, me too. >> It's been such a pleasure to talk to you again. >> Thank you, Lisa. >> Thank you so much for spending time with me on theCUBE today. >> Thank you. >> Appreciate your time. >> Super fun. >> Good. You're watching theCUBE. I'm Lisa Martin on the ground at Women Transforming Technology, the fourth annual. Thanks for watching. (peppy electronic music)
SUMMARY :
Brought to you by Hi, Lisa Martin, on the ground with theCUBE, and shape the future. One of the things that I love is that Breakthrough data of all the biases that are being built but the entire audience. It's all about inclusion and I love her approach to this. and just showing the massive differences and I love it that she's unpacking that. I loved how she formed the Algorithmic Justice League. One is highlight the bias. And I thought, that's awareness. And so there needs to be more awareness, I mean she has a global mind around the social impact Yeah, you're right. One of the things that Joy said, Just cracking the surface. and one of the things that we've done at VMware So you have been at the helm of people at VMware and so how do you double down on that It's all about getting business results. One of the things that surprises me, in some cases, Really staggering, if you look at it, And, you know, we have a longstanding relationship and so that keeps me intellectually engaged, is you see all age levels. I think those are so important to help share. and the questions are always just a blast to hear, right, and I think that's actually what Joy was talking about To hear that generation saying, you know, all the ingredients of what she's helping to create. and accountability with respect to data science No, and you know, we think to when you first joined VMware, I mean, the industry is amazing, for me to be a part of this. and the environment of the company, and you continue to hold on to, to change the culture in a way that, you know, and so if you never assume that you've arrived, but how can that culture change to deliver And so you need to elevate it, you can't hide. that you made a mistake. And so there's something about that You know what, we made a mistake. What are the things that inspire you about Ashley's work? and the juxtaposition of Joy with Ashley is fabulous, right. To all of us in the audience. Yeah, and positive. Persistence is the key. and create the future. Thank you so much for spending time I'm Lisa Martin on the ground at
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Ashley | PERSON | 0.99+ |
Susan Fowler | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
VMware | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Joy Buolamwini | PERSON | 0.99+ |
Pat Gelsinger | PERSON | 0.99+ |
Susan | PERSON | 0.99+ |
Oprah Winfrey | PERSON | 0.99+ |
Ray | PERSON | 0.99+ |
Betsy | PERSON | 0.99+ |
Michael Dell | PERSON | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
19 | QUANTITY | 0.99+ |
Stanford | ORGANIZATION | 0.99+ |
Joy | PERSON | 0.99+ |
Betsy Sutter | PERSON | 0.99+ |
Caroline | PERSON | 0.99+ |
Ashley Judd | PERSON | 0.99+ |
iPhones | COMMERCIAL_ITEM | 0.99+ |
Palo Alto, California | LOCATION | 0.99+ |
21 years | QUANTITY | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
next year | DATE | 0.99+ |
today | DATE | 0.99+ |
one | QUANTITY | 0.99+ |
fourth | QUANTITY | 0.99+ |
both | QUANTITY | 0.98+ |
Shannon | PERSON | 0.98+ |
two | QUANTITY | 0.98+ |
three things | QUANTITY | 0.97+ |
One | QUANTITY | 0.96+ |
an hour | QUANTITY | 0.96+ |
Women Transforming Technology | EVENT | 0.95+ |
theCUBE | ORGANIZATION | 0.95+ |
first | QUANTITY | 0.95+ |
VMware Stanford Women's Leadership Innovation Lab | ORGANIZATION | 0.93+ |
McKenzie | ORGANIZATION | 0.92+ |
one aspect | QUANTITY | 0.92+ |
one young | QUANTITY | 0.9+ |
three tenets | QUANTITY | 0.9+ |
Algorithmic Justice League | TITLE | 0.89+ |
few months ago | DATE | 0.89+ |
Dell | ORGANIZATION | 0.89+ |
Boomi World | ORGANIZATION | 0.88+ |
2019 | DATE | 0.87+ |
this morning | DATE | 0.87+ |
Women Transforming Technology | ORGANIZATION | 0.86+ |
Women Transforming Technology 2019 | EVENT | 0.81+ |
Clayman Institute | ORGANIZATION | 0.8+ |
Caroline Simard, Ph.D & Shannon Gilmartin, Ph.D | Women Transforming Technology 2019
>> From Palo Alto, California, it's theCUBE. Covering VMware Women Transforming Technology 2019. Brought to you by VMware. >> Hi, Lisa Martin on the ground with theCUBE at the fourth annual Women Transforming Technology event VMware, WT squared, one of my favorite events and I'm joined by two PhDs, both from, I'm going to say this one time, the Stanford VMware Women's Leadership Innovation Lab, we've got Shannon Gilmartin, senior research scholar. Hi, Shannon. >> Hi, great to be here. >> And we've got, great to have you, we've got Caroline Simard, managing director of the lab. Ladies, thank you so much for joining. >> Thank you, it's a pleasure to be here. >> So this event, we were talking about before we started, that you, walk into the keynote, opening keynote which in and of itself was electric but the energy that comes into the room with, VMware was telling me a little while ago, about 1500 live attendees. >> Incredible. >> Not even including those that were watching the livestream. The energy comes into the room and then, of course, this morning with Joy, I'm going to try to say her name, Buolamwini. The poet of code, the MIT researcher who started really, sharing with us the significant biases in AI. The energy, if it could even be down that more, I can't even imagine it, so. I can imagine the panel that you guys were on this morning was quite charged. The panel title was, I found interesting, Inclusive Innovators Designing For Change. So Caroline, talk to us about designing for change. You look through a design lens, what does that mean? >> Yeah, so I think what, to frame the morning, and then Shannon was the moderator, so I want, she picked the topic of design. But I think what Joy really showed is the power that is possible to realize when women and women of color and people from different dimensions of identity are included in creating technology and how much better technology will be for society, right? If all voices are included, and I would also say that some of her comments also make it clear that it is fundamentally irresponsible not to have diversity at the table in designing the technology of tomorrow. The consequences on different kinds of people and different populations are significant. And so this is why Shannon really picked this idea of, as engineers and designers and creators of this technology, how do you keep in mind the responsibility that you have? >> So yeah, talk to us more about the design and why that is so critical. >> And the way we positioned it for our panelists, it was titled Inclusive Innovators Designing For Change, and we were going to explore how meaningful change towards greater diversity and equity is realized in engineering cultures. And in the very technology that's being created. More specifically though, how do individuals and communities of people design for change in their technical environments? Even when this environment may not be initially very receptive to new ways of interacting. To new ways of thinking, to new ways of achieving. And so the whole panel was premised on this idea of people are designers of change in their environments. How does that happen? How do people interface with barriers to those design processes? And what is advice for the younger generation as they look ahead to their pathways as designers for change? >> Yeah, 'cause change in any context of life is hard. >> Yep. >> Yes. >> Right, it's an uphill battle. But designing for that change, I'm curious what some of the commentary was from the panelists about, when you're encountering, whether it's a company or a leadership group within a company that, to your point, isn't receptive, what were some of the comments or stories of how that was changed over time to become receptive and understand, the massive potential that that change can have? I mean we look at numbers like, companies with women on the leadership communities are far more profitable, so what were some of those, from, I don't get it, to, oh my gosh, why aren't we doing sooner? >> And we have this amazing range of perspectives represented on the panel, so we had a VMware CTO, chief technology officer Ray O'Farrell. And he was really talking about from a leader perspective, a key idea here when there are barriers and blocks and inertia, is to open things up and really start listening. And this is a skill and a talent and a group practice that is so little done, so infrequently done. So poorly done, sometimes. But really key in the face of those barriers is to actually say, instead of shutting down, open up and start listening to what's happening. Another one of our panelists, Susan Fowler who is the Time Magazine Person of the Year as one of the silence breakers in 2017, she was really talking about how, expect the steps, you're going to need to go through a lot of steps to make your voice heard. And ultimately, for Susan, she made the decision to go public with what she had encountered and was facing and grappling with and struggling, as were many of her colleagues. But she was really talking about the step by step process that's involved in a large organization, when you're hitting blocks, you just got to keep on fighting that good fight, and you also need to be doing your very best work at the same time, it's a high pressure situation. >> Yeah, absolutely. >> So. >> Absolutely, we also heard from Lisa Gelobter who is the CEO of tEQuitable, an organization that's creating a safe place for change agents to share their stories when they're encountering these blocks and this kind of unfair treatment. And she talked about, also, the need to do your best work but also the critical importance of community in being more resilient as you're trying drive change in your environment, right? And this is the kind of community that is being built today with this event, right? It's really paying attention especially for her, as a black woman engineer, being the only one constantly at the table fighting for change has been something that she has realized she needs to pay a lot of attention to so that she can be much more resilient as a leader for longterm change. Another topic that I think, in terms of generating change, that really came through both in the panel and during this morning's keynote, and that we pay a lot of attention to at the lab, is to really highlight bias. Is to really diagnose what is really happening in organizations? Or in AI, as we heard from Joy this morning. So a lot of people genuinely aspire to treat others fairly, right? But they don't realize that their workplaces are so far from being a meritocracy, that there's these structural inequalities that are really embedded in all of the ways that people are working. And so when you're able to show people exactly how it shows up in their company, right? The promotion rates for women of color for example, being lower than for other people, the exact points of data that they need to see, that they're not treating people the same way and creating the same kind of pathways for impact for different kinds of people, then that has a lot of power to drive change because a lot of people, then, will be very motivated to say, okay, I see this is happening in my org every day. Now I can design a different approach, right? How do I redesign the way I'm working today? In my units. >> And take action. >> And take action. >> 'Cause you actually have the data, it's such a dichotomy at times, that we have, we're surrounded by data especially in Silicon Valley. But one of the things that shocked me, what Joy showed this morning is, when she put on blast, IBM, Microsoft, and what was it, Face++, about looking at all of the built in biases to facial recognition. But, one of the things that really also, I thought, was interesting, was that, she went and showed this to these companies, who responded, and those numbers are actually improving. And then when she said, hey Amazon, so, the fact that even that one person is able to show, look at some of the massive problems that you're training these models to have, they need to be able to see that. So the highlight, I think, the highlight the bias, and the communicate, communicate, communicate and listen, are three critical elements to any place being successful. >> Exactly. >> Exactly. One additional part of both Joy's presentation and Lisa's comments too, really spoke to action needing to take an intersectional approach. So Joy's data breaks it down by race and gender and all of a sudden, you see completely different trends. Lisa spoke to that as well in her comments. Key to this designing for change process is really wearing the hat of someone who is looking through the world with an intersectional lens. And understanding how different axes operate together uniquely for different groups. And that's when you see these biases being highlighted really in full force, in full relief. So both of these points and these presentations really brought that up. >> Yeah and the intersectionality that Joy talked about was even evident and you could parallel it to, why it was important to look at all these different sources of facial recognition data, how disparate some of them were. >> Right, right. >> I know. >> Without that lens you couldn't see all of that variation even across the different providers. >> Exactly. >> Yeah, and she talked, too, about how everything is classified in a binary way, right? In terms of gender identity, and then where data doesn't even see people who are Non-Binary. >> Exactly. >> So it's like, >> That's still a huge omission >> again, exactly. That we have a lot more work to do to have data that truly captures all the dimensions we're interested in. >> It does, it does. Long way to go, but the fact that it's being highlighted and opportunities like, not just what VMware does but the lab as well. So let's talk a little bit about the lab. It kind of got its start in 2013 when then Stanford president Doctor John Hennessy, provided some funding. I had the opportunity to interview him last week, lovely man. Last year VMware did a big endowment of about 15 million. What's going on, Caroline, we'll start with you, what's going on at the lab? What are you guys studying now? What are some of the breakthroughs that have been uncovered in the last 12 months? >> Yeah, so a big part of our lab's work and since we began this work, has been to really bridge the gap between research and practice, right? And so a lot of why there's little progress being made is because you have a lot of research happening in the academy, in the ivory tower, if you will. And then you have a lot of innovative practices being tested but without necessarily the research foundation and the research frameworks to truly evaluate it. And so, our work has been to really bridge those two things together. And explore those boundaries so we can have more innovative research but also more evidence based practices come in, right? And since the VMware endowment we've been able to, really grow in our aspirations in the kind of data, in the kind of research questions that we can really ask. One of them is this focus on the more intersectional, longterm study of really documenting the experience of women of color. And really understanding the nature of their career pathways across racial dimensions, right? And really highlighting a lot more of, qualitative deep insight, generate their stories, right? And really centering their experience. The other one is, investing in large scale datasets that capture gender, race, age, and other identity dimensions and look at their longterm career trajectories. This is actually work that Shannon is leading. So we have an exciting dataset where we have people through five years and we see what happens to them, who gets promoted? Who doesn't? Who gets top talent designation, who gets a salary increase? Who, and then we're excitingly, looking at social network data, so who's meeting with who? And then what kind of connections do you need to be able to advance in your career? And are there some systematic inequalities there, right? And a big part of our work then is to design these interventions where we work with companies to test what we call a small wins approach. It always starts with diagnosis, here's what's going on in your very specific workplace and your culture. And then we co-design with leaders and managers. It doesn't work for us or HR or anybody to say, go do this, or you should do this. It's really about really engaging managers who want to do better in coming up with the design fix, if you will, that they can come up with. Informed by our research, so it's a co-design process. And then we roll it out and we test the outcomes pre and post, so. We're doing a lot more work now to disseminate what we're learning through these interventions so that other organizations can implement this very similar approach. >> First I love that it's called an intervention. 'Cause I think that's incredibly appropriate. (Shannon and Caroline laughing) Second, are you seeing an uptick in the last year of companies, obviously VMware and Dell being two great companies that are very focused on, not just women in technology, but I loved how Joy said today, it's women and people of color are the underrepresented majority. Are you seeing an uptick in companies willing to, accept the intervention and collaborate with you to really design from within for that change? >> Yes absolutely. And I would say that in this industry people are comfortable with piloting things and doing a little R and D experiment, right? So it's also a culturally appropriate way of thinking, okay, what if we try this, and see what happens? And so I see a lot of energy from organizations and based on what you were talking about, it's also, I think companies are aware that it's, the overlapping dimensions of identity increasingly aware, are within their own walls, but then, in their consumer base, right? So how is their product affecting different kinds of people? Are their customers experiencing bias from the very platforms that they build? And so I think that's also a very powerful, entryway into this intersectional conversation because, the product is, so foundational to the business of the company. >> It is, and especially event after event that we cover on theCUBE, customer experience in any industry, is critical because as consumers of whatever it is, we have so much choice. Shannon last question for you. One of the things that always interests me is the attrition rate being so high in technology. I'm curious what you guys are finding in the lab with, mentioning following women on maybe their first five years. Are you seeing any glaringly obvious, challenges that are driving that attrition? Is it, it's got to be more than the motherhood penalty. >> Right, right. We're looking at a range of, what we call pathway outcomes really for young people just starting out in their very first, second jobs, where they are several years later, we're looking at odds of promotion, odds of leaving the company, odds of moving and making a lateral move into some other kind of line of business, maybe taking them out of, let's say, a technical role and moving them into a non technical role. Each and every one of those critical moments is worthy of deeper study for us. And what we're doing, really, is taking this intersectional lens and understanding how do those different moments vary for different groups of women? It's not enough just to say, all women have some x percentage of an attrition rate. We're trying to understand how attrition really varies by sub-groups of women. And how that varies over time with what interactions that precede it and then follow. One of the themes that we've really been looking at in, for instance, attrition stories, is the assignment. Which projects, what kinds of assignments are people getting in their first few years on the job? How are some of those make or break? With what net consequence for women, men, from different racial ethnic backgrounds, different ages, different countries? And understanding, really, the role of those assignments in someone's longer term career pathway, just how important they are. And what kinds of interventions we can hand design to really elevate access to the best assignments for everyone, basically. >> Gosh, you guys, this is so fascinating and so inspiring what you're doing at the lab I wish we had more time, but you'll have to come back next year! >> Exactly. >> Absolutely we will thank you so much for having us. >> Thank you so much, Lisa. >> Thank you. >> Thank you. >> Thank you so much. For theCUBE I'm Lisa Martin, on the ground at WT squared, thanks for watching. (electronic music)
SUMMARY :
Brought to you by VMware. Hi, Lisa Martin on the ground with theCUBE managing director of the lab. but the energy that comes into the room I can imagine the panel that you guys were on is the power that is possible to realize and why that is so critical. And the way we positioned it for our panelists, from the panelists about, when you're encountering, and blocks and inertia, is to open things up And she talked about, also, the need to do your best work all of the built in biases to facial recognition. and all of a sudden, you see completely different trends. Yeah and the intersectionality even across the different providers. and then where data doesn't even see all the dimensions we're interested in. What are some of the breakthroughs and the research frameworks to truly evaluate it. accept the intervention and collaborate with you and based on what you were talking about, One of the things that always interests me One of the themes that we've really been looking at Absolutely we will thank you Thank you so much.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Susan Fowler | PERSON | 0.99+ |
Shannon | PERSON | 0.99+ |
Caroline | PERSON | 0.99+ |
Dell | ORGANIZATION | 0.99+ |
VMware | ORGANIZATION | 0.99+ |
Lisa | PERSON | 0.99+ |
Lisa Gelobter | PERSON | 0.99+ |
2013 | DATE | 0.99+ |
Caroline Simard | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Susan | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Shannon Gilmartin | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Buolamwini | PERSON | 0.99+ |
Ray O'Farrell | PERSON | 0.99+ |
Joy | PERSON | 0.99+ |
2017 | DATE | 0.99+ |
five years | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
Last year | DATE | 0.99+ |
last week | DATE | 0.99+ |
One | QUANTITY | 0.99+ |
tEQuitable | ORGANIZATION | 0.99+ |
Palo Alto, California | LOCATION | 0.99+ |
Second | QUANTITY | 0.99+ |
next year | DATE | 0.99+ |
First | QUANTITY | 0.99+ |
first five years | QUANTITY | 0.99+ |
Stanford | ORGANIZATION | 0.99+ |
both | QUANTITY | 0.99+ |
MIT | ORGANIZATION | 0.98+ |
John Hennessy | PERSON | 0.98+ |
today | DATE | 0.98+ |
several years later | DATE | 0.98+ |
one | QUANTITY | 0.98+ |
about 15 million | QUANTITY | 0.98+ |
one person | QUANTITY | 0.97+ |
Time Magazine | TITLE | 0.97+ |
Inclusive Innovators Designing For Change | TITLE | 0.97+ |
2019 | DATE | 0.97+ |
two things | QUANTITY | 0.97+ |
Stanford VMware Women's Leadership Innovation Lab | ORGANIZATION | 0.97+ |
one time | QUANTITY | 0.97+ |
tomorrow | DATE | 0.96+ |
second jobs | QUANTITY | 0.96+ |
last year | DATE | 0.96+ |
first few years | QUANTITY | 0.96+ |
two great companies | QUANTITY | 0.96+ |
two PhDs | QUANTITY | 0.95+ |
about 1500 live attendees | QUANTITY | 0.94+ |
this morning | DATE | 0.94+ |
Doctor | PERSON | 0.94+ |
VMware Women Transforming Technology 2019 | EVENT | 0.91+ |
last 12 months | DATE | 0.87+ |
Each | QUANTITY | 0.86+ |
Women Transforming Technology event | EVENT | 0.84+ |
Wolfgang Hopfes, Fujitsu | SAP SAPPHIRE NOW 2018
>> From Orlando, Florida, it's theCUBE. Covering SAP SAPPHIRE NOW 2018. Brought to you by NetApp. >> Welcome to theCUBE. I'm Lisa Martin with Keith Townsend. We're in Orlando at SAP SAPPHIRE NOW 2018. We're in the NetApp booth, and we are now talking to Wolfgang Hopfes, the Head of the SAP Business EMEIA for Fujitsu. Wolfgang, welcome to theCUBE. >> Thank you very much. It's a pleasure to be here. >> Great to meet you. So Fujitsu and SAP have been partners, global partners in technology in services and hosting for over 40 years. Fujitsu runs SAP, SAP runs Fujitsu. You guys have about 8,000 joint customers worldwide. We are at an enormous event. This is not just 20-plus thousand people, but this event location is about 16 American football fields. >> Really? >> It's huge. Tell us about what's new with Fujitsu and SAP. What excites you about this longstanding partnership? >> Number one, we are building, or we are trying to build additional business on our strong foundation, which has been growing over 40 years. So we are coming from very early days, where we were named Fujitsu and transformed several times into that. Nevertheless, the customer requirements to a company like us are kind of stay the same and stable. Also, everybody's evolving. So what we are trying to do is, we are trying to accompany our customers in a way where their customer requirements transform quicker than they are able to react, where all the technology is filling in quicker than we can expect, software technologies, artificial intelligence, and we try to be a company that helps the customer managing all these complexities in a really powerful IT world. >> So, let's talk about that from a practical sense. Fujitsu, if the average would think, "Oh, Fujitsu, servers; NetApp, storage; "SAP, software; we understand your relationship." But the relationship is much more complex than that. Fujitsu not only provides the physical infrastructure, but you guys offer services as well. >> What are some of the services that you offer? How does that feed back to the infrastructure? >> So general, and this is really something that at the moment we are trying to fundamentally change, because we are coming this is based on history from a strong technology foundation, yeah? Over the time, we added some system integration and consulting capabilities and skills across Europe, and this is what we are trying to change at the moment. So we tried to make out of at least two to three distinct business areas. We tried to glue them together and start thinking from a customer perspective. So because the customer no longer buys technology, the customer buys the functionality. And look maybe 20 years back. Maybe it's a little bit longer, but when I was young, when I bought my car, I bought a car, and then I started to integrate different things: a stereo, speaker systems, whatsoever kind of fancy things. And you did it by your own. Today, you order a car in a completely different way. You have a configuration tool at your manufacturer of choice, and you say, "I wanna have leather seats, seat heater, "whatsoever kind of things," and then you click, and you get the car which is perfectly designed for you in a different way of standards. And this is exactly my vision of what I wanna achieve in the IT world. So I wanna make the complexity and the technology consumable for the business units and not for IT guys. So that means that we glue together our services capabilities, our technology capabilities, to provide the customer an SAP system for his future needs. That will include all these fancy stuff, like artificial intelligence, blockchain, analytics, big data, all these kind of things are coming together. And we heard an announcement today from SAP, the HANA database management suite, which is, my understanding so far, kind of an umbrella kind of thing gluing different functional building blocks together. And you need more integrating, technology integration, application integration, capabilities in your company, to make your customers landscape-run, and this is what we are trying to achieve. >> So there's two similar, I think, adjacencies to your example. The first, you know, when I got my first car a little bit ago, five years ago I just got my license five years ago. You know, I'm so young. I'd have those challenges. I'd buy a stereo or I'll buy a after-market something to improve or customize my car. However, when it was time to upgrade or do maintenance, I'd take it into the shop, and they'll look at this thing and say, "Oh, it's not standard. "We can't fix it, "because you've modified it in a way that breaks it." One of the challenges with SAP is that customers in the past modified the solution to fit their needs. One of the challenges with SAP and infrastructure in general is that it's very bespoke, and I've designed a server, storage, and compute model that was very bespoke to my business. Talk about how Fujitsu is helping customers, through the relationship with SAP, deal with this modernization of their datasets. >> So there are a couple of different aspects in the whole thing. The first one is, so when we're talking about NetApp and Fujitsu, so, the two companies sat together maybe a year ago, maybe a little bit longer, and came up with the concept that is called NFLEX, which is an integrated system that reduces already this complexity, because it glues the compute and the storage power together. Also, some networking kind of things. And this gives the customer already a ready-to-run platform just from a technical point of view. So if you use this building block and find different and we are working on that on the application side, different building blocks And we're really we can deliver the whole stack that is, the foundation is built on Fujitsu and NetApp compute and storage power. So we are combining the different technology worlds with the special needs for our customers. This is what we are doing. >> So along those lines, I just read that Fujitsu was named the Competitive IT Strategy Company for 2018. So I'm curious, what is it that Fujitsu is driving towards in 2018 to deliver this competitive IT strategy, like what you just talked about. How does that give you a competitive edge? >> Yeah. So first of all, we have and this is based in our headquarters in Japan. We have really a lot of things to talk about when it comes to artificial intelligence, deep learning, blockchain and big data. So the company is investing heavily in these things. And this is what we are trying to tie together, because this gives us a uniqueness in the market. These are elements that everybody needs for the digital transformation. And today, you often hear some sentence like, "It's running on a platform." "It's running on a SAP platform." The reality is that about 90% of today's S/4HANA customers are still running on premise. So we see a move into cloud environments. We see a move into hybrid or multi-cloud engagements in customers, and this is exactly When we just look onto the application or this digital side of the business, we forget that the customer has a business and a technology foundation, too. And this is where we are taking care of. And this gives us this advantage where we think this is needed from the customer. >> So, talking about customer experiences, customer relationships, what are some of the key considerations as customers look at Fujitsu? I will call this infrastructure is Fujitsu's wheelhouse. >> Yeah. >> What are some of the key differentiators customers need to look at as they examine potential infrastructure solutions? >> You need to differentiate and this brings me back to my car comparison. If you wanna have just building blocks, and it's the customer's responsibility to, number one, get them to run, and number two, operate them over a certain period of time with a service level. So within Fujitsu, we are prepackaging and we are taking care of the customer. So, first of all, we are not delivering components. We are delivering an up-and-running platform. And secondly, we are taking the risk away from the customer. So that means we give service levels, we give maintenance, we offer managed services so that the customer can really focus on their business instead of wasting energy on his IT systems, because this is what we are good at, and this is what we are offering to the customer. So this is a really big difference. We are providing a ready-to-run system, and we are taking care on the maintenance, regardless of what components are in the system. So we are also taking care, if we put on NetApp storage and the Fujitsu server together, Fujitsu is taking care on the maintenance issues. Whenever something may go wrong with the system, it's one face to the customer. And this give us a very strong position. >> So for that managing servers, how deep does that stack go? I mean, one of the appeals to customers when it comes to cloud is that, you know what, all the way, to some cases, BASIS is handled by someone else. I'm just laying my application. I'm installing my application. I'm making the modifications that SAP kind of says, "These are the guardrails we'll make." And from every other system, you can count on consistently from SAP platform the SAP platform. How far does Fujitsu go in managing service for SAP? >> So we are offering many services, starting technology foundation, starting going into SAP BASIS, going into the complete application. So we are offering the complete stack also on the managed services side. The customer can start with an easy, just managing his hardware, managing his platform, managing his whole system. So the whole landscape can be under contract of Fujitsu. And it's just a consumption model for the customer. Risk-free, that's all what he needs to take care of. So we are really taking based on customer needs, requirements, and desires, we are taking the risk on the Fujitsu side that the customer has an up-and-running SAP landscape. >> So one of the big challenges that enterprises face when it comes to SAP in general and it's not just SAP, it's all big enterprise apps. On the stage floor, Bill McDermott said this morning and I was taken aback, I don't know if this is, in my experience, it hasn't been quite the experience that he had a customer from discussion to implementation to all business processes, six weeks to implement S/4. That was a bit of a dream. >> Absolutely. >> Not typical of the experience, but even, let's say, a lot less complex than just raising a developing environment, where customers just want to experiment, they wanna fail fast, they wanna take a copy of production, put it into development, create an application, see if it works. How does Fujitsu help speed agility of customers who just simply wanna get up a faster-running development environment? >> So, in this case, we'd definitely recommend. So these are use cases where we would recommend going into a cloud-like environment. So, easy. In an Amazon or other world, you get one-terabyte HANA system within 24 hours later. So you just need a credit card; that's all you need. The interesting part starts when you exactly go through this exercise, and did your experience, and then you wanna take whatever you experienced back into your production system, because then, the complexity for the customer starts. Because what you get in these hyperscaler clouds isn't platform. But you're not getting service to get your results back onto your production system. And this is where are taking care on. So we are going beyond this "just a platform" or "just a device" or "just a server," because the agility to get a platform is not necessary. You can have this everywhere. The luxury to get your results, your data, back and forth from your production system, make a copy, move them, transform them into an Amazon and back again, after you've made your four-weeks development cycle, that's something where the value for our customer is in. So sometimes, it's not only about the speed and the time and the agility. Sometimes it's about the completeness of getting the whole thing back again so that you can use your results, use your experience that you made over this short period of time, and bring it into your production system. That's a key message. >> Yeah, well I'm glad you answered I think that's legitimately how customers look at it. The cloud is for a short burst, I need to get it up and ready and quick. Steady state, SAP HANA, SAP in the cloud, and especially hyperscaler specifically, probably doesn't make any sense because those are known steady workloads that are probably best suited for the private data center. >> Not only that, so it's about the stability. So my experiences in talking to customers and I know at least two, and both are in the Middle East. Two customers who decided to go out of the cloud again because of, it does not make sense for them. So cloud is especially for this use case, try something, start something, four weeks, collapse it and do something else again. The important part is, normally customers wanna be sure of where their data is. This is a big issue at these times, especially GDPR, especially in Europe. So I've seen customers asking me somewhere in Russia or the Middle East, "Can you ensure that my data "is stored in Western Europe? "Or, even better, in Germany?" So, yes we can, with our concept. And most of these customers are likely to wanna have control over their production systems. So the core, where the customers' data are located, they wanna have this somewhere where they can go and feel and touch it. So this is important for them. Everything else can be in the cloud. So that means two-third of today's SAP landscapes have the ability to be moved in a cloud. But the stable core, which is S/4HANA core business, should be somewhere where the customer can get feel it again. >> Get their hands on it. Wolfgang, thanks so much for stopping by and sharing with us what's new with Fujitsu and SAP, and we appreciate your time. >> Thank you very much. >> We wanna thank you for watching theCUBE. Lisa Martin with Keith Townsend from Orlando at SAP SAPPHIRE NOW 2018. Thanks for watching.
SUMMARY :
Brought to you by NetApp. and we are now talking to Wolfgang Hopfes, It's a pleasure to be here. So Fujitsu and SAP have been partners, What excites you about this longstanding partnership? So we are coming from very early days, Fujitsu not only provides the physical infrastructure, at the moment we are trying to fundamentally change, in the past modified the solution to fit their needs. So we are combining the different technology worlds with the So I'm curious, what is it that So the company is investing heavily in these things. what are some of the key considerations and it's the customer's responsibility to, I mean, one of the appeals to customers So the whole landscape can be under contract of Fujitsu. So one of the big challenges that enterprises face just raising a developing environment, where customers just the whole thing back again so that you can use your results, that are probably best suited for the private data center. So the core, where the customers' data are located, and SAP, and we appreciate your time. We wanna thank you for watching theCUBE.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Fujitsu | ORGANIZATION | 0.99+ |
Wolfgang Hopfes | PERSON | 0.99+ |
Japan | LOCATION | 0.99+ |
Wolfgang | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Keith Townsend | PERSON | 0.99+ |
Bill McDermott | PERSON | 0.99+ |
Russia | LOCATION | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
2018 | DATE | 0.99+ |
Germany | LOCATION | 0.99+ |
Orlando | LOCATION | 0.99+ |
SAP | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Two customers | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
four-weeks | QUANTITY | 0.99+ |
first car | QUANTITY | 0.99+ |
two companies | QUANTITY | 0.99+ |
Today | DATE | 0.99+ |
Middle East | LOCATION | 0.99+ |
One | QUANTITY | 0.99+ |
Orlando, Florida | LOCATION | 0.99+ |
today | DATE | 0.99+ |
over 40 years | QUANTITY | 0.99+ |
five years ago | DATE | 0.99+ |
six weeks | QUANTITY | 0.99+ |
one-terabyte | QUANTITY | 0.99+ |
GDPR | TITLE | 0.99+ |
S/4HANA | TITLE | 0.99+ |
first | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
a year ago | DATE | 0.98+ |
four weeks | QUANTITY | 0.98+ |
HANA | TITLE | 0.98+ |
Western Europe | LOCATION | 0.98+ |
NetApp | ORGANIZATION | 0.97+ |
20-plus thousand people | QUANTITY | 0.96+ |
about 90% | QUANTITY | 0.96+ |
two-third | QUANTITY | 0.96+ |
about 8,000 joint customers | QUANTITY | 0.95+ |
first one | QUANTITY | 0.94+ |
20 years back | DATE | 0.94+ |
NetApp | TITLE | 0.94+ |
three distinct business areas | QUANTITY | 0.93+ |
one | QUANTITY | 0.91+ |
SAP | TITLE | 0.9+ |
SAP HANA | TITLE | 0.88+ |
24 hours later | DATE | 0.87+ |
SAP BASIS | TITLE | 0.87+ |
one face | QUANTITY | 0.86+ |
David Levine, Red Hat | Red Hat Summit 2018
>> Announcer: Live from San Francisco, it's theCUBE, covering Red Hat Summit 2018, brought to you by Red Hat. >> Hello everyone, welcome back to theCUBE's exclusive coverage of Red Hat Summit 2018 in San Francisco, Moscone West. I'm John Furrier, my co-host John Troyer, and we are here with David Levine, Assistant General Counsel of Red Hat, we've got the lawyer in the house. Who's billing for this hour? >> Exactly. >> Welcome to theCUBE. >> Thank you, John, it's good to be here. >> So, obviously the legal challenges, putting GDPR aside, which I don't want to get on that rant, we're not going to talk about, is licenses. In open source, this has been an enabler but also an inhibitor for many in not knowing what license to use or what code is, licenses mean for them, their role in the community, all of this stuff could be a morass of gray area, or just no one's educated in some cases, right? So it's tough. >> And that's what I do. I mean my job is to help bring some order to what you describe as some morass, right. How do we help reassure especially enterprises that it's safe to go in the water, it's safe to use open source. Red Hat is an open source company, our entire business is built on open source, and that sort of has a couple aspects to it. One is on the development side, you know we collaborate in the development of software, but what really enables that are licenses, open source licenses. And much of Red Hat's software is built on top of a particular license, which is called the copyleft license. It's known as the GPL or the General Public License. And it's a great tool to foster collaboration, right? What copyleft means is if I create a piece of software under a copyleft license and I give this software to you, I give it to you with all my copyrights. So you have the right to copy it, to distribute it to John, to improve upon it, but the only requirement is if you give it to John, you have to give it to him with the same rights and you have to give him the source code, and if you improve upon it, you have to license the improvements to him under those same rights. So it's this whole virtuous circle, right? I create something, I give it to you, you're able to continue to improve on it, you redistribute it, and we all get to share... >> Furrier: So if I create value, do you get that back? >> If you decide to distribute it to me, you don't have to, >> OK. >> David: But if you distribute it to someone else, then you have to give it back with all those same rights. >> Furrier: So you're paying it forward, basically all the rights forward, >> Exactly. >> Furrier: A dose of good ethos. But then if I improve upon, I create a derivative work, whatever the legal jargon is, >> Right, right. >> Furrier: And I have, this is a magic secret sauce, ten percent of it is magic secret sauce, now I distribute that product, I pass along the license. >> David: Correct. >> Including my secret sauce. >> David: If you decide to, there's nothing that requires you to do it, so a lot of our customers sort of build their secret sauce internally, they keep it within their companies and it doesn't go out any further than that, and that's perfectly fine, but if you decide to distribute it, you have to continue to... >> What does that mean, >> Furrier: Distribute, distribute the software to a partner or the product itself? >> David: It could be both. >> So the product is sold publicly as a service, say a cloud service, and I've got some secret sauce. >> David: So if it's a service, it's a great question and it goes into legal issues, but generally speaking, if you're providing a service that's not a distribution, so I don't really have access to the software. >> Furrier: That's actually a really good thing for developers. >> Yeah. >> Well, it's an issue, we are now in a service-oriented world so that's a, we are, maybe that's one of the next things that we as a technology community and an IT industry have to deal with. Certainly, it seems though, David, before we get into the new news here and the specifics of the new development, but open source was scary... A generation or two ago. It seems like, at this point especially in cloud, it's the new normal. Is that as you, inside Red Hat, as you all look at your landscape, it doesn't seem like you have, do you have big Fortune 100 lawyers coming in and yelling at you now versus ten years ago? >> It's a great point. So I've been at Red Hat for 13 years now, so I've seen sort of tremendous change over the years, and when I started in 2005, we were having a lot of discussions with customers about the copyleft aspects of the GPL, you know, this requirement to give back, and there were companies that were concerned about this, but over time, they've become more sophisticated and they're realizing that, notwithstanding what their lawyers were telling them, it really wasn't that dangerous, and I have very few of those conversations today. Most people get it. >> Furrier: And also a lot's changed since that time, I mean right now I think people are seeing the benefits of projects being out in the open, where it's fostering great collaboration. And the productization piece can still exist >> Yeah. >> With that, so that dynamic between productization, AKA commercialization, and open source projects is interesting. So you could almost make the argument, it's easier to be compliant if you just make everything open source because, rather than just re-engineering any fixes, the community can do it for you. >> David: Absolutely. >> So this efficiency's already been proven. >> David: Absolutely. And you know, customers are concerned about compliance with all of the obligations under the open source licenses, and one of the things that I try to tell customers is if you take open source, you build it into a product, rather than spend a lot of time focusing on pulling out the obligations into a separate file, just make the source code available, republish it and you get to participate, you get to push your contributions upstream and so you a whole community that's supporting the contributions that you described. >> Furrier: Okay, so what's the big news here that GPL, version 2, okay, so first of all, what's the current situation? You guys made a quick tweak in this GPL 2-3 situation, what was the current situation, what was the motivation? Why the change? What's the impact? >> David: So I talked earlier about the GPL and the GPL has very exacting requirements. I mentioned that if you're going to distribute the software to John, you have to give him the source code, and you have to include a copy of the license. Understanding what is source code, what has to be, what has to accompany it, depending on how you're distributing the software, that's not always an easy question, and so companies don't always get it right. And one of the challenges with GPL, version 2 is that there is no grace period, and so if you miss something, if you make a mistake in the way that you've tried to meet your license obligations, your license is terminated and you're a copyright infringer, sort of, right at that point in time, and that scares a lot of our customers, it scares enterprises. They need more predictability, they want some level of fairness. >> This is the grace period you're talking about. >> David: Yeah, this the grace period. So, there's no grace period in version 2 of the General Public License. That problem was fixed when they came out ten years ago with version 3 of the GPL. So version 3 included this grace period in it, but the challenge is that a lot of code today remains GPL, version 2, so what do we do with that large existing code base? And so, the solution was to adopt the cure provision, or the grace provision from GPL, version 2, I'm sorry, version 3, for GPL, version 2 code. Stop me if I'm speaking too quickly or if I'm getting too technical. So the idea is >> Let's rewind just back 30 seconds. So, do a little playback. So, if we can apply GPL, v.3 to the v.2 code, >> So, the cure period. >> Oh, just the cure period. >> So I'm adopting >> David: the cure period. >> Got it. >> David: So, the license stays the same, the only difference is, I've said that if you fail to meet your obligation to John when you redistribute, I'm going to give you 30 days to fix the problem. >> Furrier: So essentially you grandfather in the v.2 with the grace period. >> We're giving this grace period. >> Troyer: And this is a corporate promise. This doesn't change the license, this is a corporate promise. >> So it's a promise >> David: by any copyright holder, so in my example to you, I'm the sole copyright holder here, but in the Linux kernel, there are thousands of copyright holders. So the Linux kernel developers back in October adopted this same approach, adopted the GPL v.3 cure period for the Linux kernel, which continues to be licensed under GPL, version 3. And then in December, Red Hat led a group of companies that included IBM, Google, Facebook, we all adopted it for our own copyrights. So, we together, those four companies own a lot of copyrights to open source code. And then again in March, six more companies joined us. SAP, Microsoft, Cisco, HPE, Soothsay, CA Technologies, and at the Red Hat Summit today, we're asking developers to do the same thing. We want to show that it's a movement, that we want to cooperate in enforcement, because we think ultimately if we want more people to join the open source ecosystem, we can do that by making enforcement more predictable. >> Furrier: And so what specifically are you asking startups? What's the ask for developers? >> For developers, if you go to, we have a site on GitHub, so it's the GPL Cooperation Commitment, so gplcc.github.io/gplcc. >> And what do they do, just take a guess? >> And you go there, and there's the statement, the same commitment that the company's made, and you go in and add your name to the bottom of the file and submit a pull request, like developers know how to do on GitHub, and your name will be added as a supporter. >> Into the record. >> David: It would apply to every new copier. >> That gives them the primary source (mumbles), or write... >> David: It gives anyone who takes that code, has that piece of mind. >> Furrier: Well, great stuff, great one-on-one on the GPL v.2, v.3 grace period, it's super cool you guys are doing that. It's just such a hassle, I'm sure the complaints have been crazy. The bigger question for me as I look at, cause I love that the innovation comes from open source, we're seeing that both on the collaborative side in the project, but also people are really productizing open source and its running everything. The question is, where do I have code that I, you know, people are programming like crazy, they're slinging code like it's nobody's business right now. So, I might be afraid I'd be liable if I'm an enterprise or a startup that, through venture capital or an M&A process where something's going on, wait a minute, we can't actually sell this because that's his code over there. You didn't comply with the license, so there's always these tripwires in the mind, and sometimes that's just fear, this is a general kind of license hygiene practice. What's your take on that? What's your advice to entrepreneurs, to enterprise developers, to be safe? What should they do as their approach? >> David: That's a great question. I mean, what you want to know is where's my code coming from? And you have, it's a license issue, but it's also a product security issue. If you're taking something from someone, they took it from two places down the food chain, what's the provenance of that code? So, just like from a security perspective... >> Furrier: I've seen M&As go south because of this. >> Yes, so you want to know the source of your code, get it from a trusted source. Make sure that you understand what the license terms are. One of the things that we're trying to encourage developers to do is make sure you attach a license to it, because if you don't, a user or startup's not going to know what rights they have. And that can become problematic if they have a liquidity fight. >> Furrier: Okay, so here's my next question. So, the next question is obviously open source is growing and people are joining projects and/or creating projects. So this is a hypothetical: I have a project and I want to donate to CUBE code, to the open source CUBE community. Do I just ship the code, do I have to pick the license, what's the best license? And then I want to also have in the mind that I might use Linux and other things, so I have code I've written, proprietary code I want to open it up, I've got to pick a license, like, do I just go like that and pick the license out of the hat, or... >> Lots of times, it's sort of dictated for you. So it depends on the ecosystem that you're working with. I mean, if you're working in the Linux kernel ecosystem, generally it's going to be GPL, version 2. So you have to look at what other projects you're working with, is this part of a particular project that already has an existing license? And then it's a philosophical point. I mentioned before, the GPL is a copyleft license. It forces sharing, right, so it protects John's rights downstream from you, but there are other licenses that are permissive and give you lots of rights, but you could decide what you want to do with it downstream. So if you're okay with people taking your code downstream from you and making it proprietary, then using a permissive license is fine. But if you want to ensure this virtuous circle, then you want to pick a copyleft license. >> Troyer: Paul, do you think we have reached the end stage of open source licenses here? Are you, you know, GPL v.3 is ten years old, and after we started from MIT and Apache, and I could probably list a couple of others and I haven't even been paying attention, so, are we settled down, are we about done? Are you looking for things? >> David: That's a great question. So I was at a conference two weeks ago in Barcelona put on by the Free Software Foundation in Europe, and one of the conference sessions was The Future of Copyleft. You know, is there going to be another copyleft license? Do we need GPL, version 4? It's, you look at what the GPL has done and how many projects are governed by it, and how it's forced this collaboration, it's done amazing things, but it's pretty complicated. So is there a simpler way of accomplishing the same objectives? But I don't know that people have the stomach... >> Furrier: And the answer is? >> Uh, (laughing). I'll come back next year and let you know what I learn... >> Were you worried about, and now I'm going to ask, have to ask this, ask me how you can support open source licensing, so I'll ask you: how can you and me support open source licensing? >> David: So, take the GPL v.3 Cure Commitment, commit your name to supporting greater stability and predictability and fairness in the way enforcement takes place. So, I mean it's an exciting project. It's kind of fun to pull the whole community together. >> It's quite an accomplishment, too if you think about open source principles are now, again, we don't want to skew other events, but okay, this beginning of another generation of open source greatness certainly, remember the glory days when there was a Tier 2 citizen in the enterprise, you guys made it Tier 1 but now it's going to a whole other level with Cloud-Native, and you're seeing open source ethos being applied to other markets, not just software development. So, you're starting to see the success create this circle of innovation. Have you guys had the "pinch me" moments inside Red Hat, saying, "hey, this is actually working, and really well"? >> David: I think just a couple touchpoints, I mean, I think, look at where Microsoft has come, right? When I joined Red Hat, that wasn't a friendly relationship, but now they've embraced it. Who would have thought 15 years ago that we'd see Microsoft on board and we have. And your point about where else is open source going; one of my colleagues spoke about a year ago to seed developers who were interested in open sourcing seeds, because there was concern about seeds becoming patented and not being able to grow food. And so, thinking about ways to open up the market in seeds. >> Productization is a great thing. >> Yeah, absolutely. >> On the legal front, what's on the horizon? Any hurdles you see, opportunities, challenges that your guys are working on? Obviously, there's always the legal framework, we just commented before you came on with Chris Wright about Blockchain and some of the tokenization around content, we might even see a token economics model in software down the road. So, a lot of interesting legal things happening to rights if you open them up. What's your thoughts on the future? >> So, one of the areas that we're focused on, as is Red Hat, is containers. So what does it mean if you put open source software layers in a container? What does it mean if there are proprietary layers in there? Does it mean if you add, if you take my open source software, add a proprietary component, package it in a container and give that container to John, what does it mean for your proprietary layer? Is that, does that have to be licensed under the GPL? And so we spent a lot of time thinking about that a number of years ago and luckily concluded that it may improve the situation as opposed to adding any concerns, so we're thinking about the impact of open source licensing and containers, ensuring, again to your point earlier, what's the provenance of the code? There's so much code now available, making sure that there is a license associated with it. >> It's almost, you just declare all code free. (all laughing) >> Absolutely. >> Well certainly a lot of new things you're seeing, societal change is impacted, you've got self-driving cars and all kinds of new things that are just mind-blowing on a legal framework standpoint. First-time challenges, so you're busy, you're always going to have an interesting job. >> I really think that I have the best job in Red Hat, because I get to think about these things. What does it mean from a licensing perspective? What are the new issues that we're going to face as the technology evolves, the market evolves? And... >> Furrier: Super important, I mean there's tripwires in there, and again, if you don't think about it probably, I know or I've seen from experience, great companies lose big-time acquisition opportunities because of some faulty code on a license, and it's just killed things, and I've seen enterprises get (laughing). I mean, little weird things could happen, you've just got to be on top of it. >> David: I mean, look at what Tesla did in open sourcing their patents, making their patented technology available so that, to help the whole autonomous car industry. We've been doing a lot of work in the patent area as well to ensure that patents don't become an inhibitor to the change that you've described. >> Furrier: It's a great conversation, provocative, legal and open source software. These are competitive advantages and opportunities, not challenges and compliance, old-school guarded secrets. Open it up and good things happen. David, thanks for coming on theCUBE. Thanks for sharing the insights on the legal perspectives of licenses as open source software continues to power the globe on a global basis, the global economy, and the technology innovation coming. It's theCUBE, bringing you all the live action here in San Francisco. We'll be right back with more after this short break. (upbeat music) (inspirational music)
SUMMARY :
brought to you by Red Hat. and we are here with David Levine, So, obviously the legal challenges, I give it to you with all my copyrights. then you have to give it back Furrier: A dose of good ethos. I pass along the license. to distribute it, you have to So the product is sold David: So if it's a Furrier: That's actually and the specifics of the new development, about the copyleft aspects of the GPL, of projects being out in the open, it's easier to be compliant if you just So this efficiency's the contributions that you described. to John, you have to This is the grace period of the General Public License. So, if we can apply GPL, going to give you 30 days in the v.2 with the grace period. This doesn't change the license, this is a and at the Red Hat Summit today, so it's the GPL Cooperation Commitment, and you go in and add your name to every new copier. source (mumbles), or write... has that piece of mind. cause I love that the innovation I mean, what you want to know is Furrier: I've seen M&As One of the things that we're trying and pick the license So it depends on the ecosystem the end stage of open and one of the conference sessions was let you know what I learn... and predictability and fairness in the way in the enterprise, you guys made it Tier 1 and not being able to grow food. to rights if you open them up. and give that container to John, It's almost, you just and all kinds of new things What are the new issues and again, if you don't in the patent area as well on the legal perspectives
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David | PERSON | 0.99+ |
David Levine | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
John | PERSON | 0.99+ |
John Troyer | PERSON | 0.99+ |
Soothsay | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
SAP | ORGANIZATION | 0.99+ |
HPE | ORGANIZATION | 0.99+ |
2005 | DATE | 0.99+ |
December | DATE | 0.99+ |
Free Software Foundation | ORGANIZATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
Chris Wright | PERSON | 0.99+ |
March | DATE | 0.99+ |
Red Hat | ORGANIZATION | 0.99+ |
13 years | QUANTITY | 0.99+ |
San Francisco | LOCATION | 0.99+ |
Europe | LOCATION | 0.99+ |
30 days | QUANTITY | 0.99+ |
October | DATE | 0.99+ |
Tesla | ORGANIZATION | 0.99+ |
Barcelona | LOCATION | 0.99+ |
Apache | ORGANIZATION | 0.99+ |
next year | DATE | 0.99+ |
San Francisco | LOCATION | 0.99+ |
ten percent | QUANTITY | 0.99+ |
Red Hat | ORGANIZATION | 0.99+ |
MIT | ORGANIZATION | 0.99+ |
Paul | PERSON | 0.99+ |
Face | ORGANIZATION | 0.99+ |
Linux kernel | TITLE | 0.99+ |
two places | QUANTITY | 0.99+ |
GDPR | TITLE | 0.99+ |
GPL | TITLE | 0.99+ |
ten years | QUANTITY | 0.99+ |
four companies | QUANTITY | 0.99+ |
First | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
Red Hat Summit 2018 | EVENT | 0.99+ |
Linux | TITLE | 0.99+ |
Troyer | PERSON | 0.99+ |
GPL v.3 | TITLE | 0.99+ |
theCUBE | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
30 seconds | QUANTITY | 0.98+ |
Red Hat Summit | EVENT | 0.98+ |
15 years ago | DATE | 0.98+ |
CA Technologies | ORGANIZATION | 0.98+ |
both | QUANTITY | 0.98+ |
ten years ago | DATE | 0.98+ |
General Public License | TITLE | 0.98+ |
two weeks ago | DATE | 0.98+ |
today | DATE | 0.98+ |
gplcc.github.io/gplcc | OTHER | 0.98+ |
six more companies | QUANTITY | 0.98+ |
Halsey Minor, VideoCoin | Polycon 2018
>> Announcer: Live from Nassau in the Bahamas, it's theCUBE, covering Polygon 18, brought to you by Polyman. >> Welcome back everyone, we're here live with theCUBE's exclusive coverage of Polycon '18. We're in the Bahamas, I'm John Furrier with Dave Vellante, co-founders and co-hosts of theCUBE. We're here with special guest Halsey Minor, entrepreneur, serious serial entrepreneur here on theCUBE. Halsey, great to have you. You're the founder and CEO of VideoCoin, a successful ICO. You had an event last night, kind of an investor thank you event out in the Bahamas Country Club, there, you're here. Man, you're a pro, you're back in the game with this crypto. This is the wave, I mean, I want to get your perspective 'cause you see waves. You've seen CNET, you started that from scratch before online news was anything, you were the pioneer in that. First investor, first operator in salesforce.com, a variety of other successful entrepreneurial adventures. You've got a nose for the waves. So just put it in perspective, what is this wave? >> Yeah, so I actually have an interesting story because I've actually started around 2012, and I launched my first business in 2013. So, the first problem that I saw was, how do you get your money from your bank account and buy Bitcoin? Still a problem, hasn't been fixed, right? So I tried to fix that. Oh well, I did to a certain extent, I did fix the problem. So what I did was created effectively a coin-based converter, and I started out and was going to make it very easy for you to take your bank account, connect it up, seemed logical, and then buy, you know, the currency. The company was called Bit Reserve at the time. So, no bank would touch anybody named Bit in their name. And it was even worse than that, all of us who put our company name into our bank account, we had our bank accounts basically shut down, right? So, I started getting an idea how difficult this was going to be, you know, Coinbase getting a Silicon Valley bank account early on to become a conduit, was very fortuitous. It ultimately took two and a half years and buying a big chunk of New Jersey Bank before we were able to allow you to connect your US bank and your European bank into Uphold to buy currency. So it's really Uphold, Coinbase, maybe like Gitbit, very, very few who've been able to crack that problem. We literally had to buy part of a bank to do it. So that's where I started. So I really looked at it very much as money, as a new monetary system. And I still see unlimited opportunities in that area. It wasn't until really a couple years later that I saw the block chain as the new architecture for the computer, and what I mean by that, is what Bitcoin proved was that if you gave people software and they ran it on their computer and they got paid in some funny kind of digital money, they would convert that money back into fee hock, you know, dollars, and they go buy more computers. And nobody asks anybody to be a Bitcoin miner, they just come and showed up the more, the bigger it got, the bigger the opportunity. And what's most interesting is when you make money or lose money, depends on your cost of power. So for most of these Bitcoin miners, they're near hydroelectric dams. So what I realized, and VideoCoin is in the area of video. It's a direct competitor with Amazon web services, everything they do in video. So there's, it's called encoding which is compress it, there's storage and there's streaming, three basic pieces. So what I realized was, two things: first of all, 20% of servers and data centers are not used at all. They're called zombies, right? So all of these people, the Airbnb, Uber model, they can all of a sudden start earning on assets that are doing nothing. But even if you look out into the future, if video mining, which is what we call it, ends up being like bitcoin mining, then what happens is that the whole thing works on the cost of power. It's not good for Amazon, if they have to be competitive solely based on the cost of power. >> Dave, so he's got an ICO going on, we looked Filecoin, right? So Filecoin was storage and that's infrastructure. You go to VideoCoin, we're streaming right now, we've got video. This is kind of like an interesting digital media infrastructure ... >> Well ... >> What's your take compared to Filecoin? >> What's interesting to me is that I'd love to get Halsey's input on, because you've got the full spectrum here. You started in publishing and now-- >> With five TV shows. >> Dave: Okay. >> Yeah, CNET had five TV shows. >> So right, and so very digital from the beginning and relatively ripe for disruption and then now into banking, which really hasn't been disrupted, but we all think it's coming. So that's an interesting spectrum. It's not Negroponte, I don't think, bits versus atoms, because you've seen, you know tax season get disrupted. That's atoms. So what are the factors that make an industry ripe for disruption? >> Well, I mean the obvious thing is really disruptive technologies, right? And so for the Internet, for me, it was, I started the company in '93 to be on commercial online services like AOL and I saw, I guess, the first browser in '93 and, actually at Sun, and it made me believe the Internet was going to be this incredible thing. And it was really seeing information coming in, and, you know, the Internet wasn't that big back then but I watched a gif of a storm, you know, from one of the weather centers, and so I realized that this information thing was incredibly interesting. And so what all of us did, the way I thought about it and seen it, is we're cracking open databases and we're just letting people have the information. And it was silly things like the ability for me to live in San Francisco but know what the weather was in New York and pack appropriately. This was the magic, I mean, we take all of this for granted. This was magic, right, at the time. You had to go out and buy a USA Today-- >> Check the stock price. >> Yeah, exactly. >> Call your friends in New York. >> Yeah, that was magic. So at a very high level, it was just access to information. At a very high level, what this is is combining information and money into a packet. Right? So now what we can do is, I can gather information from servers about what they're really doing and I can also be paying them at the same time. So you know, it would have actually solved a lot of problems around the Internet, because on the Internet getting paid was hard. And there were so many times we'd go into a meeting and we'd agree on the partnership but we didn't know who was paying who. You know? (laughing) Am I paying you for traffic or are you paying me for content or you know, how is that going? So this kind of comes with a built-in payment system, which I think is what makes it so incredible as a system. >> So we're-- >> And more stable, I am inferring, long-term anyway. Because that whole system that you just described on the Internet all blew up when the funding dried up. >> It blew up and I think, you know, I think there are certainly a lot of risks. The number one thing I would tell everybody in this area is, you know, be very cautious about what in you invest in. There were a lot of companies that, uh-- so my whole description was sort of the Internet bubble was that people say that, well, you know, nine trillion dollars was lost in investing. >> With everything that happened though. >> And when I-- >> The plus.com happened, everything happened. >> And what I said to the people is that it would be great if people had just invested in the survivors, but who knew what they were? The only reason the United States emerged, with, you know, with Salesforce and Ebay and Amazon, etc., the only reason that we emerged dominating the world was 'cause we invested in them all. Right? And so-- >> Even all those things that were called silly ideas actually happened. >> And they ended up happening. It was all a matter of timing, yeah. So you know, what's happening now is very much the same thing. You know, a lot of people are going to invest in a lot of bad ideas, right? But this is all necessary for the good ideas to get funding and for something big to come out of this. >> So I want to get your take on with the VideoCoin and in comparison, you mentioned Amazon, right? So our observation, obviously we're recording all these shows, Amazon web service, among others, the big guys are sucking all the oxygen out of the room. Look at the big whales, Google, Facebook, Amazon, I mean, we can't even run any ads on our site. We actually prefer to just push the content all over the world because it's hard to build a destination site. I mean, people going out of business in the media business. Video, your choices are Ustream now owned by IBM, Twitch TV became Amazon which was Ustream before that. Build your own custom player, set up a CDN, which is actually hard and expensive. Okay, so do I do Facebook live, again controlled by Facebook? So there's an opportunity that you're pursuing. Did you have that in mind? I mean, we see it every day and we know this, but luckily we have a good deal with Ustream, but the point is that is going to be up too. What's the alternative producers, content producers who have streaming, whether it's a pro set like this or someone who's going to have unlimited access to video streaming? >> So the real issues are cost and innovation, okay? And so Hanno Basse, who's the CTO of 20th Century Fox and one of our advisors, right? And all these media companies have the same problem. Nobody is watching broadcast anymore that'll cost them nothing and everybody's now streaming in, which is one-to-one and has a cost associated with it. So that's why, and even worse, videos going to 4k, 8k, VR, data that's going up like this-- >> Data isn't growing as fast either. >> So all these companies are confronted with all these costs and they can't monetize them. Google can monetize it, Amazon can monetize it. >> Tel cos ... >> Netflix, yeah. >> Ouch. >> But they can't monetize it, so it's all cost effectively and no revenue. So the one thing that we offered to VideoCoin by using all this research is we cut the cost 60 to 80%, so that's huge. The other thing is, in the early days, everybody bought Salesforce because it was cheaper. It was 1/10th of the cost. And I used to say to people, in the long run, it's going to be way more innovation, right? Because they're constantly, every quarter, rolling out a new version, right? And they're going to have the ability to connect, an API effectively, and the ability to connect, and the whole ecosystem can arise around that. And that's why their conference has 140,000 people, Dreamforce, because there's a whole ecosystem. >> It's sticky as hell too. >> That's right. >> Hard to get out. >> That's right. So while we are 60 to 80% lower cost, we're also effectively open source at the same time. So the ability to have a community arise and develop software. And so right now, you've seen this huge consolidation because it's actually kind of hard to build new kinds of apps on top of Amazon web services, right? But if you have this open system, and you have all these people are contributing code to it, all of a sudden, there are apps, video apps, that they'll be literally a whole new-- >> So you're going to have an open source contribution piece to your ... ? >> Yeah, I mean basically, everything we build is open source, right, so you know, all the way through to the network. So it creates a palate for people to start innovating in video. Because really what's happening is a lot of innovation is getting hurt by the fact these big guys totally dominate it, right? They don't want to see any innovation outside of the funds they bring you, right? >> Right, so you've heard my rap on this. I'd love to get Halsey's thoughts. So the big guys, you're right, have won. It's like centralization and victory. People here are saying, "No, we want to take it back." The premise that I hear a lot is there's been no innovation in protocols in, you know ... Google built gmail on SMPT, HTTP, DNS, it's all government-funded or academia. >> Yeah. >> And it's just a lack of innovation. >> That's right. >> And now, this is why I counter Warren Buffet and Charlie Monger, is no, we're building out a new set of infrastructure. >> That's right. >> Okay, so where do you guys fit into that? What are your thoughts, first of all, on that premise? And where do you guys fit? >> Yeah, I mean, look, you've got these huge companies that are totally dominant and even though they are, in fact, you know, innovative Silicon Valley companies by label, okay, they have all the same issues-- like I say to people, nobody today believes that anybody can put Amazon web services at risk. If I went to somebody and said, "You know Amazon web services which are worth 3/4 "of the value of the company, or 5/6, "depending on who you talk to, "there's going to be something after that." It would literally be a new concept because everybody's convinced this is Amazon's-- >> John: The winner. >> Yeah, this is their big, this is the way they make all their money-- >> Alright it's over-- >> Right, and if you say to somebody there is going to be a next thing, they would look at you like, you know, like you're foolish. But the reality is when you start changing some basic, underlying infrastructure in the Internet and you start doing things, decentralization, this is the word we're going to be using, you know, we're going to see it in solar power. And solar power is, you know, on a cost to benefit like this so, you know, it isn't going to be long before we're going to have power in our house legitimately, not like, you know, some science-fiction thing, we'll be legitimately powering most of our needs with solar that we connect because the cost is coming down so much. So we're going to see all of this decentralization happening. And in the world of computing, decentralization means that this is going to be the most efficient that computing can ever be. Because just compare using the Uber and Airbnb model of saying anything that's excess, let's turn into value. And I've heard that for every Uber driver, 15 cars go away, right? So the decentralization is going to have a profound effect on the economy and it's going to have a profound effect on these big guys. >> Oh, even those guys are going to get disrupted. >> They're going to get disrupted. And they're 20 years old, it's time for them to get disrupted, I mean, you know ... >> E-commerce is a 20, 30-year-old stack, some say 20, 20-year-old stack on e-commerce, all these things are ready, even what we would consider modern, you know, the miracle of saying oh the weather in New York. I mean that magic is here now in a new way. So I got to ask you the question-- >> Taken for granted. >> I got to ask you a question because you brought up that point. In your history of your career as an entrepreneur because you're doing stuff that's always new and cool, and probably before anyone else sees it, can you talk about some of the ideas that you've seen, not necessarily your ideas, as well others, where the investor said, "That's the dumbest idea "I ever heard"? What billion dollar opportunities have you seen emerge that investors have said, "That's the dumbest idea "I've ever heard"? >> Well, actually, the one that is Salesforce. No VC would put money in. It was really kind of backed by Larry Ellison and me early on. And what's so-- >> John: Google was a dumb idea. We want portals, not search. >> Yeah, so the bet that nobody would take in 2000 was that companies would take their sales information and they would put it in the cloud. Nobody would believe that. Not anyone. And so I used to joke, I used to say the only way it's going to happen is if the sales guy's been waiting two years to get his sales management system in place actually runs over the head of security in the parking lot. That's what it's going to take because it's outsourcing and, you know, the security guys say, "Oh, no, no, no, "we're going to lose all of our data", right? It didn't matter that Salesforce had way more security guys, you know, than these guys had and better, you know, working internally. Nobody believed in it. Literally nobody believed in it. >> This is your point about the decentralization, no one's going to believe, "Wait a minute, "that could never happen." So, in a way, the investor thesis should be, "I want to invest in the dumbest ideas," because that might be the best idea. >> It is. I mean the big, obvious ones that attract billions and billions of dollars, I mean, how many of those end up actually not turning into anything? Right? A lot of them, right? So CDAT was profitable on nine million dollars. I believe that Yahoo was profitable on three million dollars. I think Google was somewhere around 12 to 15 million dollars, right? So there are a lot of these business-- Amazon's obviously the outlier. >> John: It's still not profitable. >> Yeah, it's the outlier. But you know, a lot of these businesses were started by people who used a relatively small amount of money and were very creative. You know, you're going to hear this over and over again. Microsoft never needed any money. They accepted five million dollars from-- >> John: (mumbles) >> Yeah, so this happens a lot. And in fact, I think it's very dangerous when in year five, you're losing three hundred million dollars, right? I mean, five hundred, or whatever it is. There are a lot of things that can go wrong. >> What's the role of community? Because we heard the guy from Locktower Capital say something I thought was really profound, "I don't need VC because, if you're a startup, "you don't have to waste your energy on board meetings "and other things, you can build your business "and use the community as your benchmark." So this plays to your whole picking up the slack kind of thing in efficiency. So entrepreneurs can be more efficient in these communities. This is where the cryptocurrency Blockchain is thriving. What's your thoughts to that and how do you see that community interaction progressing? >> In my career, there's been a sea change in sort of the culture of technology and really everything, right? You know, when I started out, everything was very hierarchical. You know, it's like how far up the chain you got that measured how successful you were. Now it's how big is your network, right? And you know, I was talking to somebody the other day who said VCs are going in and they're measuring these companies' success by how many Instagram and Twitter accounts they have and there's massive fraud going on because people are buying these accounts to pump up their numbers, right? So people are starting to value by the breadth of your network. >> John: Reputable network. >> Reputable, yeah. >> John: Not fake network. >> Yeah, but what I heard is there's actually a Twitter application which I haven't seen that'll go in and tell how many of 'em are real and how many of 'em are not now. So really the community becomes almost the measuring stick for your value. You know, before I'd seen it, I had users. Today, everybody has community members. And so, it becomes sort of, kind of like everything I guess. >> And our media model is all community-based which is, we just naturally go there because that's where the data is. >> That's right. >> That's where the feedback is. >> That's right. >> I mean, I can't get feedback from Facebook and Google, they own the data, right? There's no letters to the editor on Facebook. There's only hate comments. >> But you know before Microsoft and all these came, you know, IBM dominated the world. Nobody ever thought they would go away. AT&T dominated the world and nobody ever thought that they would go away, you know. >> Alright, personal question for you, I got to wrap because I know you got to go. Appreciate your time, by the way. Great story, we could go on for another hour. Personal note, what is the most compelling thing that's moved you, as an entrepreneur, in the crypto market? Like, something that, it could be an anecdote, it could be a situation. When you look at this opportunity, as the world's going to eventually be re-instrumented with data, with new open source and community, what's something that's surprised you or moves you as an entrepreneur saying, "This is freakin' awesome"? >> So this hasn't been done yet but it will be done. So this is what actually motivated me to start Uphold was the ability to turn your phone into your bank and to be able to exchange money and primarily really solving the ability for the poor to be able to move money around without having 10 to 20 to 30% of it taken away. Everybody's talked about this, remittance, and so far, nobody has actually solved that problem. That problem is going to get solved. I mean it's inevitable that the phone becomes the bank. There are so many regulations that are designed to stop that and it's extraordinary. Once you get into it and you see all the ways that have been set up-- >> Byzantine system. >> this problem should have been solved long ago, right? And every phone should be a bank. I mean, it can be connected to a bank, but every phone should have my money in it. I should be able to send it to you instantaneously. >> It shouldn't be like getting into Fort Knox. >> Yeah. I mean, computers, banks have computers, they could make this happen today. They just don't want to. So I think the most profound thing for me is the problem is still not solved, that the problem I set out to solve, which is really creating a more equitable financial system. And we live in a country where the banks make about 37 billion dollars a year in bounced check fees. Think about that. Thirty-seven billion dollars in bounced check fees. So if you just take that out, you just take out, 'cause it all affects people in the lower socioeconomic scale, you create a revolution. Just getting rid of the bank fees that you'll pay for bouncing checks. >> Well, I mean the narratives, like the narrative of taking down gatekeepers or central authorities, is the premise of this ecosystem and you could take that example and apply it to thousands of use cases. >> And banks are rapacious, flat out. American banks are the most rapacious 'cause no other country would allow 37 billion dollars to be taken away in bounced check fees. >> Halsey, congratulations on your success again and great to see you on theCUBE. You're now a Cube alumni, so ... >> Congratulations. >> We hope you'll come back again. >> Yeah, thank you guys. >> We're going to get you in our telegram group, now you'll be 42 members, we just turned on last night. (everyone laughs) We appreciate it and congratulations. >> Thank you very much. >> Thanks for your insight and experience and commentary. Halsey Minor, experienced entrepreneur, pro, here in the trenches, establishing a great new venture. We'll be back with more live coverage after this short break. (electronic music)
SUMMARY :
brought to you by Polyman. This is the wave, I mean, I want to get your perspective and was going to make it very easy for you You go to VideoCoin, we're streaming right now, that I'd love to get Halsey's input on, So right, and so very digital from the beginning And so for the Internet, for me, it was, So you know, it would have actually solved a lot of problems Because that whole system that you just described was that people say that, well, you know, and Amazon, etc., the only reason that we emerged Even all those things that were called silly ideas So you know, what's happening now but the point is that is going to be up too. So the real issues are cost and innovation, okay? So all these companies are confronted with all these costs So the one thing that we offered to VideoCoin So the ability to have a community arise to your ... ? so you know, all the way through to the network. So the big guys, you're right, have won. and Charlie Monger, is no, we're building out in fact, you know, innovative Silicon Valley companies So the decentralization is going to have a profound effect to get disrupted, I mean, you know ... So I got to ask you the question-- I got to ask you a question Well, actually, the one that is Salesforce. John: Google was a dumb idea. Yeah, so the bet that nobody would take in 2000 because that might be the best idea. I mean the big, obvious ones that attract billions But you know, a lot of these businesses And in fact, I think it's very dangerous So this plays to your whole picking up the slack And you know, I was talking to somebody the other day So really the community becomes almost the measuring stick And our media model is all community-based There's no letters to the editor on Facebook. that they would go away, you know. I got to wrap because I know you got to go. I mean it's inevitable that the phone becomes the bank. I should be able to send it to you instantaneously. that the problem I set out to solve, and you could take that example and apply it to be taken away in bounced check fees. and great to see you on theCUBE. We're going to get you in our telegram group, here in the trenches, establishing a great new venture.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Vellante | PERSON | 0.99+ |
20 | QUANTITY | 0.99+ |
Halsey Minor | PERSON | 0.99+ |
Hanno Basse | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
John | PERSON | 0.99+ |
Halsey | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Yahoo | ORGANIZATION | 0.99+ |
New York | LOCATION | 0.99+ |
San Francisco | LOCATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Larry Ellison | PERSON | 0.99+ |
2013 | DATE | 0.99+ |
Coinbase | ORGANIZATION | 0.99+ |
CNET | ORGANIZATION | 0.99+ |
60 | QUANTITY | 0.99+ |
three million dollars | QUANTITY | 0.99+ |
Charlie Monger | PERSON | 0.99+ |
nine million dollars | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
five million dollars | QUANTITY | 0.99+ |
AT&T | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Bahamas | LOCATION | 0.99+ |
two years | QUANTITY | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
10 | QUANTITY | 0.99+ |
20 years | QUANTITY | 0.99+ |
15 cars | QUANTITY | 0.99+ |
nine trillion dollars | QUANTITY | 0.99+ |
John Furrier | PERSON | 0.99+ |
Airbnb | ORGANIZATION | 0.99+ |
billions | QUANTITY | 0.99+ |
42 members | QUANTITY | 0.99+ |
1/10th | QUANTITY | 0.99+ |
140,000 people | QUANTITY | 0.99+ |
five TV shows | QUANTITY | 0.99+ |
Ustream | ORGANIZATION | 0.99+ |
'93 | DATE | 0.99+ |
20% | QUANTITY | 0.99+ |
five hundred | QUANTITY | 0.99+ |
Warren Buffet | PERSON | 0.99+ |
Ebay | ORGANIZATION | 0.99+ |
Uphold | ORGANIZATION | 0.99+ |
three hundred million dollars | QUANTITY | 0.99+ |
Twitch TV | ORGANIZATION | 0.99+ |
Locktower Capital | ORGANIZATION | 0.99+ |
first problem | QUANTITY | 0.99+ |
two and a half years | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
37 billion dollars | QUANTITY | 0.99+ |
20th Century Fox | ORGANIZATION | 0.99+ |
Cube | ORGANIZATION | 0.99+ |
Thirty-seven billion dollars | QUANTITY | 0.99+ |
Netflix | ORGANIZATION | 0.99+ |
2000 | DATE | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
Gitbit | ORGANIZATION | 0.99+ |
Dreamforce | ORGANIZATION | 0.99+ |
Bahamas Country Club | LOCATION | 0.99+ |