Breaking Analysis: ChatGPT Won't Give OpenAI First Mover Advantage
>> From theCUBE Studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR. This is Breaking Analysis with Dave Vellante. >> OpenAI The company, and ChatGPT have taken the world by storm. Microsoft reportedly is investing an additional 10 billion dollars into the company. But in our view, while the hype around ChatGPT is justified, we don't believe OpenAI will lock up the market with its first mover advantage. Rather, we believe that success in this market will be directly proportional to the quality and quantity of data that a technology company has at its disposal, and the compute power that it could deploy to run its system. Hello and welcome to this week's Wikibon CUBE insights, powered by ETR. In this Breaking Analysis, we unpack the excitement around ChatGPT, and debate the premise that the company's early entry into the space may not confer winner take all advantage to OpenAI. And to do so, we welcome CUBE collaborator, alum, Sarbjeet Johal, (chuckles) and John Furrier, co-host of the Cube. Great to see you Sarbjeet, John. Really appreciate you guys coming to the program. >> Great to be on. >> Okay, so what is ChatGPT? Well, actually we asked ChatGPT, what is ChatGPT? So here's what it said. ChatGPT is a state-of-the-art language model developed by OpenAI that can generate human-like text. It could be fine tuned for a variety of language tasks, such as conversation, summarization, and language translation. So I asked it, give it to me in 50 words or less. How did it do? Anything to add? >> Yeah, think it did good. It's large language model, like previous models, but it started applying the transformers sort of mechanism to focus on what prompt you have given it to itself. And then also the what answer it gave you in the first, sort of, one sentence or two sentences, and then introspect on itself, like what I have already said to you. And so just work on that. So it it's self sort of focus if you will. It does, the transformers help the large language models to do that. >> So to your point, it's a large language model, and GPT stands for generative pre-trained transformer. >> And if you put the definition back up there again, if you put it back up on the screen, let's see it back up. Okay, it actually missed the large, word large. So one of the problems with ChatGPT, it's not always accurate. It's actually a large language model, and it says state of the art language model. And if you look at Google, Google has dominated AI for many times and they're well known as being the best at this. And apparently Google has their own large language model, LLM, in play and have been holding it back to release because of backlash on the accuracy. Like just in that example you showed is a great point. They got almost right, but they missed the key word. >> You know what's funny about that John, is I had previously asked it in my prompt to give me it in less than a hundred words, and it was too long, I said I was too long for Breaking Analysis, and there it went into the fact that it's a large language model. So it largely, it gave me a really different answer the, for both times. So, but it's still pretty amazing for those of you who haven't played with it yet. And one of the best examples that I saw was Ben Charrington from This Week In ML AI podcast. And I stumbled on this thanks to Brian Gracely, who was listening to one of his Cloudcasts. Basically what Ben did is he took, he prompted ChatGPT to interview ChatGPT, and he simply gave the system the prompts, and then he ran the questions and answers into this avatar builder and sped it up 2X so it didn't sound like a machine. And voila, it was amazing. So John is ChatGPT going to take over as a cube host? >> Well, I was thinking, we get the questions in advance sometimes from PR people. We should actually just plug it in ChatGPT, add it to our notes, and saying, "Is this good enough for you? Let's ask the real question." So I think, you know, I think there's a lot of heavy lifting that gets done. I think the ChatGPT is a phenomenal revolution. I think it highlights the use case. Like that example we showed earlier. It gets most of it right. So it's directionally correct and it feels like it's an answer, but it's not a hundred percent accurate. And I think that's where people are seeing value in it. Writing marketing, copy, brainstorming, guest list, gift list for somebody. Write me some lyrics to a song. Give me a thesis about healthcare policy in the United States. It'll do a bang up job, and then you got to go in and you can massage it. So we're going to do three quarters of the work. That's why plagiarism and schools are kind of freaking out. And that's why Microsoft put 10 billion in, because why wouldn't this be a feature of Word, or the OS to help it do stuff on behalf of the user. So linguistically it's a beautiful thing. You can input a string and get a good answer. It's not a search result. >> And we're going to get your take on on Microsoft and, but it kind of levels the playing- but ChatGPT writes better than I do, Sarbjeet, and I know you have some good examples too. You mentioned the Reed Hastings example. >> Yeah, I was listening to Reed Hastings fireside chat with ChatGPT, and the answers were coming as sort of voice, in the voice format. And it was amazing what, he was having very sort of philosophy kind of talk with the ChatGPT, the longer sentences, like he was going on, like, just like we are talking, he was talking for like almost two minutes and then ChatGPT was answering. It was not one sentence question, and then a lot of answers from ChatGPT and yeah, you're right. I, this is our ability. I've been thinking deep about this since yesterday, we talked about, like, we want to do this segment. The data is fed into the data model. It can be the current data as well, but I think that, like, models like ChatGPT, other companies will have those too. They can, they're democratizing the intelligence, but they're not creating intelligence yet, definitely yet I can say that. They will give you all the finite answers. Like, okay, how do you do this for loop in Java, versus, you know, C sharp, and as a programmer you can do that, in, but they can't tell you that, how to write a new algorithm or write a new search algorithm for you. They cannot create a secretive code for you to- >> Not yet. >> Have competitive advantage. >> Not yet, not yet. >> but you- >> Can Google do that today? >> No one really can. The reasoning side of the data is, we talked about at our Supercloud event, with Zhamak Dehghani who's was CEO of, now of Nextdata. This next wave of data intelligence is going to come from entrepreneurs that are probably cross discipline, computer science and some other discipline. But they're going to be new things, for example, data, metadata, and data. It's hard to do reasoning like a human being, so that needs more data to train itself. So I think the first gen of this training module for the large language model they have is a corpus of text. Lot of that's why blog posts are, but the facts are wrong and sometimes out of context, because that contextual reasoning takes time, it takes intelligence. So machines need to become intelligent, and so therefore they need to be trained. So you're going to start to see, I think, a lot of acceleration on training the data sets. And again, it's only as good as the data you can get. And again, proprietary data sets will be a huge winner. Anyone who's got a large corpus of content, proprietary content like theCUBE or SiliconANGLE as a publisher will benefit from this. Large FinTech companies, anyone with large proprietary data will probably be a big winner on this generative AI wave, because it just, it will eat that up, and turn that back into something better. So I think there's going to be a lot of interesting things to look at here. And certainly productivity's going to be off the charts for vanilla and the internet is going to get swarmed with vanilla content. So if you're in the content business, and you're an original content producer of any kind, you're going to be not vanilla, so you're going to be better. So I think there's so much at play Dave (indistinct). >> I think the playing field has been risen, so we- >> Risen and leveled? >> Yeah, and leveled to certain extent. So it's now like that few people as consumers, as consumers of AI, we will have a advantage and others cannot have that advantage. So it will be democratized. That's, I'm sure about that. But if you take the example of calculator, when the calculator came in, and a lot of people are, "Oh, people can't do math anymore because calculator is there." right? So it's a similar sort of moment, just like a calculator for the next level. But, again- >> I see it more like open source, Sarbjeet, because like if you think about what ChatGPT's doing, you do a query and it comes from somewhere the value of a post from ChatGPT is just a reuse of AI. The original content accent will be come from a human. So if I lay out a paragraph from ChatGPT, did some heavy lifting on some facts, I check the facts, save me about maybe- >> Yeah, it's productive. >> An hour writing, and then I write a killer two, three sentences of, like, sharp original thinking or critical analysis. I then took that body of work, open source content, and then laid something on top of it. >> And Sarbjeet's example is a good one, because like if the calculator kids don't do math as well anymore, the slide rule, remember we had slide rules as kids, remember we first started using Waze, you know, we were this minority and you had an advantage over other drivers. Now Waze is like, you know, social traffic, you know, navigation, everybody had, you know- >> All the back roads are crowded. >> They're car crowded. (group laughs) Exactly. All right, let's, let's move on. What about this notion that futurist Ray Amara put forth and really Amara's Law that we're showing here, it's, the law is we, you know, "We tend to overestimate the effect of technology in the short run and underestimate it in the long run." Is that the case, do you think, with ChatGPT? What do you think Sarbjeet? >> I think that's true actually. There's a lot of, >> We don't debate this. >> There's a lot of awe, like when people see the results from ChatGPT, they say what, what the heck? Like, it can do this? But then if you use it more and more and more, and I ask the set of similar question, not the same question, and it gives you like same answer. It's like reading from the same bucket of text in, the interior read (indistinct) where the ChatGPT, you will see that in some couple of segments. It's very, it sounds so boring that the ChatGPT is coming out the same two sentences every time. So it is kind of good, but it's not as good as people think it is right now. But we will have, go through this, you know, hype sort of cycle and get realistic with it. And then in the long term, I think it's a great thing in the short term, it's not something which will (indistinct) >> What's your counter point? You're saying it's not. >> I, no I think the question was, it's hyped up in the short term and not it's underestimated long term. That's what I think what he said, quote. >> Yes, yeah. That's what he said. >> Okay, I think that's wrong with this, because this is a unique, ChatGPT is a unique kind of impact and it's very generational. People have been comparing it, I have been comparing to the internet, like the web, web browser Mosaic and Netscape, right, Navigator. I mean, I clearly still remember the days seeing Navigator for the first time, wow. And there weren't not many sites you could go to, everyone typed in, you know, cars.com, you know. >> That (indistinct) wasn't that overestimated, the overhyped at the beginning and underestimated. >> No, it was, it was underestimated long run, people thought. >> But that Amara's law. >> That's what is. >> No, they said overestimated? >> Overestimated near term underestimated- overhyped near term, underestimated long term. I got, right I mean? >> Well, I, yeah okay, so I would then agree, okay then- >> We were off the charts about the internet in the early days, and it actually exceeded our expectations. >> Well there were people who were, like, poo-pooing it early on. So when the browser came out, people were like, "Oh, the web's a toy for kids." I mean, in 1995 the web was a joke, right? So '96, you had online populations growing, so you had structural changes going on around the browser, internet population. And then that replaced other things, direct mail, other business activities that were once analog then went to the web, kind of read only as you, as we always talk about. So I think that's a moment where the hype long term, the smart money, and the smart industry experts all get the long term. And in this case, there's more poo-pooing in the short term. "Ah, it's not a big deal, it's just AI." I've heard many people poo-pooing ChatGPT, and a lot of smart people saying, "No this is next gen, this is different and it's only going to get better." So I think people are estimating a big long game on this one. >> So you're saying it's bifurcated. There's those who say- >> Yes. >> Okay, all right, let's get to the heart of the premise, and possibly the debate for today's episode. Will OpenAI's early entry into the market confer sustainable competitive advantage for the company. And if you look at the history of tech, the technology industry, it's kind of littered with first mover failures. Altair, IBM, Tandy, Commodore, they and Apple even, they were really early in the PC game. They took a backseat to Dell who came in the scene years later with a better business model. Netscape, you were just talking about, was all the rage in Silicon Valley, with the first browser, drove up all the housing prices out here. AltaVista was the first search engine to really, you know, index full text. >> Owned by Dell, I mean DEC. >> Owned by Digital. >> Yeah, Digital Equipment >> Compaq bought it. And of course as an aside, Digital, they wanted to showcase their hardware, right? Their super computer stuff. And then so Friendster and MySpace, they came before Facebook. The iPhone certainly wasn't the first mobile device. So lots of failed examples, but there are some recent successes like AWS and cloud. >> You could say smartphone. So I mean. >> Well I know, and you can, we can parse this so we'll debate it. Now Twitter, you could argue, had first mover advantage. You kind of gave me that one John. Bitcoin and crypto clearly had first mover advantage, and sustaining that. Guys, will OpenAI make it to the list on the right with ChatGPT, what do you think? >> I think categorically as a company, it probably won't, but as a category, I think what they're doing will, so OpenAI as a company, they get funding, there's power dynamics involved. Microsoft put a billion dollars in early on, then they just pony it up. Now they're reporting 10 billion more. So, like, if the browsers, Microsoft had competitive advantage over Netscape, and used monopoly power, and convicted by the Department of Justice for killing Netscape with their monopoly, Netscape should have had won that battle, but Microsoft killed it. In this case, Microsoft's not killing it, they're buying into it. So I think the embrace extend Microsoft power here makes OpenAI vulnerable for that one vendor solution. So the AI as a company might not make the list, but the category of what this is, large language model AI, is probably will be on the right hand side. >> Okay, we're going to come back to the government intervention and maybe do some comparisons, but what are your thoughts on this premise here? That, it will basically set- put forth the premise that it, that ChatGPT, its early entry into the market will not confer competitive advantage to >> For OpenAI. >> To Open- Yeah, do you agree with that? >> I agree with that actually. It, because Google has been at it, and they have been holding back, as John said because of the scrutiny from the Fed, right, so- >> And privacy too. >> And the privacy and the accuracy as well. But I think Sam Altman and the company on those guys, right? They have put this in a hasty way out there, you know, because it makes mistakes, and there are a lot of questions around the, sort of, where the content is coming from. You saw that as your example, it just stole the content, and without your permission, you know? >> Yeah. So as quick this aside- >> And it codes on people's behalf and the, those codes are wrong. So there's a lot of, sort of, false information it's putting out there. So it's a very vulnerable thing to do what Sam Altman- >> So even though it'll get better, others will compete. >> So look, just side note, a term which Reid Hoffman used a little bit. Like he said, it's experimental launch, like, you know, it's- >> It's pretty damn good. >> It is clever because according to Sam- >> It's more than clever. It's good. >> It's awesome, if you haven't used it. I mean you write- you read what it writes and you go, "This thing writes so well, it writes so much better than you." >> The human emotion drives that too. I think that's a big thing. But- >> I Want to add one more- >> Make your last point. >> Last one. Okay. So, but he's still holding back. He's conducting quite a few interviews. If you want to get the gist of it, there's an interview with StrictlyVC interview from yesterday with Sam Altman. Listen to that one it's an eye opening what they want- where they want to take it. But my last one I want to make it on this point is that Satya Nadella yesterday did an interview with Wall Street Journal. I think he was doing- >> You were not impressed. >> I was not impressed because he was pushing it too much. So Sam Altman's holding back so there's less backlash. >> Got 10 billion reasons to push. >> I think he's almost- >> Microsoft just laid off 10000 people. Hey ChatGPT, find me a job. You know like. (group laughs) >> He's overselling it to an extent that I think it will backfire on Microsoft. And he's over promising a lot of stuff right now, I think. I don't know why he's very jittery about all these things. And he did the same thing during Ignite as well. So he said, "Oh, this AI will write code for you and this and that." Like you called him out- >> The hyperbole- >> During your- >> from Satya Nadella, he's got a lot of hyperbole. (group talks over each other) >> All right, Let's, go ahead. >> Well, can I weigh in on the whole- >> Yeah, sure. >> Microsoft thing on whether OpenAI, here's the take on this. I think it's more like the browser moment to me, because I could relate to that experience with ChatG, personally, emotionally, when I saw that, and I remember vividly- >> You mean that aha moment (indistinct). >> Like this is obviously the future. Anything else in the old world is dead, website's going to be everywhere. It was just instant dot connection for me. And a lot of other smart people who saw this. Lot of people by the way, didn't see it. Someone said the web's a toy. At the company I was worked for at the time, Hewlett Packard, they like, they could have been in, they had invented HTML, and so like all this stuff was, like, they just passed, the web was just being passed over. But at that time, the browser got better, more websites came on board. So the structural advantage there was online web usage was growing, online user population. So that was growing exponentially with the rise of the Netscape browser. So OpenAI could stay on the right side of your list as durable, if they leverage the category that they're creating, can get the scale. And if they can get the scale, just like Twitter, that failed so many times that they still hung around. So it was a product that was always successful, right? So I mean, it should have- >> You're right, it was terrible, we kept coming back. >> The fail whale, but it still grew. So OpenAI has that moment. They could do it if Microsoft doesn't meddle too much with too much power as a vendor. They could be the Netscape Navigator, without the anti-competitive behavior of somebody else. So to me, they have the pole position. So they have an opportunity. So if not, if they don't execute, then there's opportunity. There's not a lot of barriers to entry, vis-a-vis say the CapEx of say a cloud company like AWS. You can't replicate that, Many have tried, but I think you can replicate OpenAI. >> And we're going to talk about that. Okay, so real quick, I want to bring in some ETR data. This isn't an ETR heavy segment, only because this so new, you know, they haven't coverage yet, but they do cover AI. So basically what we're seeing here is a slide on the vertical axis's net score, which is a measure of spending momentum, and in the horizontal axis's is presence in the dataset. Think of it as, like, market presence. And in the insert right there, you can see how the dots are plotted, the two columns. And so, but the key point here that we want to make, there's a bunch of companies on the left, is he like, you know, DataRobot and C3 AI and some others, but the big whales, Google, AWS, Microsoft, are really dominant in this market. So that's really the key takeaway that, can we- >> I notice IBM is way low. >> Yeah, IBM's low, and actually bring that back up and you, but then you see Oracle who actually is injecting. So I guess that's the other point is, you're not necessarily going to go buy AI, and you know, build your own AI, you're going to, it's going to be there and, it, Salesforce is going to embed it into its platform, the SaaS companies, and you're going to purchase AI. You're not necessarily going to build it. But some companies obviously are. >> I mean to quote IBM's general manager Rob Thomas, "You can't have AI with IA." information architecture and David Flynn- >> You can't Have AI without IA >> without, you can't have AI without IA. You can't have, if you have an Information Architecture, you then can power AI. Yesterday David Flynn, with Hammersmith, was on our Supercloud. He was pointing out that the relationship of storage, where you store things, also impacts the data and stressablity, and Zhamak from Nextdata, she was pointing out that same thing. So the data problem factors into all this too, Dave. >> So you got the big cloud and internet giants, they're all poised to go after this opportunity. Microsoft is investing up to 10 billion. Google's code red, which was, you know, the headline in the New York Times. Of course Apple is there and several alternatives in the market today. Guys like Chinchilla, Bloom, and there's a company Jasper and several others, and then Lena Khan looms large and the government's around the world, EU, US, China, all taking notice before the market really is coalesced around a single player. You know, John, you mentioned Netscape, they kind of really, the US government was way late to that game. It was kind of game over. And Netscape, I remember Barksdale was like, "Eh, we're going to be selling software in the enterprise anyway." and then, pshew, the company just dissipated. So, but it looks like the US government, especially with Lena Khan, they're changing the definition of antitrust and what the cause is to go after people, and they're really much more aggressive. It's only what, two years ago that (indistinct). >> Yeah, the problem I have with the federal oversight is this, they're always like late to the game, and they're slow to catch up. So in other words, they're working on stuff that should have been solved a year and a half, two years ago around some of the social networks hiding behind some of the rules around open web back in the days, and I think- >> But they're like 15 years late to that. >> Yeah, and now they got this new thing on top of it. So like, I just worry about them getting their fingers. >> But there's only two years, you know, OpenAI. >> No, but the thing (indistinct). >> No, they're still fighting other battles. But the problem with government is that they're going to label Big Tech as like a evil thing like Pharma, it's like smoke- >> You know Lena Khan wants to kill Big Tech, there's no question. >> So I think Big Tech is getting a very seriously bad rap. And I think anything that the government does that shades darkness on tech, is politically motivated in most cases. You can almost look at everything, and my 80 20 rule is in play here. 80% of the government activity around tech is bullshit, it's politically motivated, and the 20% is probably relevant, but off the mark and not organized. >> Well market forces have always been the determining factor of success. The governments, you know, have been pretty much failed. I mean you look at IBM's antitrust, that, what did that do? The market ultimately beat them. You look at Microsoft back in the day, right? Windows 95 was peaking, the government came in. But you know, like you said, they missed the web, right, and >> so they were hanging on- >> There's nobody in government >> to Windows. >> that actually knows- >> And so, you, I think you're right. It's market forces that are going to determine this. But Sarbjeet, what do you make of Microsoft's big bet here, you weren't impressed with with Nadella. How do you think, where are they going to apply it? Is this going to be a Hail Mary for Bing, or is it going to be applied elsewhere? What do you think. >> They are saying that they will, sort of, weave this into their products, office products, productivity and also to write code as well, developer productivity as well. That's a big play for them. But coming back to your antitrust sort of comments, right? I believe the, your comment was like, oh, fed was late 10 years or 15 years earlier, but now they're two years. But things are moving very fast now as compared to they used to move. >> So two years is like 10 Years. >> Yeah, two years is like 10 years. Just want to make that point. (Dave laughs) This thing is going like wildfire. Any new tech which comes in that I think they're going against distribution channels. Lina Khan has commented time and again that the marketplace model is that she wants to have some grip on. Cloud marketplaces are a kind of monopolistic kind of way. >> I don't, I don't see this, I don't see a Chat AI. >> You told me it's not Bing, you had an interesting comment. >> No, no. First of all, this is great from Microsoft. If you're Microsoft- >> Why? >> Because Microsoft doesn't have the AI chops that Google has, right? Google is got so much core competency on how they run their search, how they run their backends, their cloud, even though they don't get a lot of cloud market share in the enterprise, they got a kick ass cloud cause they needed one. >> Totally. >> They've invented SRE. I mean Google's development and engineering chops are off the scales, right? Amazon's got some good chops, but Google's got like 10 times more chops than AWS in my opinion. Cloud's a whole different story. Microsoft gets AI, they get a playbook, they get a product they can render into, the not only Bing, productivity software, helping people write papers, PowerPoint, also don't forget the cloud AI can super help. We had this conversation on our Supercloud event, where AI's going to do a lot of the heavy lifting around understanding observability and managing service meshes, to managing microservices, to turning on and off applications, and or maybe writing code in real time. So there's a plethora of use cases for Microsoft to deploy this. combined with their R and D budgets, they can then turbocharge more research, build on it. So I think this gives them a car in the game, Google may have pole position with AI, but this puts Microsoft right in the game, and they already have a lot of stuff going on. But this just, I mean everything gets lifted up. Security, cloud, productivity suite, everything. >> What's under the hood at Google, and why aren't they talking about it? I mean they got to be freaked out about this. No? Or do they have kind of a magic bullet? >> I think they have the, they have the chops definitely. Magic bullet, I don't know where they are, as compared to the ChatGPT 3 or 4 models. Like they, but if you look at the online sort of activity and the videos put out there from Google folks, Google technology folks, that's account you should look at if you are looking there, they have put all these distinctions what ChatGPT 3 has used, they have been talking about for a while as well. So it's not like it's a secret thing that you cannot replicate. As you said earlier, like in the beginning of this segment, that anybody who has more data and the capacity to process that data, which Google has both, I think they will win this. >> Obviously living in Palo Alto where the Google founders are, and Google's headquarters next town over we have- >> We're so close to them. We have inside information on some of the thinking and that hasn't been reported by any outlet yet. And that is, is that, from what I'm hearing from my sources, is Google has it, they don't want to release it for many reasons. One is it might screw up their search monopoly, one, two, they're worried about the accuracy, 'cause Google will get sued. 'Cause a lot of people are jamming on this ChatGPT as, "Oh it does everything for me." when it's clearly not a hundred percent accurate all the time. >> So Lina Kahn is looming, and so Google's like be careful. >> Yeah so Google's just like, this is the third, could be a third rail. >> But the first thing you said is a concern. >> Well no. >> The disruptive (indistinct) >> What they will do is do a Waymo kind of thing, where they spin out a separate company. >> They're doing that. >> The discussions happening, they're going to spin out the separate company and put it over there, and saying, "This is AI, got search over there, don't touch that search, 'cause that's where all the revenue is." (chuckles) >> So, okay, so that's how they deal with the Clay Christensen dilemma. What's the business model here? I mean it's not advertising, right? Is it to charge you for a query? What, how do you make money at this? >> It's a good question, I mean my thinking is, first of all, it's cool to type stuff in and see a paper get written, or write a blog post, or gimme a marketing slogan for this or that or write some code. I think the API side of the business will be critical. And I think Howie Xu, I know you're going to reference some of his comments yesterday on Supercloud, I think this brings a whole 'nother user interface into technology consumption. I think the business model, not yet clear, but it will probably be some sort of either API and developer environment or just a straight up free consumer product, with some sort of freemium backend thing for business. >> And he was saying too, it's natural language is the way in which you're going to interact with these systems. >> I think it's APIs, it's APIs, APIs, APIs, because these people who are cooking up these models, and it takes a lot of compute power to train these and to, for inference as well. Somebody did the analysis on the how many cents a Google search costs to Google, and how many cents the ChatGPT query costs. It's, you know, 100x or something on that. You can take a look at that. >> A 100x on which side? >> You're saying two orders of magnitude more expensive for ChatGPT >> Much more, yeah. >> Than for Google. >> It's very expensive. >> So Google's got the data, they got the infrastructure and they got, you're saying they got the cost (indistinct) >> No actually it's a simple query as well, but they are trying to put together the answers, and they're going through a lot more data versus index data already, you know. >> Let me clarify, you're saying that Google's version of ChatGPT is more efficient? >> No, I'm, I'm saying Google search results. >> Ah, search results. >> What are used to today, but cheaper. >> But that, does that, is that going to confer advantage to Google's large language (indistinct)? >> It will, because there were deep science (indistinct). >> Google, I don't think Google search is doing a large language model on their search, it's keyword search. You know, what's the weather in Santa Cruz? Or how, what's the weather going to be? Or you know, how do I find this? Now they have done a smart job of doing some things with those queries, auto complete, re direct navigation. But it's, it's not entity. It's not like, "Hey, what's Dave Vellante thinking this week in Breaking Analysis?" ChatGPT might get that, because it'll get your Breaking Analysis, it'll synthesize it. There'll be some, maybe some clips. It'll be like, you know, I mean. >> Well I got to tell you, I asked ChatGPT to, like, I said, I'm going to enter a transcript of a discussion I had with Nir Zuk, the CTO of Palo Alto Networks, And I want you to write a 750 word blog. I never input the transcript. It wrote a 750 word blog. It attributed quotes to him, and it just pulled a bunch of stuff that, and said, okay, here it is. It talked about Supercloud, it defined Supercloud. >> It's made, it makes you- >> Wow, But it was a big lie. It was fraudulent, but still, blew me away. >> Again, vanilla content and non accurate content. So we are going to see a surge of misinformation on steroids, but I call it the vanilla content. Wow, that's just so boring, (indistinct). >> There's so many dangers. >> Make your point, cause we got to, almost out of time. >> Okay, so the consumption, like how do you consume this thing. As humans, we are consuming it and we are, like, getting a nicely, like, surprisingly shocked, you know, wow, that's cool. It's going to increase productivity and all that stuff, right? And on the danger side as well, the bad actors can take hold of it and create fake content and we have the fake sort of intelligence, if you go out there. So that's one thing. The second thing is, we are as humans are consuming this as language. Like we read that, we listen to it, whatever format we consume that is, but the ultimate usage of that will be when the machines can take that output from likes of ChatGPT, and do actions based on that. The robots can work, the robot can paint your house, we were talking about, right? Right now we can't do that. >> Data apps. >> So the data has to be ingested by the machines. It has to be digestible by the machines. And the machines cannot digest unorganized data right now, we will get better on the ingestion side as well. So we are getting better. >> Data, reasoning, insights, and action. >> I like that mall, paint my house. >> So, okay- >> By the way, that means drones that'll come in. Spray painting your house. >> Hey, it wasn't too long ago that robots couldn't climb stairs, as I like to point out. Okay, and of course it's no surprise the venture capitalists are lining up to eat at the trough, as I'd like to say. Let's hear, you'd referenced this earlier, John, let's hear what AI expert Howie Xu said at the Supercloud event, about what it takes to clone ChatGPT. Please, play the clip. >> So one of the VCs actually asked me the other day, right? "Hey, how much money do I need to spend, invest to get a, you know, another shot to the openAI sort of the level." You know, I did a (indistinct) >> Line up. >> A hundred million dollar is the order of magnitude that I came up with, right? You know, not a billion, not 10 million, right? So a hundred- >> Guys a hundred million dollars, that's an astoundingly low figure. What do you make of it? >> I was in an interview with, I was interviewing, I think he said hundred million or so, but in the hundreds of millions, not a billion right? >> You were trying to get him up, you were like "Hundreds of millions." >> Well I think, I- >> He's like, eh, not 10, not a billion. >> Well first of all, Howie Xu's an expert machine learning. He's at Zscaler, he's a machine learning AI guy. But he comes from VMware, he's got his technology pedigrees really off the chart. Great friend of theCUBE and kind of like a CUBE analyst for us. And he's smart. He's right. I think the barriers to entry from a dollar standpoint are lower than say the CapEx required to compete with AWS. Clearly, the CapEx spending to build all the tech for the run a cloud. >> And you don't need a huge sales force. >> And in some case apps too, it's the same thing. But I think it's not that hard. >> But am I right about that? You don't need a huge sales force either. It's, what, you know >> If the product's good, it will sell, this is a new era. The better mouse trap will win. This is the new economics in software, right? So- >> Because you look at the amount of money Lacework, and Snyk, Snowflake, Databrooks. Look at the amount of money they've raised. I mean it's like a billion dollars before they get to IPO or more. 'Cause they need promotion, they need go to market. You don't need (indistinct) >> OpenAI's been working on this for multiple five years plus it's, hasn't, wasn't born yesterday. Took a lot of years to get going. And Sam is depositioning all the success, because he's trying to manage expectations, To your point Sarbjeet, earlier. It's like, yeah, he's trying to "Whoa, whoa, settle down everybody, (Dave laughs) it's not that great." because he doesn't want to fall into that, you know, hero and then get taken down, so. >> It may take a 100 million or 150 or 200 million to train the model. But to, for the inference to, yeah to for the inference machine, It will take a lot more, I believe. >> Give it, so imagine, >> Because- >> Go ahead, sorry. >> Go ahead. But because it consumes a lot more compute cycles and it's certain level of storage and everything, right, which they already have. So I think to compute is different. To frame the model is a different cost. But to run the business is different, because I think 100 million can go into just fighting the Fed. >> Well there's a flywheel too. >> Oh that's (indistinct) >> (indistinct) >> We are running the business, right? >> It's an interesting number, but it's also kind of, like, context to it. So here, a hundred million spend it, you get there, but you got to factor in the fact that the ways companies win these days is critical mass scale, hitting a flywheel. If they can keep that flywheel of the value that they got going on and get better, you can almost imagine a marketplace where, hey, we have proprietary data, we're SiliconANGLE in theCUBE. We have proprietary content, CUBE videos, transcripts. Well wouldn't it be great if someone in a marketplace could sell a module for us, right? We buy that, Amazon's thing and things like that. So if they can get a marketplace going where you can apply to data sets that may be proprietary, you can start to see this become bigger. And so I think the key barriers to entry is going to be success. I'll give you an example, Reddit. Reddit is successful and it's hard to copy, not because of the software. >> They built the moat. >> Because you can, buy Reddit open source software and try To compete. >> They built the moat with their community. >> Their community, their scale, their user expectation. Twitter, we referenced earlier, that thing should have gone under the first two years, but there was such a great emotional product. People would tolerate the fail whale. And then, you know, well that was a whole 'nother thing. >> Then a plane landed in (John laughs) the Hudson and it was over. >> I think verticals, a lot of verticals will build applications using these models like for lawyers, for doctors, for scientists, for content creators, for- >> So you'll have many hundreds of millions of dollars investments that are going to be seeping out. If, all right, we got to wrap, if you had to put odds on it that that OpenAI is going to be the leader, maybe not a winner take all leader, but like you look at like Amazon and cloud, they're not winner take all, these aren't necessarily winner take all markets. It's not necessarily a zero sum game, but let's call it winner take most. What odds would you give that open AI 10 years from now will be in that position. >> If I'm 0 to 10 kind of thing? >> Yeah, it's like horse race, 3 to 1, 2 to 1, even money, 10 to 1, 50 to 1. >> Maybe 2 to 1, >> 2 to 1, that's pretty low odds. That's basically saying they're the favorite, they're the front runner. Would you agree with that? >> I'd say 4 to 1. >> Yeah, I was going to say I'm like a 5 to 1, 7 to 1 type of person, 'cause I'm a skeptic with, you know, there's so much competition, but- >> I think they're definitely the leader. I mean you got to say, I mean. >> Oh there's no question. There's no question about it. >> The question is can they execute? >> They're not Friendster, is what you're saying. >> They're not Friendster and they're more like Twitter and Reddit where they have momentum. If they can execute on the product side, and if they don't stumble on that, they will continue to have the lead. >> If they say stay neutral, as Sam is, has been saying, that, hey, Microsoft is one of our partners, if you look at their company model, how they have structured the company, then they're going to pay back to the investors, like Microsoft is the biggest one, up to certain, like by certain number of years, they're going to pay back from all the money they make, and after that, they're going to give the money back to the public, to the, I don't know who they give it to, like non-profit or something. (indistinct) >> Okay, the odds are dropping. (group talks over each other) That's a good point though >> Actually they might have done that to fend off the criticism of this. But it's really interesting to see the model they have adopted. >> The wildcard in all this, My last word on this is that, if there's a developer shift in how developers and data can come together again, we have conferences around the future of data, Supercloud and meshs versus, you know, how the data world, coding with data, how that evolves will also dictate, 'cause a wild card could be a shift in the landscape around how developers are using either machine learning or AI like techniques to code into their apps, so. >> That's fantastic insight. I can't thank you enough for your time, on the heels of Supercloud 2, really appreciate it. All right, thanks to John and Sarbjeet for the outstanding conversation today. Special thanks to the Palo Alto studio team. My goodness, Anderson, this great backdrop. You guys got it all out here, I'm jealous. And Noah, really appreciate it, Chuck, Andrew Frick and Cameron, Andrew Frick switching, Cameron on the video lake, great job. And Alex Myerson, he's on production, manages the podcast for us, Ken Schiffman as well. Kristen Martin and Cheryl Knight help get the word out on social media and our newsletters. Rob Hof is our editor-in-chief over at SiliconANGLE, does some great editing, thanks to all. Remember, all these episodes are available as podcasts. All you got to do is search Breaking Analysis podcast, wherever you listen. Publish each week on wikibon.com and siliconangle.com. Want to get in touch, email me directly, david.vellante@siliconangle.com or DM me at dvellante, or comment on our LinkedIn post. And by all means, check out etr.ai. They got really great survey data in the enterprise tech business. This is Dave Vellante for theCUBE Insights powered by ETR. Thanks for watching, We'll see you next time on Breaking Analysis. (electronic music)
SUMMARY :
bringing you data-driven and ChatGPT have taken the world by storm. So I asked it, give it to the large language models to do that. So to your point, it's So one of the problems with ChatGPT, and he simply gave the system the prompts, or the OS to help it do but it kind of levels the playing- and the answers were coming as the data you can get. Yeah, and leveled to certain extent. I check the facts, save me about maybe- and then I write a killer because like if the it's, the law is we, you know, I think that's true and I ask the set of similar question, What's your counter point? and not it's underestimated long term. That's what he said. for the first time, wow. the overhyped at the No, it was, it was I got, right I mean? the internet in the early days, and it's only going to get better." So you're saying it's bifurcated. and possibly the debate the first mobile device. So I mean. on the right with ChatGPT, and convicted by the Department of Justice the scrutiny from the Fed, right, so- And the privacy and thing to do what Sam Altman- So even though it'll get like, you know, it's- It's more than clever. I mean you write- I think that's a big thing. I think he was doing- I was not impressed because You know like. And he did the same thing he's got a lot of hyperbole. the browser moment to me, So OpenAI could stay on the right side You're right, it was terrible, They could be the Netscape Navigator, and in the horizontal axis's So I guess that's the other point is, I mean to quote IBM's So the data problem factors and the government's around the world, and they're slow to catch up. Yeah, and now they got years, you know, OpenAI. But the problem with government to kill Big Tech, and the 20% is probably relevant, back in the day, right? are they going to apply it? and also to write code as well, that the marketplace I don't, I don't see you had an interesting comment. No, no. First of all, the AI chops that Google has, right? are off the scales, right? I mean they got to be and the capacity to process that data, on some of the thinking So Lina Kahn is looming, and this is the third, could be a third rail. But the first thing What they will do out the separate company Is it to charge you for a query? it's cool to type stuff in natural language is the way and how many cents the and they're going through Google search results. It will, because there were It'll be like, you know, I mean. I never input the transcript. Wow, But it was a big lie. but I call it the vanilla content. Make your point, cause we And on the danger side as well, So the data By the way, that means at the Supercloud event, So one of the VCs actually What do you make of it? you were like "Hundreds of millions." not 10, not a billion. Clearly, the CapEx spending to build all But I think it's not that hard. It's, what, you know This is the new economics Look at the amount of And Sam is depositioning all the success, or 150 or 200 million to train the model. So I think to compute is different. not because of the software. Because you can, buy They built the moat And then, you know, well that the Hudson and it was over. that are going to be seeping out. Yeah, it's like horse race, 3 to 1, 2 to 1, that's pretty low odds. I mean you got to say, I mean. Oh there's no question. is what you're saying. and if they don't stumble on that, the money back to the public, to the, Okay, the odds are dropping. the model they have adopted. Supercloud and meshs versus, you know, on the heels of Supercloud
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
John | PERSON | 0.99+ |
Sarbjeet | PERSON | 0.99+ |
Brian Gracely | PERSON | 0.99+ |
Lina Khan | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Reid Hoffman | PERSON | 0.99+ |
Alex Myerson | PERSON | 0.99+ |
Lena Khan | PERSON | 0.99+ |
Sam Altman | PERSON | 0.99+ |
Apple | ORGANIZATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Rob Thomas | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Ken Schiffman | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
David Flynn | PERSON | 0.99+ |
Sam | PERSON | 0.99+ |
Noah | PERSON | 0.99+ |
Ray Amara | PERSON | 0.99+ |
10 billion | QUANTITY | 0.99+ |
150 | QUANTITY | 0.99+ |
Rob Hof | PERSON | 0.99+ |
Chuck | PERSON | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
Howie Xu | PERSON | 0.99+ |
Anderson | PERSON | 0.99+ |
Cheryl Knight | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
Hewlett Packard | ORGANIZATION | 0.99+ |
Santa Cruz | LOCATION | 0.99+ |
1995 | DATE | 0.99+ |
Lina Kahn | PERSON | 0.99+ |
Zhamak Dehghani | PERSON | 0.99+ |
50 words | QUANTITY | 0.99+ |
Hundreds of millions | QUANTITY | 0.99+ |
Compaq | ORGANIZATION | 0.99+ |
10 | QUANTITY | 0.99+ |
Kristen Martin | PERSON | 0.99+ |
two sentences | QUANTITY | 0.99+ |
Dave | PERSON | 0.99+ |
hundreds of millions | QUANTITY | 0.99+ |
Satya Nadella | PERSON | 0.99+ |
Cameron | PERSON | 0.99+ |
100 million | QUANTITY | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
one sentence | QUANTITY | 0.99+ |
10 million | QUANTITY | 0.99+ |
yesterday | DATE | 0.99+ |
Clay Christensen | PERSON | 0.99+ |
Sarbjeet Johal | PERSON | 0.99+ |
Netscape | ORGANIZATION | 0.99+ |
Joseph Nelson, Roboflow | AWS Startup Showcase
(chill electronic music) >> Hello everyone, welcome to theCUBE's presentation of the AWS Startups Showcase, AI and machine learning, the top startups building generative AI on AWS. This is the season three, episode one of the ongoing series covering the exciting startups from the AWS ecosystem, talk about AI and machine learning. Can't believe it's three years and season one. I'm your host, John Furrier. Got a great guest today, we're joined by Joseph Nelson, the co-founder and CEO of Roboflow, doing some cutting edge stuff around computer vision and really at the front end of this massive wave coming around, large language models, computer vision. The next gen AI is here, and it's just getting started. We haven't even scratched a service. Thanks for joining us today. >> Thanks for having me. >> So you got to love the large language model, foundation models, really educating the mainstream world. ChatGPT has got everyone in the frenzy. This is educating the world around this next gen AI capabilities, enterprise, image and video data, all a big part of it. I mean the edge of the network, Mobile World Conference is happening right now, this month, and it's just ending up, it's just continue to explode. Video is huge. So take us through the company, do a quick explanation of what you guys are doing, when you were founded. Talk about what the company's mission is, and what's your North Star, why do you exist? >> Yeah, Roboflow exists to really kind of make the world programmable. I like to say make the world be read and write access. And our North Star is enabling developers, predominantly, to build that future. If you look around, anything that you see will have software related to it, and can kind of be turned into software. The limiting reactant though, is how to enable computers and machines to understand things as well as people can. And in a lot of ways, computer vision is that missing element that enables anything that you see to become software. So in the virtue of, if software is eating the world, computer vision kind of makes the aperture infinitely wide. It's something that I kind of like, the way I like to frame it. And the capabilities are there, the open source models are there, the amount of data is there, the computer capabilities are only improving annually, but there's a pretty big dearth of tooling, and an early but promising sign of the explosion of use cases, models, and data sets that companies, developers, hobbyists alike will need to bring these capabilities to bear. So Roboflow is in the game of building the community around that capability, building the use cases that allow developers and enterprises to use computer vision, and providing the tooling for companies and developers to be able to add computer vision, create better data sets, and deploy to production, quickly, easily, safely, invaluably. >> You know, Joseph, the word in production is actually real now. You're seeing a lot more people doing in production activities. That's a real hot one and usually it's slower, but it's gone faster, and I think that's going to be more the same. And I think the parallel between what we're seeing on the large language models coming into computer vision, and as you mentioned, video's data, right? I mean we're doing video right now, we're transcribing it into a transcript, linking up to your linguistics, times and the timestamp, I mean everything's data and that really kind of feeds. So this connection between what we're seeing, the large language and computer vision are coming together kind of cousins, brothers. I mean, how would you compare, how would you explain to someone, because everyone's like on this wave of watching people bang out their homework assignments, and you know, write some hacks on code with some of the open AI technologies, there is a corollary directly related to to the vision side. Can you explain? >> Yeah, the rise of large language models are showing what's possible, especially with text, and I think increasingly will get multimodal as the images and video become ingested. Though there's kind of this still core missing element of basically like understanding. So the rise of large language models kind of create this new area of generative AI, and generative AI in the context of computer vision is a lot of, you know, creating video and image assets and content. There's also this whole surface area to understanding what's already created. Basically digitizing physical, real world things. I mean the Metaverse can't be built if we don't know how to mirror or create or identify the objects that we want to interact with in our everyday lives. And where computer vision comes to play in, especially what we've seen at Roboflow is, you know, a little over a hundred thousand developers now have built with our tools. That's to the tune of a hundred million labeled open source images, over 10,000 pre-trained models. And they've kind of showcased to us all of the ways that computer vision is impacting and bringing the world to life. And these are things that, you know, even before large language models and generative AI, you had pretty impressive capabilities, and when you add the two together, it actually unlocks these kind of new capabilities. So for example, you know, one of our users actually powers the broadcast feeds at Wimbledon. So here we're talking about video, we're streaming, we're doing things live, we've got folks that are cropping and making sure we look good, and audio/visual all plugged in correctly. When you broadcast Wimbledon, you'll notice that the camera controllers need to do things like track the ball, which is moving at extremely high speeds and zoom crop, pan tilt, as well as determine if the ball bounced in or out. The very controversial but critical key to a lot of tennis matches. And a lot of that has been historically done with the trained, but fallible human eye and computer vision is, you know, well suited for this task to say, how do we track, pan, tilt, zoom, and see, track the tennis ball in real time, run at 30 plus frames per second, and do it all on the edge. And those are capabilities that, you know, were kind of like science fiction, maybe even a decade ago, and certainly five years ago. Now the interesting thing, is that with the advent of of generative AI, you can start to do things like create your own training data sets, or kind of create logic around once you have this visual input. And teams at Tesla have actually been speaking about, of course the autopilot team's focused on doing vision tasks, but they've combined large language models to add reasoning and logic. So given that you see, let's say the tennis ball, what do you want to do? And being able to combine the capabilities of what LLM's represent, which is really a lot of basically, core human reasoning and logic, with computer vision for the inputs of what's possible, creates these new capabilities, let alone multimodality, which I'm sure we'll talk more about. >> Yeah, and it's really, I mean it's almost intoxicating. It's amazing that this is so capable because the cloud scales here, you got the edge developing, you can decouple compute power, and let Moore's law and all the new silicone and the processors and the GPUs do their thing, and you got open source booming. You're kind of getting at this next segment I wanted to get into, which is the, how people should be thinking about these advances of the computer vision. So this is now a next wave, it's here. I mean I'd love to have that for baseball because I'm always like, "Oh, it should have been a strike." I'm sure that's going to be coming soon, but what is the computer vision capable of doing today? I guess that's my first question. You hit some of it, unpack that a little bit. What does general AI mean in computer vision? What's the new thing? Because there are old technology's been around, proprietary, bolted onto hardware, but hardware advances at a different pace, but now you got new capabilities, generative AI for vision, what does that mean? >> Yeah, so computer vision, you know, at its core is basically enabling machines, computers, to understand, process, and act on visual data as effective or more effective than people can. Traditionally this has been, you know, task types like classification, which you know, identifying if a given image belongs in a certain category of goods on maybe a retail site, is the shoes or is it clothing? Or object detection, which is, you know, creating bounding boxes, which allows you to do things like count how many things are present, or maybe measure the speed of something, or trigger an alert when something becomes visible in frame that wasn't previously visible in frame, or instant segmentation where you're creating pixel wise segmentations for both instance and semantic segmentation, where you often see these kind of beautiful visuals of the polygon surrounding objects that you see. Then you have key point detection, which is where you see, you know, athletes, and each of their joints are kind of outlined is another more traditional type problem in signal processing and computer vision. With generative AI, you kind of get a whole new class of problem types that are opened up. So in a lot of ways I think about generative AI in computer vision as some of the, you know, problems that you aimed to tackle, might still be better suited for one of the previous task types we were discussing. Some of those problem types may be better suited for using a generative technique, and some are problem types that just previously wouldn't have been possible absent generative AI. And so if you make that kind of Venn diagram in your head, you can think about, okay, you know, visual question answering is a task type where if I give you an image and I say, you know, "How many people are in this image?" We could either build an object detection model that might count all those people, or maybe a visual question answering system would sufficiently answer this type of problem. Let alone generative AI being able to create new training data for old systems. And that's something that we've seen be an increasingly prominent use case for our users, as much as things that we advise our customers and the community writ large to take advantage of. So ultimately those are kind of the traditional task types. I can give you some insight, maybe, into how I think about what's possible today, or five years or ten years as you sort go back. >> Yes, definitely. Let's get into that vision. >> So I kind of think about the types of use cases in terms of what's possible. If you just imagine a very simple bell curve, your normal distribution, for the longest time, the types of things that are in the center of that bell curve are identifying objects that are very common or common objects in context. Microsoft published the COCO Dataset in 2014 of common objects and contexts, of hundreds of thousands of images of chairs, forks, food, person, these sorts of things. And you know, the challenge of the day had always been, how do you identify just those 80 objects? So if we think about the bell curve, that'd be maybe the like dead center of the curve, where there's a lot of those objects present, and it's a very common thing that needs to be identified. But it's a very, very, very small sliver of the distribution. Now if you go out to the way long tail, let's go like deep into the tail of this imagined visual normal distribution, you're going to have a problem like one of our customers, Rivian, in tandem with AWS, is tackling, to do visual quality assurance and manufacturing in production processes. Now only Rivian knows what a Rivian is supposed to look like. Only they know the imagery of what their goods that are going to be produced are. And then between those long tails of proprietary data of highly specific things that need to be understood, in the center of the curve, you have a whole kind of messy middle, type of problems I like to say. The way I think about computer vision advancing, is it's basically you have larger and larger and more capable models that eat from the center out, right? So if you have a model that, you know, understands the 80 classes in COCO, well, pretty soon you have advances like Clip, which was trained on 400 million image text pairs, and has a greater understanding of a wider array of objects than just 80 classes in context. And over time you'll get more and more of these larger models that kind of eat outwards from that center of the distribution. And so the question becomes for companies, when can you rely on maybe a model that just already exists? How do you use your data to get what may be capable off the shelf, so to speak, into something that is usable for you? Or, if you're in those long tails and you have proprietary data, how do you take advantage of the greatest asset you have, which is observed visual information that you want to put to work for your customers, and you're kind of living in the long tails, and you need to adapt state of the art for your capabilities. So my mental model for like how computer vision advances is you have that bell curve, and you have increasingly powerful models that eat outward. And multimodality has a role to play in that, larger models have a role to play in that, more compute, more data generally has a role to play in that. But it will be a messy and I think long condition. >> Well, the thing I want to get, first of all, it's great, great mental model, I appreciate that, 'cause I think that makes a lot of sense. The question is, it seems now more than ever, with the scale and compute that's available, that not only can you eat out to the middle in your example, but there's other models you can integrate with. In the past there was siloed, static, almost bespoke. Now you're looking at larger models eating into the bell curve, as you said, but also integrating in with other stuff. So this seems to be part of that interaction. How does, first of all, is that really happening? Is that true? And then two, what does that mean for companies who want to take advantage of this? Because the old model was operational, you know? I have my cameras, they're watching stuff, whatever, and like now you're in this more of a, distributed computing, computer science mindset, not, you know, put the camera on the wall kind of- I'm oversimplifying, but you know what I'm saying. What's your take on that? >> Well, to the first point of, how are these advances happening? What I was kind of describing was, you know, almost uni-dimensional in that you have like, you're only thinking about vision, but the rise of generative techniques and multi-modality, like Clip is a multi-modal model, it has 400 million image text pairs. That will advance the generalizability at a faster rate than just treating everything as only vision. And that's kind of where LLMs and vision will intersect in a really nice and powerful way. Now in terms of like companies, how should they be thinking about taking advantage of these trends? The biggest thing that, and I think it's different, obviously, on the size of business, if you're an enterprise versus a startup. The biggest thing that I think if you're an enterprise, and you have an established scaled business model that is working for your customers, the question becomes, how do you take advantage of that established data moat, potentially, resource moats, and certainly, of course, establish a way of providing value to an end user. So for example, one of our customers, Walmart, has the advantage of one of the largest inventory and stock of any company in the world. And they also of course have substantial visual data, both from like their online catalogs, or understanding what's in stock or out of stock, or understanding, you know, the quality of things that they're going from the start of their supply chain to making it inside stores, for delivery of fulfillments. All these are are visual challenges. Now they already have a substantial trove of useful imagery to understand and teach and train large models to understand each of the individual SKUs and products that are in their stores. And so if I'm a Walmart, what I'm thinking is, how do I make sure that my petabytes of visual information is utilized in a way where I capture the proprietary benefit of the models that I can train to do tasks like, what item was this? Or maybe I'm going to create AmazonGo-like technology, or maybe I'm going to build like delivery robots, or I want to automatically know what's in and out of stock from visual input fees that I have across my in-store traffic. And that becomes the question and flavor of the day for enterprises. I've got this large amount of data, I've got an established way that I can provide more value to my own customers. How do I ensure I take advantage of the data advantage I'm already sitting on? If you're a startup, I think it's a pretty different question, and I'm happy to talk about. >> Yeah, what's startup angle on this? Because you know, they're going to want to take advantage. It's like cloud startups, cloud native startups, they were born in the cloud, they never had an IT department. So if you're a startup, is there a similar role here? And if I'm a computer vision startup, what's that mean? So can you share your your take on that, because there'll be a lot of people starting up from this. >> So the startup on the opposite advantage and disadvantage, right? Like a startup doesn't have an proven way of delivering repeatable value in the same way that a scaled enterprise does. But it does have the nimbleness to identify and take advantage of techniques that you can start from a blank slate. And I think the thing that startups need to be wary of in the generative AI enlarged language model, in multimodal world, is building what I like to call, kind of like sandcastles. A sandcastle is maybe a business model or a capability that's built on top of an assumption that is going to be pretty quickly wiped away by improving underlying model technology. So almost like if you imagine like the ocean, the waves are coming in, and they're going to wipe away your progress. You don't want to be in the position of building sandcastle business where, you don't want to bet on the fact that models aren't going to get good enough to solve the task type that you might be solving. In other words, don't take a screenshot of what's capable today. Assume that what's capable today is only going to continue to become possible. And so for a startup, what you can do, that like enterprises are quite comparatively less good at, is embedding these capabilities deeply within your products and delivering maybe a vertical based experience, where AI kind of exists in the background. >> Yeah. >> And we might not think of companies as, you know, even AI companies, it's just so embedded in the experience they provide, but that's like the vertical application example of taking AI and making it be immediately usable. Or, of course there's tons of picks and shovels businesses to be built like Roboflow, where you're enabling these enterprises to take advantage of something that they have, whether that's their data sets, their computes, or their intellect. >> Okay, so if I hear that right, by the way, I love, that's horizontally scalable, that's the large language models, go up and build them the apps, hence your developer focus. I'm sure that's probably the reason that the tsunami of developer's action. So you're saying picks and shovels tools, don't try to replicate the platform of what could be the platform. Oh, go to a VC, I'm going to build a platform. No, no, no, no, those are going to get wiped away by the large language models. Is there one large language model that will rule the world, or do you see many coming? >> Yeah, so to be clear, I think there will be useful platforms. I just think a lot of people think that they're building, let's say, you know, if we put this in the cloud context, you're building a specific type of EC2 instance. Well, it turns out that Amazon can offer that type of EC2 instance, and immediately distribute it to all of their customers. So you don't want to be in the position of just providing something that actually ends up looking like a feature, which in the context of AI, might be like a small incremental improvement on the model. If that's all you're doing, you're a sandcastle business. Now there's a lot of platform businesses that need to be built that enable businesses to get to value and do things like, how do I monitor my models? How do I create better models with my given data sets? How do I ensure that my models are doing what I want them to do? How do I find the right models to use? There's all these sorts of platform wide problems that certainly exist for businesses. I just think a lot of startups that I'm seeing right now are making the mistake of assuming the advances we're seeing are not going to accelerate or even get better. >> So if I'm a customer, if I'm a company, say I'm a startup or an enterprise, either one, same question. And I want to stand up, and I have developers working on stuff, I want to start standing up an environment to start doing stuff. Is that a service provider? Is that a managed service? Is that you guys? So how do you guys fit into your customers leaning in? Is it just for developers? Are you targeting with a specific like managed service? What's the product consumption? How do you talk to customers when they come to you? >> The thing that we do is enable, we give developers superpowers to build automated inventory tracking, self-checkout systems, identify if this image is malignant cancer or benign cancer, ensure that these products that I've produced are correct. Make sure that that the defect that might exist on this electric vehicle makes its way back for review. All these sorts of problems are immediately able to be solved and tackled. In terms of the managed services element, we have solutions as integrators that will often build on top of our tools, or we'll have companies that look to us for guidance, but ultimately the company is in control of developing and building and creating these capabilities in house. I really think the distinction is maybe less around managed service and tool, and more around ownership in the era of AI. So for example, if I'm using a managed service, in that managed service, part of their benefit is that they are learning across their customer sets, then it's a very different relationship than using a managed service where I'm developing some amount of proprietary advantages for my data sets. And I think that's a really important thing that companies are becoming attuned to, just the value of the data that they have. And so that's what we do. We tell companies that you have this proprietary, immense treasure trove of data, use that to your advantage, and think about us more like a set of tools that enable you to get value from that capability. You know, the HashiCorp's and GitLab's of the world have proven like what these businesses look like at scale. >> And you're targeting developers. When you go into a company, do you target developers with freemium, is there a paid service? Talk about the business model real quick. >> Sure, yeah. The tools are free to use and get started. When someone signs up for Roboflow, they may elect to make their work open source, in which case we're able to provide even more generous usage limits to basically move the computer vision community forward. If you elect to make your data private, you can use our hosted data set managing, data set training, model deployment, annotation tooling up to some limits. And then usually when someone validates that what they're doing gets them value, they purchase a subscription license to be able to scale up those capabilities. So like most developer centric products, it's free to get started, free to prove, free to poke around, develop what you think is possible. And then once you're getting to value, then we're able to capture the commercial upside in the value that's being provided. >> Love the business model. It's right in line with where the market is. There's kind of no standards bodies these days. The developers are the ones who are deciding kind of what the standards are by their adoption. I think making that easy for developers to get value as the model open sources continuing to grow, you can see more of that. Great perspective Joseph, thanks for sharing that. Put a plug in for the company. What are you guys doing right now? Where are you in your growth? What are you looking for? How should people engage? Give the quick commercial for the company. >> So as I mentioned, Roboflow is I think one of the largest, if not the largest collections of computer vision models and data sets that are open source, available on the web today, and have a private set of tools that over half the Fortune 100 now rely on those tools. So we're at the stage now where we know people want what we're working on, and we're continuing to drive that type of adoption. So companies that are looking to make better models, improve their data sets, train and deploy, often will get a lot of value from our tools, and certainly reach out to talk. I'm sure there's a lot of talented engineers that are tuning in too, we're aggressively hiring. So if you are interested in being a part of making the world programmable, and being at the ground floor of the company that's creating these capabilities to be writ large, we'd love to hear from you. >> Amazing, Joseph, thanks so much for coming on and being part of the AWS Startup Showcase. Man, if I was in my twenties, I'd be knocking on your door, because it's the hottest trend right now, it's super exciting. Generative AI is just the beginning of massive sea change. Congratulations on all your success, and we'll be following you guys. Thanks for spending the time, really appreciate it. >> Thanks for having me. >> Okay, this is season three, episode one of the ongoing series covering the exciting startups from the AWS ecosystem, talking about the hottest things in tech. I'm John Furrier, your host. Thanks for watching. (chill electronic music)
SUMMARY :
of the AWS Startups Showcase, of what you guys are doing, of the explosion of use and you know, write some hacks on code and do it all on the edge. and the processors and of the traditional task types. Let's get into that vision. the greatest asset you have, eating into the bell curve, as you said, and flavor of the day for enterprises. So can you share your your take on that, that you can start from a blank slate. but that's like the that right, by the way, How do I find the right models to use? Is that you guys? and GitLab's of the world Talk about the business model real quick. in the value that's being provided. The developers are the that over half the Fortune and being part of the of the ongoing series
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Joseph Nelson | PERSON | 0.99+ |
Joseph | PERSON | 0.99+ |
Walmart | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
John Furrier | PERSON | 0.99+ |
Tesla | ORGANIZATION | 0.99+ |
400 million | QUANTITY | 0.99+ |
2014 | DATE | 0.99+ |
80 objects | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
three years | QUANTITY | 0.99+ |
ten years | QUANTITY | 0.99+ |
80 classes | QUANTITY | 0.99+ |
first question | QUANTITY | 0.99+ |
five years | QUANTITY | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
Roboflow | ORGANIZATION | 0.99+ |
Wimbledon | EVENT | 0.99+ |
today | DATE | 0.98+ |
both | QUANTITY | 0.98+ |
five years ago | DATE | 0.98+ |
GitLab | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
North Star | ORGANIZATION | 0.98+ |
first point | QUANTITY | 0.97+ |
each | QUANTITY | 0.97+ |
over 10,000 pre-trained models | QUANTITY | 0.97+ |
a decade ago | DATE | 0.97+ |
Rivian | ORGANIZATION | 0.97+ |
Mobile World Conference | EVENT | 0.95+ |
over a hundred thousand developers | QUANTITY | 0.94+ |
EC2 | TITLE | 0.94+ |
this month | DATE | 0.93+ |
season one | QUANTITY | 0.93+ |
30 plus frames per second | QUANTITY | 0.93+ |
twenties | QUANTITY | 0.93+ |
sandcastle | ORGANIZATION | 0.9+ |
HashiCorp | ORGANIZATION | 0.89+ |
theCUBE | ORGANIZATION | 0.88+ |
hundreds of thousands | QUANTITY | 0.87+ |
wave | EVENT | 0.87+ |
North Star | ORGANIZATION | 0.86+ |
400 million image text pairs | QUANTITY | 0.78+ |
season three | QUANTITY | 0.78+ |
episode one | QUANTITY | 0.76+ |
AmazonGo | ORGANIZATION | 0.76+ |
over half | QUANTITY | 0.69+ |
a hundred million | QUANTITY | 0.68+ |
Startup Showcase | EVENT | 0.66+ |
Fortune 100 | TITLE | 0.66+ |
COCO | TITLE | 0.65+ |
Roboflow | PERSON | 0.6+ |
ChatGPT | ORGANIZATION | 0.58+ |
Dataset | TITLE | 0.53+ |
Moore | PERSON | 0.5+ |
COCO | ORGANIZATION | 0.39+ |
Adam Wenchel & John Dickerson, Arthur | AWS Startup Showcase S3 E1
(upbeat music) >> Welcome everyone to theCUBE's presentation of the AWS Startup Showcase AI Machine Learning Top Startups Building Generative AI on AWS. This is season 3, episode 1 of the ongoing series covering the exciting startup from the AWS ecosystem to talk about AI and machine learning. I'm your host, John Furrier. I'm joined by two great guests here, Adam Wenchel, who's the CEO of Arthur, and Chief Scientist of Arthur, John Dickerson. Talk about how they help people build better LLM AI systems to get them into the market faster. Gentlemen, thank you for coming on. >> Yeah, thanks for having us, John. >> Well, I got to say I got to temper my enthusiasm because the last few months explosion of interest in LLMs with ChatGPT, has opened the eyes to everybody around the reality of that this is going next gen, this is it, this is the moment, this is the the point we're going to look back and say, this is the time where AI really hit the scene for real applications. So, a lot of Large Language Models, also known as LLMs, foundational models, and generative AI is all booming. This is where all the alpha developers are going. This is where everyone's focusing their business model transformations on. This is where developers are seeing action. So it's all happening, the wave is here. So I got to ask you guys, what are you guys seeing right now? You're in the middle of it, it's hitting you guys right on. You're in the front end of this massive wave. >> Yeah, John, I don't think you have to temper your enthusiasm at all. I mean, what we're seeing every single day is, everything from existing enterprise customers coming in with new ways that they're rethinking, like business things that they've been doing for many years that they can now do an entirely different way, as well as all manner of new companies popping up, applying LLMs to everything from generating code and SQL statements to generating health transcripts and just legal briefs. Everything you can imagine. And when you actually sit down and look at these systems and the demos we get of them, the hype is definitely justified. It's pretty amazing what they're going to do. And even just internally, we built, about a month ago in January, we built an Arthur chatbot so customers could ask questions, technical questions from our, rather than read our product documentation, they could just ask this LLM a particular question and get an answer. And at the time it was like state of the art, but then just last week we decided to rebuild it because the tooling has changed so much that we, last week, we've completely rebuilt it. It's now way better, built on an entirely different stack. And the tooling has undergone a full generation worth of change in six weeks, which is crazy. So it just tells you how much energy is going into this and how fast it's evolving right now. >> John, weigh in as a chief scientist. I mean, you must be blown away. Talk about kid in the candy store. I mean, you must be looking like this saying, I mean, she must be super busy to begin with, but the change, the acceleration, can you scope the kind of change you're seeing and be specific around the areas you're seeing movement and highly accelerated change? >> Yeah, definitely. And it is very, very exciting actually, thinking back to when ChatGPT was announced, that was a night our company was throwing an event at NeurIPS, which is maybe the biggest machine learning conference out there. And the hype when that happened was palatable and it was just shocking to see how well that performed. And then obviously over the last few months since then, as LLMs have continued to enter the market, we've seen use cases for them, like Adam mentioned all over the place. And so, some things I'm excited about in this space are the use of LLMs and more generally, foundation models to redesign traditional operations, research style problems, logistics problems, like auctions, decisioning problems. So moving beyond the already amazing news cases, like creating marketing content into more core integration and a lot of the bread and butter companies and tasks that drive the American ecosystem. And I think we're just starting to see some of that. And in the next 12 months, I think we're going to see a lot more. If I had to make other predictions, I think we're going to continue seeing a lot of work being done on managing like inference time costs via shrinking models or distillation. And I don't know how to make this prediction, but at some point we're going to be seeing lots of these very large scale models operating on the edge as well. So the time scales are extremely compressed, like Adam mentioned, 12 months from now, hard to say. >> We were talking on theCUBE prior to this session here. We had theCUBE conversation here and then the Wall Street Journal just picked up on the same theme, which is the printing press moment created the enlightenment stage of the history. Here we're in the whole nother automating intellect efficiency, doing heavy lifting, the creative class coming back, a whole nother level of reality around the corner that's being hyped up. The question is, is this justified? Is there really a breakthrough here or is this just another result of continued progress with AI? Can you guys weigh in, because there's two schools of thought. There's the, "Oh my God, we're entering a new enlightenment tech phase, of the equivalent of the printing press in all areas. Then there's, Ah, it's just AI (indistinct) inch by inch. What's your guys' opinion? >> Yeah, I think on the one hand when you're down in the weeds of building AI systems all day, every day, like we are, it's easy to look at this as an incremental progress. Like we have customers who've been building on foundation models since we started the company four years ago, particular in computer vision for classification tasks, starting with pre-trained models, things like that. So that part of it doesn't feel real new, but what does feel new is just when you apply these things to language with all the breakthroughs and computational efficiency, algorithmic improvements, things like that, when you actually sit down and interact with ChatGPT or one of the other systems that's out there that's building on top of LLMs, it really is breathtaking, like, the level of understanding that they have and how quickly you can accelerate your development efforts and get an actual working system in place that solves a really important real world problem and makes people way faster, way more efficient. So I do think there's definitely something there. It's more than just incremental improvement. This feels like a real trajectory inflection point for the adoption of AI. >> John, what's your take on this? As people come into the field, I'm seeing a lot of people move from, hey, I've been coding in Python, I've been doing some development, I've been a software engineer, I'm a computer science student. I'm coding in C++ old school, OG systems person. Where do they come in? Where's the focus, where's the action? Where are the breakthroughs? Where are people jumping in and rolling up their sleeves and getting dirty with this stuff? >> Yeah, all over the place. And it's funny you mentioned students in a different life. I wore a university professor hat and so I'm very, very familiar with the teaching aspects of this. And I will say toward Adam's point, this really is a leap forward in that techniques like in a co-pilot for example, everybody's using them right now and they really do accelerate the way that we develop. When I think about the areas where people are really, really focusing right now, tooling is certainly one of them. Like you and I were chatting about LangChain right before this interview started, two or three people can sit down and create an amazing set of pipes that connect different aspects of the LLM ecosystem. Two, I would say is in engineering. So like distributed training might be one, or just understanding better ways to even be able to train large models, understanding better ways to then distill them or run them. So like this heavy interaction now between engineering and what I might call traditional machine learning from 10 years ago where you had to know a lot of math, you had to know calculus very well, things like that. Now you also need to be, again, a very strong engineer, which is exciting. >> I interviewed Swami when he talked about the news. He's ahead of Amazon's machine learning and AI when they announced Hugging Face announcement. And I reminded him how Amazon was easy to get into if you were developing a startup back in 2007,8, and that the language models had that similar problem. It's step up a lot of content and a lot of expense to get provisioned up, now it's easy. So this is the next wave of innovation. So how do you guys see that from where we are right now? Are we at that point where it's that moment where it's that cloud-like experience for LLMs and large language models? >> Yeah, go ahead John. >> I think the answer is yes. We see a number of large companies that are training these and serving these, some of which are being co-interviewed in this episode. I think we're at that. Like, you can hit one of these with a simple, single line of Python, hitting an API, you can boot this up in seconds if you want. It's easy. >> Got it. >> So I (audio cuts out). >> Well let's take a step back and talk about the company. You guys being featured here on the Showcase. Arthur, what drove you to start the company? How'd this all come together? What's the origination story? Obviously you got a big customers, how'd get started? What are you guys doing? How do you make money? Give a quick overview. >> Yeah, I think John and I come at it from slightly different angles, but for myself, I have been a part of a number of technology companies. I joined Capital One, they acquired my last company and shortly after I joined, they asked me to start their AI team. And so even though I've been doing AI for a long time, I started my career back in DARPA. It was the first time I was really working at scale in AI at an organization where there were hundreds of millions of dollars in revenue at stake with the operation of these models and that they were impacting millions of people's financial livelihoods. And so it just got me hyper-focused on these issues around making sure that your AI worked well and it worked well for your company and it worked well for the people who were being affected by it. At the time when I was doing this 2016, 2017, 2018, there just wasn't any tooling out there to support this production management model monitoring life phase of the life cycle. And so we basically left to start the company that I wanted. And John has a his own story. I'll let let you share that one, John. >> Go ahead John, you're up. >> Yeah, so I'm coming at this from a different world. So I'm on leave now from a tenured role in academia where I was leading a large lab focusing on the intersection of machine learning and economics. And so questions like fairness or the response to the dynamism on the underlying environment have been around for quite a long time in that space. And so I've been thinking very deeply about some of those more like R and D style questions as well as having deployed some automation code across a couple of different industries, some in online advertising, some in the healthcare space and so on, where concerns of, again, fairness come to bear. And so Adam and I connected to understand the space of what that might look like in the 2018 20 19 realm from a quantitative and from a human-centered point of view. And so booted things up from there. >> Yeah, bring that applied engineering R and D into the Capital One, DNA that he had at scale. I could see that fit. I got to ask you now, next step, as you guys move out and think about LLMs and the recent AI news around the generative models and the foundational models like ChatGPT, how should we be looking at that news and everyone watching might be thinking the same thing. I know at the board level companies like, we should refactor our business, this is the future. It's that kind of moment, and the tech team's like, okay, boss, how do we do this again? Or are they prepared? How should we be thinking? How should people watching be thinking about LLMs? >> Yeah, I think they really are transformative. And so, I mean, we're seeing companies all over the place. Everything from large tech companies to a lot of our large enterprise customers are launching significant projects at core parts of their business. And so, yeah, I would be surprised, if you're serious about becoming an AI native company, which most leading companies are, then this is a trend that you need to be taking seriously. And we're seeing the adoption rate. It's funny, I would say the AI adoption in the broader business world really started, let's call it four or five years ago, and it was a relatively slow adoption rate, but I think all that kind of investment in and scaling the maturity curve has paid off because the rate at which people are adopting and deploying systems based on this is tremendous. I mean, this has all just happened in the few months and we're already seeing people get systems into production. So, now there's a lot of things you have to guarantee in order to put these in production in a way that basically is added into your business and doesn't cause more headaches than it solves. And so that's where we help customers is where how do you put these out there in a way that they're going to represent your company well, they're going to perform well, they're going to do their job and do it properly. >> So in the use case, as a customer, as I think about this, there's workflows. They might have had an ML AI ops team that's around IT. Their inference engines are out there. They probably don't have a visibility on say how much it costs, they're kicking the tires. When you look at the deployment, there's a cost piece, there's a workflow piece, there's fairness you mentioned John, what should be, I should be thinking about if I'm going to be deploying stuff into production, I got to think about those things. What's your opinion? >> Yeah, I'm happy to dive in on that one. So monitoring in general is extremely important once you have one of these LLMs in production, and there have been some changes versus traditional monitoring that we can dive deeper into that LLMs are really accelerated. But a lot of that bread and butter style of things you should be looking out for remain just as important as they are for what you might call traditional machine learning models. So the underlying environment of data streams, the way users interact with these models, these are all changing over time. And so any performance metrics that you care about, traditional ones like an accuracy, if you can define that for an LLM, ones around, for example, fairness or bias. If that is a concern for your particular use case and so on. Those need to be tracked. Now there are some interesting changes that LLMs are bringing along as well. So most ML models in production that we see are relatively static in the sense that they're not getting flipped in more than maybe once a day or once a week or they're just set once and then not changed ever again. With LLMs, there's this ongoing value alignment or collection of preferences from users that is often constantly updating the model. And so that opens up all sorts of vectors for, I won't say attack, but for problems to arise in production. Like users might learn to use your system in a different way and thus change the way those preferences are getting collected and thus change your system in ways that you never intended. So maybe that went through governance already internally at the company and now it's totally, totally changed and it's through no fault of your own, but you need to be watching over that for sure. >> Talk about the reinforced learnings from human feedback. How's that factoring in to the LLMs? Is that part of it? Should people be thinking about that? Is that a component that's important? >> It certainly is, yeah. So this is one of the big tweaks that happened with InstructGPT, which is the basis model behind ChatGPT and has since gone on to be used all over the place. So value alignment I think is through RLHF like you mentioned is a very interesting space to get into and it's one that you need to watch over. Like, you're asking humans for feedback over outputs from a model and then you're updating the model with respect to that human feedback. And now you've thrown humans into the loop here in a way that is just going to complicate things. And it certainly helps in many ways. You can ask humans to, let's say that you're deploying an internal chat bot at an enterprise, you could ask humans to align that LLM behind the chatbot to, say company values. And so you're listening feedback about these company values and that's going to scoot that chatbot that you're running internally more toward the kind of language that you'd like to use internally on like a Slack channel or something like that. Watching over that model I think in that specific case, that's a compliance and HR issue as well. So while it is part of the greater LLM stack, you can also view that as an independent bit to watch over. >> Got it, and these are important factors. When people see the Bing news, they freak out how it's doing great. Then it goes off the rails, it goes big, fails big. (laughing) So these models people see that, is that human interaction or is that feedback, is that not accepting it or how do people understand how to take that input in and how to build the right apps around LLMs? This is a tough question. >> Yeah, for sure. So some of the examples that you'll see online where these chatbots go off the rails are obviously humans trying to break the system, but some of them clearly aren't. And that's because these are large statistical models and we don't know what's going to pop out of them all the time. And even if you're doing as much in-house testing at the big companies like the Go-HERE's and the OpenAI's of the world, to try to prevent things like toxicity or racism or other sorts of bad content that might lead to bad pr, you're never going to catch all of these possible holes in the model itself. And so, again, it's very, very important to keep watching over that while it's in production. >> On the business model side, how are you guys doing? What's the approach? How do you guys engage with customers? Take a minute to explain the customer engagement. What do they need? What do you need? How's that work? >> Yeah, I can talk a little bit about that. So it's really easy to get started. It's literally a matter of like just handing out an API key and people can get started. And so we also offer alternative, we also offer versions that can be installed on-prem for models that, we find a lot of our customers have models that deal with very sensitive data. So you can run it in your cloud account or use our cloud version. And so yeah, it's pretty easy to get started with this stuff. We find people start using it a lot of times during the validation phase 'cause that way they can start baselining performance models, they can do champion challenger, they can really kind of baseline the performance of, maybe they're considering different foundation models. And so it's a really helpful tool for understanding differences in the way these models perform. And then from there they can just flow that into their production inferencing, so that as these systems are out there, you have really kind of real time monitoring for anomalies and for all sorts of weird behaviors as well as that continuous feedback loop that helps you make make your product get better and observability and you can run all sorts of aggregated reports to really understand what's going on with these models when they're out there deciding. I should also add that we just today have another way to adopt Arthur and that is we are in the AWS marketplace, and so we are available there just to make it that much easier to use your cloud credits, skip the procurement process, and get up and running really quickly. >> And that's great 'cause Amazon's got SageMaker, which handles a lot of privacy stuff, all kinds of cool things, or you can get down and dirty. So I got to ask on the next one, production is a big deal, getting stuff into production. What have you guys learned that you could share to folks watching? Is there a cost issue? I got to monitor, obviously you brought that up, we talked about the even reinforcement issues, all these things are happening. What is the big learnings that you could share for people that are going to put these into production to watch out for, to plan for, or be prepared for, hope for the best plan for the worst? What's your advice? >> I can give a couple opinions there and I'm sure Adam has. Well, yeah, the big one from my side is, again, I had mentioned this earlier, it's just the input data streams because humans are also exploring how they can use these systems to begin with. It's really, really hard to predict the type of inputs you're going to be seeing in production. Especially, we always talk about chatbots, but then any generative text tasks like this, let's say you're taking in news articles and summarizing them or something like that, it's very hard to get a good sampling even of the set of news articles in such a way that you can really predict what's going to pop out of that model. So to me, it's, adversarial maybe isn't the word that I would use, but it's an unnatural shifting input distribution of like prompts that you might see for these models. That's certainly one. And then the second one that I would talk about is, it can be hard to understand the costs, the inference time costs behind these LLMs. So the pricing on these is always changing as the models change size, it might go up, it might go down based on model size, based on energy cost and so on, but your pricing per token or per a thousand tokens and that I think can be difficult for some clients to wrap their head around. Again, you don't know how these systems are going to be used after all so it can be tough. And so again that's another metric that really should be tracked. >> Yeah, and there's a lot of trade off choices in there with like, how many tokens do you want at each step and in the sequence and based on, you have (indistinct) and you reject these tokens and so based on how your system's operating, that can make the cost highly variable. And that's if you're using like an API version that you're paying per token. A lot of people also choose to run these internally and as John mentioned, the inference time on these is significantly higher than a traditional classifi, even NLP classification model or tabular data model, like orders of magnitude higher. And so you really need to understand how that, as you're constantly iterating on these models and putting out new versions and new features in these models, how that's affecting the overall scale of that inference cost because you can use a lot of computing power very quickly with these profits. >> Yeah, scale, performance, price all come together. I got to ask while we're here on the secret sauce of the company, if you had to describe to people out there watching, what's the secret sauce of the company? What's the key to your success? >> Yeah, so John leads our research team and they've had a number of really cool, I think AI as much as it's been hyped for a while, it's still commercial AI at least is really in its infancy. And so the way we're able to pioneer new ways to think about performance for computer vision NLP LLMs is probably the thing that I'm proudest about. John and his team publish papers all the time at Navs and other places. But I think it's really being able to define what performance means for basically any kind of model type and give people really powerful tools to understand that on an ongoing basis. >> John, secret sauce, how would you describe it? You got all the action happening all around you. >> Yeah, well I going to appreciate Adam talking me up like that. No, I. (all laughing) >> Furrier: Robs to you. >> I would also say a couple of other things here. So we have a very strong engineering team and so I think some early hires there really set the standard at a very high bar that we've maintained as we've grown. And I think that's really paid dividends as scalabilities become even more of a challenge in these spaces, right? And so that's not just scalability when it comes to LLMs, that's scalability when it comes to millions of inferences per day, that kind of thing as well in traditional ML models. And I think that's compared to potential competitors, that's really... Well, it's made us able to just operate more efficiently and pass that along to the client. >> Yeah, and I think the infancy comment is really important because it's the beginning. You really is a long journey ahead. A lot of change coming, like I said, it's a huge wave. So I'm sure you guys got a lot of plannings at the foundation even for your own company, so I appreciate the candid response there. Final question for you guys is, what should the top things be for a company in 2023? If I'm going to set the agenda and I'm a customer moving forward, putting the pedal to the metal, so to speak, what are the top things I should be prioritizing or I need to do to be successful with AI in 2023? >> Yeah, I think, so number one, as we talked about, we've been talking about this entire episode, the things are changing so quickly and the opportunities for business transformation and really disrupting different applications, different use cases, is almost, I don't think we've even fully comprehended how big it is. And so really digging in to your business and understanding where I can apply these new sets of foundation models is, that's a top priority. The interesting thing is I think there's another force at play, which is the macroeconomic conditions and a lot of places are, they're having to work harder to justify budgets. So in the past, couple years ago maybe, they had a blank check to spend on AI and AI development at a lot of large enterprises that was limited primarily by the amount of talent they could scoop up. Nowadays these expenditures are getting scrutinized more. And so one of the things that we really help our customers with is like really calculating the ROI on these things. And so if you have models out there performing and you have a new version that you can put out that lifts the performance by 3%, how many tens of millions of dollars does that mean in business benefit? Or if I want to go to get approval from the CFO to spend a few million dollars on this new project, how can I bake in from the beginning the tools to really show the ROI along the way? Because I think in these systems when done well for a software project, the ROI can be like pretty spectacular. Like we see over a hundred percent ROI in the first year on some of these projects. And so, I think in 2023, you just need to be able to show what you're getting for that spend. >> It's a needle moving moment. You see it all the time with some of these aha moments or like, whoa, blown away. John, I want to get your thoughts on this because one of the things that comes up a lot for companies that I talked to, that are on my second wave, I would say coming in, maybe not, maybe the front wave of adopters is talent and team building. You mentioned some of the hires you got were game changing for you guys and set the bar high. As you move the needle, new developers going to need to come in. What's your advice given that you've been a professor, you've seen students, I know a lot of computer science people want to shift, they might not be yet skilled in AI, but they're proficient in programming, is that's going to be another opportunity with open source when things are happening. How do you talk to that next level of talent that wants to come in to this market to supplement teams and be on teams, lead teams? Any advice you have for people who want to build their teams and people who are out there and want to be a coder in AI? >> Yeah, I've advice, and this actually works for what it would take to be a successful AI company in 2023 as well, which is, just don't be afraid to iterate really quickly with these tools. The space is still being explored on what they can be used for. A lot of the tasks that they're used for now right? like creating marketing content using a machine learning is not a new thing to do. It just works really well now. And so I'm excited to see what the next year brings in terms of folks from outside of core computer science who are, other engineers or physicists or chemists or whatever who are learning how to use these increasingly easy to use tools to leverage LLMs for tasks that I think none of us have really thought about before. So that's really, really exciting. And so toward that I would say iterate quickly. Build things on your own, build demos, show them the friends, host them online and you'll learn along the way and you'll have somebody to show for it. And also you'll help us explore that space. >> Guys, congratulations with Arthur. Great company, great picks and shovels opportunities out there for everybody. Iterate fast, get in quickly and don't be afraid to iterate. Great advice and thank you for coming on and being part of the AWS showcase, thanks. >> Yeah, thanks for having us on John. Always a pleasure. >> Yeah, great stuff. Adam Wenchel, John Dickerson with Arthur. Thanks for coming on theCUBE. I'm John Furrier, your host. Generative AI and AWS. Keep it right there for more action with theCUBE. Thanks for watching. (upbeat music)
SUMMARY :
of the AWS Startup Showcase has opened the eyes to everybody and the demos we get of them, but the change, the acceleration, And in the next 12 months, of the equivalent of the printing press and how quickly you can accelerate As people come into the field, aspects of the LLM ecosystem. and that the language models in seconds if you want. and talk about the company. of the life cycle. in the 2018 20 19 realm I got to ask you now, next step, in the broader business world So in the use case, as a the way users interact with these models, How's that factoring in to that LLM behind the chatbot and how to build the Go-HERE's and the OpenAI's What's the approach? differences in the way that are going to put So the pricing on these is always changing and in the sequence What's the key to your success? And so the way we're able to You got all the action Yeah, well I going to appreciate Adam and pass that along to the client. so I appreciate the candid response there. get approval from the CFO to spend You see it all the time with some of A lot of the tasks that and being part of the Yeah, thanks for having us Generative AI and AWS.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
John | PERSON | 0.99+ |
Adam Wenchel | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Adam | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
two | QUANTITY | 0.99+ |
John Dickerson | PERSON | 0.99+ |
2016 | DATE | 0.99+ |
2018 | DATE | 0.99+ |
2023 | DATE | 0.99+ |
3% | QUANTITY | 0.99+ |
2017 | DATE | 0.99+ |
Capital One | ORGANIZATION | 0.99+ |
last week | DATE | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Arthur | PERSON | 0.99+ |
Python | TITLE | 0.99+ |
millions | QUANTITY | 0.99+ |
Two | QUANTITY | 0.99+ |
each step | QUANTITY | 0.99+ |
2018 20 19 | DATE | 0.99+ |
two schools | QUANTITY | 0.99+ |
couple years ago | DATE | 0.99+ |
once a week | QUANTITY | 0.99+ |
one | QUANTITY | 0.98+ |
first year | QUANTITY | 0.98+ |
Swami | PERSON | 0.98+ |
four years ago | DATE | 0.98+ |
four | DATE | 0.98+ |
first time | QUANTITY | 0.98+ |
Arthur | ORGANIZATION | 0.98+ |
two great guests | QUANTITY | 0.98+ |
next year | DATE | 0.98+ |
once a day | QUANTITY | 0.98+ |
six weeks | QUANTITY | 0.97+ |
10 years ago | DATE | 0.97+ |
ChatGPT | TITLE | 0.97+ |
second one | QUANTITY | 0.96+ |
three people | QUANTITY | 0.96+ |
front | EVENT | 0.95+ |
second wave | EVENT | 0.95+ |
January | DATE | 0.95+ |
hundreds of millions of dollars | QUANTITY | 0.95+ |
five years ago | DATE | 0.94+ |
about a month ago | DATE | 0.94+ |
tens of millions | QUANTITY | 0.93+ |
today | DATE | 0.92+ |
next 12 months | DATE | 0.91+ |
LangChain | ORGANIZATION | 0.91+ |
over a hundred percent | QUANTITY | 0.91+ |
million dollars | QUANTITY | 0.89+ |
millions of inferences | QUANTITY | 0.89+ |
theCUBE | ORGANIZATION | 0.88+ |
Jay Marshall, Neural Magic | AWS Startup Showcase S3E1
(upbeat music) >> Hello, everyone, and welcome to theCUBE's presentation of the "AWS Startup Showcase." This is season three, episode one. The focus of this episode is AI/ML: Top Startups Building Foundational Models, Infrastructure, and AI. It's great topics, super-relevant, and it's part of our ongoing coverage of startups in the AWS ecosystem. I'm your host, John Furrier, with theCUBE. Today, we're excited to be joined by Jay Marshall, VP of Business Development at Neural Magic. Jay, thanks for coming on theCUBE. >> Hey, John, thanks so much. Thanks for having us. >> We had a great CUBE conversation with you guys. This is very much about the company focuses. It's a feature presentation for the "Startup Showcase," and the machine learning at scale is the topic, but in general, it's more, (laughs) and we should call it "Machine Learning and AI: How to Get Started," because everybody is retooling their business. Companies that aren't retooling their business right now with AI first will be out of business, in my opinion. You're seeing massive shift. This is really truly the beginning of the next-gen machine learning AI trend. It's really seeing ChatGPT. Everyone sees that. That went mainstream. But this is just the beginning. This is scratching the surface of this next-generation AI with machine learning powering it, and with all the goodness of cloud, cloud scale, and how horizontally scalable it is. The resources are there. You got the Edge. Everything's perfect for AI 'cause data infrastructure's exploding in value. AI is just the applications. This is a super topic, so what do you guys see in this general area of opportunities right now in the headlines? And I'm sure you guys' phone must be ringing off the hook, metaphorically speaking, or emails and meetings and Zooms. What's going on over there at Neural Magic? >> No, absolutely, and you pretty much nailed most of it. I think that, you know, my background, we've seen for the last 20-plus years. Even just getting enterprise applications kind of built and delivered at scale, obviously, amazing things with AWS and the cloud to help accelerate that. And we just kind of figured out in the last five or so years how to do that productively and efficiently, kind of from an operations perspective. Got development and operations teams. We even came up with DevOps, right? But now, we kind of have this new kind of persona and new workload that developers have to talk to, and then it has to be deployed on those ITOps solutions. And so you pretty much nailed it. Folks are saying, "Well, how do I do this?" These big, generational models or foundational models, as we're calling them, they're great, but enterprises want to do that with their data, on their infrastructure, at scale, at the edge. So for us, yeah, we're helping enterprises accelerate that through optimizing models and then delivering them at scale in a more cost-effective fashion. >> Yeah, and I think one of the things, the benefits of OpenAI we saw, was not only is it open source, then you got also other models that are more proprietary, is that it shows the world that this is really happening, right? It's a whole nother level, and there's also new landscape kind of maps coming out. You got the generative AI, and you got the foundational models, large LLMs. Where do you guys fit into the landscape? Because you guys are in the middle of this. How do you talk to customers when they say, "I'm going down this road. I need help. I'm going to stand this up." This new AI infrastructure and applications, where do you guys fit in the landscape? >> Right, and really, the answer is both. I think today, when it comes to a lot of what for some folks would still be considered kind of cutting edge around computer vision and natural language processing, a lot of our optimization tools and our runtime are based around most of the common computer vision and natural language processing models. So your YOLOs, your BERTs, you know, your DistilBERTs and what have you, so we work to help optimize those, again, who've gotten great performance and great value for customers trying to get those into production. But when you get into the LLMs, and you mentioned some of the open source components there, our research teams have kind of been right in the trenches with those. So kind of the GPT open source equivalent being OPT, being able to actually take, you know, a multi-$100 billion parameter model and sparsify that or optimize that down, shaving away a ton of parameters, and being able to run it on smaller infrastructure. So I think the evolution here, you know, all this stuff came out in the last six months in terms of being turned loose into the wild, but we're staying in the trenches with folks so that we can help optimize those as well and not require, again, the heavy compute, the heavy cost, the heavy power consumption as those models evolve as well. So we're staying right in with everybody while they're being built, but trying to get folks into production today with things that help with business value today. >> Jay, I really appreciate you coming on theCUBE, and before we came on camera, you said you just were on a customer call. I know you got a lot of activity. What specific things are you helping enterprises solve? What kind of problems? Take us through the spectrum from the beginning, people jumping in the deep end of the pool, some people kind of coming in, starting out slow. What are the scale? Can you scope the kind of use cases and problems that are emerging that people are calling you for? >> Absolutely, so I think if I break it down to kind of, like, your startup, or I maybe call 'em AI native to kind of steal from cloud native years ago, that group, it's pretty much, you know, part and parcel for how that group already runs. So if you have a data science team and an ML engineering team, you're building models, you're training models, you're deploying models. You're seeing firsthand the expense of starting to try to do that at scale. So it's really just a pure operational efficiency play. They kind of speak natively to our tools, which we're doing in the open source. So it's really helping, again, with the optimization of the models they've built, and then, again, giving them an alternative to expensive proprietary hardware accelerators to have to run them. Now, on the enterprise side, it varies, right? You have some kind of AI native folks there that already have these teams, but you also have kind of, like, AI curious, right? Like, they want to do it, but they don't really know where to start, and so for there, we actually have an open source toolkit that can help you get into this optimization, and then again, that runtime, that inferencing runtime, purpose-built for CPUs. It allows you to not have to worry, again, about do I have a hardware accelerator available? How do I integrate that into my application stack? If I don't already know how to build this into my infrastructure, does my ITOps teams, do they know how to do this, and what does that runway look like? How do I cost for this? How do I plan for this? When it's just x86 compute, we've been doing that for a while, right? So it obviously still requires more, but at least it's a little bit more predictable. >> It's funny you mentioned AI native. You know, born in the cloud was a phrase that was out there. Now, you have startups that are born in AI companies. So I think you have this kind of cloud kind of vibe going on. You have lift and shift was a big discussion. Then you had cloud native, kind of in the cloud, kind of making it all work. Is there a existing set of things? People will throw on this hat, and then what's the difference between AI native and kind of providing it to existing stuff? 'Cause we're a lot of people take some of these tools and apply it to either existing stuff almost, and it's not really a lift and shift, but it's kind of like bolting on AI to something else, and then starting with AI first or native AI. >> Absolutely. It's a- >> How would you- >> It's a great question. I think that probably, where I'd probably pull back to kind of allow kind of retail-type scenarios where, you know, for five, seven, nine years or more even, a lot of these folks already have data science teams, you know? I mean, they've been doing this for quite some time. The difference is the introduction of these neural networks and deep learning, right? Those kinds of models are just a little bit of a paradigm shift. So, you know, I obviously was trying to be fun with the term AI native, but I think it's more folks that kind of came up in that neural network world, so it's a little bit more second nature, whereas I think for maybe some traditional data scientists starting to get into neural networks, you have the complexity there and the training overhead, and a lot of the aspects of getting a model finely tuned and hyperparameterization and all of these aspects of it. It just adds a layer of complexity that they're just not as used to dealing with. And so our goal is to help make that easy, and then of course, make it easier to run anywhere that you have just kind of standard infrastructure. >> Well, the other point I'd bring out, and I'd love to get your reaction to, is not only is that a neural network team, people who have been focused on that, but also, if you look at some of the DataOps lately, AIOps markets, a lot of data engineering, a lot of scale, folks who have been kind of, like, in that data tsunami cloud world are seeing, they kind of been in this, right? They're, like, been experiencing that. >> No doubt. I think it's funny the data lake concept, right? And you got data oceans now. Like, the metaphors just keep growing on us, but where it is valuable in terms of trying to shift the mindset, I've always kind of been a fan of some of the naming shift. I know with AWS, they always talk about purpose-built databases. And I always liked that because, you know, you don't have one database that can do everything. Even ones that say they can, like, you still have to do implementation detail differences. So sitting back and saying, "What is my use case, and then which database will I use it for?" I think it's kind of similar here. And when you're building those data teams, if you don't have folks that are doing data engineering, kind of that data harvesting, free processing, you got to do all that before a model's even going to care about it. So yeah, it's definitely a central piece of this as well, and again, whether or not you're going to be AI negative as you're making your way to kind of, you know, on that journey, you know, data's definitely a huge component of it. >> Yeah, you would have loved our Supercloud event we had. Talk about naming and, you know, around data meshes was talked about a lot. You're starting to see the control plane layers of data. I think that was the beginning of what I saw as that data infrastructure shift, to be horizontally scalable. So I have to ask you, with Neural Magic, when your customers and the people that are prospects for you guys, they're probably asking a lot of questions because I think the general thing that we see is, "How do I get started? Which GPU do I use?" I mean, there's a lot of things that are kind of, I won't say technical or targeted towards people who are living in that world, but, like, as the mainstream enterprises come in, they're going to need a playbook. What do you guys see, what do you guys offer your clients when they come in, and what do you recommend? >> Absolutely, and I think where we hook in specifically tends to be on the training side. So again, I've built a model. Now, I want to really optimize that model. And then on the runtime side when you want to deploy it, you know, we run that optimized model. And so that's where we're able to provide. We even have a labs offering in terms of being able to pair up our engineering teams with a customer's engineering teams, and we can actually help with most of that pipeline. So even if it is something where you have a dataset and you want some help in picking a model, you want some help training it, you want some help deploying that, we can actually help there as well. You know, there's also a great partner ecosystem out there, like a lot of folks even in the "Startup Showcase" here, that extend beyond into kind of your earlier comment around data engineering or downstream ITOps or the all-up MLOps umbrella. So we can absolutely engage with our labs, and then, of course, you know, again, partners, which are always kind of key to this. So you are spot on. I think what's happened with the kind of this, they talk about a hockey stick. This is almost like a flat wall now with the rate of innovation right now in this space. And so we do have a lot of folks wanting to go straight from curious to native. And so that's definitely where the partner ecosystem comes in so hard 'cause there just isn't anybody or any teams out there that, I literally do from, "Here's my blank database, and I want an API that does all the stuff," right? Like, that's a big chunk, but we can definitely help with the model to delivery piece. >> Well, you guys are obviously a featured company in this space. Talk about the expertise. A lot of companies are like, I won't say faking it till they make it. You can't really fake security. You can't really fake AI, right? So there's going to be a learning curve. They'll be a few startups who'll come out of the gate early. You guys are one of 'em. Talk about what you guys have as expertise as a company, why you're successful, and what problems do you solve for customers? >> No, appreciate that. Yeah, we actually, we love to tell the story of our founder, Nir Shavit. So he's a 20-year professor at MIT. Actually, he was doing a lot of work on kind of multicore processing before there were even physical multicores, and actually even did a stint in computational neurobiology in the 2010s, and the impetus for this whole technology, has a great talk on YouTube about it, where he talks about the fact that his work there, he kind of realized that the way neural networks encode and how they're executed by kind of ramming data layer by layer through these kind of HPC-style platforms, actually was not analogous to how the human brain actually works. So we're on one side, we're building neural networks, and we're trying to emulate neurons. We're not really executing them that way. So our team, which one of the co-founders, also an ex-MIT, that was kind of the birth of why can't we leverage this super-performance CPU platform, which has those really fat, fast caches attached to each core, and actually start to find a way to break that model down in a way that I can execute things in parallel, not having to do them sequentially? So it is a lot of amazing, like, talks and stuff that show kind of the magic, if you will, a part of the pun of Neural Magic, but that's kind of the foundational layer of all the engineering that we do here. And in terms of how we're able to bring it to reality for customers, I'll give one customer quote where it's a large retailer, and it's a people-counting application. So a very common application. And that customer's actually been able to show literally double the amount of cameras being run with the same amount of compute. So for a one-to-one perspective, two-to-one, business leaders usually like that math, right? So we're able to show pure cost savings, but even performance-wise, you know, we have some of the common models like your ResNets and your YOLOs, where we can actually even perform better than hardware-accelerated solutions. So we're trying to do, I need to just dumb it down to better, faster, cheaper, but from a commodity perspective, that's where we're accelerating. >> That's not a bad business model. Make things easier to use, faster, and reduce the steps it takes to do stuff. So, you know, that's always going to be a good market. Now, you guys have DeepSparse, which we've talked about on our CUBE conversation prior to this interview, delivers ML models through the software so the hardware allows for a decoupling, right? >> Yep. >> Which is going to drive probably a cost advantage. Also, it's also probably from a deployment standpoint it must be easier. Can you share the benefits? Is it a cost side? Is it more of a deployment? What are the benefits of the DeepSparse when you guys decouple the software from the hardware on the ML models? >> No you actually, you hit 'em both 'cause that really is primarily the value. Because ultimately, again, we're so early. And I came from this world in a prior life where I'm doing Java development, WebSphere, WebLogic, Tomcat open source, right? When we were trying to do innovation, we had innovation buckets, 'cause everybody wanted to be on the web and have their app and a browser, right? We got all the money we needed to build something and show, hey, look at the thing on the web, right? But when you had to get in production, that was the challenge. So to what you're speaking to here, in this situation, we're able to show we're just a Python package. So whether you just install it on the operating system itself, or we also have a containerized version you can drop on any container orchestration platform, so ECS or EKS on AWS. And so you get all the auto-scaling features. So when you think about that kind of a world where you have everything from real-time inferencing to kind of after hours batch processing inferencing, the fact that you can auto scale that hardware up and down and it's CPU based, so you're paying by the minute instead of maybe paying by the hour at a lower cost shelf, it does everything from pure cost to, again, I can have my standard IT team say, "Hey, here's the Kubernetes in the container," and it just runs on the infrastructure we're already managing. So yeah, operational, cost and again, and many times even performance. (audio warbles) CPUs if I want to. >> Yeah, so that's easier on the deployment too. And you don't have this kind of, you know, blank check kind of situation where you don't know what's on the backend on the cost side. >> Exactly. >> And you control the actual hardware and you can manage that supply chain. >> And keep in mind, exactly. Because the other thing that sometimes gets lost in the conversation, depending on where a customer is, some of these workloads, like, you know, you and I remember a world where even like the roundtrip to the cloud and back was a problem for folks, right? We're used to extremely low latency. And some of these workloads absolutely also adhere to that. But there's some workloads where the latency isn't as important. And we actually even provide the tuning. Now, if we're giving you five milliseconds of latency and you don't need that, you can tune that back. So less CPU, lower cost. Now, throughput and other things come into play. But that's the kind of configurability and flexibility we give for operations. >> All right, so why should I call you if I'm a customer or prospect Neural Magic, what problem do I have or when do I know I need you guys? When do I call you in and what does my environment look like? When do I know? What are some of the signals that would tell me that I need Neural Magic? >> No, absolutely. So I think in general, any neural network, you know, the process I mentioned before called sparcification, it's, you know, an optimization process that we specialize in. Any neural network, you know, can be sparcified. So I think if it's a deep-learning neural network type model. If you're trying to get AI into production, you have cost concerns even performance-wise. I certainly hate to be too generic and say, "Hey, we'll talk to everybody." But really in this world right now, if it's a neural network, it's something where you're trying to get into production, you know, we are definitely offering, you know, kind of an at-scale performant deployable solution for deep learning models. >> So neural network you would define as what? Just devices that are connected that need to know about each other? What's the state-of-the-art current definition of neural network for customers that may think they have a neural network or might not know they have a neural network architecture? What is that definition for neural network? >> That's a great question. So basically, machine learning models that fall under this kind of category, you hear about transformers a lot, or I mentioned about YOLO, the YOLO family of computer vision models, or natural language processing models like BERT. If you have a data science team or even developers, some even regular, I used to call myself a nine to five developer 'cause I worked in the enterprise, right? So like, hey, we found a new open source framework, you know, I used to use Spring back in the day and I had to go figure it out. There's developers that are pulling these models down and they're figuring out how to get 'em into production, okay? So I think all of those kinds of situations, you know, if it's a machine learning model of the deep learning variety that's, you know, really specifically where we shine. >> Okay, so let me pretend I'm a customer for a minute. I have all these videos, like all these transcripts, I have all these people that we've interviewed, CUBE alumnis, and I say to my team, "Let's AI-ify, sparcify theCUBE." >> Yep. >> What do I do? I mean, do I just like, my developers got to get involved and they're going to be like, "Well, how do I upload it to the cloud? Do I use a GPU?" So there's a thought process. And I think a lot of companies are going through that example of let's get on this AI, how can it help our business? >> Absolutely. >> What does that progression look like? Take me through that example. I mean, I made up theCUBE example up, but we do have a lot of data. We have large data models and we have people and connect to the internet and so we kind of seem like there's a neural network. I think every company might have a neural network in place. >> Well, and I was going to say, I think in general, you all probably do represent even the standard enterprise more than most. 'Cause even the enterprise is going to have a ton of video content, a ton of text content. So I think it's a great example. So I think that that kind of sea or I'll even go ahead and use that term data lake again, of data that you have, you're probably going to want to be setting up kind of machine learning pipelines that are going to be doing all of the pre-processing from kind of the raw data to kind of prepare it into the format that say a YOLO would actually use or let's say BERT for natural language processing. So you have all these transcripts, right? So we would do a pre-processing path where we would create that into the file format that BERT, the machine learning model would know how to train off of. So that's kind of all the pre-processing steps. And then for training itself, we actually enable what's called sparse transfer learning. So that's transfer learning is a very popular method of doing training with existing models. So we would be able to retrain that BERT model with your transcript data that we have now done the pre-processing with to get it into the proper format. And now we have a BERT natural language processing model that's been trained on your data. And now we can deploy that onto DeepSparse runtime so that now you can ask that model whatever questions, or I should say pass, you're not going to ask it those kinds of questions ChatGPT, although we can do that too. But you're going to pass text through the BERT model and it's going to give you answers back. It could be things like sentiment analysis or text classification. You just call the model, and now when you pass text through it, you get the answers better, faster or cheaper. I'll use that reference again. >> Okay, we can create a CUBE bot to give us questions on the fly from the the AI bot, you know, from our previous guests. >> Well, and I will tell you using that as an example. So I had mentioned OPT before, kind of the open source version of ChatGPT. So, you know, typically that requires multiple GPUs to run. So our research team, I may have mentioned earlier, we've been able to sparcify that over 50% already and run it on only a single GPU. And so in that situation, you could train OPT with that corpus of data and do exactly what you say. Actually we could use Alexa, we could use Alexa to actually respond back with voice. How about that? We'll do an API call and we'll actually have an interactive Alexa-enabled bot. >> Okay, we're going to be a customer, let's put it on the list. But this is a great example of what you guys call software delivered AI, a topic we chatted about on theCUBE conversation. This really means this is a developer opportunity. This really is the convergence of the data growth, the restructuring, how data is going to be horizontally scalable, meets developers. So this is an AI developer model going on right now, which is kind of unique. >> It is, John, I will tell you what's interesting. And again, folks don't always think of it this way, you know, the AI magical goodness is now getting pushed in the middle where the developers and IT are operating. And so it again, that paradigm, although for some folks seem obvious, again, if you've been around for 20 years, that whole all that plumbing is a thing, right? And so what we basically help with is when you deploy the DeepSparse runtime, we have a very rich API footprint. And so the developers can call the API, ITOps can run it, or to your point, it's developer friendly enough that you could actually deploy our off-the-shelf models. We have something called the SparseZoo where we actually publish pre-optimized or pre-sparcified models. And so developers could literally grab those right off the shelf with the training they've already had and just put 'em right into their applications and deploy them as containers. So yeah, we enable that for sure as well. >> It's interesting, DevOps was infrastructure as code and we had a last season, a series on data as code, which we kind of coined. This is data as code. This is a whole nother level of opportunity where developers just want to have programmable data and apps with AI. This is a whole new- >> Absolutely. >> Well, absolutely great, great stuff. Our news team at SiliconANGLE and theCUBE said you guys had a little bit of a launch announcement you wanted to make here on the "AWS Startup Showcase." So Jay, you have something that you want to launch here? >> Yes, and thank you John for teeing me up. So I'm going to try to put this in like, you know, the vein of like an AWS, like main stage keynote launch, okay? So we're going to try this out. So, you know, a lot of our product has obviously been built on top of x86. I've been sharing that the past 15 minutes or so. And with that, you know, we're seeing a lot of acceleration for folks wanting to run on commodity infrastructure. But we've had customers and prospects and partners tell us that, you know, ARM and all of its kind of variance are very compelling, both cost performance-wise and also obviously with Edge. And wanted to know if there was anything we could do from a runtime perspective with ARM. And so we got the work and, you know, it's a hard problem to solve 'cause the instructions set for ARM is very different than the instruction set for x86, and our deep tensor column technology has to be able to work with that lower level instruction spec. But working really hard, the engineering team's been at it and we are happy to announce here at the "AWS Startup Showcase," that DeepSparse inference now has, or inference runtime now has support for AWS Graviton instances. So it's no longer just x86, it is also ARM and that obviously also opens up the door to Edge and further out the stack so that optimize once run anywhere, we're not going to open up. So it is an early access. So if you go to neuralmagic.com/graviton, you can sign up for early access, but we're excited to now get into the ARM side of the fence as well on top of Graviton. >> That's awesome. Our news team is going to jump on that news. We'll get it right up. We get a little scoop here on the "Startup Showcase." Jay Marshall, great job. That really highlights the flexibility that you guys have when you decouple the software from the hardware. And again, we're seeing open source driving a lot more in AI ops now with with machine learning and AI. So to me, that makes a lot of sense. And congratulations on that announcement. Final minute or so we have left, give a summary of what you guys are all about. Put a plug in for the company, what you guys are looking to do. I'm sure you're probably hiring like crazy. Take the last few minutes to give a plug for the company and give a summary. >> No, I appreciate that so much. So yeah, joining us out neuralmagic.com, you know, part of what we didn't spend a lot of time here, our optimization tools, we are doing all of that in the open source. It's called SparseML and I mentioned SparseZoo briefly. So we really want the data scientists community and ML engineering community to join us out there. And again, the DeepSparse runtime, it's actually free to use for trial purposes and for personal use. So you can actually run all this on your own laptop or on an AWS instance of your choice. We are now live in the AWS marketplace. So push button, deploy, come try us out and reach out to us on neuralmagic.com. And again, sign up for the Graviton early access. >> All right, Jay Marshall, Vice President of Business Development Neural Magic here, talking about performant, cost effective machine learning at scale. This is season three, episode one, focusing on foundational models as far as building data infrastructure and AI, AI native. I'm John Furrier with theCUBE. Thanks for watching. (bright upbeat music)
SUMMARY :
of the "AWS Startup Showcase." Thanks for having us. and the machine learning and the cloud to help accelerate that. and you got the foundational So kind of the GPT open deep end of the pool, that group, it's pretty much, you know, So I think you have this kind It's a- and a lot of the aspects of and I'd love to get your reaction to, And I always liked that because, you know, that are prospects for you guys, and you want some help in picking a model, Talk about what you guys have that show kind of the magic, if you will, and reduce the steps it takes to do stuff. when you guys decouple the the fact that you can auto And you don't have this kind of, you know, the actual hardware and you and you don't need that, neural network, you know, of situations, you know, CUBE alumnis, and I say to my team, and they're going to be like, and connect to the internet and it's going to give you answers back. you know, from our previous guests. and do exactly what you say. of what you guys call enough that you could actually and we had a last season, that you want to launch here? And so we got the work and, you know, flexibility that you guys have So you can actually run Vice President of Business
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jay | PERSON | 0.99+ |
Jay Marshall | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
John | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
five | QUANTITY | 0.99+ |
Nir Shavit | PERSON | 0.99+ |
20-year | QUANTITY | 0.99+ |
Alexa | TITLE | 0.99+ |
2010s | DATE | 0.99+ |
seven | QUANTITY | 0.99+ |
Python | TITLE | 0.99+ |
MIT | ORGANIZATION | 0.99+ |
each core | QUANTITY | 0.99+ |
Neural Magic | ORGANIZATION | 0.99+ |
Java | TITLE | 0.99+ |
YouTube | ORGANIZATION | 0.99+ |
Today | DATE | 0.99+ |
nine years | QUANTITY | 0.98+ |
both | QUANTITY | 0.98+ |
BERT | TITLE | 0.98+ |
theCUBE | ORGANIZATION | 0.98+ |
ChatGPT | TITLE | 0.98+ |
20 years | QUANTITY | 0.98+ |
over 50% | QUANTITY | 0.97+ |
second nature | QUANTITY | 0.96+ |
today | DATE | 0.96+ |
ARM | ORGANIZATION | 0.96+ |
one | QUANTITY | 0.95+ |
DeepSparse | TITLE | 0.94+ |
neuralmagic.com/graviton | OTHER | 0.94+ |
SiliconANGLE | ORGANIZATION | 0.94+ |
WebSphere | TITLE | 0.94+ |
nine | QUANTITY | 0.94+ |
first | QUANTITY | 0.93+ |
Startup Showcase | EVENT | 0.93+ |
five milliseconds | QUANTITY | 0.92+ |
AWS Startup Showcase | EVENT | 0.91+ |
two | QUANTITY | 0.9+ |
YOLO | ORGANIZATION | 0.89+ |
CUBE | ORGANIZATION | 0.88+ |
OPT | TITLE | 0.88+ |
last six months | DATE | 0.88+ |
season three | QUANTITY | 0.86+ |
double | QUANTITY | 0.86+ |
one customer | QUANTITY | 0.86+ |
Supercloud | EVENT | 0.86+ |
one side | QUANTITY | 0.85+ |
Vice | PERSON | 0.85+ |
x86 | OTHER | 0.83+ |
AI/ML: Top Startups Building Foundational Models | TITLE | 0.82+ |
ECS | TITLE | 0.81+ |
$100 billion | QUANTITY | 0.81+ |
DevOps | TITLE | 0.81+ |
WebLogic | TITLE | 0.8+ |
EKS | TITLE | 0.8+ |
a minute | QUANTITY | 0.8+ |
neuralmagic.com | OTHER | 0.79+ |
Luis Ceze & Anna Connolly, OctoML | AWS Startup Showcase S3 E1
(soft music) >> Hello, everyone. Welcome to theCUBE's presentation of the AWS Startup Showcase. AI and Machine Learning: Top Startups Building Foundational Model Infrastructure. This is season 3, episode 1 of the ongoing series covering the exciting stuff from the AWS ecosystem, talking about machine learning and AI. I'm your host, John Furrier and today we are excited to be joined by Luis Ceze who's the CEO of OctoML and Anna Connolly, VP of customer success and experience OctoML. Great to have you on again, Luis. Anna, thanks for coming on. Appreciate it. >> Thank you, John. It's great to be here. >> Thanks for having us. >> I love the company. We had a CUBE conversation about this. You guys are really addressing how to run foundational models faster for less. And this is like the key theme. But before we get into it, this is a hot trend, but let's explain what you guys do. Can you set the narrative of what the company's about, why it was founded, what's your North Star and your mission? >> Yeah, so John, our mission is to make AI sustainable and accessible for everyone. And what we offer customers is, you know, a way of taking their models into production in the most efficient way possible by automating the process of getting a model and optimizing it for a variety of hardware and making cost-effective. So better, faster, cheaper model deployment. >> You know, the big trend here is AI. Everyone's seeing the ChatGPT, kind of the shot heard around the world. The BingAI and this fiasco and the ongoing experimentation. People are into it, and I think the business impact is clear. I haven't seen this in all of my career in the technology industry of this kind of inflection point. And every senior leader I talk to is rethinking about how to rebuild their business with AI because now the large language models have come in, these foundational models are here, they can see value in their data. This is a 10 year journey in the big data world. Now it's impacting that, and everyone's rebuilding their company around this idea of being AI first 'cause they see ways to eliminate things and make things more efficient. And so now they telling 'em to go do it. And they're like, what do we do? So what do you guys think? Can you explain what is this wave of AI and why is it happening, why now, and what should people pay attention to? What does it mean to them? >> Yeah, I mean, it's pretty clear by now that AI can do amazing things that captures people's imaginations. And also now can show things that are really impactful in businesses, right? So what people have the opportunity to do today is to either train their own model that adds value to their business or find open models out there that can do very valuable things to them. So the next step really is how do you take that model and put it into production in a cost-effective way so that the business can actually get value out of it, right? >> Anna, what's your take? Because customers are there, you're there to make 'em successful, you got the new secret weapon for their business. >> Yeah, I think we just see a lot of companies struggle to get from a trained model into a model that is deployed in a cost-effective way that actually makes sense for the application they're building. I think that's a huge challenge we see today, kind of across the board across all of our customers. >> Well, I see this, everyone asking the same question. I have data, I want to get value out of it. I got to get these big models, I got to train it. What's it going to cost? So I think there's a reality of, okay, I got to do it. Then no one has any visibility on what it costs. When they get into it, this is going to break the bank. So I have to ask you guys, the cost of training these models is on everyone's mind. OctoML, your company's focus on the cost side of it as well as the efficiency side of running these models in production. Why are the production costs such a concern and where specifically are people looking at it and why did it get here? >> Yeah, so training costs get a lot of attention because normally a large number, but we shouldn't forget that it's a large, typically one time upfront cost that customers pay. But, you know, when the model is put into production, the cost grows directly with model usage and you actually want your model to be used because it's adding value, right? So, you know, the question that a customer faces is, you know, they have a model, they have a trained model and now what? So how much would it cost to run in production, right? And now without the big wave in generative AI, which rightfully is getting a lot of attention because of the amazing things that it can do. It's important for us to keep in mind that generative AI models like ChatGPT are huge, expensive energy hogs. They cost a lot to run, right? And given that model usage growth directly, model cost grows directly with usage, what you want to do is make sure that once you put a model into production, you have the best cost structure possible so that you're not surprised when it's gets popular, right? So let me give you an example. So if you have a model that costs, say 1 to $2 million to train, but then it costs about one to two cents per session to use it, right? So if you have a million active users, even if they use just once a day, it's 10 to $20,000 a day to operate that model in production. And that very, very quickly, you know, get beyond what you paid to train it. >> Anna, these aren't small numbers, and it's cost to train and cost to operate, it kind of reminds me of when the cloud came around and the data center versus cloud options. Like, wait a minute, one, it costs a ton of cash to deploy, and then running it. This is kind of a similar dynamic. What are you seeing? >> Yeah, absolutely. I think we are going to see increasingly the cost and production outpacing the costs and training by a lot. I mean, people talk about training costs now because that's what they're confronting now because people are so focused on getting models performant enough to even use in an application. And now that we have them and they're that capable, we're really going to start to see production costs go up a lot. >> Yeah, Luis, if you don't mind, I know this might be a little bit of a tangent, but, you know, training's super important. I get that. That's what people are doing now, but then there's the deployment side of production. Where do people get caught up and miss the boat or misconfigure? What's the gotcha? Where's the trip wire or so to speak? Where do people mess up on the cost side? What do they do? Is it they don't think about it, they tie it to proprietary hardware? What's the issue? >> Yeah, several things, right? So without getting really technical, which, you know, I might get into, you know, you have to understand relationship between performance, you know, both in terms of latency and throughput and cost, right? So reducing latency is important because you improve responsiveness of the model. But it's really important to keep in mind that it often leads diminishing returns. Below a certain latency, making it faster won't make a measurable difference in experience, but it's going to cost a lot more. So understanding that is important. Now, if you care more about throughputs, which is the time it takes for you to, you know, units per period of time, you care about time to solution, we should think about this throughput per dollar. And understand what you want is the highest throughput per dollar, which may come at the cost of higher latency, which you're not going to care about, right? So, and the reality here, John, is that, you know, humans and especially folks in this space want to have the latest and greatest hardware. And often they commit a lot of money to get access to them and have to commit upfront before they understand the needs that their models have, right? So common mistake here, one is not spending time to understand what you really need, and then two, over-committing and using more hardware than you actually need. And not giving yourself enough freedom to get your workload to move around to the more cost-effective choice, right? So this is just a metaphoric choice. And then another thing that's important here too is making a model run faster on the hardware directly translates to lower cost, right? So, but it takes a lot of engineers, you need to think of ways of producing very efficient versions of your model for the target hardware that you're going to use. >> Anna, what's the customer angle here? Because price performance has been around for a long time, people get that, but now latency and throughput, that's key because we're starting to see this in apps. I mean, there's an end user piece. I even seeing it on the infrastructure side where they're taking a heavy lifting away from operational costs. So you got, you know, application specific to the user and/or top of the stack, and then you got actually being used in operations where they want both. >> Yeah, absolutely. Maybe I can illustrate this with a quick story with the customer that we had recently been working with. So this customer is planning to run kind of a transformer based model for tech generation at super high scale on Nvidia T4 GPU, so kind of a commodity GPU. And the scale was so high that they would've been paying hundreds of thousands of dollars in cloud costs per year just to serve this model alone. You know, one of many models in their application stack. So we worked with this team to optimize our model and then benchmark across several possible targets. So that matching the hardware that Luis was just talking about, including the newer kind of Nvidia A10 GPUs. And what they found during this process was pretty interesting. First, the team was able to shave a quarter of their spend just by using better optimization techniques on the T4, the older hardware. But actually moving to a newer GPU would allow them to serve this model in a sub two milliseconds latency, so super fast, which was able to unlock an entirely new kind of user experience. So they were able to kind of change the value they're delivering in their application just because they were able to move to this new hardware easily. So they ultimately decided to plan their deployment on the more expensive A10 because of this, but because of the hardware specific optimizations that we helped them with, they managed to even, you know, bring costs down from what they had originally planned. And so if you extend this kind of example to everything that's happening with generative AI, I think the story we just talked about was super relevant, but the scale can be even higher, you know, it can be tenfold that. We were recently conducting kind of this internal study using GPT-J as a proxy to illustrate the experience of just a company trying to use one of these large language models with an example scenario of creating a chatbot to help job seekers prepare for interviews. So if you imagine kind of a conservative usage scenario where the model generates just 3000 words per user per day, which is, you know, pretty conservative for how people are interacting with these models. It costs 5 cents a session and if you're a company and your app goes viral, so from, you know, beginning of the year there's nobody, at the end of the year there's a million daily active active users in that year alone, going from zero to a million. You'll be spending about $6 million a year, which is pretty unmanageable. That's crazy, right? >> Yeah. >> For a company or a product that's just launching. So I think, you know, for us we see the real way to make these kind of advancements accessible and sustainable, as we said is to bring down cost to serve using these techniques. >> That's a great story and I think that illustrates this idea that deployment cost can vary from situation to situation, from model to model and that the efficiency is so strong with this new wave, it eliminates heavy lifting, creates more efficiency, automates intellect. I mean, this is the trend, this is radical, this is going to increase. So the cost could go from nominal to millions, literally, potentially. So, this is what customers are doing. Yeah, that's a great story. What makes sense on a financial, is there a cost of ownership? Is there a pattern for best practice for training? What do you guys advise cuz this is a lot of time and money involved in all potential, you know, good scenarios of upside. But you can get over your skis as they say, and be successful and be out of business if you don't manage it. I mean, that's what people are talking about, right? >> Yeah, absolutely. I think, you know, we see kind of three main vectors to reduce cost. I think one is make your deployment process easier overall, so that your engineering effort to even get your app running goes down. Two, would be get more from the compute you're already paying for, you're already paying, you know, for your instances in the cloud, but can you do more with that? And then three would be shop around for lower cost hardware to match your use case. So on the first one, I think making the deployment easier overall, there's a lot of manual work that goes into benchmarking, optimizing and packaging models for deployment. And because the performance of machine learning models can be really hardware dependent, you have to go through this process for each target you want to consider running your model on. And this is hard, you know, we see that every day. But for teams who want to incorporate some of these large language models into their applications, it might be desirable because licensing a model from a large vendor like OpenAI can leave you, you know, over provision, kind of paying for capabilities you don't need in your application or can lock you into them and you lose flexibility. So we have a customer whose team actually prepares models for deployment in a SaaS application that many of us use every day. And they told us recently that without kind of an automated benchmarking and experimentation platform, they were spending several days each to benchmark a single model on a single hardware type. So this is really, you know, manually intensive and then getting more from the compute you're already paying for. We do see customers who leave money on the table by running models that haven't been optimized specifically for the hardware target they're using, like Luis was mentioning. And for some teams they just don't have the time to go through an optimization process and for others they might lack kind of specialized expertise and this is something we can bring. And then on shopping around for different hardware types, we really see a huge variation in model performance across hardware, not just CPU vs. GPU, which is, you know, what people normally think of. But across CPU vendors themselves, high memory instances and across cloud providers even. So the best strategy here is for teams to really be able to, we say, look before you leap by running real world benchmarking and not just simulations or predictions to find the best software, hardware combination for their workload. >> Yeah. You guys sound like you have a very impressive customer base deploying large language models. Where would you categorize your current customer base? And as you look out, as you guys are growing, you have new customers coming in, take me through the progression. Take me through the profile of some of your customers you have now, size, are they hyperscalers, are they big app folks, are they kicking the tires? And then as people are out there scratching heads, I got to get in this game, what's their psychology like? Are they coming in with specific problems or do they have specific orientation point of view about what they want to do? Can you share some data around what you're seeing? >> Yeah, I think, you know, we have customers that kind of range across the spectrum of sophistication from teams that basically don't have MLOps expertise in their company at all. And so they're really looking for us to kind of give a full service, how should I do everything from, you know, optimization, find the hardware, prepare for deployment. And then we have teams that, you know, maybe already have their serving and hosting infrastructure up and ready and they already have models in production and they're really just looking to, you know, take the extra juice out of the hardware and just do really specific on that optimization piece. I think one place where we're doing a lot more work now is kind of in the developer tooling, you know, model selection space. And that's kind of an area that we're creating more tools for, particularly within the PyTorch ecosystem to bring kind of this power earlier in the development cycle so that as people are grabbing a model off the shelf, they can, you know, see how it might perform and use that to inform their development process. >> Luis, what's the big, I like this idea of picking the models because isn't that like going to the market and picking the best model for your data? It's like, you know, it's like, isn't there a certain approaches? What's your view on this? 'Cause this is where everyone, I think it's going to be a land rush for this and I want to get your thoughts. >> For sure, yeah. So, you know, I guess I'll start with saying the one main takeaway that we got from the GPT-J study is that, you know, having a different understanding of what your model's compute and memory requirements are, very quickly, early on helps with the much smarter AI model deployments, right? So, and in fact, you know, Anna just touched on this, but I want to, you know, make sure that it's clear that OctoML is putting that power into user's hands right now. So in partnership with AWS, we are launching this new PyTorch native profiler that allows you with a single, you know, one line, you know, code decorator allows you to see how your code runs on a variety of different hardware after accelerations. So it gives you very clear, you know, data on how you should think about your model deployments. And this ties back to choices of models. So like, if you have a set of choices that are equally good of models in terms of functionality and you want to understand after acceleration how are you going to deploy, how much they're going to cost or what are the options using a automated process of making a decision is really, really useful. And in fact, so I think these events can get early access to this by signing up for the Octopods, you know, this is exclusive group for insiders here, so you can go to OctoML.ai/pods to sign up. >> So that Octopod, is that a program? What is that, is that access to code? Is that a beta, what is that? Explain, take a minute and explain Octopod. >> I think the Octopod would be a group of people who is interested in experiencing this functionality. So it is the friends and users of OctoML that would be the Octopod. And then yes, after you sign up, we would provide you essentially the tool in code form for you to try out in your own. I mean, part of the benefit of this is that it happens in your own local environment and you're in control of everything kind of within the workflow that developers are already using to create and begin putting these models into their applications. So it would all be within your control. >> Got it. I think the big question I have for you is when do you, when does that one of your customers know they need to call you? What's their environment look like? What are they struggling with? What are the conversations they might be having on their side of the fence? If anyone's watching this, they're like, "Hey, you know what, I've got my team, we have a lot of data. Do we have our own language model or do I use someone else's?" There's a lot of this, I will say discovery going on around what to do, what path to take, what does that customer look like, if someone's listening, when do they know to call you guys, OctoML? >> Well, I mean the most obvious one is that you have a significant spend on AI/ML, come and talk to us, you know, putting AIML into production. So that's the clear one. In fact, just this morning I was talking to someone who is in life sciences space and is having, you know, 15 to $20 million a year cloud related to AI/ML deployment is a clear, it's a pretty clear match right there, right? So that's on the cost side. But I also want to emphasize something that Anna said earlier that, you know, the hardware and software complexity involved in putting model into production is really high. So we've been able to abstract that away, offering a clean automation flow enables one, to experiment early on, you know, how models would run and get them to production. And then two, once they are into production, gives you an automated flow to continuously updating your model and taking advantage of all this acceleration and ability to run the model on the right hardware. So anyways, let's say one then is cost, you know, you have significant cost and then two, you have an automation needs. And Anna please compliment that. >> Yeah, Anna you can please- >> Yeah, I think that's exactly right. Maybe the other time is when you are expecting a big scale up in serving your application, right? You're launching a new feature, you expect to get a lot of usage or, and you want to kind of anticipate maybe your CTO, your CIO, whoever pays your cloud bills is going to come after you, right? And so they want to know, you know, what's the return on putting this model essentially into my application stack? Am I going to, is the usage going to match what I'm paying for it? And then you can understand that. >> So you guys have a lot of the early adopters, they got big data teams, they're pushed in the production, they want to get a little QA, test the waters, understand, use your technology to figure it out. Is there any cases where people have gone into production, they have to pull it out? It's like the old lemon laws with your car, you buy a car and oh my god, it's not the way I wanted it. I mean, I can imagine the early people through the wall, so to speak, in the wave here are going to be bloody in the sense that they've gone in and tried stuff and get stuck with huge bills. Are you seeing that? Are people pulling stuff out of production and redeploying? Or I can imagine that if I had a bad deployment, I'd want to refactor that or actually replatform that. Do you see that too? >> Definitely after a sticker shock, yes, your customers will come and make sure that, you know, the sticker shock won't happen again. >> Yeah. >> But then there's another more thorough aspect here that I think we likely touched on, be worth elaborating a bit more is just how are you going to scale in a way that's feasible depending on the allocation that you get, right? So as we mentioned several times here, you know, model deployment is so hardware dependent and so complex that you tend to get a model for a hardware choice and then you want to scale that specific type of instance. But what if, when you want to scale because suddenly luckily got popular and, you know, you want to scale it up and then you don't have that instance anymore. So how do you live with whatever you have at that moment is something that we see customers needing as well. You know, so in fact, ideally what we want is customers to not think about what kind of specific instances they want. What they want is to know what their models need. Say, they know the SLA and then find a set of hybrid targets and instances that hit the SLA whenever they're also scaling, they're going to scale with more freedom, right? Instead of having to wait for AWS to give them more specific allocation for a specific instance. What if you could live with other types of hardware and scale up in a more free way, right? So that's another thing that we see customers, you know, like they need more freedom to be able to scale with whatever is available. >> Anna, you touched on this with the business model impact to that 6 million cost, if that goes out of control, there's a business model aspect and there's a technical operation aspect to the cost side too. You want to be mindful of riding the wave in a good way, but not getting over your skis. So that brings up the point around, you know, confidence, right? And teamwork. Because if you're in production, there's probably a team behind it. Talk about the team aspect of your customers. I mean, they're dedicated, they go put stuff into production, they're developers, there're data. What's in it for them? Are they getting better, are they in the beach, you know, reading the book. Are they, you know, are there easy street for them? What's the customer benefit to the teams? >> Yeah, absolutely. With just a few clicks of a button, you're in production, right? That's the dream. So yeah, I mean I think that, you know, we illustrated it before a little bit. I think the automated kind of benchmarking and optimization process, like when you think about the effort it takes to get that data by hand, which is what people are doing today, they just don't do it. So they're making decisions without the best information because it's, you know, there just isn't the bandwidth to get the information that they need to make the best decision and then know exactly how to deploy it. So I think it's actually bringing kind of a new insight and capability to these teams that they didn't have before. And then maybe another aspect on the team side is that it's making the hand-off of the models from the data science teams to the model deployment teams more seamless. So we have, you know, we have seen in the past that this kind of transition point is the place where there are a lot of hiccups, right? The data science team will give a model to the production team and it'll be too slow for the application or it'll be too expensive to run and it has to go back and be changed and kind of this loop. And so, you know, with the PyTorch profiler that Luis was talking about, and then also, you know, the other ways we do optimization that kind of prevents that hand-off problem from happening. >> Luis and Anna, you guys have a great company. Final couple minutes left. Talk about the company, the people there, what's the culture like, you know, if Intel has Moore's law, which is, you know, doubling the performance in few years, what's the culture like there? Is it, you know, more throughput, better pricing? Explain what's going on with the company and put a plug in. Luis, we'll start with you. >> Yeah, absolutely. I'm extremely proud of the team that we built here. You know, we have a people first culture, you know, very, very collaborative and folks, we all have a shared mission here of making AI more accessible and sustainable. We have a very diverse team in terms of backgrounds and life stories, you know, to do what we do here, we need a team that has expertise in software engineering, in machine learning, in computer architecture. Even though we don't build chips, we need to understand how they work, right? So, and then, you know, the fact that we have this, this very really, really varied set of backgrounds makes the environment, you know, it's say very exciting to learn more about, you know, assistance end-to-end. But also makes it for a very interesting, you know, work environment, right? So people have different backgrounds, different stories. Some of them went to grad school, others, you know, were in intelligence agencies and now are working here, you know. So we have a really interesting set of people and, you know, life is too short not to work with interesting humans. You know, that's something that I like to think about, you know. >> I'm sure your off-site meetings are a lot of fun, people talking about computer architectures, silicon advances, the next GPU, the big data models coming in. Anna, what's your take? What's the culture like? What's the company vibe and what are you guys looking to do? What's the customer success pattern? What's up? >> Yeah, absolutely. I mean, I, you know, second all of the great things that Luis just said about the team. I think one that I, an additional one that I'd really like to underscore is kind of this customer obsession, to use a term you all know well. And focus on the end users and really making the experiences that we're bringing to our user who are developers really, you know, useful and valuable for them. And so I think, you know, all of these tools that we're trying to put in the hands of users, the industry and the market is changing so rapidly that our products across the board, you know, all of the companies that, you know, are part of the showcase today, we're all evolving them so quickly and we can only do that kind of really hand in glove with our users. So that would be another thing I'd emphasize. >> I think the change dynamic, the power dynamics of this industry is just the beginning. I'm very bullish that this is going to be probably one of the biggest inflection points in history of the computer industry because of all the dynamics of the confluence of all the forces, which you mentioned some of them, I mean PC, you know, interoperability within internetworking and you got, you know, the web and then mobile. Now we have this, I mean, I wouldn't even put social media even in the close to this. Like, this is like, changes user experience, changes infrastructure. There's going to be massive accelerations in performance on the hardware side from AWS's of the world and cloud and you got the edge and more data. This is really what big data was going to look like. This is the beginning. Final question, what do you guys see going forward in the future? >> Well, it's undeniable that machine learning and AI models are becoming an integral part of an interesting application today, right? So, and the clear trends here are, you know, more and more competitional needs for these models because they're only getting more and more powerful. And then two, you know, seeing the complexity of the infrastructure where they run, you know, just considering the cloud, there's like a wide variety of choices there, right? So being able to live with that and making the most out of it in a way that does not require, you know, an impossible to find team is something that's pretty clear. So the need for automation, abstracting with the complexity is definitely here. And we are seeing this, you know, trends are that you also see models starting to move to the edge as well. So it's clear that we're seeing, we are going to live in a world where there's no large models living in the cloud. And then, you know, edge models that talk to these models in the cloud to form, you know, an end-to-end truly intelligent application. >> Anna? >> Yeah, I think, you know, our, Luis said it at the beginning. Our vision is to make AI sustainable and accessible. And I think as this technology just expands in every company and every team, that's going to happen kind of on its own. And we're here to help support that. And I think you can't do that without tools like those like OctoML. >> I think it's going to be an error of massive invention, creativity, a lot of the format heavy lifting is going to allow the talented people to automate their intellect. I mean, this is really kind of what we see going on. And Luis, thank you so much. Anna, thanks for coming on this segment. Thanks for coming on theCUBE and being part of the AWS Startup Showcase. I'm John Furrier, your host. Thanks for watching. (upbeat music)
SUMMARY :
Great to have you on again, Luis. It's great to be here. but let's explain what you guys do. And what we offer customers is, you know, So what do you guys think? so that the business you got the new secret kind of across the board So I have to ask you guys, And that very, very quickly, you know, and the data center versus cloud options. And now that we have them but, you know, training's super important. John, is that, you know, humans and then you got actually managed to even, you know, So I think, you know, for us we see in all potential, you know, And this is hard, you know, And as you look out, as And then we have teams that, you know, and picking the best model for your data? from the GPT-J study is that, you know, What is that, is that access to code? And then yes, after you sign up, to call you guys, OctoML? come and talk to us, you know, And so they want to know, you know, So you guys have a lot make sure that, you know, we see customers, you know, What's the customer benefit to the teams? and then also, you know, what's the culture like, you know, So, and then, you know, and what are you guys looking to do? all of the companies that, you know, I mean PC, you know, in the cloud to form, you know, And I think you can't And Luis, thank you so much.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Anna | PERSON | 0.99+ |
Anna Connolly | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
Luis | PERSON | 0.99+ |
Luis Ceze | PERSON | 0.99+ |
John | PERSON | 0.99+ |
1 | QUANTITY | 0.99+ |
10 | QUANTITY | 0.99+ |
15 | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
10 year | QUANTITY | 0.99+ |
6 million | QUANTITY | 0.99+ |
zero | QUANTITY | 0.99+ |
Intel | ORGANIZATION | 0.99+ |
three | QUANTITY | 0.99+ |
Nvidia | ORGANIZATION | 0.99+ |
First | QUANTITY | 0.99+ |
OctoML | ORGANIZATION | 0.99+ |
two | QUANTITY | 0.99+ |
millions | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Two | QUANTITY | 0.99+ |
$2 million | QUANTITY | 0.98+ |
3000 words | QUANTITY | 0.98+ |
one line | QUANTITY | 0.98+ |
A10 | COMMERCIAL_ITEM | 0.98+ |
OctoML | TITLE | 0.98+ |
one | QUANTITY | 0.98+ |
three main vectors | QUANTITY | 0.97+ |
hundreds of thousands of dollars | QUANTITY | 0.97+ |
both | QUANTITY | 0.97+ |
CUBE | ORGANIZATION | 0.97+ |
T4 | COMMERCIAL_ITEM | 0.97+ |
one time | QUANTITY | 0.97+ |
first one | QUANTITY | 0.96+ |
two cents | QUANTITY | 0.96+ |
GPT-J | ORGANIZATION | 0.96+ |
single model | QUANTITY | 0.95+ |
a minute | QUANTITY | 0.95+ |
about $6 million a year | QUANTITY | 0.95+ |
once a day | QUANTITY | 0.95+ |
$20,000 a day | QUANTITY | 0.95+ |
a million | QUANTITY | 0.94+ |
theCUBE | ORGANIZATION | 0.93+ |
Octopod | TITLE | 0.93+ |
this morning | DATE | 0.93+ |
first culture | QUANTITY | 0.92+ |
$20 million a year | QUANTITY | 0.92+ |
AWS Startup Showcase | EVENT | 0.9+ |
North Star | ORGANIZATION | 0.9+ |
Robert Nishihara, Anyscale | AWS Startup Showcase S3 E1
(upbeat music) >> Hello everyone. Welcome to theCube's presentation of the "AWS Startup Showcase." The topic this episode is AI and machine learning, top startups building foundational model infrastructure. This is season three, episode one of the ongoing series covering exciting startups from the AWS ecosystem. And this time we're talking about AI and machine learning. I'm your host, John Furrier. I'm excited I'm joined today by Robert Nishihara, who's the co-founder and CEO of a hot startup called Anyscale. He's here to talk about Ray, the open source project, Anyscale's infrastructure for foundation as well. Robert, thank you for joining us today. >> Yeah, thanks so much as well. >> I've been following your company since the founding pre pandemic and you guys really had a great vision scaled up and in a perfect position for this big wave that we all see with ChatGPT and OpenAI that's gone mainstream. Finally, AI has broken out through the ropes and now gone mainstream, so I think you guys are really well positioned. I'm looking forward to to talking with you today. But before we get into it, introduce the core mission for Anyscale. Why do you guys exist? What is the North Star for Anyscale? >> Yeah, like you mentioned, there's a tremendous amount of excitement about AI right now. You know, I think a lot of us believe that AI can transform just every different industry. So one of the things that was clear to us when we started this company was that the amount of compute needed to do AI was just exploding. Like to actually succeed with AI, companies like OpenAI or Google or you know, these companies getting a lot of value from AI, were not just running these machine learning models on their laptops or on a single machine. They were scaling these applications across hundreds or thousands or more machines and GPUs and other resources in the Cloud. And so to actually succeed with AI, and this has been one of the biggest trends in computing, maybe the biggest trend in computing in, you know, in recent history, the amount of compute has been exploding. And so to actually succeed with that AI, to actually build these scalable applications and scale the AI applications, there's a tremendous software engineering lift to build the infrastructure to actually run these scalable applications. And that's very hard to do. So one of the reasons many AI projects and initiatives fail is that, or don't make it to production, is the need for this scale, the infrastructure lift, to actually make it happen. So our goal here with Anyscale and Ray, is to make that easy, is to make scalable computing easy. So that as a developer or as a business, if you want to do AI, if you want to get value out of AI, all you need to know is how to program on your laptop. Like, all you need to know is how to program in Python. And if you can do that, then you're good to go. Then you can do what companies like OpenAI or Google do and get value out of machine learning. >> That programming example of how easy it is with Python reminds me of the early days of Cloud, when infrastructure as code was talked about was, it was just code the infrastructure programmable. That's super important. That's what AI people wanted, first program AI. That's the new trend. And I want to understand, if you don't mind explaining, the relationship that Anyscale has to these foundational models and particular the large language models, also called LLMs, was seen with like OpenAI and ChatGPT. Before you get into the relationship that you have with them, can you explain why the hype around foundational models? Why are people going crazy over foundational models? What is it and why is it so important? >> Yeah, so foundational models and foundation models are incredibly important because they enable businesses and developers to get value out of machine learning, to use machine learning off the shelf with these large models that have been trained on tons of data and that are useful out of the box. And then, of course, you know, as a business or as a developer, you can take those foundational models and repurpose them or fine tune them or adapt them to your specific use case and what you want to achieve. But it's much easier to do that than to train them from scratch. And I think there are three, for people to actually use foundation models, there are three main types of workloads or problems that need to be solved. One is training these foundation models in the first place, like actually creating them. The second is fine tuning them and adapting them to your use case. And the third is serving them and actually deploying them. Okay, so Ray and Anyscale are used for all of these three different workloads. Companies like OpenAI or Cohere that train large language models. Or open source versions like GPTJ are done on top of Ray. There are many startups and other businesses that fine tune, that, you know, don't want to train the large underlying foundation models, but that do want to fine tune them, do want to adapt them to their purposes, and build products around them and serve them, those are also using Ray and Anyscale for that fine tuning and that serving. And so the reason that Ray and Anyscale are important here is that, you know, building and using foundation models requires a huge scale. It requires a lot of data. It requires a lot of compute, GPUs, TPUs, other resources. And to actually take advantage of that and actually build these scalable applications, there's a lot of infrastructure that needs to happen under the hood. And so you can either use Ray and Anyscale to take care of that and manage the infrastructure and solve those infrastructure problems. Or you can build the infrastructure and manage the infrastructure yourself, which you can do, but it's going to slow your team down. It's going to, you know, many of the businesses we work with simply don't want to be in the business of managing infrastructure and building infrastructure. They want to focus on product development and move faster. >> I know you got a keynote presentation we're going to go to in a second, but I think you hit on something I think is the real tipping point, doing it yourself, hard to do. These are things where opportunities are and the Cloud did that with data centers. Turned a data center and made it an API. The heavy lifting went away and went to the Cloud so people could be more creative and build their product. In this case, build their creativity. Is that kind of what's the big deal? Is that kind of a big deal happening that you guys are taking the learnings and making that available so people don't have to do that? >> That's exactly right. So today, if you want to succeed with AI, if you want to use AI in your business, infrastructure work is on the critical path for doing that. To do AI, you have to build infrastructure. You have to figure out how to scale your applications. That's going to change. We're going to get to the point, and you know, with Ray and Anyscale, we're going to remove the infrastructure from the critical path so that as a developer or as a business, all you need to focus on is your application logic, what you want the the program to do, what you want your application to do, how you want the AI to actually interface with the rest of your product. Now the way that will happen is that Ray and Anyscale will still, the infrastructure work will still happen. It'll just be under the hood and taken care of by Ray in Anyscale. And so I think something like this is really necessary for AI to reach its potential, for AI to have the impact and the reach that we think it will, you have to make it easier to do. >> And just for clarification to point out, if you don't mind explaining the relationship of Ray and Anyscale real quick just before we get into the presentation. >> So Ray is an open source project. We created it. We were at Berkeley doing machine learning. We started Ray so that, in order to provide an easy, a simple open source tool for building and running scalable applications. And Anyscale is the managed version of Ray, basically we will run Ray for you in the Cloud, provide a lot of tools around the developer experience and managing the infrastructure and providing more performance and superior infrastructure. >> Awesome. I know you got a presentation on Ray and Anyscale and you guys are positioning as the infrastructure for foundational models. So I'll let you take it away and then when you're done presenting, we'll come back, I'll probably grill you with a few questions and then we'll close it out so take it away. >> Robert: Sounds great. So I'll say a little bit about how companies are using Ray and Anyscale for foundation models. The first thing I want to mention is just why we're doing this in the first place. And the underlying observation, the underlying trend here, and this is a plot from OpenAI, is that the amount of compute needed to do machine learning has been exploding. It's been growing at something like 35 times every 18 months. This is absolutely enormous. And other people have written papers measuring this trend and you get different numbers. But the point is, no matter how you slice and dice it, it' a astronomical rate. Now if you compare that to something we're all familiar with, like Moore's Law, which says that, you know, the processor performance doubles every roughly 18 months, you can see that there's just a tremendous gap between the needs, the compute needs of machine learning applications, and what you can do with a single chip, right. So even if Moore's Law were continuing strong and you know, doing what it used to be doing, even if that were the case, there would still be a tremendous gap between what you can do with the chip and what you need in order to do machine learning. And so given this graph, what we've seen, and what has been clear to us since we started this company, is that doing AI requires scaling. There's no way around it. It's not a nice to have, it's really a requirement. And so that led us to start Ray, which is the open source project that we started to make it easy to build these scalable Python applications and scalable machine learning applications. And since we started the project, it's been adopted by a tremendous number of companies. Companies like OpenAI, which use Ray to train their large models like ChatGPT, companies like Uber, which run all of their deep learning and classical machine learning on top of Ray, companies like Shopify or Spotify or Instacart or Lyft or Netflix, ByteDance, which use Ray for their machine learning infrastructure. Companies like Ant Group, which makes Alipay, you know, they use Ray across the board for fraud detection, for online learning, for detecting money laundering, you know, for graph processing, stream processing. Companies like Amazon, you know, run Ray at a tremendous scale and just petabytes of data every single day. And so the project has seen just enormous adoption since, over the past few years. And one of the most exciting use cases is really providing the infrastructure for building training, fine tuning, and serving foundation models. So I'll say a little bit about, you know, here are some examples of companies using Ray for foundation models. Cohere trains large language models. OpenAI also trains large language models. You can think about the workloads required there are things like supervised pre-training, also reinforcement learning from human feedback. So this is not only the regular supervised learning, but actually more complex reinforcement learning workloads that take human input about what response to a particular question, you know is better than a certain other response. And incorporating that into the learning. There's open source versions as well, like GPTJ also built on top of Ray as well as projects like Alpa coming out of UC Berkeley. So these are some of the examples of exciting projects in organizations, training and creating these large language models and serving them using Ray. Okay, so what actually is Ray? Well, there are two layers to Ray. At the lowest level, there's the core Ray system. This is essentially low level primitives for building scalable Python applications. Things like taking a Python function or a Python class and executing them in the cluster setting. So Ray core is extremely flexible and you can build arbitrary scalable applications on top of Ray. So on top of Ray, on top of the core system, what really gives Ray a lot of its power is this ecosystem of scalable libraries. So on top of the core system you have libraries, scalable libraries for ingesting and pre-processing data, for training your models, for fine tuning those models, for hyper parameter tuning, for doing batch processing and batch inference, for doing model serving and deployment, right. And a lot of the Ray users, the reason they like Ray is that they want to run multiple workloads. They want to train and serve their models, right. They want to load their data and feed that into training. And Ray provides common infrastructure for all of these different workloads. So this is a little overview of what Ray, the different components of Ray. So why do people choose to go with Ray? I think there are three main reasons. The first is the unified nature. The fact that it is common infrastructure for scaling arbitrary workloads, from data ingest to pre-processing to training to inference and serving, right. This also includes the fact that it's future proof. AI is incredibly fast moving. And so many people, many companies that have built their own machine learning infrastructure and standardized on particular workflows for doing machine learning have found that their workflows are too rigid to enable new capabilities. If they want to do reinforcement learning, if they want to use graph neural networks, they don't have a way of doing that with their standard tooling. And so Ray, being future proof and being flexible and general gives them that ability. Another reason people choose Ray in Anyscale is the scalability. This is really our bread and butter. This is the reason, the whole point of Ray, you know, making it easy to go from your laptop to running on thousands of GPUs, making it easy to scale your development workloads and run them in production, making it easy to scale, you know, training to scale data ingest, pre-processing and so on. So scalability and performance, you know, are critical for doing machine learning and that is something that Ray provides out of the box. And lastly, Ray is an open ecosystem. You can run it anywhere. You can run it on any Cloud provider. Google, you know, Google Cloud, AWS, Asure. You can run it on your Kubernetes cluster. You can run it on your laptop. It's extremely portable. And not only that, it's framework agnostic. You can use Ray to scale arbitrary Python workloads. You can use it to scale and it integrates with libraries like TensorFlow or PyTorch or JAX or XG Boost or Hugging Face or PyTorch Lightning, right, or Scikit-learn or just your own arbitrary Python code. It's open source. And in addition to integrating with the rest of the machine learning ecosystem and these machine learning frameworks, you can use Ray along with all of the other tooling in the machine learning ecosystem. That's things like weights and biases or ML flow, right. Or you know, different data platforms like Databricks, you know, Delta Lake or Snowflake or tools for model monitoring for feature stores, all of these integrate with Ray. And that's, you know, Ray provides that kind of flexibility so that you can integrate it into the rest of your workflow. And then Anyscale is the scalable compute platform that's built on top, you know, that provides Ray. So Anyscale is a managed Ray service that runs in the Cloud. And what Anyscale does is it offers the best way to run Ray. And if you think about what you get with Anyscale, there are fundamentally two things. One is about moving faster, accelerating the time to market. And you get that by having the managed service so that as a developer you don't have to worry about managing infrastructure, you don't have to worry about configuring infrastructure. You also, it provides, you know, optimized developer workflows. Things like easily moving from development to production, things like having the observability tooling, the debug ability to actually easily diagnose what's going wrong in a distributed application. So things like the dashboards and the other other kinds of tooling for collaboration, for monitoring and so on. And then on top of that, so that's the first bucket, developer productivity, moving faster, faster experimentation and iteration. The second reason that people choose Anyscale is superior infrastructure. So this is things like, you know, cost deficiency, being able to easily take advantage of spot instances, being able to get higher GPU utilization, things like faster cluster startup times and auto scaling. Things like just overall better performance and faster scheduling. And so these are the kinds of things that Anyscale provides on top of Ray. It's the managed infrastructure. It's fast, it's like the developer productivity and velocity as well as performance. So this is what I wanted to share about Ray in Anyscale. >> John: Awesome. >> Provide that context. But John, I'm curious what you think. >> I love it. I love the, so first of all, it's a platform because that's the platform architecture right there. So just to clarify, this is an Anyscale platform, not- >> That's right. >> Tools. So you got tools in the platform. Okay, that's key. Love that managed service. Just curious, you mentioned Python multiple times, is that because of PyTorch and TensorFlow or Python's the most friendly with machine learning or it's because it's very common amongst all developers? >> That's a great question. Python is the language that people are using to do machine learning. So it's the natural starting point. Now, of course, Ray is actually designed in a language agnostic way and there are companies out there that use Ray to build scalable Java applications. But for the most part right now we're focused on Python and being the best way to build these scalable Python and machine learning applications. But, of course, down the road there always is that potential. >> So if you're slinging Python code out there and you're watching that, you're watching this video, get on Anyscale bus quickly. Also, I just, while you were giving the presentation, I couldn't help, since you mentioned OpenAI, which by the way, congratulations 'cause they've had great scale, I've noticed in their rapid growth 'cause they were the fastest company to the number of users than anyone in the history of the computer industry, so major successor, OpenAI and ChatGPT, huge fan. I'm not a skeptic at all. I think it's just the beginning, so congratulations. But I actually typed into ChatGPT, what are the top three benefits of Anyscale and came up with scalability, flexibility, and ease of use. Obviously, scalability is what you guys are called. >> That's pretty good. >> So that's what they came up with. So they nailed it. Did you have an inside prompt training, buy it there? Only kidding. (Robert laughs) >> Yeah, we hard coded that one. >> But that's the kind of thing that came up really, really quickly if I asked it to write a sales document, it probably will, but this is the future interface. This is why people are getting excited about the foundational models and the large language models because it's allowing the interface with the user, the consumer, to be more human, more natural. And this is clearly will be in every application in the future. >> Absolutely. This is how people are going to interface with software, how they're going to interface with products in the future. It's not just something, you know, not just a chat bot that you talk to. This is going to be how you get things done, right. How you use your web browser or how you use, you know, how you use Photoshop or how you use other products. Like you're not going to spend hours learning all the APIs and how to use them. You're going to talk to it and tell it what you want it to do. And of course, you know, if it doesn't understand it, it's going to ask clarifying questions. You're going to have a conversation and then it'll figure it out. >> This is going to be one of those things, we're going to look back at this time Robert and saying, "Yeah, from that company, that was the beginning of that wave." And just like AWS and Cloud Computing, the folks who got in early really were in position when say the pandemic came. So getting in early is a good thing and that's what everyone's talking about is getting in early and playing around, maybe replatforming or even picking one or few apps to refactor with some staff and managed services. So people are definitely jumping in. So I have to ask you the ROI cost question. You mentioned some of those, Moore's Law versus what's going on in the industry. When you look at that kind of scale, the first thing that jumps out at people is, "Okay, I love it. Let's go play around." But what's it going to cost me? Am I going to be tied to certain GPUs? What's the landscape look like from an operational standpoint, from the customer? Are they locked in and the benefit was flexibility, are you flexible to handle any Cloud? What is the customers, what are they looking at? Basically, that's my question. What's the customer looking at? >> Cost is super important here and many of the companies, I mean, companies are spending a huge amount on their Cloud computing, on AWS, and on doing AI, right. And I think a lot of the advantage of Anyscale, what we can provide here is not only better performance, but cost efficiency. Because if we can run something faster and more efficiently, it can also use less resources and you can lower your Cloud spending, right. We've seen companies go from, you know, 20% GPU utilization with their current setup and the current tools they're using to running on Anyscale and getting more like 95, you know, 100% GPU utilization. That's something like a five x improvement right there. So depending on the kind of application you're running, you know, it's a significant cost savings. We've seen companies that have, you know, processing petabytes of data every single day with Ray going from, you know, getting order of magnitude cost savings by switching from what they were previously doing to running their application on Ray. And when you have applications that are spending, you know, potentially $100 million a year and getting a 10 X cost savings is just absolutely enormous. So these are some of the kinds of- >> Data infrastructure is super important. Again, if the customer, if you're a prospect to this and thinking about going in here, just like the Cloud, you got infrastructure, you got the platform, you got SaaS, same kind of thing's going to go on in AI. So I want to get into that, you know, ROI discussion and some of the impact with your customers that are leveraging the platform. But first I hear you got a demo. >> Robert: Yeah, so let me show you, let me give you a quick run through here. So what I have open here is the Anyscale UI. I've started a little Anyscale Workspace. So Workspaces are the Anyscale concept for interactive developments, right. So here, imagine I'm just, you want to have a familiar experience like you're developing on your laptop. And here I have a terminal. It's not on my laptop. It's actually in the cloud running on Anyscale. And I'm just going to kick this off. This is going to train a large language model, so OPT. And it's doing this on 32 GPUs. We've got a cluster here with a bunch of CPU cores, bunch of memory. And as that's running, and by the way, if I wanted to run this on instead of 32 GPUs, 64, 128, this is just a one line change when I launch the Workspace. And what I can do is I can pull up VS code, right. Remember this is the interactive development experience. I can look at the actual code. Here it's using Ray train to train the torch model. We've got the training loop and we're saying that each worker gets access to one GPU and four CPU cores. And, of course, as I make the model larger, this is using deep speed, as I make the model larger, I could increase the number of GPUs that each worker gets access to, right. And how that is distributed across the cluster. And if I wanted to run on CPUs instead of GPUs or a different, you know, accelerator type, again, this is just a one line change. And here we're using Ray train to train the models, just taking my vanilla PyTorch model using Hugging Face and then scaling that across a bunch of GPUs. And, of course, if I want to look at the dashboard, I can go to the Ray dashboard. There are a bunch of different visualizations I can look at. I can look at the GPU utilization. I can look at, you know, the CPU utilization here where I think we're currently loading the model and running that actual application to start the training. And some of the things that are really convenient here about Anyscale, both I can get that interactive development experience with VS code. You know, I can look at the dashboards. I can monitor what's going on. It feels, I have a terminal, it feels like my laptop, but it's actually running on a large cluster. And I can, with however many GPUs or other resources that I want. And so it's really trying to combine the best of having the familiar experience of programming on your laptop, but with the benefits, you know, being able to take advantage of all the resources in the Cloud to scale. And it's like when, you know, you're talking about cost efficiency. One of the biggest reasons that people waste money, one of the silly reasons for wasting money is just forgetting to turn off your GPUs. And what you can do here is, of course, things will auto terminate if they're idle. But imagine you go to sleep, I have this big cluster. You can turn it off, shut off the cluster, come back tomorrow, restart the Workspace, and you know, your big cluster is back up and all of your code changes are still there. All of your local file edits. It's like you just closed your laptop and came back and opened it up again. And so this is the kind of experience we want to provide for our users. So that's what I wanted to share with you. >> Well, I think that whole, couple of things, lines of code change, single line of code change, that's game changing. And then the cost thing, I mean human error is a big deal. People pass out at their computer. They've been coding all night or they just forget about it. I mean, and then it's just like leaving the lights on or your water running in your house. It's just, at the scale that it is, the numbers will add up. That's a huge deal. So I think, you know, compute back in the old days, there's no compute. Okay, it's just compute sitting there idle. But you know, data cranking the models is doing, that's a big point. >> Another thing I want to add there about cost efficiency is that we make it really easy to use, if you're running on Anyscale, to use spot instances and these preemptable instances that can just be significantly cheaper than the on-demand instances. And so when we see our customers go from what they're doing before to using Anyscale and they go from not using these spot instances 'cause they don't have the infrastructure around it, the fault tolerance to handle the preemption and things like that, to being able to just check a box and use spot instances and save a bunch of money. >> You know, this was my whole, my feature article at Reinvent last year when I met with Adam Selipsky, this next gen Cloud is here. I mean, it's not auto scale, it's infrastructure scale. It's agility. It's flexibility. I think this is where the world needs to go. Almost what DevOps did for Cloud and what you were showing me that demo had this whole SRE vibe. And remember Google had site reliability engines to manage all those servers. This is kind of like an SRE vibe for data at scale. I mean, a similar kind of order of magnitude. I mean, I might be a little bit off base there, but how would you explain it? >> It's a nice analogy. I mean, what we are trying to do here is get to the point where developers don't think about infrastructure. Where developers only think about their application logic. And where businesses can do AI, can succeed with AI, and build these scalable applications, but they don't have to build, you know, an infrastructure team. They don't have to develop that expertise. They don't have to invest years in building their internal machine learning infrastructure. They can just focus on the Python code, on their application logic, and run the stuff out of the box. >> Awesome. Well, I appreciate the time. Before we wrap up here, give a plug for the company. I know you got a couple websites. Again, go, Ray's got its own website. You got Anyscale. You got an event coming up. Give a plug for the company looking to hire. Put a plug in for the company. >> Yeah, absolutely. Thank you. So first of all, you know, we think AI is really going to transform every industry and the opportunity is there, right. We can be the infrastructure that enables all of that to happen, that makes it easy for companies to succeed with AI, and get value out of AI. Now we have, if you're interested in learning more about Ray, Ray has been emerging as the standard way to build scalable applications. Our adoption has been exploding. I mentioned companies like OpenAI using Ray to train their models. But really across the board companies like Netflix and Cruise and Instacart and Lyft and Uber, you know, just among tech companies. It's across every industry. You know, gaming companies, agriculture, you know, farming, robotics, drug discovery, you know, FinTech, we see it across the board. And all of these companies can get value out of AI, can really use AI to improve their businesses. So if you're interested in learning more about Ray and Anyscale, we have our Ray Summit coming up in September. This is going to highlight a lot of the most impressive use cases and stories across the industry. And if your business, if you want to use LLMs, you want to train these LLMs, these large language models, you want to fine tune them with your data, you want to deploy them, serve them, and build applications and products around them, give us a call, talk to us. You know, we can really take the infrastructure piece, you know, off the critical path and make that easy for you. So that's what I would say. And, you know, like you mentioned, we're hiring across the board, you know, engineering, product, go-to-market, and it's an exciting time. >> Robert Nishihara, co-founder and CEO of Anyscale, congratulations on a great company you've built and continuing to iterate on and you got growth ahead of you, you got a tailwind. I mean, the AI wave is here. I think OpenAI and ChatGPT, a customer of yours, have really opened up the mainstream visibility into this new generation of applications, user interface, roll of data, large scale, how to make that programmable so we're going to need that infrastructure. So thanks for coming on this season three, episode one of the ongoing series of the hot startups. In this case, this episode is the top startups building foundational model infrastructure for AI and ML. I'm John Furrier, your host. Thanks for watching. (upbeat music)
SUMMARY :
episode one of the ongoing and you guys really had and other resources in the Cloud. and particular the large language and what you want to achieve. and the Cloud did that with data centers. the point, and you know, if you don't mind explaining and managing the infrastructure and you guys are positioning is that the amount of compute needed to do But John, I'm curious what you think. because that's the platform So you got tools in the platform. and being the best way to of the computer industry, Did you have an inside prompt and the large language models and tell it what you want it to do. So I have to ask you and you can lower your So I want to get into that, you know, and you know, your big cluster is back up So I think, you know, the on-demand instances. and what you were showing me that demo and run the stuff out of the box. I know you got a couple websites. and the opportunity is there, right. and you got growth ahead
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Robert Nishihara | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Robert | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
Netflix | ORGANIZATION | 0.99+ |
35 times | QUANTITY | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
$100 million | QUANTITY | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
100% | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
Ant Group | ORGANIZATION | 0.99+ |
first | QUANTITY | 0.99+ |
Python | TITLE | 0.99+ |
20% | QUANTITY | 0.99+ |
32 GPUs | QUANTITY | 0.99+ |
Lyft | ORGANIZATION | 0.99+ |
hundreds | QUANTITY | 0.99+ |
tomorrow | DATE | 0.99+ |
Anyscale | ORGANIZATION | 0.99+ |
three | QUANTITY | 0.99+ |
128 | QUANTITY | 0.99+ |
September | DATE | 0.99+ |
today | DATE | 0.99+ |
Moore's Law | TITLE | 0.99+ |
Adam Selipsky | PERSON | 0.99+ |
PyTorch | TITLE | 0.99+ |
Ray | ORGANIZATION | 0.99+ |
second reason | QUANTITY | 0.99+ |
64 | QUANTITY | 0.99+ |
each worker | QUANTITY | 0.99+ |
each worker | QUANTITY | 0.99+ |
Photoshop | TITLE | 0.99+ |
UC Berkeley | ORGANIZATION | 0.99+ |
Java | TITLE | 0.99+ |
Shopify | ORGANIZATION | 0.99+ |
OpenAI | ORGANIZATION | 0.99+ |
Anyscale | PERSON | 0.99+ |
third | QUANTITY | 0.99+ |
two things | QUANTITY | 0.99+ |
ByteDance | ORGANIZATION | 0.99+ |
Spotify | ORGANIZATION | 0.99+ |
One | QUANTITY | 0.99+ |
95 | QUANTITY | 0.99+ |
Asure | ORGANIZATION | 0.98+ |
one line | QUANTITY | 0.98+ |
one GPU | QUANTITY | 0.98+ |
ChatGPT | TITLE | 0.98+ |
TensorFlow | TITLE | 0.98+ |
last year | DATE | 0.98+ |
first bucket | QUANTITY | 0.98+ |
both | QUANTITY | 0.98+ |
two layers | QUANTITY | 0.98+ |
Cohere | ORGANIZATION | 0.98+ |
Alipay | ORGANIZATION | 0.98+ |
Ray | PERSON | 0.97+ |
one | QUANTITY | 0.97+ |
Instacart | ORGANIZATION | 0.97+ |
Opening Panel | Generative AI: Hype or Reality | AWS Startup Showcase S3 E1
(light airy music) >> Hello, everyone, welcome to theCUBE's presentation of the AWS Startup Showcase, AI and machine learning. "Top Startups Building Generative AI on AWS." This is season three, episode one of the ongoing series covering the exciting startups from the AWS ecosystem, talking about AI machine learning. We have three great guests Bratin Saha, VP, Vice President of Machine Learning and AI Services at Amazon Web Services. Tom Mason, the CTO of Stability AI, and Aidan Gomez, CEO and co-founder of Cohere. Two practitioners doing startups and AWS. Gentlemen, thank you for opening up this session, this episode. Thanks for coming on. >> Thank you. >> Thank you. >> Thank you. >> So the topic is hype versus reality. So I think we're all on the reality is great, hype is great, but the reality's here. I want to get into it. Generative AI's got all the momentum, it's going mainstream, it's kind of come out of the behind the ropes, it's now mainstream. We saw the success of ChatGPT, opens up everyone's eyes, but there's so much more going on. Let's jump in and get your early perspectives on what should people be talking about right now? What are you guys working on? We'll start with AWS. What's the big focus right now for you guys as you come into this market that's highly active, highly hyped up, but people see value right out of the gate? >> You know, we have been working on generative AI for some time. In fact, last year we released Code Whisperer, which is about using generative AI for software development and a number of customers are using it and getting real value out of it. So generative AI is now something that's mainstream that can be used by enterprise users. And we have also been partnering with a number of other companies. So, you know, stability.ai, we've been partnering with them a lot. We want to be partnering with other companies as well. In seeing how we do three things, you know, first is providing the most efficient infrastructure for generative AI. And that is where, you know, things like Trainium, things like Inferentia, things like SageMaker come in. And then next is the set of models and then the third is the kind of applications like Code Whisperer and so on. So, you know, it's early days yet, but clearly there's a lot of amazing capabilities that will come out and something that, you know, our customers are starting to pay a lot of attention to. >> Tom, talk about your company and what your focus is and why the Amazon Web Services relationship's important for you? >> So yeah, we're primarily committed to making incredible open source foundation models and obviously stable effusions been our kind of first big model there, which we trained all on AWS. We've been working with them over the last year and a half to develop, obviously a big cluster, and bring all that compute to training these models at scale, which has been a really successful partnership. And we're excited to take it further this year as we develop commercial strategy of the business and build out, you know, the ability for enterprise customers to come and get all the value from these models that we think they can get. So we're really excited about the future. We got hugely exciting pipeline for this year with new modalities and video models and wonderful things and trying to solve images for once and for all and get the kind of general value and value proposition correct for customers. So it's a really exciting time and very honored to be part of it. >> It's great to see some of your customers doing so well out there. Congratulations to your team. Appreciate that. Aidan, let's get into what you guys do. What does Cohere do? What are you excited about right now? >> Yeah, so Cohere builds large language models, which are the backbone of applications like ChatGPT and GPT-3. We're extremely focused on solving the issues with adoption for enterprise. So it's great that you can make a super flashy demo for consumers, but it takes a lot to actually get it into billion user products and large global enterprises. So about six months ago, we released our command models, which are some of the best that exist for large language models. And in December, we released our multilingual text understanding models and that's on over a hundred different languages and it's trained on, you know, authentic data directly from native speakers. And so we're super excited to continue pushing this into enterprise and solving those barriers for adoption, making this transformation a reality. >> Just real quick, while I got you there on the new products coming out. Where are we in the progress? People see some of the new stuff out there right now. There's so much more headroom. Can you just scope out in your mind what that looks like? Like from a headroom standpoint? Okay, we see ChatGPT. "Oh yeah, it writes my papers for me, does some homework for me." I mean okay, yawn, maybe people say that, (Aidan chuckles) people excited or people are blown away. I mean, it's helped theCUBE out, it helps me, you know, feed up a little bit from my write-ups but it's not always perfect. >> Yeah, at the moment it's like a writing assistant, right? And it's still super early in the technologies trajectory. I think it's fascinating and it's interesting but its impact is still really limited. I think in the next year, like within the next eight months, we're going to see some major changes. You've already seen the very first hints of that with stuff like Bing Chat, where you augment these dialogue models with an external knowledge base. So now the models can be kept up to date to the millisecond, right? Because they can search the web and they can see events that happened a millisecond ago. But that's still limited in the sense that when you ask the question, what can these models actually do? Well they can just write text back at you. That's the extent of what they can do. And so the real project, the real effort, that I think we're all working towards is actually taking action. So what happens when you give these models the ability to use tools, to use APIs? What can they do when they can actually affect change out in the real world, beyond just streaming text back at the user? I think that's the really exciting piece. >> Okay, so I wanted to tee that up early in the segment 'cause I want to get into the customer applications. We're seeing early adopters come in, using the technology because they have a lot of data, they have a lot of large language model opportunities and then there's a big fast follower wave coming behind it. I call that the people who are going to jump in the pool early and get into it. They might not be advanced. Can you guys share what customer applications are being used with large language and vision models today and how they're using it to transform on the early adopter side, and how is that a tell sign of what's to come? >> You know, one of the things we have been seeing both with the text models that Aidan talked about as well as the vision models that stability.ai does, Tom, is customers are really using it to change the way you interact with information. You know, one example of a customer that we have, is someone who's kind of using that to query customer conversations and ask questions like, you know, "What was the customer issue? How did we solve it?" And trying to get those kinds of insights that was previously much harder to do. And then of course software is a big area. You know, generating software, making that, you know, just deploying it in production. Those have been really big areas that we have seen customers start to do. You know, looking at documentation, like instead of you know, searching for stuff and so on, you know, you just have an interactive way, in which you can just look at the documentation for a product. You know, all of this goes to where we need to take the technology. One of which is, you know, the models have to be there but they have to work reliably in a production setting at scale, with privacy, with security, and you know, making sure all of this is happening, is going to be really key. That is what, you know, we at AWS are looking to do, which is work with partners like stability and others and in the open source and really take all of these and make them available at scale to customers, where they work reliably. >> Tom, Aidan, what's your thoughts on this? Where are customers landing on this first use cases or set of low-hanging fruit use cases or applications? >> Yeah, so I think like the first group of adopters that really found product market fit were the copywriting companies. So one great example of that is HyperWrite. Another one is Jasper. And so for Cohere, that's the tip of the iceberg, like there's a very long tail of usage from a bunch of different applications. HyperWrite is one of our customers, they help beat writer's block by drafting blog posts, emails, and marketing copy. We also have a global audio streaming platform, which is using us the power of search engine that can comb through podcast transcripts, in a bunch of different languages. Then a global apparel brand, which is using us to transform how they interact with their customers through a virtual assistant, two dozen global news outlets who are using us for news summarization. So really like, these large language models, they can be deployed all over the place into every single industry sector, language is everywhere. It's hard to think of any company on Earth that doesn't use language. So it's, very, very- >> We're doing it right now. We got the language coming in. >> Exactly. >> We'll transcribe this puppy. All right. Tom, on your side, what do you see the- >> Yeah, we're seeing some amazing applications of it and you know, I guess that's partly been, because of the growth in the open source community and some of these applications have come from there that are then triggering this secondary wave of innovation, which is coming a lot from, you know, controllability and explainability of the model. But we've got companies like, you know, Jasper, which Aidan mentioned, who are using stable diffusion for image generation in block creation, content creation. We've got Lensa, you know, which exploded, and is built on top of stable diffusion for fine tuning so people can bring themselves and their pets and you know, everything into the models. So we've now got fine tuned stable diffusion at scale, which is democratized, you know, that process, which is really fun to see your Lensa, you know, exploded. You know, I think it was the largest growing app in the App Store at one point. And lots of other examples like NightCafe and Lexica and Playground. So seeing lots of cool applications. >> So much applications, we'll probably be a customer for all you guys. We'll definitely talk after. But the challenges are there for people adopting, they want to get into what you guys see as the challenges that turn into opportunities. How do you see the customers adopting generative AI applications? For example, we have massive amounts of transcripts, timed up to all the videos. I don't even know what to do. Do I just, do I code my API there. So, everyone has this problem, every vertical has these use cases. What are the challenges for people getting into this and adopting these applications? Is it figuring out what to do first? Or is it a technical setup? Do they stand up stuff, they just go to Amazon? What do you guys see as the challenges? >> I think, you know, the first thing is coming up with where you think you're going to reimagine your customer experience by using generative AI. You know, we talked about Ada, and Tom talked about a number of these ones and you know, you pick up one or two of these, to get that robust. And then once you have them, you know, we have models and we'll have more models on AWS, these large language models that Aidan was talking about. Then you go in and start using these models and testing them out and seeing whether they fit in use case or not. In many situations, like you said, John, our customers want to say, "You know, I know you've trained these models on a lot of publicly available data, but I want to be able to customize it for my use cases. Because, you know, there's some knowledge that I have created and I want to be able to use that." And then in many cases, and I think Aidan mentioned this. You know, you need these models to be up to date. Like you can't have it staying. And in those cases, you augmented with a knowledge base, you know you have to make sure that these models are not hallucinating. And so you need to be able to do the right kind of responsible AI checks. So, you know, you start with a particular use case, and there are a lot of them. Then, you know, you can come to AWS, and then look at one of the many models we have and you know, we are going to have more models for other modalities as well. And then, you know, play around with the models. We have a playground kind of thing where you can test these models on some data and then you can probably, you will probably want to bring your own data, customize it to your own needs, do some of the testing to make sure that the model is giving the right output and then just deploy it. And you know, we have a lot of tools. >> Yeah. >> To make this easy for our customers. >> How should people think about large language models? Because do they think about it as something that they tap into with their IP or their data? Or is it a large language model that they apply into their system? Is the interface that way? What's the interaction look like? >> In many situations, you can use these models out of the box. But in typical, in most of the other situations, you will want to customize it with your own data or with your own expectations. So the typical use case would be, you know, these are models are exposed through APIs. So the typical use case would be, you know you're using these APIs a little bit for testing and getting familiar and then there will be an API that will allow you to train this model further on your data. So you use that AI, you know, make sure you augmented the knowledge base. So then you use those APIs to customize the model and then just deploy it in an application. You know, like Tom was mentioning, a number of companies that are using these models. So once you have it, then you know, you again, use an endpoint API and use it in an application. >> All right, I love the example. I want to ask Tom and Aidan, because like most my experience with Amazon Web Service in 2007, I would stand up in EC2, put my code on there, play around, if it didn't work out, I'd shut it down. Is that a similar dynamic we're going to see with the machine learning where developers just kind of log in and stand up infrastructure and play around and then have a cloud-like experience? >> So I can go first. So I mean, we obviously, with AWS working really closely with the SageMaker team, do fantastic platform there for ML training and inference. And you know, going back to your point earlier, you know, where the data is, is hugely important for companies. Many companies bringing their models to their data in AWS on-premise for them is hugely important. Having the models to be, you know, open sources, makes them explainable and transparent to the adopters of those models. So, you know, we are really excited to work with the SageMaker team over the coming year to bring companies to that platform and make the most of our models. >> Aidan, what's your take on developers? Do they just need to have a team in place, if we want to interface with you guys? Let's say, can they start learning? What do they got to do to set up? >> Yeah, so I think for Cohere, our product makes it much, much easier to people, for people to get started and start building, it solves a lot of the productionization problems. But of course with SageMaker, like Tom was saying, I think that lowers a barrier even further because it solves problems like data privacy. So I want to underline what Bratin was saying earlier around when you're fine tuning or when you're using these models, you don't want your data being incorporated into someone else's model. You don't want it being used for training elsewhere. And so the ability to solve for enterprises, that data privacy and that security guarantee has been hugely important for Cohere, and that's very easy to do through SageMaker. >> Yeah. >> But the barriers for using this technology are coming down super quickly. And so for developers, it's just becoming completely intuitive. I love this, there's this quote from Andrej Karpathy. He was saying like, "It really wasn't on my 2022 list of things to happen that English would become, you know, the most popular programming language." And so the barrier is coming down- >> Yeah. >> Super quickly and it's exciting to see. >> It's going to be awesome for all the companies here, and then we'll do more, we're probably going to see explosion of startups, already seeing that, the maps, ecosystem maps, the landscape maps are happening. So this is happening and I'm convinced it's not yesterday's chat bot, it's not yesterday's AI Ops. It's a whole another ballgame. So I have to ask you guys for the final question before we kick off the company's showcasing here. How do you guys gauge success of generative AI applications? Is there a lens to look through and say, okay, how do I see success? It could be just getting a win or is it a bigger picture? Bratin we'll start with you. How do you gauge success for generative AI? >> You know, ultimately it's about bringing business value to our customers. And making sure that those customers are able to reimagine their experiences by using generative AI. Now the way to get their ease, of course to deploy those models in a safe, effective manner, and ensuring that all of the robustness and the security guarantees and the privacy guarantees are all there. And we want to make sure that this transitions from something that's great demos to actual at scale products, which means making them work reliably all of the time not just some of the time. >> Tom, what's your gauge for success? >> Look, I think this, we're seeing a completely new form of ways to interact with data, to make data intelligent, and directly to bring in new revenue streams into business. So if businesses can use our models to leverage that and generate completely new revenue streams and ultimately bring incredible new value to their customers, then that's fantastic. And we hope we can power that revolution. >> Aidan, what's your take? >> Yeah, reiterating Bratin and Tom's point, I think that value in the enterprise and value in market is like a huge, you know, it's the goal that we're striving towards. I also think that, you know, the value to consumers and actual users and the transformation of the surface area of technology to create experiences like ChatGPT that are magical and it's the first time in human history we've been able to talk to something compelling that's not a human. I think that in itself is just extraordinary and so exciting to see. >> It really brings up a whole another category of markets. B2B, B2C, it's B2D, business to developer. Because I think this is kind of the big trend the consumers have to win. The developers coding the apps, it's a whole another sea change. Reminds me everyone use the "Moneyball" movie as example during the big data wave. Then you know, the value of data. There's a scene in "Moneyball" at the end, where Billy Beane's getting the offer from the Red Sox, then the owner says to the Red Sox, "If every team's not rebuilding their teams based upon your model, there'll be dinosaurs." I think that's the same with AI here. Every company will have to need to think about their business model and how they operate with AI. So it'll be a great run. >> Completely Agree >> It'll be a great run. >> Yeah. >> Aidan, Tom, thank you so much for sharing about your experiences at your companies and congratulations on your success and it's just the beginning. And Bratin, thanks for coming on representing AWS. And thank you, appreciate for what you do. Thank you. >> Thank you, John. Thank you, Aidan. >> Thank you John. >> Thanks so much. >> Okay, let's kick off season three, episode one. I'm John Furrier, your host. Thanks for watching. (light airy music)
SUMMARY :
of the AWS Startup Showcase, of the behind the ropes, and something that, you know, and build out, you know, Aidan, let's get into what you guys do. and it's trained on, you know, it helps me, you know, the ability to use tools, to use APIs? I call that the people and you know, making sure the first group of adopters We got the language coming in. Tom, on your side, what do you see the- and you know, everything into the models. they want to get into what you guys see and you know, you pick for our customers. then you know, you again, All right, I love the example. and make the most of our models. And so the ability to And so the barrier is coming down- and it's exciting to see. So I have to ask you guys and ensuring that all of the robustness and directly to bring in new and it's the first time in human history the consumers have to win. and it's just the beginning. I'm John Furrier, your host.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
John | PERSON | 0.99+ |
Tom | PERSON | 0.99+ |
Tom Mason | PERSON | 0.99+ |
Aidan | PERSON | 0.99+ |
Red Sox | ORGANIZATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Andrej Karpathy | PERSON | 0.99+ |
Bratin Saha | PERSON | 0.99+ |
December | DATE | 0.99+ |
2007 | DATE | 0.99+ |
John Furrier | PERSON | 0.99+ |
Aidan Gomez | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Amazon Web Services | ORGANIZATION | 0.99+ |
Billy Beane | PERSON | 0.99+ |
Bratin | PERSON | 0.99+ |
Moneyball | TITLE | 0.99+ |
one | QUANTITY | 0.99+ |
Ada | PERSON | 0.99+ |
last year | DATE | 0.99+ |
two | QUANTITY | 0.99+ |
Earth | LOCATION | 0.99+ |
yesterday | DATE | 0.99+ |
Two practitioners | QUANTITY | 0.99+ |
Amazon Web Services | ORGANIZATION | 0.99+ |
ChatGPT | TITLE | 0.99+ |
next year | DATE | 0.99+ |
Code Whisperer | TITLE | 0.99+ |
third | QUANTITY | 0.99+ |
this year | DATE | 0.99+ |
App Store | TITLE | 0.99+ |
first time | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
Inferentia | TITLE | 0.98+ |
EC2 | TITLE | 0.98+ |
GPT-3 | TITLE | 0.98+ |
both | QUANTITY | 0.98+ |
Lensa | TITLE | 0.98+ |
SageMaker | ORGANIZATION | 0.98+ |
three things | QUANTITY | 0.97+ |
Cohere | ORGANIZATION | 0.96+ |
over a hundred different languages | QUANTITY | 0.96+ |
English | OTHER | 0.96+ |
one example | QUANTITY | 0.96+ |
about six months ago | DATE | 0.96+ |
One | QUANTITY | 0.96+ |
first use | QUANTITY | 0.96+ |
SageMaker | TITLE | 0.96+ |
Bing Chat | TITLE | 0.95+ |
one point | QUANTITY | 0.95+ |
Trainium | TITLE | 0.95+ |
Lexica | TITLE | 0.94+ |
Playground | TITLE | 0.94+ |
three great guests | QUANTITY | 0.93+ |
HyperWrite | TITLE | 0.92+ |
Gabriela de Queiroz, Microsoft | WiDS 2023
(upbeat music) >> Welcome back to theCUBE's coverage of Women in Data Science 2023 live from Stanford University. This is Lisa Martin. My co-host is Tracy Yuan. We're excited to be having great conversations all day but you know, 'cause you've been watching. We've been interviewing some very inspiring women and some men as well, talking about all of the amazing applications of data science. You're not going to want to miss this next conversation. Our guest is Gabriela de Queiroz, Principal Cloud Advocate Manager of Microsoft. Welcome, Gabriela. We're excited to have you. >> Thank you very much. I'm so excited to be talking to you. >> Yeah, you're on theCUBE. >> Yeah, finally. (Lisa laughing) Like a dream come true. (laughs) >> I know and we love that. We're so thrilled to have you. So you have a ton of experience in the data space. I was doing some research on you. You've worked in software, financial advertisement, health. Talk to us a little bit about you. What's your background in? >> So I was trained in statistics. So I'm a statistician and then I worked in epidemiology. I worked with air pollution and public health. So I was a researcher before moving into the industry. So as I was talking today, the weekly paths, it's exactly who I am. I went back and forth and back and forth and stopped and tried something else until I figured out that I want to do data science and that I want to do different things because with data science we can... The beauty of data science is that you can move across domains. So I worked in healthcare, financial, and then different technology companies. >> Well the nice thing, one of the exciting things that data science, that I geek out about and Tracy knows 'cause we've been talking about this all day, it's just all the different, to your point, diverse, pun intended, applications of data science. You know, this morning we were talking about, we had the VP of data science from Meta as a keynote. She came to theCUBE talking and really kind of explaining from a content perspective, from a monetization perspective, and of course so many people in the world are users of Facebook. It makes it tangible. But we also heard today conversations about the applications of data science in police violence, in climate change. We're in California, we're expecting a massive rainstorm and we don't know what to do when it rains or snows. But climate change is real. Everyone's talking about it, and there's data science at its foundation. That's one of the things that I love. But you also have a lot of experience building diverse teams. Talk a little bit about that. You've created some very sophisticated data science solutions. Talk about your recommendation to others to build diverse teams. What's in it for them? And maybe share some data science project or two that you really found inspirational. >> Yeah, absolutely. So I do love building teams. Every time I'm given the task of building teams, I feel the luckiest person in the world because you have the option to pick like different backgrounds and all the diverse set of like people that you can find. I don't think it's easy, like people say, yeah, it's very hard. You have to be intentional. You have to go from the very first part when you are writing the job description through the interview process. So you have to be very intentional in every step. And you have to think through when you are doing that. And I love, like my last team, we had like 10 people and we were so diverse. Like just talking about languages. We had like 15 languages inside a team. So how beautiful it is. Like all different backgrounds, like myself as a statistician, but we had people from engineering background, biology, languages, and so on. So it's, yeah, like every time thinking about building a team, if you wanted your team to be diverse, you need to be intentional. >> I'm so glad you brought up that intention point because that is the fundamental requirement really is to build it with intention. >> Exactly, and I love to hear like how there's different languages. So like I'm assuming, or like different backgrounds, I'm assuming everybody just zig zags their way into the team and now you're all women in data science and I think that's so precious. >> Exactly. And not only woman, right. >> Tracy: Not only woman, you're right. >> The team was diverse not only in terms of like gender, but like background, ethnicity, and spoken languages, and language that they use to program and backgrounds. Like as I mentioned, not everybody did the statistics in school or computer science. And it was like one of my best teams was when we had this combination also like things that I'm good at the other person is not as good and we have this knowledge sharing all the time. Every day I would feel like I'm learning something. In a small talk or if I was reviewing something, there was always something new because of like the richness of the diverse set of people that were in your team. >> Well what you've done is so impressive, because not only have you been intentional with it, but you sound like the hallmark of a great leader of someone who hires and builds teams to fill gaps. They don't have to know less than I do for me to be the leader. They have to have different skills, different areas of expertise. That is really, honestly Gabriela, that's the hallmark of a great leader. And that's not easy to come by. So tell me, who were some of your mentors and sponsors along the way that maybe influenced you in that direction? Or is that just who you are? >> That's a great question. And I joke that I want to be the role model that I never had, right. So growing up, I didn't have anyone that I could see other than my mom probably or my sister. But there was no one that I could see, I want to become that person one day. And once I was tracing my path, I started to see people looking at me and like, you inspire me so much, and I'm like, oh wow, this is amazing and I want to do do this over and over and over again. So I want to be that person to inspire others. And no matter, like I'll be like a VP, CEO, whoever, you know, I want to be, I want to keep inspiring people because that's so valuable. >> Lisa: Oh, that's huge. >> And I feel like when we grow professionally and then go to the next level, we sometimes we lose that, you know, thing that's essential. And I think also like, it's part of who I am as I was building and all my experiences as I was going through, I became what I mentioned is unique person that I think we all are unique somehow. >> You're a rockstar. Isn't she a rockstar? >> You dropping quotes out. >> I'm loving this. I'm like, I've inspired Gabriela. (Gabriela laughing) >> Oh my God. But yeah, 'cause we were asking our other guests about the same question, like, who are your role models? And then we're talking about how like it's very important for women to see that there is a representation, that there is someone they look up to and they want to be. And so that like, it motivates them to stay in this field and to start in this field to begin with. So yeah, I think like you are definitely filling a void and for all these women who dream to be in data science. And I think that's just amazing. >> And you're a founder too. In 2012, you founded R Ladies. Talk a little bit about that. This is present in more than 200 cities in 55 plus countries. Talk about R Ladies and maybe the catalyst to launch it. >> Yes, so you always start, so I'm from Brazil, I always talk about this because it's such, again, I grew up over there. So I was there my whole life and then I moved to here, Silicon Valley. And when I moved to San Francisco, like the doors opened. So many things happening in the city. That was back in 2012. Data science was exploding. And I found out something about Meetup.com, it's a website that you can join and go in all these events. And I was going to this event and I joke that it was kind of like going to the Disneyland, where you don't know if I should go that direction or the other direction. >> Yeah, yeah. >> And I was like, should I go and learn about data visualization? Should I go and learn about SQL or should I go and learn about Hadoop, right? So I would go every day to those meetups. And I was a student back then, so you know, the budget was very restricted as a student. So we don't have much to spend. And then they would serve dinner and you would learn for free. And then I got to a point where I was like, hey, they are doing all of this as a volunteer. Like they are running this meetup and events for free. And I felt like it's a cycle. I need to do something, right. I'm taking all this in. I'm having this huge opportunity to be here. I want to give back. So that's what how everything started. I was like, no, I have to think about something. I need to think about something that I can give back. And I was using R back then and I'm like how about I do something with R. I love R, I'm so passionate about R, what about if I create a community around R but not a regular community, because by going to this events, I felt that as a Latina and as a woman, I was always in the corner and I was not being able to participate and to, you know, be myself and to network and ask questions. I would be in the corner. So I said to myself, what about if I do something where everybody feel included, where everybody can participate, can share, can ask questions without judgment? So that's how R ladies all came together. >> That's awesome. >> Talk about intentions, like you have to, you had that go in mind, but yeah, I wanted to dive a little bit into R. So could you please talk more about where did the passion for R come from, and like how did the special connection between you and R the language, like born, how did that come from? >> It was not a love at first sight. >> No. >> Not at all. Not at all. Because that was back in Brazil. So all the documentation were in English, all the tutorials, only two. We had like very few tutorials. It was not like nowadays that we have so many tutorials and courses. There were like two tutorials, other documentation in English. So it's was hard for me like as someone that didn't know much English to go through the language and then to learn to program was not easy task. But then as I was going through the language and learning and reading books and finding the people behind the language, I don't know how I felt in love. And then when I came to to San Francisco, I saw some of like the main contributors who are speaking in person and I'm like, wow, they are like humans. I don't know, it was like, I have no idea why I had this love. But I think the the people and then the community was the thing that kept me with the R language. >> Yeah, the community factors is so important. And it's so, at WIDS it's so palpable. I mean I literally walk in the door, every WIDS I've done, I think I've been doing them for theCUBE since 2017. theCUBE has been here since the beginning in 2015 with our co-founders. But you walk in, you get this sense of belonging. And this sense of I can do anything, why not? Why not me? Look at her up there, and now look at you speaking in the technical talk today on theCUBE. So inspiring. One of the things that I always think is you can't be what you can't see. We need to be able to see more people that look like you and sound like you and like me and like you as well. And WIDS gives us that opportunity, which is fantastic, but it's also helping to move the needle, really. And I was looking at some of the Anitab.org stats just yesterday about 2022. And they're showing, you know, the percentage of females in technical roles has been hovering around 25% for a while. It's a little higher now. I think it's 27.6 according to any to Anitab. We're seeing more women hired in roles. But what are the challenges, and I would love to get your advice on this, for those that might be in this situation is attrition, women who are leaving roles. What would your advice be to a woman who might be trying to navigate family and work and career ladder to stay in that role and keep pushing forward? >> I'll go back to the community. If you don't have a community around you, it's so hard to navigate. >> That's a great point. >> You are lonely. There is no one that you can bounce ideas off, that you can share what you are feeling or like that you can learn as well. So sometimes you feel like you are the only person that is going through that problem or like, you maybe have a family or you are planning to have a family and you have to make a decision. But you've never seen anyone going through this. So when you have a community, you see people like you, right. So that's where we were saying about having different people and people like you so they can share as well. And you feel like, oh yeah, so they went through this, they succeed. I can also go through this and succeed. So I think the attrition problem is still big problem. And I'm sure will be worse now with everything that is happening in Tech with layoffs. >> Yes and the great resignation. >> Yeah. >> We are going back, you know, a few steps, like a lot of like advancements that we did. I feel like we are going back unfortunately, but I always tell this, make sure that you have a community. Make sure that you have a mentor. Make sure that you have someone or some people, not only one mentor, different mentors, that can support you through this trajectory. Because it's not easy. But there are a lot of us out there. >> There really are. And that's a great point. I love everything about the community. It's all about that network effect and feeling like you belong- >> That's all WIDS is about. >> Yeah. >> Yes. Absolutely. >> Like coming over here, it's like seeing the old friends again. It's like I'm so glad that I'm coming because I'm all my old friends that I only see like maybe once a year. >> Tracy: Reunion. >> Yeah, exactly. And I feel like that our tank get, you know- >> Lisa: Replenished. >> Exactly. For the rest of the year. >> Yes. >> Oh, that's precious. >> I love that. >> I agree with that. I think one of the things that when I say, you know, you can't see, I think, well, how many females in technology would I be able to recognize? And of course you can be female technology working in the healthcare sector or working in finance or manufacturing, but, you know, we need to be able to have more that we can see and identify. And one of the things that I recently found out, I was telling Tracy this earlier that I geeked out about was finding out that the CTO of Open AI, ChatGPT, is a female. I'm like, (gasps) why aren't we talking about this more? She was profiled on Fast Company. I've seen a few pieces on her, Mira Murati. But we're hearing so much about ChatJTP being... ChatGPT, I always get that wrong, about being like, likening it to the launch of the iPhone, which revolutionized mobile and connectivity. And here we have a female in the technical role. Let's put her on a pedestal because that is hugely inspiring. >> Exactly, like let's bring everybody to the front. >> Yes. >> Right. >> And let's have them talk to us because like, you didn't know. I didn't know probably about this, right. You didn't know. Like, we don't know about this. It's kind of like we are hidden. We need to give them the spotlight. Every woman to give the spotlight, so they can keep aspiring the new generation. >> Or Susan Wojcicki who ran, how long does she run YouTube? All the YouTube influencers that probably have no idea who are influential for whatever they're doing on YouTube in different social platforms that don't realize, do you realize there was a female behind the helm that for a long time that turned it into what it is today? That's outstanding. Why aren't we talking about this more? >> How about Megan Smith, was the first CTO on the Obama administration. >> That's right. I knew it had to do with Obama. Couldn't remember. Yes. Let's let's find more pedestals. But organizations like WIDS, your involvement as a speaker, showing more people you can be this because you can see it, >> Yeah, exactly. is the right direction that will help hopefully bring us back to some of the pre-pandemic levels, and keep moving forward because there's so much potential with data science that can impact everyone's lives. I always think, you know, we have this expectation that we have our mobile phone and we can get whatever we want wherever we are in the world and whatever time of day it is. And that's all data driven. The regular average person that's not in tech thinks about data as a, well I'm paying for it. What's all these data charges? But it's powering the world. It's powering those experiences that we all want as consumers or in our business lives or we expect to be able to do a transaction, whether it's something in a CRM system or an Uber transaction like that, and have the app respond, maybe even know me a little bit better than I know myself. And that's all data. So I think we're just at the precipice of the massive impact that data science will make in our lives. And luckily we have leaders like you who can help navigate us along this path. >> Thank you. >> What advice for, last question for you is advice for those in the audience who might be nervous or maybe lack a little bit of confidence to go I really like data science, or I really like engineering, but I don't see a lot of me out there. What would you say to them? >> Especially for people who are from like a non-linear track where like going onto that track. >> Yeah, I would say keep going. Keep going. I don't think it's easy. It's not easy. But keep going because the more you go the more, again, you advance and there are opportunities out there. Sometimes it takes a little bit, but just keep going. Keep going and following your dreams, that you get there, right. So again, data science, such a broad field that doesn't require you to come from a specific background. And I think the beauty of data science exactly is this is like the combination, the most successful data science teams are the teams that have all these different backgrounds. So if you think that we as data scientists, we started programming when we were nine, that's not true, right. You can be 30, 40, shifting careers, starting to program right now. It doesn't matter. Like you get there no matter how old you are. And no matter what's your background. >> There's no limit. >> There was no limits. >> I love that, Gabriela, >> Thank so much. for inspiring. I know you inspired me. I'm pretty sure you probably inspired Tracy with your story. And sometimes like what you just said, you have to be your own mentor and that's okay. Because eventually you're going to turn into a mentor for many, many others and sounds like you're already paving that path and we so appreciate it. You are now officially a CUBE alumni. >> Yes. Thank you. >> Yay. We've loved having you. Thank you so much for your time. >> Thank you. Thank you. >> For our guest and for Tracy's Yuan, this is Lisa Martin. We are live at WIDS 23, the eighth annual Women in Data Science Conference at Stanford. Stick around. Our next guest joins us in just a few minutes. (upbeat music)
SUMMARY :
but you know, 'cause you've been watching. I'm so excited to be talking to you. Like a dream come true. So you have a ton of is that you can move across domains. But you also have a lot of like people that you can find. because that is the Exactly, and I love to hear And not only woman, right. that I'm good at the other Or is that just who you are? And I joke that I want And I feel like when You're a rockstar. I'm loving this. So yeah, I think like you the catalyst to launch it. And I was going to this event And I was like, and like how did the special I saw some of like the main more people that look like you If you don't have a community around you, There is no one that you Make sure that you have a mentor. and feeling like you belong- it's like seeing the old friends again. And I feel like that For the rest of the year. And of course you can be everybody to the front. you didn't know. do you realize there was on the Obama administration. because you can see it, I always think, you know, What would you say to them? are from like a non-linear track that doesn't require you to I know you inspired me. you so much for your time. Thank you. the eighth annual Women
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Tracy Yuan | PERSON | 0.99+ |
Megan Smith | PERSON | 0.99+ |
Gabriela de Queiroz | PERSON | 0.99+ |
Susan Wojcicki | PERSON | 0.99+ |
Gabriela | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Brazil | LOCATION | 0.99+ |
2015 | DATE | 0.99+ |
2012 | DATE | 0.99+ |
San Francisco | LOCATION | 0.99+ |
San Francisco | LOCATION | 0.99+ |
Tracy | PERSON | 0.99+ |
Obama | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
Mira Murati | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
California | LOCATION | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
iPhone | COMMERCIAL_ITEM | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
27.6 | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
30 | QUANTITY | 0.99+ |
40 | QUANTITY | 0.99+ |
15 languages | QUANTITY | 0.99+ |
R Ladies | ORGANIZATION | 0.99+ |
two tutorials | QUANTITY | 0.99+ |
Anitab | ORGANIZATION | 0.99+ |
10 people | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
YouTube | ORGANIZATION | 0.99+ |
today | DATE | 0.99+ |
55 plus countries | QUANTITY | 0.99+ |
first part | QUANTITY | 0.99+ |
more than 200 cities | QUANTITY | 0.99+ |
first | QUANTITY | 0.98+ |
nine | QUANTITY | 0.98+ |
SQL | TITLE | 0.98+ |
theCUBE | ORGANIZATION | 0.98+ |
WIDS 23 | EVENT | 0.98+ |
Stanford University | ORGANIZATION | 0.98+ |
2017 | DATE | 0.98+ |
CUBE | ORGANIZATION | 0.97+ |
Stanford | LOCATION | 0.97+ |
Women in Data Science | TITLE | 0.97+ |
around 25% | QUANTITY | 0.96+ |
Disneyland | LOCATION | 0.96+ |
English | OTHER | 0.96+ |
one mentor | QUANTITY | 0.96+ |
Women in Data Science Conference | EVENT | 0.96+ |
once a year | QUANTITY | 0.95+ |
WIDS | ORGANIZATION | 0.92+ |
this morning | DATE | 0.91+ |
Meetup.com | ORGANIZATION | 0.91+ |
ORGANIZATION | 0.9+ | |
Hadoop | TITLE | 0.89+ |
WiDS 2023 | EVENT | 0.88+ |
Anitab.org | ORGANIZATION | 0.87+ |
ChatJTP | TITLE | 0.86+ |
One | QUANTITY | 0.86+ |
one day | QUANTITY | 0.85+ |
ChatGPT | TITLE | 0.84+ |
pandemic | EVENT | 0.81+ |
Fast Company | ORGANIZATION | 0.78+ |
CTO | PERSON | 0.76+ |
Open | ORGANIZATION | 0.76+ |
Shir Meir Lador, Intuit | WiDS 2023
(gentle upbeat music) >> Hey, friends of theCUBE. It's Lisa Martin live at Stanford University covering the Eighth Annual Women In Data Science. But you've been a Cube fan for a long time. So you know that we've been here since the beginning of WiDS, which is 2015. We always loved to come and cover this event. We learned great things about data science, about women leaders, underrepresented minorities. And this year we have a special component. We've got two grad students from Stanford's Master's program and Data Journalism joining. One of my them is here with me, Hannah Freitag, my co-host. Great to have you. And we are pleased to welcome from Intuit for the first time, Shir Meir Lador Group Manager at Data Science. Shir, it's great to have you. Thank you for joining us. >> Thank you for having me. >> And I was just secrets girl talking with my boss of theCUBE who informed me that you're in great company. Intuit's Chief Technology Officer, Marianna Tessel is an alumni of theCUBE. She was on at our Supercloud event in January. So welcome back into it. >> Thank you very much. We're happy to be with you. >> Tell us a little bit about what you're doing. You're a data science group manager as I mentioned, but also you've had you've done some cool things I want to share with the audience. You're the co-founder of the PyData Tel Aviv Meetups the co-host of the unsupervised podcast about data science in Israel. You give talks, about machine learning, about data science. Tell us a little bit about your background. Were you always interested in STEM studies from the time you were small? >> So I was always interested in mathematics when I was small, I went to this special program for youth going to university. So I did my test in mathematics earlier and studied in university some courses. And that's when I understood I want to do something in that field. And then when I got to go to university, I went to electrical engineering when I found out about algorithms and how interested it is to be able to find solutions to problems, to difficult problems with math. And this is how I found my way into machine learning. >> Very cool. There's so much, we love talking about machine learning and AI on theCUBE. There's so much potential. Of course, we have to have data. One of the things that I love about WiDS and Hannah and I and our co-host Tracy, have been talking about this all day is the impact of data in everyone's life. If you break it down, I was at Mobile World Congress last week, all about connectivity telecom, and of course we have these expectation that we're going to be connected 24/7 from wherever we are in the world and we can do whatever we want. I can do an Uber transaction, I can watch Netflix, I can do a bank transaction. It all is powered by data. And data science is, some of the great applications of it is what it's being applied to. Things like climate change or police violence or health inequities. Talk about some of the data science projects that you're working on at Intuit. I'm an intuit user myself, but talk to me about some of those things. Give the audience really a feel for what you're doing. >> So if you are a Intuit product user, you probably use TurboTax. >> I do >> In the past. So for those who are not familiar, TurboTax help customers submit their taxes. Basically my group is in charge of getting all the information automatically from your documents, the documents that you upload to TurboTax. We extract that information to accelerate your tax submission to make it less work for our customers. So- >> Thank you. >> Yeah, and this is why I'm so proud to be working at this team because our focus is really to help our customers to simplify all the you know, financial heavy lifting with taxes and also with small businesses. We also do a lot of work in extracting information from small business documents like bill, receipts, different bank statements. Yeah, so this is really exciting for me, the opportunity to work to apply data science and machine learning to solution that actually help people. Yeah >> Yeah, in the past years there have been more and more digital products emerging that needs some sort of data security. And how did your team, or has your team developed in the past years with more and more products or companies offering digital services? >> Yeah, so can you clarify the question again? Sorry. >> Yeah, have you seen that you have more customers? Like has your team expanded in the past years with more digital companies starting that need kind of data security? >> Well, definitely. I think, you know, since I joined Intuit, I joined like five and a half years ago back when I was in Tel Aviv. I recently moved to the Bay Area. So when I joined, there were like a dozens of data scientists and machine learning engineers on Intuit. And now there are a few hundreds. So we've definitely grown with the year and there are so many new places we can apply machine learning to help our customers. So this is amazing, so much we can do with machine learning to get more money in the pocket of our customers and make them do less work. >> I like both of those. More money in my pocket and less work. That's awesome. >> Exactly. >> So keep going Intuit. But one of the things that is so cool is just the the abstraction of the complexity that Intuit's doing. I upload documents or it scans my receipts. I was just in Barcelona last week all these receipts and conversion euros to dollars and it takes that complexity away from the end user who doesn't know all that's going on in the background, but you're making people's lives simpler. Unfortunately, we all have to pay taxes, most of us should. And of course we're in tax season right now. And so it's really cool what you're doing with ML and data science to make fundamental processes to people's lives easier and just a little bit less complicated. >> Definitely. And I think that's what's also really amazing about Intuit it, is how it combines human in the loop as well as AI. Because in some of the tax situation it's very complicated maybe to do it yourself. And then there's an option to work with an expert online that goes on a video with you and helps you do your taxes. And the expert's work is also accelerated by AI because we build tools for those experts to do the work more efficiently. >> And that's what it's all about is you know, using data to be more efficient, to be faster, to be smarter, but also to make complicated processes in our daily lives, in our business lives just a little bit easier. One of the things I've been geeking out about recently is ChatGPT. I was using it yesterday. I was telling everyone I was asking it what's hot in data science and I didn't know would it know what hot is and it did, it gave me trends. But one of the things that I was so, and Hannah knows I've been telling this all day, I was so excited to learn over the weekend that the the CTO of OpenAI is a female. I didn't know that. And I thought why are we not putting her on a pedestal? Because people are likening ChatGPT to like the launch of the iPhone. I mean revolutionary. And here we have what I think is exciting for all of us females, whether you're in tech or not, is another role model. Because really ultimately what WiDS is great at doing is showcasing women in technical roles. Because I always say you can't be what you can't see. We need to be able to see more role models, female role role models, underrepresented minorities of course men, because a lot of my sponsors and mentors are men, but we need more women that we can look up to and see ah, she's doing this, why can't I? Talk to me about how you stay the course in data science. What excites you about the potential, the opportunities based on what you've already accomplished what inspires you to continue and be one of those females that we say oh my God, I could be like Shir. >> I think that what inspires me the most is the endless opportunities that we have. I think we haven't even started tapping into everything that we can do with generative AI, for example. There's so much that can be done to further help you know, people make more money and do less work because there's still so much work that we do that we don't need to. You know, this is with Intuit, but also there are so many other use cases like I heard today you know, with the talk about the police. So that was really exciting how you can apply machine learning and data to actually help people, to help people that been through wrongful things. So I was really moved by that. And I'm also really excited about all the medical applications that we can have with data. >> Yeah, yeah. It's true that data science is so diverse in terms of what fields it can cover but it's equally important to have diverse teams and have like equity and inclusion in your teams. Where is Intuit at promoting women, non-binary minorities in your teams to progress data science? >> Yeah, so I have so much to say on this. >> Good. >> But in my work in Tel Aviv, I had the opportunity to start with Intuit women in data science branch in Tel Aviv. So that's why I'm super excited to be here today for that because basically this is the original conference, but as you know, there are branches all over the world and I got the opportunity to lead the Tel Aviv branch with Israel since 2018. And we've been through already this year it's going to be it's next week, it's going to be the sixth conference. And every year our number of submission to make talk in the conference doubled itself. >> Nice. >> We started with 20 submission, then 50, then 100. This year we have over 200 submissions of females to give talk at the conference. >> Ah, that's fantastic. >> And beyond the fact that there's so much traction, I also feel the great impact it has on the community in Israel because one of the reason we started WiDS was that when I was going to conferences I was seeing so little women on stage in all the technical conferences. You know, kind of the reason why I guess you know, Margaret and team started the WiDS conference. So I saw the same thing in Israel and I was always frustrated. I was organizing PyData Meetups as you mentioned and I was always having such a hard time to get female speakers to talk. I was trying to role model, but that's not enough, you know. We need more. So once we started WiDS and people saw you know, so many examples on the stage and also you know females got opportunity to talk in a place for that. Then it also started spreading and you can see more and more female speakers across other conferences, which are not women in data science. So I think just the fact that Intuits started this conference back in Israel and also in Bangalore and also the support Intuit does for WiDS in Stanford here, it shows how much WiDS values are aligned with our values. Yeah, and I think that to chauffeur that I think we have over 35% females in the data science and machine learning engineering roles, which is pretty amazing I think compared to the industry. >> Way above average. Yeah, absolutely. I was just, we've been talking about some of the AnitaB.org stats from 2022 showing that 'cause usually if we look at the industry to you point, over the last, I don't know, probably five, 10 years we're seeing the number of female technologists around like a quarter, 25% or so. 2022 data from AnitaB.org showed that that number is now 27.6%. So it's very slowly- >> It's very slowly increasing. >> Going in the right direction. >> Too slow. >> And that representation of women technologists increase at every level, except intern, which I thought was really interesting. And I wonder is there a covid relation there? >> I don't know. >> What do we need to do to start opening up the the top of the pipeline, the funnel to go downstream to find kids like you when you were younger and always interested in engineering and things like that. But the good news is that the hiring we've seen improvements, but it sounds like Intuit is way ahead of the curve there with 35% women in data science or technical roles. And what's always nice and refreshing that we've talked, Hannah about this too is seeing companies actually put action into initiatives. It's one thing for a company to say we're going to have you know, 50% females in our organization by 2030. It's a whole other ball game to actually create a strategy, execute on it, and share progress. So kudos to Intuit for what it's doing because that is more companies need to adopt that same sort of philosophy. And that's really cultural. >> Yeah. >> At an organization and culture can be hard to change, but it sounds like you guys kind of have it dialed in. >> I think we definitely do. That's why I really like working and Intuit. And I think that a lot of it is with the role modeling, diversity and inclusion, and by having women leaders. When you see a woman in leadership position, as a woman it makes you want to come work at this place. And as an evidence, when I build the team I started in Israel at Intuit, I have over 50% women in my team. >> Nice. >> Yeah, because when you have a woman in the interviewers panel, it's much easier, it's more inclusive. That's why we always try to have at least you know, one woman and also other minorities represented in our interviews panel. Yeah, and I think that in general it's very important as a leader to kind of know your own biases and trying to have defined standard and rubrics in how you evaluate people to avoid for those biases. So all of that inclusiveness and leadership really helps to get more diversity in your teams. >> It's critical. That thought diversity is so critical, especially if we talk about AI and we're almost out of time, I just wanted to bring up, you brought up a great point about the diversity and equity. With respect to data science and AI, we know in AI there's biases in data. We need to have more inclusivity, more representation to help start shifting that so the biases start to be dialed down and I think a conference like WiDS and it sounds like someone like you and what you've already done so far in the work that you're doing having so many females raise their hands to want to do talks at events is a good situation. It's a good scenario and hopefully it will continue to move the needle on the percentage of females in technical roles. So we thank you Shir for your time sharing with us your story, what you're doing, how Intuit and WiDS are working together. It sounds like there's great alignment there and I think we're at the tip of the iceberg with what we can do with data science and inclusion and equity. So we appreciate all of your insights and your time. >> Thank you very much. >> All right. >> I enjoyed very, very much >> Good. We hope, we aim to please. Thank you for our guests and for Hannah Freitag. This is Lisa Martin coming to you live from Stanford University. This is our coverage of the eighth Annual Women in Data Science Conference. Stick around, next guest will be here in just a minute.
SUMMARY :
Shir, it's great to have you. And I was just secrets girl talking We're happy to be with you. from the time you were small? and how interested it is to be able and of course we have these expectation So if you are a Intuit product user, the documents that you upload to TurboTax. the opportunity to work Yeah, in the past years Yeah, so can you I recently moved to the Bay Area. I like both of those. and data science to make and helps you do your taxes. Talk to me about how you stay done to further help you know, to have diverse teams I had the opportunity to start of females to give talk at the conference. Yeah, and I think that to chauffeur that the industry to you point, And I wonder is there the funnel to go downstream but it sounds like you guys I build the team I started to have at least you know, so the biases start to be dialed down This is Lisa Martin coming to you live
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Hannah Freitag | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Marianna Tessel | PERSON | 0.99+ |
Israel | LOCATION | 0.99+ |
Bangalore | LOCATION | 0.99+ |
27.6% | QUANTITY | 0.99+ |
iPhone | COMMERCIAL_ITEM | 0.99+ |
Margaret | PERSON | 0.99+ |
Shir Meir Lador | PERSON | 0.99+ |
Hannah | PERSON | 0.99+ |
Bay Area | LOCATION | 0.99+ |
Intuit | ORGANIZATION | 0.99+ |
Tel Aviv | LOCATION | 0.99+ |
last week | DATE | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
Barcelona | LOCATION | 0.99+ |
January | DATE | 0.99+ |
Shir | PERSON | 0.99+ |
20 submission | QUANTITY | 0.99+ |
50 | QUANTITY | 0.99+ |
Tracy | PERSON | 0.99+ |
2030 | DATE | 0.99+ |
100 | QUANTITY | 0.99+ |
35% | QUANTITY | 0.99+ |
50% | QUANTITY | 0.99+ |
yesterday | DATE | 0.99+ |
2015 | DATE | 0.99+ |
five | QUANTITY | 0.99+ |
this year | DATE | 0.99+ |
next week | DATE | 0.99+ |
both | QUANTITY | 0.99+ |
2022 | DATE | 0.99+ |
sixth conference | QUANTITY | 0.99+ |
Intuits | ORGANIZATION | 0.99+ |
today | DATE | 0.99+ |
OpenAI | ORGANIZATION | 0.99+ |
This year | DATE | 0.99+ |
Stanford | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
WiDS | EVENT | 0.98+ |
2018 | DATE | 0.98+ |
over 200 submissions | QUANTITY | 0.98+ |
Eighth Annual Women In Data Science | EVENT | 0.98+ |
eighth Annual Women in Data Science Conference | EVENT | 0.98+ |
theCUBE | ORGANIZATION | 0.98+ |
TurboTax | TITLE | 0.98+ |
One | QUANTITY | 0.98+ |
over 50% | QUANTITY | 0.98+ |
over 35% | QUANTITY | 0.97+ |
five and a half years ago back | DATE | 0.97+ |
Stanford University | ORGANIZATION | 0.97+ |
first time | QUANTITY | 0.97+ |
Netflix | ORGANIZATION | 0.96+ |
one woman | QUANTITY | 0.96+ |
Mobile World Congress | EVENT | 0.94+ |
one thing | QUANTITY | 0.94+ |
AnitaB.org | ORGANIZATION | 0.93+ |
25% | QUANTITY | 0.92+ |
PyData Meetups | EVENT | 0.9+ |
Rhonda Crate, Boeing | WiDS 2023
(gentle music) >> Hey! Welcome back to theCUBE's coverage of WiDS 2023, the eighth Annual Women In Data Science Conference. I'm your host, Lisa Martin. We are at Stanford University, as you know we are every year, having some wonderful conversations with some very inspiring women and men in data science and technical roles. I'm very pleased to introduce Tracy Zhang, my co-host, who is in the Data Journalism program at Stanford. And Tracy and I are pleased to welcome our next guest, Rhonda Crate, Principal Data Scientist at Boeing. Great to have you on the program, Rhonda. >> Tracy: Welcome. >> Hey, thanks for having me. >> Were you always interested in data science or STEM from the time you were young? >> No, actually. I was always interested in archeology and anthropology. >> That's right, we were talking about that, anthropology. Interesting. >> We saw the anthropology background, not even a bachelor's degree, but also a master's degree in anthropology. >> So you were committed for a while. >> I was, I was. I actually started college as a fine arts major, but I always wanted to be an archeologist. So at the last minute, 11 credits in, left to switch to anthropology. And then when I did my master's, I focused a little bit more on quantitative research methods and then I got my Stat Degree. >> Interesting. Talk about some of the data science projects that you're working on. When I think of Boeing, I always think of aircraft. But you are doing a lot of really cool things in IT, data analytics. Talk about some of those intriguing data science projects that you're working on. >> Yeah. So when I first started at Boeing, I worked in information technology and data analytics. And Boeing, at the time, had cored up data science in there. And so we worked as a function across the enterprise working on anything from shared services to user experience in IT products, to airplane programs. So, it has a wide range. I worked on environment health and safety projects for a long time as well. So looking at ergonomics and how people actually put parts onto airplanes, along with things like scheduling and production line, part failures, software testing. Yeah, there's a wide spectrum of things. >> But I think that's so fantastic. We've been talking, Tracy, today about just what we often see at WiDS, which is this breadth of diversity in people's background. You talked about anthropology, archeology, you're doing data science. But also all of the different opportunities that you've had at Boeing. To see so many facets of that organization. I always think that breadth of thought diversity can be hugely impactful. >> Yeah. So I will say my anthropology degree has actually worked to my benefit. I'm a huge proponent of integrating liberal arts and sciences together. And it actually helps me. I'm in the Technical Fellowship program at Boeing, so we have different career paths. So you can go into management, you can be a regular employee, or you can go into the Fellowship program. So right now I'm an Associate Technical Fellow. And part of how I got into the Fellowship program was that diversity in my background, what made me different, what made me stand out on projects. Even applying a human aspect to things like ergonomics, as silly as that sounds, but how does a person actually interact in the space along with, here are the actual measurements coming off of whatever system it is that you're working on. So, I think there's a lot of opportunities, especially in safety as well, which is a big initiative for Boeing right now, as you can imagine. >> Tracy: Yeah, definitely. >> I can't go into too specifics. >> No, 'cause we were like, I think a theme for today that kind of we brought up in in all of our talk is how data is about people, how data is about how people understand the world and how these data can make impact on people's lives. So yeah, I think it's great that you brought this up, and I'm very happy that your anthropology background can tap into that and help in your day-to-day data work too. >> Yeah. And currently, right now, I actually switched over to Strategic Workforce Planning. So it's more how we understand our workforce, how we work towards retaining the talent, how do we get the right talent in our space, and making sure overall that we offer a culture and work environment that is great for our employees to come to. >> That culture is so important. You know, I was looking at some anitab.org stats from 2022 and you know, we always talk about the number of women in technical roles. For a long time it's been hovering around that 25% range. The data from anitab.org showed from '22, it's now 27.6%. So, a little increase. But one of the biggest challenges still, and Tracy and I and our other co-host, Hannah, have been talking about this, is attrition. Attrition more than doubled last year. What are some of the things that Boeing is doing on the retention side, because that is so important especially as, you know, there's this pipeline leakage of women leaving technical roles. Tell us about what Boeing's, how they're invested. >> Yeah, sure. We actually have a publicly available Global Diversity Report that anybody can go and look at and see our statistics for our organization. Right now, off the top of my head, I think we're hovering at about 24% in the US for women in our company. It has been a male majority company for many years. We've invested heavily in increasing the number of women in roles. One interesting thing about this year that came out is that even though with the great resignation and those types of things, the attrition level between men and women were actually pretty close to being equal, which is like the first time in our history. Usually it tends on more women leaving. >> Lisa: That's a good sign. >> Right. >> Yes, that's a good sign. >> And we've actually focused on hiring and bringing in more women and diversity in our company. >> Yeah, some of the stats too from anitab.org talked about the increase, and I have to scroll back and find my notes, the increase in 51% more women being hired in 2022 than 2021 for technical roles. So the data, pun intended, is showing us. I mean, the data is there to show the impact that having females in executive leadership positions make from a revenue perspective. >> Tracy: Definitely. >> Companies are more profitable when there's women at the head, or at least in senior leadership roles. But we're seeing some positive trends, especially in terms of representation of women technologists. One of the things though that I found interesting, and I'm curious to get your thoughts on this, Rhonda, is that the representation of women technologists is growing in all areas, except interns. >> Rhonda: Hmm. >> So I think, we've got to go downstream. You teach, I have to go back to my notes on you, did my due diligence, R programming classes through Boeings Ed Wells program, this is for WSU College of Arts and Sciences, talk about what you teach and how do you think that intern kind of glut could be solved? >> Yeah. So, they're actually two separate programs. So I teach a data analytics course at Washington State University as an Adjunct Professor. And then the Ed Wells program is a SPEEA, which is an Aerospace Union, focused on bringing up more technology and skills to the actual workforce itself. So it's kind of a couple different audiences. One is more seasoned employees, right? The other one is our undergraduates. I teach a Capstone class, so it's a great way to introduce students to what it's actually like to work on an industry project. We partner with Google and Microsoft and Boeing on those. The idea is also that maybe those companies have openings for the students when they're done. Since it's Senior Capstone, there's not a lot of opportunities for internships. But the opportunities to actually get hired increase a little bit. In regards to Boeing, we've actually invested a lot in hiring more women interns. I think the number was 40%, but you'd have to double check. >> Lisa: That's great, that's fantastic. >> Tracy: That's way above average, I think. >> That's a good point. Yeah, it is above average. >> Double check on that. That's all from my memory. >> Is this your first WiDS, or have you been before? >> I did virtually last year. >> Okay. One of the things that I love, I love covering this event every year. theCUBE's been covering it since it's inception in 2015. But it's just the inspiration, the vibe here at Stanford is so positive. WiDS is a movement. It's not an initiative, an organization. There are going to be, I think annually this year, there will be 200 different events. Obviously today we're live on International Women's Day. 60 plus countries, 100,000 plus people involved. So, this is such a positive environment for women and men, because we need everybody, underrepresented minorities, to be able to understand the implication that data has across our lives. If we think about stripping away titles in industries, everybody is a consumer, not everybody, most of mobile devices. And we have this expectation, I was in Barcelona last week at a Mobile World Congress, we have this expectation that we're going to be connected 24/7. I can get whatever I want wherever I am in the world, and that's all data driven. And the average person that isn't involved in data science wouldn't understand that. At the same time, they have expectations that depend on organizations like Boeing being data driven so that they can get that experience that they expect in their consumer lives in any aspect of their lives. And that's one of the things I find so interesting and inspiring about data science. What are some of the things that keep you motivated to continue pursuing this? >> Yeah I will say along those lines, I think it's great to invest in K-12 programs for Data Literacy. I know one of my mentors and directors of the Data Analytics program, Dr. Nairanjana Dasgupta, we're really familiar with each other. So, she runs a WSU program for K-12 Data Literacy. It's also something that we strive for at Boeing, and we have an internal Data Literacy program because, believe it or not, most people are in business. And there's a lot of disconnect between interpreting and understanding data. For me, what kind of drives me to continue data science is that connection between people and data and how we use it to improve our world, which is partly why I work at Boeing too 'cause I feel that they produce products that people need like satellites and airplanes, >> Absolutely. >> and everything. >> Well, it's tangible, it's relatable. We can understand it. Can you do me a quick favor and define data literacy for anyone that might not understand what that means? >> Yeah, so it's just being able to understand elements of data, whether that's a bar chart or even in a sentence, like how to read a statistic and interpret a statistic in a sentence, for example. >> Very cool. >> Yeah. And sounds like Boeing's doing a great job in these programs, and also trying to hire more women. So yeah, I wanted to ask, do you think there's something that Boeing needs to work on? Or where do you see yourself working on say the next five years? >> Yeah, I think as a company, we always think that there's always room for improvement. >> It never, never stops. >> Tracy: Definitely. (laughs) >> I know workforce strategy is an area that they're currently really heavily investing in, along with safety. How do we build safer products for people? How do we help inform the public about things like Covid transmission in airports? For example, we had the Confident Traveler Initiative which was a big push that we had, and we had to be able to inform people about data models around Covid, right? So yeah, I would say our future is more about an investment in our people and in our culture from my perspective >> That's so important. One of the hardest things to change especially for a legacy organization like Boeing, is culture. You know, when I talk with CEO's or CIO's or COO's about what's your company's vision, what's your strategy? Especially those companies that are on that digital journey that have no choice these days. Everybody expects to have a digital experience, whether you're transacting an an Uber ride, you're buying groceries, or you're traveling by air. That culture sounds like Boeing is really focused on that. And that's impressive because that's one of the hardest things to morph and mold, but it's so essential. You know, as we look around the room here at WiDS it's obviously mostly females, but we're talking about women, underrepresented minorities. We're talking about men as well who are mentors and sponsors to us. I'd love to get your advice to your younger self. What would you tell yourself in terms of where you are now to become a leader in the technology field? >> Yeah, I mean, it's kind of an interesting question because I always try to think, live with no regrets to an extent. >> Lisa: I like that. >> But, there's lots of failures along the way. (Tracy laughing) I don't know if I would tell myself anything different because honestly, if I did, I wouldn't be where I am. >> Lisa: Good for you. >> I started out in fine arts, and I didn't end up there. >> That's good. >> Such a good point, yeah. >> We've been talking about that and I find that a lot at events like WiDS, is women have these zigzaggy patterns. I studied biology, I have a master's in molecular biology, I'm in media and marketing. We talked about transportable skills. There's a case I made many years ago when I got into tech about, well in science you learn the art of interpreting esoteric data and creating a story from it. And that's a transportable skill. But I always say, you mentioned failure, I always say failure is not a bad F word. It allows us to kind of zig and zag and learn along the way. And I think that really fosters thought diversity. And in data science, that is one of the things we absolutely need to have is that diversity and thought. You know, we talk about AI models being biased, we need the data and we need the diverse brains to help ensure that the biases are identified, extracted, and removed. Speaking of AI, I've been geeking out with ChatGPT. So, I'm on it yesterday and I ask it, "What's hot in data science?" And I was like, is it going to get that? What's hot? And it did it, it came back with trends. I think if I ask anything, "What's hot?", I should be to Paris Hilton, but I didn't. And so I was geeking out. One of the things I learned recently that I thought was so super cool is the CTO of OpenAI is a woman, Mira Murati, which I didn't know until over the weekend. Because I always think if I had to name top females in tech, who would they be? And I always default to Sheryl Sandberg, Carly Fiorina, Susan Wojcicki running YouTube. Who are some of the people in your history, in your current, that are really inspiring to you? Men, women, indifferent. >> Sure. I think Boeing is one of the companies where you actually do see a lot of women in leadership roles. I think we're one of the top companies with a number of women executives, actually. Susan Doniz, who's our Chief Information Officer, I believe she's actually slotted to speak at a WiDS event come fall. >> Lisa: Cool. >> So that will be exciting. Susan's actually relatively newer to Boeing in some ways. A Boeing time skill is like three years is still kind of new. (laughs) But she's been around for a while and she's done a lot of inspiring things, I think, for women in the organization. She does a lot with Latino communities and things like that as well. For me personally, you know, when I started at Boeing Ahmad Yaghoobi was one of my mentors and my Technical Lead. He came from Iran during a lot of hard times in the 1980s. His brother actually wrote a memoir, (laughs) which is just a fun, interesting fact. >> Tracy: Oh my God! >> Lisa: Wow! >> And so, I kind of gravitate to people that I can learn from that's not in my sphere, that might make me uncomfortable. >> And you probably don't even think about how many people you're influencing along the way. >> No. >> We just keep going and learning from our mentors and probably lose sight of, "I wonder how many people actually admire me?" And I'm sure there are many that admire you, Rhonda, for what you've done, going from anthropology to archeology. You mentioned before we went live you were really interested in photography. Keep going and really gathering all that breadth 'cause it's only making you more inspiring to people like us. >> Exactly. >> We thank you so much for joining us on the program and sharing a little bit about you and what brought you to WiDS. Thank you so much, Rhonda. >> Yeah, thank you. >> Tracy: Thank you so much for being here. >> Lisa: Yeah. >> Alright. >> For our guests, and for Tracy Zhang, this is Lisa Martin live at Stanford University covering the eighth Annual Women In Data Science Conference. Stick around. Next guest will be here in just a second. (gentle music)
SUMMARY :
Great to have you on the program, Rhonda. I was always interested in That's right, we were talking We saw the anthropology background, So at the last minute, 11 credits in, Talk about some of the And Boeing, at the time, had But also all of the I'm in the Technical that you brought this up, and making sure overall that we offer about the number of women at about 24% in the US more women and diversity in our company. I mean, the data is is that the representation and how do you think for the students when they're done. Lisa: That's great, Tracy: That's That's a good point. That's all from my memory. One of the things that I love, I think it's great to for anyone that might not being able to understand that Boeing needs to work on? we always think that there's Tracy: Definitely. the public about things One of the hardest things to change I always try to think, live along the way. I started out in fine arts, And I always default to Sheryl I believe she's actually slotted to speak So that will be exciting. to people that I can learn And you probably don't even think about from anthropology to archeology. and what brought you to WiDS. Tracy: Thank you so covering the eighth Annual Women
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Tracy | PERSON | 0.99+ |
Nairanjana Dasgupta | PERSON | 0.99+ |
Boeing | ORGANIZATION | 0.99+ |
Tracy Zhang | PERSON | 0.99+ |
Rhonda | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
Mira Murati | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Susan Wojcicki | PERSON | 0.99+ |
Rhonda Crate | PERSON | 0.99+ |
Susan Doniz | PERSON | 0.99+ |
Susan | PERSON | 0.99+ |
Sheryl Sandberg | PERSON | 0.99+ |
Hannah | PERSON | 0.99+ |
27.6% | QUANTITY | 0.99+ |
2015 | DATE | 0.99+ |
Barcelona | LOCATION | 0.99+ |
WSU College of Arts and Sciences | ORGANIZATION | 0.99+ |
40% | QUANTITY | 0.99+ |
2022 | DATE | 0.99+ |
yesterday | DATE | 0.99+ |
Iran | LOCATION | 0.99+ |
last week | DATE | 0.99+ |
International Women's Day | EVENT | 0.99+ |
11 credits | QUANTITY | 0.99+ |
one | QUANTITY | 0.99+ |
2021 | DATE | 0.99+ |
last year | DATE | 0.99+ |
51% | QUANTITY | 0.99+ |
Washington State University | ORGANIZATION | 0.99+ |
first | QUANTITY | 0.99+ |
three years | QUANTITY | 0.99+ |
Ahmad Yaghoobi | PERSON | 0.99+ |
200 different events | QUANTITY | 0.99+ |
Carly Fiorina | PERSON | 0.99+ |
60 plus countries | QUANTITY | 0.99+ |
1980s | DATE | 0.99+ |
US | LOCATION | 0.99+ |
YouTube | ORGANIZATION | 0.99+ |
100,000 plus people | QUANTITY | 0.99+ |
first time | QUANTITY | 0.99+ |
'22 | DATE | 0.98+ |
eighth Annual Women In Data Science Conference | EVENT | 0.98+ |
One | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
two separate programs | QUANTITY | 0.98+ |
Stanford University | ORGANIZATION | 0.98+ |
eighth Annual Women In Data Science Conference | EVENT | 0.98+ |
Global Diversity Report | TITLE | 0.98+ |
this year | DATE | 0.98+ |
Gayatree Ganu, Meta | WiDS 2023
(upbeat music) >> Hey everyone. Welcome back to "The Cube"'s live coverage of "Women in Data Science 2023". As every year we are here live at Stanford University, profiling some amazing women and men in the fields of data science. I have my co-host for this segment is Hannah Freitag. Hannah is from Stanford's Data Journalism program, really interesting, check it out. We're very pleased to welcome our first guest of the day fresh from the keynote stage, Gayatree Ganu, the VP of Data Science at Meta. Gayatree, It's great to have you on the program. >> Likewise, Thank you for having me. >> So you have a PhD in Computer Science. You shared some really cool stuff. Everyone knows Facebook, everyone uses it. I think my mom might be one of the biggest users (Gayatree laughs) and she's probably watching right now. People don't realize there's so much data behind that and data that drives decisions that we engage with. But talk to me a little bit about you first, PhD in Computer Science, were you always, were you like a STEM kid? Little Gayatree, little STEM, >> Yeah, I was a STEM kid. I grew up in Mumbai, India. My parents are actually pharmacists, so they were not like math or stats or anything like that, but I was always a STEM kid. I don't know, I think it, I think I was in sixth grade when we got our first personal computer and I obviously used it as a Pacman playing machine. >> Oh, that's okay. (all laugh) >> But I was so good at, and I, I honestly believe I think being good at games kind of got me more familiar and comfortable with computers. Yeah. I think I always liked computers, I, yeah. >> And so now you lead, I'm looking at my notes here, the Engagement Ecosystem and Monetization Data Science teams at Facebook, Meta. Talk about those, what are the missions of those teams and how does it impact the everyday user? >> Yeah, so the engagement is basically users coming back to our platform more, there's, no better way for users to tell us that they are finding value on the things that we are doing on Facebook, Instagram, WhatsApp, all the other products than coming back to our platform more. So the Engagement Ecosystem team is looking at trends, looking at where there are needs, looking at how users are changing their behaviors, and you know, helping build strategy for the long term, using that data knowledge. Monetization is very different. You know, obviously the top, top apex goal is have a sustainable business so that we can continue building products for our users. And so, but you know, I said this in my keynote today, it's not about making money, our mission statement is not, you know, maximize as much money as you can make. It's about building a meaningful connection between businesses, customers, users, and, you know especially in these last two or three funky, post-pandemic years, it's been such a big, an important thing to do for small businesses all over all, all around the world for users to find like goods and services and products that they care about and that they can connect to. So, you know, there is truly an connection between my engagement world and the monetization world. And you know, it's not very clear always till you go in to, like, you peel the layers. Everything we do in the ads world is also always first with users as our, you know, guiding principle. >> Yeah, you mentioned how you supported especially small businesses also during the pandemic. You touched a bit upon it in the keynote speech. Can you tell our audience what were like special or certain specific programs you implemented to support especially small businesses during these times? >> Yeah, so there are 200 million businesses on our platform. A lot of them small businesses, 10 million of them run ads. So there is a large number of like businesses on our platform who, you know use the power of social media to connect to the customers that matter to them, to like you, you know use the free products that we built. In the post-pandemic years, we built a lot of stuff very quickly when Covid first hit for business to get the word out, right? Like, they had to announce when special shopping hours existed for at-risk populations, or when certain goods and services were available versus not. We had grants, there's $100 million grant that we gave out to small businesses. Users could show sort of, you know show their support with a bunch of campaigns that we ran, and of course we continue running ads. Our ads are very effective, I guess, and, you know getting a very reliable connection with from the customer to the business. And so, you know, we've run all these studies. We support, I talked about two examples today. One of them is the largest black-owned, woman black-owned wine company, and how they needed to move to an online program and, you know, we gave them a grant, and supported them through their ads campaign and, you know, they saw 60% lift in purchases, or something like that. So, a lot of good stories, small stories, you know, on a scale of 200 million, that really sort of made me feel proud about the work we do. And you know, now more than ever before, I think people can connect so directly with businesses. You can WhatsApp them, I come from India, every business is on WhatsApp. And you can, you know, WhatsApp them, you can send them Facebook messages, and you can build this like direct connection with things that matter to you. >> We have this expectation that we can be connected anywhere. I was just at Mobile World Congress for MWC last week, where, obviously talking about connectivity. We want to be able to do any transaction, whether it's post on Facebook or call an Uber, or watch on Netflix if you're on the road, we expect that we're going to be connected. >> Yeah. >> And what we, I think a lot of us don't realize I mean, those of us in tech do, but how much data science is a facilitator of all of those interactions. >> Yeah! >> As we, Gayatree, as we talk about, like, any business, whether it is the black women-owned wine business, >> Yeah. >> great business, or a a grocer or a car dealer, everybody has to become data-driven. >> Yes. >> Because the consumer has the expectation. >> Yes. >> Talk about data science as a facilitator of just pretty much everything we are doing and conducting in our daily lives. >> Yeah, I think that's a great question. I think data science as a field wasn't really defined like maybe 15 years ago, right? So this is all in our lifetimes that we are seeing this. Even in data science today, People come from so many different backgrounds and bring their own expertise here. And I think we, you know, this conference, all of us get to define what that means and how we can bring data to do good in the world. Everything you do, as you said, there is a lot of data. Facebook has a lot of data, Meta has a lot of data, and how do we responsibly use this data? How do we use this data to make sure that we're, you know representing all diversity? You know, minorities? Like machine learning algorithms don't do well with small data, they do well with big data, but the small data matters. And how do you like, you know, bring that into algorithms? Yeah, so everything we do at Meta is very, very data-driven. I feel proud about that, to be honest, because while data gets a bad rap sometimes, having no data and making decisions in the blind is just the absolute worst thing you can do. And so, you know, we, the job as a data scientist at Facebook is to make sure that we use this data, use this responsibly, make sure that we are representing every aspect of the, you know, 3 billion users who come to our platform. Yeah, data serves all the products that we build here. >> The responsibility factor is, is huge. You know, we can't talk about AI without talking about ethics. One of the things that I was talking with Hannah and our other co-host, Tracy, about during our opening is something I just learned over the weekend. And that is that the CTO of ChatGPT is a woman. (Gayatree laughs) I didn't know that. And I thought, why isn't she getting more awareness? There's a lot of conversations with their CEO. >> Yeah. >> Everyone's using it, playing around with it. I actually asked it yesterday, "What's hot in Data Science?" (all laugh) I was like, should I have asked that to let itself in, what's hot? (Gayatree laughs) But it, I thought that was phenomenal, and we need to be talking about this more. >> Yeah. >> This is something that they're likening to the launch of the iPhone, which has transformed our lives. >> I know, it is. >> ChatGPT, and its chief technologist is a female, how great is that? >> And I don't know whether you, I don't know the stats around this, but I think CTO is even less, it's even more rare to have a woman there, like you have women CEOs because I mean, we are building upon years and years of women not choosing technical fields and not choosing STEM, and it's going to take some time, but yeah, yeah, she's a woman. Isn't it amazing? It's wonderful. >> Yes, there was a great, there's a great "Fast Company" article on her that I was looking at yesterday and I just thought, we need to do what we can to help spread, Mira Murati is her name, because what she's doing is, one of the biggest technological breakthroughs we may ever see in our lifetime. It gives me goosebumps just thinking about it. (Gayatree laughs) I also wanted to share some stats, oh, sorry, go ahead, Hannah. >> Yeah, I was going to follow up on the thing that you mentioned that we had many years with like not enough women choosing a career path in STEM and that we have to overcome this trend. What are some, like what is some advice you have like as the Vice-President Data Science? Like what can we do to make this feel more, you know, approachable and >> Yeah. >> accessible for women? >> Yeah, I, there's so much that we have done already and you know, want to continue, keep doing. Of course conferences like these were, you know and I think there are high school students here there are students from my Alma Mater's undergrad year. It's amazing to like get all these women together to get them to see what success could look like. >> Yeah. >> What being a woman leader in this space could look like. So that's, you know, that's one, at Meta I lead recruiting at Meta and we've done a bunch to sort of open up the thinking around data science and technical jobs for women. Simple things like what you write in your job description. I don't know whether you know this, or this is a story you've heard before, when you see, when you have a job description and there are like 10 things that you need to, you know be good at to apply to this job, a woman sees those 10 and says, okay, I don't meet the qualifications of one of them and she doesn't apply. And a man sees one that he meets the qualifications to and he applies. And so, you know, there's small things you can do, and just how you write your job description, what goals you set for diversity and inclusion for your own organization. We have goals, Facebook's always been pretty up there in like, you know, speaking out for diversity and Sheryl Sandberg has been our Chief Business Officer for a very long time and she's been, like, amazing at like pushing from more women. So yeah, every step of the way, I think, we made a lot of progress, to be honest. I do think women choose STEM fields a lot more than they did. When I did my Computer Science I was often one of one or two women in the Computer Science class. It takes some time to, for it to percolate all the way to like having more CTOs and CEOs, >> Yeah. >> but it's going to happen in our lifetime, and you know, three of us know this, women are going to rule the world, and it (laughs) >> Drop the mic, girl! >> And it's going to happen in our lifetime, so I'm excited about it. >> And we have responsibility in helping make that happen. You know, I'm curious, you were in STEM, you talked about Computer Science, being one of the only females. One of the things that the nadb.org data from 2022 showed, some good numbers, the number of women in technical roles is now 27.6%, I believe, so up from 25, it's up in '22, which is good, more hiring of women. >> Yeah. >> One of the biggest challenges is attrition. What keeps you motivated? >> Yeah. >> To stay what, where you are doing what you're doing, managing a family and helping to drive these experiences at Facebook that we all expect are just going to happen? >> Yeah, two things come to mind. It does take a village. You do need people around you. You know, I'm grateful for my husband. You talked about managing a family, I did the very Indian thing and my parents live with us, and they help take care of the kids. >> Right! (laughs) >> (laughs) My kids are young, six and four, and I definitely needed help over the last few years. It takes mentors, it takes other people that you look up to, who've gone through all of those same challenges and can, you know, advise you to sort of continue working in the field. I remember when my kid was born when he was six months old, I was considering quitting. And my husband's like, to be a good role model for your children, you need to continue working. Like, just being a mother is not enough. And so, you know, so that's one. You know, the village that you build around you your supporters, your mentors who keep encouraging you. Sheryl Sandberg said this to me in my second month at Facebook. She said that women drop out of technical fields, they become managers, they become sort of administrative more, in their nature of their work, and her advice was, "Don't do that, Don't stop the technical". And I think that's the other thing I'd say to a lot of women. Technical stuff is hard, but you know, keeping up with that and keeping sort of on top of it actually does help you in the long run. And it's definitely helped me in my career at Facebook. >> I think one of the things, and Hannah and I and Tracy talked about this in the open, and I think you'll agree with us, is the whole saying of you can't be what you can't see, and I like to way, "Well, you can be what you can see". That visibility, the great thing that WiDS did, of having you on the stage as a speaker this morning so people can understand, everyone, like I said, everyone knows Meta, >> Yeah. >> everyone uses Facebook. And so it's important to bring that connection, >> Yeah. >> of how data is driving the experiences, the fact that it's User First, but we need to be able to see women in positions, >> Yes. >> like you, especially with Sheryl stepping down moving on to something else, or people that are like YouTube influencers, that have no idea that the head of YouTube for a very long time, Susan Wojcicki is a woman. >> (laughs) Yes. Who pioneered streaming, and I mean how often do you are you on YouTube every day? >> Yep, every day. >> But we have to be able to see and and raise the profile of these women and learn from them and be inspired, >> Absolutely. >> to keep going and going. I like what I do, I'm making a difference here. >> Yeah, yeah, absolutely. >> And I can be the, the sponsor or the mentor for somebody down the road. >> Absolutely. >> Yeah, and then referring back to what we talked in the beginning, show that data science is so diverse and it doesn't mean if you're like in IT, you're like sitting in your dark room, >> Right. (laughs) >> coding all day, but you know, >> (laughs) Right! >> to show the different facets of this job and >> Right! >> make this appealing to women, >> Yeah. for sure. >> And I said this in my keynote too, you know, one of the things that helped me most is complimenting the data and the techniques and the algorithms with how you work with people, and you know, empathy and alignment building and leadership, strategic thinking. And I think honestly, I think women do a lot of this stuff really well. We know how to work with people and so, you know, I've seen this at Meta for sure, like, you know, all of these skills soft skills, as we call them, go a long way, and like, you know, doing the right things and having a lasting impact. And like I said, women are going to rule the world, you know, in our lifetimes. (laughs) >> Oh, I can't, I can't wait to see that happen. There's some interesting female candidates that are already throwing their hats in the ring for the next presidential election. >> Yes. >> So we'll have to see where that goes. But some of the things that are so interesting to me, here we are in California and Palo Alto, technically Stanford is its own zip code, I believe. And we're in California, we're freaking out because we've gotten so much rain, it's absolutely unprecedented. We need it, we had a massive drought, an extreme drought, technically, for many years. I've got friends that live up in Tahoe, I've been getting pictures this morning of windows that are >> (laughs) that are covered? >> Yes, actually, yes. (Gayatree laughs) That, where windows like second-story windows are covered in snow. >> Yeah. >> Climate change. >> Climate change. >> There's so much that data science is doing to power and power our understanding of climate change whether it's that, or police violence. >> Yeah. (all talk together) >> We had talk today on that it was amazing. >> Yes. So I want more people to know what data science is really facilitating, that impacts all of us, whether you're in a technical role or not. >> And data wins arguments. >> Yes, I love that! >> I said this is my slide today, like, you know, there's always going to be doubters and naysayers and I mean, but there's hard evidence, there's hard data like, yeah. In all of these fields, I mean the data that climate change, the data science that we have done in the environmental and climate change areas and medical, and you know, medicine professions just so much, so much more opportunity, and like, how much we can learn more about the world. >> Yeah. >> Yeah, it's a pretty exciting time to be a data scientist. >> I feel like, we're just scratching the surface. >> Yeah. >> With the potential and the global impact that we can make with data science. Gayatree, it's been so great having you on theCUBE, thank you. >> Right, >> Thank you so much, Gayatree. >> So much, I love, >> Thank you. >> I'm going to take Data WiD's arguments into my personal life. (Gayatree laughs) I was actually just, just a quick anecdote, funny story. I was listening to the radio this morning and there was a commercial from an insurance company and I guess the joke is, it's an argument between two spouses, and the the voiceover comes in and says, "Let's watch a replay". I'm like, if only they, then they got the data that helped the woman win the argument. (laughs) >> (laughs) I will warn you it doesn't always help with arguments I have with my husband. (laughs) >> Okay, I'm going to keep it in the middle of my mind. >> Yes! >> Gayatree, thank you so much. >> Thank you so much, >> for sharing, >> Thank you both for the opportunity. >> And being a great female that we can look up to, we really appreciate your insights >> Oh, likewise. >> and your time. >> Thank you. >> All right, for our guest, for Hannah Freitag, I'm Lisa Martin, live at Stanford University covering "Women in Data Science '23". Stick around, our next guest joins us in just a minute. (upbeat music) I have been in the software and technology industry for over 12 years now, so I've had the opportunity as a marketer to really understand and interact with customers across the entire buyer's journey. Hi, I'm Lisa Martin and I'm a host of theCUBE. (upbeat music) Being a host on theCUBE has been a dream of mine for the last few years. I had the opportunity to meet Jeff and Dave and John at EMC World a few years ago and got the courage up to say, "Hey, I'm really interested in this. I love talking with customers, gimme a shot, let me come into the studio and do an interview and see if we can work together". I think where I really impact theCUBE is being a female in technology. We interview a lot of females in tech, we do a lot of women in technology events and one of the things I.
SUMMARY :
in the fields of data science. and data that drives and I obviously used it as a (all laugh) and comfortable with computers. And so now you lead, I'm and you know, helping build Yeah, you mentioned how and you can build this I was just at Mobile World a lot of us don't realize has to become data-driven. has the expectation. and conducting in our daily lives. And I think we, you know, this conference, And that is that the CTO and we need to be talking about this more. to the launch of the iPhone, which has like you have women CEOs and I just thought, we on the thing that you mentioned and you know, want to and just how you write And it's going to One of the things that the One of the biggest I did the very Indian thing and can, you know, advise you to sort of and I like to way, "Well, And so it's important to bring that have no idea that the head of YouTube and I mean how often do you I like what I do, I'm Yeah, yeah, for somebody down the road. (laughs) Yeah. and like, you know, doing the right things that are already throwing But some of the things that are covered in snow. There's so much that Yeah. on that it was amazing. that impacts all of us, and you know, medicine professions to be a data scientist. I feel like, and the global impact and I guess the joke is, (laughs) I will warn you I'm going to keep it in the and one of the things I.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Susan Wojcicki | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Hannah | PERSON | 0.99+ |
Mira Murati | PERSON | 0.99+ |
California | LOCATION | 0.99+ |
Tracy | PERSON | 0.99+ |
ORGANIZATION | 0.99+ | |
Hannah Freitag | PERSON | 0.99+ |
Sheryl Sandberg | PERSON | 0.99+ |
10 | QUANTITY | 0.99+ |
Gayatree | PERSON | 0.99+ |
$100 million | QUANTITY | 0.99+ |
Jeff | PERSON | 0.99+ |
27.6% | QUANTITY | 0.99+ |
60% | QUANTITY | 0.99+ |
Tahoe | LOCATION | 0.99+ |
three | QUANTITY | 0.99+ |
Sheryl | PERSON | 0.99+ |
one | QUANTITY | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
2022 | DATE | 0.99+ |
One | QUANTITY | 0.99+ |
India | LOCATION | 0.99+ |
200 million | QUANTITY | 0.99+ |
six months | QUANTITY | 0.99+ |
six | QUANTITY | 0.99+ |
Meta | ORGANIZATION | 0.99+ |
10 things | QUANTITY | 0.99+ |
iPhone | COMMERCIAL_ITEM | 0.99+ |
two spouses | QUANTITY | 0.99+ |
Engagement Ecosystem | ORGANIZATION | 0.99+ |
10 million | QUANTITY | 0.99+ |
yesterday | DATE | 0.99+ |
today | DATE | 0.99+ |
last week | DATE | 0.99+ |
25 | QUANTITY | 0.99+ |
Mumbai, India | LOCATION | 0.99+ |
YouTube | ORGANIZATION | 0.99+ |
John | PERSON | 0.99+ |
four | QUANTITY | 0.99+ |
two examples | QUANTITY | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
Dave | PERSON | 0.99+ |
over 12 years | QUANTITY | 0.98+ |
first | QUANTITY | 0.98+ |
two things | QUANTITY | 0.98+ |
200 million businesses | QUANTITY | 0.98+ |
Stanford | ORGANIZATION | 0.98+ |
both | QUANTITY | 0.98+ |
ORGANIZATION | 0.98+ | |
Women in Data Science 2023 | TITLE | 0.98+ |
ORGANIZATION | 0.98+ | |
Gayatree Ganu | PERSON | 0.98+ |
ChatGPT | ORGANIZATION | 0.98+ |
second month | QUANTITY | 0.97+ |
nadb.org | ORGANIZATION | 0.97+ |
sixth grade | QUANTITY | 0.97+ |
first guest | QUANTITY | 0.97+ |
'22 | DATE | 0.97+ |
Jacqueline Kuo, Dataiku | WiDS 2023
(upbeat music) >> Morning guys and girls, welcome back to theCUBE's live coverage of Women in Data Science WIDS 2023 live at Stanford University. Lisa Martin here with my co-host for this segment, Tracy Zhang. We're really excited to be talking with a great female rockstar. You're going to learn a lot from her next, Jacqueline Kuo, solutions engineer at Dataiku. Welcome, Jacqueline. Great to have you. >> Thank you so much. >> Thank for being here. >> I'm so excited to be here. >> So one of the things I have to start out with, 'cause my mom Kathy Dahlia is watching, she's a New Yorker. You are a born and raised New Yorker and I learned from my mom and others. If you're born in New York no matter how long you've moved away, you are a New Yorker. There's you guys have like a secret club. (group laughs) >> I am definitely very proud of being born and raised in New York. My family immigrated to New York, New Jersey from Taiwan. So very proud Taiwanese American as well. But I absolutely love New York and I can't imagine living anywhere else. >> Yeah, yeah. >> I love it. >> So you studied, I was doing some research on you you studied mechanical engineering at MIT. >> Yes. >> That's huge. And you discovered your passion for all things data-related. You worked at IBM as an analytics consultant. Talk to us a little bit about your career path. Were you always interested in engineering STEM-related subjects from the time you were a child? >> I feel like my interests were ranging in many different things and I ended up landing in engineering, 'cause I felt like I wanted to gain a toolkit like a toolset to make some sort of change with or use my career to make some sort of change in this world. And I landed on engineering and mechanical engineering specifically, because I felt like I got to, in my undergrad do a lot of hands-on projects, learn every part of the engineering and design process to build products which is super-transferable and transferable skills sort of is like the trend in my career so far. Where after undergrad I wanted to move back to New York and mechanical engineering jobs are kind of few and fall far in between in the city. And I ended up landing at IBM doing analytics consulting, because I wanted to understand how to use data. I knew that data was really powerful and I knew that working with it could allow me to tell better stories to influence people across different industries. And that's also how I kind of landed at Dataiku to my current role, because it really does allow me to work across different industries and work on different problems that are just interesting. >> Yeah, I like the way that, how you mentioned building a toolkit when doing your studies at school. Do you think a lot of skills are still very relevant to your job at Dataiku right now? >> I think that at the core of it is just problem solving and asking questions and continuing to be curious or trying to challenge what is is currently given to you. And I think in an engineering degree you get a lot of that. >> Yeah, I'm sure. >> But I think that we've actually seen that a lot in the panels today already, that you get that through all different types of work and research and that kind of thoughtfulness comes across in all different industries too. >> Talk a little bit about some of the challenges, that data science is solving, because every company these days, whether it's an enterprise in manufacturing or a small business in retail, everybody has to be data-driven, because the end user, the end customer, whoever that is whether it's a person, an individual, a company, a B2B, expects to have a personalized custom experience and that comes from data. But you have to be able to understand that data treated properly, responsibly. Talk about some of the interesting projects that you're doing at Dataiku or maybe some that you've done in the past that are really kind of transformative across things climate change or police violence, some of the things that data science really is impacting these days. >> Yeah, absolutely. I think that what I love about coming to these conferences is that you hear about those really impactful social impact projects that I think everybody who's in data science wants to be working on. And I think at Dataiku what's great is that we do have this program called Ikig.AI where we work with nonprofits and we support them in their data and analytics projects. And so, a project I worked on was with the Clean Water, oh my goodness, the Ocean Cleanup project, Ocean Cleanup organization, which was amazing, because it was sort of outside of my day-to-day and it allowed me to work with them and help them understand better where plastic is being aggregated across the world and where it appears, whether that's on beaches or in lakes and rivers. So using data to help them better understand that. I feel like from a day-to-day though, we, in terms of our customers, they're really looking at very basic problems with data. And I say basic, not to diminish it, but really just to kind of say that it's high impact, but basic problems around how do they forecast sales better? That's a really kind of, sort of basic problem, but it's actually super-complex and really impactful for people, for companies when it comes to forecasting how much headcount they need to have in the next year or how much inventory to have if they're retail. And all of those are going to, especially for smaller companies, make a huge impact on whether they make profit or not. And so, what's great about working at Dataiku is you get to work on these high-impact projects and oftentimes I think from my perspective, I work as a solutions engineer on the commercial team. So it's just, we work generally with smaller customers and sometimes talking to them, me talking to them is like their first introduction to what data science is and what they can do with that data. And sort of using our platform to show them what the possibilities are and help them build a strategy around how they can implement data in their day-to-day. >> What's the difference? You were a data scientist by title and function, now you're a solutions engineer. Talk about the ascendancy into that and also some of the things that you and Tracy will talk about as those transferable, those transportable skills that probably maybe you learned in engineering, you brought data science now you're bringing to solutions engineering. >> Yeah, absolutely. So data science, I love working with data. I love getting in the weeds of things and I love, oftentimes that means debugging things or looking line by line at your code and trying to make it better. I found that on in the data science role, while those things I really loved, sometimes it also meant that I didn't, couldn't see or didn't have visibility into the broader picture of well like, well why are we doing this project? And who is it impacting? And because oftentimes your day-to-day is very much in the weeds. And so, I moved into sales or solutions engineering at Dataiku to get that perspective, because what a sales engineer does is support the sale from a technical perspective. And so, you really truly understand well, what is the customer looking for and what is going to influence them to make a purchase? And how do you tell the story of the impact of data? Because oftentimes they need to quantify well, if I purchase a software like Dataiku then I'm able to build this project and make this X impact on the business. And that is really powerful. That's where the storytelling comes in and that I feel like a lot of what we've been hearing today about connecting data with people who can actually do something with that data. That's really the bridge that we as sales engineers are trying to connect in that sales process. >> It's all about connectivity, isn't it? >> Yeah, definitely. We were talking about this earlier that it's about making impact and it's about people who we are analyzing data is like influencing. And I saw that one of the keywords or one of the biggest thing at Dataiku is everyday AI, so I wanted to just ask, could you please talk more about how does that weave into the problem solving and then day-to-day making an impact process? >> Yes, so I started working on Dataiku around three years ago and I fell in love with the product itself. The product that we have is we allow for people with different backgrounds. If you're coming from a data analyst background, data science, data engineering, maybe you are more of like a business subject matter expert, to all work in one unified central platform, one user interface. And why that's powerful is that when you're working with data, it's not just that data scientist working on their own and their own computer coding. We've heard today that it's all about connecting the data scientists with those business people, with maybe the data engineers and IT people who are actually going to put that model into production or other folks. And so, they all use different languages. Data scientists might use Python and R, your business people are using PowerPoint and Excel, everyone's using different tools. How do we bring them all in one place so that you can have conversations faster? So the business people can understand exactly what you're building with the data and can get their hands on that data and that model prediction faster. So that's what Dataiku does. That's the product that we have. And I completely forgot your question, 'cause I got so invested in talking about this. Oh, everyday AI. Yeah, so the goal of of Dataiku is really to allow for those maybe less technical people with less traditional data science backgrounds. Maybe they're data experts and they understand the data really well and they've been working in SQL for all their career. Maybe they're just subject matter experts and want to get more into working with data. We allow those people to do that through our no and low-code tools within our platform. Platform is very visual as well. And so, I've seen a lot of people learn data science, learn machine learning by working in the tool itself. And that's sort of, that's where everyday AI comes in, 'cause we truly believe that there are a lot of, there's a lot of unutilized expertise out there that we can bring in. And if we did give them access to data, imagine what we could do in the kind of work that they can do and become empowered basically with that. >> Yeah, we're just scratching the surface. I find data science so fascinating, especially when you talk about some of the real world applications, police violence, health inequities, climate change. Here we are in California and I don't know if you know, we're experiencing an atmospheric river again tomorrow. Californians and the rain- >> Storm is coming. >> We are not good... And I'm a native Californian, but we all know about climate change. People probably don't associate all of the data that is helping us understand it, make decisions based on what's coming what's happened in the past. I just find that so fascinating. But I really think we're truly at the beginning of really understanding the impact that being data-driven can actually mean whether you are investigating climate change or police violence or health inequities or your a grocery store that needs to become data-driven, because your consumer is expecting a personalized relevant experience. I want you to offer me up things that I know I was doing online grocery shopping, yesterday, I just got back from Europe and I was so thankful that my grocer is data-driven, because they made the process so easy for me. And but we have that expectation as consumers that it's going to be that easy, it's going to be that personalized. And what a lot of folks don't understand is the data the democratization of data, the AI that's helping make that a possibility that makes our lives easier. >> Yeah, I love that point around data is everywhere and the more we have, the actually the more access we actually are providing. 'cause now compute is cheaper, data is literally everywhere, you can get access to it very easily. And so, I feel like more people are just getting themselves involved and that's, I mean this whole conference around just bringing more women into this industry and more people with different backgrounds from minority groups so that we get their thoughts, their opinions into the work is so important and it's becoming a lot easier with all of the technology and tools just being open source being easier to access, being cheaper. And that I feel really hopeful about in this field. >> That's good. Hope is good, isn't it? >> Yes, that's all we need. But yeah, I'm glad to see that we're working towards that direction. I'm excited to see what lies in the future. >> We've been talking about numbers of women, percentages of women in technical roles for years and we've seen it hover around 25%. I was looking at some, I need to AnitaB.org stats from 2022 was just looking at this yesterday and the numbers are going up. I think the number was 26, 27.6% of women in technical roles. So we're seeing a growth there especially over pre-pandemic levels. Definitely the biggest challenge that still seems to be one of the biggest that remains is attrition. I would love to get your advice on what would you tell your younger self or the previous prior generation in terms of having the confidence and the courage to pursue engineering, pursue data science, pursue a technical role, and also stay in that role so you can be one of those females on stage that we saw today? >> Yeah, that's the goal right there one day. I think it's really about finding other people to lift and mentor and support you. And I talked to a bunch of people today who just found this conference through Googling it, and the fact that organizations like this exist really do help, because those are the people who are going to understand the struggles you're going through as a woman in this industry, which can get tough, but it gets easier when you have a community to share that with and to support you. And I do want to definitely give a plug to the WIDS@Dataiku team. >> Talk to us about that. >> Yeah, I was so fortunate to be a WIDS ambassador last year and again this year with Dataiku and I was here last year as well with Dataiku, but we have grown the WIDS effort so much over the last few years. So the first year we had two events in New York and also in London. Our Dataiku's global. So this year we additionally have one in the west coast out here in SF and another one in Singapore which is incredible to involve that team. But what I love is that everyone is really passionate about just getting more women involved in this industry. But then also what I find fortunate too at Dataiku is that we have a strong female, just a lot of women. >> Good. >> Yeah. >> A lot of women working as data scientists, solutions engineer and sales and all across the company who even if they aren't doing data work in a day-to-day, they are super-involved and excited to get more women in the technical field. And so. that's like our Empower group internally that hosts events and I feel like it's a really nice safe space for all of us to speak about challenges that we encounter and feel like we're not alone in that we have a support system to make it better. So I think from a nutrition standpoint every organization should have a female ERG to just support one another. >> Absolutely. There's so much value in a network in the community. I was talking to somebody who I'm blanking on this may have been in Barcelona last week, talking about a stat that showed that a really high percentage, 78% of people couldn't identify a female role model in technology. Of course, Sheryl Sandberg's been one of our role models and I thought a lot of people know Sheryl who's leaving or has left. And then a whole, YouTube influencers that have no idea that the CEO of YouTube for years has been a woman, who has- >> And she came last year to speak at WIDS. >> Did she? >> Yeah. >> Oh, I missed that. It must have been, we were probably filming. But we need more, we need to be, and it sounds like Dataiku was doing a great job of this. Tracy, we've talked about this earlier today. We need to see what we can be. And it sounds like Dataiku was pioneering that with that ERG program that you talked about. And I completely agree with you. That should be a standard program everywhere and women should feel empowered to raise their hand ask a question, or really embrace, "I'm interested in engineering, I'm interested in data science." Then maybe there's not a lot of women in classes. That's okay. Be the pioneer, be that next Sheryl Sandberg or the CTO of ChatGPT, Mira Murati, who's a female. We need more people that we can see and lean into that and embrace it. I think you're going to be one of them. >> I think so too. Just so that young girls like me like other who's so in school, can see, can look up to you and be like, "She's my role model and I want to be like her. And I know that there's someone to listen to me and to support me if I have any questions in this field." So yeah. >> Yeah, I mean that's how I feel about literally everyone that I'm surrounded by here. I find that you find role models and people to look up to in every conversation whenever I'm speaking with another woman in tech, because there's a journey that has had happen for you to get to that place. So it's incredible, this community. >> It is incredible. WIDS is a movement we're so proud of at theCUBE to have been a part of it since the very beginning, since 2015, I've been covering it since 2017. It's always one of my favorite events. It's so inspiring and it just goes to show the power that data can have, the influence, but also just that we're at the beginning of uncovering so much. Jacqueline's been such a pleasure having you on theCUBE. Thank you. >> Thank you. >> For sharing your story, sharing with us what Dataiku was doing and keep going. More power to you girl. We're going to see you up on that stage one of these years. >> Thank you so much. Thank you guys. >> Our pleasure. >> Our pleasure. >> For our guests and Tracy Zhang, this is Lisa Martin, you're watching theCUBE live at WIDS '23. #EmbraceEquity is this year's International Women's Day theme. Stick around, our next guest joins us in just a minute. (upbeat music)
SUMMARY :
We're really excited to be talking I have to start out with, and I can't imagine living anywhere else. So you studied, I was the time you were a child? and I knew that working Yeah, I like the way and continuing to be curious that you get that through and that comes from data. And I say basic, not to diminish it, and also some of the I found that on in the data science role, And I saw that one of the keywords so that you can have conversations faster? Californians and the rain- that it's going to be that easy, and the more we have, Hope is good, isn't it? I'm excited to see what and also stay in that role And I talked to a bunch of people today is that we have a strong and all across the company that have no idea that the And she came last and lean into that and embrace it. And I know that there's I find that you find role models but also just that we're at the beginning We're going to see you up on Thank you so much. #EmbraceEquity is this year's
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Sheryl | PERSON | 0.99+ |
Mira Murati | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Tracy Zhang | PERSON | 0.99+ |
Tracy | PERSON | 0.99+ |
Jacqueline | PERSON | 0.99+ |
Kathy Dahlia | PERSON | 0.99+ |
Jacqueline Kuo | PERSON | 0.99+ |
California | LOCATION | 0.99+ |
Europe | LOCATION | 0.99+ |
Dataiku | ORGANIZATION | 0.99+ |
New York | LOCATION | 0.99+ |
Singapore | LOCATION | 0.99+ |
London | LOCATION | 0.99+ |
last year | DATE | 0.99+ |
Sheryl Sandberg | PERSON | 0.99+ |
YouTube | ORGANIZATION | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Barcelona | LOCATION | 0.99+ |
2022 | DATE | 0.99+ |
Taiwan | LOCATION | 0.99+ |
2015 | DATE | 0.99+ |
last week | DATE | 0.99+ |
two events | QUANTITY | 0.99+ |
26, 27.6% | QUANTITY | 0.99+ |
last year | DATE | 0.99+ |
PowerPoint | TITLE | 0.99+ |
Excel | TITLE | 0.99+ |
this year | DATE | 0.99+ |
yesterday | DATE | 0.99+ |
Python | TITLE | 0.99+ |
Dataiku | PERSON | 0.99+ |
New York, New Jersey | LOCATION | 0.99+ |
tomorrow | DATE | 0.99+ |
2017 | DATE | 0.99+ |
SF | LOCATION | 0.99+ |
MIT | ORGANIZATION | 0.99+ |
today | DATE | 0.98+ |
78% | QUANTITY | 0.98+ |
ChatGPT | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
Ocean Cleanup | ORGANIZATION | 0.98+ |
SQL | TITLE | 0.98+ |
next year | DATE | 0.98+ |
International Women's Day | EVENT | 0.97+ |
R | TITLE | 0.97+ |
around 25% | QUANTITY | 0.96+ |
Californians | PERSON | 0.95+ |
Women in Data Science | TITLE | 0.94+ |
one day | QUANTITY | 0.92+ |
theCUBE | ORGANIZATION | 0.91+ |
WIDS | ORGANIZATION | 0.89+ |
first introduction | QUANTITY | 0.88+ |
Stanford University | LOCATION | 0.87+ |
one place | QUANTITY | 0.87+ |
Keynote Analysis | WiDS 2023
(ambient music) >> Good morning, everyone. Lisa Martin with theCUBE, live at the eighth Annual Women in Data Science Conference. This is one of my absolute favorite events of the year. We engage with tons of great inspirational speakers, men and women, and what's happening with WiDS is a global movement. I've got two fabulous co-hosts with me today that you're going to be hearing and meeting. Please welcome Tracy Zhang and Hannah Freitag, who are both from the sata journalism program, master's program, at Stanford. So great to have you guys. >> So excited to be here. >> So data journalism's so interesting. Tracy, tell us a little bit about you, what you're interested in, and then Hannah we'll have you do the same thing. >> Yeah >> Yeah, definitely. I definitely think data journalism is very interesting, and in fact, I think, what is data journalism? Is definitely one of the big questions that we ask during the span of one year, which is the length of our program. And yeah, like you said, I'm in this data journalism master program, and I think coming in I just wanted to pivot from my undergrad studies, which is more like a traditional journalism, into data. We're finding stories through data, so that's why I'm also very excited about meeting these speakers for today because they're all, they have different backgrounds, but they all ended up in data science. So I think they'll be very inspirational and I can't wait to talk to them. >> Data in stories, I love that. Hannah, tell us a little bit about you. >> Yeah, so before coming to Stanford, I was a research assistant at Humboldt University in Berlin, so I was in political science research. And I love to work with data sets and data, but I figured that, for me, I don't want this story to end up in a research paper, which is only very limited in terms of the audience. And I figured, okay, data journalism is the perfect way to tell stories and use data to illustrate anecdotes, but to make it comprehensive and accessible for a broader audience. So then I found this program at Stanford and I was like, okay, that's the perfect transition from political science to journalism, and to use data to tell data-driven stories. So I'm excited to be in this program, I'm excited for the conference today and to hear from these amazing women who work in data science. >> You both brought up great points, and we were chatting earlier that there's a lot of diversity in background. >> Tracy: Definitely. >> Not everyone was in STEM as a young kid or studied computer science. Maybe some are engineering, maybe some are are philosophy or economic, it's so interesting. And what I find year after year at WiDS is it brings in so much thought diversity. And that's what being data-driven really demands. It demands that unbiased approach, that diverse, a spectrum of diverse perspectives, and we definitely get that at WiDS. There's about 350 people in person here, but as I mentioned in the opening, hundreds of thousands will engage throughout the year, tens of thousands probably today at local events going on across the globe. And it just underscores the importance of every organization, whether it's a bank or a grocer, has to be data-driven. We have that expectation as consumers in our consumer lives, and even in our business lives, that I'm going to engage with a business, whatever it is, and they're going to know about me, they're going to deliver me a personalized experience that's relevant to me and my history. And all that is powered by data science, which is I think it's fascinating. >> Yeah, and the great way is if you combine data with people. Because after all, large data sets, they oftentimes consist of stories or data that affects people. And to find these stories or advanced research in whatever fields, maybe in the financial business, or in health, as you mentioned, the variety of fields, it's very powerful, powerful tool to use. >> It's a very power, oh, go ahead Tracy. >> No, definitely. I just wanted to build off of that. It's important to put a face on data. So a dataset without a name is just some numbers, but if there's a story, then I think it means something too. And I think Margot was talking about how data science is about knowing or understanding the past, I think that's very interesting. That's a method for us to know who we are. >> Definitely. There's so many opportunities. I wanted to share some of the statistics from AnitaB.org that I was just looking at from 2022. We always talk at events like WiDS, and some of the other women in tech things, theCUBE is very much pro-women in tech, and has been for a very long, since the beginning of theCUBE. But we've seen the numbers of women technologists historically well below 25%, and we see attrition rates are high. And so we often talk about, well, what can we do? And part of that is raising the awareness. And that's one of the great things about WiDS, especially WiDS happening on International Women's Day, today, March 8th, and around event- >> Tracy: A big holiday. >> Exactly. But one of the nice things I was looking at, the AnitaB.org research, is that representation of tech women is on the rise, still below pre-pandemic levels, but it's actually nearly 27% of women in technical roles. And that's an increase, slow increase, but the needle is moving. We're seeing much more gender diversity across a lot of career levels, which is exciting. But some of the challenges remain. I mean, the representation of women technologists is growing, except at the intern level. And I thought that was really poignant. We need to be opening up that pipeline and going younger. And you'll hear a lot of those conversations today about, what are we doing to reach girls in grade school, 10 year olds, 12 year olds, those in high school? How do we help foster them through their undergrad studies- >> And excite them about science and all these fields, for sure. >> What do you think, Hannah, on that note, and I'll ask you the same question, what do you think can be done? The theme of this year's International Women's Day is Embrace Equity. What do you think can be done on that intern problem to help really dial up the volume on getting those younger kids interested, one, earlier, and two, helping them stay interested? >> Yeah. Yeah, that's a great question. I think it's important to start early, as you said, in school. Back in the day when I went to high school, we had this one day per year where we could explore as girls, explore a STEM job and go into the job for one day and see how it's like to work in a, I dunno, in IT or in data science, so that's a great first step. But as you mentioned, it's important to keep girls and women excited about this field and make them actually pursue this path. So I think conferences or networking is very powerful. Also these days with social media and technology, we have more ability and greater ways to connect. And I think we should even empower ourselves even more to pursue this path if we're interested in data science, and not be like, okay, maybe it's not for me, or maybe as a woman I have less chances. So I think it's very important to connect with other women, and this is what WiDS is great about. >> WiDS is so fantastic for that network effect, as you talked about. It's always such, as I was telling you about before we went live, I've covered five or six WiDS for theCUBE, and it's always such a day of positivity, it's a day of of inclusivity, which is exactly what Embrace Equity is really kind of about. Tracy, talk a little bit about some of the things that you see that will help with that hashtag Embrace Equity kind of pulling it, not just to tech. Because we're talking and we saw Meta was a keynote who's going to come to talk with Hannah and me in a little bit, we see Total Energies on the program today, we see Microsoft, Intuit, Boeing Air Company. What are some of the things you think that can be done to help inspire, say, little Tracy back in the day to become interested in STEM or in technology or in data? What do you think companies can and should be doing to dial up the volume for those youngsters? >> Yeah, 'cause I think somebody was talking about, one of the keynote speakers was talking about how there is a notion that girls just can't be data scientists. girls just can't do science. And I think representation definitely matters. If three year old me see on TV that all the scientists are women, I think I would definitely have the notion that, oh, this might be a career choice for me and I can definitely also be a scientist if I want. So yeah, I think representation definitely matters and that's why conference like this will just show us how these women are great in their fields. They're great data scientists that are bringing great insight to the company and even to the social good as well. So yeah, I think that's very important just to make women feel seen in this data science field and to listen to the great woman who's doing amazing work. >> Absolutely. There's a saying, you can't be what you can't see. >> Exactly. >> And I like to say, I like to flip it on its head, 'cause we can talk about some of the negatives, but there's a lot of positives and I want to share some of those in a minute, is that we need to be, that visibility that you talked about, the awareness that you talked about, it needs to be there but it needs to be sustained and maintained. And an organization like WiDS and some of the other women in tech events that happen around the valley here and globally, are all aimed at raising the profile of these women so that the younger, really, all generations can see what they can be. We all, the funny thing is, we all have this expectation whether we're transacting on Uber ride or we are on Netflix or we're buying something on Amazon, we can get it like that. They're going to know who I am, they're going to know what I want, they're going to want to know what I just bought or what I just watched. Don't serve me up something that I've already done that. >> Hannah: Yeah. >> Tracy: Yeah. >> So that expectation that everyone has is all about data, though we don't necessarily think about it like that. >> Hannah: Exactly. >> Tracy: Exactly. >> But it's all about the data that, the past data, the data science, as well as the realtime data because we want to have these experiences that are fresh, in the moment, and super relevant. So whether women recognize it or not, they're data driven too. Whether or not you're in data science, we're all driven by data and we have these expectations that every business is going to meet it. >> Exactly. >> Yeah. And circling back to young women, I think it's crucial and important to have role models. As you said, if you see someone and you're younger and you're like, oh I want to be like her. I want to follow this path, and have inspiration and a role model, someone you look up to and be like, okay, this is possible if I study the math part or do the physics, and you kind of have a goal and a vision in mind, I think that's really important to drive you. >> Having those mentors and sponsors, something that's interesting is, I always, everyone knows what a mentor is, somebody that you look up to, that can guide you, that you admire. I didn't learn what a sponsor was until a Women in Tech event a few years ago that we did on theCUBE. And I was kind of, my eyes were open but I didn't understand the difference between a mentor and a sponsor. And then it got me thinking, who are my sponsors? And I started going through LinkedIn, oh, he's a sponsor, she's a sponsor, people that help really propel you forward, your recommenders, your champions, and it's so important at every level to build that network. And we have, to your point, Hannah, there's so much potential here for data drivenness across the globe, and there's so much potential for women. One of the things I also learned recently , and I wanted to share this with you 'cause I'm not sure if you know this, ChatGPT, exploding, I was on it yesterday looking at- >> Everyone talking about it. >> What's hot in data science? And it was kind of like, and I actually asked it, what was hot in data science in 2023? And it told me that it didn't know anything prior to 2021. >> Tracy: Yes. >> Hannah: Yeah. >> So I said, Oh, I'm so sorry. But everyone's talking about ChatGPT, it is the most advanced AI chatbot ever released to the masses, it's on fire. They're likening it to the launch of the iPhone, 100 million-plus users. But did you know that the CTO of ChatGPT is a woman? >> Tracy: I did not know, but I learned that. >> I learned that a couple days ago, Mira Murati, and of course- >> I love it. >> She's been, I saw this great profile piece on her on Fast Company, but of course everything that we're hearing about with respect to ChatGPT, a lot on the CEO. But I thought we need to help dial up the profile of the CTO because she's only 35, yet she is at the helm of one of the most groundbreaking things in our lifetime we'll probably ever see. Isn't that cool? >> That is, yeah, I completely had no idea. >> I didn't either. I saw it on LinkedIn over the weekend and I thought, I have to talk about that because it's so important when we talk about some of the trends, other trends from AnitaB.org, I talked about some of those positive trends. Overall hiring has rebounded in '22 compared to pre-pandemic levels. And we see also 51% more women being hired in '22 than '21. So the data, it's all about data, is showing us things are progressing quite slowly. But one of the biggest challenges that's still persistent is attrition. So we were talking about, Hannah, what would your advice be? How would you help a woman stay in tech? We saw that attrition last year in '22 according to AnitaB.org, more than doubled. So we're seeing women getting into the field and dropping out for various reasons. And so that's still an extent concern that we have. What do you think would motivate you to stick around if you were in a technical role? Same question for you in a minute. >> Right, you were talking about how we see an increase especially in the intern level for women. And I think if, I don't know, this is a great for a start point for pushing the momentum to start growth, pushing the needle rightwards. But I think if we can see more increase in the upper level, the women representation in the upper level too, maybe that's definitely a big goal and something we should work towards to. >> Lisa: Absolutely. >> But if there's more representation up in the CTO position, like in the managing level, I think that will definitely be a great factor to keep women in data science. >> I was looking at some trends, sorry, Hannah, forgetting what this source was, so forgive me, that was showing that there was a trend in the last few years, I think it was Fast Company, of more women in executive positions, specifically chief operating officer positions. What that hasn't translated to, what they thought it might translate to, is more women going from COO to CEO and we're not seeing that. We think of, if you ask, name a female executive that you'd recognize, everyone would probably say Sheryl Sandberg. But I was shocked to learn the other day at a Women in Tech event I was doing, that there was a survey done by this organization that showed that 78% of people couldn't identify. So to your point, we need more of them in that visible role, in the executive suite. >> Tracy: Exactly. >> And there's data that show that companies that have women, companies across industries that have women in leadership positions, executive positions I should say, are actually more profitable. So it's kind of like, duh, the data is there, it's telling you this. >> Hannah: Exactly. >> Right? >> And I think also a very important point is work culture and the work environment. And as a woman, maybe if you enter and you work two or three years, and then you have to oftentimes choose, okay, do I want family or do I want my job? And I think that's one of the major tasks that companies face to make it possible for women to combine being a mother and being a great data scientist or an executive or CEO. And I think there's still a lot to be done in this regard to make it possible for women to not have to choose for one thing or the other. And I think that's also a reason why we might see more women at the entry level, but not long-term. Because they are punished if they take a couple years off if they want to have kids. >> I think that's a question we need to ask to men too. >> Absolutely. >> How to balance work and life. 'Cause we never ask that. We just ask the woman. >> No, they just get it done, probably because there's a woman on the other end whose making it happen. >> Exactly. So yeah, another thing to think about, another thing to work towards too. >> Yeah, it's a good point you're raising that we have this conversation together and not exclusively only women, but we all have to come together and talk about how we can design companies in a way that it works for everyone. >> Yeah, and no slight to men at all. A lot of my mentors and sponsors are men. They're just people that I greatly admire who saw raw potential in me 15, 18 years ago, and just added a little water to this little weed and it started to grow. In fact, theCUBE- >> Tracy: And look at you now. >> Look at me now. And theCUBE, the guys Dave Vellante and John Furrier are two of those people that are sponsors of mine. But it needs to be diverse. It needs to be diverse and gender, it needs to include non-binary people, anybody, shouldn't matter. We should be able to collectively work together to solve big problems. Like the propaganda problem that was being discussed in the keynote this morning with respect to China, or climate change. Climate change is a huge challenge. Here, we are in California, we're getting an atmospheric river tomorrow. And Californians and rain, we're not so friendly. But we know that there's massive changes going on in the climate. Data science can help really unlock a lot of the challenges and solve some of the problems and help us understand better. So there's so much real-world implication potential that being data-driven can really lead to. And I love the fact that you guys are studying data journalism. You'll have to help me understand that even more. But we're going to going to have great conversations today, I'm so excited to be co-hosting with both of you. You're going to be inspired, you're going to learn, they're going to learn from us as well. So let's just kind of think of this as a community of men, women, everything in between to really help inspire the current generations, the future generations. And to your point, let's help women feel confident to be able to stay and raise their hand for fast-tracking their careers. >> Exactly. >> What are you guys, last minute, what are you looking forward to most for today? >> Just meeting these great women, I can't wait. >> Yeah, learning from each other. Having this conversation about how we can make data science even more equitable and hear from the great ideas that all these women have. >> Excellent, girls, we're going to have a great day. We're so glad that you're here with us on theCUBE, live at Stanford University, Women in Data Science, the eighth annual conference. I'm Lisa Martin, my two co-hosts for the day, Tracy Zhang, Hannah Freitag, you're going to be seeing a lot of us, we appreciate. Stick around, our first guest joins Hannah and me in just a minute. (ambient music)
SUMMARY :
So great to have you guys. and then Hannah we'll have Is definitely one of the Data in stories, I love that. And I love to work with and we were chatting earlier and they're going to know about me, Yeah, and the great way is And I think Margot was And part of that is raising the awareness. I mean, the representation and all these fields, for sure. and I'll ask you the same question, I think it's important to start early, What are some of the things and even to the social good as well. be what you can't see. and some of the other women in tech events So that expectation that everyone has that every business is going to meet it. And circling back to young women, and I wanted to share this with you know anything prior to 2021. it is the most advanced Tracy: I did not of one of the most groundbreaking That is, yeah, I and I thought, I have to talk about that for pushing the momentum to start growth, to keep women in data science. So to your point, we need more that have women in leadership positions, and the work environment. I think that's a question We just ask the woman. a woman on the other end another thing to work towards too. and talk about how we can design companies and it started to grow. And I love the fact that you guys great women, I can't wait. and hear from the great ideas Women in Data Science, the
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Mira Murati | PERSON | 0.99+ |
Hannah | PERSON | 0.99+ |
Tracy | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Hannah Freitag | PERSON | 0.99+ |
Tracy Zhang | PERSON | 0.99+ |
California | LOCATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Sheryl Sandberg | PERSON | 0.99+ |
two | QUANTITY | 0.99+ |
Tracy Zhang | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
Boeing Air Company | ORGANIZATION | 0.99+ |
Berlin | LOCATION | 0.99+ |
one year | QUANTITY | 0.99+ |
Intuit | ORGANIZATION | 0.99+ |
2023 | DATE | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
78% | QUANTITY | 0.99+ |
iPhone | COMMERCIAL_ITEM | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Margot | PERSON | 0.99+ |
tens of thousands | QUANTITY | 0.99+ |
one day | QUANTITY | 0.99+ |
International Women's Day | EVENT | 0.99+ |
2022 | DATE | 0.99+ |
yesterday | DATE | 0.99+ |
last year | DATE | 0.99+ |
tomorrow | DATE | 0.99+ |
three years | QUANTITY | 0.99+ |
10 year | QUANTITY | 0.99+ |
12 year | QUANTITY | 0.99+ |
three year | QUANTITY | 0.99+ |
ORGANIZATION | 0.99+ | |
Humboldt University | ORGANIZATION | 0.99+ |
both | QUANTITY | 0.99+ |
International Women's Day | EVENT | 0.99+ |
hundreds of thousands | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
'22 | DATE | 0.98+ |
today | DATE | 0.98+ |
WiDS | EVENT | 0.98+ |
John Furrier | PERSON | 0.98+ |
Uber | ORGANIZATION | 0.98+ |
two co-hosts | QUANTITY | 0.98+ |
35 | QUANTITY | 0.98+ |
eighth Annual Women in Data Science Conference | EVENT | 0.97+ |
first step | QUANTITY | 0.97+ |
first guest | QUANTITY | 0.97+ |
one thing | QUANTITY | 0.97+ |
five | QUANTITY | 0.97+ |
six | QUANTITY | 0.97+ |
'21 | DATE | 0.97+ |
about 350 people | QUANTITY | 0.96+ |
100 million-plus users | QUANTITY | 0.95+ |
2021 | DATE | 0.95+ |
theCUBE | ORGANIZATION | 0.95+ |
AnitaB.org | ORGANIZATION | 0.95+ |
Stanford | ORGANIZATION | 0.95+ |
Lena Smart & Tara Hernandez, MongoDB | International Women's Day
(upbeat music) >> Hello and welcome to theCube's coverage of International Women's Day. I'm John Furrier, your host of "theCUBE." We've got great two remote guests coming into our Palo Alto Studios, some tech athletes, as we say, people that've been in the trenches, years of experience, Lena Smart, CISO at MongoDB, Cube alumni, and Tara Hernandez, VP of Developer Productivity at MongoDB as well. Thanks for coming in to this program and supporting our efforts today. Thanks so much. >> Thanks for having us. >> Yeah, everyone talk about the journey in tech, where it all started. Before we get there, talk about what you guys are doing at MongoDB specifically. MongoDB is kind of gone the next level as a platform. You have your own ecosystem, lot of developers, very technical crowd, but it's changing the business transformation. What do you guys do at Mongo? We'll start with you, Lena. >> So I'm the CISO, so all security goes through me. I like to say, well, I don't like to say, I'm described as the ones throat to choke. So anything to do with security basically starts and ends with me. We do have a fantastic Cloud engineering security team and a product security team, and they don't report directly to me, but obviously we have very close relationships. I like to keep that kind of church and state separate and I know I've spoken about that before. And we just recently set up a physical security team with an amazing gentleman who left the FBI and he came to join us after 26 years for the agency. So, really starting to look at the physical aspects of what we offer as well. >> I interviewed a CISO the other day and she said, "Every day is day zero for me." Kind of goofing on the Amazon Day one thing, but Tara, go ahead. Tara, go ahead. What's your role there, developer productivity? What are you focusing on? >> Sure. Developer productivity is kind of the latest description for things that we've described over the years as, you know, DevOps oriented engineering or platform engineering or build and release engineering development infrastructure. It's all part and parcel, which is how do we actually get our code from developer to customer, you know, and all the mechanics that go into that. It's been something I discovered from my first job way back in the early '90s at Borland. And the art has just evolved enormously ever since, so. >> Yeah, this is a very great conversation both of you guys, right in the middle of all the action and data infrastructures changing, exploding, and involving big time AI and data tsunami and security never stops. Well, let's get into, we'll talk about that later, but let's get into what motivated you guys to pursue a career in tech and what were some of the challenges that you faced along the way? >> I'll go first. The fact of the matter was I intended to be a double major in history and literature when I went off to university, but I was informed that I had to do a math or a science degree or else the university would not be paid for. At the time, UC Santa Cruz had a policy that called Open Access Computing. This is, you know, the late '80s, early '90s. And anybody at the university could get an email account and that was unusual at the time if you were, those of us who remember, you used to have to pay for that CompuServe or AOL or, there's another one, I forget what it was called, but if a student at Santa Cruz could have an email account. And because of that email account, I met people who were computer science majors and I'm like, "Okay, I'll try that." That seems good. And it was a little bit of a struggle for me, a lot I won't lie, but I can't complain with how it ended up. And certainly once I found my niche, which was development infrastructure, I found my true love and I've been doing it for almost 30 years now. >> Awesome. Great story. Can't wait to ask a few questions on that. We'll go back to that late '80s, early '90s. Lena, your journey, how you got into it. >> So slightly different start. I did not go to university. I had to leave school when I was 16, got a job, had to help support my family. Worked a bunch of various jobs till I was about 21 and then computers became more, I think, I wouldn't say they were ubiquitous, but they were certainly out there. And I'd also been saving up every penny I could earn to buy my own computer and bought an Amstrad 1640, 20 meg hard drive. It rocked. And kind of took that apart, put it back together again, and thought that could be money in this. And so basically just teaching myself about computers any job that I got. 'Cause most of my jobs were like clerical work and secretary at that point. But any job that had a computer in front of that, I would make it my business to go find the guy who did computing 'cause it was always a guy. And I would say, you know, I want to learn how these work. Let, you know, show me. And, you know, I would take my lunch hour and after work and anytime I could with these people and they were very kind with their time and I just kept learning, so yep. >> Yeah, those early days remind me of the inflection point we're going through now. This major C change coming. Back then, if you had a computer, you had to kind of be your own internal engineer to fix things. Remember back on the systems revolution, late '80s, Tara, when, you know, your career started, those were major inflection points. Now we're seeing a similar wave right now, security, infrastructure. It feels like it's going to a whole nother level. At Mongo, you guys certainly see this as well, with this AI surge coming in. A lot more action is coming in. And so there's a lot of parallels between these inflection points. How do you guys see this next wave of change? Obviously, the AI stuff's blowing everyone away. Oh, new user interface. It's been called the browser moment, the mobile iPhone moment, kind of for this generation. There's a lot of people out there who are watching that are young in their careers, what's your take on this? How would you talk to those folks around how important this wave is? >> It, you know, it's funny, I've been having this conversation quite a bit recently in part because, you know, to me AI in a lot of ways is very similar to, you know, back in the '90s when we were talking about bringing in the worldwide web to the forefront of the world, right. And we tended to think in terms of all the optimistic benefits that would come of it. You know, free passing of information, availability to anyone, anywhere. You just needed an internet connection, which back then of course meant a modem. >> John: Not everyone had though. >> Exactly. But what we found in the subsequent years is that human beings are what they are and we bring ourselves to whatever platforms that are there, right. And so, you know, as much as it was amazing to have this freely available HTML based internet experience, it also meant that the negatives came to the forefront quite quickly. And there were ramifications of that. And so to me, when I look at AI, we're already seeing the ramifications to that. Yes, are there these amazing, optimistic, wonderful things that can be done? Yes. >> Yeah. >> But we're also human and the bad stuff's going to come out too. And how do we- >> Yeah. >> How do we as an industry, as a community, you know, understand and mitigate those ramifications so that we can benefit more from the positive than the negative. So it is interesting that it comes kind of full circle in really interesting ways. >> Yeah. The underbelly takes place first, gets it in the early adopter mode. Normally industries with, you know, money involved arbitrage, no standards. But we've seen this movie before. Is there hope, Lena, that we can have a more secure environment? >> I would hope so. (Lena laughs) Although depressingly, we've been in this well for 30 years now and we're, at the end of the day, still telling people not to click links on emails. So yeah, that kind of still keeps me awake at night a wee bit. The whole thing about AI, I mean, it's, obviously I am not an expert by any stretch of the imagination in AI. I did read (indistinct) book recently about AI and that was kind of interesting. And I'm just trying to teach myself as much as I can about it to the extent of even buying the "Dummies Guide to AI." Just because, it's actually not a dummies guide. It's actually fairly interesting, but I'm always thinking about it from a security standpoint. So it's kind of my worst nightmare and the best thing that could ever happen in the same dream. You know, you've got this technology where I can ask it a question and you know, it spits out generally a reasonable answer. And my team are working on with Mark Porter our CTO and his team on almost like an incubation of AI link. What would it look like from MongoDB? What's the legal ramifications? 'Cause there will be legal ramifications even though it's the wild, wild west just now, I think. Regulation's going to catch up to us pretty quickly, I would think. >> John: Yeah, yeah. >> And so I think, you know, as long as companies have a seat at the table and governments perhaps don't become too dictatorial over this, then hopefully we'll be in a good place. But we'll see. I think it's a really interest, there's that curse, we're living in interesting times. I think that's where we are. >> It's interesting just to stay on this tech trend for a minute. The standards bodies are different now. Back in the old days there were, you know, IEEE standards, ITF standards. >> Tara: TPC. >> The developers are the new standard. I mean, now you're seeing open source completely different where it was in the '90s to here beginning, that was gen one, some say gen two, but I say gen one, now we're exploding with open source. You have kind of developers setting the standards. If developers like it in droves, it becomes defacto, which then kind of rolls into implementation. >> Yeah, I mean I think if you don't have developer input, and this is why I love working with Tara and her team so much is 'cause they get it. If we don't have input from developers, it's not going to get used. There's going to be ways of of working around it, especially when it comes to security. If they don't, you know, if you're a developer and you're sat at your screen and you don't want to do that particular thing, you're going to find a way around it. You're a smart person. >> Yeah. >> So. >> Developers on the front lines now versus, even back in the '90s, they're like, "Okay, consider the dev's, got a QA team." Everything was Waterfall, now it's Cloud, and developers are on the front lines of everything. Tara, I mean, this is where the standards are being met. What's your reaction to that? >> Well, I think it's outstanding. I mean, you know, like I was at Netscape and part of the crowd that released the browser as open source and we founded mozilla.org, right. And that was, you know, in many ways kind of the birth of the modern open source movement beyond what we used to have, what was basically free software foundation was sort of the only game in town. And I think it is so incredibly valuable. I want to emphasize, you know, and pile onto what Lena was saying, it's not just that the developers are having input on a sort of company by company basis. Open source to me is like a checks and balance, where it allows us as a broader community to be able to agree on and enforce certain standards in order to try and keep the technology platforms as accessible as possible. I think Kubernetes is a great example of that, right. If we didn't have Kubernetes, that would've really changed the nature of how we think about container orchestration. But even before that, Linux, right. Linux allowed us as an industry to end the Unix Wars and as someone who was on the front lines of that as well and having to support 42 different operating systems with our product, you know, that was a huge win. And it allowed us to stop arguing about operating systems and start arguing about software or not arguing, but developing it in positive ways. So with, you know, with Kubernetes, with container orchestration, we all agree, okay, that's just how we're going to orchestrate. Now we can build up this huge ecosystem, everybody gets taken along, right. And now it changes the game for what we're defining as business differentials, right. And so when we talk about crypto, that's a little bit harder, but certainly with AI, right, you know, what are the checks and balances that as an industry and as the developers around this, that we can in, you know, enforce to make sure that no one company or no one body is able to overly control how these things are managed, how it's defined. And I think that is only for the benefit in the industry as a whole, particularly when we think about the only other option is it gets regulated in ways that do not involve the people who actually know the details of what they're talking about. >> Regulated and or thrown away or bankrupt or- >> Driven underground. >> Yeah. >> Which would be even worse actually. >> Yeah, that's a really interesting, the checks and balances. I love that call out. And I was just talking with another interview part of the series around women being represented in the 51% ratio. Software is for everybody. So that we believe that open source movement around the collective intelligence of the participants in the industry and independent of gender, this is going to be the next wave. You're starting to see these videos really have impact because there are a lot more leaders now at the table in companies developing software systems and with AI, the aperture increases for applications. And this is the new dynamic. What's your guys view on this dynamic? How does this go forward in a positive way? Is there a certain trajectory you see? For women in the industry? >> I mean, I think some of the states are trying to, again, from the government angle, some of the states are trying to force women into the boardroom, for example, California, which can be no bad thing, but I don't know, sometimes I feel a bit iffy about all this kind of forced- >> John: Yeah. >> You know, making, I don't even know how to say it properly so you can cut this part of the interview. (John laughs) >> Tara: Well, and I think that they're >> I'll say it's not organic. >> No, and I think they're already pulling it out, right. It's already been challenged so they're in the process- >> Well, this is the open source angle, Tara, you are getting at it. The change agent is open, right? So to me, the history of the proven model is openness drives transparency drives progress. >> No, it's- >> If you believe that to be true, this could have another impact. >> Yeah, it's so interesting, right. Because if you look at McKinsey Consulting or Boston Consulting or some of the other, I'm blocking on all of the names. There has been a decade or more of research that shows that a non homogeneous employee base, be it gender or ethnicity or whatever, generates more revenue, right? There's dollar signs that can be attached to this, but it's not enough for all companies to want to invest in that way. And it's not enough for all, you know, venture firms or investment firms to grant that seed money or do those seed rounds. I think it's getting better very slowly, but socialization is a much harder thing to overcome over time. Particularly, when you're not just talking about one country like the United States in our case, but around the world. You know, tech centers now exist all over the world, including places that even 10 years ago we might not have expected like Nairobi, right. Which I think is amazing, but you have to factor in the cultural implications of that as well, right. So yes, the openness is important and we have, it's important that we have those voices, but I don't think it's a panacea solution, right. It's just one more piece. I think honestly that one of the most important opportunities has been with Cloud computing and Cloud's been around for a while. So why would I say that? It's because if you think about like everybody holds up the Steve Jobs, Steve Wozniak, back in the '70s, or Sergey and Larry for Google, you know, you had to have access to enough credit card limit to go to Fry's and buy your servers and then access to somebody like Susan Wojcicki to borrow the garage or whatever. But there was still a certain amount of upfrontness that you had to be able to commit to, whereas now, and we've, I think, seen a really good evidence of this being able to lease server resources by the second and have development platforms that you can do on your phone. I mean, for a while I think Africa, that the majority of development happened on mobile devices because there wasn't a sufficient supply chain of laptops yet. And that's no longer true now as far as I know. But like the power that that enables for people who would otherwise be underrepresented in our industry instantly opens it up, right? And so to me that's I think probably the biggest opportunity that we've seen from an industry on how to make more availability in underrepresented representation for entrepreneurship. >> Yeah. >> Something like AI, I think that's actually going to take us backwards if we're not careful. >> Yeah. >> Because of we're reinforcing that socialization. >> Well, also the bias. A lot of people commenting on the biases of the large language inherently built in are also problem. Lena, I want you to weigh on this too, because I think the skills question comes up here and I've been advocating that you don't need the pedigree, college pedigree, to get into a certain jobs, you mentioned Cloud computing. I mean, it's been around for you think a long time, but not really, really think about it. The ability to level up, okay, if you're going to join something new and half the jobs in cybersecurity are created in the past year, right? So, you have this what used to be a barrier, your degree, your pedigree, your certification would take years, would be a blocker. Now that's gone. >> Lena: Yeah, it's the opposite. >> That's, in fact, psychology. >> I think so, but the people who I, by and large, who I interview for jobs, they have, I think security people and also I work with our compliance folks and I can't forget them, but let's talk about security just now. I've always found a particular kind of mindset with security folks. We're very curious, not very good at following rules a lot of the time, and we'd love to teach others. I mean, that's one of the big things stem from the start of my career. People were always interested in teaching and I was interested in learning. So it was perfect. And I think also having, you know, strong women leaders at MongoDB allows other underrepresented groups to actually apply to the company 'cause they see that we're kind of talking the talk. And that's been important. I think it's really important. You know, you've got Tara and I on here today. There's obviously other senior women at MongoDB that you can talk to as well. There's a bunch of us. There's not a whole ton of us, but there's a bunch of us. And it's good. It's definitely growing. I've been there for four years now and I've seen a growth in women in senior leadership positions. And I think having that kind of track record of getting really good quality underrepresented candidates to not just interview, but come and join us, it's seen. And it's seen in the industry and people take notice and they're like, "Oh, okay, well if that person's working, you know, if Tara Hernandez is working there, I'm going to apply for that." And that in itself I think can really, you know, reap the rewards. But it's getting started. It's like how do you get your first strong female into that position or your first strong underrepresented person into that position? It's hard. I get it. If it was easy, we would've sold already. >> It's like anything. I want to see people like me, my friends in there. Am I going to be alone? Am I going to be of a group? It's a group psychology. Why wouldn't? So getting it out there is key. Is there skills that you think that people should pay attention to? One's come up as curiosity, learning. What are some of the best practices for folks trying to get into the tech field or that's in the tech field and advancing through? What advice are you guys- >> I mean, yeah, definitely, what I say to my team is within my budget, we try and give every at least one training course a year. And there's so much free stuff out there as well. But, you know, keep learning. And even if it's not right in your wheelhouse, don't pick about it. Don't, you know, take a look at what else could be out there that could interest you and then go for it. You know, what does it take you few minutes each night to read a book on something that might change your entire career? You know, be enthusiastic about the opportunities out there. And there's so many opportunities in security. Just so many. >> Tara, what's your advice for folks out there? Tons of stuff to taste, taste test, try things. >> Absolutely. I mean, I always say, you know, my primary qualifications for people, I'm looking for them to be smart and motivated, right. Because the industry changes so quickly. What we're doing now versus what we did even last year versus five years ago, you know, is completely different though themes are certainly the same. You know, we still have to code and we still have to compile that code or package the code and ship the code so, you know, how well can we adapt to these new things instead of creating floppy disks, which was my first job. Five and a quarters, even. The big ones. >> That's old school, OG. There it is. Well done. >> And now it's, you know, containers, you know, (indistinct) image containers. And so, you know, I've gotten a lot of really great success hiring boot campers, you know, career transitioners. Because they bring a lot experience in addition to the technical skills. I think the most important thing is to experiment and figuring out what do you like, because, you know, maybe you are really into security or maybe you're really into like deep level coding and you want to go back, you know, try to go to school to get a degree where you would actually want that level of learning. Or maybe you're a front end engineer, you want to be full stacked. Like there's so many different things, data science, right. Maybe you want to go learn R right. You know, I think it's like figure out what you like because once you find that, that in turn is going to energize you 'cause you're going to feel motivated. I think the worst thing you could do is try to force yourself to learn something that you really could not care less about. That's just the worst. You're going in handicapped. >> Yeah and there's choices now versus when we were breaking into the business. It was like, okay, you software engineer. They call it software engineering, that's all it was. You were that or you were in sales. Like, you know, some sort of systems engineer or sales and now it's,- >> I had never heard of my job when I was in school, right. I didn't even know it was a possibility. But there's so many different types of technical roles, you know, absolutely. >> It's so exciting. I wish I was young again. >> One of the- >> Me too. (Lena laughs) >> I don't. I like the age I am. So one of the things that I did to kind of harness that curiosity is we've set up a security champions programs. About 120, I guess, volunteers globally. And these are people from all different backgrounds and all genders, diversity groups, underrepresented groups, we feel are now represented within this champions program. And people basically give up about an hour or two of their time each week, with their supervisors permission, and we basically teach them different things about security. And we've now had seven full-time people move from different areas within MongoDB into my team as a result of that program. So, you know, monetarily and time, yeah, saved us both. But also we're showing people that there is a path, you know, if you start off in Tara's team, for example, doing X, you join the champions program, you're like, "You know, I'd really like to get into red teaming. That would be so cool." If it fits, then we make that happen. And that has been really important for me, especially to give, you know, the women in the underrepresented groups within MongoDB just that window into something they might never have seen otherwise. >> That's a great common fit is fit matters. Also that getting access to what you fit is also access to either mentoring or sponsorship or some sort of, at least some navigation. Like what's out there and not being afraid to like, you know, just ask. >> Yeah, we just actually kicked off our big mentor program last week, so I'm the executive sponsor of that. I know Tara is part of it, which is fantastic. >> We'll put a plug in for it. Go ahead. >> Yeah, no, it's amazing. There's, gosh, I don't even know the numbers anymore, but there's a lot of people involved in this and so much so that we've had to set up mentoring groups rather than one-on-one. And I think it was 45% of the mentors are actually male, which is quite incredible for a program called Mentor Her. And then what we want to do in the future is actually create a program called Mentor Them so that it's not, you know, not just on the female and so that we can live other groups represented and, you know, kind of break down those groups a wee bit more and have some more granularity in the offering. >> Tara, talk about mentoring and sponsorship. Open source has been there for a long time. People help each other. It's community-oriented. What's your view of how to work with mentors and sponsors if someone's moving through ranks? >> You know, one of the things that was really interesting, unfortunately, in some of the earliest open source communities is there was a lot of pervasive misogyny to be perfectly honest. >> Yeah. >> And one of the important adaptations that we made as an open source community was the idea, an introduction of code of conducts. And so when I'm talking to women who are thinking about expanding their skills, I encourage them to join open source communities to have opportunity, even if they're not getting paid for it, you know, to develop their skills to work with people to get those code reviews, right. I'm like, "Whatever you join, make sure they have a code of conduct and a good leadership team. It's very important." And there are plenty, right. And then that idea has come into, you know, conferences now. So now conferences have codes of contact, if there are any good, and maybe not all of them, but most of them, right. And the ideas of expanding that idea of intentional healthy culture. >> John: Yeah. >> As a business goal and business differentiator. I mean, I won't lie, when I was recruited to come to MongoDB, the culture that I was able to discern through talking to people, in addition to seeing that there was actually women in senior leadership roles like Lena, like Kayla Nelson, that was a huge win. And so it just builds on momentum. And so now, you know, those of us who are in that are now representing. And so that kind of reinforces, but it's all ties together, right. As the open source world goes, particularly for a company like MongoDB, which has an open source product, you know, and our community builds. You know, it's a good thing to be mindful of for us, how we interact with the community and you know, because that could also become an opportunity for recruiting. >> John: Yeah. >> Right. So we, in addition to people who might become advocates on Mongo's behalf in their own company as a solution for themselves, so. >> You guys had great successful company and great leadership there. I mean, I can't tell you how many times someone's told me "MongoDB doesn't scale. It's going to be dead next year." I mean, I was going back 10 years. It's like, just keeps getting better and better. You guys do a great job. So it's so fun to see the success of developers. Really appreciate you guys coming on the program. Final question, what are you guys excited about to end the segment? We'll give you guys the last word. Lena will start with you and Tara, you can wrap us up. What are you excited about? >> I'm excited to see what this year brings. I think with ChatGPT and its copycats, I think it'll be a very interesting year when it comes to AI and always in the lookout for the authentic deep fakes that we see coming out. So just trying to make people aware that this is a real thing. It's not just pretend. And then of course, our old friend ransomware, let's see where that's going to go. >> John: Yeah. >> And let's see where we get to and just genuine hygiene and housekeeping when it comes to security. >> Excellent. Tara. >> Ah, well for us, you know, we're always constantly trying to up our game from a security perspective in the software development life cycle. But also, you know, what can we do? You know, one interesting application of AI that maybe Google doesn't like to talk about is it is really cool as an addendum to search and you know, how we might incorporate that as far as our learning environment and developer productivity, and how can we enable our developers to be more efficient, productive in their day-to-day work. So, I don't know, there's all kinds of opportunities that we're looking at for how we might improve that process here at MongoDB and then maybe be able to share it with the world. One of the things I love about working at MongoDB is we get to use our own products, right. And so being able to have this interesting document database in order to put information and then maybe apply some sort of AI to get it out again, is something that we may well be looking at, if not this year, then certainly in the coming year. >> Awesome. Lena Smart, the chief information security officer. Tara Hernandez, vice president developer of productivity from MongoDB. Thank you so much for sharing here on International Women's Day. We're going to do this quarterly every year. We're going to do it and then we're going to do quarterly updates. Thank you so much for being part of this program. >> Thank you. >> Thanks for having us. >> Okay, this is theCube's coverage of International Women's Day. I'm John Furrier, your host. Thanks for watching. (upbeat music)
SUMMARY :
Thanks for coming in to this program MongoDB is kind of gone the I'm described as the ones throat to choke. Kind of goofing on the you know, and all the challenges that you faced the time if you were, We'll go back to that you know, I want to learn how these work. Tara, when, you know, your career started, you know, to me AI in a lot And so, you know, and the bad stuff's going to come out too. you know, understand you know, money involved and you know, it spits out And so I think, you know, you know, IEEE standards, ITF standards. The developers are the new standard. and you don't want to do and developers are on the And that was, you know, in many ways of the participants I don't even know how to say it properly No, and I think they're of the proven model is If you believe that that you can do on your phone. going to take us backwards Because of we're and half the jobs in cybersecurity And I think also having, you know, I going to be of a group? You know, what does it take you Tons of stuff to taste, you know, my primary There it is. And now it's, you know, containers, Like, you know, some sort you know, absolutely. I (Lena laughs) especially to give, you know, Also that getting access to so I'm the executive sponsor of that. We'll put a plug in for it. and so that we can live to work with mentors You know, one of the things And one of the important and you know, because So we, in addition to people and Tara, you can wrap us up. and always in the lookout for it comes to security. addendum to search and you know, We're going to do it and then we're I'm John Furrier, your host.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Susan Wojcicki | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Jim | PERSON | 0.99+ |
Jason | PERSON | 0.99+ |
Tara Hernandez | PERSON | 0.99+ |
David Floyer | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Lena Smart | PERSON | 0.99+ |
John Troyer | PERSON | 0.99+ |
Mark Porter | PERSON | 0.99+ |
Mellanox | ORGANIZATION | 0.99+ |
Kevin Deierling | PERSON | 0.99+ |
Marty Lans | PERSON | 0.99+ |
Tara | PERSON | 0.99+ |
John | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Jim Jackson | PERSON | 0.99+ |
Jason Newton | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Daniel Hernandez | PERSON | 0.99+ |
Dave Winokur | PERSON | 0.99+ |
Daniel | PERSON | 0.99+ |
Lena | PERSON | 0.99+ |
Meg Whitman | PERSON | 0.99+ |
Telco | ORGANIZATION | 0.99+ |
Julie Sweet | PERSON | 0.99+ |
Marty | PERSON | 0.99+ |
Yaron Haviv | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Western Digital | ORGANIZATION | 0.99+ |
Kayla Nelson | PERSON | 0.99+ |
Mike Piech | PERSON | 0.99+ |
Jeff | PERSON | 0.99+ |
Dave Volante | PERSON | 0.99+ |
John Walls | PERSON | 0.99+ |
Keith Townsend | PERSON | 0.99+ |
five | QUANTITY | 0.99+ |
Ireland | LOCATION | 0.99+ |
Antonio | PERSON | 0.99+ |
Daniel Laury | PERSON | 0.99+ |
Jeff Frick | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
six | QUANTITY | 0.99+ |
Todd Kerry | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
$20 | QUANTITY | 0.99+ |
Mike | PERSON | 0.99+ |
January 30th | DATE | 0.99+ |
Meg | PERSON | 0.99+ |
Mark Little | PERSON | 0.99+ |
Luke Cerney | PERSON | 0.99+ |
Peter | PERSON | 0.99+ |
Jeff Basil | PERSON | 0.99+ |
Stu Miniman | PERSON | 0.99+ |
Dan | PERSON | 0.99+ |
10 | QUANTITY | 0.99+ |
Allan | PERSON | 0.99+ |
40 gig | QUANTITY | 0.99+ |
Adam Wenchel, Arthur.ai | CUBE Conversation
(bright upbeat music) >> Hello and welcome to this Cube Conversation. I'm John Furrier, host of theCUBE. We've got a great conversation featuring Arthur AI. I'm your host. I'm excited to have Adam Wenchel who's the Co-Founder and CEO. Thanks for joining us today, appreciate it. >> Yeah, thanks for having me on, John, looking forward to the conversation. >> I got to say, it's been an exciting world in AI or artificial intelligence. Just an explosion of interest kind of in the mainstream with the language models, which people don't really get, but they're seeing the benefits of some of the hype around OpenAI. Which kind of wakes everyone up to, "Oh, I get it now." And then of course the pessimism comes in, all the skeptics are out there. But this breakthrough in generative AI field is just awesome, it's really a shift, it's a wave. We've been calling it probably the biggest inflection point, then the others combined of what this can do from a surge standpoint, applications. I mean, all aspects of what we used to know is the computing industry, software industry, hardware, is completely going to get turbo. So we're totally obviously bullish on this thing. So, this is really interesting. So my first question is, I got to ask you, what's you guys taking? 'Cause you've been doing this, you're in it, and now all of a sudden you're at the beach where the big waves are. What's the explosion of interest is there? What are you seeing right now? >> Yeah, I mean, it's amazing, so for starters, I've been in AI for over 20 years and just seeing this amount of excitement and the growth, and like you said, the inflection point we've hit in the last six months has just been amazing. And, you know, what we're seeing is like people are getting applications into production using LLMs. I mean, really all this excitement just started a few months ago, with ChatGPT and other breakthroughs and the amount of activity and the amount of new systems that we're seeing hitting production already so soon after that is just unlike anything we've ever seen. So it's pretty awesome. And, you know, these language models are just, they could be applied in so many different business contexts and that it's just the amount of value that's being created is again, like unprecedented compared to anything. >> Adam, you know, you've been in this for a while, so it's an interesting point you're bringing up, and this is a good point. I was talking with my friend John Markoff, former New York Times journalist and he was talking about, there's been a lot of work been done on ethics. So there's been, it's not like it's new. It's like been, there's a lot of stuff that's been baking over many, many years and, you know, decades. So now everyone wakes up in the season, so I think that is a key point I want to get into some of your observations. But before we get into it, I want you to explain for the folks watching, just so we can kind of get a definition on the record. What's an LLM, what's a foundational model and what's generative ai? Can you just quickly explain the three things there? >> Yeah, absolutely. So an LLM or a large language model, it's just a large, they would imply a large language model that's been trained on a huge amount of data typically pulled from the internet. And it's a general purpose language model that can be built on top for all sorts of different things, that includes traditional NLP tasks like document classification and sentiment understanding. But the thing that's gotten people really excited is it's used for generative tasks. So, you know, asking it to summarize documents or asking it to answer questions. And these aren't new techniques, they've been around for a while, but what's changed is just this new class of models that's based on new architectures. They're just so much more capable that they've gone from sort of science projects to something that's actually incredibly useful in the real world. And there's a number of companies that are making them accessible to everyone so that you can build on top of them. So that's the other big thing is, this kind of access to these models that can power generative tasks has been democratized in the last few months and it's just opening up all these new possibilities. And then the third one you mentioned foundation models is sort of a broader term for the category that includes LLMs, but it's not just language models that are included. So we've actually seen this for a while in the computer vision world. So people have been building on top of computer vision models, pre-trained computer vision models for a while for image classification, object detection, that's something we've had customers doing for three or four years already. And so, you know, like you said, there are antecedents to like, everything that's happened, it's not entirely new, but it does feel like a step change. >> Yeah, I did ask ChatGPT to give me a riveting introduction to you and it gave me an interesting read. If we have time, I'll read it. It's kind of, it's fun, you get a kick out of it. "Ladies and gentlemen, today we're a privileged "to have Adam Wenchel, Founder of Arthur who's going to talk "about the exciting world of artificial intelligence." And then it goes on with some really riveting sentences. So if we have time, I'll share that, it's kind of funny. It was good. >> Okay. >> So anyway, this is what people see and this is why I think it's exciting 'cause I think people are going to start refactoring what they do. And I've been saying this on theCUBE now for about a couple months is that, you know, there's a scene in "Moneyball" where Billy Beane sits down with the Red Sox owner and the Red Sox owner says, "If people aren't rebuilding their teams on your model, "they're going to be dinosaurs." And it reminds me of what's happening right now. And I think everyone that I talk to in the business sphere is looking at this and they're connecting the dots and just saying, if we don't rebuild our business with this new wave, they're going to be out of business because there's so much efficiency, there's so much automation, not like DevOps automation, but like the generative tasks that will free up the intellect of people. Like just the simple things like do an intro or do this for me, write some code, write a countermeasure to a hack. I mean, this is kind of what people are doing. And you mentioned computer vision, again, another huge field where 5G things are coming on, it's going to accelerate. What do you say to people when they kind of are leaning towards that, I need to rethink my business? >> Yeah, it's 100% accurate and what's been amazing to watch the last few months is the speed at which, and the urgency that companies like Microsoft and Google or others are actually racing to, to do that rethinking of their business. And you know, those teams, those companies which are large and haven't always been the fastest moving companies are working around the clock. And the pace at which they're rolling out LLMs across their suite of products is just phenomenal to watch. And it's not just the big, the large tech companies as well, I mean, we're seeing the number of startups, like we get, every week a couple of new startups get in touch with us for help with their LLMs and you know, there's just a huge amount of venture capital flowing into it right now because everyone realizes the opportunities for transforming like legal and healthcare and content creation in all these different areas is just wide open. And so there's a massive gold rush going on right now, which is amazing. >> And the cloud scale, obviously horizontal scalability of the cloud brings us to another level. We've been seeing data infrastructure since the Hadoop days where big data was coined. Now you're seeing this kind of take fruit, now you have vertical specialization where data shines, large language models all of a set up perfectly for kind of this piece. And you know, as you mentioned, you've been doing it for a long time. Let's take a step back and I want to get into how you started the company, what drove you to start it? Because you know, as an entrepreneur you're probably saw this opportunity before other people like, "Hey, this is finally it, it's here." Can you share the origination story of what you guys came up with, how you started it, what was the motivation and take us through that origination story. >> Yeah, absolutely. So as I mentioned, I've been doing AI for many years. I started my career at DARPA, but it wasn't really until 2015, 2016, my previous company was acquired by Capital One. Then I started working there and shortly after I joined, I was asked to start their AI team and scale it up. And for the first time I was actually doing it, had production models that we were working with, that was at scale, right? And so there was hundreds of millions of dollars of business revenue and certainly a big group of customers who were impacted by the way these models acted. And so it got me hyper-aware of these issues of when you get models into production, it, you know. So I think people who are earlier in the AI maturity look at that as a finish line, but it's really just the beginning and there's this constant drive to make them better, make sure they're not degrading, make sure you can explain what they're doing, if they're impacting people, making sure they're not biased. And so at that time, there really weren't any tools to exist to do this, there wasn't open source, there wasn't anything. And so after a few years there, I really started talking to other people in the industry and there was a really clear theme that this needed to be addressed. And so, I joined with my Co-Founder John Dickerson, who was on the faculty in University of Maryland and he'd been doing a lot of research in these areas. And so we ended up joining up together and starting Arthur. >> Awesome. Well, let's get into what you guys do. Can you explain the value proposition? What are people using you for now? Where's the action? What's the customers look like? What do prospects look like? Obviously you mentioned production, this has been the theme. It's not like people woke up one day and said, "Hey, I'm going to put stuff into production." This has kind of been happening. There's been companies that have been doing this at scale and then yet there's a whole follower model coming on mainstream enterprise and businesses. So there's kind of the early adopters are there now in production. What do you guys do? I mean, 'cause I think about just driving the car off the lot is not, you got to manage operations. I mean, that's a big thing. So what do you guys do? Talk about the value proposition and how you guys make money? >> Yeah, so what we do is, listen, when you go to validate ahead of deploying these models in production, starts at that point, right? So you want to make sure that if you're going to be upgrading a model, if you're going to replacing one that's currently in production, that you've proven that it's going to perform well, that it's going to be perform ethically and that you can explain what it's doing. And then when you launch it into production, traditionally data scientists would spend 25, 30% of their time just manually checking in on their model day-to-day babysitting as we call it, just to make sure that the data hasn't drifted, the model performance hasn't degraded, that a programmer did make a change in an upstream data system. You know, there's all sorts of reasons why the world changes and that can have a real adverse effect on these models. And so what we do is bring the same kind of automation that you have for other kinds of, let's say infrastructure monitoring, application monitoring, we bring that to your AI systems. And that way if there ever is an issue, it's not like weeks or months till you find it and you find it before it has an effect on your P&L and your balance sheet, which is too often before they had tools like Arthur, that was the way they were detected. >> You know, I was talking to Swami at Amazon who I've known for a long time for 13 years and been on theCUBE multiple times and you know, I watched Amazon try to pick up that sting with stage maker about six years ago and so much has happened since then. And he and I were talking about this wave, and I kind of brought up this analogy to how when cloud started, it was, Hey, I don't need a data center. 'Cause when I did my startup that time when Amazon, one of my startups at that time, my choice was put a box in the colo, get all the configuration before I could write over the line of code. So the cloud became the benefit for that and you can stand up stuff quickly and then it grew from there. Here it's kind of the same dynamic, you don't want to have to provision a large language model or do all this heavy lifting. So that seeing companies coming out there saying, you can get started faster, there's like a new way to get it going. So it's kind of like the same vibe of limiting that heavy lifting. >> Absolutely. >> How do you look at that because this seems to be a wave that's going to be coming in and how do you guys help companies who are going to move quickly and start developing? >> Yeah, so I think in the race to this kind of gold rush mentality, race to get these models into production, there's starting to see more sort of examples and evidence that there are a lot of risks that go along with it. Either your model says things, your system says things that are just wrong, you know, whether it's hallucination or just making things up, there's lots of examples. If you go on Twitter and the news, you can read about those, as well as sort of times when there could be toxic content coming out of things like that. And so there's a lot of risks there that you need to think about and be thoughtful about when you're deploying these systems. But you know, you need to balance that with the business imperative of getting these things into production and really transforming your business. And so that's where we help people, we say go ahead, put them in production, but just make sure you have the right guardrails in place so that you can do it in a smart way that's going to reflect well on you and your company. >> Let's frame the challenge for the companies now that you have, obviously there's the people who doing large scale production and then you have companies maybe like as small as us who have large linguistic databases or transcripts for example, right? So what are customers doing and why are they deploying AI right now? And is it a speed game, is it a cost game? Why have some companies been able to deploy AI at such faster rates than others? And what's a best practice to onboard new customers? >> Yeah, absolutely. So I mean, we're seeing across a bunch of different verticals, there are leaders who have really kind of started to solve this puzzle about getting AI models into production quickly and being able to iterate on them quickly. And I think those are the ones that realize that imperative that you mentioned earlier about how transformational this technology is. And you know, a lot of times, even like the CEOs or the boards are very personally kind of driving this sense of urgency around it. And so, you know, that creates a lot of movement, right? And so those companies have put in place really smart infrastructure and rails so that people can, data scientists aren't encumbered by having to like hunt down data, get access to it. They're not encumbered by having to stand up new platforms every time they want to deploy an AI system, but that stuff is already in place. There's a really nice ecosystem of products out there, including Arthur, that you can tap into. Compared to five or six years ago when I was building at a top 10 US bank, at that point you really had to build almost everything yourself and that's not the case now. And so it's really nice to have things like, you know, you mentioned AWS SageMaker and a whole host of other tools that can really accelerate things. >> What's your profile customer? Is it someone who already has a team or can people who are learning just dial into the service? What's the persona? What's the pitch, if you will, how do you align with that customer value proposition? Do people have to be built out with a team and in play or is it pre-production or can you start with people who are just getting going? >> Yeah, people do start using it pre-production for validation, but I think a lot of our customers do have a team going and they're starting to put, either close to putting something into production or about to, it's everything from large enterprises that have really sort of complicated, they have dozens of models running all over doing all sorts of use cases to tech startups that are very focused on a single problem, but that's like the lifeblood of the company and so they need to guarantee that it works well. And you know, we make it really easy to get started, especially if you're using one of the common model development platforms, you can just kind of turn key, get going and make sure that you have a nice feedback loop. So then when your models are out there, it's pointing out, areas where it's performing well, areas where it's performing less well, giving you that feedback so that you can make improvements, whether it's in training data or futurization work or algorithm selection. There's a number of, you know, depending on the symptoms, there's a number of things you can do to increase performance over time and we help guide people on that journey. >> So Adam, I have to ask, since you have such a great customer base and they're smart and they got teams and you're on the front end, I mean, early adopters is kind of an overused word, but they're killing it. They're putting stuff in the production's, not like it's a test, it's not like it's early. So as the next wave comes of fast followers, how do you see that coming online? What's your vision for that? How do you see companies that are like just waking up out of the frozen, you know, freeze of like old IT to like, okay, they got cloud, but they're not yet there. What do you see in the market? I see you're in the front end now with the top people really nailing AI and working hard. What's the- >> Yeah, I think a lot of these tools are becoming, or every year they get easier, more accessible, easier to use. And so, you know, even for that kind of like, as the market broadens, it takes less and less of a lift to put these systems in place. And the thing is, every business is unique, they have their own kind of data and so you can use these foundation models which have just been trained on generic data. They're a great starting point, a great accelerant, but then, in most cases you're either going to want to create a model or fine tune a model using data that's really kind of comes from your particular customers, the people you serve and so that it really reflects that and takes that into account. And so I do think that these, like the size of that market is expanding and its broadening as these tools just become easier to use and also the knowledge about how to build these systems becomes more widespread. >> Talk about your customer base you have now, what's the makeup, what size are they? Give a taste a little bit of a customer base you got there, what's they look like? I'll say Capital One, we know very well while you were at there, they were large scale, lot of data from fraud detection to all kinds of cool stuff. What do your customers now look like? >> Yeah, so we have a variety, but I would say one area we're really strong, we have several of the top 10 US banks, that's not surprising, that's a strength for us, but we also have Fortune 100 customers in healthcare, in manufacturing, in retail, in semiconductor and electronics. So what we find is like in any sort of these major verticals, there's typically, you know, one, two, three kind of companies that are really leading the charge and are the ones that, you know, in our opinion, those are the ones that for the next multiple decades are going to be the leaders, the ones that really kind of lead the charge on this AI transformation. And so we're very fortunate to be working with some of those. And then we have a number of startups as well who we love working with just because they're really pushing the boundaries technologically and so they provide great feedback and make sure that we're continuing to innovate and staying abreast of everything that's going on. >> You know, these early markups, even when the hyperscalers were coming online, they had to build everything themselves. That's the new, they're like the alphas out there building it. This is going to be a big wave again as that fast follower comes in. And so when you look at the scale, what advice would you give folks out there right now who want to tee it up and what's your secret sauce that will help them get there? >> Yeah, I think that the secret to teeing it up is just dive in and start like the, I think these are, there's not really a secret. I think it's amazing how accessible these are. I mean, there's all sorts of ways to access LLMs either via either API access or downloadable in some cases. And so, you know, go ahead and get started. And then our secret sauce really is the way that we provide that performance analysis of what's going on, right? So we can tell you in a very actionable way, like, hey, here's where your model is doing good things, here's where it's doing bad things. Here's something you want to take a look at, here's some potential remedies for it. We can help guide you through that. And that way when you're putting it out there, A, you're avoiding a lot of the common pitfalls that people see and B, you're able to really kind of make it better in a much faster way with that tight feedback loop. >> It's interesting, we've been kind of riffing on this supercloud idea because it was just different name than multicloud and you see apps like Snowflake built on top of AWS without even spending any CapEx, you just ride that cloud wave. This next AI, super AI wave is coming. I don't want to call AIOps because I think there's a different distinction. If you, MLOps and AIOps seem a little bit old, almost a few years back, how do you view that because everyone's is like, "Is this AIOps?" And like, "No, not kind of, but not really." How would you, you know, when someone says, just shoots off the hip, "Hey Adam, aren't you doing AIOps?" Do you say, yes we are, do you say, yes, but we do differently because it's doesn't seem like it's the same old AIOps. What's your- >> Yeah, it's a good question. AIOps has been a term that was co-opted for other things and MLOps also has people have used it for different meanings. So I like the term just AI infrastructure, I think it kind of like describes it really well and succinctly. >> But you guys are doing the ops. I mean that's the kind of ironic thing, it's like the next level, it's like NextGen ops, but it's not, you don't want to be put in that bucket. >> Yeah, no, it's very operationally focused platform that we have, I mean, it fires alerts, people can action off them. If you're familiar with like the way people run security operations centers or network operations centers, we do that for data science, right? So think of it as a DSOC, a Data Science Operations Center where all your models, you might have hundreds of models running across your organization, you may have five, but as problems are detected, alerts can be fired and you can actually work the case, make sure they're resolved, escalate them as necessary. And so there is a very strong operational aspect to it, you're right. >> You know, one of the things I think is interesting is, is that, if you don't mind commenting on it, is that the aspect of scale is huge and it feels like that was made up and now you have scale and production. What's your reaction to that when people say, how does scale impact this? >> Yeah, scale is huge for some of, you know, I think, I think look, the highest leverage business areas to apply these to, are generally going to be the ones at the biggest scale, right? And I think that's one of the advantages we have. Several of us come from enterprise backgrounds and we're used to doing things enterprise grade at scale and so, you know, we're seeing more and more companies, I think they started out deploying AI and sort of, you know, important but not necessarily like the crown jewel area of their business, but now they're deploying AI right in the heart of things and yeah, the scale that some of our companies are operating at is pretty impressive. >> John: Well, super exciting, great to have you on and congratulations. I got a final question for you, just random. What are you most excited about right now? Because I mean, you got to be pretty pumped right now with the way the world is going and again, I think this is just the beginning. What's your personal view? How do you feel right now? >> Yeah, the thing I'm really excited about for the next couple years now, you touched on it a little bit earlier, but is a sort of convergence of AI and AI systems with sort of turning into AI native businesses. And so, as you sort of do more, get good further along this transformation curve with AI, it turns out that like the better the performance of your AI systems, the better the performance of your business. Because these models are really starting to underpin all these key areas that cumulatively drive your P&L. And so one of the things that we work a lot with our customers is to do is just understand, you know, take these really esoteric data science notions and performance and tie them to all their business KPIs so that way you really are, it's kind of like the operating system for running your AI native business. And we're starting to see more and more companies get farther along that maturity curve and starting to think that way, which is really exciting. >> I love the AI native. I haven't heard any startup yet say AI first, although we kind of use the term, but I guarantee that's going to come in all the pitch decks, we're an AI first company, it's going to be great run. Adam, congratulations on your success to you and the team. Hey, if we do a few more interviews, we'll get the linguistics down. We can have bots just interact with you directly and ask you, have an interview directly. >> That sounds good, I'm going to go hang out on the beach, right? So, sounds good. >> Thanks for coming on, really appreciate the conversation. Super exciting, really important area and you guys doing great work. Thanks for coming on. >> Adam: Yeah, thanks John. >> Again, this is Cube Conversation. I'm John Furrier here in Palo Alto, AI going next gen. This is legit, this is going to a whole nother level that's going to open up huge opportunities for startups, that's going to use opportunities for investors and the value to the users and the experience will come in, in ways I think no one will ever see. So keep an eye out for more coverage on siliconangle.com and theCUBE.net, thanks for watching. (bright upbeat music)
SUMMARY :
I'm excited to have Adam Wenchel looking forward to the conversation. kind of in the mainstream and that it's just the amount Adam, you know, you've so that you can build on top of them. to give me a riveting introduction to you And you mentioned computer vision, again, And you know, those teams, And you know, as you mentioned, of when you get models into off the lot is not, you and that you can explain what it's doing. So it's kind of like the same vibe so that you can do it in a smart way And so, you know, that creates and make sure that you out of the frozen, you know, and so you can use these foundation models a customer base you got there, that are really leading the And so when you look at the scale, And so, you know, go how do you view that So I like the term just AI infrastructure, I mean that's the kind of ironic thing, and you can actually work the case, is that the aspect of and so, you know, we're seeing exciting, great to have you on so that way you really are, success to you and the team. out on the beach, right? and you guys doing great work. and the value to the users and
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
John Markoff | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Adam Wenchel | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Red Sox | ORGANIZATION | 0.99+ |
John Dickerson | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Adam | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
2015 | DATE | 0.99+ |
Capital One | ORGANIZATION | 0.99+ |
five | QUANTITY | 0.99+ |
100% | QUANTITY | 0.99+ |
2016 | DATE | 0.99+ |
13 years | QUANTITY | 0.99+ |
Snowflake | TITLE | 0.99+ |
three | QUANTITY | 0.99+ |
first question | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
five | DATE | 0.99+ |
today | DATE | 0.99+ |
one | QUANTITY | 0.99+ |
four years | QUANTITY | 0.99+ |
Billy Beane | PERSON | 0.99+ |
over 20 years | QUANTITY | 0.99+ |
DARPA | ORGANIZATION | 0.99+ |
third one | QUANTITY | 0.98+ |
AWS | ORGANIZATION | 0.98+ |
siliconangle.com | OTHER | 0.98+ |
University of Maryland | ORGANIZATION | 0.97+ |
first time | QUANTITY | 0.97+ |
US | LOCATION | 0.97+ |
first | QUANTITY | 0.96+ |
six years ago | DATE | 0.96+ |
New York Times | ORGANIZATION | 0.96+ |
ChatGPT | ORGANIZATION | 0.96+ |
Swami | PERSON | 0.95+ |
ChatGPT | TITLE | 0.95+ |
hundreds of models | QUANTITY | 0.95+ |
25, 30% | QUANTITY | 0.95+ |
single problem | QUANTITY | 0.95+ |
hundreds of millions of dollars | QUANTITY | 0.95+ |
10 | QUANTITY | 0.94+ |
Moneyball | TITLE | 0.94+ |
wave | EVENT | 0.91+ |
three things | QUANTITY | 0.9+ |
AIOps | TITLE | 0.9+ |
last six months | DATE | 0.89+ |
few months ago | DATE | 0.88+ |
big | EVENT | 0.86+ |
next couple years | DATE | 0.86+ |
DevOps | TITLE | 0.85+ |
Arthur | PERSON | 0.85+ |
CUBE | ORGANIZATION | 0.83+ |
dozens of models | QUANTITY | 0.8+ |
a few years back | DATE | 0.8+ |
six years ago | DATE | 0.78+ |
theCUBE | ORGANIZATION | 0.76+ |
SageMaker | TITLE | 0.75+ |
decades | QUANTITY | 0.75+ |
ORGANIZATION | 0.74+ | |
MLOps | TITLE | 0.74+ |
supercloud | ORGANIZATION | 0.73+ |
super AI wave | EVENT | 0.73+ |
a couple months | QUANTITY | 0.72+ |
Arthur | ORGANIZATION | 0.72+ |
100 customers | QUANTITY | 0.71+ |
Cube Conversation | EVENT | 0.69+ |
theCUBE.net | OTHER | 0.67+ |
Wayne Duso, AWS & Iyad Tarazi, Federated Wireless | MWC Barcelona 2023
(light music) >> Announcer: TheCUBE's live coverage is made possible by funding from Dell Technologies. Creating technologies that drive human progress. (upbeat music) >> Welcome back to the Fira in Barcelona. Dave Vellante with Dave Nicholson. Lisa Martin's been here all week. John Furrier is in our Palo Alto studio, banging out all the news. Don't forget to check out siliconangle.com, thecube.net. This is day four, our last segment, winding down. MWC23, super excited to be here. Wayne Duso, friend of theCUBE, VP of engineering from products at AWS is here with Iyad Tarazi, who's the CEO of Federated Wireless. Gents, welcome. >> Good to be here. >> Nice to see you. >> I'm so stoked, Wayne, that we connected before the show. We texted, I'm like, "You're going to be there. I'm going to be there. You got to come on theCUBE." So thank you so much for making time, and thank you for bringing a customer partner, Federated Wireless. Everybody knows AWS. Iyad, tell us about Federated Wireless. >> We're a software and services company out of Arlington, Virginia, right outside of Washington, DC, and we're really focused on this new technology called Shared Spectrum and private wireless for 5G. Think of it as enterprises consuming 5G, the way they used to consume WiFi. >> Is that unrestricted spectrum, or? >> It is managed, organized, interference free, all through cloud platforms. That's how we got to know AWS. We went and got maybe about 300 products from AWS to make it work. Quite sophisticated, highly available, and pristine spectrum worth billions of dollars, but available for people like you and I, that want to build enterprises, that want to make things work. Also carriers, cable companies everybody else that needs it. It's really a new revolution for everyone. >> And that's how you, it got introduced to AWS. Was that through public sector, or just the coincidence that you're in DC >> No, I, well, yes. The center of gravity in the world for spectrum is literally Arlington. You have the DOD spectrum people, you have spectrum people from National Science Foundation, DARPA, and then you have commercial sector, and you have the FCC just an Uber ride away. So we went and found the scientists that are doing all this work, four or five of them, Virginia Tech has an office there too, for spectrum research for the Navy. Come together, let's have a party and make a new model. >> So I asked this, I'm super excited to have you on theCUBE. I sat through the keynotes on Monday. I saw Satya Nadella was in there, Thomas Kurian there was no AWS. I'm like, where's AWS? AWS is everywhere. I mean, you guys are all over the show. I'm like, "Hey, where's the number one cloud?" So you guys have made a bunch of announcements at the show. Everybody's talking about the cloud. What's going on for you guys? >> So we are everywhere, and you know, we've been coming to this show for years. But this is really a year that we can demonstrate that what we've been doing for the IT enterprise, IT people for 17 years, we're now bringing for telcos, you know? For years, we've been, 17 years to be exact, we've been bringing the cloud value proposition, whether it's, you know, cost efficiencies or innovation or scale, reliability, security and so on, to these enterprise IT folks. Now we're doing the same thing for telcos. And so whether they want to build in region, in a local zone, metro area, on-prem with an outpost, at the edge with Snow Family, or with our IoT devices. And no matter where they want to start, if they start in the cloud and they want to move to the edge, or they start in the edge and they want to bring the cloud value proposition, like, we're demonstrating all of that is happening this week. And, and very much so, we're also demonstrating that we're bringing the same type of ecosystem that we've built for enterprise IT. We're bringing that type of ecosystem to the telco companies, with CSPs, with the ISP vendors. We've seen plenty of announcements this week. You know, so on and so forth. >> So what's different, is it, the names are different? Is it really that simple, that you're just basically taking the cloud model into telco, and saying, "Hey, why do all this undifferentiated heavy lifting when we can do it for you? Don't worry about all the plumbing." Is it really that simple? I mean, that straightforward. >> Well, simple is probably not what I'd say, but we can make it straightforward. >> Conceptually. >> Conceptually, yes. Conceptually it is the same. Because if you think about, firstly, we'll just take 5G for a moment, right? The 5G folks, if you look at the architecture for 5G, it was designed to run on a cloud architecture. It was designed to be a set of services that you could partition, and run in different places, whether it's in the region or at the edge. So in many ways it is sort of that simple. And let me give you an example. Two things, the first one is we announced integrated private wireless on AWS, which allows enterprise customers to come to a portal and look at the industry solutions. They're not worried about their network, they're worried about solving a problem, right? And they can come to that portal, they can find a solution, they can find a service provider that will help them with that solution. And what they end up with is a fully validated offering that AWS telco SAS have actually put to its paces to make sure this is a real thing. And whether they get it from a telco, and, and quite frankly in that space, it's SIs such as Federated that actually help our customers deploy those in private environments. So that's an example. And then added to that, we had a second announcement, which was AWS telco network builder, which allows telcos to plan, deploy, and operate at scale telco network capabilities on the cloud, think about it this way- >> As a managed service? >> As a managed service. So think about it this way. And the same way that enterprise IT has been deploying, you know, infrastructure as code for years. Telco network builder allows the telco folks to deploy telco networks and their capabilities as code. So it's not simple, but it is pretty straightforward. We're making it more straightforward as we go. >> Jump in Dave, by the way. He can geek out if you want. >> Yeah, no, no, no, that's good, that's good, that's good. But actually, I'm going to ask an AWS question, but I'm going to ask Iyad the AWS question. So when we, when I hear the word cloud from Wayne, cloud, AWS, typically in people's minds that denotes off-premises. Out there, AWS data center. In the telecom space, yes, of course, in the private 5G space, we're talking about a little bit of a different dynamic than in the public 5G space, in terms of the physical infrastructure. But regardless at the edge, there are things that need to be physically at the edge. Do you feel that AWS is sufficiently, have they removed the H word, hybrid, from the list of bad words you're not allowed to say? 'Cause there was a point in time- >> Yeah, of course. >> Where AWS felt that their growth- >> They'll even say multicloud today, (indistinct). >> No, no, no, no, no. But there was a period of time where, rightfully so, AWS felt that the growth trajectory would be supported solely by net new things off premises. Now though, in this space, it seems like that hybrid model is critical. Do you see AWS being open to the hybrid nature of things? >> Yeah, they're, absolutely. I mean, just to explain from- we're a services company and a solutions company. So we put together solutions at the edge, a smart campus, smart agriculture, a deployment. One of our biggest deployment is a million square feet warehouse automation project with the Marine Corps. >> That's bigger than the Fira. >> Oh yeah, it's bigger, definitely bigger than, you know, a small section of here. It's actually three massive warehouses. So yes, that is the edge. What the cloud is about is that massive amount of efficiency has happened by concentrating applications in data centers. And that is programmability, that is APIs that is solutions, that is applications that can run on it, where people know how to do it. And so all that efficiency now is being ported in a box called the edge. What AWS is doing for us is bringing all the business and technical solutions they had into the edge. Some of the data may send back and forth, but that's actually a smaller piece of the value for us. By being able to bring an AWS package at the edge, we're bringing IoT applications, we're bringing high speed cameras, we're able to integrate with the 5G public network. We're able to bring in identity and devices, we're able to bring in solutions for students, embedded laptops. All of these things that you can do much much faster and cheaper if you are able to tap in the 4,000, 5,000 partners and all the applications and all the development and all the models that AWS team did. By being able to bring that efficiency to the edge why reinvent that? And then along with that, there are partners that you, that help do integration. There are development done to make it hardened, to make the data more secure, more isolated. All of these things will contribute to an edge that truly is a carbon copy of the data center. >> So Wayne, it's AWS, Regardless of where the compute, networking and storage physically live, it's AWS. Do you think that the term cloud will sort of drift away from usage? Because if, look, it's all IT, in this case it's AWS and federated IT working together. How, what's your, it's sort of a obscure question about cloud, because cloud is so integrated. >> You Got this thing about cloud, it's just IT. >> I got thing about cloud too, because- >> You and Larry Ellison. >> Because it's no, no, no, I'm, yeah, well actually there's- >> There's a lot of IT that's not cloud, just say that okay. >> Now, a lot of IT that isn't cloud, but I would say- >> But I'll (indistinct) cloud is an IT tool, and you see AWS obviously with the Snow fill in the blank line of products and outpost type stuff. Fair to say that you're, doesn't matter where it is, it could be AWS if it's on the edge, right? >> Well, you know, everybody wants to define the cloud as what it may have been when it started. But if you look at what it was when it started and what it is today, it is different. But the ability to bring the experience, the AWS experience, the services, the operational experience and all the things that Iyad had been talking about from the region all to all the way to, you know, the IoT device, if you would, that entire continuum. And it doesn't matter where you start. Like if you start in region and you need to bring your value to other places because your customers are asking you to do so, we're enabling that experience where you need to bring it. If you started at the edge, and- but you want to build cloud value, you know, whether it's again, cost efficiency, scalability, AI, ML or analytics into those capabilities, you can start at the edge with the same APIs, with the same service, the same capabilities, and you can build that value in right from the get go. You don't build this bifurcation or many separations and try to figure out how do I glue them together? There is no gluing together. So if you think of cloud as being elastic, scalable flexible, where you can drive innovation, it's the same exact model on the continuum. And you can start at either end, it's up to you as a customer. >> And I think if, the key to me is the ecosystem. I mean, if you can do for this industry what you've done for the technology- enterprise technology business from an ecosystem standpoint, you know everybody talks about flywheel, but that gives you like the massive flywheel. I don't know what the ratio is, but it used to be for every dollar spent on a VMware license, $15 is spent in the ecosystem. I've never heard similar ratios in the AWS ecosystem, but it's, I go to reinvent and I'm like, there's some dollars being- >> That's a massive ecosystem. >> (indistinct). >> And then, and another thing I'll add is Jose Maria Alvarez, who's the chairman of Telefonica, said there's three pillars of the future-ready telco, low latency, programmable networks, and he said cloud and edge. So they recognizing cloud and edge, you know, low latency means you got to put the compute and the data, the programmable infrastructure was invented by Amazon. So what's the strategy around the telco edge? >> So, you know, at the end, so those are all great points. And in fact, the programmability of the network was a big theme in the show. It was a huge theme. And if you think about the cloud, what is the cloud? It's a set of APIs against a set of resources that you use in whatever way is appropriate for what you're trying to accomplish. The network, the telco network becomes a resource. And it could be described as a resource. We, I talked about, you know, network as in code, right? It's same infrastructure in code, it's telco infrastructure as code. And that code, that infrastructure, is programmable. So this is really, really important. And in how you build the ecosystem around that is no different than how we built the ecosystem around traditional IT abstractions. In fact, we feel that really the ecosystem is the killer app for 5G. You know, the killer app for 4G, data of sorts, right? We started using data beyond simple SMS messages. So what's the killer app for 5G? It's building this ecosystem, which includes the CSPs, the ISVs, all of the partners that we bring to the table that can drive greater value. It's not just about cost efficiency. You know, you can't save your way to success, right? At some point you need to generate greater value for your customers, which gives you better business outcomes, 'cause you can monetize them, right? The ecosystem is going to allow everybody to monetize 5G. >> 5G is like the dot connector of all that. And then developers come in on top and create new capabilities >> And how different is that than, you know, the original smartphones? >> Yeah, you're right. So what do you guys think of ChatGPT? (indistinct) to Amazon? Amazon turned the data center into an API. It's like we're visioning this world, and I want to ask that technologist, like, where it's turning resources into human language interfaces. You know, when you see that, you play with ChatGPT at all, or I know you guys got your own. >> So I won't speak directly to ChatGPT. >> No, don't speak from- >> But if you think about- >> Generative AI. >> Yeah generative AI is important. And, and we are, and we have been for years, in this space. Now you've been talking to AWS for a long time, and we often don't talk about things we don't have yet. We don't talk about things that we haven't brought to market yet. And so, you know, you'll often hear us talk about something, you know, a year from now where others may have been talking about it three years earlier, right? We will be talking about this space when we feel it's appropriate for our customers and our partners. >> You have talked about it a little bit, Adam Selipsky went on an interview with myself and John Furrier in October said you watch, you know, large language models are going to be enormous and I know you guys have some stuff that you're working on there. >> It's, I'll say it's exciting. >> Yeah, I mean- >> Well proof point is, Siri is an idiot compared to Alexa. (group laughs) So I trust one entity to come up with something smart. >> I have conversations with Alexa and Siri, and I won't judge either one. >> You don't need, you could be objective on that one. I definitely have a preference. >> Are the problems you guys solving in this space, you know, what's unique about 'em? What are they, can we, sort of, take some examples here (indistinct). >> Sure, the main theme is that the enterprise is taking control. They want to have their own networks. They want to focus on specific applications, and they want to build them with a skeleton crew. The one IT person in a warehouse want to be able to do it all. So what's unique about them is that they're now are a lot of automation on robotics, especially in warehousing environment agriculture. There simply aren't enough people in these industries, and that required precision. And so you need all that integration to make it work. People also want to build these networks as they want to control it. They want to figure out how do we actually pick this team and migrate it. Maybe just do the front of the house first. Maybe it's a security team that monitor the building, maybe later on upgrade things that use to open doors and close doors and collect maintenance data. So that ability to pick what you want to do from a new processors is really important. And then you're also seeing a lot of public-private network interconnection. That's probably the undercurrent of this show that haven't been talked about. When people say private networks, they're also talking about something called neutral host, which means I'm going to build my own network, but I want it to work, my Verizon (indistinct) need to work. There's been so much progress, it's not done yet. So much progress about this bring my own network concept, and then make sure that I'm now interoperating with the public network, but it's my domain. I can create air gaps, I can create whatever security and policy around it. That is probably the power of 5G. Now take all of these tiny networks, big networks, put them all in one ecosystem. Call it the Amazon marketplace, call it the Amazon ecosystem, that's 5G. It's going to be tremendous future. >> What does the future look like? We're going to, we just determined we're going to be orchestrating the network through human language, okay? (group laughs) But seriously, what's your vision for the future here? You know, both connectivity and cloud are on on a continuum. It's, they've been on a continuum forever. They're going to continue to be on a continuum. That being said, those continuums are coming together, right? They're coming together to bring greater value to a greater set of customers, and frankly all of us. So, you know, the future is now like, you know, this conference is the future, and if you look at what's going on, it's about the acceleration of the future, right? What we announced this week is really the acceleration of listening to customers for the last handful of years. And, we're going to continue to do that. We're going to continue to bring greater value in the form of solutions. And that's what I want to pick up on from the prior question. It's not about the network, it's not about the cloud, it's about the solutions that we can provide the customers where they are, right? And if they're on their mobile phone or they're in their factory floor, you know, they're looking to accelerate their business. They're looking to accelerate their value. They're looking to create greater safety for their employees. That's what we can do with these technologies. So in fact, when we came out with, you know, our announcement for integrated private wireless, right? It really was about industry solutions. It really isn't about, you know, the cloud or the network. It's about how you can leverage those technologies, that continuum, to deliver you value. >> You know, it's interesting you say that, 'cause again, when we were interviewing Adam Selipsky, everybody, you know, all journalists analysts want to know, how's Adam Selipsky going to be different from Andy Jassy, what's the, what's he going to do to Amazon to change? And he said, listen, the real answer is Amazon has changed. If Andy Jassy were here, we'd be doing all, you know, pretty much the same things. Your point about 17 years ago, the cloud was S3, right, and EC2. Now it's got to evolve to be solutions. 'Cause if that's all you're selling, is the bespoke services, then you know, the future is not as bright as the past has been. And so I think it's key to look for what are those outcomes or solutions that customers require and how you're going to meet 'em. And there's a lot of challenges. >> You continue to build value on the value that you've brought, and you don't lose sight of why that value is important. You carry that value proposition up the stack, but the- what you're delivering, as you said, becomes maybe a bigger or or different. >> And you are getting more solution oriented. I mean, you're not hardcore solutions yet, but we're seeing more and more of that. And that seems to be a trend. We've even seen in the database world, making things easier, connecting things. Not really an abstraction layer, which is sort of antithetical to your philosophy, but it creates a similar outcome in terms of simplicity. Yeah, you're smiling 'cause you guys always have a different angle, you know? >> Yeah, we've had this conversation. >> It's right, it's, Jassy used to say it's okay to be misunderstood. >> That's Right. For a long time. >> Yeah, right, guys, thanks so much for coming to theCUBE. I'm so glad we could make this happen. >> It's always good. Thank you. >> Thank you so much. >> All right, Dave Nicholson, for Lisa Martin, Dave Vellante, John Furrier in the Palo Alto studio. We're here at the Fira, wrapping out MWC23. Keep it right there, thanks for watching. (upbeat music)
SUMMARY :
that drive human progress. banging out all the news. and thank you for bringing the way they used to consume WiFi. but available for people like you and I, or just the coincidence that you're in DC and you have the FCC excited to have you on theCUBE. and you know, we've been the cloud model into telco, and saying, but we can make it straightforward. that you could partition, And the same way that enterprise Jump in Dave, by the way. that need to be physically at the edge. They'll even say multicloud AWS felt that the growth trajectory I mean, just to explain from- and all the models that AWS team did. the compute, networking You Got this thing about cloud, not cloud, just say that okay. on the edge, right? But the ability to bring the experience, but that gives you like of the future-ready telco, And in fact, the programmability 5G is like the dot So what do you guys think of ChatGPT? to ChatGPT. And so, you know, you'll often and I know you guys have some stuff it's exciting. Siri is an idiot compared to Alexa. and I won't judge either one. You don't need, you could Are the problems you that the enterprise is taking control. that continuum, to deliver you value. is the bespoke services, then you know, and you don't lose sight of And that seems to be a trend. it's okay to be misunderstood. For a long time. so much for coming to theCUBE. It's always good. in the Palo Alto studio.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Nicholson | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Marine Corps | ORGANIZATION | 0.99+ |
Adam Selipsky | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
National Science Foundation | ORGANIZATION | 0.99+ |
Wayne | PERSON | 0.99+ |
Iyad Tarazi | PERSON | 0.99+ |
Dave Nicholson | PERSON | 0.99+ |
Jose Maria Alvarez | PERSON | 0.99+ |
Thomas Kurian | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Verizon | ORGANIZATION | 0.99+ |
Andy Jassy | PERSON | 0.99+ |
Federated Wireless | ORGANIZATION | 0.99+ |
Wayne Duso | PERSON | 0.99+ |
$15 | QUANTITY | 0.99+ |
October | DATE | 0.99+ |
Satya Nadella | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
17 years | QUANTITY | 0.99+ |
Monday | DATE | 0.99+ |
Telefonica | ORGANIZATION | 0.99+ |
DARPA | ORGANIZATION | 0.99+ |
Arlington | LOCATION | 0.99+ |
Larry Ellison | PERSON | 0.99+ |
Virginia Tech | ORGANIZATION | 0.99+ |
Dave | PERSON | 0.99+ |
Siri | TITLE | 0.99+ |
five | QUANTITY | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
four | QUANTITY | 0.99+ |
Washington, DC | LOCATION | 0.99+ |
siliconangle.com | OTHER | 0.99+ |
FCC | ORGANIZATION | 0.99+ |
Barcelona | LOCATION | 0.99+ |
Dell Technologies | ORGANIZATION | 0.99+ |
Jassy | PERSON | 0.99+ |
DC | LOCATION | 0.99+ |
One | QUANTITY | 0.99+ |
telco | ORGANIZATION | 0.98+ |
thecube.net | OTHER | 0.98+ |
this week | DATE | 0.98+ |
second announcement | QUANTITY | 0.98+ |
three years earlier | DATE | 0.98+ |
Shahid Ahmed, NTT | MWC Barcelona 2023
(inspirational music) >> theCUBE's live coverage is made possible by funding from Dell Technologies. Creating technologies that drive human progress. (uplifting electronic music) (crowd chattering in background) >> Hi everybody. We're back at the Fira in Barcelona. Winding up our four day wall-to-wall coverage of MWC23 theCUBE has been thrilled to cover the telco transformation. Dave Vellante with Dave Nicholson. Really excited to have NTT on. Shahid Ahmed is the Group EVP of New Ventures and Innovation at NTT in from Chicago. Welcome to Barcelona. Welcome to theCUBE. >> Thank you for having me over. >> So, really interesting title. You have, you know, people might not know NTT you know, huge Japan telco but a lot of other businesses, explain your business. >> So we do a lot of things. Most of us are known for our Docomo business in Japan. We have one of the largest wireless cellular carriers in the world. We serve most of Japan. Outside of Japan, we are B2B systems, integration, professional services company. So we offer managed services. We have data centers, we have undersea cables. We offer all kinds of outsourcing services. So we're a big company. >> So there's a narrative out there that says, you know, 5G, it's a lot of hype, not a lot of adoption. Nobody's ever going to make money at 5G. You have a different point of view, I understand. You're like leaning into 5G and you've actually got some traction there. Explain that. >> So 5G can be viewed from two lenses. One is just you and I using our cell phones and we get 5G coverage over it. And the other one is for businesses to use 5G, and we call that private 5G or enterprise grade 5G. Two very separate distinct things, but it is 5G in the end. Now the big debate here in Europe and US is how to monetize 5G. As a consumer, you and I are not going to pay extra for 5G. I mean, I haven't. I just expect the carrier to offer faster, cheaper services. And so would I pay extra? Not really. I just want a reliable network from my carrier. >> Paid up for the good camera though, didn't you? >> I did. (Dave and Dave laughing) >> I'm waiting for four cameras now. >> So the carriers are in this little bit of a pickle at the moment because they've just spent billions of dollars, not only on spectrum but the infrastructure needed to upgrade to 5G, yet nobody's willing to pay extra for that 5G service. >> Oh, right. >> So what do they do? And one idea is to look at enterprises, companies, industrial companies, manufacturing companies who want to build their own 5G networks to support their own use cases. And these use cases could be anything from automating the surveyor belt to cameras with 5G in it to AGVs. These are little carts running around warehouses picking up products and goods, but they have to be connected all the time. Wifi doesn't work all the time there. And so those businesses are willing to pay for 5G. So your question is, is there a business case for 5G? Yes. I don't think it's in the consumer side. I think it's in the business side. And that's where NTT is finding success. >> So you said, you know, how they going to make money, right? You very well described the telco dilemma. We heard earlier this week, you know, well, we could tax the OTT vendors, like Netflix of course shot back and said, "Well, we spent a lot of money on content. We're driving a lot of value. Why don't you help us pay for the content development?" Which is incredibly expensive. I think I heard we're going to tax the developers for API calls on the network. I'm not sure how well that's going to work out. Look at Twitter, you know, we'll see. And then yeah, there's the B2B piece. What's your take on, we heard the Orange CEO say, "We need help." You know, maybe implying we're going to tax the OTT vendors, but we're for net neutrality, which seems like it's completely counter-posed. What's your take on, you know, fair share in the network? >> Look, we've seen this debate unfold in the US for the last 10 years. >> Yeah. >> Tom Wheeler, the FCC chairman started that debate and he made great progress and open internet and net neutrality. The thing is that if you create a lane, a tollway, where some companies have to pay toll and others don't have to, you create an environment where the innovation could be stifled. Content providers may not appear on the scene anymore. And with everything happening around AI, we may see that backfire. So creating a toll for rich companies to be able to pay that toll and get on a faster speed internet, that may work some places may backfire in others. >> It's, you know, you're bringing up a great point. It's one of those sort of unintended consequences. You got to be be careful because the little guy gets crushed in that environment, and then what? Right? Then you stifle innovation. So, okay, so you're a fan of net neutrality. You think the balance that the US model, for a change, maybe the US got it right instead of like GDPR, who sort of informed the US on privacy, maybe the opposite on net neutrality. >> I think so. I mean, look, the way the US, particularly the FCC and the FTC has mandated these rules and regulation. I think it's a nice balance. FTC is all looking at big tech at the moment, but- >> Lena Khan wants to break up big tech. I mean for, you know, you big tech, boom, break 'em up, right? So, but that's, you know- >> That's a whole different story. >> Yeah. Right. We could talk about that too, if you want. >> Right. But I think that we have a balanced approach, a measured approach. Asking the content providers or the developers to pay for your innovative creative application that's on your phone, you know, that's asking for too much in my opinion. >> You know, I think you're right though. Government did do a good job with net neutrality in the US and, I mean, I'm just going to go my high horse for a second, so forgive me. >> Go for it. >> Market forces have always done a better job at adjudicating, you know, competition. Now, if a company's a monopoly, in my view they should be, you know, regulated, or at least penalized. Yeah, but generally speaking, you know the attack on big tech, I think is perhaps misplaced. I sat through, and the reason it's relevant to Mobile World Congress or MWC, is I sat through a Nokia presentation this week and they were talking about Bell Labs when United States broke up, you know, the US telcos, >> Yeah. >> Bell Labs was a gem in the US and now it's owned by Nokia. >> Yeah. >> Right? And so you got to be careful about, you know what you wish for with breaking up big tech. You got AI, you've got, you know, competition with China- >> Yeah, but the upside to breaking up Ma Bell was not just the baby Bells and maybe the stranded orphan asset of Bell Labs, but I would argue it led to innovation. I'm old enough to remember- >> I would say it made the US less competitive. >> I know. >> You were in junior high school, but I remember as an adult, having a rotary dial phone and having to pay for that access, and there was no such- >> Yeah, but they all came back together. The baby Bells are all, they got all acquired. And the cable company, it was no different. So I don't know, do you have a perspective of this? Because you know this better than I do. >> Well, I think look at Nokia, just they announced a whole new branding strategy and new brand. >> I like the brand. >> Yeah. And- >> It looks cool. >> But guess what? It's B2B oriented. >> (laughs) Yeah. >> It's no longer consumer, >> Right, yeah. >> because they felt that Nokia brand phone was sort of misleading towards a lot of business to business work that they do. And so they've oriented themselves to B2B. Look, my point is, the carriers and the service providers, network operators, and look, I'm a network operator, too, in Japan. We need to innovate ourselves. Nobody's stopping us from coming up with a content strategy. Nobody's stopping a carrier from building a interesting, new, over-the-top app. In fact, we have better control over that because we are closer to the customer. We need to innovate, we need to be more creative. I don't think taxing the little developer that's building a very innovative application is going to help in the long run. >> NTT Japan, what do they have a content play? I, sorry, I'm not familiar with it. Are they strong in content, or competitive like Netflix-like, or? >> We have relationships with them, and you remember i-mode? >> Yeah. Oh yeah, sure. >> Remember in the old days. I mean, that was a big hit. >> Yeah, yeah, you're right. >> Right? I mean, that was actually the original app marketplace. >> Right. >> And the application store. So, of course we've evolved from that and we should, and this is an evolution and we should look at it more positively instead of looking at ways to regulate it. We should let it prosper and let it see where- >> But why do you think that telcos generally have failed at content? I mean, AT&T is sort of the exception that proves the rule. I mean, they got some great properties, obviously, CNN and HBO, but generally it's viewed as a challenging asset and others have had to diversify or, you know, sell the assets. Why do you think that telcos have had such trouble there? >> Well, Comcast owns also a lot of content. >> Yeah. Yeah, absolutely. >> And I think, I think that is definitely a strategy that should be explored here in Europe. And I think that has been underexplored. I, in my opinion, I believe that every large carrier must have some sort of content strategy at some point, or else you are a pipe. >> Yeah. You lose touch with a customer. >> Yeah. And by the way, being a dump pipe is okay. >> No, it's a lucrative business. >> It's a good business. You just have to focus. And if you start to do a lot of ancillary things around it then you start to see the margins erode. But if you just focus on being a pipe, I think that's a very good business and it's very lucrative. Everybody wants bandwidth. There's insatiable demand for bandwidth all the time. >> Enjoy the monopoly, I say. >> Yeah, well, capital is like an organism in and of itself. It's going to seek a place where it can insert itself and grow. Do you think that the questions around fair share right now are having people wait in the wings to see what's going to happen? Because especially if I'm on the small end of creating content, creating services, and there's possibly a death blow to my fixed costs that could be coming down the line, I'm going to hold back and wait. Do you think that the answer is let's solve this sooner than later? What are your thoughts? >> I think in Europe the opinion has been always to go after the big tech. I mean, we've seen a lot of moves either through antitrust, or other means. >> Or the guillotine! >> That's right. (all chuckle) A guillotine. Yes. And I've heard those directly. I think, look, in the end, EU has to decide what's right for their constituents, the countries they operate, and the economy. Frankly, with where the economy is, you got recession, inflation pressures, a war, and who knows what else might come down the pipe. I would be very careful in messing with this equilibrium in this economy. Until at least we have gone through this inflation and recessionary pressure and see what happens. >> I, again, I think I come back to markets, ultimately, will adjudicate. I think what we're seeing with chatGPT is like a Netscape moment in some ways. And I can't predict what's going to happen, but I can predict that it's going to change the world. And there's going to be new disruptors that come about. That just, I don't think Amazon, Google, Facebook, Apple are going to rule the world forever. They're just, I guarantee they're not, you know. They'll make it through. But there's going to be some new companies. I think it might be open AI, might not be. Give us a plug for NTT at the show. What do you guys got going here? Really appreciate you coming on. >> Thank you. So, you know, we're showing off our private 5G network for enterprises, for businesses. We see this as a huge opportunities. If you look around here you've got Rohde & Schwarz, that's the industrial company. You got Airbus here. All the big industrial companies are here. Automotive companies and private 5G. 5G inside a factory, inside a hospital, a warehouse, a mining operation. That's where the dollars are. >> Is it a meaningful business for you today? >> It is. We just started this business only a couple of years ago. We're seeing amazing growth and I think there's a lot of good opportunities there. >> Shahid Ahmed, thanks so much for coming to theCUBE. It was great to have you. Really a pleasure. >> Thanks for having me over. Great questions. >> Oh, you're welcome. All right. For David Nicholson, Dave Vellante. We'll be back, right after this short break, from the Fira in Barcelona, MWC23. You're watching theCUBE. (uplifting electronic music)
SUMMARY :
that drive human progress. Shahid Ahmed is the Group EVP You have, you know, We have one of the largest there that says, you know, I just expect the carrier to I did. So the carriers are in but they have to be We heard earlier this week, you know, in the US for the last 10 years. appear on the scene anymore. You got to be be careful because I mean, look, the way the I mean for, you know, you We could talk about that too, if you want. or the developers to pay and, I mean, I'm just going to at adjudicating, you know, competition. US and now it's owned by Nokia. And so you got to be Yeah, but the upside the US less competitive. And the cable company, Well, I think look at Nokia, just But guess what? and the service providers, I, sorry, I'm not familiar with it. Remember in the old days. I mean, that was actually And the application store. I mean, AT&T is sort of the also a lot of content. And I think that has been underexplored. And if you start to do a lot that could be coming down the line, I think in Europe the and the economy. And there's going to be new that's the industrial company. and I think there's a lot much for coming to theCUBE. Thanks for having me over. from the Fira in Barcelona, MWC23.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Amazon | ORGANIZATION | 0.99+ |
Dave Nicholson | PERSON | 0.99+ |
David Nicholson | PERSON | 0.99+ |
FCC | ORGANIZATION | 0.99+ |
Apple | ORGANIZATION | 0.99+ |
Comcast | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
ORGANIZATION | 0.99+ | |
Tom Wheeler | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
CNN | ORGANIZATION | 0.99+ |
Europe | LOCATION | 0.99+ |
Nokia | ORGANIZATION | 0.99+ |
Lena Khan | PERSON | 0.99+ |
HBO | ORGANIZATION | 0.99+ |
Japan | LOCATION | 0.99+ |
Shahid Ahmed | PERSON | 0.99+ |
FTC | ORGANIZATION | 0.99+ |
Chicago | LOCATION | 0.99+ |
Netflix | ORGANIZATION | 0.99+ |
US | LOCATION | 0.99+ |
NTT | ORGANIZATION | 0.99+ |
Bell Labs | ORGANIZATION | 0.99+ |
AT&T | ORGANIZATION | 0.99+ |
EU | ORGANIZATION | 0.99+ |
Airbus | ORGANIZATION | 0.99+ |
Dave | PERSON | 0.99+ |
Orange | ORGANIZATION | 0.99+ |
Barcelona | LOCATION | 0.99+ |
Dell Technologies | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
Docomo | ORGANIZATION | 0.99+ |
MWC23 | EVENT | 0.99+ |
One | QUANTITY | 0.98+ |
four day | QUANTITY | 0.98+ |
earlier this week | DATE | 0.98+ |
billions of dollars | QUANTITY | 0.98+ |
this week | DATE | 0.98+ |
two lenses | QUANTITY | 0.98+ |
one idea | QUANTITY | 0.98+ |
telco | ORGANIZATION | 0.98+ |
GDPR | TITLE | 0.97+ |
US | ORGANIZATION | 0.97+ |
Mobile World Congress | EVENT | 0.97+ |
telcos | ORGANIZATION | 0.97+ |
United States | LOCATION | 0.96+ |
NTT Japan | ORGANIZATION | 0.95+ |
one | QUANTITY | 0.95+ |
MWC | EVENT | 0.95+ |
today | DATE | 0.94+ |
Fira | LOCATION | 0.93+ |
Barcelona, | LOCATION | 0.91+ |
5G | ORGANIZATION | 0.91+ |
four cameras | QUANTITY | 0.9+ |
Two very separate distinct things | QUANTITY | 0.89+ |
Rohde & Schwarz | ORGANIZATION | 0.89+ |
last 10 years | DATE | 0.88+ |
Netscape | ORGANIZATION | 0.88+ |
couple of years ago | DATE | 0.88+ |
theCUBE | ORGANIZATION | 0.85+ |
New Ventures and Innovation | ORGANIZATION | 0.73+ |
Ma Bell | ORGANIZATION | 0.71+ |
Phil Kippen, Snowflake, Dave Whittington, AT&T & Roddy Tranum, AT&T | | MWC Barcelona 2023
(gentle music) >> Narrator: "TheCUBE's" live coverage is made possible by funding from Dell Technologies, creating technologies that drive human progress. (upbeat music) >> Hello everybody, welcome back to day four of "theCUBE's" coverage of MWC '23. We're here live at the Fira in Barcelona. Wall-to-wall coverage, John Furrier is in our Palo Alto studio, banging out all the news. Really, the whole week we've been talking about the disaggregation of the telco network, the new opportunities in telco. We're really excited to have AT&T and Snowflake here. Dave Whittington is the AVP, at the Chief Data Office at AT&T. Roddy Tranum is the Assistant Vice President, for Channel Performance Data and Tools at AT&T. And Phil Kippen, the Global Head Of Industry-Telecom at Snowflake, Snowflake's new telecom business. Snowflake just announced earnings last night. Typical Scarpelli, they beat earnings, very conservative guidance, stocks down today, but we like Snowflake long term, they're on that path to 10 billion. Guys, welcome to "theCUBE." Thanks so much >> Phil: Thank you. >> for coming on. >> Dave and Roddy: Thanks Dave. >> Dave, let's start with you. The data culture inside of telco, We've had this, we've been talking all week about this monolithic system. Super reliable. You guys did a great job during the pandemic. Everything shifting to landlines. We didn't even notice, you guys didn't miss a beat. Saved us. But the data culture's changing inside telco. Explain that. >> Well, absolutely. So, first of all IoT and edge processing is bringing forth new and exciting opportunities all the time. So, we're bridging the world between a lot of the OSS stuff that we can do with edge processing. But bringing that back, and now we're talking about working, and I would say traditionally, we talk data warehouse. Data warehouse and big data are now becoming a single mesh, all right? And the use cases and the way you can use those, especially I'm taking that edge data and bringing it back over, now I'm running AI and ML models on it, and I'm pushing back to the edge, and I'm combining that with my relational data. So that mesh there is making all the difference. We're getting new use cases that we can do with that. And it's just, and the volume of data is immense. >> Now, I love ChatGPT, but I'm hoping your data models are more accurate than ChatGPT. I never know. Sometimes it's really good, sometimes it's really bad. But enterprise, you got to be clean with your AI, don't you? >> Not only you have to be clean, you have to monitor it for bias and be ethical about it. We're really good about that. First of all with AT&T, our brand is Platinum. We take care of that. So, we may not be as cutting-edge risk takers as others, but when we go to market with an AI or an ML or a product, it's solid. >> Well hey, as telcos go, you guys are leaning into the Cloud. So I mean, that's a good starting point. Roddy, explain your role. You got an interesting title, Channel Performance Data and Tools, what's that all about? >> So literally anything with our consumer, retail, concenters' channels, all of our channels, from a data perspective and metrics perspective, what it takes to run reps, agents, all the way to leadership levels, scorecards, how you rank in the business, how you're driving the business, from sales, service, customer experience, all that data infrastructure with our great partners on the CDO side, as well as Snowflake, that comes from my team. >> And that's traditionally been done in a, I don't mean the pejorative, but we're talking about legacy, monolithic, sort of data warehouse technologies. >> Absolutely. >> We have a love-hate relationship with them. It's what we had. It's what we used, right? And now that's evolving. And you guys are leaning into the Cloud. >> Dramatic evolution. And what Snowflake's enabled for us is impeccable. We've talked about having, people have dreamed of one data warehouse for the longest time and everything in one system. Really, this is the only way that becomes a reality. The more you get in Snowflake, we can have golden source data, and instead of duplicating that 50 times across AT&T, it's in one place, we just share it, everybody leverages it, and now it's not duplicated, and the process efficiency is just incredible. >> But it really hinges on that separation of storage and compute. And we talk about the monolithic warehouse, and one of the nightmares I've lived with, is having a monolithic warehouse. And let's just go with some of my primary, traditional customers, sales, marketing and finance. They are leveraging BSS OSS data all the time. For me to coordinate a deployment, I have to make sure that each one of these units can take an outage, if it's going to be a long deployment. With the separation of storage, compute, they own their own compute cluster. So I can move faster for these people. 'Cause if finance, I can implement his code without impacting finance or marketing. This brings in CI/CD to more reality. It brings us faster to market with more features. So if he wants to implement a new comp plan for the field reps, or we're reacting to the marketplace, where one of our competitors has done something, we can do that in days, versus waiting weeks or months. >> And we've reported on this a lot. This is the brilliance of Snowflake's founders, that whole separation >> Yep. >> from compute and data. I like Dave, that you're starting with sort of the business flexibility, 'cause there's a cost element of this too. You can dial down, you can turn off compute, and then of course the whole world said, "Hey, that's a good idea." And a VC started throwing money at Amazon, but Redshift said, "Oh, we can do that too, sort of, can't turn off the compute." But I want to ask you Phil, so, >> Sure. >> it looks from my vantage point, like you're taking your Data Cloud message which was originally separate compute from storage simplification, now data sharing, automated governance, security, ultimately the marketplace. >> Phil: Right. >> Taking that same model, break down the silos into telecom, right? It's that same, >> Mm-hmm. >> sorry to use the term playbook, Frank Slootman tells me he doesn't use playbooks, but he's not a pattern matcher, but he's a situational CEO, he says. But the situation in telco calls for that type of strategy. So explain what you guys are doing in telco. >> I think there's, so, what we're launching, we launched last week, and it really was three components, right? So we had our platform as you mentioned, >> Dave: Mm-hmm. >> and that platform is being utilized by a number of different companies today. We also are adding, for telecom very specifically, we're adding capabilities in marketplace, so that service providers can not only use some of the data and apps that are in marketplace, but as well service providers can go and sell applications or sell data that they had built. And then as well, we're adding our ecosystem, it's telecom-specific. So, we're bringing partners in, technology partners, and consulting and services partners, that are very much focused on telecoms and what they do internally, but also helping them monetize new services. >> Okay, so it's not just sort of generic Snowflake into telco? You have specific value there. >> We're purposing the platform specifically for- >> Are you a telco guy? >> I am. You are, okay. >> Total telco guy absolutely. >> So there you go. You see that Snowflake is actually an interesting organizational structure, 'cause you're going after verticals, which is kind of rare for a company of your sort of inventory, I'll say, >> Absolutely. >> I don't mean that as a negative. (Dave laughs) So Dave, take us through the data journey at AT&T. It's a long history. You don't have to go back to the 1800s, but- (Dave laughs) >> Thank you for pointing out, we're a 149-year-old company. So, Jesse James was one of the original customers, (Dave laughs) and we have no longer got his data. So, I'll go back. I've been 17 years singular AT&T, and I've watched it through the whole journey of, where the monolithics were growing, when the consolidation of small, wireless carriers, and we went through that boom. And then we've gone through mergers and acquisitions. But, Hadoop came out, and it was going to solve all world hunger. And we had all the aspects of, we're going to monetize and do AI and ML, and some of the things we learned with Hadoop was, we had this monolithic warehouse, we had this file-based-structured Hadoop, but we really didn't know how to bring this all together. And we were bringing items over to the relational, and we were taking the relational and bringing it over to the warehouse, and trying to, and it was a struggle. Let's just go there. And I don't think we were the only company to struggle with that, but we learned a lot. And so now as tech is finally emerging, with the cloud, companies like Snowflake, and others that can handle that, where we can create, we were discussing earlier, but it becomes more of a conducive mesh that's interoperable. So now we're able to simplify that environment. And the cloud is a big thing on that. 'Cause you could not do this on-prem with on-prem technologies. It would be just too cost prohibitive, and too heavy of lifting, going back and forth, and managing the data. The simplicity the cloud brings with a smaller set of tools, and I'll say in the data space specifically, really allows us, maybe not a single instance of data for all use cases, but a greatly reduced ecosystem. And when you simplify your ecosystem, you simplify speed to market and data management. >> So I'm going to ask you, I know it's kind of internal organizational plumbing, but it'll inform my next question. So, Dave, you're with the Chief Data Office, and Roddy, you're kind of, you all serve in the business, but you're really serving the, you're closer to those guys, they're banging on your door for- >> Absolutely. I try to keep the 130,000 users who may or may not have issues sometimes with our data and metrics, away from Dave. And he just gets a call from me. >> And he only calls when he has a problem. He's never wished me happy birthday. (Dave and Phil laugh) >> So the reason I asked that is because, you describe Dave, some of the Hadoop days, and again love-hate with that, but we had hyper-specialized roles. We still do. You've got data engineers, data scientists, data analysts, and you've got this sort of this pipeline, and it had to be this sequential pipeline. I know Snowflake and others have come to simplify that. My question to you is, how is that those roles, how are those roles changing? How is data getting closer to the business? Everybody talks about democratizing business. Are you doing that? What's a real use example? >> From our perspective, those roles, a lot of those roles on my team for years, because we're all about efficiency, >> Dave: Mm-hmm. >> we cut across those areas, and always have cut across those areas. So now we're into a space where things have been simplified, data processes and copying, we've gone from 40 data processes down to five steps now. We've gone from five steps to one step. We've gone from days, now take hours, hours to minutes, minutes to seconds. Literally we're seeing that time in and time out with Snowflake. So these resources that have spent all their time on data engineering and moving data around, are now freed up more on what they have skills for and always have, the data analytics area of the business, and driving the business forward, and new metrics and new analysis. That's some of the great operational value that we've seen here. As this simplification happens, it frees up brain power. >> So, you're pumping data from the OSS, the BSS, the OKRs everywhere >> Everywhere. >> into Snowflake? >> Scheduling systems, you name it. If you can think of what drives our retail and centers and online, all that data, scheduling system, chat data, call center data, call detail data, all of that enters into this common infrastructure to manage the business on a day in and day out basis. >> How are the roles and the skill sets changing? 'Cause you're doing a lot less ETL, you're doing a lot less moving of data around. There were guys that were probably really good at that. I used to joke in the, when I was in the storage world, like if your job is bandaging lungs, you need to look for a new job, right? So, and they did and people move on. So, are you able to sort of redeploy those assets, and those people, those human resources? >> These folks are highly skilled. And we were talking about earlier, SQL hasn't gone away. Relational databases are not going away. And that's one thing that's made this migration excellent, they're just transitioning their skills. Experts in legacy systems are now rapidly becoming experts on the Snowflake side. And it has not been that hard a transition. There are certainly nuances, things that don't operate as well in the cloud environment that we have to learn and optimize. But we're making that transition. >> Dave: So just, >> Please. >> within the Chief Data Office we have a couple of missions, and Roddy is a great partner and an example of how it works. We try to bring the data for democratization, so that we have one interface, now hopefully know we just have a logical connection back to these Snowflake instances that we connect. But we're providing that governance and cleansing, and if there's a business rule at the enterprise level, we provide it. But the goal at CDO is to make sure that business units like Roddy or marketing or finance, that they can come to a platform that's reliable, robust, and self-service. I don't want to be in his way. So I feel like I'm providing a sub-level of platform, that he can come to and anybody can come to, and utilize, that they're not having to go back and undo what's in Salesforce, or ServiceNow, or in our billers. So, I'm sort of that layer. And then making sure that that ecosystem is robust enough for him to use. >> And that self-service infrastructure is predominantly through the Azure Cloud, correct? >> Dave: Absolutely. >> And you work on other clouds, but it's predominantly through Azure? >> We're predominantly in Azure, yeah. >> Dave: That's the first-party citizen? >> Yeah. >> Okay, I like to think in terms sometimes of data products, and I know you've mentioned upfront, you're Gold standard or Platinum standard, you're very careful about personal information. >> Dave: Yeah. >> So you're not trying to sell, I'm an AT&T customer, you're not trying to sell my data, and make money off of my data. So the value prop and the business case for Snowflake is it's simpler. You do things faster, you're in the cloud, lower cost, et cetera. But I presume you're also in the business, AT&T, of making offers and creating packages for customers. I look at those as data products, 'cause it's not a, I mean, yeah, there's a physical phone, but there's data products behind it. So- >> It ultimately is, but not everybody always sees it that way. Data reporting often can be an afterthought. And we're making it more on the forefront now. >> Yeah, so I like to think in terms of data products, I mean even if the financial services business, it's a data business. So, if we can think about that sort of metaphor, do you see yourselves as data product builders? Do you have that, do you think about building products in that regard? >> Within the Chief Data Office, we have a data product team, >> Mm-hmm. >> and by the way, I wouldn't be disingenuous if I said, oh, we're very mature in this, but no, it's where we're going, and it's somewhat of a journey, but I've got a peer, and their whole job is to go from, especially as we migrate from cloud, if Roddy or some other group was using tables three, four and five and joining them together, it's like, "Well look, this is an offer for data product, so let's combine these and put it up in the cloud, and here's the offer data set product, or here's the opportunity data product," and it's a journey. We're on the way, but we have dedicated staff and time to do this. >> I think one of the hardest parts about that is the organizational aspects of it. Like who owns the data now, right? It used to be owned by the techies, and increasingly the business lines want to have access, you're providing self-service. So there's a discussion about, "Okay, what is a data product? Who's responsible for that data product? Is it in my P&L or your P&L? Somebody's got to sign up for that number." So, it sounds like those discussions are taking place. >> They are. And, we feel like we're more the, and CDO at least, we feel more, we're like the guardians, and the shepherds, but not the owners. I mean, we have a role in it all, but he owns his metrics. >> Yeah, and even from our perspective, we see ourselves as an enabler of making whatever AT&T wants to make happen in terms of the key products and officers' trade-in offers, trade-in programs, all that requires this data infrastructure, and managing reps and agents, and what they do from a channel performance perspective. We still ourselves see ourselves as key enablers of that. And we've got to be flexible, and respond quickly to the business. >> I always had empathy for the data engineer, and he or she had to service all these different lines of business with no business context. >> Yeah. >> Like the business knows good data from bad data, and then they just pound that poor individual, and they're like, "Okay, I'm doing my best. It's just ones and zeros to me." So, it sounds like that's, you're on that path. >> Yeah absolutely, and I think, we do have refined, getting more and more refined owners of, since Snowflake enables these golden source data, everybody sees me and my organization, channel performance data, go to Roddy's team, we have a great team, and we go to Dave in terms of making it all happen from a data infrastructure perspective. So we, do have a lot more refined, "This is where you go for the golden source, this is where it is, this is who owns it. If you want to launch this product and services, and you want to manage reps with it, that's the place you-" >> It's a strong story. So Chief Data Office doesn't own the data per se, but it's your responsibility to provide the self-service infrastructure, and make sure it's governed properly, and in as automated way as possible. >> Well, yeah, absolutely. And let me tell you more, everybody talks about single version of the truth, one instance of the data, but there's context to that, that we are taking, trying to take advantage of that as we do data products is, what's the use case here? So we may have an entity of Roddy as a prospective customer, and we may have a entity of Roddy as a customer, high-value customer over here, which may have a different set of mix of data and all, but as a data product, we can then create those for those specific use cases. Still point to the same data, but build it in different constructs. One for marketing, one for sales, one for finance. By the way, that's where your data engineers are struggling. >> Yeah, yeah, of course. So how do I serve all these folks, and really have the context-common story in telco, >> Absolutely. >> or are these guys ahead of the curve a little bit? Or where would you put them? >> I think they're definitely moving a lot faster than the industry is generally. I think the enabling technologies, like for instance, having that single copy of data that everybody sees, a single pane of glass, right, that's definitely something that everybody wants to get to. Not many people are there. I think, what AT&T's doing, is most definitely a little bit further ahead than the industry generally. And I think the successes that are coming out of that, and the learning experiences are starting to generate momentum within AT&T. So I think, it's not just about the product, and having a product now that gives you a single copy of data. It's about the experiences, right? And now, how the teams are getting trained, domains like network engineering for instance. They typically haven't been a part of data discussions, because they've got a lot of data, but they're focused on the infrastructure. >> Mm. >> So, by going ahead and deploying this platform, for platform's purpose, right, and the business value, that's one thing, but also to start bringing, getting that experience, and bringing new experience in to help other groups that traditionally hadn't been data-centric, that's also a huge step ahead, right? So you need to enable those groups. >> A big complaint of course we hear at MWC from carriers is, "The over-the-top guys are killing us. They're riding on our networks, et cetera, et cetera. They have all the data, they have all the client relationships." Do you see your client relationships changing as a result of sort of your data culture evolving? >> Yes, I'm not sure I can- >> It's a loaded question, I know. >> Yeah, and then I, so, we want to start embedding as much into our network on the proprietary value that we have, so we can start getting into that OTT play, us as any other carrier, we have distinct advantages of what we can do at the edge, and we just need to start exploiting those. But you know, 'cause whether it's location or whatnot, so we got to eat into that. Historically, the network is where we make our money in, and we stack the services on top of it. It used to be *69. >> Dave: Yeah. >> If anybody remembers that. >> Dave: Yeah, of course. (Dave laughs) >> But you know, it was stacked on top of our network. Then we stack another product on top of it. It'll be in the edge where we start providing distinct values to other partners as we- >> I mean, it's a great business that you're in. I mean, if they're really good at connectivity. >> Dave: Yeah. >> And so, it sounds like it's still to be determined >> Dave: Yeah. >> where you can go with this. You have to be super careful with private and for personal information. >> Dave: Yep. >> Yeah, but the opportunities are enormous. >> There's a lot. >> Yeah, particularly at the edge, looking at, private networks are just an amazing opportunity. Factories and name it, hospital, remote hospitals, remote locations. I mean- >> Dave: Connected cars. >> Connected cars are really interesting, right? I mean, if you start communicating car to car, and actually drive that, (Dave laughs) I mean that's, now we're getting to visit Xen Fault Tolerance people. This is it. >> Dave: That's not, let's hold the traffic. >> Doesn't scare me as much as we actually learn. (all laugh) >> So how's the show been for you guys? >> Dave: Awesome. >> What're your big takeaways from- >> Tremendous experience. I mean, someone who doesn't go outside the United States much, I'm a homebody. The whole experience, the whole trip, city, Mobile World Congress, the technologies that are out here, it's been a blast. >> Anything, top two things you learned, advice you'd give to others, your colleagues out in general? >> In general, we talked a lot about technologies today, and we talked a lot about data, but I'm going to tell you what, the accelerator that you cannot change, is the relationship that we have. So when the tech and the business can work together toward a common goal, and it's a partnership, you get things done. So, I don't know how many CDOs or CIOs or CEOs are out there, but this connection is what accelerates and makes it work. >> And that is our audience Dave. I mean, it's all about that alignment. So guys, I really appreciate you coming in and sharing your story in "theCUBE." Great stuff. >> Thank you. >> Thanks a lot. >> All right, thanks everybody. Thank you for watching. I'll be right back with Dave Nicholson. Day four SiliconANGLE's coverage of MWC '23. You're watching "theCUBE." (gentle music)
SUMMARY :
that drive human progress. And Phil Kippen, the Global But the data culture's of the OSS stuff that we But enterprise, you got to be So, we may not be as cutting-edge Channel Performance Data and all the way to leadership I don't mean the pejorative, And you guys are leaning into the Cloud. and the process efficiency and one of the nightmares I've lived with, This is the brilliance of the business flexibility, like you're taking your Data Cloud message But the situation in telco and that platform is being utilized You have specific value there. I am. So there you go. I don't mean that as a negative. and some of the things we and Roddy, you're kind of, And he just gets a call from me. (Dave and Phil laugh) and it had to be this sequential pipeline. and always have, the data all of that enters into How are the roles and in the cloud environment that But the goal at CDO is to and I know you've mentioned upfront, So the value prop and the on the forefront now. I mean even if the and by the way, I wouldn't and increasingly the business and the shepherds, but not the owners. and respond quickly to the business. and he or she had to service Like the business knows and we go to Dave in terms doesn't own the data per se, and we may have a entity and really have the and having a product now that gives you and the business value, that's one thing, They have all the data, on the proprietary value that we have, Dave: Yeah, of course. It'll be in the edge business that you're in. You have to be super careful Yeah, but the particularly at the edge, and actually drive that, let's hold the traffic. much as we actually learn. the whole trip, city, is the relationship that we have. and sharing your story in "theCUBE." Thank you for watching.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave | PERSON | 0.99+ |
Dave Whittington | PERSON | 0.99+ |
Frank Slootman | PERSON | 0.99+ |
Roddy | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Phil | PERSON | 0.99+ |
Phil Kippen | PERSON | 0.99+ |
AT&T | ORGANIZATION | 0.99+ |
Jesse James | PERSON | 0.99+ |
AT&T. | ORGANIZATION | 0.99+ |
five steps | QUANTITY | 0.99+ |
Dave Nicholson | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
50 times | QUANTITY | 0.99+ |
Snowflake | ORGANIZATION | 0.99+ |
Roddy Tranum | PERSON | 0.99+ |
10 billion | QUANTITY | 0.99+ |
one step | QUANTITY | 0.99+ |
17 years | QUANTITY | 0.99+ |
130,000 users | QUANTITY | 0.99+ |
United States | LOCATION | 0.99+ |
1800s | DATE | 0.99+ |
last week | DATE | 0.99+ |
Barcelona | LOCATION | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
Dell Technologies | ORGANIZATION | 0.99+ |
last night | DATE | 0.99+ |
MWC '23 | EVENT | 0.98+ |
telco | ORGANIZATION | 0.98+ |
one system | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
40 data processes | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
one place | QUANTITY | 0.97+ |
P&L | ORGANIZATION | 0.97+ |
telcos | ORGANIZATION | 0.97+ |
CDO | ORGANIZATION | 0.97+ |
149-year-old | QUANTITY | 0.97+ |
five | QUANTITY | 0.97+ |
single | QUANTITY | 0.96+ |
three components | QUANTITY | 0.96+ |
One | QUANTITY | 0.96+ |
Deania Davidson, Dell Technologies & Dave Lincoln, Dell Technologies | MWC Barcelona 2023
>> Narrator: theCUBE's live coverage is made possible by funding from Dell Technologies. Creating technologies that drive human progress. (upbeat music) >> Hey everyone and welcome back to Barcelona, Spain, it's theCUBE. We are live at MWC 23. This is day two of our coverage, we're giving you four days of coverage, but you already know that because you were here yesterday. Lisa Martin with Dave Nicholson. Dave this show is massive. I was walking in this morning and almost getting claustrophobic with the 80,000 people that are joining us. There is, seems to be at MWC 23 more interest in enterprise-class technology than we've ever seen before. What are some of the things that you've observed with that regard? >> Well I've observed a lot of people racing to the highest level messaging about how wonderful it is to have the kiss of a breeze on your cheek, and to feel the flowing wheat. (laughing) I want to hear about the actual things that make this stuff possible. >> Right. >> So I think we have a couple of guests here who can help us start to go down that path of actually understanding the real cool stuff that's behind the scenes. >> And absolutely we got some cool stuff. We've got two guests from Dell. Dave Lincoln is here, the VP of Networking and Emerging the Server Solutions, and Deania Davidson, Director Edge Server Product Planning and Management at Dell. So great to have you. >> Thank you. >> Two Daves, and a Davidson. >> (indistinct) >> Just me who stands alone here. (laughing) So guys talk about, Dave, we'll start with you the newest generation of PowerEdge servers. What's new? Why is it so exciting? What challenges for telecom operators is it solving? >> Yeah, well so this is actually Dell's largest server launch ever. It's the most expansive, which is notable because of, we have a pretty significant portfolio. We're very proud of our core mainstream portfolio. But really since the Supercompute in Dallas in November, that we started a rolling thunder of launches. MWC being part of that leading up to DTW here in May, where we're actually going to be announcing big investments in those parts of the market that are the growth segments of server. Specifically AIML, where we in, to address that. We're investing heavy in our XE series which we, as I said, we announced at Supercompute in November. And then we have to address the CSP segment, a big investment around the HS series which we just announced, and then lastly, the edge telecom segment which we're, we had the biggest investment, biggest announce in portfolio launch with XR series. >> Deania, lets dig into that. >> Yeah. >> Where we see the growth coming from you mentioned telecom CSPs with the edge. What are some of the growth opportunities there that organizations need Dell's help with to manage, so that they can deliver what they're demanding and user is wanting? >> The biggest areas being obviously, in addition the telecom has been the biggest one, but the other areas too we're seeing is in retail and manufacturing as well. And, so internally, I mean we're going to be focused on hardware, but we also have a solutions team who are working with us to build the solutions focused on retail, and edge and telecom as well on top of the servers that we'll talk about shortly. >> What are some of the biggest challenges that retailers and manufacturers are facing? And during the pandemic retailers, those that were successful pivoted very quickly to curbside delivery. >> Deania: Yeah. >> Those that didn't survive weren't able to do that digitally. >> Deania: Yeah. >> But we're seeing such demand. >> Yeah. >> At the retail edge. On the consumer side we want to get whatever we want right now. >> Yes. >> It has to be delivered, it has to be personalized. Talk a little bit more about some of the challenges there, within those two verticals and how Dell is helping to address those with the new server technologies. >> For retail, I think there's couple of things, the one is like in the fast food area. So obviously through COVID a lot of people got familiar and comfortable with driving through. >> Lisa: Yeah. >> And so there's probably a certain fast food restaurant everyone's pretty familiar with, they're pretty efficient in that, and so there are other customers who are trying to replicate that, and so how do we help them do that all, from a technology perspective. From a retail, it's one of the pickup and the online experience, but when you go into a store, I don't know about you but I go to Target, and I'm looking for something and I have kids who are kind of distracting you. Its like where is this one thing, and so I pull up the Target App for example, and it tells me where its at, right. And then obviously, stores want to make more money, so like hey, since you picked this thing, there are these things around you. So things like that is what we're having conversations with customers about. >> It's so interesting because the demand is there. >> Yeah, it is. >> And its not going to go anywhere. >> No. >> And it's certainly not going to be dialed down. We're not going to want less stuff, less often. >> Yeah (giggles) >> And as typical consumers, we don't necessarily make the association between what we're seeing in the palm of our hand on a mobile device. >> Deania: Right. >> And the infrastructure that's actually supporting all of it. >> Deania: Right. >> People hear the term Cloud and they think cloud-phone mystery. >> Yeah, magic just happens. >> Yeah. >> Yeah. >> But in fact, in order to support the things that we want to be able to do. >> Yeah. >> On the move, you have to optimize the server hardware. >> Deania: Yes. >> In certain ways. What does that mean exactly? When you say that its optimized, what are the sorts of decisions that you make when you're building? I think of this in the terms of Lego bricks. >> Yes, yeah >> Put together. What are some of the decisions that you make? >> So there were few key things that we really had to think about in terms of what was different from the Data center, which obviously supports the cloud environment, but it was all about how do we get closer to the customer right? How do we get things really fast and how do we compute that information really quickly. So for us, it's things like size. All right, so our server is going to weigh one of them is the size of a shoe box and (giggles), we have a picture with Dave. >> Dave: It's true. >> Took off his shoe. >> Its actually, its actually as big as a shoe. (crowd chuckles) >> It is. >> It is. >> To be fair, its a pretty big shoe. >> True, true. >> It is, but its small in relative to the old big servers that you see. >> I see what you're doing, you find a guy with a size 12, (crowd giggles) >> Yeah. >> Its the size of your shoe. >> Yeah. >> Okay. >> Its literally the size of a shoe, and that's our smallest server and its the smallest one in the portfolio, its the XR 4000, and so we've actually crammed a lot of technology in there going with the Intel ZRT processors for example to get into that compute power. The XR 8000 which you'll be hearing a lot more about shortly with our next guest is one I think from a telco perspective is our flagship product, and its size was a big thing there too. Ruggedization so its like (indistinct) certification, so it can actually operate continuously in negative 5 to 55 C, which for customers, or they need that range of temperature operation, flexibility was a big thing too. In meaning that, there are some customers who wanted to have one system in different areas of deployment. So can I take this one system and configure it one way, take that same system, configure another way and have it here. So flexibility was really key for us as well, and so we'll actually be seeing that in the next segment coming. >> I think one of, some of the common things you're hearing from this is our focus on innovation, purpose build servers, so yes our times, you know economic situation like in itself is tough yeah. But far from receding we've doubled down on investment and you've seen that with the products that we are launching here, and we will be launching in the years to come. >> I imagine there's a pretty sizeable day impact to the total adjustable market for PowerEdge based on the launch what you're doing, its going to be a tam, a good size tam expansion. >> Yeah, absolutely. Depending on how you look at it, its roughly we add about $30 Billion of adjustable tam between the three purposeful series that we've launched, XE, HS and XR. >> Can you comment on, I know Dell and customers are like this. Talk about, I'd love to get both of your perspective, I'm sure you have a favorite customer stories. But talk about the involvement of the customer in the generation, and the evolution of PowerEdge. Where are they in that process? What kind of feedback do they deliver? >> Well, I mean, just to start, one thing that is essential Cortana of Dell period, is it all is about the customer. All of it, everything that we do is about the customer, and so there is a big focus at our level, from on high to get out there and talk with customers, and actually we have a pretty good story around XR8000 which is call it our flagship of the XR line that we've just announced, and because of this deep customer intimacy, there was a last minute kind of architectural design change. >> Hm-mm. >> Which actually would have been, come to find out it would have been sort of a fatal flaw for deployment. So we corrected that because of this tight intimacy with our customers. This was in two Thanksgiving ago about and, so anyways it's super cool and the fact that we were able to make a change so late in development cycle, that's a testament to a lot of the speed and, speed of innovation that we're driving, so anyway that was that's one, just case of one example. >> Hm-mm. >> Let talk about AI, we can't go to any trade show without talking about AI, the big thing right now is ChatGPT. >> Yeah. >> I was using it the other day, it's so interesting. But, the growing demand for AI, talk about how its driving the evolution of the server so that more AI use cases can become more (indistinct). >> In the edge space primarily, we actually have another product, so I guess what you'll notice in the XR line itself because there are so many different use cases and technologies that support the different use cases. We actually have a range form factor, so we have really small, I guess I would say 350 ml the size of a shoe box, you know, Dave's shoe box. (crowd chuckles) And then we also have, at the other end a 472, so still small, but a little bit bigger, but we did recognize obviously AI was coming up, and so that is our XR 7620 platform and that does support 2 GPUs right, so, like for Edge infrencing, making sure that we have the capability to support customers in that too, but also in the small one, we do also have a GPU capability there, that also helps in those other use cases as well. So we've built the platforms even though they're small to be able to handle the GPU power for customers. >> So nice tight package, a lot of power there. >> Yes. >> Beside as we've all clearly demonstrated the size of Dave's shoe. (crowd chuckles) Dave, talk about Dell's long standing commitment to really helping to rapidly evolve the server market. >> Dave: Yeah. >> Its a pivotal payer there. >> Well, like I was saying, we see innovation, I mean, this is, to us its a race to the top. You talked about racing and messaging that sort of thing, when you opened up the show here, but we see this as a race to the top, having worked at other server companies where maybe its a little bit different, maybe more of a race to the bottom source of approach. That's what I love about being at Dell. This is very much, we understand that it's innovation is that is what's going to deliver the most value for our customers. So whether its some of the first to market, first of its kind sort of innovation that you find in the XR4000, or XR8000, or any of our XE line, we know that at the end of day, that is what going to propel Dell, do the best for our customers and thereby do the best for us. To be honest, its a little bit surprising walking by some of our competitors booths, there's been like a dearth of zero, like no, like it's almost like you wouldn't even know that there was a big launch here right? >> Yeah. >> Or is it just me? >> No. >> It was a while, we've been walking around and yet we've had, and its sort of maybe I should take this as a flattery, but a lot of our competitors have been coming by to our booth everyday actually. >> Deania: Yeah, everyday. >> They came by multiple times yesterday, they came by multiple times today, they're taking pictures of our stuff I kind of want to just send 'em a sample. >> Lisa: Or your shoe. >> Right? Or just maybe my shoe right? But anyway, so I suppose I should take it as an honor. >> Deania: Yeah. >> And conversely when we've walked over there we actually get in back (indistinct), maybe I need a high Dell (indistinct). (crowd chuckles) >> We just had that experience, yeah. >> Its kind of funny but. >> Its a good position to be in. >> Yeah. >> Yes. >> You talked about the involvement of the customers, talk a bit more about Dell's ecosystem is also massive, its part of what makes Dell, Dell. >> Wait did you say ego-system? (laughing) After David just. >> You caught that? Darn it! The talk about the influence or the part of the ecosystem and also some of the feedback from the partners as you've been rapidly evolving the server market and clearly your competitors are taking notice. >> Yeah, sorry. >> Deania: That's okay. >> Dave: you want to take that? >> I mean I would say generally, one of the things that Dell prides itself on is being able to deliver the worlds best innovation into the hands of our customers, faster and better that any other, the optimal solution. So whether its you know, working with our great partners like Intel, AMD Broadcom, these sorts of folks. That is, at the end of the day that is our core mantra, again its retractor on service, doing the best, you know, what's best for the customers. And we want to bring the world's best innovation from our technology partners, get it into the hands of our partners you know, faster and better than any other option out there. >> Its a satisfying business for all of us to be in, because to your point, I made a joke about the high level messaging. But really, that's what it comes down to. >> Lisa: Yeah. >> We do these things, we feel like sometimes we're toiling in obscurity, working with the hardware. But what it delivers. >> Deania: Hm-mm. >> The experiences. >> Dave: Absolutely. >> Deania: Yes. >> Are truly meaningful. So its a fun. >> Absolutely. >> Its a really fun thing to be a part of. >> It is. >> Absolutely. >> Yeah. Is there a favorite customer story that you have that really articulates the value of what Dell is doing, full PowerEdge, at the Edge? >> Its probably one I can't particularly name obviously but, it was, they have different environments, so, in one case there's like on flights or on sea vessels, and just being able to use the same box in those different environments is really cool. And they really appreciate having the small compact, where they can just take the server with them and go somewhere. That was really cool to me in terms of how they were using the products that we built for them. >> I have one that's kind of funny. It around XR8000. Again a customer I won't name but they're so proud of it, they almost kinds feel like they co defined it with us, they want to be on the patent with us so, anyways that's. >> Deania: (indistinct). >> That's what they went in for, yeah. >> So it shows the strength of the partnership that. >> Yeah, exactly. >> Of course, the ecosystem of partners, customers, CSVs, telecom Edge. Guys thank you so much for joining us today. >> Thank you. >> Thank you. >> Sharing what's new with the PowerEdge. We can't wait to, we're just, we're cracking open the box, we saw the shoe. (laughing) And we're going to be dealing a little bit more later. So thank you. >> We're going to be able to touch something soon? >> Yes, yes. >> Yeah. >> In couple of minutes? >> Next segment I think. >> All right! >> Thanks for setting the table for that guys. We really appreciate your time. >> Thank you for having us. >> Thank you. >> Alright, our pleasure. >> For our guests and for Dave Nicholson, I'm Lisa Martin . You're watching theCUBE. The leader in live tech coverage, LIVE in Barcelona, Spain, MWC 23. Don't go anywhere, we will be right back with our next guests. (gentle music)
SUMMARY :
that drive human progress. What are some of the have the kiss of a breeze that's behind the scenes. the VP of Networking and and a Davidson. the newest generation that are the growth segments of server. What are some of the but the other areas too we're seeing is What are some of the biggest challenges do that digitally. On the consumer side we some of the challenges there, the one is like in the fast food area. and the online experience, because the demand is there. going to be dialed down. in the palm of our hand And the infrastructure People hear the term Cloud the things that we want to be able to do. the server hardware. decisions that you make What are some of the from the Data center, its actually as big as a shoe. that you see. and its the smallest one in the portfolio, some of the common things for PowerEdge based on the between the three purposeful and the evolution of PowerEdge. flagship of the XR line and the fact that we were able the big thing right now is ChatGPT. the evolution of the server but also in the small one, a lot of power there. the size of Dave's shoe. the first to market, and its sort of maybe I should I kind of want to just send 'em a sample. But anyway, so I suppose I should take it we actually get in back (indistinct), involvement of the customers, Wait did you say ego-system? and also some of the one of the things that I made a joke about the we feel like sometimes So its a fun. that really articulates the the server with them they want to be on the patent with us so, So it shows the Of course, the ecosystem of partners, we saw the shoe. the table for that guys. we will be right back
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Nicholson | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Deania | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Lisa | PERSON | 0.99+ |
May | DATE | 0.99+ |
Dave Lincoln | PERSON | 0.99+ |
David | PERSON | 0.99+ |
November | DATE | 0.99+ |
Dell | ORGANIZATION | 0.99+ |
Cortana | TITLE | 0.99+ |
350 ml | QUANTITY | 0.99+ |
Dallas | LOCATION | 0.99+ |
Target | ORGANIZATION | 0.99+ |
Dell Technologies | ORGANIZATION | 0.99+ |
Intel | ORGANIZATION | 0.99+ |
Two | QUANTITY | 0.99+ |
XR 4000 | COMMERCIAL_ITEM | 0.99+ |
four days | QUANTITY | 0.99+ |
80,000 people | QUANTITY | 0.99+ |
two guests | QUANTITY | 0.99+ |
XR 8000 | COMMERCIAL_ITEM | 0.99+ |
XR8000 | COMMERCIAL_ITEM | 0.99+ |
55 C | QUANTITY | 0.99+ |
2 GPUs | QUANTITY | 0.99+ |
Deania Davidson | PERSON | 0.99+ |
XR4000 | COMMERCIAL_ITEM | 0.99+ |
yesterday | DATE | 0.99+ |
today | DATE | 0.99+ |
two verticals | QUANTITY | 0.99+ |
Barcelona, Spain | LOCATION | 0.98+ |
both | QUANTITY | 0.98+ |
Lego | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
XR series | COMMERCIAL_ITEM | 0.98+ |
one system | QUANTITY | 0.98+ |
about $30 Billion | QUANTITY | 0.97+ |
Supercompute | ORGANIZATION | 0.97+ |
MWC | EVENT | 0.97+ |
zero | QUANTITY | 0.95+ |
5 | QUANTITY | 0.95+ |
first | QUANTITY | 0.94+ |
MWC 23 | EVENT | 0.94+ |
this morning | DATE | 0.94+ |
telco | ORGANIZATION | 0.93+ |
one way | QUANTITY | 0.93+ |
Davidson | ORGANIZATION | 0.92+ |
couple | QUANTITY | 0.92+ |
two | DATE | 0.91+ |
Edge | ORGANIZATION | 0.91+ |
Greg Manganello Fuijitsu, Fujitsu & Ryan McMeniman, Dell Technologies | MWC Barcelona 2023
>> Announcer: TheCUBE's live coverage is made possible by funding from Dell Technologies, creating technologies that drive human progress. (pleasant music) >> We're back. This is Dave Vellante for our live coverage of MWC '23 SiliconANGLE's wall to wall, four-day coverage. We're here with Greg Manganello, who's from Fuijitsu. He's the global head of network services business unit at the company. And Ryan McMeniman is the director of product management for the open telecom ecosystem. We've been talking about that all week, how this ecosystem has opened up. Ryan's with Dell Technologies. Gents, welcome to theCUBE. >> Thank you, Dave. >> Thank you. >> Good to be here. >> Greg, thanks for coming on. Let's hear Fuijitsu's story. We haven't heard much at this event from Fuijitsu. I'm sure you got a big presence, but welcome to theCUBE. Tell us your angle. >> Thanks very much. So Fuijitsu, we're big O-RAN advocates, open radio access network advocates. We're one of the leading founders of that open standard. We're also members of the Open RAN Policy Coalition. I'm a board member there. We're kind of all in on OpenRAN. The reason is it gives operators choices and much more vendor diversity and therefore a lot of innovation when they build out their 5G networks. >> And so as an entry point for Dell as well, I mean obviously you guys make a lot of hay with servers and storage and other sort of hardware, but O-RAN is just this disruptive change to this industry, but it's also compute intensive. So from Dell's perspective, what are the challenges of getting customers to the carriers to adopt O-RAN? How do you de-risk it for them? >> Right, I mean O-RAN really needs to be seen as a choice, right? And that choice comes with building out an ecosystem of partners, right? Working with people like Fuijitsu and others helps us build systems that the carriers can rely upon. Otherwise, it looks like another science experiment, a sandbox, and it's really anything but that. >> So what specifically are you guys doing together? Are you doing integrations, reference architectures engineered systems, all of the above? >> Yeah, so I think it's a little bit of all of the above. So we've announced our cooperation, so the engineering teams are linked, and that we're combining our both sweet spots together from Fuijitsu's virtual CU/DU, and our OpenRAN radios, and Dell's platforms and integration capabilities. And together we're offering a pre-integrated bundle to operators to reduce that risk and kind of help overcome some of the startup obstacles by shrinking the integration cost. >> So you've got Greenfield customers, that's pretty straightforward, white sheet of paper, go, go disrupt. And then there's traditional carriers, got 4G and 5G networks, and sort of hybrid if you will, and this integration there. Where do you see the action now? I presume it's Greenfield today, but isn't it inevitable that the traditional carriers have to go open? >> It is, a couple of different ways that they need to go and they want to go might be power consumption, it might be the cloudification of their network. They're going to have different reasons for doing it. And I think we have to make sure that when we work on collaborations like we do with Fuijitsu, we have to look at all of those vectors. What is it that somebody maybe here in Europe is dealing with high gas prices, high energy prices, in the U.S. or wherever it's expansion. They're going to be different justifications for it. >> Yeah, so power must be an increasing component of the operating expense, with energy costs up, and it's a power hungry environment. So how does OpenRAN solve that problem? >> So that's a great question. So by working together we can really optimize the configurations. So on the Fuijitsu side, our radios are multi-band and highly compact and super energy efficient so that the TCO for the carrier is much, much lower. And then we've also announced on the rApp side power savings, energy savings applications, which are really sophisticated AI enabled apps that can switch off the radio based upon traffic prediction models and we can save the operator 30% on their energy bill. That's a big number. >> And that intelligence that lives in the, does it live in the RIC, is it in the brain? >> In the app right above the RIC, absolutely. >> Okay, so it's a purpose-built app to deal with that. >> It's multi-vendor app, it can sit on anybody's O-RAN system. And one of the beauties of O-RAN is there is that open architecture, so that even if Dell and Fuijitsu only sell part of the, or none of the system, an app can be selected from any vendor including Fuijitsu. So that's one of the benefits of whoever's got the best idea, the best cost performance, the best energy performance, customers can really be enabled to make the choice and continue to make choices, not just way back at RFP time, but throughout their life cycle they can keep making choices. And so that's really meaning that, hey, if we miss the buying cycle then we're closed out for 5 or 10 years. No, it's constantly being reevaluated, and that's really exciting, the whole ecosystem. But what we really want to do is make sure we partner together with key partners, Dell and Fuijitsu, such that the customer, when they do select us they see a bundle, not just every person for themselves. It de-risks it. And we get a lot of that integration headache out of the way before we launch it. >> I think that's what's different. We've been talking about how we've kind of seen this move before, in the nineties we saw the move from the mainframe vertical stack to the horizontal stack. We talked about that, but there are real differences because back then you had, I don't know, five components of the stack and there was no integration, and even converged infrastructure was kind of bolts that brought that together. And then over time it's become engineered systems. When you talk to customers, Ryan, is the conversation today mostly TCO? Is it how to get the reliability and quality of service of traditional stacks? Where's the conversation today? >> Yeah, it's the flip side of choice, which is how do you make sure you have that reliability and that security to ensure that the full stack isn't just integrated, but it lives through that whole life cycle management. What are, if you're bringing in another piece, an rApp or an xApp, how do you actually make sure that it works together as a group? Because if you don't have that kind of assurance how can you actually guarantee that O-RAN in and of itself is going to perform better than a traditional RAN system? So overcoming that barrier requires partnerships and integration activity. That is an investment on the parts of our companies, but also the operators need to look back at us and say, yeah, that work has been done, and I trust as trusted advisors for the operators that that's been done. And then we can go validate it. >> Help our audience understand it. At what point in time do you feel that from a TCO perspective there'll be parity, or in my opinion it doesn't even have to be equal. It has to be close enough. And I don't know what that close enough is because the other benefits of openness, the innovation, so there's that piece of it as the cost piece and then there is the reliability. And I would say the same thing. It's got to be, well, maybe good enough is not good enough in this world, but maybe it is for some use cases. So really my question is around adoption and what are those factors that are going to affect adoption and when can we expect them to be? >> It's a good question, Dave, and what I would say is that the closed RAN vendors are making incremental improvements. And if you think in a snapshot there might be one answer, but if you think in kind of a flow model, a river over time, our O-RAN like-minded people are on a monster innovation curve. I mean the slope of the curve is huge. So in the OpenRAN policy coalition, 60 like-minded companies working together going north, and we're saying that let's bring all the innovation together, so you can say TCO, reliability, but we're bringing the innovation curve of software and integration curve from silicon and integration from system vendors all together to really out-innovate everybody else by working together. So that's the-- >> I like that curve analogy, Greg 'cause okay, you got the ogive or S curve, and you're saying that O-RAN is entering or maybe even before the steep part of the S curve, so you're going to go hyperbolic, whereas the traditional vendors are maybe trying to squeeze a little bit more out of the lemon. >> 1, 2%, and we're making 30% or more quantum leaps at a time every innovation. So what we tell customers is you can measure right now, but if you just do the time-based competition model, as an organization, as a group of us, we're going to be ahead. >> Is it a Moore's law innovation curve or is it actually faster because you've got the combinatorial factors of silicon, certain telco technologies, other integration software. Is it actually steeper than maybe historical Moore's law? >> I think it's steeper. I don't know Ryan's opinion, but I think it's steeper because Moore's law, well-known in silicon, and it's reaching five nanometers and more and more innovations. But now we're talking about AI software and machine learning as well as the system and device vendors. So when all that's combined, what is that? So that's why I think we're at an O-RAN conference today. I'm not sure we're at MWC. >> Well, it's true. It's funny they changed the name from Mobile World Congress and that was never really meant to be a consumer show, but these things change that, right? And so I think it's appropriate MWC because we're seeing really deep enterprise technology now enter, so that's your sweet spot, isn't it? >> It really is. But I think in some ways it's the path to that price performance parity, which we saw in IT a long time ago, making its way into telecom is there, but it doesn't work unless everybody is on board. And that involves players like this and even smaller companies and innovative startups, which we really haven't seen in this space for some time. And we've been having them at the Dell booth all week long. And there's really interesting stuff like Greg said, AI, ML, optimization and efficiency, which is exciting. And that's where O-RAN can also benefit the Industry. >> And as I say, there are other differences to your advantage. You've got engineered systems or you've been through that in enterprise IT, kind of learned how to do that. But you've also got the cloud, public cloud for experimentation, so you can fail cheaply, and you got AI, right, which is, really didn't have AI in the nineties. You had it, but nobody used it. And now you're like, everybody's using ChatGPT. >> Right, but now what's exciting, and the other thing that Ryan and we are working on together is linking our labs together because it's not about the first time system integration and connecting the hoses together, and okay, there it worked, but it's about the ongoing life cycle management of all the updates and upgrades. And by using Dell's OTEL Lab and Fuijitsu's MITC lab and linking them together, now we really have a way of giving operators confidence that as we bring out the new innovations it's battle tested by two organizations. And so two logos coming together and saying, we've looked at it from our different angles and then this is battle tested. There's a lot of value there. >> I think the labs are key. >> But it's interesting, the point there is by tying labs together, there's an acknowledged skills gap as we move into this O-RAN world that operators are looking to us and probably Fuijitsu saying, help our team understand how to thrive in this new environment because we're going from closed systems to open systems where they actually again, have more choice and more ability to be flexible. >> Yeah, if you could take away that plumbing, even though they're good plumbers. All right guys, we got to go. Thanks so much for coming on theCUBE. >> Thank you much. >> It's great to have you. >> Appreciate it, Dave. >> Okay, keep it right there. Dave Vellante, Lisa Martin, and Dave Nicholson will be back from the Fira in Barcelona on theCUBE. Keep it right there. (pleasant music)
SUMMARY :
that drive human progress. And Ryan McMeniman is the I'm sure you got a big presence, We're also members of the and other sort of hardware, the carriers can rely upon. and that we're combining our that the traditional it might be the cloudification of the operating expense, so that the TCO for the In the app right above app to deal with that. Dell and Fuijitsu, such that the customer, in the nineties we saw the move but also the operators of it as the cost piece that the closed RAN vendors or maybe even before the and we're making 30% or more quantum leaps combinatorial factors of silicon, and it's reaching five nanometers and that was never really And that involves players like this and you got AI, right, and connecting the hoses together, and more ability to be flexible. Yeah, if you could Martin, and Dave Nicholson
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Greg Manganello | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Dave Zeigenfuss | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Ryan McMeniman | PERSON | 0.99+ |
BJ Gardner | PERSON | 0.99+ |
BJ | PERSON | 0.99+ |
Dave Nicholson | PERSON | 0.99+ |
February of 2019 | DATE | 0.99+ |
Greg | PERSON | 0.99+ |
November of 2018 | DATE | 0.99+ |
David Zeigenfuss | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Faction | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Philadelphia | LOCATION | 0.99+ |
Atlanta | LOCATION | 0.99+ |
New Jersey | LOCATION | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
Fuijitsu | ORGANIZATION | 0.99+ |
September 17th | DATE | 0.99+ |
Ryan | PERSON | 0.99+ |
5 | QUANTITY | 0.99+ |
two floors | QUANTITY | 0.99+ |
30% | QUANTITY | 0.99+ |
Dell | ORGANIZATION | 0.99+ |
David | PERSON | 0.99+ |
Stu Miniman | PERSON | 0.99+ |
Dell Technologies | ORGANIZATION | 0.99+ |
one floor | QUANTITY | 0.99+ |
June | DATE | 0.99+ |
two guests | QUANTITY | 0.99+ |
10 years | QUANTITY | 0.99+ |
summer of 2019 | DATE | 0.99+ |
U.S. | LOCATION | 0.99+ |
First | QUANTITY | 0.99+ |
OTEL Lab | ORGANIZATION | 0.99+ |
two organizations | QUANTITY | 0.99+ |
Pennsylvania Lumbermens | ORGANIZATION | 0.99+ |
125 years | QUANTITY | 0.99+ |
Atlanta, Georgia | LOCATION | 0.99+ |
one answer | QUANTITY | 0.99+ |
Barcelona | LOCATION | 0.99+ |
Open RAN Policy Coalition | ORGANIZATION | 0.99+ |
two logos | QUANTITY | 0.99+ |
2007-2008 | DATE | 0.99+ |
two fold | QUANTITY | 0.99+ |
four-day | QUANTITY | 0.99+ |
100 | QUANTITY | 0.99+ |
Vanesa Diaz, LuxQuanta & Dr Antonio Acin, ICFO | MWC Barcelona 2023
(upbeat music) >> Narrator: theCUBE's live coverage is made possible by funding from Dell Technologies: creating technologies that drive human progress. (upbeat music) >> Welcome back to the Fira in Barcelona. You're watching theCUBE's Coverage day two of MWC 23. Check out SiliconANGLE.com for all the news, John Furrier in our Palo Alto studio, breaking that down. But we're here live Dave Vellante, Dave Nicholson and Lisa Martin. We're really excited. We're going to talk qubits. Vanessa Diaz is here. She's CEO of LuxQuanta And Antonio Acin is a professor of ICFO. Folks, welcome to theCUBE. We're going to talk quantum. Really excited about that. >> Vanessa: Thank you guys. >> What does quantum have to do with the network? Tell us. >> Right, so we are actually leaving the second quantum revolution. So the first one actually happened quite a few years ago. It enabled very much the communications that we have today. So in this second quantum revolution, if in the first one we learn about some very basic properties of quantum physics now our scientific community is able to actually work with the systems and ask them to do things. So quantum technologies mean right now, three main pillars, no areas of exploration. The first one is quantum computing. Everybody knows about that. Antonio knows a lot about that too so he can explain further. And it's about computers that now can do wonder. So the ability of of these computers to compute is amazing. So they'll be able to do amazing things. The other pillar is quantum communications but in fact it's slightly older than quantum computer, nobody knows that. And we are the ones that are coming to actually counteract the superpowers of quantum computers. And last but not least quantum sensing, that's the the application of again, quantum physics to measure things that were impossible to measure in with such level of quality, of precision than before. So that's very much where we are right now. >> Okay, so I think I missed the first wave of quantum computing Because, okay, but my, our understanding is ones and zeros, they can be both and the qubits aren't that stable, et cetera. But where are we today, Antonio in terms of actually being able to apply quantum computing? I'm inferring from what Vanessa said that we've actually already applied it but has it been more educational or is there actual work going on with quantum? >> Well, at the moment, I mean, typical question is like whether we have a quantum computer or not. I think we do have some quantum computers, some machines that are able to deal with these quantum bits. But of course, this first generation of quantum computers, they have noise, they're imperfect, they don't have many qubits. So we have to understand what we can do with these quantum computers today. Okay, this is science, but also technology working together to solve relevant problems. So at this moment is not clear what we can do with present quantum computers but we also know what we can do with a perfect quantum computer without noise with many quantum bits, with many qubits. And for instance, then we can solve problems that are out of reach for our classical computers. So the typical example is the problem of factorization that is very connected to what Vanessa does in her company. So we have identified problems that can be solved more efficiently with a quantum computer, with a very good quantum computer. People are working to have this very good quantum computer. At the moment, we have some imperfect quantum computers, we have to understand what we can do with these imperfect machines. >> Okay. So for the first wave was, okay, we have it working for a little while so we see the potential. Okay, and we have enough evidence almost like a little experiment. And now it's apply it to actually do some real work. >> Yeah, so now there is interest by companies so because they see a potential there. So they are investing and they're working together with scientists. We have to identify use cases, problems of relevance for all of us. And then once you identify a problem where a quantum computer can help you, try to solve it with existing machines and see if you can get an advantage. So now the community is really obsessed with getting a quantum advantage. So we really hope that we will get a quantum advantage. This, we know we will get it. We eventually have a very good quantum computer. But we want to have it now. And we're working on that. We have some results, there were I would say a bit academic situation in which a quantum advantage was proven. But to be honest with you on a really practical problem, this has not happened yet. But I believe the day that this happens and I mean it will be really a game changing. >> So you mentioned the word efficiency and you talked about the quantum advantage. Is the quantum advantage a qualitative advantage in that it is fundamentally different? Or is it simply a question of greater efficiency, so therefore a quantitative advantage? The example in the world we're used to, think about a card system where you're writing information on a card and putting it into a filing cabinet and then you want to retrieve it. Well, the information's all there, you can retrieve it. Computer system accelerates that process. It's not doing something that is fundamentally different unless you accept that the speed with which these things can be done gives it a separate quality. So how would you characterize that quantum versus non quantum? Is it just so much horse power changes the game or is it fundamentally different? >> Okay, so from a fundamental perspective, quantum physics is qualitatively different from classical physics. I mean, this year the Nobel Prize was given to three experimentalists who made experiments that proved that quantum physics is qualitatively different from classical physics. This is established, I mean, there have been experiments proving that. Now when we discuss about quantum computation, it's more a quantitative difference. So we have problems that you can solve, in principle you can solve with the classical computers but maybe the amount of time you need to solve them is we are talking about centuries and not with your laptop even with a classic super computer, these machines that are huge, where you have a building full of computers there are some problems for which computers take centuries to solve them. So you can say that it's quantitative, but in practice you may even say that it's impossible in practice and it will remain impossible. And now these problems become feasible with a quantum computer. So it's quantitative but almost qualitative I would say. >> Before we get into the problems, 'cause I want to understand some of those examples, but Vanessa, so your role at LuxQuanta is you're applying quantum in the communication sector for security purposes, correct? >> Vanessa: Correct. >> Because everybody talks about how quantum's going to ruin our lives in terms of taking all our passwords and figuring everything out. But can quantum help us defend against quantum and is that what you do? >> That's what we do. So one of the things that Antonio's explaining so our quantum computer will be able to solve in a reasonable amount of time something that today is impossible to solve unless you leave a laptop or super computer working for years. So one of those things is cryptography. So at the end, when use send a message and you want to preserve its confidentiality what you do is you destroy it but following certain rules which means they're using some kind of key and therefore you can send it through a public network which is the case for every communication that we have, we go through the internet and then the receiver is going to be able to reassemble it because they have that private key and nobody else has. So that private key is actually made of computational problems or mathematical problems that are very, very hard. We're talking about 40 years time for a super computer today to be able to hack it. However, we do not have the guarantee that there is already very smart mind that already have potentially the capacity also of a quantum computer even with enough, no millions, but maybe just a few qubits, it's enough to actually hack this cryptography. And there is also the fear that somebody could actually waiting for quantum computing to finally reach out this amazing capacity we harvesting now which means capturing all this confidential information storage in it. So when we are ready to have the power to unlock it and hack it and see what's behind. So we are talking about information as delicate as governmental, citizens information related to health for example, you name it. So what we do is we build a key to encrypt the information but it's not relying on a mathematical problem it's relying on the laws of quantum physics. So I'm going to have a channel that I'm going to pump photons there, light particles of light. And that quantum channel, because of the laws of physics is going to allow to detect somebody trying to sneak in and seeing the key that I'm establishing. If that happens, I will not create a key if it's clean and nobody was there, I'll give you a super key that nobody today or in the future, regardless of their computational power, will be able to hack. >> So it's like super zero trust. >> Super zero trust. >> Okay so but quantum can solve really challenging mathematical problems. If you had a quantum computer could you be a Bitcoin billionaire? >> Not that I know. I think people are, okay, now you move me a bit of my comfort zone. Because I know people have working on that. I don't think there is a lot of progress at least not that I am aware of. Okay, but I mean, in principle you have to understand that our society is based on information and computation. Computers are a key element in our society. And if you have a machine that computes better but much better than our existing machines, this gives you an advantage for many things. I mean, progress is locked by many computational problems we cannot solve. We can want to have better materials better medicines, better drugs. I mean this, you have to solve hard computational problems. If you have machine that gives you machine learning, big data. I mean, if you have a machine that gives you an advantage there, this may be a really real change. I'm not saying that we know how to do these things with a quantum computer. But if we understand how this machine that has been proven more powerful in some context can be adapted to some other context. I mean having a much better computer machine is an advantage. >> When? When are we going to have, you said we don't really have it today, we want it today. Are we five years away, 10 years away? Who's working on this? >> There are already quantum computers are there. It's just that the capacity that they have of right now is the order of a few hundred qubits. So people are, there are already companies harvesting, they're actually the companies that make these computers they're already putting them. People can access to them through the cloud and they can actually run certain algorithms that have been tailor made or translated to the language of a quantum computer to see how that performs there. So some people are already working with them. There is billions of investment across the world being put on different flavors of technologies that can reach to that quantum supremacy that we are talking about. The question though that you're asking is Q day it sounds like doomsday, you know, Q day. So depending on who you talk to, they will give you a different estimation. So some people say, well, 2030 for example but perhaps we could even think that it could be a more aggressive date, maybe 2027. So it is yet to be the final, let's say not that hard deadline but I think that the risk, that it can actually bring is big enough for us to pay attention to this and start preparing for it. So the end times of cryptography that's what quantum is doing is we have a system here that can actually prevent all your communications from being hacked. So if you think also about Q day and you go all the way back. So whatever tools you need to protect yourself from it, you need to deploy them, you need to see how they fit in your organization, evaluate the benefits, learn about it. So that, how close in time does that bring us? Because I believe that the time to start thinking about this is now. >> And it's likely it'll be some type of hybrid that will get us there, hybrid between existing applications. 'Cause you have to rewrite or write new applications and that's going to take some time. But it sounds like you feel like this decade we will see Q day. What probability would you give that? Is it better than 50/50? By 2030 we'll see Q day. >> But I'm optimistic by nature. So yes, I think it's much higher than 50. >> Like how much higher? >> 80, I would say yes. I'm pretty confident. I mean, but what I want to say also usually when I think there is a message here so you have your laptop, okay, in the past I had a Spectrum This is very small computer, it was more or less the same size but this machine is much more powerful. Why? Because we put information on smaller scales. So we always put information in smaller and smaller scale. This is why here you have for the same size, you have much more information because you put on smaller scales. So if you go small and small and small, you'll find the quantum word. So this is unavoidable. So our information devices are going to meet the quantum world and they're going to exploit it. I'm fully convinced about this, maybe not for the quantum computer we're imagining now but they will find it and they will use quantum effects. And also for cryptography, for me, this is unavoidable. >> And you brought the point there are several companies working on that. I mean, I can get quantum computers on in the cloud and Amazon and other suppliers. IBM of course is. >> The underlying technology, there are competing versions of how you actually create these qubits. pins of electrons and all sorts of different things. Does it need to be super cooled or not? >> Vanessa: There we go. >> At a fundamental stage we'd be getting ground. But what is, what does ChatGPT look like when it can leverage the quantum realm? >> Well, okay. >> I Mean are we all out of jobs at that point? Should we all just be planning for? >> No. >> Not you. >> I think all of us real estate in Portugal, should we all be looking? >> No, actually, I mean in machine learning there are some hopes about quantum competition because usually you have to deal with lots of data. And we know that in quantum physics you have a concept that is called superposition. So we, there are some hopes not in concrete yet but we have some hopes that these superpositions may allow you to explore this big data in a more efficient way. One has to if this can be confirmed. But one of the hopes creating this lots of qubits in this superpositions that you will have better artificial intelligence machines but, okay, this is quite science fiction what I'm saying now. >> At this point and when you say superposition, that's in contrast to the ones and zeros that we're used to. So when someone says it could be a one or zero or a one and a zero, that's referencing the concept of superposition. And so if this is great for encryption, doesn't that necessarily mean that bad actors can leverage it in a way that is now unhackable? >> I mean our technologies, again it's impossible to hack because it is the laws of physics what are allowing me to detect an intruder. So that's the beauty of it. It's not something that you're going to have to replace in the future because there will be a triple quantum computer, it is not going to affect us in any way but definitely the more capacity, computational capacity that we see out there in quantum computers in particular but in any other technologies in general, I mean, when we were coming to talk to you guys, Antonio and I, he was the one saying we do not know whether somebody has reached some relevant computational power already with the technologies that we have. And they've been able to hack already current cryptography and then they're not telling us. So it's a bit of, the message is a little bit like a paranoid message, but if you think about security that the amount of millions that means for a private institution know when there is a data breach, we see it every day. And also the amount of information that is relevant for the wellbeing of a country. Can you really put a reasonable amount of paranoid to that? Because I believe that it's worth exploring whatever tool is going to prevent you from putting any of those piece of information at risk. >> Super interesting topic guys. I know you're got to run. Thanks for stopping by theCUBE, it was great to have you on. >> Thank you guys. >> All right, so this is the SiliconANGLE theCUBE's coverage of Mobile World Congress, MWC now 23. We're live at the Fira Check out silicon SiliconANGLE.com and theCUBE.net for all the videos. Be right back, right after this short break. (relaxing music)
SUMMARY :
that drive human progress. for all the news, to do with the network? if in the first one we learn and the qubits aren't So we have to understand what we can do Okay, and we have enough evidence almost But to be honest with you So how would you characterize So we have problems that you can solve, and is that what you do? that I'm going to pump photons If you had a quantum computer that gives you machine learning, big data. you said we don't really have It's just that the capacity that they have of hybrid that will get us there, So yes, I think it's much higher than 50. So if you go small and small and small, And you brought the point of how you actually create these qubits. But what is, what does ChatGPT look like that these superpositions may allow you and when you say superposition, that the amount of millions that means it was great to have you on. for all the videos.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Vellante | PERSON | 0.99+ |
Vanessa | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Vanessa Diaz | PERSON | 0.99+ |
Dave Nicholson | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
Antonio | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Portugal | LOCATION | 0.99+ |
five years | QUANTITY | 0.99+ |
LuxQuanta | ORGANIZATION | 0.99+ |
10 years | QUANTITY | 0.99+ |
Vanesa Diaz | PERSON | 0.99+ |
three experimentalists | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
Antonio Acin | PERSON | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
2027 | DATE | 0.99+ |
first one | QUANTITY | 0.99+ |
2030 | DATE | 0.99+ |
Barcelona | LOCATION | 0.99+ |
zero | QUANTITY | 0.98+ |
both | QUANTITY | 0.98+ |
three main pillars | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
Dell Technologies | ORGANIZATION | 0.97+ |
this year | DATE | 0.97+ |
Nobel Prize | TITLE | 0.97+ |
Mobile World Congress | EVENT | 0.97+ |
first generation | QUANTITY | 0.97+ |
MWC 23 | EVENT | 0.96+ |
millions | QUANTITY | 0.96+ |
SiliconANGLE | ORGANIZATION | 0.95+ |
second quantum revolution | QUANTITY | 0.95+ |
few years ago | DATE | 0.95+ |
80 | QUANTITY | 0.94+ |
billions of investment | QUANTITY | 0.92+ |
theCUBE | ORGANIZATION | 0.92+ |
centuries | QUANTITY | 0.91+ |
SiliconANGLE.com | OTHER | 0.9+ |
about 40 years | QUANTITY | 0.89+ |
Dr | PERSON | 0.88+ |
super zero | OTHER | 0.86+ |
50/50 | QUANTITY | 0.84+ |
first wave | EVENT | 0.84+ |
day two | QUANTITY | 0.83+ |
zeros | QUANTITY | 0.82+ |
years | QUANTITY | 0.81+ |
ICFO | ORGANIZATION | 0.8+ |
this decade | DATE | 0.77+ |
few hundred qubits | QUANTITY | 0.72+ |
Fira | LOCATION | 0.69+ |
23 | DATE | 0.64+ |
MWC | EVENT | 0.62+ |
higher | QUANTITY | 0.62+ |
50 | QUANTITY | 0.61+ |
Fira | EVENT | 0.55+ |
triple | QUANTITY | 0.55+ |
zero | OTHER | 0.54+ |
One | QUANTITY | 0.53+ |
theCUBE.net | OTHER | 0.53+ |
qubits | QUANTITY | 0.51+ |
Srinivas Mukkamala & David Shepherd | Ivanti
(gentle music) >> Announcer: "theCube's" live coverage is made possible by funding from Dell Technologies, creating technologies that drive human progress. (upbeat music) (logo whooshing) >> Hey, everyone, welcome back to "theCube's" coverage of day one, MWC23 live from Barcelona, Lisa Martin here with Dave Vellante. Dave, we've got some great conversations so far This is the biggest, most packed show I've been to in years. About 80,000 people here so far. >> Yeah, down from its peak of 108, but still pretty good. You know, a lot of folks from China come to this show, but with the COVID situation in China, that's impacted the attendance, but still quite amazing. >> Amazing for sure. We're going to be talking about trends and mobility, and all sorts of great things. We have a couple of guests joining us for the first time on "theCUBE." Please welcome Dr. Srinivas Mukkamala or Sri, chief product officer at Ivanti. And Dave Shepherd, VP Ivanti. Guys, welcome to "theCUBE." Great to have you here. >> Thank you. >> So, day one of the conference, Sri, we'll go to you first. Talk about some of the trends that you're seeing in mobility. Obviously, the conference renamed from Mobile World Congress to MWC mobility being part of it, but what are some of the big trends? >> It's interesting, right? I mean, I was catching up with Dave. The first thing is from the keynotes, it took 45 minutes to talk about security. I mean, it's quite interesting when you look at the shore floor. We're talking about Edge, we're talking about 5G, the whole evolution. And there's also the concept of are we going into the Cloud? Are we coming back from the Cloud, back to the Edge? They're really two different things. Edge is all decentralized while you recompute. And one thing I observed here is they're talking about near real-time reality. When you look at automobiles, when you look at medical, when you look at robotics, you can't have things processed in the Cloud. It'll be too late. Because you got to make millisecond-based stations. That's a big trend for me. When I look at staff... Okay, the compute it takes to process in the Cloud versus what needs to happen on-prem, on device, is going to revolutionize the way we think about mobility. >> Revolutionize. David, what are some of the things that you're saying? Do you concur? >> Yeah, 100%. I mean, look, just reading some of the press recently, they're predicting 22 billion IoT devices by 2024. Everything Sri just talked about there. It's growing exponentially. You know, problems we have today are a snapshot. We're probably in the slowest place we are today. Everything's just going to get faster and faster and faster. So it's a, yeah, 100% concur with that. >> You know, Sri, on your point, so Jose Maria Alvarez, the CEO of Telefonica, said there are three pillars of the future of telco, low latency, programmable networks, and Cloud and Edge. So, as to your point, Cloud and low latency haven't gone hand in hand. But the Cloud guys are saying, "All right, we're going to bring the Cloud to the Edge." That's sort of an interesting dynamic. We're going to bypass them. We heard somebody, another speaker say, "You know, Cloud can't do it alone." You know? (chuckles) And so, it's like these worlds need each other in a way, don't they? >> Definitely right. So that's a fantastic way to look at it. The Cloud guys can say, "We're going to come closer to where the computer is." And if you really take a look at it with data localization, where are we going to put the Cloud in, right? I mean, so the data sovereignty becomes a very interesting thing. The localization becomes a very interesting thing. And when it comes to security, it gets completely different. I mean, we talked about moving everything to a centralized compute, really have massive processing, and give you the addition back wherever you are. Whereas when you're localized, I have to process everything within the local environment. So there's already a conflict right there. How are we going to address that? >> Yeah. So another statement, I think, it was the CEO of Ericsson, he was kind of talking about how the OTT guys have heard, "We can't let that happen again. And we're going to find new ways to charge for the network." Basically, he's talking about monetizing the API access. But I'm interested in what you're hearing from customers, right? 'Cause our mindset is, what value you're going to give to customers that they're going to pay for, versus, "I got this data I'm going to charge developers for." But what are you hearing from customers? >> It's amazing, Dave, the way you're looking at it, right? So if we take a look at what we were used to perpetual, and we said we're going to move to a subscription, right? I mean, everybody talks about subscription economy. Telcos on the other hand, had subscription economy for a long time, right? They were always based on usage, right? It's a usage economy. But today, we are basically realizing on compute. We haven't even started charging for compute. If you go to AWS, go to Azure, go to GCP, they still don't quite charge you for actual compute, right? It's kind of, they're still leaning on it. So think about API-based, we're going to break the bank. What people don't realize is, we do millions of API calls for any high transaction environment. A consumer can't afford that. What people don't realize is... I don't know how you're going to monetize. Even if you charge a cent a call, that is still going to be hundreds and thousands of dollars a day. And that's where, if you look at what you call low-code no-code motion? You see a plethora of companies being built on that. They're saying, "Hey, you don't have to write code. I'll give you authentication as a service. What that means is, Every single time you call my API to authenticate a user, I'm going to charge you." So just imagine how many times we authenticate on a single day. You're talking a few dozen times. And if I have to pay every single time I authenticate... >> Real friction in the marketplace, David. >> Yeah, and I tell you what. It's a big topic, right? And it's a topic that we haven't had to deal with at the Edge before, and we hear it probably daily really, complexity. The complexity's growing all the time. That means that we need to start to get insight, visibility. You know? I think a part of... Something that came out of the EU actually this week, stated, you know, there's a cyber attack every 11 seconds. That's fast, right? 2016, that was 40 seconds. So actually that speed I talked about earlier, everything Sri says that's coming down to the Edge, we want to embrace the Edge and that is the way we're going to move. But customers are mindful of the complexity that's involved in that. And that, you know, lens thought to how are we going to deal with those complexities. >> I was just going to ask you, how are you planning to deal with those complexities? You mentioned one ransomware attack every 11 seconds. That's down considerably from just a few years ago. Ransomware is a household word. It's no longer, "Are we going to get attacked?" It's when, it's to what extent, it's how much. So how is Ivanti helping customers deal with some of the complexities, and the changes in the security landscape? >> Yeah. Shall I start on that one first? Yeah, look, we want to give all our customers and perspective customers full visibility of their environment. You know, devices that are attached to the environment. Where are they? What are they doing? How often are we going to look for those devices? Not only when we find those devices. What applications are they running? Are those applications secure? How are we going to manage those applications moving forward? And overall, wrapping it round, what kind of service are we going to do? What processes are we going to put in place? To Sri's point, the low-code no-code angle. How do we build processes that protect our organization? But probably a point where I'll pass to Sri in a moment is how do we add a level of automation to that? How do we add a level of intelligence that doesn't always require a human to be fixing or remediating a problem? >> To Sri, you mentioned... You're right, the keynote, it took 45 minutes before it even mentioned security. And I suppose it's because they've historically, had this hardened stack. Everything's controlled and it's a safe environment. And now that's changing. So what would you add? >> You know, great point, right? If you look at telcos, they're used to a perimeter-based network. >> Yep. >> I mean, that's what we are. Boxed, we knew our perimeter. Today, our perimeter is extended to our home, everywhere work, right? >> Yeah- >> We don't have a definition of a perimeter. Your browser is the new perimeter. And a good example, segueing to that, what we have seen is horizontal-based security. What we haven't seen is verticalization, especially in mobile. We haven't seen vertical mobile security solutions, right? Yes, you hear a little bit about automobile, you hear a little bit about healthcare, but what we haven't seen is, what about food sector? What about the frontline in food? What about supply chain? What security are we really doing? And I'll give you a simple example. You brought up ransomware. Last night, Dole was attacked with ransomware. We have seen the beef producer colonial pipeline. Now, if we have seen agritech being hit, what does it mean? We are starting to hit humanity. If you can't really put food on the table, you're starting to really disrupt the supply chain, right? In a massive way. So you got to start thinking about that. Why is Dole related to mobility? Think about that. They don't carry service and computers. What they carry is mobile devices. that's where the supply chain works. And then that's where you have to start thinking about it. And the evolution of ransomware, rather than a single-trick pony, you see them using multiple vulnerabilities. And Pegasus was the best example. Spyware across all politicians, right? And CEOs. It is six or seven vulnerabilities put together that actually was constructed to do an attack. >> Yeah. How does AI kind of change this? Where does it fit in? The attackers are going to have AI, but we could use AI to defend. But attackers are always ahead, right? (chuckles) So what's your... Do you have a point of view on that? 'Cause everybody's crazy about ChatGPT, right? The banks have all banned it. Certain universities in the United States have banned it. Another one's forcing his students to learn how to use ChatGPT to prompt it. It's all over the place. You have a point of view on this? >> So definitely, Dave, it's a great point. First, we all have to have our own generative AI. I mean, I look at it as your digital assistant, right? So when you had calculators, you can't function without a calculator today. It's not harmful. It's not going to take you away from doing multiplication, right? So we'll still teach arithmetic in school. You'll still use your calculator. So to me, AI will become an integral part. That's one beautiful thing I've seen on the short floor. Every little thing there is a AI-based solution I've seen, right? So ChatGPT is well played from multiple perspective. I would rather up level it and say, generated AI is the way to go. So there are three things. There is human intense triaging, where humans keep doing easy work, minimal work. You can use ML and AI to do that. There is human designing that you need to do. That's when you need to use AI. >> But, I would say this, in the Enterprise, that the quality of the AI has to be better than what we've seen so far out of ChatGPT, even though I love ChatGPT, it's amazing. But what we've seen from being... It's got to be... Is it true that... Don't you think it has to be cleaner, more accurate? It can't make up stuff. If I'm going to be automating my network with AI. >> I'll answer that question. It comes down to three fundamentals. The reason ChatGPT is giving addresses, it's not trained on the latest data. So for any AI and ML method, you got to look at three things. It's your data, it's your domain expertise, who is training it, and your data model. In ChatGPT, it's older data, it's biased to the people that trained it, right? >> Mm-hmm. >> And then, the data model is it's going to spit out what it's trained on. That's a precursor of any GPT, right? It's pre-trained transformation. >> So if we narrow that, right? Train it better for the specific use case, that AI has huge potential. >> You flip that to what the Enterprise customers talk about to us is, insight is invaluable. >> Right. >> But then too much insight too quickly all the time means we go remediation crazy. So we haven't got enough humans to be fixing all the problems. Sri's point with the ChatGPT data, some of that data we are looking at there could be old. So we're trying to triage something that may still be an issue, but it might have been superseded by something else as well. So that's my overriding when I'm talking to customers and we talk ChatGPT, it's in the news all the time. It's very topical. >> It's fun. >> It is. I even said to my 13-year-old son yesterday, your homework's out a date. 'Cause I knew he was doing some summary stuff on ChatGPT. So a little wind up that's out of date just to make that emphasis around the model. And that's where we, with our Neurons platform Ivanti, that's what we want to give the customers all the time, which is the real-time snapshot. So they can make a priority or a decision based on what that information is telling them. >> And we've kind of learned, I think, over the last couple of years, that access to real-time data, real-time AI, is no longer nice to have. It's a massive competitive advantage for organizations, but it's going to enable the on-demand, everything that we expect in our consumer lives, in our business lives. This is going to be table stakes for organizations, I think, in every industry going forward. >> Yeah. >> But assumes 5G, right? Is going to actually happen and somebody's going to- >> Going to absolutely. >> Somebody's going to make some money off it at some point. When are they going to make money off of 5G, do you think? (all laughing) >> No. And then you asked a very good question, Dave. I want to answer that question. Will bad guys use AI? >> Yeah. Yeah. >> Offensive AI is a very big thing. We have to pay attention to it. It's got to create an asymmetric war. If you look at the president of the United States, he said, "If somebody's going to attack us on cyber, we are going to retaliate." For the first time, US is willing to launch a cyber war. What that really means is, we're going to use AI for offensive reasons as well. And we as citizens have to pay attention to that. And that's where I'm worried about, right? AI bias, whether it's data, or domain expertise, or algorithmic bias, is going to be a big thing. And offensive AI is something everybody have to pay attention to. >> To your point, Sri, earlier about critical infrastructure getting hacked, I had this conversation with Dr. Robert Gates several years ago, and I said, "Yeah, but don't we have the best offensive, you know, technology in cyber?" And he said, "Yeah, but we got the most to lose too." >> Yeah, 100%. >> We're the wealthiest nation of the United States. The wealthiest is. So you got to be careful. But to your point, the president of the United States saying, "We'll retaliate," right? Not necessarily start the war, but who started it? >> But that's the thing, right? Attribution is the hardest part. And then you talked about a very interesting thing, rich nations, right? There's emerging nations. There are nations left behind. One thing I've seen on the show floor today is, digital inequality. Digital poverty is a big thing. While we have this amazing technology, 90% of the world doesn't have access to this. >> Right. >> What we have done is we have created an inequality across, and especially in mobility and cyber, if this technology doesn't reach to the last mile, which is emerging nations, I think we are creating a crater back again and putting societies a few miles back. >> And at much greater risk. >> 100%, right? >> Yeah. >> Because those are the guys. In cyber, all you need is a laptop and a brain to attack. >> Yeah. Yeah. >> If I don't have it, that's where the civil war is going to start again. >> Yeah. What are some of the things in our last minute or so, guys, David, we'll start with you and then Sri go to you, that you're looking forward to at this MWC? The theme is velocity. We're talking about so much transformation and evolution in the telecom industry. What are you excited to hear and learn in the next couple of days? >> Just getting a complete picture. One is actually being out after the last couple of years, so you learn a lot. But just walking around and seeing, from my perspective, some vendor names that I haven't seen before, but seeing what they're doing and bringing to the market. But I think goes back to the point made earlier around APIs and integration. Everybody's talking about how can we kind of do this together in a way. So integrations, those smart things is what I'm kind of looking for as well, and how we plug into that as well. >> Excellent, and Sri? >> So for us, there is a lot to offer, right? So while I'm enjoying what I'm seeing here, I'm seeing at an opportunity. We have an amazing portfolio of what we can do. We are into mobile device management. We are the last (indistinct) company. When people find problems, somebody has to go remediators. We are the world's largest patch management company. And what I'm finding is, yes, all these people are embedding software, pumping it like nobody's business. As you find one ability, somebody has to go fix them, and we want to be the (indistinct) company. We had the last smile. And I find an amazing opportunity, not only we can do device management, but do mobile threat defense and give them a risk prioritization on what needs to be remediated, and manage all that in our ITSM. So I look at this as an amazing, amazing opportunity. >> Right. >> Which is exponential than what I've seen before. >> So last question then. Speaking of opportunities, Sri, for you, what are some of the things that customers can go to? Obviously, you guys talk to customers all the time. In terms of learning what Ivanti is going to enable them to do, to take advantage of these opportunities. Any webinars, any events coming up that we want people to know about? >> Absolutely, ivanti.com is the best place to go because we keep everything there. Of course, "theCUBE" interview. >> Of course. >> You should definitely watch that. (all laughing) No. So we have quite a few industry events we do. And especially there's a lot of learning. And we just raised the ransomware report that actually talks about ransomware from a global index perspective. So one thing what we have done is, rather than just looking at vulnerabilities, we showed them the weaknesses that led to the vulnerabilities, and how attackers are using them. And we even talked about DHS, how behind they are in disseminating the information and how it's actually being used by nation states. >> Wow. >> And we did cover mobility as a part of that as well. So there's a quite a bit we did in our report and it actually came out very well. >> I have to check that out. Ransomware is such a fascinating topic. Guys, thank you so much for joining Dave and me on the program today, sharing what's going on at Ivanti, the changes that you're seeing in mobile, and the opportunities that are there for your customers. We appreciate your time. >> Thank you >> Thank you. >> Yes. Thanks, guys. >> Thanks, guys. >> For our guests and for Dave Vellante, I'm Lisa Martin. You're watching "theCUBE" live from MWC23 in Barcelona. As you know, "theCUBE" is the leader in live tech coverage. Dave and I will be right back with our next guest. (gentle upbeat music)
SUMMARY :
that drive human progress. This is the biggest, most packed from China come to this show, Great to have you here. Talk about some of the trends is going to revolutionize the Do you concur? Everything's just going to get bring the Cloud to the Edge." I have to process everything that they're going to pay for, And if I have to pay every the marketplace, David. to how are we going to deal going to get attacked?" of automation to that? So what would you add? If you look at telcos, extended to our home, And a good example, segueing to that, The attackers are going to have AI, It's not going to take you away the AI has to be better it's biased to the people the data model is it's going to So if we narrow that, right? You flip that to what to be fixing all the problems. I even said to my This is going to be table stakes When are they going to make No. And then you asked We have to pay attention to it. got the most to lose too." But to your point, have access to this. reach to the last mile, laptop and a brain to attack. is going to start again. What are some of the things in But I think goes back to a lot to offer, right? than what I've seen before. to customers all the time. is the best place to go that led to the vulnerabilities, And we did cover mobility I have to check that out. As you know, "theCUBE" is the
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Lisa Martin | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
David | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
Dave Shepherd | PERSON | 0.99+ |
Jose Maria Alvarez | PERSON | 0.99+ |
Ericsson | ORGANIZATION | 0.99+ |
David Shepherd | PERSON | 0.99+ |
six | QUANTITY | 0.99+ |
Telefonica | ORGANIZATION | 0.99+ |
Srinivas Mukkamala | PERSON | 0.99+ |
40 seconds | QUANTITY | 0.99+ |
China | LOCATION | 0.99+ |
45 minutes | QUANTITY | 0.99+ |
100% | QUANTITY | 0.99+ |
2024 | DATE | 0.99+ |
United States | LOCATION | 0.99+ |
2016 | DATE | 0.99+ |
90% | QUANTITY | 0.99+ |
ChatGPT | TITLE | 0.99+ |
Robert Gates | PERSON | 0.99+ |
First | QUANTITY | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Sri | ORGANIZATION | 0.99+ |
Barcelona | LOCATION | 0.99+ |
today | DATE | 0.99+ |
yesterday | DATE | 0.99+ |
millions | QUANTITY | 0.99+ |
this week | DATE | 0.99+ |
Dell Technologies | ORGANIZATION | 0.99+ |
Telcos | ORGANIZATION | 0.99+ |
US | ORGANIZATION | 0.99+ |
Last night | DATE | 0.98+ |
Today | DATE | 0.98+ |
Sri | PERSON | 0.98+ |
Mobile World Congress | EVENT | 0.98+ |
one | QUANTITY | 0.98+ |
Edge | ORGANIZATION | 0.98+ |
three things | QUANTITY | 0.98+ |
first time | QUANTITY | 0.98+ |
Dr. | PERSON | 0.98+ |
108 | QUANTITY | 0.98+ |
telco | ORGANIZATION | 0.98+ |
several years ago | DATE | 0.97+ |
first | QUANTITY | 0.97+ |
MWC | EVENT | 0.96+ |
hundreds and thousands of dollars a day | QUANTITY | 0.96+ |
MWC23 | EVENT | 0.96+ |
About 80,000 people | QUANTITY | 0.95+ |
one thing | QUANTITY | 0.95+ |
13-year-old | QUANTITY | 0.95+ |
theCUBE | TITLE | 0.95+ |
theCUBE | ORGANIZATION | 0.95+ |
two different things | QUANTITY | 0.94+ |
day one | QUANTITY | 0.93+ |
Ivanti | PERSON | 0.92+ |
seven vulnerabilities | QUANTITY | 0.91+ |
VP | PERSON | 0.91+ |
president | PERSON | 0.9+ |
three pillars | QUANTITY | 0.89+ |
first thing | QUANTITY | 0.89+ |
Breaking Analysis: MWC 2023 goes beyond consumer & deep into enterprise tech
>> From theCUBE Studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR, this is Breaking Analysis with Dave Vellante. >> While never really meant to be a consumer tech event, the rapid ascendancy of smartphones sucked much of the air out of Mobile World Congress over the years, now MWC. And while the device manufacturers continue to have a major presence at the show, the maturity of intelligent devices, longer life cycles, and the disaggregation of the network stack, have put enterprise technologies front and center in the telco business. Semiconductor manufacturers, network equipment players, infrastructure companies, cloud vendors, software providers, and a spate of startups are eyeing the trillion dollar plus communications industry as one of the next big things to watch this decade. Hello, and welcome to this week's Wikibon CUBE Insights, powered by ETR. In this Breaking Analysis, we bring you part two of our ongoing coverage of MWC '23, with some new data on enterprise players specifically in large telco environments, a brief glimpse at some of the pre-announcement news and corresponding themes ahead of MWC, and some of the key announcement areas we'll be watching at the show on theCUBE. Now, last week we shared some ETR data that showed how traditional enterprise tech players were performing, specifically within the telecoms vertical. Here's a new look at that data from ETR, which isolates the same companies, but cuts the data for what ETR calls large telco. The N in this cut is 196, down from 288 last week when we included all company sizes in the dataset. Now remember the two dimensions here, on the y-axis is net score, or spending momentum, and on the x-axis is pervasiveness in the data set. The table insert in the upper left informs how the dots and companies are plotted, and that red dotted line, the horizontal line at 40%, that indicates a highly elevated net score. Now while the data are not dramatically different in terms of relative positioning, there are a couple of changes at the margin. So just going down the list and focusing on net score. Azure is comparable, but slightly lower in this sector in the large telco than it was overall. Google Cloud comes in at number two, and basically swapped places with AWS, which drops slightly in the large telco relative to overall telco. Snowflake is also slightly down by one percentage point, but maintains its position. Remember Snowflake, overall, its net score is much, much higher when measuring across all verticals. Snowflake comes down in telco, and relative to overall, a little bit down in large telco, but it's making some moves to attack this market that we'll talk about in a moment. Next are Red Hat OpenStack and Databricks. About the same in large tech telco as they were an overall telco. Then there's Dell next that has a big presence at MWC and is getting serious about driving 16G adoption, and new servers, and edge servers, and other partnerships. Cisco and Red Hat OpenShift basically swapped spots when moving from all telco to large telco, as Cisco drops and Red Hat bumps up a bit. And VMware dropped about four percentage points in large telco. Accenture moved up dramatically, about nine percentage points in big telco, large telco relative to all telco. HPE dropped a couple of percentage points. Oracle stayed about the same. And IBM surprisingly dropped by about five points. So look, I understand not a ton of change in terms of spending momentum in the large sector versus telco overall, but some deltas. The bottom line for enterprise players is one, they're just getting started in this new disruption journey that they're on as the stack disaggregates. Two, all these players have experience in delivering horizontal solutions, but now working with partners and identifying big problems to be solved, and three, many of these companies are generally not the fastest moving firms relative to smaller disruptive disruptors. Now, cloud has been an exception in fairness. But the good news for the legacy infrastructure and IT companies is that the telco transformation and the 5G buildout is going to take years. So it's moving at a pace that is very favorable to many of these companies. Okay, so looking at just some of the pre-announcement highlights that have hit the wire this week, I want to give you a glimpse of the diversity of innovation that is occurring in the telecommunication space. You got semiconductor manufacturers, device makers, network equipment players, carriers, cloud vendors, enterprise tech companies, software companies, startups. Now we've included, you'll see in this list, we've included OpeRAN, that logo, because there's so much buzz around the topic and we're going to come back to that. But suffice it to say, there's no way we can cover all the announcements from the 2000 plus exhibitors at the show. So we're going to cherry pick here and make a few call outs. Hewlett Packard Enterprise announced an acquisition of an Italian private cellular network company called AthoNet. Zeus Kerravala wrote about it on SiliconANGLE if you want more details. Now interestingly, HPE has a partnership with Solana, which also does private 5G. But according to Zeus, Solona is more of an out-of-the-box solution, whereas AthoNet is designed for the core and requires more integration. And as you'll see in a moment, there's going to be a lot of talk at the show about private network. There's going to be a lot of news there from other competitors, and we're going to be watching that closely. And while many are concerned about the P5G, private 5G, encroaching on wifi, Kerravala doesn't see it that way. Rather, he feels that these private networks are really designed for more industrial, and you know mission critical environments, like factories, and warehouses that are run by robots, et cetera. 'Cause these can justify the increased expense of private networks. Whereas wifi remains a very low cost and flexible option for, you know, whatever offices and homes. Now, over to Dell. Dell announced its intent to go hard after opening up the telco network with the announcement that in the second half of this year it's going to begin shipping its infrastructure blocks for Red Hat. Remember it's like kind of the converged infrastructure for telco with a more open ecosystem and sort of more flexible, you know, more mature engineered system. Dell has also announced a range of PowerEdge servers for a variety of use cases. A big wide line bringing forth its 16G portfolio and aiming squarely at the telco space. Dell also announced, here we go, a private wireless offering with airspan, and Expedo, and a solution with AthoNet, the company HPE announced it was purchasing. So I guess Dell and HPE are now partnering up in the private wireless space, and yes, hell is freezing over folks. We'll see where that relationship goes in the mid- to long-term. Dell also announced new lab and certification capabilities, which we said last week was going to be critical for the further adoption of open ecosystem technology. So props to Dell for, you know, putting real emphasis and investment in that. AWS also made a number of announcements in this space including private wireless solutions and associated managed services. AWS named Deutsche Telekom, Orange, T-Mobile, Telefonica, and some others as partners. And AWS announced the stepped up partnership, specifically with T-Mobile, to bring AWS services to T-Mobile's network portfolio. Snowflake, back to Snowflake, announced its telecom data cloud. Remember we showed the data earlier, it's Snowflake not as strong in the telco sector, but they're continuing to move toward this go-to market alignment within key industries, realigning their go-to market by vertical. It also announced that AT&T, and a number of other partners, are collaborating to break down data silos specifically in telco. Look, essentially, this is Snowflake taking its core value prop to the telco vertical and forming key partnerships that resonate in the space. So think simplification, breaking down silos, data sharing, eventually data monetization. Samsung previewed its future capability to allow smartphones to access satellite services, something Apple has previously done. AMD, Intel, Marvell, Qualcomm, are all in the act, all the semiconductor players. Qualcomm for example, announced along with Telefonica, and Erickson, a 5G millimeter network that will be showcased in Spain at the event this coming week using Qualcomm Snapdragon chipset platform, based on none other than Arm technology. Of course, Arm we said is going to dominate the edge, and is is clearly doing so. It's got the volume advantage over, you know, traditional Intel, you know, X86 architectures. And it's no surprise that Microsoft is touting its open AI relationship. You're going to hear a lot of AI talk at this conference as is AI is now, you know, is the now topic. All right, we could go on and on and on. There's just so much going on at Mobile World Congress or MWC, that we just wanted to give you a glimpse of some of the highlights that we've been watching. Which brings us to the key topics and issues that we'll be exploring at MWC next week. We touched on some of this last week. A big topic of conversation will of course be, you know, 5G. Is it ever going to become real? Is it, is anybody ever going to make money at 5G? There's so much excitement around and anticipation around 5G. It has not lived up to the hype, but that's because the rollout, as we've previous reported, is going to take years. And part of that rollout is going to rely on the disaggregation of the hardened telco stack, as we reported last week and in previous Breaking Analysis episodes. OpenRAN is a big component of that evolution. You know, as our RAN intelligent controllers, RICs, which essentially the brain of OpenRAN, if you will. Now as we build out 5G networks at massive scale and accommodate unprecedented volumes of data and apply compute-hungry AI to all this data, the issue of energy efficiency is going to be front and center. It has to be. Not only is it a, you know, hot political issue, the reality is that improving power efficiency is compulsory or the whole vision of telco's future is going to come crashing down. So chip manufacturers, equipment makers, cloud providers, everybody is going to be doubling down and clicking on this topic. Let's talk about AI. AI as we said, it is the hot topic right now, but it is happening not only in consumer, with things like ChatGPT. And think about the theme of this Breaking Analysis in the enterprise, AI in the enterprise cannot be ChatGPT. It cannot be error prone the way ChatGPT is. It has to be clean, reliable, governed, accurate. It's got to be ethical. It's got to be trusted. Okay, we're going to have Zeus Kerravala on the show next week and definitely want to get his take on private networks and how they're going to impact wifi. You know, will private networks cannibalize wifi? If not, why not? He wrote about this again on SiliconANGLE if you want more details, and we're going to unpack that on theCUBE this week. And finally, as always we'll be following the data flows to understand where and how telcos, cloud players, startups, software companies, disruptors, legacy companies, end customers, how are they going to make money from new data opportunities? 'Cause we often say in theCUBE, don't ever bet against data. All right, that's a wrap for today. Remember theCUBE is going to be on location at MWC 2023 next week. We got a great set. We're in the walkway in between halls four and five, right in Congress Square, stand CS-60. Look for us, we got a full schedule. If you got a great story or you have news, stop by. We're going to try to get you on the program. I'll be there with Lisa Martin, co-hosting, David Nicholson as well, and the entire CUBE crew, so don't forget to come by and see us. I want to thank Alex Myerson, who's on production and manages the podcast, and Ken Schiffman, as well, in our Boston studio. Kristen Martin and Cheryl Knight help get the word out on social media and in our newsletters. And Rob Hof is our editor-in-chief over at SiliconANGLE.com. He does some great editing. Thank you. All right, remember all these episodes they are available as podcasts wherever you listen. All you got to do is search Breaking Analysis podcasts. I publish each week on Wikibon.com and SiliconANGLE.com. All the video content is available on demand at theCUBE.net, or you can email me directly if you want to get in touch David.Vellante@SiliconANGLE.com or DM me @DVellante, or comment on our LinkedIn posts. And please do check out ETR.ai for the best survey data in the enterprise tech business. This is Dave Vellante for theCUBE Insights, powered by ETR. Thanks for watching. We'll see you next week at Mobile World Congress '23, MWC '23, or next time on Breaking Analysis. (bright music)
SUMMARY :
bringing you data-driven in the mid- to long-term.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
David Nicholson | PERSON | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Alex Myerson | PERSON | 0.99+ |
Orange | ORGANIZATION | 0.99+ |
Qualcomm | ORGANIZATION | 0.99+ |
HPE | ORGANIZATION | 0.99+ |
Telefonica | ORGANIZATION | 0.99+ |
Kristen Martin | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
AMD | ORGANIZATION | 0.99+ |
Spain | LOCATION | 0.99+ |
T-Mobile | ORGANIZATION | 0.99+ |
Ken Schiffman | PERSON | 0.99+ |
Deutsche Telekom | ORGANIZATION | 0.99+ |
Hewlett Packard Enterprise | ORGANIZATION | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
Cheryl Knight | PERSON | 0.99+ |
Marvell | ORGANIZATION | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Samsung | ORGANIZATION | 0.99+ |
Apple | ORGANIZATION | 0.99+ |
AT&T | ORGANIZATION | 0.99+ |
Dell | ORGANIZATION | 0.99+ |
Intel | ORGANIZATION | 0.99+ |
Rob Hof | PERSON | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
Oracle | ORGANIZATION | 0.99+ |
40% | QUANTITY | 0.99+ |
last week | DATE | 0.99+ |
AthoNet | ORGANIZATION | 0.99+ |
Erickson | ORGANIZATION | 0.99+ |
Congress Square | LOCATION | 0.99+ |
Accenture | ORGANIZATION | 0.99+ |
next week | DATE | 0.99+ |
Mobile World Congress | EVENT | 0.99+ |
Solana | ORGANIZATION | 0.99+ |
Boston | LOCATION | 0.99+ |
two dimensions | QUANTITY | 0.99+ |
ETR | ORGANIZATION | 0.99+ |
MWC '23 | EVENT | 0.99+ |
MWC | EVENT | 0.99+ |
288 | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
this week | DATE | 0.98+ |
Solona | ORGANIZATION | 0.98+ |
David.Vellante@SiliconANGLE.com | OTHER | 0.98+ |
telco | ORGANIZATION | 0.98+ |
Two | QUANTITY | 0.98+ |
each week | QUANTITY | 0.97+ |
Zeus Kerravala | PERSON | 0.97+ |
MWC 2023 | EVENT | 0.97+ |
about five points | QUANTITY | 0.97+ |
theCUBE.net | OTHER | 0.97+ |
Red Hat | ORGANIZATION | 0.97+ |
Snowflake | TITLE | 0.96+ |
one | QUANTITY | 0.96+ |
Databricks | ORGANIZATION | 0.96+ |
three | QUANTITY | 0.96+ |
theCUBE Studios | ORGANIZATION | 0.96+ |
Robert Nishihara, Anyscale | CUBE Conversation
(upbeat instrumental) >> Hello and welcome to this CUBE conversation. I'm John Furrier, host of theCUBE, here in Palo Alto, California. Got a great conversation with Robert Nishihara who's the co-founder and CEO of Anyscale. Robert, great to have you on this CUBE conversation. It's great to see you. We did your first Ray Summit a couple years ago and congratulations on your venture. Great to have you on. >> Thank you. Thanks for inviting me. >> So you're first time CEO out of Berkeley in Data. You got the Databricks is coming out of there. You got a bunch of activity coming from Berkeley. It's like a, it really is kind of like where a lot of innovations going on data. Anyscale has been one of those startups that has risen out of that scene. Right? You look at the success of what the Data lakes are now. Now you've got the generative AI. This has been a really interesting innovation market. This new wave is coming. Tell us what's going on with Anyscale right now, as you guys are gearing up and getting some growth. What's happening with the company? >> Yeah, well one of the most exciting things that's been happening in computing recently, is the rise of AI and the excitement about AI, and the potential for AI to really transform every industry. Now of course, one of the of the biggest challenges to actually making that happen is that doing AI, that AI is incredibly computationally intensive, right? To actually succeed with AI to actually get value out of AI. You're typically not just running it on your laptop, you're often running it and scaling it across thousands of machines, or hundreds of machines or GPUs, and to, so organizations and companies and businesses that do AI often end up building a large infrastructure team to manage the distributed systems, the computing to actually scale these applications. And that's a, that's a, a huge software engineering lift, right? And so, one of the goals for Anyscale is really to make that easy. To get to the point where, developers and teams and companies can succeed with AI. Can build these scalable AI applications, without really you know, without a huge investment in infrastructure with a lot of, without a lot of expertise in infrastructure, where really all they need to know is how to program on their laptop, how to program in Python. And if you have that, then that's really all you need to succeed with AI. So that's what we've been focused on. We're building Ray, which is an open source project that's been starting to get adopted by tons of companies, to actually train these models, to deploy these models, to do inference with these models, you know, to ingest and pre-process their data. And our goals, you know, here with the company are really to make Ray successful. To grow the Ray community, and then to build a great product around it and simplify the development and deployment, and productionization of machine learning for, for all these businesses. >> It's a great trend. Everyone wants developer productivity seeing that, clearly right now. And plus, developers are voting literally on what standards become. As you look at how the market is open source driven, a lot of that I love the model, love the Ray project love the, love the Anyscale value proposition. How big are you guys now, and how is that value proposition of Ray and Anyscale and foundational models coming together? Because it seems like you guys are in a perfect storm situation where you guys could get a real tailwind and draft off the the mega trend that everyone's getting excited. The new toy is ChatGPT. So you got to look at that and say, hey, I mean, come on, you guys did all the heavy lifting. >> Absolutely. >> You know how many people you are, and what's the what's the proposition for you guys these days? >> You know our company's about a hundred people, that a bit larger than that. Ray's been going really quickly. It's been, you know, companies using, like OpenAI uses Ray to train their models, like ChatGPT. Companies like Uber run all their deep learning you know, and classical machine learning on top of Ray. Companies like Shopify, Spotify, Netflix, Cruise, Lyft, Instacart, you know, Bike Dance. A lot of these companies are investing heavily in Ray for their machine learning infrastructure. And I think it's gotten to the point where, if you're one of these, you know type of businesses, and you're looking to revamp your machine learning infrastructure. If you're looking to enable new capabilities, you know make your teams more productive, increase, speed up the experimentation cycle, you know make it more performance, like build, you know, run applications that are more scalable, run them faster, run them in a more cost efficient way. All of these types of companies are at least evaluating Ray and Ray is an increasingly common choice there. I think if they're not using Ray, if many of these companies that end up not using Ray, they often end up building their own infrastructure. So Ray has been, the growth there has been incredibly exciting over the, you know we had our first in-person Ray Summit just back in August, and planning the next one for, for coming September. And so when you asked about the value proposition, I think there's there's really two main things, when people choose to go with Ray and Anyscale. One reason is about moving faster, right? It's about developer productivity, it's about speeding up the experimentation cycle, easily getting their models in production. You know, we hear many companies say that they, you know they, once they prototype a model, once they develop a model, it's another eight weeks, or 12 weeks to actually get that model in production. And that's a reason they talk to us. We hear companies say that, you know they've been training their models and, and doing inference on a single machine, and they've been sort of scaling vertically, like using bigger and bigger machines. But they, you know, you can only do that for so long, and at some point you need to go beyond a single machine and that's when they start talking to us. Right? So one of the main value propositions is around moving faster. I think probably the phrase I hear the most is, companies saying that they don't want their machine learning people to have to spend all their time configuring infrastructure. All this is about productivity. >> Yeah. >> The other. >> It's the big brains in the company. That are being used to do remedial tasks that should be automated right? I mean that's. >> Yeah, and I mean, it's hard stuff, right? It's also not these people's area of expertise, and or where they're adding the most value. So all of this is around developer productivity, moving faster, getting to market faster. The other big value prop and the reason people choose Ray and choose Anyscale, is around just providing superior infrastructure. This is really, can we scale more? You know, can we run it faster, right? Can we run it in a more cost effective way? We hear people saying that they're not getting good GPU utilization with the existing tools they're using, or they can't scale beyond a certain point, or you know they don't have a way to efficiently use spot instances to save costs, right? Or their clusters, you know can't auto scale up and down fast enough, right? These are all the kinds of things that Ray and Anyscale, where Ray and Anyscale add value and solve these kinds of problems. >> You know, you bring up great points. Auto scaling concept, early days, it was easy getting more compute. Now it's complicated. They're built into more integrated apps in the cloud. And you mentioned those companies that you're working with, that's impressive. Those are like the big hardcore, I call them hardcore. They have a good technical teams. And as the wave starts to move from these companies that were hyper scaling up all the time, the mainstream are just developers, right? So you need an interface in, so I see the dots connecting with you guys and I want to get your reaction. Is that how you see it? That you got the alphas out there kind of kicking butt, building their own stuff, alpha developers and infrastructure. But mainstream just wants programmability. They want that heavy lifting taken care of for them. Is that kind of how you guys see it? I mean, take us through that. Because to get crossover to be democratized, the automation's got to be there. And for developer productivity to be in, it's got to be coding and programmability. >> That's right. Ultimately for AI to really be successful, and really you know, transform every industry in the way we think it has the potential to. It has to be easier to use, right? And that is, and being easier to use, there's many dimensions to that. But an important one is that as a developer to do AI, you shouldn't have to be an expert in distributed systems. You shouldn't have to be an expert in infrastructure. If you do have to be, that's going to really limit the number of people who can do this, right? And I think there are so many, all of the companies we talk to, they don't want to be in the business of building and managing infrastructure. It's not that they can't do it. But it's going to slow them down, right? They want to allocate their time and their energy toward building their product, right? To building a better product, getting their product to market faster. And if we can take the infrastructure work off of the critical path for them, that's going to speed them up, it's going to simplify their lives. And I think that is critical for really enabling all of these companies to succeed with AI. >> Talk about the customers you guys are talking to right now, and how that translates over. Because I think you hit a good thread there. Data infrastructure is critical. Managed services are coming online, open sources continuing to grow. You have these people building their own, and then if they abandon it or don't scale it properly, there's kind of consequences. 'Cause it's a system you mentioned, it's a distributed system architecture. It's not as easy as standing up a monolithic app these days. So when you guys go to the marketplace and talk to customers, put the customers in buckets. So you got the ones that are kind of leaning in, that are pretty peaked, probably working with you now, open source. And then what's the customer profile look like as you go mainstream? Are they looking to manage service, looking for more architectural system, architecture approach? What's the, Anyscale progression? How do you engage with your customers? What are they telling you? >> Yeah, so many of these companies, yes, they're looking for managed infrastructure 'cause they want to move faster, right? Now the kind of these profiles of these different customers, they're three main workloads that companies run on Anyscale, run with Ray. It's training related workloads, and it is serving and deployment related workloads, like actually deploying your models, and it's batch processing, batch inference related workloads. Like imagine you want to do computer vision on tons and tons of, of images or videos, or you want to do natural language processing on millions of documents or audio, or speech or things like that, right? So the, I would say the, there's a pretty large variety of use cases, but the most common you know, we see tons of people working with computer vision data, you know, computer vision problems, natural language processing problems. And it's across many different industries. We work with companies doing drug discovery, companies doing you know, gaming or e-commerce, right? Companies doing robotics or agriculture. So there's a huge variety of the types of industries that can benefit from AI, and can really get a lot of value out of AI. And, but the, but the problems are the same problems that they all want to solve. It's like how do you make your team move faster, you know succeed with AI, be more productive, speed up the experimentation, and also how do you do this in a more performant way, in a faster, cheaper, in a more cost efficient, more scalable way. >> It's almost like the cloud game is coming back to AI and these foundational models, because I was just on a podcast, we recorded our weekly podcast, and I was just riffing with Dave Vellante, my co-host on this, were like, hey, in the early days of Amazon, if you want to build an app, you just, you have to build a data center, and then you go to now you go to the cloud, cloud's easier, pay a little money, penny's on the dollar, you get your app up and running. Cloud computing is born. With foundation models in generative AI. The old model was hard, heavy lifting, expensive, build out, before you get to do anything, as you mentioned time. So I got to think that you're pretty much in a good position with this foundational model trend in generative AI because I just looked at the foundation map, foundation models, map of the ecosystem. You're starting to see layers of, you got the tooling, you got platform, you got cloud. It's filling out really quickly. So why is Anyscale important to this new trend? How do you talk to people when they ask you, you know what does ChatGPT mean for Anyscale? And how does the financial foundational model growth, fit into your plan? >> Well, foundational models are hugely important for the industry broadly. Because you're going to have these really powerful models that are trained that you know, have been trained on tremendous amounts of data. tremendous amounts of computes, and that are useful out of the box, right? That people can start to use, and query, and get value out of, without necessarily training these huge models themselves. Now Ray fits in and Anyscale fit in, in a number of places. First of all, they're useful for creating these foundation models. Companies like OpenAI, you know, use Ray for this purpose. Companies like Cohere use Ray for these purposes. You know, IBM. If you look at, there's of course also open source versions like GPTJ, you know, created using Ray. So a lot of these large language models, large foundation models benefit from training on top of Ray. And, but of course for every company training and creating these huge foundation models, you're going to have many more that are fine tuning these models with their own data. That are deploying and serving these models for their own applications, that are building other application and business logic around these models. And that's where Ray also really shines, because Ray you know, is, can provide common infrastructure for all of these workloads. The training, the fine tuning, the serving, the data ingest and pre-processing, right? The hyper parameter tuning, the and and so on. And so where the reason Ray and Anyscale are important here, is that, again, foundation models are large, foundation models are compute intensive, doing you know, using both creating and using these foundation models requires tremendous amounts of compute. And there there's a big infrastructure lift to make that happen. So either you are using Ray and Anyscale to do this, or you are building the infrastructure and managing the infrastructure yourself. Which you can do, but it's, it's hard. >> Good luck with that. I always say good luck with that. I mean, I think if you really need to do, build that hardened foundation, you got to go all the way. And I think this, this idea of composability is interesting. How is Ray working with OpenAI for instance? Take, take us through that. Because I think you're going to see a lot of people talking about, okay I got trained models, but I'm going to have not one, I'm going to have many. There's big debate that OpenAI is going to be the mother of all LLMs, but now, but really people are also saying that to be many more, either purpose-built or specific. The fusion and these things come together there's like a blending of data, and that seems to be a value proposition. How does Ray help these guys get their models up? Can you take, take us through what Ray's doing for say OpenAI and others, and how do you see the models interacting with each other? >> Yeah, great question. So where, where OpenAI uses Ray right now, is for the training workloads. Training both to create ChatGPT and models like that. There's both a supervised learning component, where you're pre-training this model on doing supervised pre-training with example data. There's also a reinforcement learning component, where you are fine-tuning the model and continuing to train the model, but based on human feedback, based on input from humans saying that, you know this response to this question is better than this other response to this question, right? And so Ray provides the infrastructure for scaling the training across many, many GPUs, many many machines, and really running that in an efficient you know, performance fault tolerant way, right? And so, you know, open, this is not the first version of OpenAI's infrastructure, right? They've gone through iterations where they did start with building the infrastructure themselves. They were using tools like MPI. But at some point, you know, given the complexity, given the scale of what they're trying to do, you hit a wall with MPI and that's going to happen with a lot of other companies in this space. And at that point you don't have many other options other than to use Ray or to build your own infrastructure. >> That's awesome. And then your vision on this data interaction, because the old days monolithic models were very rigid. You couldn't really interface with them. But we're kind of seeing this future of data fusion, data interaction, data blending at large scale. What's your vision? How do you, what's your vision of where this goes? Because if this goes the way people think. You can have this data chemistry kind of thing going on where people are integrating all kinds of data with each other at large scale. So you need infrastructure, intelligence, reasoning, a lot of code. Is this something that you see? What's your vision in all this? Take us through. >> AI is going to be used everywhere right? It's, we see this as a technology that's going to be ubiquitous, and is going to transform every business. I mean, imagine you make a product, maybe you were making a tool like Photoshop or, or whatever the, you know, tool is. The way that people are going to use your tool, is not by investing, you know, hundreds of hours into learning all of the different, you know specific buttons they need to press and workflows they need to go through it. They're going to talk to it, right? They're going to say, ask it to do the thing they want it to do right? And it's going to do it. And if it, if it doesn't know what it's want, what it's, what's being asked of it. It's going to ask clarifying questions, right? And then you're going to clarify, and you're going to have a conversation. And this is going to make many many many kinds of tools and technology and products easier to use, and lower the barrier to entry. And so, and this, you know, many companies fit into this category of trying to build products that, and trying to make them easier to use, this is just one kind of way it can, one kind of way that AI will will be used. But I think it's, it's something that's pretty ubiquitous. >> Yeah. It'll be efficient, it'll be efficiency up and down the stack, and will change the productivity equation completely. You just highlighted one, I don't want to fill out forms, just stand up my environment for me. And then start coding away. Okay well this is great stuff. Final word for the folks out there watching, obviously new kind of skill set for hiring. You guys got engineers, give a plug for the company, for Anyscale. What are you looking for? What are you guys working on? Give a, take the last minute to put a plug in for the company. >> Yeah well if you're interested in AI and if you think AI is really going to be transformative, and really be useful for all these different industries. We are trying to provide the infrastructure to enable that to happen, right? So I think there's the potential here, to really solve an important problem, to get to the point where developers don't need to think about infrastructure, don't need to think about distributed systems. All they think about is their application logic, and what they want their application to do. And I think if we can achieve that, you know we can be the foundation or the platform that enables all of these other companies to succeed with AI. So that's where we're going. I think something like this has to happen if AI is going to achieve its potential, we're looking for, we're hiring across the board, you know, great engineers, on the go-to-market side, product managers, you know people who want to really, you know, make this happen. >> Awesome well congratulations. I know you got some good funding behind you. You're in a good spot. I think this is happening. I think generative AI and foundation models is going to be the next big inflection point, as big as the pc inter-networking, internet and smartphones. This is a whole nother application framework, a whole nother set of things. So this is the ground floor. Robert, you're, you and your team are right there. Well done. >> Thank you so much. >> All right. Thanks for coming on this CUBE conversation. I'm John Furrier with theCUBE. Breaking down a conversation around AI and scaling up in this new next major inflection point. This next wave is foundational models, generative AI. And thanks to ChatGPT, the whole world's now knowing about it. So it really is changing the game and Anyscale is right there, one of the hot startups, that is in good position to ride this next wave. Thanks for watching. (upbeat instrumental)
SUMMARY :
Robert, great to have you Thanks for inviting me. as you guys are gearing up and the potential for AI to a lot of that I love the and at some point you need It's the big brains in the company. and the reason people the automation's got to be there. and really you know, and talk to customers, put but the most common you know, and then you go to now that are trained that you know, and that seems to be a value proposition. And at that point you don't So you need infrastructure, and lower the barrier to entry. What are you guys working on? and if you think AI is really is going to be the next And thanks to ChatGPT,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Vellante | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
Robert Nishihara | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
12 weeks | QUANTITY | 0.99+ |
Robert | PERSON | 0.99+ |
Uber | ORGANIZATION | 0.99+ |
Lyft | ORGANIZATION | 0.99+ |
Shopify | ORGANIZATION | 0.99+ |
eight weeks | QUANTITY | 0.99+ |
Spotify | ORGANIZATION | 0.99+ |
Netflix | ORGANIZATION | 0.99+ |
August | DATE | 0.99+ |
September | DATE | 0.99+ |
Palo Alto, California | LOCATION | 0.99+ |
Cruise | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Instacart | ORGANIZATION | 0.99+ |
Anyscale | ORGANIZATION | 0.99+ |
first | QUANTITY | 0.99+ |
Photoshop | TITLE | 0.99+ |
One reason | QUANTITY | 0.99+ |
Bike Dance | ORGANIZATION | 0.99+ |
Ray | ORGANIZATION | 0.99+ |
Python | TITLE | 0.99+ |
thousands of machines | QUANTITY | 0.99+ |
Berkeley | LOCATION | 0.99+ |
two main things | QUANTITY | 0.98+ |
single machine | QUANTITY | 0.98+ |
Cohere | ORGANIZATION | 0.98+ |
Ray and Anyscale | ORGANIZATION | 0.98+ |
millions of documents | QUANTITY | 0.98+ |
both | QUANTITY | 0.98+ |
one kind | QUANTITY | 0.96+ |
first version | QUANTITY | 0.95+ |
CUBE | ORGANIZATION | 0.95+ |
about a hundred people | QUANTITY | 0.95+ |
hundreds of machines | QUANTITY | 0.95+ |
one | QUANTITY | 0.95+ |
OpenAI | ORGANIZATION | 0.94+ |
First | QUANTITY | 0.94+ |
hundreds of hours | QUANTITY | 0.93+ |
first time | QUANTITY | 0.93+ |
Databricks | ORGANIZATION | 0.91+ |
Ray and Anyscale | ORGANIZATION | 0.9+ |
tons | QUANTITY | 0.89+ |
couple years ago | DATE | 0.88+ |
Ray and | ORGANIZATION | 0.86+ |
ChatGPT | TITLE | 0.81+ |
tons of people | QUANTITY | 0.8+ |
Meagen Eisenberg, Lacework | International Women's Day 2023
>> Hello and welcome to theCUBE's coverage of International Women's Day. I'm John Furrier, host of theCUBE. Got a variety of interviews across the gamut from topics, women in tech, mentoring, pipelining, developers, open source, executives. Stanford's having International Women's Day celebration with the women in data science, which we're streaming that live as well. Variety of programs. In this segment, Meagen Eisenberg, friend of theCUBE, she's the CMO of Laceworks, is an amazing executive, got a great journey story as a CMO but she's also actively advising startups, companies and really pays it forward. I want to say Meagen, thank you for coming on the program and thanks for sharing. >> Yeah, thank you for having me. I'm happy to be here. >> Well, we're going to get into some of the journey celebrations that you've gone through and best practice what you've learned is pay that forward. But I got to say, one of the things that really impresses me about you as an executive is you get stuff done. You're a great CMO but also you're advised a lot of companies, you have a lot of irons in the fires and you're advising companies and sometimes they're really small startups to bigger companies, and you're paying it forward, which I love. That's kind of the spirit of this day. >> Yeah, I mean, I agree with you. When I think about my career, a lot of it was looking to mentors women out in the field. This morning I was at a breakfast by Eileen and we had the CEO of General Motors on, and she was talking about her journey nine years as a CEO. And you know, and she's paying it forward with us. But I think about, you know, when you're advising startups, you know, I've gathered knowledge and pattern recognition and to be able to share that is, you know, I enjoy it. >> Yeah. And the startups are also fun too, but it's not always easy and it can get kind of messy as you know. Some startups don't make it some succeed and it's always like the origination story is kind of rewritten and then that's that messy middle. And then it's like that arrows that don't look like a straight line but everyone thinks it's great and you know, it's not for the faint of heart. And Teresa Carlson, who I've interviewed many times, former Amazon, now she's the president of Flexport, she always says, sometimes startups on certain industries aren't for the faint of heart so you got to have a little bit of metal, right? You got to be tough. And some cases that you don't need that, but startups, it's not always easy. What have you learned? >> Yeah, I mean, certainly in the startup world, grit, creativity. You know, when I was at TripActions travel company, pandemic hits, nobody's traveling. You cut budget, you cut heads, but you focus on the core, right? You focus on what you need to survive. And creativity, I think, wins. And, you know, as a CMO when you're marketing, how do you get through that noise? Even the security space, Lacework, it's a fragmented market. You've got to be differentiated and position yourself and you know, be talking to the right target audience and customers. >> Talk about your journey over the years. What have you learned? What's some observations? Can you share any stories and best practices that someone watching could learn from? I know there's a lot of people coming into the tech space with the generative AI things going on in Cloud computing, scaling to the edge, there's a lot more aperture for technical jobs as well as just new roles and new roles that haven't, you really don't go to college for anymore. You got cybersecurity you're in. What are some of the things that you've done over your career if you can share and some best practices? >> Yeah, I think number one, continual learning. When I look through my career, I was constantly reading, networking. Part of the journey is who you're meeting along the way. As you become more senior, your ability to hire and bring in talent matters a lot. I'm always trying to meet with new people. Yeah, if I look at my Amazon feed of books I've bought, right, it kind of chronicle of my history of things I was learning about. Right now I'm reading a lot about cybersecurity, how the, you know, how how they tell me the world ends is the one I'm reading most recently. But you've got to come up to speed and then know the product, get in there and talk to customers. Certainly on the marketing front, anytime I can talk with the customer and find out how they're using us, why they love us, that, you know, helps me better position and differentiate our company. >> By the way, that book is amazing. I saw Nicole speak on Tuesday night with John Markoff and Palo Alto here. What a great story she told there. I recommend that book to everyone. It goes in and she did eight years of research into that book around zero day marketplaces to all the actors involved in security. And it was very interesting. >> Yeah, I mean, it definitely wakes you up, makes you think about what's going on in the world. Very relevant. >> It's like, yeah, it was happening all the time, wasn't it. All the hacking. But this brings me, this brings up an interesting point though, because you're in a cybersecurity area, which by the way, it's changing very fast. It's becoming a bigger industry. It's not just male dominated, although it is now, it's still male dominated, but it's becoming much more and then just tech. >> Yeah, I mean it's a constantly evolving threat landscape and we're learning, and I think more than ever you need to be able to use the data that companies have and, you know, learn from it. That's one of the ways we position ourselves. We're not just about writing rules that won't help you with those zero day attacks. You've got to be able to understand your particular environment and at any moment if it changes. And that's how we help you detect a threat. >> How is, how are things going with you? Is there any new things you guys got going on? Initiatives or programs for women in tech and increasing the range of diversity inclusion in the industry? Because again, this industry's getting much wider too. It's not just specialized, it's also growing. >> Yes, actually I'm excited. We're launching secured by women, securedbywomen.com and it's very much focused on women in the industry, which some studies are showing it's about 25% of security professionals are women. And we're going to be taking nominations and sponsoring women to go to upcoming security events. And so excited to launch that this month and really celebrate women in security and help them, you know, part of that continual learning that I talked about, making sure they're there learning, having the conversations at the conferences, being able to network. >> I have to ask you, what inspired you to pursue the career in tech? What was the motivation? >> You know, if I think way back, originally I wanted to be on the art side and my dad said, "You can do anything as long as it's in the sciences." And so in undergrad I did computer science and MIS. Graduated with MIS and computer science minor. And when I came out I was a IT engineer at Cisco and you know, that kind of started my journey and decided to go back and get my MBA. And during that process I fell in love with marketing and I thought, okay, I understand the buyer, I can come out and market technology to the IT world and developers. And then from there went to several tech companies. >> I mean my father was an engineer. He had the same kind of thing. You got to be an engineer, it's a steady, stable job. But that time, computer science, I mean we've seen the evolution of computer science now it's the most popular degree at Berkeley we've heard and around the world and the education formats are changing. You're seeing a lot of people's self-training on YouTube. The field has really changed. What are some of the challenges you see for folks trying to get into the industry and how would you advise today if you were talking to your young self, what would you, what would be the narrative? >> Yeah, I mean my drawback then was HTML pages were coming out and I thought it would be fun to design, you know, webpages. So you find something you're passionate about in the space today, whether it's gaming or it's cybersecurity. Go and be excited about it and apply and don't give up, right? Do whatever you can to read and learn. And you're right, there are a ton of online self-help. I always try to hire women and people who are continual learners and are teaching themselves something. And I try to find that in an interview to know that they, because when you come to a business, you're there to solve problems and challenges. And the folks that can do that and be innovative and learn, those are the ones I want on my team. >> It's interesting, you know, technology is now impacting society and we need everyone involved to participate and give requirements. And that kind of leads my next question for you is, like, in your opinion, or let me just step back, let me rephrase. What are some of the things that you see technology being used for, for society right now that will impact people's lives? Because this is not a gender thing. We need everybody involved 'cause society is now digital. Technology's pervasive. The AI trends now we're seeing is clearly unmasking to the mainstream that there's some cool stuff happening. >> Yeah, I mean, I think ChatGPT, think about that. All the different ways we're using it we're writing content and marketing with it. We're, you know, I just read an article yesterday, folks are using it to write children's stories and then selling those stories on Amazon, right? And the amount that they can produce with it. But if you think about it, there's unlimited uses with that technology and you've got all the major players getting involved on it. That one major launch and piece of technology is going to transform us in the next six months to a year. And it's the ability to process so much data and then turn that into just assets that we use and the creativity that's building on top of it. Even TripActions has incorporated ChatGPT into your ability to figure out where you want when you're traveling, what's happening in that city. So it's just, you're going to see that incorporated everywhere. >> I mean we've done an interview before TripAction, your other company you were at. Interesting point you don't have to type in a box to say, I'm traveling, I want a hotel. You can just say, I'm going to Barcelona for Mobile World Congress, I want to have a good time. I want some tapas and a nice dinner out. >> Yes. Yeah. That easy. We're making it easy. >> It's efficiency. >> And actually I was going to say for women specifically, I think the reason why we can do so much today is all the technology and apps that we have. I think about DoorDash, I think about Waze you know, when I was younger you had to print out instructions. Now I get in the car real quick, I need to go to soccer practice, I enter it, I need to pick them up at someone's house. I enter it. It's everything's real time. And so it takes away all the things that I don't add value to and allows me to focus on what I want in business. And so there's a bunch of, you know, apps out there that have allowed me to be so much more efficient and productive that my mother didn't have for sure when I was growing up. >> That is an amazing, I think that actually illustrates, in my opinion, the best example of ChatGPT because the maps and GPS integration were two techs, technologies merged together that replace driving and looking at the map. You know, like how do you do that? Like now it's automatically. This is what's going to happen to creative, to writing, to ideation. I even heard Nicole from her book read said that they're using ChatGPT to write zero day exploits. So you seeing it... >> That's scary stuff. You're right. >> You're seeing it everywhere. Super exciting. Well, I got to ask you before you get into some of the Lacework things that you're involved with, cause I think you're doing great work over there is, what was the most exciting projects you've worked on in your career? You came in Cisco, very technical company, so got the technical chops, CSMIS which stands for Management of Information Science for all the young people out there, that was the state of the art back then. What are some of the exciting things you've done? >> Yeah, I mean, I think about, I think about MongoDB and learning to market to developers. Taking the company public in 2017. Launching Atlas database as a service. Now there's so much more of that, you know, the PLG motion, going to TripActions, you know, surviving a pandemic, still being able to come out of that and all the learnings that went with it. You know, they recently, I guess rebranded, so they're Navan now. And then now back in the security space, you know, 14 years ago I was at ArcSite and we were bought by HP. And so getting back into the security world is exciting and it's transformed a ton as you know, it's way more complicated than it was. And so just understanding the pain of our customers and how we protect them as is fun. And I like, you know, being there from a marketing standpoint. >> Well we really appreciate you coming on and sharing that. I got to ask you, for folks watching they might be interested in some advice that you might have for them and their career in tech. I know a lot of young people love the tech. It's becoming pervasive in our lives, as we mentioned. What advice would you give for folks watching that want to start a career in tech? >> Yeah, so work hard, right? Study, network, your first job, be the best at it because every job after that you get pulled into a network. And every time I move, I'm hiring people from the last job, two jobs before, three jobs before. And I'm looking for people that are working hard, care, you know, are continual learners and you know, add value. What can you do to solve problems at your work and add value? >> What's your secret networking hack or growth hack or tip that you can share? Because you're a great networker by the way. You're amazing and you do add a lot of value. I've seen you in action. >> Well, I try never to eat alone. I've got breakfast, I've got lunch, I've got coffee breaks and dinner. And so when I'm at work, I try and always sit and eat with a team member, new group. If I'm out on the road, I'm, you know, meeting people for lunch, going for dinner, just, you know, don't sit at your desk by yourself and don't sit in the hotel room. Get out and meet with people. >> What do you think about now that we're out of the pandemic or somewhat out of the pandemic so to speak, events are back. >> Yes. >> RSA is coming up. It's a big event. The bigger events are getting bigger and then the other events are kind of smaller being distributed. What's your vision of how events are evolving? >> Yeah, I mean, you've got to be in person. Those are the relationships. Right now more than ever people care about renewals and you are building that rapport. And if you're not meeting with your customers, your competitors are. So what I would say is get out there Lacework, we're going to be at RSA, we're going to be at re:Inforce, we're going to be at all of these events, building relationships, you know, coffee, lunch, and yeah, I think the future of events are here to stay and those that don't embrace in person are going to give up business. They're going to lose market share to us. >> And networking is obviously very key on events as well. >> Yes. >> A good opportunity as always get out to the events. What's the event networking trick or advice do you give folks that are going to get out to the networking world? >> Yeah, schedule ahead of time. Don't go to an event and expect people just to come by for great swag. You should be partnering with your sales team and scheduling ahead of time, getting on people's calendars. Don't go there without having 100 or 200 meetings already booked. >> Got it. All right. Let's talk about you, your career. You're currently at Lacework. It's a very hot company in a hot field, security, very male dominated, you're a leader there. What's it like? What's the strategies? How does a woman get in there and be successful? What are some tricks, observations, any data you can share? What's the best practice? What's the secret sauce from Meagen Eisenberg? >> Yes. Yeah, for Meagen Eisenberg. For Lacework, you know, we're focused on our customers. There's nothing better than getting, being close to them, solving their pain, showcasing them. So if you want to go into security, focus on their, the issues and their problems and make sure they're aware of what you're delivering. I mean, we're focused on cloud security and we go from build time to run time. And that's the draw for me here is we had a lot of, you know, happy, excited customers by what we were doing. And what we're doing is very different from legacy security providers. And it is tapping into the trend of really understanding how much data you have and what's happening in the data to detect the anomalies and the threats that are there. >> You know, one of the conversations that I was just having with a senior leader, she was amazing and I asked her what she thought of the current landscape, the job market, the how to get promoted through the careers, all those things. And the response was interesting. I want to get your reaction. She said interdisciplinary skills are critical. And now more than ever, the having that, having a set of skills, technical and social and emotional are super valuable. Do you agree? What's your reaction to that and what would, how would you reframe that? >> Yeah, I mean, I completely agree. You can't be a leader without balance. You've got to know your craft because you're developing and training your team, but you also need to know the, you know, how to build relationships. You're not going to be successful as a C-level exec if you're not partnering across the functions. As a CMO I need to partner with product, I need to partner with the head of sales, I need to partner with finance. So those relationships matter a ton. I also need to attract the right talent. I want to have solid people on the team. And what I will say in the security, cybersecurity space, there's a talent shortage and you cannot hire enough people to protect your company in that space. And that's kind of our part of it is we reduce the number of alerts that you're getting. So you don't need hundreds of people to detect an issue. You're using technology to show, you know, to highlight the issue and then your team can focus on those alerts that matter. >> Yeah, there's a lot of emerging markets where leveling up and you don't need pedigree. You can just level up skill-wise pretty quickly. Which brings me to the next question for you is how do you keep up with all the tech day-to-day and how should someone watching stay on top of it? Because I mean, you got to be on top of this stuff and you got to ride the wave. It's pretty turbulent, but it's still growing and changing. >> Yeah, it's true. I mean, there's a lot of reading. I'm watching the news. Anytime something comes out, you know, ChatGPT I'm playing with it. I've got a great network and sharing. I'm on, you know, LinkedIn reading articles all the time. I have a team, right? Every time I hire someone, they bring new information and knowledge in and I'm you know, Cal Poly had this learn by doing that was the philosophy at San Luis Obispo. So do it. Try it, don't be afraid of it. I think that's the advice. >> Well, I love some of the points you mentioned community and network. You mentioned networking. That brings up the community question, how could people get involved? What communities are out there? How should they approach communities? 'Cause communities are also networks, but also they're welcoming people in that form networks. So it's a network of networks. So what's your take on how to engage and work with communities? How do you find your tribe? If someone's getting into the business, they want support, they might want technology learnings, what's your approach? >> Yeah, so a few, a few different places. One, I'm part of the operator collective, which is a strong female investment group that's open and works a lot with operators and they're in on the newest technologies 'cause they're investing in it. Chief I think is a great organization as well. You've got a lot of, if you're in marketing, there's a ton of CMO networking events that you can go to. I would say any field, even for us at Lacework, we've got some strong CISO networks and we do dinners around you know, we have one coming up in the Bay area, in Boston, New York, and you can come and meet other CISOs and security leaders. So when I get an invite and you know we all do, I will go to it. I'll carve out the time and meet with others. So I think, you know, part of the community is get out there and, you know, join some of these different groups. >> Meagen, thank you so much for spending the time. Final question for you. How do you see the future of tech evolving and how do you see your role in it? >> Yeah, I mean, marketing's changing wildly. There's so many different channels. You think about all the social media channels that have changed over the last five years. So when I think about the future of tech, I'm looking at apps on my phone. I have three daughters, 13, 11, and 8. I'm telling you, they come to me with new apps and new technology all the time, and I'm paying attention what they're, you know, what they're participating in and what they want to be a part of. And certainly it's going to be a lot more around the data and AI. I think we're only at the beginning of that. So we will continue to, you know, learn from it and wield it and deal with the mass amount of data that's out there. >> Well, you saw TikTok just got banned by the European Commission today around their staff. Interesting times. >> It is. >> Meagen, thank you so much as always. You're a great tech athlete. Been following your career for a while, a long time. You're an amazing leader. Thank you for sharing your story here on theCUBE, celebration of International Women's Day. Every day is IWD and thanks for coming on. >> Thank you for having me. >> Okay. I'm John Furrier here in theCUBE Studios in Palo Alto. Thank you for watching, more to come stay with us. (bright music)
SUMMARY :
you for coming on the program Yeah, thank you for having me. That's kind of the spirit of this day. But I think about, you know, and it can get kind of messy as you know. and you know, be talking to the right What are some of the how the, you know, I recommend that book to everyone. makes you think about what's happening all the time, wasn't it. rules that won't help you you guys got going on? and help them, you know, and you know, that kind and around the world and the to design, you know, webpages. It's interesting, you know, to figure out where you Interesting point you That easy. I think about Waze you know, and looking at the map. You're right. Well, I got to ask you before you get into And I like, you know, some advice that you might have and you know, add value. You're amazing and you If I'm out on the road, I'm, you know, What do you think about now and then the other events and you are building that rapport. And networking is obviously do you give folks that just to come by for great swag. any data you can share? and the threats that are there. the how to get promoted You're using technology to show, you know, and you got to ride the wave. and I'm you know, the points you mentioned and you can come and meet other and how do you see your role in it? and new technology all the time, Well, you saw TikTok just got banned Thank you for sharing your Thank you for watching,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Teresa Carlson | PERSON | 0.99+ |
Nicole | PERSON | 0.99+ |
2017 | DATE | 0.99+ |
General Motors | ORGANIZATION | 0.99+ |
Meagen Eisenberg | PERSON | 0.99+ |
European Commission | ORGANIZATION | 0.99+ |
HP | ORGANIZATION | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
Meagen | PERSON | 0.99+ |
John Furrier | PERSON | 0.99+ |
Cal Poly | ORGANIZATION | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
100 | QUANTITY | 0.99+ |
Lacework | ORGANIZATION | 0.99+ |
nine years | QUANTITY | 0.99+ |
Boston | LOCATION | 0.99+ |
two jobs | QUANTITY | 0.99+ |
eight years | QUANTITY | 0.99+ |
Tuesday night | DATE | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
Flexport | ORGANIZATION | 0.99+ |
International Women's Day | EVENT | 0.99+ |
John Markoff | PERSON | 0.99+ |
three jobs | QUANTITY | 0.99+ |
13 | QUANTITY | 0.99+ |
yesterday | DATE | 0.99+ |
Barcelona | LOCATION | 0.99+ |
Eileen | PERSON | 0.99+ |
14 years ago | DATE | 0.99+ |
two techs | QUANTITY | 0.99+ |
ArcSite | ORGANIZATION | 0.99+ |
securedbywomen.com | OTHER | 0.99+ |
TripActions | ORGANIZATION | 0.99+ |
International Women's Day | EVENT | 0.99+ |
today | DATE | 0.99+ |
first job | QUANTITY | 0.98+ |
theCUBE | ORGANIZATION | 0.98+ |
Mobile World Congress | EVENT | 0.98+ |
ChatGPT | TITLE | 0.98+ |
200 meetings | QUANTITY | 0.98+ |
three daughters | QUANTITY | 0.98+ |
11 | QUANTITY | 0.98+ |
pandemic | EVENT | 0.98+ |
YouTube | ORGANIZATION | 0.98+ |
8 | QUANTITY | 0.98+ |
Laceworks | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.97+ |
about 25% | QUANTITY | 0.97+ |
International Women's Day 2023 | EVENT | 0.97+ |
ORGANIZATION | 0.97+ | |
Bay | LOCATION | 0.96+ |
TripAction | ORGANIZATION | 0.95+ |
One | QUANTITY | 0.94+ |
Meagen Eisenberg | ORGANIZATION | 0.93+ |
a year | QUANTITY | 0.93+ |
RSA | ORGANIZATION | 0.92+ |
This morning | DATE | 0.92+ |
Navan | ORGANIZATION | 0.91+ |
MongoDB | TITLE | 0.91+ |
zero day | QUANTITY | 0.91+ |
this month | DATE | 0.9+ |
DoorDash | ORGANIZATION | 0.89+ |
IWD | ORGANIZATION | 0.88+ |
Atlas | TITLE | 0.87+ |
Luis Obispo | ORGANIZATION | 0.86+ |
CSMIS | ORGANIZATION | 0.85+ |
theCUBE Studios | ORGANIZATION | 0.83+ |
around zero day | QUANTITY | 0.82+ |
hundreds of people | QUANTITY | 0.82+ |
Waze | TITLE | 0.81+ |