Image Title

Search Results for John Adam:

Adam Wenchel & John Dickerson, Arthur | AWS Startup Showcase S3 E1


 

(upbeat music) >> Welcome everyone to theCUBE's presentation of the AWS Startup Showcase AI Machine Learning Top Startups Building Generative AI on AWS. This is season 3, episode 1 of the ongoing series covering the exciting startup from the AWS ecosystem to talk about AI and machine learning. I'm your host, John Furrier. I'm joined by two great guests here, Adam Wenchel, who's the CEO of Arthur, and Chief Scientist of Arthur, John Dickerson. Talk about how they help people build better LLM AI systems to get them into the market faster. Gentlemen, thank you for coming on. >> Yeah, thanks for having us, John. >> Well, I got to say I got to temper my enthusiasm because the last few months explosion of interest in LLMs with ChatGPT, has opened the eyes to everybody around the reality of that this is going next gen, this is it, this is the moment, this is the the point we're going to look back and say, this is the time where AI really hit the scene for real applications. So, a lot of Large Language Models, also known as LLMs, foundational models, and generative AI is all booming. This is where all the alpha developers are going. This is where everyone's focusing their business model transformations on. This is where developers are seeing action. So it's all happening, the wave is here. So I got to ask you guys, what are you guys seeing right now? You're in the middle of it, it's hitting you guys right on. You're in the front end of this massive wave. >> Yeah, John, I don't think you have to temper your enthusiasm at all. I mean, what we're seeing every single day is, everything from existing enterprise customers coming in with new ways that they're rethinking, like business things that they've been doing for many years that they can now do an entirely different way, as well as all manner of new companies popping up, applying LLMs to everything from generating code and SQL statements to generating health transcripts and just legal briefs. Everything you can imagine. And when you actually sit down and look at these systems and the demos we get of them, the hype is definitely justified. It's pretty amazing what they're going to do. And even just internally, we built, about a month ago in January, we built an Arthur chatbot so customers could ask questions, technical questions from our, rather than read our product documentation, they could just ask this LLM a particular question and get an answer. And at the time it was like state of the art, but then just last week we decided to rebuild it because the tooling has changed so much that we, last week, we've completely rebuilt it. It's now way better, built on an entirely different stack. And the tooling has undergone a full generation worth of change in six weeks, which is crazy. So it just tells you how much energy is going into this and how fast it's evolving right now. >> John, weigh in as a chief scientist. I mean, you must be blown away. Talk about kid in the candy store. I mean, you must be looking like this saying, I mean, she must be super busy to begin with, but the change, the acceleration, can you scope the kind of change you're seeing and be specific around the areas you're seeing movement and highly accelerated change? >> Yeah, definitely. And it is very, very exciting actually, thinking back to when ChatGPT was announced, that was a night our company was throwing an event at NeurIPS, which is maybe the biggest machine learning conference out there. And the hype when that happened was palatable and it was just shocking to see how well that performed. And then obviously over the last few months since then, as LLMs have continued to enter the market, we've seen use cases for them, like Adam mentioned all over the place. And so, some things I'm excited about in this space are the use of LLMs and more generally, foundation models to redesign traditional operations, research style problems, logistics problems, like auctions, decisioning problems. So moving beyond the already amazing news cases, like creating marketing content into more core integration and a lot of the bread and butter companies and tasks that drive the American ecosystem. And I think we're just starting to see some of that. And in the next 12 months, I think we're going to see a lot more. If I had to make other predictions, I think we're going to continue seeing a lot of work being done on managing like inference time costs via shrinking models or distillation. And I don't know how to make this prediction, but at some point we're going to be seeing lots of these very large scale models operating on the edge as well. So the time scales are extremely compressed, like Adam mentioned, 12 months from now, hard to say. >> We were talking on theCUBE prior to this session here. We had theCUBE conversation here and then the Wall Street Journal just picked up on the same theme, which is the printing press moment created the enlightenment stage of the history. Here we're in the whole nother automating intellect efficiency, doing heavy lifting, the creative class coming back, a whole nother level of reality around the corner that's being hyped up. The question is, is this justified? Is there really a breakthrough here or is this just another result of continued progress with AI? Can you guys weigh in, because there's two schools of thought. There's the, "Oh my God, we're entering a new enlightenment tech phase, of the equivalent of the printing press in all areas. Then there's, Ah, it's just AI (indistinct) inch by inch. What's your guys' opinion? >> Yeah, I think on the one hand when you're down in the weeds of building AI systems all day, every day, like we are, it's easy to look at this as an incremental progress. Like we have customers who've been building on foundation models since we started the company four years ago, particular in computer vision for classification tasks, starting with pre-trained models, things like that. So that part of it doesn't feel real new, but what does feel new is just when you apply these things to language with all the breakthroughs and computational efficiency, algorithmic improvements, things like that, when you actually sit down and interact with ChatGPT or one of the other systems that's out there that's building on top of LLMs, it really is breathtaking, like, the level of understanding that they have and how quickly you can accelerate your development efforts and get an actual working system in place that solves a really important real world problem and makes people way faster, way more efficient. So I do think there's definitely something there. It's more than just incremental improvement. This feels like a real trajectory inflection point for the adoption of AI. >> John, what's your take on this? As people come into the field, I'm seeing a lot of people move from, hey, I've been coding in Python, I've been doing some development, I've been a software engineer, I'm a computer science student. I'm coding in C++ old school, OG systems person. Where do they come in? Where's the focus, where's the action? Where are the breakthroughs? Where are people jumping in and rolling up their sleeves and getting dirty with this stuff? >> Yeah, all over the place. And it's funny you mentioned students in a different life. I wore a university professor hat and so I'm very, very familiar with the teaching aspects of this. And I will say toward Adam's point, this really is a leap forward in that techniques like in a co-pilot for example, everybody's using them right now and they really do accelerate the way that we develop. When I think about the areas where people are really, really focusing right now, tooling is certainly one of them. Like you and I were chatting about LangChain right before this interview started, two or three people can sit down and create an amazing set of pipes that connect different aspects of the LLM ecosystem. Two, I would say is in engineering. So like distributed training might be one, or just understanding better ways to even be able to train large models, understanding better ways to then distill them or run them. So like this heavy interaction now between engineering and what I might call traditional machine learning from 10 years ago where you had to know a lot of math, you had to know calculus very well, things like that. Now you also need to be, again, a very strong engineer, which is exciting. >> I interviewed Swami when he talked about the news. He's ahead of Amazon's machine learning and AI when they announced Hugging Face announcement. And I reminded him how Amazon was easy to get into if you were developing a startup back in 2007,8, and that the language models had that similar problem. It's step up a lot of content and a lot of expense to get provisioned up, now it's easy. So this is the next wave of innovation. So how do you guys see that from where we are right now? Are we at that point where it's that moment where it's that cloud-like experience for LLMs and large language models? >> Yeah, go ahead John. >> I think the answer is yes. We see a number of large companies that are training these and serving these, some of which are being co-interviewed in this episode. I think we're at that. Like, you can hit one of these with a simple, single line of Python, hitting an API, you can boot this up in seconds if you want. It's easy. >> Got it. >> So I (audio cuts out). >> Well let's take a step back and talk about the company. You guys being featured here on the Showcase. Arthur, what drove you to start the company? How'd this all come together? What's the origination story? Obviously you got a big customers, how'd get started? What are you guys doing? How do you make money? Give a quick overview. >> Yeah, I think John and I come at it from slightly different angles, but for myself, I have been a part of a number of technology companies. I joined Capital One, they acquired my last company and shortly after I joined, they asked me to start their AI team. And so even though I've been doing AI for a long time, I started my career back in DARPA. It was the first time I was really working at scale in AI at an organization where there were hundreds of millions of dollars in revenue at stake with the operation of these models and that they were impacting millions of people's financial livelihoods. And so it just got me hyper-focused on these issues around making sure that your AI worked well and it worked well for your company and it worked well for the people who were being affected by it. At the time when I was doing this 2016, 2017, 2018, there just wasn't any tooling out there to support this production management model monitoring life phase of the life cycle. And so we basically left to start the company that I wanted. And John has a his own story. I'll let let you share that one, John. >> Go ahead John, you're up. >> Yeah, so I'm coming at this from a different world. So I'm on leave now from a tenured role in academia where I was leading a large lab focusing on the intersection of machine learning and economics. And so questions like fairness or the response to the dynamism on the underlying environment have been around for quite a long time in that space. And so I've been thinking very deeply about some of those more like R and D style questions as well as having deployed some automation code across a couple of different industries, some in online advertising, some in the healthcare space and so on, where concerns of, again, fairness come to bear. And so Adam and I connected to understand the space of what that might look like in the 2018 20 19 realm from a quantitative and from a human-centered point of view. And so booted things up from there. >> Yeah, bring that applied engineering R and D into the Capital One, DNA that he had at scale. I could see that fit. I got to ask you now, next step, as you guys move out and think about LLMs and the recent AI news around the generative models and the foundational models like ChatGPT, how should we be looking at that news and everyone watching might be thinking the same thing. I know at the board level companies like, we should refactor our business, this is the future. It's that kind of moment, and the tech team's like, okay, boss, how do we do this again? Or are they prepared? How should we be thinking? How should people watching be thinking about LLMs? >> Yeah, I think they really are transformative. And so, I mean, we're seeing companies all over the place. Everything from large tech companies to a lot of our large enterprise customers are launching significant projects at core parts of their business. And so, yeah, I would be surprised, if you're serious about becoming an AI native company, which most leading companies are, then this is a trend that you need to be taking seriously. And we're seeing the adoption rate. It's funny, I would say the AI adoption in the broader business world really started, let's call it four or five years ago, and it was a relatively slow adoption rate, but I think all that kind of investment in and scaling the maturity curve has paid off because the rate at which people are adopting and deploying systems based on this is tremendous. I mean, this has all just happened in the few months and we're already seeing people get systems into production. So, now there's a lot of things you have to guarantee in order to put these in production in a way that basically is added into your business and doesn't cause more headaches than it solves. And so that's where we help customers is where how do you put these out there in a way that they're going to represent your company well, they're going to perform well, they're going to do their job and do it properly. >> So in the use case, as a customer, as I think about this, there's workflows. They might have had an ML AI ops team that's around IT. Their inference engines are out there. They probably don't have a visibility on say how much it costs, they're kicking the tires. When you look at the deployment, there's a cost piece, there's a workflow piece, there's fairness you mentioned John, what should be, I should be thinking about if I'm going to be deploying stuff into production, I got to think about those things. What's your opinion? >> Yeah, I'm happy to dive in on that one. So monitoring in general is extremely important once you have one of these LLMs in production, and there have been some changes versus traditional monitoring that we can dive deeper into that LLMs are really accelerated. But a lot of that bread and butter style of things you should be looking out for remain just as important as they are for what you might call traditional machine learning models. So the underlying environment of data streams, the way users interact with these models, these are all changing over time. And so any performance metrics that you care about, traditional ones like an accuracy, if you can define that for an LLM, ones around, for example, fairness or bias. If that is a concern for your particular use case and so on. Those need to be tracked. Now there are some interesting changes that LLMs are bringing along as well. So most ML models in production that we see are relatively static in the sense that they're not getting flipped in more than maybe once a day or once a week or they're just set once and then not changed ever again. With LLMs, there's this ongoing value alignment or collection of preferences from users that is often constantly updating the model. And so that opens up all sorts of vectors for, I won't say attack, but for problems to arise in production. Like users might learn to use your system in a different way and thus change the way those preferences are getting collected and thus change your system in ways that you never intended. So maybe that went through governance already internally at the company and now it's totally, totally changed and it's through no fault of your own, but you need to be watching over that for sure. >> Talk about the reinforced learnings from human feedback. How's that factoring in to the LLMs? Is that part of it? Should people be thinking about that? Is that a component that's important? >> It certainly is, yeah. So this is one of the big tweaks that happened with InstructGPT, which is the basis model behind ChatGPT and has since gone on to be used all over the place. So value alignment I think is through RLHF like you mentioned is a very interesting space to get into and it's one that you need to watch over. Like, you're asking humans for feedback over outputs from a model and then you're updating the model with respect to that human feedback. And now you've thrown humans into the loop here in a way that is just going to complicate things. And it certainly helps in many ways. You can ask humans to, let's say that you're deploying an internal chat bot at an enterprise, you could ask humans to align that LLM behind the chatbot to, say company values. And so you're listening feedback about these company values and that's going to scoot that chatbot that you're running internally more toward the kind of language that you'd like to use internally on like a Slack channel or something like that. Watching over that model I think in that specific case, that's a compliance and HR issue as well. So while it is part of the greater LLM stack, you can also view that as an independent bit to watch over. >> Got it, and these are important factors. When people see the Bing news, they freak out how it's doing great. Then it goes off the rails, it goes big, fails big. (laughing) So these models people see that, is that human interaction or is that feedback, is that not accepting it or how do people understand how to take that input in and how to build the right apps around LLMs? This is a tough question. >> Yeah, for sure. So some of the examples that you'll see online where these chatbots go off the rails are obviously humans trying to break the system, but some of them clearly aren't. And that's because these are large statistical models and we don't know what's going to pop out of them all the time. And even if you're doing as much in-house testing at the big companies like the Go-HERE's and the OpenAI's of the world, to try to prevent things like toxicity or racism or other sorts of bad content that might lead to bad pr, you're never going to catch all of these possible holes in the model itself. And so, again, it's very, very important to keep watching over that while it's in production. >> On the business model side, how are you guys doing? What's the approach? How do you guys engage with customers? Take a minute to explain the customer engagement. What do they need? What do you need? How's that work? >> Yeah, I can talk a little bit about that. So it's really easy to get started. It's literally a matter of like just handing out an API key and people can get started. And so we also offer alternative, we also offer versions that can be installed on-prem for models that, we find a lot of our customers have models that deal with very sensitive data. So you can run it in your cloud account or use our cloud version. And so yeah, it's pretty easy to get started with this stuff. We find people start using it a lot of times during the validation phase 'cause that way they can start baselining performance models, they can do champion challenger, they can really kind of baseline the performance of, maybe they're considering different foundation models. And so it's a really helpful tool for understanding differences in the way these models perform. And then from there they can just flow that into their production inferencing, so that as these systems are out there, you have really kind of real time monitoring for anomalies and for all sorts of weird behaviors as well as that continuous feedback loop that helps you make make your product get better and observability and you can run all sorts of aggregated reports to really understand what's going on with these models when they're out there deciding. I should also add that we just today have another way to adopt Arthur and that is we are in the AWS marketplace, and so we are available there just to make it that much easier to use your cloud credits, skip the procurement process, and get up and running really quickly. >> And that's great 'cause Amazon's got SageMaker, which handles a lot of privacy stuff, all kinds of cool things, or you can get down and dirty. So I got to ask on the next one, production is a big deal, getting stuff into production. What have you guys learned that you could share to folks watching? Is there a cost issue? I got to monitor, obviously you brought that up, we talked about the even reinforcement issues, all these things are happening. What is the big learnings that you could share for people that are going to put these into production to watch out for, to plan for, or be prepared for, hope for the best plan for the worst? What's your advice? >> I can give a couple opinions there and I'm sure Adam has. Well, yeah, the big one from my side is, again, I had mentioned this earlier, it's just the input data streams because humans are also exploring how they can use these systems to begin with. It's really, really hard to predict the type of inputs you're going to be seeing in production. Especially, we always talk about chatbots, but then any generative text tasks like this, let's say you're taking in news articles and summarizing them or something like that, it's very hard to get a good sampling even of the set of news articles in such a way that you can really predict what's going to pop out of that model. So to me, it's, adversarial maybe isn't the word that I would use, but it's an unnatural shifting input distribution of like prompts that you might see for these models. That's certainly one. And then the second one that I would talk about is, it can be hard to understand the costs, the inference time costs behind these LLMs. So the pricing on these is always changing as the models change size, it might go up, it might go down based on model size, based on energy cost and so on, but your pricing per token or per a thousand tokens and that I think can be difficult for some clients to wrap their head around. Again, you don't know how these systems are going to be used after all so it can be tough. And so again that's another metric that really should be tracked. >> Yeah, and there's a lot of trade off choices in there with like, how many tokens do you want at each step and in the sequence and based on, you have (indistinct) and you reject these tokens and so based on how your system's operating, that can make the cost highly variable. And that's if you're using like an API version that you're paying per token. A lot of people also choose to run these internally and as John mentioned, the inference time on these is significantly higher than a traditional classifi, even NLP classification model or tabular data model, like orders of magnitude higher. And so you really need to understand how that, as you're constantly iterating on these models and putting out new versions and new features in these models, how that's affecting the overall scale of that inference cost because you can use a lot of computing power very quickly with these profits. >> Yeah, scale, performance, price all come together. I got to ask while we're here on the secret sauce of the company, if you had to describe to people out there watching, what's the secret sauce of the company? What's the key to your success? >> Yeah, so John leads our research team and they've had a number of really cool, I think AI as much as it's been hyped for a while, it's still commercial AI at least is really in its infancy. And so the way we're able to pioneer new ways to think about performance for computer vision NLP LLMs is probably the thing that I'm proudest about. John and his team publish papers all the time at Navs and other places. But I think it's really being able to define what performance means for basically any kind of model type and give people really powerful tools to understand that on an ongoing basis. >> John, secret sauce, how would you describe it? You got all the action happening all around you. >> Yeah, well I going to appreciate Adam talking me up like that. No, I. (all laughing) >> Furrier: Robs to you. >> I would also say a couple of other things here. So we have a very strong engineering team and so I think some early hires there really set the standard at a very high bar that we've maintained as we've grown. And I think that's really paid dividends as scalabilities become even more of a challenge in these spaces, right? And so that's not just scalability when it comes to LLMs, that's scalability when it comes to millions of inferences per day, that kind of thing as well in traditional ML models. And I think that's compared to potential competitors, that's really... Well, it's made us able to just operate more efficiently and pass that along to the client. >> Yeah, and I think the infancy comment is really important because it's the beginning. You really is a long journey ahead. A lot of change coming, like I said, it's a huge wave. So I'm sure you guys got a lot of plannings at the foundation even for your own company, so I appreciate the candid response there. Final question for you guys is, what should the top things be for a company in 2023? If I'm going to set the agenda and I'm a customer moving forward, putting the pedal to the metal, so to speak, what are the top things I should be prioritizing or I need to do to be successful with AI in 2023? >> Yeah, I think, so number one, as we talked about, we've been talking about this entire episode, the things are changing so quickly and the opportunities for business transformation and really disrupting different applications, different use cases, is almost, I don't think we've even fully comprehended how big it is. And so really digging in to your business and understanding where I can apply these new sets of foundation models is, that's a top priority. The interesting thing is I think there's another force at play, which is the macroeconomic conditions and a lot of places are, they're having to work harder to justify budgets. So in the past, couple years ago maybe, they had a blank check to spend on AI and AI development at a lot of large enterprises that was limited primarily by the amount of talent they could scoop up. Nowadays these expenditures are getting scrutinized more. And so one of the things that we really help our customers with is like really calculating the ROI on these things. And so if you have models out there performing and you have a new version that you can put out that lifts the performance by 3%, how many tens of millions of dollars does that mean in business benefit? Or if I want to go to get approval from the CFO to spend a few million dollars on this new project, how can I bake in from the beginning the tools to really show the ROI along the way? Because I think in these systems when done well for a software project, the ROI can be like pretty spectacular. Like we see over a hundred percent ROI in the first year on some of these projects. And so, I think in 2023, you just need to be able to show what you're getting for that spend. >> It's a needle moving moment. You see it all the time with some of these aha moments or like, whoa, blown away. John, I want to get your thoughts on this because one of the things that comes up a lot for companies that I talked to, that are on my second wave, I would say coming in, maybe not, maybe the front wave of adopters is talent and team building. You mentioned some of the hires you got were game changing for you guys and set the bar high. As you move the needle, new developers going to need to come in. What's your advice given that you've been a professor, you've seen students, I know a lot of computer science people want to shift, they might not be yet skilled in AI, but they're proficient in programming, is that's going to be another opportunity with open source when things are happening. How do you talk to that next level of talent that wants to come in to this market to supplement teams and be on teams, lead teams? Any advice you have for people who want to build their teams and people who are out there and want to be a coder in AI? >> Yeah, I've advice, and this actually works for what it would take to be a successful AI company in 2023 as well, which is, just don't be afraid to iterate really quickly with these tools. The space is still being explored on what they can be used for. A lot of the tasks that they're used for now right? like creating marketing content using a machine learning is not a new thing to do. It just works really well now. And so I'm excited to see what the next year brings in terms of folks from outside of core computer science who are, other engineers or physicists or chemists or whatever who are learning how to use these increasingly easy to use tools to leverage LLMs for tasks that I think none of us have really thought about before. So that's really, really exciting. And so toward that I would say iterate quickly. Build things on your own, build demos, show them the friends, host them online and you'll learn along the way and you'll have somebody to show for it. And also you'll help us explore that space. >> Guys, congratulations with Arthur. Great company, great picks and shovels opportunities out there for everybody. Iterate fast, get in quickly and don't be afraid to iterate. Great advice and thank you for coming on and being part of the AWS showcase, thanks. >> Yeah, thanks for having us on John. Always a pleasure. >> Yeah, great stuff. Adam Wenchel, John Dickerson with Arthur. Thanks for coming on theCUBE. I'm John Furrier, your host. Generative AI and AWS. Keep it right there for more action with theCUBE. Thanks for watching. (upbeat music)

Published Date : Mar 9 2023

SUMMARY :

of the AWS Startup Showcase has opened the eyes to everybody and the demos we get of them, but the change, the acceleration, And in the next 12 months, of the equivalent of the printing press and how quickly you can accelerate As people come into the field, aspects of the LLM ecosystem. and that the language models in seconds if you want. and talk about the company. of the life cycle. in the 2018 20 19 realm I got to ask you now, next step, in the broader business world So in the use case, as a the way users interact with these models, How's that factoring in to that LLM behind the chatbot and how to build the Go-HERE's and the OpenAI's What's the approach? differences in the way that are going to put So the pricing on these is always changing and in the sequence What's the key to your success? And so the way we're able to You got all the action Yeah, well I going to appreciate Adam and pass that along to the client. so I appreciate the candid response there. get approval from the CFO to spend You see it all the time with some of A lot of the tasks that and being part of the Yeah, thanks for having us Generative AI and AWS.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

Adam WenchelPERSON

0.99+

AmazonORGANIZATION

0.99+

AdamPERSON

0.99+

John FurrierPERSON

0.99+

twoQUANTITY

0.99+

John DickersonPERSON

0.99+

2016DATE

0.99+

2018DATE

0.99+

2023DATE

0.99+

3%QUANTITY

0.99+

2017DATE

0.99+

Capital OneORGANIZATION

0.99+

last weekDATE

0.99+

AWSORGANIZATION

0.99+

ArthurPERSON

0.99+

PythonTITLE

0.99+

millionsQUANTITY

0.99+

TwoQUANTITY

0.99+

each stepQUANTITY

0.99+

2018 20 19DATE

0.99+

two schoolsQUANTITY

0.99+

couple years agoDATE

0.99+

once a weekQUANTITY

0.99+

oneQUANTITY

0.98+

first yearQUANTITY

0.98+

SwamiPERSON

0.98+

four years agoDATE

0.98+

fourDATE

0.98+

first timeQUANTITY

0.98+

ArthurORGANIZATION

0.98+

two great guestsQUANTITY

0.98+

next yearDATE

0.98+

once a dayQUANTITY

0.98+

six weeksQUANTITY

0.97+

10 years agoDATE

0.97+

ChatGPTTITLE

0.97+

second oneQUANTITY

0.96+

three peopleQUANTITY

0.96+

frontEVENT

0.95+

second waveEVENT

0.95+

JanuaryDATE

0.95+

hundreds of millions of dollarsQUANTITY

0.95+

five years agoDATE

0.94+

about a month agoDATE

0.94+

tens of millionsQUANTITY

0.93+

todayDATE

0.92+

next 12 monthsDATE

0.91+

LangChainORGANIZATION

0.91+

over a hundred percentQUANTITY

0.91+

million dollarsQUANTITY

0.89+

millions of inferencesQUANTITY

0.89+

theCUBEORGANIZATION

0.88+

Tanuja Randery, AWS | AWS re:Invent 2021


 

>>Hey, welcome back everyone to the cubes coverage of eaters reinvent 2021. So our third day wall-to-wall coverage. I'm my coach, Dave Alonzo. He we're getting all the action two sets in person. It's also a virtual hybrid events with a lot of great content online, bringing you all the fresh voices, all the knowledge, all the news and all the action and got great guests here today. As your renderer, managing director of AWS is Europe, middle east, and Africa also known as EMIA. Welcome to the cube. Welcome, >>Welcome. Thanks for coming on. Lovely to be here. >>So Europe is really hot. Middle east Africa. Great growth. The VC culture in Europe specifically has been booming this year. A lot of great action. We've done many cube gigs out there talking to folks, uh, entrepreneurship, cloud, native growth, and then for us it's global. It's awesome. So first question got to ask you is, is you're new to AWS? What brought you here? >>Yeah, no, John, thank you so much. I've been here about three and a half months now, actually. Um, so what brought me here? Um, I have been in and around the tech world since I was a baby. Um, my father was an entrepreneur. I sold fax machines and microfilm equipment in my early days. And then my career has spanned technology in some form or the other. I was at EMC when we bought VMware. Uh, I was a Colt when we did a FinTech startup joined Schneider in my background, which is industrial tech. So I guess I'm a bit of a tech nerd, although I'm not an engineer, that's for sure. The other thing is I've spent a huge part of my career advising clients. And so while I was at McKinsey on business transformation and cloud keeps coming up, especially post pandemic, huge, huge, huge enabler, right of transformation. So when I got the call from AWS, I thought here's my opportunity to finally take what companies are wrestling with, bring together a pioneer in cloud with our enterprise and start-up and SMB clients connect those dots between business and technology and make things happen. So it real magic. So that's what brought me here. And I guess the only other thing to say is I'd heard a lot of other culture, customer mash, obsession, and leadership principles. >>That's why I'm here. It's been a great success. I got to ask you too, now that your new ostium McKinsey, even seeing the front lines, all the transformation, the pandemic has really forced everybody globally to move faster. Uh, things like connect were popular in EMEA. How, how is that going out? There's at the same kind of global pressure on the digital transformation with cloud? What are you seeing out there? >>I've been traveling since I joined, uh, around 10 of the countries already. So Ben planes, trains, automobiles, and what you definitely see is massive acceleration. And I think it's around reinvention of the business. So people are adopting cloud because it's obviously there's cost reasons. There's MNA reasons. There's really increasingly more about innovating. How do I innovate my business? How do I reinvent my business? So you see that constantly. Um, and whether you're a enterprise company or you're a startup, they're all adopting cloud in different, different ways. Um, I mean, I want to tell a core to stack because it's really interesting. And Adam mentioned this in his keynote five to 15% only of workloads have moved to the cloud. So there's a tremendous runway ahead of us. Um, and the three big things on people's minds helped me become a tech company. So it doesn't matter who you are, you're retail, whether you're life sciences or healthcare. You've probably heard about the Roche, uh, work that we're doing with Roche around accelerating R and D with data, or if you're a shoes Addie desk, how do you accelerate again, your personalized experiences? So it doesn't matter who you have helped me become a tech company, give me skills, digital skills, and then help me become a more sustainable company. Those are the three big things I'm thinking of. >>So a couple of things to unpack there. So think about it. Transformation. We still have a long way to go to your point, whatever 10, 15%, depending on which numbers you look at. We've been talking a lot in the cube about the next decade around business transformation, deeper business integration, and the four smarts to digital. And the woke us up to that, accelerated that as you say, so as you travel around to customers in AMEA, what are you hearing with regard to that? I mean, many customers maybe didn't have time to plan. Now they can sit back and take what they've learned. What are you hearing? >>Yeah. And it's, it's a little bit different in different places, right? So, I mean, if you start, if you look at, uh, you know, our businesses, for example, in France, if you look at our businesses in Iberia or Italy, a lot of them are now starting they're on the, at least on the enterprise front, they are now starting to adopt cloud. So they stepping back and thinking about their overall strategy, right? And then the way that they're doing it is actually they're using data as the first trigger point. And I think that makes it easier to migrate because if you, if you look at large enterprises and if you think of the big processes that they've got and all the mainframes and everything that they need to do, if you S if you look at it as one big block, it's too difficult. But when you think about data, you can actually start to aggregate all of your data into one area and then start to analyze and unpack that. >>So I think what I'm seeing for sure is in those countries, data is the first trigger. If you go out to Israel, well that you've got all, it's really start up nation as you know, right. And then we've got more of the digital natives and they want to, you know, absorb all of the innovation that we're throwing at them. And you've heard a lot here at reinvent on some of the things, whether it's digital twins or robotics, or frankly, even using 5g private network, we've just announcement. They are adopting innovation and really taking that in. So it really does differ, but I think the one big message I would leave you with is bringing industry solutions to business is critical. So rather than just talking it and technology, we've got to be able to bring some of what we've done. So for example, the Goldman Sachs financial cloud, bring that to the rest of financial services companies and the media, or if you take the work we're doing on industrials and IOT. So it's really about connecting what industry use cases with. >>What's interesting about the Goldman Dave and I were commenting. I think we coined the term, the story we wrote on Thursday last week, and then PIP was Sunday superclouds because you look at the rise of snowflake and Databricks and Goldman Sachs. You're going to start to see people building on AWS and building these super clouds because they are taking unique platform features of AWS and then sacrificing it for their needs, and then offering that as a service. So there's kind of a whole nother tier developing in the natural evolution of clouds. So the partners are on fire right now because the creativity, the market opportunities are there to be captured. So you're seeing this opportunity recognition, opportunity, capture vibe going on. And it's interesting. I'd love to get your thoughts on how you see that, because certainly the VCs are here in force. I did when I saw all the top Silicon valley VCs here, um, and some European VCs are all here. They're all seeing this. >>So pick up on two things you mentioned that I think absolutely spot on. We're absolutely seeing with our partners, this integration on our platform is so important. So we talk about the power of three, which is you bring a JSI partner, you bring an ISV partner, you bring AWS, you create that power of three and you take it to our customers. And it doesn't matter which industry we are. Our partner ecosystem is so rich. The Adam mentioned, we have a hundred thousand partners around the world, and then you integrate that with marketplace. Um, and the AWS marketplace just opens the world. We have about 325,000 active customers on marketplace. So sassiphy cation integration with our platform, bringing in the GSI and the NSIs. I think that's the real power to, to, to coming back to your point on transformation on the second one, the unicorns, you know, it's interesting. >>So UK France, um, Israel, Mia, I spent a lot of time, uh, recently in Dubai and you can see it happening there. Uh, Africa, Nigeria, South Africa, I mean all across those countries, you're saying huge amount of VC funding going in towards developers, towards startups to at scale-ups more and more of a, um, our startup clients, by the way, uh, are actually going IPO. You know, initially it used to be a lot of M and a and strategic acquisitions, but they have actually bigger aspirations and they're going IPO and we've seen them through from when they were seed or pre-seed all the way to now that they are unicorns. Right? So that there's just a tremendous amount happening in EMEA. Um, and we're fueling that, you know, you know, I mean, born in the cloud is easy, right? In terms of what AWS brings to the table. >>Well, I've been sacred for years. I always talked to Andy Jassy about this. Cause he's a big sports nut. When you bring like these stadiums to certain cities that rejuvenates and Amazon regions are bringing local rejuvenation around the digital economies. And what you see with the startup culture is the ecosystems around it. And Silicon valley thrives because you have all the service providers, you have all the fear of failure goes away. There's support systems. You start to see now with AWS as ecosystem, that same ecosystem support the robustness of it. So, you know, it's classic, rising tide floats all boats kind of vibe. So, I mean, we don't really have our narrative get down on this, but we're seeing this ecosystem kind of play going on. Yeah. >>And actually it's a real virtuous circle, or we call flywheel right within AWS because a startup wants to connect to an enterprise. An enterprise wants to connect to a startup, right? A lot of our ISV partners, by the way, were startups. Now they've graduated and they're like very large. So what we are, I see our role. And by the way, this is one of the other reasons I came here is I see our role to be able to be real facilitators of these ecosystems. Right. And, you know, we've got something that we kicked off in EMEA, which I'm really proud of called our EMEA startup loft accelerator. And we launched that a web summit. And the idea is to bring startups into our space virtually and physically and help them build and help them make those connections. So I think really, I really do think, and I enterprise clients are asking us all the time, right? Who do I need to involve if I'm thinking IOT, who do I need to involve if I want to do something with data. And that's what we do. Super connectors, >>John, you mentioned the, the Goldman deal. And I think it was Adam in his keynote was talking about our customers are asking us to teach them how to essentially build a Supercloud. I mean, our words. But so with your McKinsey background, I would imagine there's real opportunities there, especially as you, I hear you talk about IMIA going around to see customers. There must be a lot of, sort of non-digital businesses that are now transforming to digital. A lot of capital needs there, but maybe you could talk about sort of how you see that playing out over the next several years in your role and AWS's role in affecting that transfer. >>Yeah, no, absolutely. I mean, you're right actually. And I, you know, maybe I will, from my past experience pick up on something, you know, I was in the world of industry, uh, with Schneider as an example. And, you know, we did business through the channel. Um, and a lot of our channel was not digitized. You know, you had point of sale, electrical distributors, wholesalers, et cetera. I think all of those businesses during the pandemic realized that they had to go digital and online. Right. And so they started from having one fax machine in a store. Real literally I'm not kidding nothing else to actually having to go online and be able to do click and collect and various other things. And we were able with AWS, you can spin up in minutes, right. That sort of service, right. I love the fact that you have a credit card you can get onto our cloud. >>Right. That's the whole thing. And it's about instances. John Adam talked about instances, which I think is great. How do businesses transform? And again, I think it's about unpacking the problem, right? So what we do a lot is we sit down with our customers and we actually map a migration journey with them, right? We look across their core infrastructure. We look at their SAP systems. For example, we look at what's happening in the various businesses, their e-commerce systems, that customer life cycle value management systems. I think you've got to go business by business by business use case by use case, by use case, and then help our technology enable that use case to actually digitize. And whether it's front office or back office. I think the advantages are pretty clear. It's more, I think the difficulty is not technology anymore. The difficulty is mindset, leadership, commitment, the operating model, the organizational model and skills. And so what we have to do is AWS is bringing not only our technology, but our culture of innovation and our digital innovation teams to help our clients on that journey >>Technology. Well, we really appreciate you taking the time coming on the cube. We have a couple more minutes. I do want to get into what's your agenda. Now that you're got you're in charge, got the landscape and the 20 mile stare in front of you. Cloud's booming. You got some personal passion projects. Tell us what your plans are. >>So, um, three or four things, right? Three or four, really big takeaways for me is one. I, I came here to help make sure our customers could leverage the power of the cloud. So I will not feel like my job's been done if I haven't been able to do that. So, you know, that five to 15% we talked about, we've got to go 50, 60, 70%. That that's, that's the goal, right? And why not a hundred percent at some point, right? So I think over the next few years, that's the acceleration we need to help bring in AMEA Americas already started to get there as you know, much more, and we need to drive that into me. And then eventually our APJ colleagues are going to do the same. So that's one thing. The other is we talked about partners. I really want to accelerate and expand our partner ecosystem. >>Um, we have actually a huge growth by the way, in the number of partners signing up the number of certifications they're taking, I really, really want to double down on our partners and actually do what they ask us for, which is join. Co-sell joined marketing globalization. So that's two, I think the third big thing is when you mentioned industry industry industry, we've got to bring real use cases and solutions to our customers and not only talk technology got to connect those two dots. And we have lots of examples to bring by the way. Um, and then for hire and develop the best, you know, we've got a new LP as you know, to strive to be at its best employer. I want to do that in a Mia. I want to make sure we can actually do that. We attract, we retain and we grow and we develop that. >>And the diversity has been a huge theme of this event. It's front and center in virtually every company. >>I am. I'm usually passionate about diversity. I'm proud actually that when I was back at Schneider, I launched something called the power women network. We're a network of a hundred senior women and we meet every month. I've also got a podcast out there. So if anyone's listening, it's called power. Women's speak. It is, I've done 16 over the pandemic with CEOs of women podcast, our women speak >>Or women speak oh, >>And Spotify and >>Everything else. >>And, um, you know, what I love about what we're doing is AWS on diversity and you heard Adam onstage, uh, talk to this. We've got our restock program where we really help under employed and unemployed to get a 12 week intensive course and get trained up on thought skills. And the other thing is, get it helping young girls, 12 to 15, get into stem. So lots of different things on the whole, but we need to do a lot more of course, on diversity. And I look forward to helping our clients through that as well. >>Well, we had, we had the training VP on yesterday. It's all free trainings free. >>We've got such a digital skills issue that I love that we've said 29 million people around the world, free cloud training. >>Literally the th the, the gap there between earnings with cloud certification, you can be making six figures like with cloud training. So, I mean, it's really easy. It's free. It's like, it's such a great thing. >>Have you seen the YouTube video on Charlotte Wilkins? Donald's fast food. She changed her mind. She wanted to take Korea. She now has a tech career as a result of being part of restock. Awesome. >>Oh, really appreciate. You got a lot of energy and love, love the podcast. I'm subscribing. I'm going to listen. We love doing the podcast as well. So thanks for coming on the >>Queue. Thank you so much for having me >>Good luck on anemia and your plans. Thank you. Okay. Cube. You're watching the cube, the leader in global tech coverage. We go to the events and extract the signal from the noise. I'm John furrier with Dave, a lot to here at re-invent physical event in person hybrid event as well. Thanks for watching.

Published Date : Dec 2 2021

SUMMARY :

It's also a virtual hybrid events with a lot of great content online, bringing you all the fresh voices, Lovely to be here. So first question got to ask you is, is you're new to AWS? And I guess the only other thing to say is I'd heard a lot of other culture, I got to ask you too, now that your new ostium McKinsey, even seeing the front So Ben planes, trains, automobiles, and what you definitely see is massive And the woke us up to that, accelerated that as you say, so as you travel around to customers in AMEA, and all the mainframes and everything that they need to do, if you S if you look at it as one big block, it's too difficult. So for example, the Goldman Sachs financial cloud, bring that to the rest of because the creativity, the market opportunities are there to be captured. second one, the unicorns, you know, it's interesting. and we're fueling that, you know, you know, I mean, born in the cloud is easy, right? all the service providers, you have all the fear of failure goes away. And the idea is to bring A lot of capital needs there, but maybe you could talk about sort of how you see that playing I love the fact that you have a credit card you can get onto our cloud. So what we do a lot is we sit down with our customers and we actually map Well, we really appreciate you taking the time coming on the cube. in AMEA Americas already started to get there as you know, much more, and we need to drive that into So that's two, I think the third big thing is when you mentioned industry industry And the diversity has been a huge theme of this event. back at Schneider, I launched something called the power women network. And I look forward to helping our clients through that as well. Well, we had, we had the training VP on yesterday. around the world, free cloud training. Literally the th the, the gap there between earnings with cloud certification, Have you seen the YouTube video on Charlotte Wilkins? So thanks for coming on the Thank you so much for having me We go to the events and extract the signal from the noise.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
AdamPERSON

0.99+

JohnPERSON

0.99+

Tanuja RanderyPERSON

0.99+

Dave AlonzoPERSON

0.99+

IberiaLOCATION

0.99+

ItalyLOCATION

0.99+

DubaiLOCATION

0.99+

FranceLOCATION

0.99+

50QUANTITY

0.99+

Andy JassyPERSON

0.99+

RocheORGANIZATION

0.99+

AWSORGANIZATION

0.99+

DavePERSON

0.99+

fiveQUANTITY

0.99+

10QUANTITY

0.99+

Goldman SachsORGANIZATION

0.99+

EuropeLOCATION

0.99+

John AdamPERSON

0.99+

12 weekQUANTITY

0.99+

12QUANTITY

0.99+

20 mileQUANTITY

0.99+

fourQUANTITY

0.99+

twoQUANTITY

0.99+

AfricaLOCATION

0.99+

ThreeQUANTITY

0.99+

60QUANTITY

0.99+

McKinseyORGANIZATION

0.99+

threeQUANTITY

0.99+

South AfricaLOCATION

0.99+

third dayQUANTITY

0.99+

two dotsQUANTITY

0.99+

NigeriaLOCATION

0.99+

15%QUANTITY

0.99+

15QUANTITY

0.99+

70%QUANTITY

0.99+

DatabricksORGANIZATION

0.99+

SchneiderORGANIZATION

0.99+

second oneQUANTITY

0.99+

EMCORGANIZATION

0.99+

first questionQUANTITY

0.99+

six figuresQUANTITY

0.98+

oneQUANTITY

0.98+

Thursday last weekDATE

0.98+

YouTubeORGANIZATION

0.98+

Middle east AfricaLOCATION

0.98+

yesterdayDATE

0.98+

Silicon valleyLOCATION

0.98+

SundayDATE

0.98+

MiaPERSON

0.98+

two setsQUANTITY

0.98+

IsraelLOCATION

0.98+

first triggerQUANTITY

0.98+

EMEALOCATION

0.98+

middle eastLOCATION

0.98+

GoldmanORGANIZATION

0.97+

pandemicEVENT

0.97+

hundred percentQUANTITY

0.97+

next decadeDATE

0.97+

VMwareORGANIZATION

0.97+

29 million peopleQUANTITY

0.97+

about three and a half monthsQUANTITY

0.97+

EMEAORGANIZATION

0.97+

first triggerQUANTITY

0.97+

about 325,000 active customersQUANTITY

0.96+

MNAORGANIZATION

0.96+

AMEALOCATION

0.96+

John furrierPERSON

0.96+

two thingsQUANTITY

0.96+