Image Title

Search Results for Riding the Wave:

Closing Panel | Generative AI: Riding the Wave | AWS Startup Showcase S3 E1


 

(mellow music) >> Hello everyone, welcome to theCUBE's coverage of AWS Startup Showcase. This is the closing panel session on AI machine learning, the top startups generating generative AI on AWS. It's a great panel. This is going to be the experts talking about riding the wave in generative AI. We got Ankur Mehrotra, who's the director and general manager of AI and machine learning at AWS, and Clem Delangue, co-founder and CEO of Hugging Face, and Ori Goshen, who's the co-founder and CEO of AI21 Labs. Ori from Tel Aviv dialing in, and rest coming in here on theCUBE. Appreciate you coming on for this closing session for the Startup Showcase. >> Thanks for having us. >> Thank you for having us. >> Thank you. >> I'm super excited to have you all on. Hugging Face was recently in the news with the AWS relationship, so congratulations. Open source, open science, really driving the machine learning. And we got the AI21 Labs access to the LLMs, generating huge scale live applications, commercial applications, coming to the market, all powered by AWS. So everyone, congratulations on all your success, and thank you for headlining this panel. Let's get right into it. AWS is powering this wave here. We're seeing a lot of push here from applications. Ankur, set the table for us on the AI machine learning. It's not new, it's been goin' on for a while. Past three years have been significant advancements, but there's been a lot of work done in AI machine learning. Now it's released to the public. Everybody's super excited and now says, "Oh, the future's here!" It's kind of been going on for a while and baking. Now it's kind of coming out. What's your view here? Let's get it started. >> Yes, thank you. So, yeah, as you may be aware, Amazon has been in investing in machine learning research and development since quite some time now. And we've used machine learning to innovate and improve user experiences across different Amazon products, whether it's Alexa or Amazon.com. But we've also brought in our expertise to extend what we are doing in the space and add more generative AI technology to our AWS products and services, starting with CodeWhisperer, which is an AWS service that we announced a few months ago, which is, you can think of it as a coding companion as a service, which uses generative AI models underneath. And so this is a service that customers who have no machine learning expertise can just use. And we also are talking to customers, and we see a lot of excitement about generative AI, and customers who want to build these models themselves, who have the talent and the expertise and resources. For them, AWS has a number of different options and capabilities they can leverage, such as our custom silicon, such as Trainium and Inferentia, as well as distributed machine learning capabilities that we offer as part of SageMaker, which is an end-to-end machine learning development service. At the same time, many of our customers tell us that they're interested in not training and building these generative AI models from scratch, given they can be expensive and can require specialized talent and skills to build. And so for those customers, we are also making it super easy to bring in existing generative AI models into their machine learning development environment within SageMaker for them to use. So we recently announced our partnership with Hugging Face, where we are making it super easy for customers to bring in those models into their SageMaker development environment for fine tuning and deployment. And then we are also partnering with other proprietary model providers such as AI21 and others, where we making these generative AI models available within SageMaker for our customers to use. So our approach here is to really provide customers options and choices and help them accelerate their generative AI journey. >> Ankur, thank you for setting the table there. Clem and Ori, I want to get your take, because the riding the waves, the theme of this session, and to me being in California, I imagine the big surf, the big waves, the big talent out there. This is like alpha geeks, alpha coders, developers are really leaning into this. You're seeing massive uptake from the smartest people. Whether they're young or around, they're coming in with their kind of surfboards, (chuckles) if you will. These early adopters, they've been on this for a while; Now the waves are hitting. This is a big wave, everyone sees it. What are some of those early adopter devs doing? What are some of the use cases you're seeing right out of the gate? And what does this mean for the folks that are going to come in and get on this wave? Can you guys share your perspective on this? Because you're seeing the best talent now leaning into this. >> Yeah, absolutely. I mean, from Hugging Face vantage points, it's not even a a wave, it's a tidal wave, or maybe even the tide itself. Because actually what we are seeing is that AI and machine learning is not something that you add to your products. It's very much a new paradigm to do all technology. It's this idea that we had in the past 15, 20 years, one way to build software and to build technology, which was writing a million lines of code, very rule-based, and then you get your product. Now what we are seeing is that every single product, every single feature, every single company is starting to adopt AI to build the next generation of technology. And that works both to make the existing use cases better, if you think of search, if you think of social network, if you think of SaaS, but also it's creating completely new capabilities that weren't possible with the previous paradigm. Now AI can generate text, it can generate image, it can describe your image, it can do so many new things that weren't possible before. >> It's going to really make the developers really productive, right? I mean, you're seeing the developer uptake strong, right? >> Yes, we have over 15,000 companies using Hugging Face now, and it keeps accelerating. I really think that maybe in like three, five years, there's not going to be any company not using AI. It's going to be really kind of the default to build all technology. >> Ori, weigh in on this. APIs, the cloud. Now I'm a developer, I want to have live applications, I want the commercial applications on this. What's your take? Weigh in here. >> Yeah, first, I absolutely agree. I mean, we're in the midst of a technology shift here. I think not a lot of people realize how big this is going to be. Just the number of possibilities is endless, and I think hard to imagine. And I don't think it's just the use cases. I think we can think of it as two separate categories. We'll see companies and products enhancing their offerings with these new AI capabilities, but we'll also see new companies that are AI first, that kind of reimagine certain experiences. They build something that wasn't possible before. And that's why I think it's actually extremely exciting times. And maybe more philosophically, I think now these large language models and large transformer based models are helping us people to express our thoughts and kind of making the bridge from our thinking to a creative digital asset in a speed we've never imagined before. I can write something down and get a piece of text, or an image, or a code. So I'll start by saying it's hard to imagine all the possibilities right now, but it's certainly big. And if I had to bet, I would say it's probably at least as big as the mobile revolution we've seen in the last 20 years. >> Yeah, this is the biggest. I mean, it's been compared to the Enlightenment Age. I saw the Wall Street Journal had a recent story on this. We've been saying that this is probably going to be bigger than all inflection points combined in the tech industry, given what transformation is coming. I guess I want to ask you guys, on the early adopters, we've been hearing on these interviews and throughout the industry that there's already a set of big companies, a set of companies out there that have a lot of data and they're already there, they're kind of tinkering. Kind of reminds me of the old hyper scaler days where they were building their own scale, and they're eatin' glass, spittin' nails out, you know, they're hardcore. Then you got everybody else kind of saying board level, "Hey team, how do I leverage this?" How do you see those two things coming together? You got the fast followers coming in behind the early adopters. What's it like for the second wave coming in? What are those conversations for those developers like? >> I mean, I think for me, the important switch for companies is to change their mindset from being kind of like a traditional software company to being an AI or machine learning company. And that means investing, hiring machine learning engineers, machine learning scientists, infrastructure in members who are working on how to put these models in production, team members who are able to optimize models, specialized models, customized models for the company's specific use cases. So it's really changing this mindset of how you build technology and optimize your company building around that. Things are moving so fast that I think now it's kind of like too late for low hanging fruits or small, small adjustments. I think it's important to realize that if you want to be good at that, and if you really want to surf this wave, you need massive investments. If there are like some surfers listening with this analogy of the wave, right, when there are waves, it's not enough just to stand and make a little bit of adjustments. You need to position yourself aggressively, paddle like crazy, and that's how you get into the waves. So that's what companies, in my opinion, need to do right now. >> Ori, what's your take on the generative models out there? We hear a lot about foundation models. What's your experience running end-to-end applications for large foundation models? Any insights you can share with the app developers out there who are looking to get in? >> Yeah, I think first of all, it's start create an economy, where it probably doesn't make sense for every company to create their own foundation models. You can basically start by using an existing foundation model, either open source or a proprietary one, and start deploying it for your needs. And then comes the second round when you are starting the optimization process. You bootstrap, whether it's a demo, or a small feature, or introducing new capability within your product, and then start collecting data. That data, and particularly the human feedback data, helps you to constantly improve the model, so you create this data flywheel. And I think we're now entering an era where customers have a lot of different choice of how they want to start their generative AI endeavor. And it's a good thing that there's a variety of choices. And the really amazing thing here is that every industry, any company you speak with, it could be something very traditional like industrial or financial, medical, really any company. I think peoples now start to imagine what are the possibilities, and seriously think what's their strategy for adopting this generative AI technology. And I think in that sense, the foundation model actually enabled this to become scalable. So the barrier to entry became lower; Now the adoption could actually accelerate. >> There's a lot of integration aspects here in this new wave that's a little bit different. Before it was like very monolithic, hardcore, very brittle. A lot more integration, you see a lot more data coming together. I have to ask you guys, as developers come in and grow, I mean, when I went to college and you were a software engineer, I mean, I got a degree in computer science, and software engineering, that's all you did was code, (chuckles) you coded. Now, isn't it like everyone's a machine learning engineer at this point? Because that will be ultimately the science. So, (chuckles) you got open source, you got open software, you got the communities. Swami called you guys the GitHub of machine learning, Hugging Face is the GitHub of machine learning, mainly because that's where people are going to code. So this is essentially, machine learning is computer science. What's your reaction to that? >> Yes, my co-founder Julien at Hugging Face have been having this thing for quite a while now, for over three years, which was saying that actually software engineering as we know it today is a subset of machine learning, instead of the other way around. People would call us crazy a few years ago when we're seeing that. But now we are realizing that you can actually code with machine learning. So machine learning is generating code. And we are starting to see that every software engineer can leverage machine learning through open models, through APIs, through different technology stack. So yeah, it's not crazy anymore to think that maybe in a few years, there's going to be more people doing AI and machine learning. However you call it, right? Maybe you'll still call them software engineers, maybe you'll call them machine learning engineers. But there might be more of these people in a couple of years than there is software engineers today. >> I bring this up as more tongue in cheek as well, because Ankur, infrastructure's co is what made Cloud great, right? That's kind of the DevOps movement. But here the shift is so massive, there will be a game-changing philosophy around coding. Machine learning as code, you're starting to see CodeWhisperer, you guys have had coding companions for a while on AWS. So this is a paradigm shift. How is the cloud playing into this for you guys? Because to me, I've been riffing on some interviews where it's like, okay, you got the cloud going next level. This is an example of that, where there is a DevOps-like moment happening with machine learning, whether you call it coding or whatever. It's writing code on its own. Can you guys comment on what this means on top of the cloud? What comes out of the scale? What comes out of the benefit here? >> Absolutely, so- >> Well first- >> Oh, go ahead. >> Yeah, so I think as far as scale is concerned, I think customers are really relying on cloud to make sure that the applications that they build can scale along with the needs of their business. But there's another aspect to it, which is that until a few years ago, John, what we saw was that machine learning was a data scientist heavy activity. They were data scientists who were taking the data and training models. And then as machine learning found its way more and more into production and actual usage, we saw the MLOps become a thing, and MLOps engineers become more involved into the process. And then we now are seeing, as machine learning is being used to solve more business critical problems, we're seeing even legal and compliance teams get involved. We are seeing business stakeholders more engaged. So, more and more machine learning is becoming an activity that's not just performed by data scientists, but is performed by a team and a group of people with different skills. And for them, we as AWS are focused on providing the best tools and services for these different personas to be able to do their job and really complete that end-to-end machine learning story. So that's where, whether it's tools related to MLOps or even for folks who cannot code or don't know any machine learning. For example, we launched SageMaker Canvas as a tool last year, which is a UI-based tool which data analysts and business analysts can use to build machine learning models. So overall, the spectrum in terms of persona and who can get involved in the machine learning process is expanding, and the cloud is playing a big role in that process. >> Ori, Clem, can you guys weigh in too? 'Cause this is just another abstraction layer of scale. What's it mean for you guys as you look forward to your customers and the use cases that you're enabling? >> Yes, I think what's important is that the AI companies and providers and the cloud kind of work together. That's how you make a seamless experience and you actually reduce the barrier to entry for this technology. So that's what we've been super happy to do with AWS for the past few years. We actually announced not too long ago that we are doubling down on our partnership with AWS. We're excited to have many, many customers on our shared product, the Hugging Face deep learning container on SageMaker. And we are working really closely with the Inferentia team and the Trainium team to release some more exciting stuff in the coming weeks and coming months. So I think when you have an ecosystem and a system where the AWS and the AI providers, AI startups can work hand in hand, it's to the benefit of the customers and the companies, because it makes it orders of magnitude easier for them to adopt this new paradigm to build technology AI. >> Ori, this is a scale on reasoning too. The data's out there and making sense out of it, making it reason, getting comprehension, having it make decisions is next, isn't it? And you need scale for that. >> Yes. Just a comment about the infrastructure side. So I think really the purpose is to streamline and make these technologies much more accessible. And I think we'll see, I predict that we'll see in the next few years more and more tooling that make this technology much more simple to consume. And I think it plays a very important role. There's so many aspects, like the monitoring the models and their kind of outputs they produce, and kind of containing and running them in a production environment. There's so much there to build on, the infrastructure side will play a very significant role. >> All right, that's awesome stuff. I'd love to change gears a little bit and get a little philosophy here around AI and how it's going to transform, if you guys don't mind. There's been a lot of conversations around, on theCUBE here as well as in some industry areas, where it's like, okay, all the heavy lifting is automated away with machine learning and AI, the complexity, there's some efficiencies, it's horizontal and scalable across all industries. Ankur, good point there. Everyone's going to use it for something. And a lot of stuff gets brought to the table with large language models and other things. But the key ingredient will be proprietary data or human input, or some sort of AI whisperer kind of role, or prompt engineering, people are saying. So with that being said, some are saying it's automating intelligence. And that creativity will be unleashed from this. If the heavy lifting goes away and AI can fill the void, that shifts the value to the intellect or the input. And so that means data's got to come together, interact, fuse, and understand each other. This is kind of new. I mean, old school AI was, okay, got a big model, I provisioned it long time, very expensive. Now it's all free flowing. Can you guys comment on where you see this going with this freeform, data flowing everywhere, heavy lifting, and then specialization? >> Yeah, I think- >> Go ahead. >> Yeah, I think, so what we are seeing with these large language models or generative models is that they're really good at creating stuff. But I think it's also important to recognize their limitations. They're not as good at reasoning and logic. And I think now we're seeing great enthusiasm, I think, which is justified. And the next phase would be how to make these systems more reliable. How to inject more reasoning capabilities into these models, or augment with other mechanisms that actually perform more reasoning so we can achieve more reliable results. And we can count on these models to perform for critical tasks, whether it's medical tasks, legal tasks. We really want to kind of offload a lot of the intelligence to these systems. And then we'll have to get back, we'll have to make sure these are reliable, we'll have to make sure we get some sort of explainability that we can understand the process behind the generated results that we received. So I think this is kind of the next phase of systems that are based on these generated models. >> Clem, what's your view on this? Obviously you're at open community, open source has been around, it's been a great track record, proven model. I'm assuming creativity's going to come out of the woodwork, and if we can automate open source contribution, and relationships, and onboarding more developers, there's going to be unleashing of creativity. >> Yes, it's been so exciting on the open source front. We all know Bert, Bloom, GPT-J, T5, Stable Diffusion, that work up. The previous or the current generation of open source models that are on Hugging Face. It has been accelerating in the past few months. So I'm super excited about ControlNet right now that is really having a lot of impact, which is kind of like a way to control the generation of images. Super excited about Flan UL2, which is like a new model that has been recently released and is open source. So yeah, it's really fun to see the ecosystem coming together. Open source has been the basis for traditional software, with like open source programming languages, of course, but also all the great open source that we've gotten over the years. So we're happy to see that the same thing is happening for machine learning and AI, and hopefully can help a lot of companies reduce a little bit the barrier to entry. So yeah, it's going to be exciting to see how it evolves in the next few years in that respect. >> I think the developer productivity angle that's been talked about a lot in the industry will be accelerated significantly. I think security will be enhanced by this. I think in general, applications are going to transform at a radical rate, accelerated, incredible rate. So I think it's not a big wave, it's the water, right? I mean, (chuckles) it's the new thing. My final question for you guys, if you don't mind, I'd love to get each of you to answer the question I'm going to ask you, which is, a lot of conversations around data. Data infrastructure's obviously involved in this. And the common thread that I'm hearing is that every company that looks at this is asking themselves, if we don't rebuild our company, start thinking about rebuilding our business model around AI, we might be dinosaurs, we might be extinct. And it reminds me that scene in Moneyball when, at the end, it's like, if we're not building the model around your model, every company will be out of business. What's your advice to companies out there that are having those kind of moments where it's like, okay, this is real, this is next gen, this is happening. I better start thinking and putting into motion plans to refactor my business, 'cause it's happening, business transformation is happening on the cloud. This kind of puts an exclamation point on, with the AI, as a next step function. Big increase in value. So it's an opportunity for leaders. Ankur, we'll start with you. What's your advice for folks out there thinking about this? Do they put their toe in the water? Do they jump right into the deep end? What's your advice? >> Yeah, John, so we talk to a lot of customers, and customers are excited about what's happening in the space, but they often ask us like, "Hey, where do we start?" So we always advise our customers to do a lot of proof of concepts, understand where they can drive the biggest ROI. And then also leverage existing tools and services to move fast and scale, and try and not reinvent the wheel where it doesn't need to be. That's basically our advice to customers. >> Get it. Ori, what's your advice to folks who are scratching their head going, "I better jump in here. "How do I get started?" What's your advice? >> So I actually think that need to think about it really economically. Both on the opportunity side and the challenges. So there's a lot of opportunities for many companies to actually gain revenue upside by building these new generative features and capabilities. On the other hand, of course, this would probably affect the cogs, and incorporating these capabilities could probably affect the cogs. So I think we really need to think carefully about both of these sides, and also understand clearly if this is a project or an F word towards cost reduction, then the ROI is pretty clear, or revenue amplifier, where there's, again, a lot of different opportunities. So I think once you think about this in a structured way, I think, and map the different initiatives, then it's probably a good way to start and a good way to start thinking about these endeavors. >> Awesome. Clem, what's your take on this? What's your advice, folks out there? >> Yes, all of these are very good advice already. Something that you said before, John, that I disagreed a little bit, a lot of people are talking about the data mode and proprietary data. Actually, when you look at some of the organizations that have been building the best models, they don't have specialized or unique access to data. So I'm not sure that's so important today. I think what's important for companies, and it's been the same for the previous generation of technology, is their ability to build better technology faster than others. And in this new paradigm, that means being able to build machine learning faster than others, and better. So that's how, in my opinion, you should approach this. And kind of like how can you evolve your company, your teams, your products, so that you are able in the long run to build machine learning better and faster than your competitors. And if you manage to put yourself in that situation, then that's when you'll be able to differentiate yourself to really kind of be impactful and get results. That's really hard to do. It's something really different, because machine learning and AI is a different paradigm than traditional software. So this is going to be challenging, but I think if you manage to nail that, then the future is going to be very interesting for your company. >> That's a great point. Thanks for calling that out. I think this all reminds me of the cloud days early on. If you went to the cloud early, you took advantage of it when the pandemic hit. If you weren't native in the cloud, you got hamstrung by that, you were flatfooted. So just get in there. (laughs) Get in the cloud, get into AI, you're going to be good. Thanks for for calling that. Final parting comments, what's your most exciting thing going on right now for you guys? Ori, Clem, what's the most exciting thing on your plate right now that you'd like to share with folks? >> I mean, for me it's just the diversity of use cases and really creative ways of companies leveraging this technology. Every day I speak with about two, three customers, and I'm continuously being surprised by the creative ideas. And the future is really exciting of what can be achieved here. And also I'm amazed by the pace that things move in this industry. It's just, there's not at dull moment. So, definitely exciting times. >> Clem, what are you most excited about right now? >> For me, it's all the new open source models that have been released in the past few weeks, and that they'll keep being released in the next few weeks. I'm also super excited about more and more companies getting into this capability of chaining different models and different APIs. I think that's a very, very interesting development, because it creates new capabilities, new possibilities, new functionalities that weren't possible before. You can plug an API with an open source embedding model, with like a no-geo transcription model. So that's also very exciting. This capability of having more interoperable machine learning will also, I think, open a lot of interesting things in the future. >> Clem, congratulations on your success at Hugging Face. Please pass that on to your team. Ori, congratulations on your success, and continue to, just day one. I mean, it's just the beginning. It's not even scratching the service. Ankur, I'll give you the last word. What are you excited for at AWS? More cloud goodness coming here with AI. Give you the final word. >> Yeah, so as both Clem and Ori said, I think the research in the space is moving really, really fast, so we are excited about that. But we are also excited to see the speed at which enterprises and other AWS customers are applying machine learning to solve real business problems, and the kind of results they're seeing. So when they come back to us and tell us the kind of improvement in their business metrics and overall customer experience that they're driving and they're seeing real business results, that's what keeps us going and inspires us to continue inventing on their behalf. >> Gentlemen, thank you so much for this awesome high impact panel. Ankur, Clem, Ori, congratulations on all your success. We'll see you around. Thanks for coming on. Generative AI, riding the wave, it's a tidal wave, it's the water, it's all happening. All great stuff. This is season three, episode one of AWS Startup Showcase closing panel. This is the AI ML episode, the top startups building generative AI on AWS. I'm John Furrier, your host. Thanks for watching. (mellow music)

Published Date : Mar 9 2023

SUMMARY :

This is the closing panel I'm super excited to have you all on. is to really provide and to me being in California, and then you get your product. kind of the default APIs, the cloud. and kind of making the I saw the Wall Street Journal I think it's important to realize that the app developers out there So the barrier to entry became lower; I have to ask you guys, instead of the other way around. That's kind of the DevOps movement. and the cloud is playing a and the use cases that you're enabling? the barrier to entry And you need scale for that. in the next few years and AI can fill the void, a lot of the intelligence and if we can automate reduce a little bit the barrier to entry. I'd love to get each of you drive the biggest ROI. to folks who are scratching So I think once you think Clem, what's your take on this? and it's been the same of the cloud days early on. And also I'm amazed by the pace in the past few weeks, Please pass that on to your team. and the kind of results they're seeing. This is the AI ML episode,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Ankur MehrotraPERSON

0.99+

JohnPERSON

0.99+

AWSORGANIZATION

0.99+

ClemPERSON

0.99+

Ori GoshenPERSON

0.99+

John FurrierPERSON

0.99+

CaliforniaLOCATION

0.99+

OriPERSON

0.99+

Clem DelanguePERSON

0.99+

Hugging FaceORGANIZATION

0.99+

JulienPERSON

0.99+

AnkurPERSON

0.99+

AmazonORGANIZATION

0.99+

Tel AvivLOCATION

0.99+

threeQUANTITY

0.99+

AnkurORGANIZATION

0.99+

second roundQUANTITY

0.99+

AI21 LabsORGANIZATION

0.99+

two separate categoriesQUANTITY

0.99+

Amazon.comORGANIZATION

0.99+

last yearDATE

0.99+

two thingsQUANTITY

0.99+

firstQUANTITY

0.98+

over 15,000 companiesQUANTITY

0.98+

BothQUANTITY

0.98+

five yearsQUANTITY

0.98+

bothQUANTITY

0.98+

over three yearsQUANTITY

0.98+

three customersQUANTITY

0.98+

eachQUANTITY

0.98+

TrainiumORGANIZATION

0.98+

todayDATE

0.98+

AlexaTITLE

0.98+

Stable DiffusionORGANIZATION

0.97+

SwamiPERSON

0.97+

InferentiaORGANIZATION

0.96+

GPT-JORGANIZATION

0.96+

SageMakerTITLE

0.96+

AI21 LabsORGANIZATION

0.95+

Riding the WaveTITLE

0.95+

ControlNetORGANIZATION

0.94+

one wayQUANTITY

0.94+

a million linesQUANTITY

0.93+

Startup ShowcaseEVENT

0.92+

few months agoDATE

0.92+

second waveEVENT

0.91+

theCUBEORGANIZATION

0.91+

few years agoDATE

0.91+

CodeWhispererTITLE

0.9+

AI21ORGANIZATION

0.89+

Nitin Madhok, Clemson University | Splunk .conf19


 

>>live from Las Vegas. It's the Cube covering Splunk dot com. 19. Brought to you by spunk >>Welcome back Everyone's two cubes Live coverage from Las Vegas. Four Splunk dot com 2019 The 10th anniversary of their and user conference I'm John Free host of the key that starts seventh year covering Splunk Riding the wave of Big Data Day three of our three days were winding down. Our show are great to have on next guest Didn't Medoc executive director be Ibis Intelligence? Advanced Data Analytics at Clemson University Big A C C. Football team Everyone knows that. Great stadium. Great to have you on. Thanks for spending the time to come by and on Day three coverage. >>Thanks, John, for having me over. >>So, you know, hospitals, campuses, some use cases just encapsulate the digital opportunities and challenges. But you guys air have that kind of same thing going on. You got students, you got people who work there. You got a I ot or campus to campus is you guys are living the the real life example of physical digital coming together. Tell us about what's going on in your world that Clemson wouldn't your job there. What's your current situation? >>So, like you mentioned, we have a lot of students. So Clemson's about 20,000 undergraduate, children's and 5000 graduate students way faculty and staff. So you're talking about a lot of people every semester. We have new devices coming in. We have to support the entire network infrastructure, our student information systems on and research computing. So way we're focused on how convene make students lives better than experience. Better on how convene facilitated education for them. So way try toe in my role. Specifically, I'm responsible for the advanced eight analytics, the data that we're collecting from our systems. How can we? How can you use that on get more insides for better decision making? So that's that's >>Is a scope university wide, or is it specifically targeted for certain areas? >>So it does interest divide. So we have. We have some key projects going on University wide way, have a project for sure and success. There's a project for space utilization and how how, how we can utilize space and campus more efficiently. And then we're looking at energy energy usage across buildings campus emergency management idea. So we've got a couple of projects, and then Pettersson projects that most hired edge motion overseas work on this father's retention enrollment, graduation rates. How how the academics are. So so we're doing the same thing. >>What's interesting is that the new tagline for Splunk is data to everything. You got a lot of things. Their data. Ah, lot of horizontal use cases. So it seems to me that you have, ah, view and we're kind of talking on camera before we went live here was Dana is a fluid situation is not like just a subsystem. It's gotta be every native everywhere in the organization on touched, touches everything. How do you guys look at the data? Because you want to harness the data? Because data getting gathering on, say, energy. Your specialization might be great data to look at endpoint protection, for instance. I don't know. I'm making it up, but data needs to be workable. Cross. How do you view that? What's what's the state of the art thinking around data everywhere? >>So the key thing is, we've got so many IOC's. We've got so many sensors, we've got so many servers, it's it's hard when you work with different technologies to sort of integrate all of them on in the industry that have bean Some some software companies that try to view themselves as being deking, but really the way to dress it does you look at each system, you look at how you can integrate all of that, all of that data without being deking. So you basically analyze the data from different systems. You figured out a way to get it into a place where you can analyze it on, then make decisions based on that. So so that's essentially what we've been focused on. Working on >>Splunk role in all this is because one of things that we've been doing spot I've been falling spunk for a long time in a very fascinated with law. How they take log files and make make value out of that. And their vision now is that Grew is grow is they're enabling a lot of value of the data which I love. I think it's a mission that's notable, relevant and certainly gonna help a lot of use cases. But their success has been about just dumping data on display and then getting value out of it. How does that translate into this kind of data space that you're looking at, because does it work across all areas? What should what specifically are you guys doing with Splunk and you talk about the case. >>So we're looking at it as a platform, like, how can we provide ah self service platform toe analysts who can who can go into system, analyze the data way not We're not focusing on a specific technology, so our platform is built up of multiple technologies. We have tableau for visual analytics. We're also using Splunk. We also have a data warehouse. We've got a lot of databases. We have a Kafka infrastructure. So how can we integrate all of these tools and give give the choice to the people to use the tools, the place where we really see strong helping us? Originally in our journey when we started, our network team used to long for getting log data from switches. It started off troubleshooting exercise of a switch went down. You know what was wrong with it? Eventually we pulled in all for server logs. That's where security guard interested apart from the traditional idea of monitoring security, saw value in the data on. And then we talked about the whole ecosystem. That that's one provides. It gives you a way to bring in data withdrawal based access control so you can have data in a read only state that you can change when it's in the system and then give access to people to a specific set of data. So so that's that's really game changing, even for us. Like having having people be comfortable to opening data to two analysts for so that they can make better decisions. That's that's the key with a lot of product announcements made during dot com, I think the exciting thing is it's Nargis, the data that you index and spunk anymore, especially with the integration with With Dew and s three. You don't have to bring in your data in response. So even if you have your data sitting in history, our audio do cluster, you can just use the data fabric search and Sarge across all your data sets. And from what I hear that are gonna be more integrations that are gonna be added to the tool. So >>that's awesome. Well, that's a good use. Case shows that they're thinking about it. I got to ask you about Clemson to get into some of the things that you guys do in knowing Clemson. You guys have a lot of new things. You do your university here, building stuff here, you got people doing research. So you guys are bringing on new stuff, The network, a lot of new technology. Is there security concerns in terms of that, How do you guys handle that? Because you want to encourage innovation, students and faculty at the same time. You want gonna have the data to make sure you get the security without giving away the security secrets are things that you do. How do you look at the data when you got an environment that encourages people to put more stuff on the network to generate more data? Because devices generate data project, create more data. How do you view that? How do you guys handle that? >>So our mission and our goal is not to disrupt the student experience. Eso we want to make it seem less. And as we as we get influx of students every semester, we have way have challenges that the traditional corporate sector doesn't have. If you think about our violence infrastructure. We're talking about 20 25,000 students on campus. They're moving around. When, when? When they move from one class to another, they're switching between different access points. So having a robust infrastructure, how can we? How can we use the data to be more proactive and build infrastructure that's more stable? It also helps us plan for maintenance is S O. We don't destruct. Children's so looking at at key usage patterns. How what time's Our college is more active when our submissions happening when our I. D. Computing service is being access more and then finding out the time, which is gonna be less disruptive, do the students. So that's that's how we what's been >>the biggest learnings and challenges that you've overcome or opportunities that you see with data that Clemson What's the What's the exciting areas and or things that you guys have tripped over on, or what I have learned from? We'll share some experiences of what's going on in there for you, >>So I think Sky's the limit here. Really like that is so much data and so less people in the industry, it's hard to analyze all of the data and make sense of it. And it's not just the people who were doing the analysis. You also need people who understand the data. So the data, the data stores, the data trustees you need you need buy in from them. They're the ones who understand what data looks like, how how it should be structured, how, how, how it can be provided for additional analysis s Oh, that's That's the key thing. What's >>the coolest thing you're working on right now? >>So I'm specifically working on analyzing data from our learning management system canvas. So we're getting data informer snapshots that we're trying to analyze, using multiple technologies for that spunk is one of them. But we're loading the data, looking at at key trends, our colleges interacting, engaging with that elements. How can we drive more adoption? How can we encourage certain colleges and departments, too sort of moved to a digital classroom Gordon delivery experience. >>I just l a mess part of the curriculum in gym or online portion? Or is it integrated into the physical curriculum? >>So it's at this time it's more online, But are we trying to trying to engage more classes and more faculty members to use the elements to deliver content. So >>right online, soon to be integrated in Yeah, you know, I was talking with Dawn on our team from the Cube and some of the slum people this week. Look at this event. This is a physical event. Get physical campuses digitizing. Everything is kind of a nirvana. It's kind of aspiration is not. People aren't really doing 100% but people are envisioning that the physical and digital worlds are coming together. If that happens and it's going to happen at some point, it's a day that problem indeed, Opportunity date is everything right? So what's your vision of that as a professional or someone in the industry and someone dealing with data Clemson Because you can digitize everything, Then you can instrument everything of your instrument, everything you could start creating an official efficiencies and innovations. >>Yes, so the way I think you you structure it very accurately. It's amalgam of the physical world and the digital world as the as the as the world is moving towards using more more of smartphones and digital devices, how how can we improve experience by by analyzing the data on and sort of be behind the scenes without even having the user. The North is what's going on trading expedience. If the first expedience is in good that the user has, they're not going to be inclined to continue using the service that we offer. >>What's your view on security now? Splunk House League has been talking about security for a long time. I think about five years ago we started seeing the radar data. Is driving a lot of the cyber security now is ever Everyone knows that you guys have a lot of endpoints. Security's always a concern. How do you guys view the security of picture with data? How do you guys talk about that internally? How do you guys implement data without giving me a secret? You know, >>way don't have ah ready Good Cyber Security Operation Center. That's run by students on. And they do a tremendous job protecting our environment. Way monitored. A lot of activity that goes on higher I deserve is a is a challenge because way have in the corporate industry, you can you can have a set of devices in the in the higher education world We have students coming in every semester that bringing in new, important devices. It causes some unique set of challenges knowing where devices are getting on the network. If if there's fishing campaigns going on, how can be, How can we protect that environment and those sort of things? >>It is great to have you on. First of all, love to have folks from Clemson ons great great university got a great environment. Great Great conversation. Congratulations on all your success on their final question for you share some stories around some mischief that students do because students or students, you know, they're gonna get on the network and most things down. Like when when I was in school, when we were learning they're all love coding. They're all throwing. Who knows? Kitty scripts out there hosting Blockchain mining algorithms. They gonna cause some creek. Curiosity's gonna cause potentially some issues. Um, can you share some funny or interesting student stories of caught him in the dorm room, but a server in there running a Web farm? Is there any kind of cool experiences you can share? That might be interesting to folks that students have done that have been kind of funny mistress, but innovative. >>So without going into Thio, I just say, Like most universities, we have, we have students and computer science programs and people who were programmers and sort of trying to pursue the security route in the industry. So they, um, way also have a lot of research going on the network on. And sometimes research going on may affect our infrastructure environment. So we tried toe account for those use cases and on silo specific use cases and into a dedicated network. >>So they hit the honeypot a lot. They're freshmen together. I'll go right to the kidding, of course. >>Yes. So way do we do try to protect that environment on Dhe. Makes shooting experience better. >>I know you don't want to give any secrets. Thanks for coming on. I always find a talk tech with you guys. Thanks so much appreciated. Okay. Cube coverage. I'm shot for a year. Day three of spunk dot com for more coverage after this short break

Published Date : Oct 24 2019

SUMMARY :

19. Brought to you by spunk Great to have you on. to campus is you guys are living the the real life example How can you use that on How how the academics are. So it seems to me that you have, ah, view and we're kind of talking on camera before we went live here but really the way to dress it does you look at each system, guys doing with Splunk and you talk about the case. So even if you have your data sitting in history, get into some of the things that you guys do in knowing Clemson. So our mission and our goal is not to disrupt the the data stores, the data trustees you need you need buy in from them. So we're getting data informer So it's at this time it's more online, But are right online, soon to be integrated in Yeah, you know, I was talking with Dawn on our team from the Yes, so the way I think you you structure it very accurately. How do you guys talk about that internally? the corporate industry, you can you can have a set of devices in the in the It is great to have you on. also have a lot of research going on the network on. So they hit the honeypot a lot. I always find a talk tech with you guys.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JohnPERSON

0.99+

100%QUANTITY

0.99+

Nitin MadhokPERSON

0.99+

Las VegasLOCATION

0.99+

two analystsQUANTITY

0.99+

SplunkORGANIZATION

0.99+

John FreePERSON

0.99+

two cubesQUANTITY

0.99+

Ibis IntelligenceORGANIZATION

0.99+

DawnPERSON

0.98+

three daysQUANTITY

0.98+

Clemson UniversityORGANIZATION

0.98+

each systemQUANTITY

0.98+

ThioPERSON

0.98+

a yearQUANTITY

0.98+

seventh yearQUANTITY

0.98+

first expedienceQUANTITY

0.97+

oneQUANTITY

0.97+

ClemsonPERSON

0.97+

5000 graduate studentsQUANTITY

0.97+

this weekDATE

0.97+

SkyORGANIZATION

0.96+

10th anniversaryQUANTITY

0.96+

one classQUANTITY

0.96+

Day threeQUANTITY

0.96+

about 20,000 undergraduateQUANTITY

0.95+

2019DATE

0.95+

ClemsonORGANIZATION

0.94+

MedocORGANIZATION

0.94+

Splunk .conf19OTHER

0.92+

Splunk House LeagueORGANIZATION

0.92+

FirstQUANTITY

0.92+

about five years agoDATE

0.92+

ClemsonLOCATION

0.91+

about 20 25,000 studentsQUANTITY

0.9+

CubeORGANIZATION

0.89+

PetterssonPERSON

0.89+

KafkaTITLE

0.83+

GordonPERSON

0.82+

eight analyticsQUANTITY

0.82+

spunk dot comORGANIZATION

0.82+

Advanced Data AnalyticsORGANIZATION

0.8+

FourQUANTITY

0.74+

DewORGANIZATION

0.73+

SplunkTITLE

0.73+

SplunkEVENT

0.7+

NargisORGANIZATION

0.7+

SplunkPERSON

0.66+

the wave of Big DataEVENT

0.62+

I. D.ORGANIZATION

0.58+

KittyPERSON

0.57+

DanaPERSON

0.56+

semesterQUANTITY

0.52+

Security OperationORGANIZATION

0.5+

Big A CORGANIZATION

0.5+

RidingTITLE

0.49+

threeTITLE

0.32+

Armon Dadgar, HashiCorp | KubeCon 2017


 

>> Announcer: Live from Austin, Texas, it's theCUBE, covering Kubecon and CloudNativeCon 2017. Brought to you by Red Hat, the Linux Foundation, and theCUBE's ecosystem partners. >> Okay, welcome back everyone. This is theCUBE's exclusive coverage. We are live in Austin, Texas for CloudNativeCon and KubeCon, not to be confused with CUBE, 'cause we don't have a CUBE Con yet, C-U-B-E. I'm John Furrier with Stu Miniman. Next is Armon Dadgar who is the founder and CTO of HashiCorp. Welcome to theCUBE. >> Thanks so much for having me. >> Thanks for coming on. So we interviewed your partner in crime Mitchell years ago, and we were riffing in our studio in Palo Alto, and essentially we laid out microsurfaces and all the stuff that's being worked on today. So, congratulations, you guys were right in your bet? >> It's funny to see how the reaction has changed over the last few years. Back then it used to be, we'd go in and it's like, people are like, did you catch a load of those crazy people who came in and talked about microsurfaces, and immutable, and cloud? It's like, get out of here. And now it's funny to be here at KubeCon, and it's like-- >> Well it was fun days back then, it was the purest in DevOps, and I say purest, I mean people who were really cutting their teeth into the new methodology, the new way to develop, the new way to kind of roll out scale, a lot of the challenges involved. Certainly, now it's gone mainstream. >> Armon: Yeah. >> You're seeing no doubt about it, I just came back from re:Invent, from AWS, Lambda, Server List. You got application developers that just don't want to deal with any infrastructure. That's infrastructure as code in the DevOps ethos, and then you got a lot of people in the infrastructure plumbing, and App plumbing world, who actually care about all this stuff, provisioning. So, how are you guys fitting into the new landscape? You guys riding along? Were you guys the first ones paddling out to these waves? How do you guys at HashiCorp look at all this growth? >> So the way we think about it is, I think there's a lot of market confusion right now, just because there's so much happening, and I mean, even just being here it's like, almost overwhelming to just like understand what exactly is this market landscape evolving to? And the way we're thinking about it is, there's really these four discrete layers with the four different people that are involved in tech, right? We have, on one side, we have our IT operators that are just trying to get a handle around, how do I provision things in Amazon, and now I have business groups coming and saying, okay I want to provision in Google, cloud and Azure. How do I really do that in way that I don't lose my sanity? You have your security people who are saying, I've lost my network perimeter, now what? Like, how do I think about secret management, and app identity, and this brave new world of cloud. You have your app developers who are like, I don't care about any of that, just give me a platform where I can push deploy and out the gate it goes, and you deal with it. And then you have the folks that are kind of making it all kind of plug together and work, the networking backbone, who is saying okay, before it was F5 and Juniper and Cisco. What does it mean for me as I'm going cloud? So, the way we're sorting of seeing ourself involved in all of this is, how do we help operators sort of get a handle around the provisioning side, with things like Terraform? How do we help the security folks with tools like Volt? How do we complement things like Kubernetes at the runtime layer, or provide our solution with Nomad, and then on the networking side, how do we provide a consistent service discovery experience with Consul? >> So you guys are really just now just kind of riding in with everybody else, kind of welcoming everybody to the party, if you will. (Armon laughs) What's the big surprise for you as you guys, you know it's not new to you guys, but as you see it evolving, what's jumping out at you? I mean, we're hearing service mesh, pluggable architectures. What are some of the things that's popping out of the woodwork that you're excited about? >> Honestly, the thing that I'm excited about is the excitement about infrastructure, right? I mean, when we started four, five years ago, it was an ice cold market. You'd go and talk to people, like, let's talking about how you're doing provisioning, or your deployment, or how your developers push things, and people were like, do we really have to? Like, let me get a coffee. And now it's like the opposite. It's like people are so excited to talk about the infrastructure, the bits and bytes of it, and I think that for us is probably the most exciting thing. So, whether you come here, and it's like the vibe is electric, right? Like, you guys can attest to it. It's crazy to see the growth of it, and so what's exciting for us is now these conversations are being lit up all across industry. >> Yeah. >> So whether you're talking about hey, how do I provision a thing on cloud, to what's a scheduler and how does that help me, there is this tremendous interest in it. >> Yeah, Armon, take us inside. You talked about, you know, it used to be kind of, we would be talking, is infrastructure boring? What is that change that's happening in customers? Has it just reached a certain maturity level, that now the business, they need to move faster, and therefore I need to adopt these kinds of architectures? What are you seeing when you're talking to customers? >> Yeah, I think that, the sort of, we heard that, the sort of, the line a few times is it's becoming boring, but I think what, and sometimes that's the goal, right? All of these tools, all of infrastructure is plumbing, at the end of the day, right? At the end of the day, the applications of the end users is really what should be, sort of, the exciting bit. And so, it's our responsibility, sort of, as the vendors here in the community, working on the infrastructure, to make the stuff boring. And I think, in that case, what we really mean is that it should be so reliable, so well documented, so scalable that it's brain dead to operate these things. And I think, step one is, let's get people excited about what's the state of the possible, what's the art of the possible in terms of, what do I get in terms of business agility of adopting stuff? Once people start adopting it, let's make it boring for them. Let's make them sure they don't regret it, and that they actually see those benefits. >> Well, it's reliable too. Boring equals reliability. >> Exactly, exactly. >> Yeah, it's interesting. When you walk through the provision, secure, connect, and run, it reminded me a little bit of Chen talking in the Keynote this morning about kind of the stack they see Kubernetes playing. >> Armon: Totally. >> You know, there's some people who will probably look, well, HashiCorp, you guys, you have a platform. You've got some of these projects. Is that, what's compatible, what's replaceable? What's the connection between what you are doing and what's happening in this space? >> Yeah, it's a great question. I mean, think a lot of people are like "Is it odd for HashiCorp to be here?" And I think it goes back to our lens on this market, Which is. we want to provide tools that are sort of discrete in each of these categories and we fully know that customers are not going to go all in on HashiCorp and say, I want all four layers, right? A lot of our customers are Kubernetes users. And so, for us the mission is, okay great, how do we make sure Terraform plays nice with Kubernetes? How do we make sure Vault plays nice? So I actually have a session in about an hour and a half here, talking about Vault integration with Kubernetes. And then, we have a developer advocate talking about using Console with Kubernetes as well. So for us, it's really a play nice story. How do we make all of these work together. >> It's a rising-tide-that-floats-all-boats market, I mean this is what's happening. You guys are actors in the ecosystem. It's not a land grab. No-one can own the stack. That's the whole point of this ecosystem, isn't it? >> It's so big, right, this market that we are talking about is so enormous. It's every organization writing software. (laughing) >> All right, give us the update on HashiCorp. What's going on, what's the latest and greatest you guys are out starting? We interviewed you guys about, I think three years ago, maybe four. Can't even remember now at this point. It seems like a blur. >> Yeah, I mean, so two months ago was our big HashiCom for our user com friends. And for us, the focus has really been saying okay, we've got our initial set of open-source tools out on the market in 2015. And we said okay, lets take a pause. There's already so many tools, lets just focus on how do we make the practitioners successful with each of these things and really go deep on all of them. And so, with things like Terraform, we've been partnering with all the various cloud providers, right, to say how do we have first class support for Azure, and Google Cloud and Amazon and make sure that you know, as you're adopting these clouds, Terraform meet you there. And then with things like Vault it's how do we integrate with every platform companies want to be on. So if you're using Kubernetes, how do we make sure Vault meets you there and integrates? So, for us that's been the focus, is staying sort of focused on the six core tools, and saying, "How do we make sure "they're staying up to date as technology moves?" And sort of deepening them. >> Yeah, because your users are going to be leveraging a lot of the new stuff. They're going to be, Kubernetes has certainly been great. What's your take on Kubernetes, if you can just take a minute to just, I mean, not new to this notion of runtime and orchestration. We talked about it with Mitchell in our session years ago, we didn't actually say Kubernetes, it wasn't around then, but we talked about the middleware of the cloud. That was our discussion, and that was essentially called Pass at that time, but now, no one talks about Pass any more, it's all kind of one. >> Right, right. >> What's your take on Kubernetes? How do you feel about it? What is it to you? >> Right, yeah, I think that's, so I think, twofold: I think what's exciting for me about it is, it reminds me in some sense like what Docker did for the industry, which, if we went to sort of the pre-Docker world nobody talked about immutable artifact based deploys. It was like this esoteric thing and then all of a sudden over night Docker made it popular. Whereas like, oh yeah, of course everything should be immutable and artifact based. And then when you look at what Kubernetes has done, it's built on that momentum to say, okay, that was step one. Step two is to say, you really should think about all your machines as a sort of shared pool of resources and move the abstraction up to the application to the service and think about, I'm deploying a service, I'm not deploying a set of VMs. And so it's been this sort of tidal shift in how IT thinks about deploying and delivering in application. It actually should be focused on the service. Focus on sort of abstracting away the machine, and that's super exciting. >> And what do you think the benefits will be with the impact of the marketplace? Faster development, I mean, what's some of the impact that you see coming out of this to go to the next level? >> Yeah, I mean the impact for me is really saying, when we really look at these approaches, in some sense they are not new, if you look at what Google's been doing since the early 2000s with Board, what Amazon's been doing, what Facebook's been doing internally. These big tech companies have showed if you are able to move up the abstraction and provide this higher level of utility to developers, you can support tens of thousands of services, innovate much more quickly, and for a while, that was sort of trapped in these big tech companies. And I think what Kubernetes is really doing is bringing that to everybody else and saying, actually adopting the same strategy lets you have that, right? >> Yeah, its a maturation of open source of this generation. You look at what Lyft, Uber are doing. Look at the Open Tracing for instance, pretty interesting stuff, because I mean they had to build their own stuff. >> Armon: Right. >> At scale, massive scale. Not like, you know, hundreds of thousands of services, millions of transactions a second. >> Armon: Right. >> I mean, that's daunting. >> That's daunting. >> Okay, so your take on open source. Okay, because now we're seeing a new generation of developers coming online. I've been saying it's been, a renaissance is coming. More of an artisan, a craft coming back to craftsmanship of coding. Not like UX Design side, become a craft in code. So you got a new, younger generation coming up. They don't even know what a load balancer is. >> Right. But they're happy not to deal with that as you said. And then you've got open source growing exponentially. Jim Zemlin at the Linux Foundation is saying 10% of the IP is going to be unique to the company. The rest is going to be that sandwich of open source. That's exponential growth. >> Right. >> You get exponential growth, new wave of software developers. You're a young gun, what's your view of the future? >> I mean, its funny, because it's like that first derivative is going exponential. The second derivative is going exponential. You know, I think we're going to see more and more innovation at the, ultimately what it's really about is delivering at the end application layer, right? Like, we're all here to be plumbing, right, and so the better we can be at being plumbing, the better the application developers can be at delivering innovation there. And so, I totally agree that the trend is going to go 90/10. And I think that was partly one of the reasons we started HashiCorp, because we'd look around and we're like it's insane that you have 30 to 50% of these companies doing platform engineering that's completely undifferentiated from anyone else. It's like you're deploying on the same vSphere VM as your competitor but you're rebuilding the whole platform. It's crazy, it's like you should have used an open source tool and focused on the application and not how to boot a vSphere into it. >> And the impact cost and time. >> Armon, one of the things we talk about, the only thing constant in this industry is that the pace of change keeps increasing. How are you dealing internally? How are customers doing? I think back two years, a year and a half ago I talked to a guy who was like, "Oh, Vagrant is like my favorite thing, "I've been using it ever." Now I talk to lots of customers that are, Vault is critical to their stacks that they're doing. HashiCorp looks very different than they did two years ago. How's that pace of change happening internally and with customers? >> Totally, and I think part of what we've done as actually since 2015 we haven't really introduced brand new products because our feeling is that it's becoming so confusing for the end users to really navigate this landscape. So, in 2015 we thought the landscape was confusing. Today it's multiplied by 100 or 1,000. >> We were at Amazon last week, we understand. >> Yeah, exactly. And I think honestly I think that is, when you look around here I think that's one of the challenges we're facing as an industry, is I go and meet with customers who are like, "Every time I refresh Hacker News, "there's 50 new things I need to go evaluate." It's like I don't know where to even begin. And its like, as a vendor I have a hard time keeping up with space, you know. I empathize with the end user who, it's not their full time job to do that. So, our goal has been to say how do we better distill at least the HashiCorp universe in terms of hey, here's how our pieces fit together and here's how we relate to everything else in the ecosystem, and kind of give our end users a map of okay, what tools play nice, how do these things sort of work together. But I think as a bigger industry we have a bit of an issue around the sheer amount of sort of innovation. How do we curate that and really make it more accessible? >> Armon, I've got to ask you a personal question. Obviously you guys are entrepreneurs doing a great job. Been following you guys, congratulations by the way. What are you most proud of as you look back and what do you wish you could do over? If you could get a mulligan and say "Okay, I want to do that differently." >> How much time do we have by the way? (laughing) >> 10 seconds, I'm going to ask you the parachute question next, go ahead. >> You know, I think the thing we're most proud of might be Terraform. I think it's fun to see sort of the level of ubiquity and the standardization that is taking place around it. Ah, the thing I wish we could take back is you know, probably our Otto project. I think the scope was so big for that thing and I think our eyes were probably a little wider than they should have been on that one. So I wish we had not committed to that one. >> You reign it in, catch the mistakes early. Okay, final question for you. You're a large customer and the plane is going down, you have 10 seconds to pick a parachute. Amazon, Azure or Google. Which one do you grab? >> Ooh. >> Go. >> You know, probably Amazon. No one ever gets fired for choosing Amazon. >> All right well Jeff Frick on our CUBE team said, "I'd take all three and call it Multi Cloud." >> That's the right answer. Armon, thanks for coming on appreciate it. Congratulations on your success at HashiCorp. >> My pleasure, thanks so much for having me. >> Got HashiCorp here on theCUBE, CTO and co-founder on theCUBE, Riding The Wave, CloudNative, Kupernetes, lot of great stuff happening. Microservices and containers. It's theCUBE doing our part here at KubeCon. We'll be right back with more live coverage after this short break.

Published Date : Dec 7 2017

SUMMARY :

Brought to you by Red Hat, the Linux Foundation, and KubeCon, not to be confused with CUBE, and essentially we laid out microsurfaces and all the stuff And now it's funny to be here at KubeCon, and it's like-- a lot of the challenges involved. and then you got a lot of people and out the gate it goes, and you deal with it. What's the big surprise for you as you guys, and it's like the vibe is electric, right? to what's a scheduler and how does that help me, that now the business, they need to move faster, so scalable that it's brain dead to operate these things. Well, it's reliable too. of Chen talking in the Keynote this morning What's the connection between what you are doing And I think it goes back to our lens on this market, You guys are actors in the ecosystem. this market that we are talking about is so enormous. We interviewed you guys about, and make sure that you know, as you're adopting I mean, not new to this notion of runtime and orchestration. and move the abstraction up And I think what Kubernetes is really doing Look at the Open Tracing for instance, Not like, you know, hundreds of thousands of services, So you got a new, younger generation coming up. 10% of the IP is going to be unique to the company. You're a young gun, what's your view of the future? and so the better we can be at being plumbing, Armon, one of the things we talk about, it's becoming so confusing for the end users So, our goal has been to say how do we better distill and what do you wish you could do over? 10 seconds, I'm going to ask you and the standardization that is taking place around it. and the plane is going down, No one ever gets fired for choosing Amazon. All right well Jeff Frick on our CUBE team said, That's the right answer. CTO and co-founder on theCUBE,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FrickPERSON

0.99+

Jim ZemlinPERSON

0.99+

FacebookORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

2015DATE

0.99+

CiscoORGANIZATION

0.99+

UberORGANIZATION

0.99+

Red HatORGANIZATION

0.99+

30QUANTITY

0.99+

10 secondsQUANTITY

0.99+

Palo AltoLOCATION

0.99+

Armon DadgarPERSON

0.99+

LyftORGANIZATION

0.99+

millionsQUANTITY

0.99+

Linux FoundationORGANIZATION

0.99+

last weekDATE

0.99+

GoogleORGANIZATION

0.99+

MitchellPERSON

0.99+

John FurrierPERSON

0.99+

CUBEORGANIZATION

0.99+

TodayDATE

0.99+

10%QUANTITY

0.99+

HashiCorpORGANIZATION

0.99+

JuniperORGANIZATION

0.99+

AWSORGANIZATION

0.99+

1,000QUANTITY

0.99+

Stu MinimanPERSON

0.99+

KupernetesORGANIZATION

0.99+

ArmonPERSON

0.99+

Austin, TexasLOCATION

0.99+

theCUBEORGANIZATION

0.99+

six core toolsQUANTITY

0.99+

two months agoDATE

0.99+

Riding The WaveORGANIZATION

0.99+

a year and a half agoDATE

0.99+

early 2000sDATE

0.99+

100QUANTITY

0.99+

three years agoDATE

0.98+

HashiCorpTITLE

0.98+

KubeConEVENT

0.98+

CloudNativeORGANIZATION

0.98+

two years agoDATE

0.98+

CloudNativeConEVENT

0.98+

50%QUANTITY

0.98+

vSphereTITLE

0.98+

firstQUANTITY

0.98+

50 new thingsQUANTITY

0.98+

TerraformORGANIZATION

0.98+

tens of thousandsQUANTITY

0.97+

KubernetesTITLE

0.97+

oneQUANTITY

0.97+

ChenPERSON

0.96+

about an hour and a halfQUANTITY

0.96+

F5ORGANIZATION

0.96+

CTOORGANIZATION

0.96+

AzureORGANIZATION

0.95+

CUBE ConEVENT

0.95+

one sideQUANTITY

0.95+

Step twoQUANTITY

0.95+

step oneQUANTITY

0.95+

VagrantORGANIZATION

0.95+

hundreds of thousands of servicesQUANTITY

0.95+

todayDATE

0.94+

90/10QUANTITY

0.93+