DockerCon 2022 | Ajay Mungara
(upbeat music) >> Hi, everyone welcome back to theCUBE's main stage coverage of DockerCon 2022. We got a great guest from Intel here, Ajay Mungara Senior Director of Edge Software and AI at Intel talking about cloud native and AI workloads at The Edge and building a better developer ecosystem for The Edge which we all know those where the actions going cloud native, compute data, data as code. These are things we've been talking about, so Ajay, welcome to theCUBE. >> Thank you, John. I'm really happy to be here in DockerCon and everything we do Docker makes it better. >> Well, you guys have done a lot in your career and looking at your background, The Edge was manufacturing the old school IOT stuff. Now that's converged completely in with cloud native IP technologies. Everything's kind of happening now at The Edge. This is where the problems are now shifting in solving because of the goodness of the cloud and what that's done for cloud operations which essentially distributed computing is making The Edge the battleground for where the innovation's happening. Could you just share with us your view of why The Edge is so important and why it's different than what we've been seeing in pure cloud on and on premise data centers? >> Yeah, you know 75% of the data that is getting generated of late is happening at The Edge. Okay, so there's a lot of value, there's a lot of value that's getting generated at The Edge because most of the compute we want to move it where closest to the data because of latency issues, bandwidth issues, security issues all of those things is getting people to move compute storage data towards more at The Edge. There's also one big shift from a developer point of view where 51% of all of the developers in the world have deployed in somewhere the other cloud native Docker based solutions out there, okay. What we are seeing is the combination of cloud computing, networking, edge computing all of that coming together. And that is where it is pushing the envelope from The Edge perspective. And one of the big drivers is AI at The Edge as well, right. The Edge inference workloads that is really happening with camera as one of the sensors is really driving that compute. And your question about what's so different about it. The challenges at The Edge are compounded because it's bringing together the operational technology, the information technology processes and cloud computing environments along with networking all together. So when a developer wants to build a solution for The Edge they have to figure out what part of that workload sits in the cloud, how they're going to move that workload towards The Edge using some form of networking. How are they going to protect the data in transport as well as at rest, because Edge devices can get stolen, you know. So there is all of these challenges about like how do you like figure out the trade offs between price, performance, functionality, power, heat, size, weight everything matters when you talk about The Edge. So anyway, that is why we see those differences. >> It's interesting you know you do a little go back in history and distribute computing, the movies still the same. Remember back in the day when I was breaking into the business memory was the bottleneck and storage was the resource. And you had to swap out memory, and as a developer you had to deal with that. Then memory became abundant and storage was the problem. Now you got networking is the latency problem. So again, these are a challenges that developers have to weave through, I was going to ask the question of why is The Edge important for the and what's in it for the developer, why should they care about The Edge? And I think what you were saying is there's design decisions going on around how to code, can you elaborate on what's in it for the developer? Why should they care about The Edge? >> Developers have to really care about The Edge is because when you are really building a solution you cannot move the data and make all the decisions at the cloud because it's late, right, sometimes latency, your bandwidth costs, your solution costs are going to get increased. And because of security and privacy concerns sometimes you have to make those decisions at The Edge itself. You will have to figure out only take the data strategically to the cloud where it makes sense, okay. And that is the reason why developers have no choice but they have to focus on the combination of cloud networking and edge, and that's where we are seeing a large scale set of deployments that are happening today. >> Yeah, and I can see the business value too which is one of the big themes that DockerCon this year is tracks on that people talking about that. Are you seeing trends like headless retail, which is basically, it's not Shopify managed service, it's more of you build your own stack and you put the head on there which is the application and business model. >> Right. >> So again, that's an example. There's also the manufacturing, there's automotive all kinds of use cases where there's money making opportunities, right. So there's business value there, so the developer's going to be pulled to The Edge 'cause they're in the front lines now. So this is about making The Edge ready, and I want to hear your thoughts on what Intel's doing to make that developer environment ready for The Edge because we know the developer on the front lines today and that front line vanguard will be The Edge. What's it look like? >> Exactly, right, so what we have done is we have created this environment for developers which we call it as IntelDevCloud. And in this dev cloud which is Kubernetes based environment where we support all of the Docker workloads and it's based off of Red Hat OpenShift. And we thought about this a little differently. What we did is it's a cloud environment where you could use a browser to do all of your development build test and all of that. But we also took a whole range of these edge devices and we made it available in the cloud. So as a developer, you don't have to have an edge device sitting at your desk. You have an edge device or a plethora of edge devices sitting in the cloud. So you have one environment where you have cloud, you have network, and you have all these edge nodes. So you could start building your solution, you could start building your cloud native or edge native solutions, test it, benchmark it, and figure out how and what type of combination that you actually need for your final solution as you said in retail, in smart cities, in healthcare, any of these vertical markets and get your solution closer to being a deployment ready. >> Yeah, and I love your description by the way it's called a container playground. I mean, it's just comes across as fun. And I think this idea of having these nodes available you guys bring a lot of expertise at the table. That's almost like your local host for Edge devices, right? You can work with it in a safe environment, am I getting that right. >> You're getting that right, and in fact, during the pandemic when we are all working remote, right, nobody has access to these labs where you have all these Edge devices available to you, you could actually play with all these network simulators everything. Now with dev all these developers spread all over the world, you don't have access to as many of those edge devices. So now with browser, with this container playground, you could develop any of your Docker composed, Docker based container workloads and try it on all of these edge devices which may range from an Intel's point of view, CPUs, VPUs, GPUS, anything, right. >> We know there's a lot of compute at The Edge which always ever helps in Intel but your north star is about making it easier for the developers as you guys invest cloud network and The Edge and the cloud native world, that's the goal. How do you do that? And what should the developers optimize for it sounds like they're going to learn with this playground that you have the dev cloud. What are you seeing that they're going to learn to optimize for? Is it like I use the oldest school example of memory optimization, swapping memory out and that kind of thing but what's the new issues that need to be optimized for your developer. >> If you're a developer you got to optimize for your edge AI workloads, right, so that means AI inference workloads. You have to look at like saying that how can I take like a model that is developed in a some type of a cloud environment, like a TensorFlow model or a Pieto model, bring it down to The Edge. And then you have to do inference workloads. You need to understand to do this inference, what type of compute you need, what type of storage do you need? What type of memory do you need? And we give you those options where you could optimize those type of inference AI, inference workloads, you could actually do that. Then you also can decide like what type of decisions you want to make at The Edge what decisions you want to make at the cloud. We give you those options and flexibility for you to build those solutions. >> Great. >> One last point I'll make is there's a lot of legacy applications that have been developed which is traditional embedded applications. We are also want to teach developers how to take these applications and containerize them. How to take advantage of the cloud native DevOps type of paradigms that it would make your life easier when it comes to scaling your solution, deploying your solution worldwide. >> All right, Ajay, thanks so much for coming on theCUBE DevCloud, a container playground. Now back to you at the main stage at DockerCon. (upbeat music)
SUMMARY :
and AI at Intel talking about cloud native and everything we do Well, you guys have because most of the and as a developer you And that is the reason why it's more of you build your own stack and I want to hear your So you could start building your solution, Yeah, and I love your and in fact, during the pandemic for the developers as you and flexibility for you the cloud native DevOps Now back to you at the
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Ajay Mungara | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Ajay | PERSON | 0.99+ |
51% | QUANTITY | 0.99+ |
DockerCon | EVENT | 0.99+ |
Shopify | ORGANIZATION | 0.98+ |
The Edge | TITLE | 0.97+ |
DockerCon 2022 | EVENT | 0.97+ |
one | QUANTITY | 0.97+ |
this year | DATE | 0.97+ |
Intel | ORGANIZATION | 0.97+ |
The Edge | ORGANIZATION | 0.97+ |
DevOps | TITLE | 0.96+ |
Edge Software | ORGANIZATION | 0.96+ |
Red Hat OpenShift | TITLE | 0.95+ |
today | DATE | 0.94+ |
theCUBE | ORGANIZATION | 0.93+ |
75% of | QUANTITY | 0.92+ |
one environment | QUANTITY | 0.91+ |
Docker | TITLE | 0.87+ |
One last | QUANTITY | 0.87+ |
DevCloud | TITLE | 0.84+ |
pandemic | EVENT | 0.82+ |
Edge | TITLE | 0.73+ |
Kubernetes | TITLE | 0.56+ |
Pieto | TITLE | 0.54+ |
IntelDevCloud | ORGANIZATION | 0.53+ |
Edge | COMMERCIAL_ITEM | 0.48+ |
Docker | ORGANIZATION | 0.42+ |