Roger Barga, AWS | AWS re:Invent 2020
>>from around the globe. It's the Cube with digital coverage of AWS reinvent 2020 sponsored by Intel and AWS. Yeah, husband. Welcome back to the cubes. Live coverage of AWS reinvent 2020. We're not in person this year. We're virtual This is the Cube Virtual. I'm John for your host of the Cube. Roger Barker, the General Manager AWS Robotics and Autonomous Service. And a lot of other cool stuff was on last year. Always. Speed Racer. You got the machines. Now you have real time Robotics hitting, hitting seen Andy Jassy laid out a huge vision and and data points and announcements around Industrial this I o t it's kind of coming together. Roger, great to see you. And thanks for coming on. I want to dig in and get your perspective. Thanks for joining the Cube. >>Good to be here with you again today. >>Alright, so give us your take on the announcements yesterday and how that relates to the work that you're doing on the robotic side at a w s. And where where does this go from? You know, fun to real world to societal impact. Take us through. What? You how you see that vision? >>Yeah, sure. So we continue to see the story of how processing is moving to the edge and cloud services, or augmenting that processing at the edge with unique and new services. And he talked about five new industrial machine learning services yesterday, which are very relevant to exactly what we're trying to do with AWS robot maker. Um, a couple of them monitor on, which is for equipment monitoring for anomalies. And it's a whole solution, from an edge device to a gateway to a service. But we also heard about look out for equipment, which is if a customer already has their own censors. It's a service that can actually back up that that sensor on their on the device to actually get identify anomalies or potential failures. And we saw look out for video, which allows customers to actually use their camera and and build a service to detect anomalies and potential failures. When A. W s robot maker, we have Ross Cloud Service extensions, which allow developers to connect their robot to these services and so increasingly, that combination of being able to put sensors and processing at the edge, connecting it back with the cloud where you could do intelligent processing and understand what's going on out in the environment. So those were exciting announcements. And that story is going to continue to unfold with new services. New sensors we can put on our robots to again intelligently process the data and control these robots and industrial settings. >>You know, this brings up a great point. And, you know, I wasn't kidding. Was saying fun to real world. I mean, this is what's happening. Um, the use cases air different. You look at you mentioned, um, you know, monitor on lookout. But those depend Panorama appliance. You had computer vision, machine learning. I mean, these are all new, cool, relevant use cases, but they're not like static. It's not like you're going to see them. Just one thing is like the edge has very diverse and sometimes mostly purpose built for the edge piece. So it's not like you could build a product. Okay, fits everywhere. Talk about that dynamic and why the robotics piece has to be agile. And what do you guys doing to make that workable? Because, you know, you want purpose built. The purpose built implies supply chain years. in advance. It implies slow and you know, how do you get the trust? How do you get the security? Take us through that, please. >>So to your point, um, no single service is going to solve all problems, which is why AWS has has released a number of just primitives. Just think about Kinesis video or Aiken. Stream my raw video from an edge device and build my own machine learning model in the cloud with sage maker that will process that. Or I could use recognition. So we give customers these basic building blocks. But we also think about working customer backward. What is the finished solution that we could give a customer that just works out of the box? And the new services we heard about we heard about yesterday were exactly in that latter category. Their purpose built. They're ready to be used or trained for developers to use and and with very little customization that necessary. Um, but the point is, is that is that these customers that are working these environments, the business questions change all the time, and so they need actually re program a robot on the fly, for example, with a new mission to address the new business need that just arose is a dynamic, which we've been very tuned into since we first started with a device robo maker. We have a feature for a fleet management, which allows a developer to choose any robot that's out in their fleet and take the software stack a new software stack tested in simulation and then redeploy it to that robot so it changes its mission. And this is a This is a dialogue we've been seeing coming up over the last year, where roboticists are starting to educate their company that a robot is a device that could be dynamically program. At any point in time, they contest their application and simulation while the robots out in the field verify it's gonna work correctly and simulation and then change the mission for that robot. Dynamically. One of my customers they're working with Woods Hole Institute is sending autonomous underwater robots out into the ocean to monitor wind farms, and they realized the mission may change may change based on what they find out. If the wind farm with the equipment with their autonomous robot, the robot itself may encounter an issue and that ability because they do have connective ity to change the mission dynamically. First Testament, of course, in simulation is completely changing the game for how they think about robots no longer a static program at once, and have to bring it back in the shop to re program it. It's now just this dynamic entity that could test and modify it any time. >>You know, I'm old enough to know how hard that really is to pull off. And this highlights really kind of how exciting this is, E. I mean, just think about the idea of hardware being dynamically updated with software in real time and or near real time with new stacks. I mean, just that's just unheard of, you know, because purpose built has always been kind of you. Lock it in, you deploy it. You send the tech out there this kind of break fixed kind of mindset. Let's changes everything, whether it's space or underwater. You've been seeing everything. It's software defined, software operated model, so I have to ask you First of all, that's super awesome. Anyway, what's this like for the new generation? Because Andy talked on stage and in in my one On one way I had with him. He talked about, um, and referring to land in some of these new things. There's a new generation of developer. So you gotta look at these young kids coming out of school to them. They don't understand what how hard this is. They just look at it as lingua frank with software defined stuff. So can you share some of the cutting edge things that are coming out of these new new the new talent or the new developers? Uh, I'm sure the creativity is off the charts. Can you share some cool, um, use cases? Share your perspective? >>Absolutely. I think there's a couple of interesting cases to look at. One is, you know, roboticists historically have thought about all the processing on the robot. And if you say cloud and cloud service, they just couldn't fathom that reality that all the processing has cannot has to be, you know, could be moved off of the robot. Now you're seeing developers who are looking at the cloud services that we're launching and our cloud service extensions, which give you a secure connection to the cloud from your robot. They're starting to realize they can actually move some of that processing off the robot that could lower the bomb or the building materials, the cost of the robot. And they can have this dynamic programming surface in the cloud that they can program and change the behavior of the robot. So that's a dialogue we've seen coming over the last couple years, that rethinking of where the software should live. What makes sense to run on the robot? And what should we push out to the cloud? Let alone the fact that if you're aggregating information from hundreds of robots, you can actually build machine learning models that actually identify mistakes a single robot might make across the fleet and actually use that insight to actually retrain the models. Push new applications down, pushing machine learning models down. That is a completely different mindset. It's almost like introducing distributed computing to roboticists that you actually think this fabric of robots and another, more recent trend we're seeing that were listening very closely to customers is the ability to use simulation and machine learning, specifically reinforcement. Learning for a robot actually try different tasks up because simulations have gotten so realistic with the physics engines and the rendering quality that is almost nearly realistic for a camera. The physics are actually real world physics, so that you can put a simulation of your robot into a three D simulated world and allow it to bumble around and make mistakes while trying to perform the task that you frankly don't know how to write the code for it so complex and through reinforcement, learning, giving rewards signals if it does something right or punishment or negative rewards signals. If it does something wrong, the machine learning algorithm will learn to perform navigation and manipulation tasks, which again the programmer simply didn't have to write a line of code for other than creating the right simulation in the right set of trials >>so that it's like reversing the debugging protocol. It's like, Hey, do the simulations. The code writes itself. Debug it on the front end. It rights itself rather than writing code, compiling it, debugging it, working through the use cases. I mean, it's pretty different. >>It is. It's really a new persona. When we started out, not only are you taking that roboticist persona and again introduced him to the cloud services and distributed computing what you're seeing machine learning scientists with robotics experience is actually rising. Is a new developer persona that we have to pay attention to him. We're talking to right now about what they what they need from our service. >>Well, Roger, I get I'm getting tight on time here. I want one final question before we break. How does someone get involved with Amazon? And I'll see you know, whether it's robotics and new areas like space, which is verging, there's a lot of action, a lot of interest. Um, how does someone engaged with Amazon to get involved, Whether I'm a student or whether I'm a professional, I want a code. What's what's the absolutely, >>absolutely, so certainly reinvent. We have several sessions that reinvent on AWS robo maker. Our team is there, presenting and talking about our road map and how people can get engaged. There is, of course, the remarks conference, which will be happening next year, hopefully to get engaged. Our team is active in the Ross Open Source Community and Ross Industrial, which is happening in Europe later in December but also happens in the Americas, where were present giving demos and getting hands on tutorials. We're also very active in the academic research in education arena. In fact, we just released open source curriculum that any developer could get access to on Get Hub for Robotics and Ross, as well as how to use robo maker that's freely available. Eso There's a number of touch points and, of course, I'd be welcome to a field. Any request for people to learn more or just engage with our team? >>Arthur Parker, general manager. It is robotics and also the Autonomous Systems Group at AWS Amazon Web services. Great stuff, and this is really awesome insight. Also, you know it za candy For the developers, it's the new generation of people who are going to get put their teeth into some new science and some new problems to solve. With software again, distributed computing meets robotics and hardware, and it's an opportunity to change the world literally. >>It is an exciting space. It's still Day one and robotics, and we look forward to seeing the car customers do with our service. >>Great stuff, of course. The Cube loves this country. Love robots. We love autonomous. We love space programming all this stuff, totally cutting edge cloud computing, changing the game at many levels with the digital transformation just a cube. Thanks for watching
SUMMARY :
It's the Cube with digital You know, fun to real world to societal at the edge, connecting it back with the cloud where you could do intelligent processing and understand what's going And what do you guys doing to make that workable? for developers to use and and with very little customization that necessary. It's software defined, software operated model, so I have to ask you First of all, all the processing has cannot has to be, you know, could be moved off of the robot. so that it's like reversing the debugging protocol. persona and again introduced him to the cloud services and distributed computing what you're seeing machine And I'll see you know, whether it's robotics and There is, of course, the remarks conference, which will be happening next year, hopefully to get engaged. and hardware, and it's an opportunity to change the world literally. It's still Day one and robotics, and we look forward to seeing the car customers do with our service. all this stuff, totally cutting edge cloud computing, changing the game at many levels with the digital
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Roger | PERSON | 0.99+ |
Arthur Parker | PERSON | 0.99+ |
AWS | ORGANIZATION | 0.99+ |
Roger Barker | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Andy Jassy | PERSON | 0.99+ |
Europe | LOCATION | 0.99+ |
Andy | PERSON | 0.99+ |
Woods Hole Institute | ORGANIZATION | 0.99+ |
Ross Industrial | ORGANIZATION | 0.99+ |
Americas | LOCATION | 0.99+ |
next year | DATE | 0.99+ |
Roger Barga | PERSON | 0.99+ |
John | PERSON | 0.99+ |
Ross Open Source Community | ORGANIZATION | 0.99+ |
yesterday | DATE | 0.99+ |
Ross | ORGANIZATION | 0.99+ |
last year | DATE | 0.99+ |
One | QUANTITY | 0.98+ |
today | DATE | 0.98+ |
this year | DATE | 0.98+ |
Get Hub | ORGANIZATION | 0.97+ |
hundreds of robots | QUANTITY | 0.97+ |
AWS Robotics and Autonomous Service | ORGANIZATION | 0.96+ |
first | QUANTITY | 0.96+ |
Intel | ORGANIZATION | 0.96+ |
one thing | QUANTITY | 0.95+ |
one final question | QUANTITY | 0.95+ |
five new industrial machine learning services | QUANTITY | 0.92+ |
Autonomous Systems Group | ORGANIZATION | 0.92+ |
First | QUANTITY | 0.9+ |
single service | QUANTITY | 0.9+ |
last couple years | DATE | 0.87+ |
single robot | QUANTITY | 0.85+ |
Amazon Web | ORGANIZATION | 0.85+ |
Day one | QUANTITY | 0.83+ |
Kinesis | ORGANIZATION | 0.8+ |
First Testament | QUANTITY | 0.79+ |
Cube Virtual | COMMERCIAL_ITEM | 0.75+ |
Cube | COMMERCIAL_ITEM | 0.74+ |
W | PERSON | 0.68+ |
one | QUANTITY | 0.64+ |
couple | QUANTITY | 0.63+ |
Invent | EVENT | 0.62+ |
December | DATE | 0.62+ |
Robotics | ORGANIZATION | 0.61+ |
Aiken | ORGANIZATION | 0.58+ |
reinvent 2020 | EVENT | 0.49+ |
2020 | TITLE | 0.47+ |
Cloud | TITLE | 0.47+ |
reinvent | EVENT | 0.44+ |
re | EVENT | 0.32+ |