Rob High, IBM | IBM Think 2020
>>Yeah, >>from the Cube Studios in Palo Alto and Boston. It's the Cube covering IBM. Think brought to you by IBM. >>Welcome back, everybody. This is Dave Vellante of the Cube, and you're watching our continuous coverage of the IBM think Digital 2020 experience. And we're really pleased to have Rob High here. He's not only an IBM fellow bodies. He runs the vice president CTO of the IBM Edge Computing Initiative. Rob, thanks so much for coming on the Cube. Good to see you. Which we're face to face, but yeah, that time to be safe and healthy, I guess. And did so edge obviously hot topic. Everybody has this sort of point of view would be interested in how IBM looks at edge. You define it and what your thoughts are on. It's evolution. >>Yeah, well, you know, there's ah really kind of two fairly distinct ways of thinking about the edge of the telcos. Our, ah, you know, they're creating edge capabilities in their own network facilities. We call that the network edge on the other side of the edge they that I think matters a lot to our enterprise businesses is there's remote on premise locations where they actually perform the work that they do, where the majority of people are, where the data that actually gets created is first formed and where the actions that they need to operate on are being taken. That is a lot of interest, because if we can move work workloads, Iot workloads to where that data is being created, where those actions are being taken Uh, not only can we dramatically reduce the late and see to those decisions, uh, but we can also ensure continuous operations and the failed in the presence of perhaps network failures. We can manage the growth of increasing demand for network bandwidth as Maura born data gets created and we can optimize the efficiency of both the business operations as well as the I t operations before that. So for us edge computing at the end of the day is about movie work where the data and the actions are being taken >>well, so this work from home, you know, gives a result of this pandemic is kind of creating a new stresses on networks and people are putting, you know, pouring money actually into beefing up that infrastructure is sort of an extension of what we used to think about edge. But I wonder if you could talk about some of the industries and the use cases that you guys we are seeing and notwithstanding, though assay that >>work from home pivot. Yeah, absolutely. So I mean, look, we have seen ah, the need for placing workloads close to where it is being created and where actions have been picking in virtually every industry, the ones that are probably easier for us to think about and more common in terms of our mindset. Our is manufacturing. If you think about all the things that go on in a factory floor that need to be able to perform analytic in, uh, in the equipment and the processes that are performing in the affection for, If you think, for example, production quality. Uh, you know, if you've got a machine that's putting out parts and maybe it's welding seams on metal boxes, uh, you know, you want to be able to look at the quality of that seem at the moment that is being performed, so that if there are any problems, you can remediate that immediately rather than having that box move on down the line and find that you know the quality issues they were created earlier on now have exacerbated in other ways. Um, you know, so quality, productive quality. Ah, inspection production optimization in our world of Covic Cover 19 and worker safety and getting workers back to work and ensuring that you know people wearing the masks and are exercising social distancing. This is on the factory floor. Worker Insight is another major use case that we're seeing surface of lake with a lot of interest in using whether that's infrared cameras or Bluetooth beacons or infrared cameras. Any variety of devices that could be employed in the work area to help ensure that factories are operating efficiently, that workers are safe. Ah, and whether that's in a factor situation or even in an office situation or e a r in a warehouse or distribution center. And all these scenarios the the utility, the edge computing to bring to those use cases is tremendous. >>And a lot of these devices are unattended or infrequently attended. I always use the windmill example. Um, you know, you don't want to have to do a truck roll to figure out you know what the dynamics are going on, that at the windmill s, so I can instrument that. But what about the management of those devices you know from an autonomous standpoint? And and are you? What are you doing? Or are you doing anything in the autonomous managed space? >>Yeah. In fact, that's really kind of key here, because when you think about the scale, the diversity and the dynamic dynamism of equipment in these environments And as you point out, Dave, you know the lack of I t resource lack of skills on the factory floor, or even in the retail store or hotel or distribution center or any of these environments. The situation is very similar. You can't simply manage getting the right workloads to the right place at the right time. In sort of the traditional approach is, you have to really think about another autonomous approach to management and, you know, let the system the side for you. What software needs to be placed out there? Which software to put their If it's an analytic algorithm, what models to be associated with that software and getting to the right place at the right place at the right Time is a key Part of what we do in this thing that we call IBM Edge application manager is that product that we're really kind of bringing to market right now in the context of edge computing that facilitates this idea of autonomous management. >>You know, I wonder if you could comment Robb on just sort of the approach that you're taking with regard to providing products and services. I mean, we've seen a lot of, uh, situations where people are just essentially packing, packaging traditional, you know, compute and storage devices and sort of throwing it over the fence at the edge. Uh, and saying, Hey, here's our edge computing solution and another saying there's not a place for that. Maybe that will help flatten the network and, you know, provide Ah, gateway for storing on maybe processing information. But it seems to us that that that a bottoms up approach is going to be more appropriate. In other words, you've got engineers, you know who really understand operations, technology, people, maybe a new breed of developers emerging. How do you see the evolution you know of products and services and architectures at >>the edge? Yeah, so First of all, let me say IBM is taking a really pretty broad approach to edge computing we have. What I just described is IBM Edge Application Manager, which is the if you will the platform or the infrastructure on which we can manage the appointment of workloads out to the edge. But then add to that we do have a whole variety of edge and Nevil enabled applications that are being created are global service of practices and our AI applications business all are creating, um, variations of their product specific to address and exploit edge computing and to bring that advantage to the business. And of course, then we also have global services Consulting, which is a set of skilled resource, is who know we understand the transformations that business need to go through when they went, take advantage of edge computing and how to think about that in the context of both their journey to the cloud as well as now in this case, the edge. But also then how to go about implementing and delivering that, uh and then firmly further managing that now you know, coupled out then with at the end of the day you're also going to need the equipment, the devices, whether that is an intelligent automobile or other vehicle, whether that is an appellate, a robot or a camera, Um, or if those things are not intelligent. But you want to bring intelligence to them that how you augment that with servers and other forms of cluster computing that resides resident with the device. All of those are going to require participation from a very broad ecosystem. So we've been working with partners of whether that is vendors who create hardware and enabling that hardware in certifying that hardware to work with our management infrastructure or whether those are people who bring higher order services to the table that provide support for, let's, say, data cashing and facilitating the creation of applications, or whether those are device manufacturers that are embedding compute in their device equipment. All of that is part of our partnership ecosystem, Um, and then finally, you know, I need to emphasize that, you know, the world that we operate in is so vast and so large. There are so many edge devices in the marketplace, and that's growing so rapidly, and so many participants in that likewise There are a lot of other contributors to this ecosystem that we call edge computing. And so for all of those reasons, we have grounded IBM education manager on open source. We created an open source project called Open Rise, and we've been developing that, actually now, for about 4.5 years just recently, the Linux Foundation has adopted Stage one adoption of Open arising as part of its Lennox Foundation edge LF edge, uh, Reg X Foundry project. And so we think this is key to building out, Um, a ecosystem of partners who want to both contribute as well consumed value and create ecosystems around this common idea of how we manage the edge. >>Yeah, I'm glad you brought up the ecosystem, and it's too big for any one company toe to go it alone. But I want to tap your brain on just sort of architectures. And there's so many diverse use cases, you know, we don't necessarily see one uber architecture emerging, but there are some characteristics that we think are important at the edge you mentioned sort of real time or near real time. In many cases, it has to be real time you think about autonomous vehicles? Um, yeah. A lot of the data today is analog, and maybe it doesn't have to be digitized, but much of it will be, um, it's not all gonna be sent back to the cloud. It may not all have to be persisted. So we've envisioned this sort of purpose built, you know, architecture for certain use cases that can support real time. That maybe have, you know, arm based processors. Ah, or other alternative processors there that can do real time analytics at the edge and maybe sending portions of the data back. How do you see the architectures evolving from a technologist? >>Well, so certainly one of the things that we see at the edge is a tremendous premium being placed on things like energy consumption. So architectures they're able to operate efficiently with less power is ah is certainly an advantage to any of those architectures that are being brought aboard. Um, clearly, you know x 86 is a dominant architecture in any information technology endeavor. More specifically at the edge. We're seeing the emergence of lot of arm based architecture chips out there. In fact, I would guess that the majority of the edge devices today are not being created with, um, arm architectures, but it's the you know, but some of this is about the underlying architecture of the compute. But also then the augmentation of that compute the the compute Thea the CP use with other types of processing units. Whether those GPS, of course, we're seeing, you know, a number of deep use being created that are designed to be low power consuming, um, and have a tremendous amount of utility at the edge. There are alternate processing units, architectures that have been designed specifically for AI model based analytics. Uh, things like TP use and infuse and and, uh, and set around, which are very purpose built for certain kinds of intellect. And we think that those are starting to surface and become increasingly important. And then on the flip side of this is both the memory storage in network architectures which are sort of exotically different. But at least in terms of capacity, um have quite variability. Specifically, five G, though, is emerging and five g. While it's not necessarily the same computing, there is a lot of symbolism between edge and five G and the kinds of use cases that five G envisions are very similar to those that we've been talking about in the edge world as well. >>Rob, I want to ask you about sort of this notion of program ability at the edge. I mean, we've seen the success of infrastructure as code. Um, how do you see program ability occurring at the edge in terms of fostering innovation and maybe new developer bottles or maybe existing developer models at the edge? Yeah, >>we found a lot of utility in sort of leveraging what we now think of as cloud computing or cloud computing models. Uh, you know, the idea of continue ization extends itself very easily into the edge. Whether that is running a container in a docker runtime, let's say on an edge device which is, you know, resource constrained and purpose built and needs to focus on sort of a very small footprint or even edge clusters edge servers where we might be running a cluster of containers using our kubernetes platform called open shift. Um, you know the course of practices of continuous integration, continuous delivery. What we write a Otherwise think of his Dev ops. Ah, and, of course, the benefits they continue. Realization brings to the idea of component architectures. Three. Idea of loose coupling. The separation of concerns, the ability to mix and match different service implementations to be opposed. Your application are all ideas that were matured in the cloud world but have a lot of utility in the edge world. Now we actually call it edge native programming. But you can think of that as being mostly cloud native programming, with a further extension that there are certain things you have to be aware of what you're building for the edge. You have to recognize that resource is air limited. Unlike the cloud where we have this notion of infinite resource, you don't have that at the edge. Find and constrained resources. Be worried about, you know, Layton sees and the fact that there is a network that separates the different services and that network can be and reliable. It can introduce his own forms of Layton sees it, maybe bandwidth constrained and those air issues that you now have to factor into your thinking as you build out the logic of your application components. But I think by building on the cloud native programming about me paradigm. You know, we get to exercise sort of all of the skills that have been developing and maturing in the cloud world. Now, for the edge >>that makes sense. My last question is around security. I mean, I've often sort of tongue in cheek said, you know, building a moat around the castle doesn't work anymore. The queen i e. The data has left the castle. She's everywhere. So what about the security model? I mean, I feel like the edge is moving so fast you feel confident or what gives you confidence >>that we can secure the edge. You know, the edges does introduce some very interesting and challenging concerns with respect to security because, frankly, the compute is out there in the wild. You know, you've got computers in the store you've got, you know, people walking around the kiosks you have in the manufacturing site, you know, workers that are, you know, in the midst of all of this compute capability and so the attack surface is substantially bigger. And that's been a big focus for us, is how to the only way validate in 30 of the software that was But it also takes advantage of one of the key characters with edge computing to bring to the table, which is, if you think about it. You know, when you've got personal and private information being entered into quote system, the more often you move that personal private data around, and certainly the more that you move it to a central location and aggregate that with other data, the more of a target becomes more vulnerable, exposed that data becomes and by using edge computing, which moves the workloads out to the edge where that did has been created in some sense, you can process on it there and then move it back. They need central location, you don't have to aggregate it. And that actually in itself is a counterbalance of all of the other issues that we also describe about security by essentially not moving the personal privacy and in protecting by keeping it exactly where it began. >>You know, Rob, this is an exciting topic. Is a huge opportunity for IBM and Ginny in and talk about the trillion dollar opportunity and hybrid cloud and the Edge is a multi $1,000,000,000 opportunity for IBM and, uh So you just got to go get her done. But I really appreciate you coming on the Cube and sharing your insights. That awesome topic in the best interest of the David. Yeah. Thank you. Thank you for the thank you. Stay safe and thank you for watching everybody. This is Dave Volante for the Cube. This is our coverage of IBM. Think 2020 the digital. Think >>we'll be right back after this short break? >>Yeah, yeah, yeah, yeah.
SUMMARY :
Think brought to you by IBM. This is Dave Vellante of the Cube, and you're watching our continuous coverage of the IBM Yeah, well, you know, there's ah really kind of two fairly distinct ways of thinking about the edge industries and the use cases that you guys we are seeing and notwithstanding, that immediately rather than having that box move on down the line and find that you Um, you know, you don't want to have to do a truck roll to figure out you know what and, you know, let the system the side for you. You know, I wonder if you could comment Robb on just sort of the approach that you're taking with regard to and then finally, you know, I need to emphasize that, you know, the world that we operate In many cases, it has to be real time you think about autonomous vehicles? the you know, but some of this is about the underlying architecture of Rob, I want to ask you about sort of this notion of program ability at the edge. you know, Layton sees and the fact that there is a network that separates the different services and that I mean, I feel like the edge is moving so fast you the edge where that did has been created in some sense, you can process on it there and then But I really appreciate you coming on the Cube
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave | PERSON | 0.99+ |
Dave Vellante | PERSON | 0.99+ |
IBM | ORGANIZATION | 0.99+ |
30 | QUANTITY | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
Rob | PERSON | 0.99+ |
David | PERSON | 0.99+ |
Boston | LOCATION | 0.99+ |
Dave Volante | PERSON | 0.99+ |
Rob High | PERSON | 0.99+ |
Rob High | PERSON | 0.99+ |
Linux Foundation | ORGANIZATION | 0.99+ |
both | QUANTITY | 0.98+ |
Lennox Foundation | ORGANIZATION | 0.98+ |
one | QUANTITY | 0.98+ |
today | DATE | 0.97+ |
Three | QUANTITY | 0.96+ |
IBM Edge Computing Initiative | ORGANIZATION | 0.95+ |
Think 2020 | COMMERCIAL_ITEM | 0.94+ |
Layton | ORGANIZATION | 0.94+ |
first | QUANTITY | 0.94+ |
Stage one | QUANTITY | 0.94+ |
Cube | COMMERCIAL_ITEM | 0.93+ |
$1,000,000,000 | QUANTITY | 0.93+ |
First | QUANTITY | 0.93+ |
Cube Studios | ORGANIZATION | 0.91+ |
Ginny | PERSON | 0.91+ |
86 | OTHER | 0.9+ |
about 4.5 years | QUANTITY | 0.89+ |
pandemic | EVENT | 0.88+ |
G | TITLE | 0.87+ |
two fairly distinct ways | QUANTITY | 0.85+ |
uber | ORGANIZATION | 0.84+ |
vice president | PERSON | 0.83+ |
trillion dollar | QUANTITY | 0.79+ |
five G | OTHER | 0.77+ |
Edge | TITLE | 0.77+ |
Cover 19 | COMMERCIAL_ITEM | 0.72+ |
CTO | PERSON | 0.72+ |
Edge Application Manager | TITLE | 0.71+ |
think Digital 2020 | EVENT | 0.7+ |
Nevil | TITLE | 0.61+ |
five g. | OTHER | 0.6+ |
Covic | COMMERCIAL_ITEM | 0.55+ |
Open Rise | TITLE | 0.54+ |
Layton | PERSON | 0.54+ |
Maura | ORGANIZATION | 0.47+ |
five | OTHER | 0.47+ |
key characters | QUANTITY | 0.46+ |
Cube | ORGANIZATION | 0.38+ |
Think 2020 | ORGANIZATION | 0.28+ |