Image Title

Search Results for EPFL:

Edouard Bugnion, EPFL | CUBE Conversation


 

(cheerful music) >> Hi, I'm Peter Burris and welcome once again to a Cube Conversation from our beautiful Studios here in Palo Alto, California. Today we've got a great guest. Ed Bugnion is a Professor of Computer Science at EPFL, one of the leading Swiss technology institutes or engineering institutes, a country known for engineering. Ed B, thanks very much for being here. >> Thanks for having me. >> So a lot going on this week but you're here for a particular reason here in Silicon Valley. Long journey, what are you here for? What's going on? >> Yeah, so I'm back to my old neighborhood in Palo Alto because VMware had its 20th birthday celebration this week and they were kind enough to invite me, invite all the founders and so it was a great event. Happy to be here. >> So what was your role in the early VMware? >> I had many, many different roles. I had many different lives. At one point I was the CTO of the company from the beginning until 2005. >> So this week a lot of catching up with folks, talking about a 20 year history, anything particular, interesting? >> I mean I think the nice thing was that VMware's actually doing great. It's got a great future ahead for itself but it was actually nice to to be able to communicate to the existing, the current employees what VMware was 20 years ago. >> And where it's meant so they can see a perspective. So I actually have an interesting thought, at least I think it's an interesting thought, and that is I've been doing this for a long time and if I look back over the last 20 years I think there were two really consequential technology changes. One was virtualization which obviously VMware popularized in kind of a open way. Because without it, first of all it created great as you said a great company, but also without it it would not have been possible to have the cloud because the cloud is effectively a whole bunch of virtualized resources. But the second one is flash. And the reason why I think flash is important is because for the first 50 years of computing we were focused on how do we reliably persist data onto these things called disks? And with flash now we're thinking about how we quickly deliver data out to applications. I don't see how AI and some of these new types of applications we're talking about, businesses we're talking about, are possible without flash. What do you think? >> Obviously these are two of the big pillars right? There are few other pillars, networking being one of them, both within the data center and delivery of content otherwise we would not have the network effect and all the applications that are pairing us mobile as well. But yes from a data center infrastructure perspective, virtualization which is you know started as a relatively point technology right? How to run two operating systems on a computer at the time, it wasn't even a laptop, it was a desktop into being what it is today has had a profound effect. It forced us to separate the logical from the physical and manage them separately and think about capacity differently. And then create the flexibility in the provisioning of these new resources applied to computing and networking and storage. >> And flash is also had a similar kind of effect, I mean would you agree with that as well? >> Yeah, I mean it's totally changed expectations right? Before flash and before in memory, the expectation was that anything that involved data warehousing and analytics was something that was a batch process. You have to wait for it and the notion that data is available on demand is something that is now taken for granted by users but it wouldn't have been possible without those new technologies. >> And it's had an enormous impact on system design and system architecture. Another thing we believe at Wikibon is that digital transformation is real. And by that we mean that the emergence of data as an asset is having a profound consequence on how business thinks because at the end of the day you institutionalize your work, your value propositions, how you get things done around the assets which you consider your assets and as you do that for data you're going to rebuild your business around data as an asset. But it also suggests that data is going to take a more central role in describing how future architectures are laid out. Now at EPFL you're doing a lot of research specifically on how data center infrastructure is going to be organized. What do you think? Is the data going to move to the cloud? Is the cloud going to come to the data? What does that mean? >> Well it's actually, my research is actually squarely on what's happening within the data center. And in particular whether you can actually take make efficient use of the resources in a given data center while meeting service level objectives. How do you make sure that you can respond to user facing requests fast enough and have and at the same time be able to deploy that with the right amount of capacity? >> When you say user you mean not only human being but also other system resources right? (crosstalk) >> The interactive behavior makes things different right? Because then you actually have an actual time constraint. And it's actually difficult to be able to solve the problem of delivering latency critical, human real-time responses reliably and at the same time being able to do that without consuming an exorbitant amount of resources. You know energy is a big issue. If you can deliver the same amount of capacity of actual traffic with less underlying hardware capacity then it's a win. >> So as as we think about data centers going forward, I presume that you believe that data centers are going to change and evolve but still in some capacity be very much in force as a design center for how an enterprise thinks about its resources. Is that accurate? >> Yeah, I mean the notion that everything is going to concentrate into a few mega data centers is obviously a little bit of a stretch right? There will always be a balance. There are economies of scale in these very large mega data centers. The Suites point and the minimal operating point at which it makes sense to actually build on our data center and to deploy infrastructure has actually changed right? A few years ago it actually made sense to put three servers in a basement. That doesn't make any sense today. But for many enterprises it does still make sense to have some amount of capacity on-premise because it's an economic balance right? You get to own the assets but you need to have a certain scale. >> So as you're driving your research about the future of the data center and how it's going to be organized, what role does automation play in conceptualizing what the future of the data center looks like? >> There's an old friend of mine who once said screwdrivers don't scale. (laughter) If you want to be able to operate anything at any scale, you need to have automation. And virtualization is a one of the mechanism for automation, it's one of the foundational elements right? You want to make absolutely clearly separate the act of operating screwdrivers which you need to do once in a while. You need to add capacity physically in a data center but you want to make sure that that is completely decoupled from the operations. >> So how do you think or where do you think some of the next set of advances are going to come as we think about the data center? You know given virtualization, given flash, given improvements in networking. Where do you see some of that next round of technological advances coming? >> Well if there were no new applications, if there were no digital transformation the answer would be easy right? It's not a hard problem. You just keep doing and it's going to get better over time. >> Just faster. Faster, cheaper. >> The reality is we have a digital transformation. It is, if anything, accelerating and so the question is how do you keep up with the growth complexity? And the reality of virtualization is whenever you apply to a particular domain, right, you allow that domain to scale by reducing operational complexity but part of that operational complexity actually gets shifted elsewhere. The early days of virtualization at VMware we virtualized servers, we virtualized clusters of servers. That was really nice right? You could actually move VMs around across you know transparently. We obviously push a lot of that complexity into storage area networks. And that was fine at small scale. At larger scale it creates again an operational issue with storage because we move some of that complexity into another subsystem. So it is about chasing where which subsystem actually has the pain point and has the complexity at any point in time. >> So as we start chasing these new opportunities, we're also seeing the business start to react as they try to exploit data differently. So that the whole concept of technology, not at the infrastructure level per se, but rather as an enabler or as a strategic capability within a business starts again elevating it up into the organization. We start worrying about security. We start worrying about customer experience and the rate at which we transition. When we substitute labor, technology for labor, in a customer experience kind of way. As we think about those types of things that suggests that the technology professional has to start becoming a little bit more mindful of their responsibilities, what do you envision will be the role of where that interplay between a sense of responsibility and engineering as we start to conceive of some of these more complex rich systems? >> So that's actually is the one of the big, big transitions because when I started in tech what we did effectively had a relatively moderate implication on people's lives right? It was basically business process that was being digitized and we were enabling a more efficient digitization of business processes but it was sort of left at that. Today tech is at a stage where we can actually directly impact people's lives for the better or for worse. And it's very important that as an industry we actually have the appropriate introspection so that we know we're doing things in a sensible way. It might involve actually slowing down at some times the pace of innovation. Trying to do things in a more deliberate, careful way. Other segments and industries had to do that. You can't you know come up with a new drug and simply put it on the market. There's a process. Obviously this is an extreme example but tech has always been on the other extreme. And the big trend is to find the appropriate level balance. I live in Switzerland now and GDPR is all over Europe. It's actually a big change in the mindset because now you not only have to make sure that you can manage data for yourself as an enterprise but also that you actually abide to your responsibilities as an uprise as a data processor for your customers and your users. >> For other peoples data. Yeah and it's interesting because in many respects medicine has itself been at the vanguard of ethics for a long time and what we're suggesting is that eventually technology is going to have to start thinking about what do the new ethics mean. Now at EPFL are, I'm putting you on the spot, at EPFL are you starting to introduce these concepts of ethics directly into the curriculum? Are you teaching the next generation to think in these terms? >> Yeah, well actually the first thing we're doing is we're adding into the curriculum for all engineers not just computer science crowd but all engineering students the notion of computational thinking as a freshman class, mandatory core freshman class. >> Peter Denning. >> And computational thinking is really about sort of we're positioning that sort of a third pillar of the engineering foundation along with math and physics right? You need math to learn rigor and you need physics to sort of understand how to model the world. And we're adding computational thinking as a way to you know reason about how you can use computational methods to solve engineering problems because as an engineer all of us will actually use computers all the time. And yet we never really know what it actually means to apply computational methods and to think about it in those terms. >> So coming back to this notion of the world of flash is playing in the industry, we also believe here at Wikibon that we are seeing a significant transformation in the computational model. The basic way that you approach a problem. And so taking the notion of computational thinking and I mentioned Peter Denning, who's a guide known for a long time, now down at the Naval Postgraduate School to Cebrowski Institute. When you start asking that fundamental question, how do you approach a problem? How are people going to approach the problems going forward as a consequence of a new orientation of delivering data? >> Well Peter Denning obviously is known for the locality principle. And the locality principle says that you affect. >> Great segway by the way. >> I mean you need to have, you need to know what your working set of data is and you need to have it close to you know to operate because you cannot have uniform equal cost access to all data at at all times. It's particularly interesting when you combine flash technologies from a latency and throughput perspective with networking technologies and computational technologies. It's about knowing where do you actually actuate the points, at what point do you go from an aggregate model to disaggregate model? What are the pros and cons of each? But fundamentally you know recognizing that locality that does exist and locality matters is fundamental to the scaling of the infrastructure. Obviously these are the problems that we infrastructure people worry about so that from an application perspective and from a policy and reflection perspective we don't have to worry about those. >> And so the application people don't but especially the business people can focus more on customer experience and those types of things. Coming back to this notion of locality and tying it back into GDPR for example, it seems as though the constraints of locality are not just latency and cost but they also are increasingly in human terms, in ethical terms, including regulatory principles but also intellectual property principles. When you start to think about how again this notion of the data center gets organized where we probably increasingly start organizing data centers around the natural location of data, I don't mean geographic, I mean the natural location of data, do you foresee a new set of constraints starting to influence not just latency, not just cost but some of these other more human concerns starting to impact how we conceive of where we put data centers and what we put in those data centers? >> Well there are two different aspects to the answer. One is data centers consume energy. And so the location of the data center from an energy perspective will matter and will keep mattering and because we need to be very conscious about the overall global footprint of these data centers. And then the other consideration which is completely orthogonal is natural boundaries also matter and the notion of sovereignty and obviously I'm not a lawyer, I don't know if you're a lawyer. >> Nope. >> But the notion of sovereignty is rooted in the notion of of national boundaries right? It applies to land. It applies to water. It applies to airspace. >> Laws, culture. >> And so the question of how it applies to data is a really important one right? Does it matter where the data is actually stored? Can I reach into some other country's data? These questions are completely open at this point. They must be resolved. I think there is a global reflection among the industry right now that the time has come for both the govern entities and the industrial players to sort of take a position that this problem must be addressed. How it will be addressed? That I don't know. >> Well so I have a perspective on related I'm not going to answer how it's going to be addressed but security is a crucially important feature of how we think about computing going forward specifically data security. And it seems to me as though if we think about these data assets and how we institutionalize work around these assets, security is a significant feature of how we actually conceive of and create data assets because effectively it is through security that we privatize data. What laws and whatnot that we put on things turns into policy turns into technology for privatizing things. So talk a little bit about how you foresee the future of security, the data security, technology security and data coming together as people think about the role of data is going to play in our lives? >> So security is in a way a very technical way of looking at the problem right? Not everybody you know outside of tech actually appreciates what we all mean by security and within tech sometimes we mean different things when we talk about security. One of the themes we're trying to talk about is the notion that we need trust as a society irrespective of how it's done technologically. You need trust. We know how to establish trust in the physical world. We've been doing this for a few centuries or millennia. We need to learn how to establish trust in the digital world. So that's actually one of the initiatives we have right now at EPFL is actually establishing a center for digital trust. Whose goal is to basically try to ask the question of how do you actually have the same level of trust between players in the digital world that we can actually establish through known means, that we've learned to experience over centuries in the physical world? It's not an easy problem. >> No, it's not. So I got one more question for you. As you, so imagine you're writing a book in 2035 and you're writing a history of computing. You're looking back and you're saying, "Wow, look at all these things that happened." And we've already discussed some of the salient inflection points within the industry but if we think about an inflection point between 2018 and 2035, what do you think in a future purchase sense looking back what was the inflection point? When did it occur in the next 17 years? >> Well if you're an optimist then the path between today and 2035 was a positive one, free of any hardship or complications or unintended consequences. If you're a realist, we have to anticipate that there are some unanticipated consequences of tech and emergent properties of tech and where those evolution will take us. I mean I'm not a futurist right? I try to, my fellow could, my sort of my own research agenda, I try to look five years out as where things might go at a particular layer. If we look at the emergent properties, the emergent behavior I think they're very hard to anticipate. We're just trying to learn right now as collectively the side effects of social networking on how we interact as a society, as a democracy. It's very difficult to imagine where we'll go between now and 2035. There are a few things that are obvious and I'm going to just state what is obvious is the digital transformation is accelerating. The importance of data is growing. The existential threat associated with the misuse of data is going to be greater and greater especially as we digitize you know our human lives, our biological lives get digitized for example. That's going to have a huge impact. And then the drill transformation is also going to change jobs and change entire industries. Automation, AI, is going to have a profound effect. How fast that effect will be, I think is the open question. The history has always been an evolution of technology. I think what may be different this time is that its operating on a global scale faster than before. >> That affects a lot more people. So in certain respects it's especially crucial over the next few years to as you said the word, the key word is emergent. That there's going to be a lot of emergent properties that come out of technologies. Accelerating technology programs itself for example. Those types of things and so you kind of summarize, it's that fine line between too much control and too much freedom and staying right there so we get the innovation while at the same time we can have some degree of say over how it actually behaves. Is that kind of where we're going to be thinking? >> Yeah, I mean that's one way to look at this. Obviously regulation is not the answer. The other way to solve these problems is to actually have the appropriate products. I'll just give an example. Database management systems were not designed with data privacy in mind. They were designed to process data. Now GDPR comes along and what does it mean if I have a sequel database and I also need to be GDPR compliant? That's, if you think about it, there's somewhat of a mismatch between the two if you look at it purely from a technical perspective. Five years from now, does it make sense to have a GDPR by design database, whatever that means right? Maybe, I haven't thought about it too deeply but it's one of those examples where you have a new set of constraints and I think as an industry we need to take them as parameters. And what we've been consistently very good at in the tech industry is to actually take these constraints and actually turn them into products that people know how to operate and deploy. >> Excellent. Ed Bugnion, Computer Science Professor at EPFL. Thank you very much for being on The Cube. >> Thanks for having me. It was a pleasure. >> Once again, Peter Burris, Cube Conversation. Thanks for watching. See you again. (cheerful music)

Published Date : Apr 17 2018

SUMMARY :

at EPFL, one of the leading Swiss technology Long journey, what are you here for? Yeah, so I'm back to my old neighborhood from the beginning until 2005. to communicate to the existing, the current and if I look back over the last 20 years of these new resources applied to computing the expectation was that anything of the day you institutionalize your work, and have and at the same time be able and at the same time being able to do that I presume that you believe that data centers You get to own the assets but you need the act of operating screwdrivers which you some of the next set of advances are going to come You just keep doing and it's going to get better And the reality of virtualization is whenever So that the whole concept of technology, but also that you actually abide to your responsibilities that eventually technology is going to have students the notion of computational thinking You need math to learn rigor and you need physics is playing in the industry, we also believe And the locality principle says that you affect. to have it close to you know to operate And so the application people don't but especially And so the location of the data center in the notion of of national boundaries right? And so the question of how it applies to data And it seems to me as though if we think the notion that we need trust as a society When did it occur in the next 17 years? and I'm going to just state what is obvious over the next few years to as you said the word, at in the tech industry is to actually take Thank you very much for being on The Cube. It was a pleasure. See you again.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Ed BugnionPERSON

0.99+

SwitzerlandLOCATION

0.99+

Peter BurrisPERSON

0.99+

Peter DenningPERSON

0.99+

Silicon ValleyLOCATION

0.99+

EuropeLOCATION

0.99+

EPFLORGANIZATION

0.99+

VMwareORGANIZATION

0.99+

Edouard BugnionPERSON

0.99+

2005DATE

0.99+

2035DATE

0.99+

twoQUANTITY

0.99+

Cebrowski InstituteORGANIZATION

0.99+

Palo AltoLOCATION

0.99+

Ed BPERSON

0.99+

Naval Postgraduate SchoolORGANIZATION

0.99+

GDPRTITLE

0.99+

oneQUANTITY

0.99+

OneQUANTITY

0.99+

Palo Alto, CaliforniaLOCATION

0.99+

first 50 yearsQUANTITY

0.99+

WikibonORGANIZATION

0.99+

bothQUANTITY

0.99+

five yearsQUANTITY

0.99+

this weekDATE

0.98+

one more questionQUANTITY

0.98+

20 years agoDATE

0.98+

2018DATE

0.97+

TodayDATE

0.97+

eachQUANTITY

0.97+

todayDATE

0.96+

second oneQUANTITY

0.96+

three serversQUANTITY

0.94+

two different aspectsQUANTITY

0.94+

one wayQUANTITY

0.94+

one pointQUANTITY

0.93+

two operating systemsQUANTITY

0.91+

third pillarQUANTITY

0.9+

few years agoDATE

0.86+

millenniaQUANTITY

0.82+

firstQUANTITY

0.81+

last 20 yearsDATE

0.8+

20th birthday celebrationQUANTITY

0.78+

first thingQUANTITY

0.75+

20 yearQUANTITY

0.75+

next 17 yearsDATE

0.75+

two really consequential technology changesQUANTITY

0.74+

Cube ConversationEVENT

0.73+

onceQUANTITY

0.72+

few yearsDATE

0.69+

Five yearsQUANTITY

0.64+

pillarsQUANTITY

0.61+

mega data centersQUANTITY

0.6+

themQUANTITY

0.59+

elementsQUANTITY

0.58+

more peopleQUANTITY

0.56+

themesQUANTITY

0.52+

SwissOTHER

0.44+

CubeTITLE

0.33+

CubeCOMMERCIAL_ITEM

0.31+

Edouard Bugnion, EPFL - Second Segment | CUBE Conversation


 

(bright, upbeat music) >> Hi, I'm Peter Burris, and welcome to another CUBE Conversation. We've got another great guest this week, Ed Bugnion, who's a professor of computer science at EPFL, a leading technical university in Switzerland. Ed, welcome to theCUBE. >> Thanks for having me. >> So Ed, you do at EPFL, you are leading research on the future of the data center. What I want to do, is I want to talk about the near term of the data center, 'cause a lot of people have questions about what's going to happen over the next few years. Let's posit that the data center's not going to go away any time soon, and instead talk about inside the data center. What's going to happen with the organization of technology inside data centers? >> Well it's always been a chase about how to reduce complexity. You always start with basically having a number of moving parts and then the business requirements keep increasing, and at some point, the complexity just overwhelms the operational model. So I was involved in virtualization. I've been working virtualization for close to 20 years. Right, virtualization was about reducing the complexity for the servers, and basically moved from having to manage servers one by one, separate the physical from logical and sort of solving that problem. Now what we actually did, as a side effect, is we actually pushed the remaining aspects of that complexity elsewhere. The servers were mobile, they were flexible, they could v motion across a cluster, they had to be stored on a storage area network so as a result, we ended up having this entire operational complexity around the management of storage area networks for very large amounts of data and as the increase in virtualization became more and more important, that became bigger, more of an issue. So then I actually got involved into into networking and networking was about the fact that a decade or so ago there was a proliferation of incompatible networks inside the data center. I was involved at Cisco in the pure storage, unified conversion networking with the UCS product so we could both do storage and regular TCPIP networking on the same on the line firework. This was about reducing complexity, but we didn't address all the complexity problems, we created other bottlenecks so it's always this ever shifting issue with dealing with scale. >> So as we virtualized the servers, we virtualize the storage and now we're virtualizing the network, that suggests that we can start bringing these things together in new novel ways have I got that right? >> Yeah so we first virtualized the network access, right, the storage access and the SANs and then now we're obviously with hyperconvergence we're about disaggregating storage and rethinking storage because of these new requirements. That's solves a number of the problems, right? It's actually now proving out to be sort of an industry-wide accepted model that we move away from storage arrays into hyperconverged models and hyperconvergence alone if the only thing you're doing is moving blocks around is again only solving part of the problem, you still need to worry about DR, you still need to worry about backup, you still need to worry about offsite. You still need to worry about locality, right, because having completely filed storage is a gross violation of the locality principal and the locality principal actually does come back and matter at some point in time. So it's really about finding the balance between the space and feeds, what needs to be co-located and what can be disaggregated and then what use-case must be addressed. >> And I presume, how much control can be bought from a single point of presence, console, onto the underlying infrastructures, is that how the rest are worried about? >> Yeah so I think there's, you're going to have to separate two things. One is the physical building blocks and the other one are the operational consoles, right, and the physical building blocks, the number of people providing these physical building blocks is small and if anything, shrinking. If you think about the operational console, the different panels, right? If you think about the different software companies providing technology, they actually themselves offer different panels to different constituencies. The silos have not completely disappeared in the IT operation model today, they're, communication is much better, tools facilitate this communication but silos not completely gone. So you still have these different panels, they can come from one vendor, different vendors, the same vendor can actually provide multiple capabilities but the theme is do you actually want to move away from having to deal with the complexity of having completely different universes into having much more coherent elements to talk to each other? >> So if we have this more coherence, presumably that means these more coherent elements can actually support each other in providing, as you said earlier, some of the crucial features of what a complex, large, scalable system needs to perform. You mentioned backup restore for example. How do you anticipate that the requirements of what constitute as systems, before it was scale compute and now we're actually worried about making sure that all those other issues from an automation from a business requirement standpoint and increasing impinging upon what we regard as design, like, having data protection. How do those new constraints start to impact folks to think about what to buy, what to use now? >> Yeah it's actually fascinating that tape, right, as we know it and as we knew it which largely has not changed, right, is actually still present. Tape obviously is a sequential approach, it's not by any stretch not the most easy way technology to operate and yet it still has sort of a presence so moving away from this, and the interesting observation is and you can now move away from these classic approaches of backup to object-based solutions. These object-based solutions, provide that you have the appropriate kind of connectivity assumptions can either be offsite or onsite and it's a very fluid and transparent model. And these object-based solutions are actually now designed into scale and can be used to either store primary data and stream data also to store backups of data and so this convergence between using object storage between what is backup and what is live data is one of the interesting themes. >> So we're talking about convergence of the hardware elements, but now we're also talking about convergence of the services and the capabilities associated, all within the same console, all within the same platform, utilizing specialization where it makes sense, have I got that right? >> Yes I mean you obviously have different use cases right? One of the things that is always goes back to the question of what is the API right? If you have an API and it is really you know gets and sets on an object model, that is designed to operate transactional objects right, you effectively are in a particular mindset. If you actually want to guarantee retention, you actually want a different set of APIs right, one of the things that's really important is to make sure that the data is actually safe and that the API won't prevent a catastrophic misuse and deletion of the data, for example. >> So there's one bit of advice you can offer someone who's sitting in a data center today and thinking about what they should be doing to increase the returns on their data assets and what they provide to the business, what would that kind of one thing that you'd leave them with be? >> Almost depends on where you start from, right? >> Peter: Okay good point. >> But having said that, there are sort of two general approaches, one is sort of the incremental approach which is you try to catch up with the technology trends and the other one is to say, okay what are actually my problems that I'm trying to solve purely from an infrastructure perspective and how do I actually solve these problems in a reasonable timeframe? It's actually if you think about the pros and cons between the two approaches, the first approach is this pragmatic, it's going to be better this quarter than last quarter, but you may never be able to catch up the other approach requires a little bit more thinking, sometimes process re-engineering, sometimes thinking about things differently. Changing the operational model, how your teams operate within the IT organization, sometimes it actually delivers the right solution. >> And we do have a model for how to do this, the big hyper-scalers are doing just that second approach and it's having a consequential impact on the industry isn't it? >> Yeah well storage, the storage industry has always been a fascinating industry, it was static for a few years, it's now extremely dynamic industry, there's a lot of companies that went public in the storage space over the last few years as we all know. They went because there was new technology, right? Flash sort of was transforming to the landscape. Now object and hyperconverged and post-hyperconverged solutions are actually also completely transforming the landscape because now, we think about storage different because it's not, the paradigm is no longer the same. >> Thinking about computing entirely differently. Storage plus everything else. >> Well at the end of the day, this is purely, this is infrastructure right? >> Right. >> And infrastructure is never for infrastructure's sake. Infrastructure is to deliver a new capabilities, new applications. The combination of you know phenomenal increases in primary memory, in Flash memory, and NVME, all these technologies are sort of transforming our expectation with respect to responsiveness and access to data. And then the changes on the compute side and the huge specialization going on in hardware in A-six that we know how to process data in much more efficient way and this is, we haven't talked about AI yet but fundamentally when you think about all these AI-based improvements, it is about being able to put massive amount of computational capabilities onto mass amounts of data. >> So you've been part VMware, you've been part of Neva, you've been part of a lot of different companies, if you look out, what types of foci, what types of centers of innovation amongst, in the valley do you look to for leadership? (laughing) >> The nice thing is, I was in the valley, i was in the industry and now I'm. >> And now you're out. (laughing) >> So I actually don't have to take a position. It's actually nice to be able to look at it much more from a principal perspective rather than to look at is as to which of the existing players are, the agenda they're trying to push. They each have legitimate agendas because they're driving their business and the evolution of their business for their customer and trying to deliver value to their customer. Obviously the customers have to choose. When I look at it sort of from my perspective both academically and so simply from an IT perspective as I operate a fair amount of IT EDPFL, it's really this notion of what problems are we trying to solve? And whether the boundaries that we traditionally had between the classic large vendors still make sense in this sort of hyperconverged environment. >> Alright well, Ed Bugnion, Professor of computer science at EDPFL, thanks again for being on theCUBE and this is Peter Burris and once again, great CUBE conversation and hope to see you soon. (bright upbeat music)

Published Date : Apr 17 2018

SUMMARY :

to another CUBE Conversation. Let's posit that the data center's not going to go away and as the increase in virtualization and the locality principal actually does come back and the other one are the operational consoles, right, folks to think about what to buy, and the interesting observation is and you can now and that the API won't prevent a catastrophic and the other one is to say, okay the paradigm is no longer the same. Thinking about computing entirely differently. and the huge specialization going on in hardware and now I'm. And now you're out. Obviously the customers have to choose. great CUBE conversation and hope to see you soon.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
EdPERSON

0.99+

SwitzerlandLOCATION

0.99+

Peter BurrisPERSON

0.99+

Ed BugnionPERSON

0.99+

CiscoORGANIZATION

0.99+

EPFLORGANIZATION

0.99+

PeterPERSON

0.99+

Edouard BugnionPERSON

0.99+

EDPFLORGANIZATION

0.99+

two approachesQUANTITY

0.99+

two thingsQUANTITY

0.99+

UCSORGANIZATION

0.99+

last quarterDATE

0.99+

OneQUANTITY

0.99+

oneQUANTITY

0.98+

bothQUANTITY

0.98+

second approachQUANTITY

0.98+

a decade or so agoDATE

0.98+

first approachQUANTITY

0.97+

VMwareORGANIZATION

0.96+

one vendorQUANTITY

0.96+

two general approachesQUANTITY

0.94+

firstQUANTITY

0.93+

this weekDATE

0.92+

todayDATE

0.91+

close to 20 yearsQUANTITY

0.91+

this quarterDATE

0.91+

one bitQUANTITY

0.9+

eachQUANTITY

0.9+

single pointQUANTITY

0.84+

SecondQUANTITY

0.83+

ConversationEVENT

0.73+

one thingQUANTITY

0.72+

last few yearsDATE

0.7+

next few yearsDATE

0.69+

NevaORGANIZATION

0.61+

CUBEORGANIZATION

0.56+

themesQUANTITY

0.54+

thingsQUANTITY

0.53+

yearsQUANTITY

0.4+