Image Title

Search Results for one motherboard:

Google's PoV on Confidential Computing NO PUB


 

>> Welcome Nelly and Patricia, great to have you. >> Great to be here. >> Thank you so much for having us. >> You're very welcome. Nelly, why don't you start, and then Patricia you can weigh in. Just tell the audience a little bit about each of your roles at Google Cloud. >> So I'll start, I'm honing a lot of interesting activities in Google and again, security or infrastructure securities that I usually hone, and we're talking about encryption, Antware encryption, and confidential computing is a part of portfolio. In additional areas that I contribute to get with my team to Google and our customers is secure software supply chain. Because you need to trust your software. Is it operating your confidential environment to have end to end story about if you believe that your software and your environment doing what you expect, it's my role. >> Got it, okay. Patricia? >> Well I am a technical director in the office of the CTO, OCTO for short, in Google Cloud. And we are a global team. We include former CTOs like myself and senior technologies from large corporations, institutions, and a lot of success for startups as well. And we have two main goals. First, we work side by side with some of our largest, more strategic or most strategic customers and we help them solve complex engineering technical problems. And second, we are device Google and Google Cloud engineering and product management on emerging trends in technologies to guide the trajectory of our business. We are unique group, I think, because we have created this collaborative culture with our customers. And within OCTO I spend a lot of time collaborating with customers in the industry at large on technologies that can address privacy, security, and sovereignty of data in general. >> Excellent, thank you for that both of you. Let's get into it. So Nelly, what is confidential computing from Google's perspective? How do you define it? >> Confidential computing is a tool. And it's one of the tools in our toolbox. And confidential computing is a way how would help our customers to complete this very interesting end to end lifecycle of their data. And when customers bring in the data to Cloud and want to protect it, as they ingest it to the Cloud, they protect it address when they store data in the Cloud. But what was missing for many, many years is ability for us to continue protecting data and workloads of our customers when they running them. And again, because data is not brought to Cloud to have huge graveyard, we need to ensure that this data is actually indexed. Again there is some insights driven and drawn from this data. You have to process this data and confidential computing here to help. Now we have end to end protection of our customer's data when they bring the workloads and data to Cloud, thanks to confidential computing. >> Thank you for that. Okay, we're going to get into the architecture a bit but before we do Patricia, why do you think this topic of confidential computing is such an important technology? Can you explain, do you think it's transformative for customers and if so, why? >> Yeah, I would maybe like to use one thought, one way, one intuition behind why confidential matters. Because at the end of the day it reduces more and more the customers thrush boundaries and the attack surface, that's about reducing that periphery, the boundary, in which the customer needs to mind about trust and safety. And in a way is a natural progression that you're using encryption to secure and protect data in the same way that we are encrypting data in transit and at rest. Now we are also encrypting data while in use. And among other beneficial I would say one of the most transformative ones is that organizations will be able to collaborate with each other and retain the confidentiality of the data. And that is across industry. Even though it's highly focused on, I wouldn't say highly focused, but very beneficial for highly regulated industries. It applies to all of industries. And if you look at financing for example, where bankers are trying to detect fraud and specifically double finance where you are a customer is actually trying to get a finance on an asset, let's say a boat or a house and then it goes to another bank and gets another finance on that asset. Now bankers would be able to collaborate and detect fraud while preserving confidentiality and privacy of the of the data. >> Interesting, and I want to understand that a little bit more but I'm going to push you a little bit on this, Nelly, if I can, because there's a narrative out there that says confidential computing is a marketing ploy. I talked about this upfront, by Cloud providers that are just trying to placate people that are scared of the Cloud. And I'm presuming you don't agree with that but I'd like you to weigh in here. The argument is confidential computing is just memory encryption, it doesn't address many other problems, it is overhyped by Cloud providers. What do you say to that line of thinking? >> I absolutely disagree as you can imagine, it's a crazy statement. But the most importantly is we mixing multiple concepts I guess. And exactly as Patricia said, we need to look at the end-to-end story not again the mechanism of how confidential computing trying to again execute and protect customer's data, and why it's so critically important. Because what confidential computing was able to do it's in addition to isolate our tenants in multi-tenant environments the Cloud over. To offer additional stronger isolation, we called it cryptographic isolation. It's why customers will have more trust to customers and to other customers, the tenants that's running on the same host but also us, because they don't need to worry about against threats and more malicious attempts to penetrate the environment. So what confidential computing is helping us to offer our customers, stronger isolation between tenants in this multi-tenant environment but also incredibly important, stronger isolation of our customers. So tenants from us, we also writing code, we also software providers will also make mistakes or have some zero days sometimes again us introduced, sometimes introduced by our adversaries. But what I'm trying to say by creating this cryptographic layer of isolation between us and our tenants, and amongst those tenants, they're really providing meaningful security to our customers and eliminate some of the worries that they have running on multi-tenant spaces or even collaborating together this very sensitive data, knowing that this particular protection is available to them. >> Okay, thank you, appreciate that. And I, you know, I think malicious code is often a threat model missed in these narratives. You know, operator access, yeah, could maybe I trust my Clouds provider, but if I can fence off your access even better I'll sleep better at night. Separating a code from the data, everybody's arm Intel, AM, Invidia, others, they're all doing it. I wonder if Nell, if we could stay with you and bring up the slide on the architecture. What's architecturally different with confidential computing versus how operating systems and VMs have worked traditionally? We're showing a slide here with some VMs, maybe you could take us through that. >> Absolutely, and Dave, the whole idea for Google and industry way of dealing with confidential computing is to ensure as it's three main property is actually preserved. Customers don't need to change the code. They can operate in those VMs exactly as they would with normal non-confidential VMs. But to give them this opportunity of lift and shift or no changing their apps and performing and having very, very, very low latency and scale as any Cloud can, something that Google actually pioneered in confidential computing. I think we need to open and explain how this magic was actually done. And as I said, it's again the whole entire system have to change to be able to provide this magic. And I would start with we have this concept of root of trust and root of trust where we will ensure that this machine, the whole entire post has integrity guarantee, means nobody changing my code on the most low level of system. And we introduce this in 2017 code Titan. Those our specific ASIC specific, again inch by inch system on every single motherboard that we have, that ensures that your low level former, your actually system code, your kernel, the most powerful system, is actually proper configured and not changed, not tempered. We do it for everybody, confidential computing concluded. But for confidential computing what we have to change we bring in a MD again, future silicon vendors, and we have to trust their former, their way to deal with our confidential environments. And that's why we have obligation to validate integrity not only our software and our firmware but also firmware and software of our vendors, silicon vendors. So we actually, when we booting this machine as you can see, we validate that integrity of all of this system is in place. It means nobody touching, nobody changing, nobody modifying it. But then we have this concept of the secure processor. It's special Asics best, specific things that generate a key for every single VM that our customers will run or every single node in Kubernetes, or every single worker thread in our Spark capability. We offer all of that, and those keys are not available to us. It's the best keys ever in encryption space. Because when we are talking about encryption the first question that I'm receiving all the time, where's the key, who will have access to the key? Because if you have access to the key then it doesn't matter if you encrypt it enough. But the case in confidential computing quite so revolutionary technology, ask Cloud providers who don't have access to the keys. They're sitting in the hardware and they fed to memory controller. And it means when Hypervisors that also know about these wonderful things, saying I need to get access to the memories that this particular VM I'm trying to get access to. They do not encrypt the data, they don't have access to the key. Because those keys are random, ephemeral and VM, but the most importantly in hardware not exportable. And it means now you will be able to have this very interesting role that customers all Cloud providers, will not be able to get access to your memory. And what we do, again, as you can see our customers don't need to change their applications. Their VMs are running exactly as it should run. And what you're running in VM you actually see your memory in clear, it's not encrypted. But God forbid is trying somebody to do it outside of my confidential box. No, no, no, no, no, you will not be able to do it. Now you'll see cybernet. And it's exactly what combination of these multiple hardware pieces and software pieces have to do. So OS is also modified, and OS is modified such way to provide integrity. It means even OS that you're running in UVM bucks is not modifiable and you as customer can verify. But the most interesting thing I guess how to ensure the super performance of this environment because you can imagine, Dave, that's increasing it's additional performance, additional time, additional latency. So we're able to mitigate all of that by providing incredibly interesting capability in the OS itself. So our customers will get no changes needed, fantastic performance, and scales as they would expect from Cloud providers like Google. >> Okay, thank you. Excellent, appreciate that explanation. So you know again, the narrative on this is, well you know you've already given me guarantees as a Cloud provider that you don't have access to my data but this gives another level of assurance. Key management as they say is key. Now you're not, humans aren't managing the keys the machines are managing them. So Patricia, my question to you is in addition to, you know, let's go pre-confidential computing days what are the sort of new guarantees that these hardware-based technologies are going to provide to customers? >> So if I am a customer, I am saying I now have full guarantee of confidentiality and integrity of the data and of the code. So if you look at code and data confidentiality the customer cares then they want to know whether their systems are protected from outside or unauthorized access. And that we covered with Nelly that it is. Confidential computing actually ensures that the applications and data antennas remain secret, right? The code is actually looking at the data only the memory is decrypting the data with a key that is ephemeral, and per VM, and generated on demand. Then you have the second point where you have code and data integrity and now customers want to know whether their data was corrupted, tempered, with or impacted by outside actors. And what confidential computing insures is that application internals are not tampered with. So the application, the workload as we call it, that is processing the data it's also it has not been tempered and preserves integrity. I would also say that this is all verifiable. So you have attestation, and this attestation actually generates a log trail and the log trail guarantees that provides a proof that it was preserved. And I think that the offers also a guarantee of what we call ceiling, this idea that the secrets have been preserved and not tempered with. Confidentiality and integrity of code and data. >> Got it, okay, thank you. You know, Nelly, you mentioned, I think I heard you say that the applications, it's transparent,you don't have to change the application it just comes for free essentially. And I'm, we showed some various parts of the stack before. I'm curious as to what's affected but really more importantly what is specifically Google's value add? You know, how do partners, you know, participate in this? The ecosystem or maybe said another way how does Google ensure the compatibility of confidential computing with existing systems and applications? >> And a fantastic question by the way. And it's very difficult and definitely complicated world because to be able to provide these guarantees actually a lot of works was done by community. Google is very much operate and open. So again, our operating system we working in this operating system repository OS vendors to ensure that all capabilities that we need is part of their kernels, are part of their releases, and it's available for customers to understand and even explore if they have fun to explore a lot of code. We have also modified together with our silicon vendors, kernel, host kernel, to support this capability and it means working this community to ensure that all of those patches are there. We also worked with every single silicon vendor as you've seen, and that's what I probably feel that Google contributed quite a bit in this role. We moved our industry, our community, our vendors to understand the value of easy to use confidential computing or removing barriers. And now I don't know if you noticed Intel is pulling the lead and also announcing the trusted domain extension very similar architecture and no surprise, it's again a lot of work done with our partners to again, convince, work with them, and make this capability available. The same with ARM this year, actually last year, ARM unknowns are future design for confidential computing. It's called confidential computing architecture. And it's also influenced very heavily with similar ideas by Google and industry overall. So it's a lot of work in confidential computing consortiums that we are doing. For example, simply to mention to ensure interop, as you mentioned, between different confidential environments of Cloud providers. We want to ensure that they can attest to each other. Because when you're communicating with different environments, you need to trust them. And if it's running on different Cloud providers you need to ensure that you can trust your receiver when you are sharing your sensitive data workloads or secret with them. So we coming as a community and we have this at the station, the community based systems that we want to build and influence and work with ARM and every other Cloud providers to ensure that they can interrupt. And it means it doesn't matter where confidential workloads will be hosted but they can exchange the data in secure, verifiable, and controlled by customers way. And to do it, we need to continue what we are doing. Working open again and contribute with our ideas and ideas of our partners to this role to become what we see confidential computing has to become, it has to become utility. It doesn't need to be so special but it's what what we've wanted to become. >> Let's talk about, thank you for that explanation. Let talk about data sovereignty, because when you think about data sharing you think about data sharing across, you know, the ecosystem and different regions and then of course data sovereignty comes up. Typically public policy lags, you know, the technology industry and sometimes is problematic. I know, you know, there's a lot of discussions about exceptions, but Patricia, we have a graphic on data sovereignty. I'm interested in how confidential computing ensures that data sovereignty and privacy edicts are adhered to even if they're out of alignment maybe with the pace of technology. One of the frequent examples is when you you know, when you delete data, can you actually prove the data is deleted with a hundred percent certainty? You got to prove that and a lot of other issues. So looking at this slide, maybe you could take us through your thinking on data sovereignty. >> Perfect, so for us, data sovereignty is only one of the three pillars of digital sovereignty. And I don't want to give the impression that confidential computing addresses at all. That's why we want to step back and say, hey, digital sovereignty includes data sovereignty where we are giving you full control and ownership of the location, encryption, and access to your data. Operational sovereignty where the goal is to give our Google Cloud customers full visibility and control over the provider operations, right? So if there are any updates on hardware, software, stack, any operations, that is full transparency, full visibility. And then the third pillar is around software sovereignty where the customer wants to ensure that they can run their workloads without dependency on the provider's software. So they have sometimes is often referred as survivability that you can actually survive if you are untethered to the Cloud and that you can use open source. Now let's take a deep dive on data sovereignty, which by the way is one of my favorite topics. And we typically focus on saying, hey, we need to care about data residency. We care where the data resides because where the data is at rest or in processing it typically abides to the jurisdiction, the regulations of the jurisdiction where the data resides. And others say, hey, let's focus on data protection. We want to ensure the confidentiality and integrity and availability of the data which confidential computing is at the heart of that data protection. But it is yet another element that people typically don't talk about when talking about data sovereignty, which is the element of user control. And here Dave, is about what happens to the data when I give you access to my data. And this reminds me of security two decades ago, even a decade ago, where we started the security movement by putting firewall protections and login accesses. But once you were in, you were able to do everything you wanted with the data, an insider had access to all the infrastructure, the data, and the code. And that's similar because with data sovereignty we care about whether it resides, who is operating on the data. But the moment that the data is being processed, I need to trust that the processing of the data will abide by user control, by the policies that I put in place of how my data is going to be used. And if you look at a lot of the regulation today and a lot of the initiatives around the International Data Space Association, IDSA, and Gaia X, there is a movement of saying the two parties, the provider of the data and the receiver of the data going to agree on a contract that describes what my data can be used for. The challenge is to ensure that once the data crosses boundaries, that the data will be used for the purposes that it was intended and specified in the contract. And if you actually bring together, and this is the exciting part, confidential computing together with policy enforcement. Now the policy enforcement can guarantee that the data is only processed within the confines of a confidential computing environment. That the workload is cryptographically verified that there is the workload that was meant to process the data and that the data will be only used when abiding to the confidentiality and integrity, safety of the confidential computing environment. And that's why we believe confidential computing is one, necessary and essential technology that will allow us to ensure data sovereignty especially when it comes to user control. >> Thank you for that. I mean it was a deep dive, I mean brief, but really detailed, so I appreciate that, especially the verification of the enforcement. Last question, I met you two because as part of my year end prediction post you guys sent in some predictions, and I wasn't able to get to them in the predictions post. So I'm thrilled that you were able to make the time to come on the program. How widespread do you think the adoption of confidential computing will be in '23 and what's the maturity curve look like, you know, this decade in, in your opinion? Maybe each of you could give us a brief answer. >> So my prediction in five, seven years as I started, it'll become utility. It'll become TLS. As of, again, 10 years ago we couldn't believe that websites will have certificates and we will support encrypted traffic. Now we do, and it's become ubiquity. It's exactly where our confidential computing is heading and heading, I don't know if we are there yet yet. It'll take a few years of maturity for us, but we'll do that. >> Thank you, and Patricia, what's your prediction? >> I would double that and say, hey, in the future, in the very near future you will not be able to afford not having it. I believe as digital sovereignty becomes ever more top of mind with sovereign states and also for multinational organizations and for organizations that want to collaborate with each other, confidential computing will become the norm. It'll become the default, If I say mode of operation, I like to compare that, today is inconceivable if we talk to the young technologists. It's inconceivable to think that at some point in history and I happen to be alive that we had data at address that was not encrypted. Data in transit, that was not encrypted. And I think that we will be inconceivable at some point in the near future that to have unencrypted data while we use. >> You know, and plus, I think the beauty of the this industry is because there's so much competition this essentially comes for free. I want to thank you both for spending some time on Breaking Analysis. There's so much more we could cover. I hope you'll come back to share the progress that you're making in this area and we can double click on some of these topics. Really appreciate your time. >> Anytime. >> Thank you so much.

Published Date : Feb 10 2023

SUMMARY :

Patricia, great to have you. and then Patricia you can weigh in. In additional areas that I contribute to Got it, okay. of the CTO, OCTO for Excellent, thank you in the data to Cloud into the architecture a bit and privacy of the of the data. but I'm going to push you a is available to them. we could stay with you and they fed to memory controller. So Patricia, my question to you is and integrity of the data and of the code. that the applications, and ideas of our partners to this role is when you you know, and that the data will be only used of the enforcement. and we will support encrypted traffic. and I happen to be alive and we can double click

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
NellyPERSON

0.99+

PatriciaPERSON

0.99+

International Data Space AssociationORGANIZATION

0.99+

DavePERSON

0.99+

GoogleORGANIZATION

0.99+

IDSAORGANIZATION

0.99+

last yearDATE

0.99+

2017DATE

0.99+

two partiesQUANTITY

0.99+

oneQUANTITY

0.99+

twoQUANTITY

0.99+

second pointQUANTITY

0.99+

FirstQUANTITY

0.99+

ARMORGANIZATION

0.99+

first questionQUANTITY

0.99+

fiveQUANTITY

0.99+

bothQUANTITY

0.99+

IntelORGANIZATION

0.99+

two decades agoDATE

0.99+

AsicsORGANIZATION

0.99+

secondQUANTITY

0.99+

Gaia XORGANIZATION

0.99+

OneQUANTITY

0.99+

eachQUANTITY

0.98+

seven yearsQUANTITY

0.98+

OCTOORGANIZATION

0.98+

one thoughtQUANTITY

0.98+

a decade agoDATE

0.98+

this yearDATE

0.98+

10 years agoDATE

0.98+

InvidiaORGANIZATION

0.98+

'23DATE

0.98+

todayDATE

0.98+

CloudTITLE

0.98+

three pillarsQUANTITY

0.97+

one wayQUANTITY

0.97+

hundred percentQUANTITY

0.97+

zero daysQUANTITY

0.97+

three main propertyQUANTITY

0.95+

third pillarQUANTITY

0.95+

two main goalsQUANTITY

0.95+

CTOORGANIZATION

0.93+

NellPERSON

0.9+

KubernetesTITLE

0.89+

every single VMQUANTITY

0.86+

NellyORGANIZATION

0.83+

Google CloudTITLE

0.82+

every single workerQUANTITY

0.77+

every single nodeQUANTITY

0.74+

AMORGANIZATION

0.73+

doubleQUANTITY

0.71+

single motherboardQUANTITY

0.68+

single siliconQUANTITY

0.57+

SparkTITLE

0.53+

kernelTITLE

0.53+

inchQUANTITY

0.48+

Breaking Analysis: Google's PoV on Confidential Computing


 

>> From theCUBE Studios in Palo Alto in Boston, bringing you data-driven insights from theCUBE and ETR. This is Breaking Analysis with Dave Vellante. >> Confidential computing is a technology that aims to enhance data privacy and security, by providing encrypted computation on sensitive data and isolating data, and apps that are fenced off enclave during processing. The concept of, I got to start over. I fucked that up, I'm sorry. That's not right, what I said was not right. On Dave in five, four, three. Confidential computing is a technology that aims to enhance data privacy and security by providing encrypted computation on sensitive data, isolating data from apps and a fenced off enclave during processing. The concept of confidential computing is gaining popularity, especially in the cloud computing space, where sensitive data is often stored and of course processed. However, there are some who view confidential computing as an unnecessary technology in a marketing ploy by cloud providers aimed at calming customers who are cloud phobic. Hello and welcome to this week's Wikibon Cube Insights powered by ETR. In this Breaking Analysis, we revisit the notion of confidential computing, and to do so, we'll invite two Google experts to the show. But before we get there, let's summarize briefly. There's not a ton of ETR data on the topic of confidential computing, I mean, it's a technology that's deeply embedded into silicon and computing architectures. But at the highest level, security remains the number one priority being addressed by IT decision makers in the coming year as shown here. And this data is pretty much across the board by industry, by region, by size of company. I mean we dug into it and the only slight deviation from the mean is in financial services. The second and third most cited priorities, cloud migration and analytics are noticeably closer to cybersecurity in financial services than in other sectors, likely because financial services has always been hyper security conscious, but security is still a clear number one priority in that sector. The idea behind confidential computing is to better address threat models for data in execution. Protecting data at rest and data in transit have long been a focus of security approaches, but more recently, silicon manufacturers have introduced architectures that separate data and applications from the host system, ARM, Intel, AMD, Nvidia and other suppliers are all on board, as are the big cloud players. Now, the argument against confidential computing is that it narrowly focuses on memory encryption and it doesn't solve the biggest problems in security. Multiple system images, updates, different services and the entire code flow aren't directly addressed by memory encryption. Rather to truly attack these problems, many believe that OSs need to be re-engineered with the attacker and hacker in mind. There are so many variables and at the end of the day, critics say the emphasis on confidential computing made by cloud providers is overstated and largely hype. This tweet from security researcher Rodrigo Bronco, sums up the sentiment of many skeptics. He says, "Confidential computing is mostly a marketing campaign from memory encryption. It's not driving the industry towards the hard open problems. It is selling an illusion." Okay. Nonetheless, encrypting data in use and fencing off key components of the system isn't a bad thing, especially if it comes with the package essentially for free. There has been a lack of standardization and interoperability between different confidential computing approaches. But the confidential computing consortium was established in 2019 ostensibly to accelerate the market and influence standards. Notably, AWS is not part of the consortium, likely because the politics of the consortium were probably a conundrum for AWS because the base technology defined by the consortium is seen as limiting by AWS. This is my guess, not AWS' words. But I think joining the consortium would validate a definition which AWS isn't aligned with. And two, it's got to lead with this Annapurna acquisition. It was way ahead with ARM integration, and so it's probably doesn't feel the need to validate its competitors. Anyway, one of the premier members of the confidential computing consortium is Google, along with many high profile names, including Aem, Intel, Meta, Red Hat, Microsoft, and others. And we're pleased to welcome two experts on confidential computing from Google to unpack the topic. Nelly Porter is Head of Product for GCP Confidential Computing and Encryption and Dr. Patricia Florissi is the Technical Director for the Office of the CTO at Google Cloud. Welcome Nelly and Patricia, great to have you. >> Great to be here. >> Thank you so much for having us. >> You're very welcome. Nelly, why don't you start and then Patricia, you can weigh in. Just tell the audience a little bit about each of your roles at Google Cloud. >> So I'll start, I'm owning a lot of interesting activities in Google and again, security or infrastructure securities that I usually own. And we are talking about encryption, end-to-end encryption, and confidential computing is a part of portfolio. Additional areas that I contribute to get with my team to Google and our customers is secure software supply chain because you need to trust your software. Is it operate in your confidential environment to have end-to-end security, about if you believe that your software and your environment doing what you expect, it's my role. >> Got it. Okay, Patricia? >> Well, I am a Technical Director in the Office of the CTO, OCTO for short in Google Cloud. And we are a global team, we include former CTOs like myself and senior technologies from large corporations, institutions and a lot of success for startups as well. And we have two main goals, first, we walk side by side with some of our largest, more strategic or most strategical customers and we help them solve complex engineering technical problems. And second, we advice Google and Google Cloud Engineering, product management on emerging trends and technologies to guide the trajectory of our business. We are unique group, I think, because we have created this collaborative culture with our customers. And within OCTO I spend a lot of time collaborating with customers in the industry at large on technologies that can address privacy, security, and sovereignty of data in general. >> Excellent. Thank you for that both of you. Let's get into it. So Nelly, what is confidential computing from Google's perspective? How do you define it? >> Confidential computing is a tool and one of the tools in our toolbox. And confidential computing is a way how we would help our customers to complete this very interesting end-to-end lifecycle of the data. And when customers bring in the data to cloud and want to protect it as they ingest it to the cloud, they protect it at rest when they store data in the cloud. But what was missing for many, many years is ability for us to continue protecting data and workloads of our customers when they run them. And again, because data is not brought to cloud to have huge graveyard, we need to ensure that this data is actually indexed. Again, there is some insights driven and drawn from this data. You have to process this data and confidential computing here to help. Now we have end-to-end protection of our customer's data when they bring the workloads and data to cloud thanks to confidential computing. >> Thank you for that. Okay, we're going to get into the architecture a bit, but before we do Patricia, why do you think this topic of confidential computing is such an important technology? Can you explain? Do you think it's transformative for customers and if so, why? >> Yeah, I would maybe like to use one thought, one way, one intuition behind why confidential computing matters because at the end of the day, it reduces more and more the customer's thrush boundaries and the attack surface. That's about reducing that periphery, the boundary in which the customer needs to mind about trust and safety. And in a way is a natural progression that you're using encryption to secure and protect data in the same way that we are encrypting data in transit and at rest. Now, we are also encrypting data while in the use. And among other beneficials, I would say one of the most transformative ones is that organizations will be able to collaborate with each other and retain the confidentiality of the data. And that is across industry, even though it's highly focused on, I wouldn't say highly focused but very beneficial for highly regulated industries, it applies to all of industries. And if you look at financing for example, where bankers are trying to detect fraud and specifically double finance where a customer is actually trying to get a finance on an asset, let's say a boat or a house, and then it goes to another bank and gets another finance on that asset. Now bankers would be able to collaborate and detect fraud while preserving confidentiality and privacy of the data. >> Interesting and I want to understand that a little bit more but I got to push you a little bit on this, Nellie if I can, because there's a narrative out there that says confidential computing is a marketing ploy I talked about this up front, by cloud providers that are just trying to placate people that are scared of the cloud. And I'm presuming you don't agree with that, but I'd like you to weigh in here. The argument is confidential computing is just memory encryption, it doesn't address many other problems. It is over hyped by cloud providers. What do you say to that line of thinking? >> I absolutely disagree as you can imagine Dave, with this statement. But the most importantly is we mixing a multiple concepts I guess, and exactly as Patricia said, we need to look at the end-to-end story, not again, is a mechanism. How confidential computing trying to execute and protect customer's data and why it's so critically important. Because what confidential computing was able to do, it's in addition to isolate our tenants in multi-tenant environments the cloud offering to offer additional stronger isolation, they called it cryptographic isolation. It's why customers will have more trust to customers and to other customers, the tenants running on the same host but also us because they don't need to worry about against rats and more malicious attempts to penetrate the environment. So what confidential computing is helping us to offer our customers stronger isolation between tenants in this multi-tenant environment, but also incredibly important, stronger isolation of our customers to tenants from us. We also writing code, we also software providers, we also make mistakes or have some zero days. Sometimes again us introduce, sometimes introduced by our adversaries. But what I'm trying to say by creating this cryptographic layer of isolation between us and our tenants and among those tenants, we really providing meaningful security to our customers and eliminate some of the worries that they have running on multi-tenant spaces or even collaborating together with very sensitive data knowing that this particular protection is available to them. >> Okay, thank you. Appreciate that. And I think malicious code is often a threat model missed in these narratives. You know, operator access. Yeah, maybe I trust my cloud's provider, but if I can fence off your access even better, I'll sleep better at night separating a code from the data. Everybody's ARM, Intel, AMD, Nvidia and others, they're all doing it. I wonder if Nell, if we could stay with you and bring up the slide on the architecture. What's architecturally different with confidential computing versus how operating systems and VMs have worked traditionally? We're showing a slide here with some VMs, maybe you could take us through that. >> Absolutely, and Dave, the whole idea for Google and now industry way of dealing with confidential computing is to ensure that three main property is actually preserved. Customers don't need to change the code. They can operate in those VMs exactly as they would with normal non-confidential VMs. But to give them this opportunity of lift and shift though, no changing the apps and performing and having very, very, very low latency and scale as any cloud can, some things that Google actually pioneer in confidential computing. I think we need to open and explain how this magic was actually done, and as I said, it's again the whole entire system have to change to be able to provide this magic. And I would start with we have this concept of root of trust and root of trust where we will ensure that this machine within the whole entire host has integrity guarantee, means nobody changing my code on the most low level of system, and we introduce this in 2017 called Titan. So our specific ASIC, specific inch by inch system on every single motherboard that we have that ensures that your low level former, your actually system code, your kernel, the most powerful system is actually proper configured and not changed, not tempered. We do it for everybody, confidential computing included, but for confidential computing is what we have to change, we bring in AMD or future silicon vendors and we have to trust their former, their way to deal with our confidential environments. And that's why we have obligation to validate intelligent not only our software and our former but also former and software of our vendors, silicon vendors. So we actually, when we booting this machine as you can see, we validate that integrity of all of this system is in place. It means nobody touching, nobody changing, nobody modifying it. But then we have this concept of AMD Secure Processor, it's special ASIC best specific things that generate a key for every single VM that our customers will run or every single node in Kubernetes or every single worker thread in our Hadoop spark capability. We offer all of that and those keys are not available to us. It's the best case ever in encryption space because when we are talking about encryption, the first question that I'm receiving all the time, "Where's the key? Who will have access to the key?" because if you have access to the key then it doesn't matter if you encrypted or not. So, but the case in confidential computing why it's so revolutionary technology, us cloud providers who don't have access to the keys, they're sitting in the hardware and they fed to memory controller. And it means when hypervisors that also know about this wonderful things saying I need to get access to the memories, that this particular VM I'm trying to get access to. They do not decrypt the data, they don't have access to the key because those keys are random, ephemeral and per VM, but most importantly in hardware not exportable. And it means now you will be able to have this very interesting world that customers or cloud providers will not be able to get access to your memory. And what we do, again as you can see, our customers don't need to change their applications. Their VMs are running exactly as it should run. And what you've running in VM, you actually see your memory clear, it's not encrypted. But God forbid is trying somebody to do it outside of my confidential box, no, no, no, no, no, you will now be able to do it. Now, you'll see cyber test and it's exactly what combination of these multiple hardware pieces and software pieces have to do. So OS is also modified and OS is modified such way to provide integrity. It means even OS that you're running in your VM box is not modifiable and you as customer can verify. But the most interesting thing I guess how to ensure the super performance of this environment because you can imagine Dave, that's increasing and it's additional performance, additional time, additional latency. So we're able to mitigate all of that by providing incredibly interesting capability in the OS itself. So our customers will get no changes needed, fantastic performance and scales as they would expect from cloud providers like Google. >> Okay, thank you. Excellent, appreciate that explanation. So you know again, the narrative on this is, well, you've already given me guarantees as a cloud provider that you don't have access to my data, but this gives another level of assurance, key management as they say is key. Now humans aren't managing the keys, the machines are managing them. So Patricia, my question to you is in addition to, let's go pre-confidential computing days, what are the sort of new guarantees that these hardware based technologies are going to provide to customers? >> So if I am a customer, I am saying I now have full guarantee of confidentiality and integrity of the data and of the code. So if you look at code and data confidentiality, the customer cares and they want to know whether their systems are protected from outside or unauthorized access, and that we covered with Nelly that it is. Confidential computing actually ensures that the applications and data antennas remain secret. The code is actually looking at the data, only the memory is decrypting the data with a key that is ephemeral, and per VM, and generated on demand. Then you have the second point where you have code and data integrity and now customers want to know whether their data was corrupted, tempered with or impacted by outside actors. And what confidential computing ensures is that application internals are not tempered with. So the application, the workload as we call it, that is processing the data is also has not been tempered and preserves integrity. I would also say that this is all verifiable, so you have attestation and this attestation actually generates a log trail and the log trail guarantees that provides a proof that it was preserved. And I think that the offers also a guarantee of what we call sealing, this idea that the secrets have been preserved and not tempered with, confidentiality and integrity of code and data. >> Got it. Okay, thank you. Nelly, you mentioned, I think I heard you say that the applications is transparent, you don't have to change the application, it just comes for free essentially. And we showed some various parts of the stack before, I'm curious as to what's affected, but really more importantly, what is specifically Google's value add? How do partners participate in this, the ecosystem or maybe said another way, how does Google ensure the compatibility of confidential computing with existing systems and applications? >> And a fantastic question by the way, and it's very difficult and definitely complicated world because to be able to provide these guarantees, actually a lot of work was done by community. Google is very much operate and open. So again our operating system, we working this operating system repository OS is OS vendors to ensure that all capabilities that we need is part of the kernels are part of the releases and it's available for customers to understand and even explore if they have fun to explore a lot of code. We have also modified together with our silicon vendors kernel, host kernel to support this capability and it means working this community to ensure that all of those pages are there. We also worked with every single silicon vendor as you've seen, and it's what I probably feel that Google contributed quite a bit in this world. We moved our industry, our community, our vendors to understand the value of easy to use confidential computing or removing barriers. And now I don't know if you noticed Intel is following the lead and also announcing a trusted domain extension, very similar architecture and no surprise, it's a lot of work done with our partners to convince work with them and make this capability available. The same with ARM this year, actually last year, ARM announced future design for confidential computing, it's called confidential computing architecture. And it's also influenced very heavily with similar ideas by Google and industry overall. So it's a lot of work in confidential computing consortiums that we are doing, for example, simply to mention, to ensure interop as you mentioned, between different confidential environments of cloud providers. They want to ensure that they can attest to each other because when you're communicating with different environments, you need to trust them. And if it's running on different cloud providers, you need to ensure that you can trust your receiver when you sharing your sensitive data workloads or secret with them. So we coming as a community and we have this at Station Sig, the community-based systems that we want to build, and influence, and work with ARM and every other cloud providers to ensure that they can interop. And it means it doesn't matter where confidential workloads will be hosted, but they can exchange the data in secure, verifiable and controlled by customers really. And to do it, we need to continue what we are doing, working open and contribute with our ideas and ideas of our partners to this role to become what we see confidential computing has to become, it has to become utility. It doesn't need to be so special, but it's what what we've wanted to become. >> Let's talk about, thank you for that explanation. Let's talk about data sovereignty because when you think about data sharing, you think about data sharing across the ecosystem in different regions and then of course data sovereignty comes up, typically public policy, lags, the technology industry and sometimes it's problematic. I know there's a lot of discussions about exceptions but Patricia, we have a graphic on data sovereignty. I'm interested in how confidential computing ensures that data sovereignty and privacy edicts are adhered to, even if they're out of alignment maybe with the pace of technology. One of the frequent examples is when you delete data, can you actually prove the data is deleted with a hundred percent certainty, you got to prove that and a lot of other issues. So looking at this slide, maybe you could take us through your thinking on data sovereignty. >> Perfect. So for us, data sovereignty is only one of the three pillars of digital sovereignty. And I don't want to give the impression that confidential computing addresses it at all, that's why we want to step back and say, hey, digital sovereignty includes data sovereignty where we are giving you full control and ownership of the location, encryption and access to your data. Operational sovereignty where the goal is to give our Google Cloud customers full visibility and control over the provider operations, right? So if there are any updates on hardware, software stack, any operations, there is full transparency, full visibility. And then the third pillar is around software sovereignty, where the customer wants to ensure that they can run their workloads without dependency on the provider's software. So they have sometimes is often referred as survivability that you can actually survive if you are untethered to the cloud and that you can use open source. Now, let's take a deep dive on data sovereignty, which by the way is one of my favorite topics. And we typically focus on saying, hey, we need to care about data residency. We care where the data resides because where the data is at rest or in processing need to typically abides to the jurisdiction, the regulations of the jurisdiction where the data resides. And others say, hey, let's focus on data protection, we want to ensure the confidentiality, and integrity, and availability of the data, which confidential computing is at the heart of that data protection. But it is yet another element that people typically don't talk about when talking about data sovereignty, which is the element of user control. And here Dave, is about what happens to the data when I give you access to my data, and this reminds me of security two decades ago, even a decade ago, where we started the security movement by putting firewall protections and logging accesses. But once you were in, you were able to do everything you wanted with the data. An insider had access to all the infrastructure, the data, and the code. And that's similar because with data sovereignty, we care about whether it resides, who is operating on the data, but the moment that the data is being processed, I need to trust that the processing of the data we abide by user's control, by the policies that I put in place of how my data is going to be used. And if you look at a lot of the regulation today and a lot of the initiatives around the International Data Space Association, IDSA and Gaia-X, there is a movement of saying the two parties, the provider of the data and the receiver of the data going to agree on a contract that describes what my data can be used for. The challenge is to ensure that once the data crosses boundaries, that the data will be used for the purposes that it was intended and specified in the contract. And if you actually bring together, and this is the exciting part, confidential computing together with policy enforcement. Now, the policy enforcement can guarantee that the data is only processed within the confines of a confidential computing environment, that the workload is in cryptographically verified that there is the workload that was meant to process the data and that the data will be only used when abiding to the confidentiality and integrity safety of the confidential computing environment. And that's why we believe confidential computing is one necessary and essential technology that will allow us to ensure data sovereignty, especially when it comes to user's control. >> Thank you for that. I mean it was a deep dive, I mean brief, but really detailed. So I appreciate that, especially the verification of the enforcement. Last question, I met you two because as part of my year-end prediction post, you guys sent in some predictions and I wasn't able to get to them in the predictions post, so I'm thrilled that you were able to make the time to come on the program. How widespread do you think the adoption of confidential computing will be in '23 and what's the maturity curve look like this decade in your opinion? Maybe each of you could give us a brief answer. >> So my prediction in five, seven years as I started, it will become utility, it will become TLS. As of freakin' 10 years ago, we couldn't believe that websites will have certificates and we will support encrypted traffic. Now we do, and it's become ubiquity. It's exactly where our confidential computing is heeding and heading, I don't know we deserve yet. It'll take a few years of maturity for us, but we'll do that. >> Thank you. And Patricia, what's your prediction? >> I would double that and say, hey, in the very near future, you will not be able to afford not having it. I believe as digital sovereignty becomes ever more top of mind with sovereign states and also for multinational organizations, and for organizations that want to collaborate with each other, confidential computing will become the norm, it will become the default, if I say mode of operation. I like to compare that today is inconceivable if we talk to the young technologists, it's inconceivable to think that at some point in history and I happen to be alive, that we had data at rest that was non-encrypted, data in transit that was not encrypted. And I think that we'll be inconceivable at some point in the near future that to have unencrypted data while we use. >> You know, and plus I think the beauty of the this industry is because there's so much competition, this essentially comes for free. I want to thank you both for spending some time on Breaking Analysis, there's so much more we could cover. I hope you'll come back to share the progress that you're making in this area and we can double click on some of these topics. Really appreciate your time. >> Anytime. >> Thank you so much, yeah. >> In summary, while confidential computing is being touted by the cloud players as a promising technology for enhancing data privacy and security, there are also those as we said, who remain skeptical. The truth probably lies somewhere in between and it will depend on the specific implementation and the use case as to how effective confidential computing will be. Look as with any new tech, it's important to carefully evaluate the potential benefits, the drawbacks, and make informed decisions based on the specific requirements in the situation and the constraints of each individual customer. But the bottom line is silicon manufacturers are working with cloud providers and other system companies to include confidential computing into their architectures. Competition in our view will moderate price hikes and at the end of the day, this is under-the-covers technology that essentially will come for free, so we'll take it. I want to thank our guests today, Nelly and Patricia from Google. And thanks to Alex Myerson who's on production and manages the podcast. Ken Schiffman as well out of our Boston studio. Kristin Martin and Cheryl Knight help get the word out on social media and in our newsletters, and Rob Hoof is our editor-in-chief over at siliconangle.com, does some great editing for us. Thank you all. Remember all these episodes are available as podcasts. Wherever you listen, just search Breaking Analysis podcast. I publish each week on wikibon.com and siliconangle.com where you can get all the news. If you want to get in touch, you can email me at david.vellante@siliconangle.com or DM me at D Vellante, and you can also comment on my LinkedIn post. Definitely you want to check out etr.ai for the best survey data in the enterprise tech business. I know we didn't hit on a lot today, but there's some amazing data and it's always being updated, so check that out. This is Dave Vellante for theCUBE Insights powered by ETR. Thanks for watching and we'll see you next time on Breaking Analysis. (subtle music)

Published Date : Feb 10 2023

SUMMARY :

bringing you data-driven and at the end of the day, and then Patricia, you can weigh in. contribute to get with my team Okay, Patricia? Director in the Office of the CTO, for that both of you. in the data to cloud into the architecture a bit, and privacy of the data. that are scared of the cloud. and eliminate some of the we could stay with you and they fed to memory controller. to you is in addition to, and integrity of the data and of the code. that the applications is transparent, and ideas of our partners to this role One of the frequent examples and a lot of the initiatives of the enforcement. and we will support encrypted traffic. And Patricia, and I happen to be alive, the beauty of the this industry and at the end of the day,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
NellyPERSON

0.99+

PatriciaPERSON

0.99+

Alex MyersonPERSON

0.99+

AWSORGANIZATION

0.99+

International Data Space AssociationORGANIZATION

0.99+

DavePERSON

0.99+

AWS'ORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

Dave VellantePERSON

0.99+

Rob HoofPERSON

0.99+

Cheryl KnightPERSON

0.99+

Nelly PorterPERSON

0.99+

GoogleORGANIZATION

0.99+

NvidiaORGANIZATION

0.99+

IDSAORGANIZATION

0.99+

Rodrigo BroncoPERSON

0.99+

2019DATE

0.99+

Ken SchiffmanPERSON

0.99+

IntelORGANIZATION

0.99+

AMDORGANIZATION

0.99+

2017DATE

0.99+

ARMORGANIZATION

0.99+

AemORGANIZATION

0.99+

NelliePERSON

0.99+

Kristin MartinPERSON

0.99+

Red HatORGANIZATION

0.99+

two partiesQUANTITY

0.99+

Palo AltoLOCATION

0.99+

last yearDATE

0.99+

Patricia FlorissiPERSON

0.99+

oneQUANTITY

0.99+

MetaORGANIZATION

0.99+

twoQUANTITY

0.99+

thirdQUANTITY

0.99+

Gaia-XORGANIZATION

0.99+

second pointQUANTITY

0.99+

two expertsQUANTITY

0.99+

david.vellante@siliconangle.comOTHER

0.99+

secondQUANTITY

0.99+

bothQUANTITY

0.99+

first questionQUANTITY

0.99+

fiveQUANTITY

0.99+

OneQUANTITY

0.99+

theCUBE StudiosORGANIZATION

0.99+

two decades agoDATE

0.99+

'23DATE

0.99+

eachQUANTITY

0.99+

a decade agoDATE

0.99+

threeQUANTITY

0.99+

zero daysQUANTITY

0.98+

fourQUANTITY

0.98+

OCTOORGANIZATION

0.98+

todayDATE

0.98+

Jim Long, Sarbjeet Johal, and Joseph Jacks | CUBEConversation, February 2019


 

(lively classical music) >> Hello everyone, welcome to this special Cube conversation, we are here at the Power Panel Conversation. I'm John Furrier, in Palo Alto, California, theCUBE studies we have remote on the line here, talk about the cloud technology's impact on entrepreneurship and startups and overall ecosystem is Jim Long, who's the CEO of Didja, which is a startup around disrupting digital TV, also has been an investor and a serial entrepreneur, Sarbjeet Johal, who's the in-cloud influencer of strategy and investor out of Berkeley, California, The Batchery, and also Joseph Jacks, CUBE alumni, actually you guys are all CUBE alumni, so great to have you on. Joseph Jacks is the founder and general partner of OSS Capital, Open Source Software Capital, a new fund that's been raised specifically to commercialize and fund startups around open source software. Guys, we got a great panel here of experts, thanks for joining us, appreciate it. >> Go Bears! >> Nice to be here. >> So we have a distinguished panel, it's the Power Panel, we're on cloud technos, first I'd like to get you guys' reaction you know, you're to seeing a lot of negative news around what Facebook has become, essentially their own hyper-scale cloud with their application. They were called the digital, you know, renegades, or digital gangsters in the UK by the Parliament, which was built on open source software. Amazon's continuing to win, Azure's doing their thing, bundling Office 365, making it look like they've got more revenue with their catching up, Google, and then you got IBM and Oracle, and then you got an ecosystem that's impacted by this large scale, so I want to get your thoughts on first point here. Is there room for more clouds? There's a big buzzword around multiple clouds. Are we going to see specialty clouds? 'Causes Salesforce is a cloud, so is there room for more cloud? Jim, why don't you start? >> Well, I sure hope so. You know, the internet has unfortunately become sort of the internet of monopolies, and that doesn't do anyone any good. In fact, you bring up an interesting point, it'd be kind of interesting to see if Facebook created a social cloud for certain types of applications to use. I've no idea whether that makes any sense, but Amazon's clearly been the big gorilla now, and done an amazing job, we love using them, but we also love seeing, trying out different services that they have and then figuring out whether we want to develop them ourselves or use a specialty service, and I think that's going to be interesting, particularly in the AI area, stuff like that. So I sure hope more clouds are around for all of us to take advantage of. >> Joseph, I want you to weigh in here, 'cause you were close to the Kubernetes trend, in fact we were at a OpenStack event when you started Kismatic, which is the movement that became KubeCon Cloud Native, many many years ago, now you're investing in open source. The world's built on open source, there's got to be room for more clouds. Your thoughts on the opportunities? >> Yeah, thanks for having me on, John. I think we need a new kind of open collaborative cloud, and to date, we haven't really seen any of the existing major sort of large critical mass cloud providers participate in that type of model. Arguably, Google has probably participated and contributed the most in the open source ecosystem, contributing TensorFlow and Kubernetes and Go, lots of different open source projects, but they're ultimately focused on gravitating huge amounts of compute and storage cycles to their cloud platform. So I think one of the big missing links in the industry is, as we continue to see the rise of these large vertically integrated proprietary control planes for computing and storage and applications and services, I think as the open source community and the open source ecosystem continues to grow and explode, we'll need a third sort of provider, one that isn't based on monopoly or based on a traditional proprietary software business like Microsoft kind of transitioning their enterprise customers to services, sort of Amazon in the first camp vertically integrated many a buffet of all these different compute, storage, networking services, application, middleware. Microsoft focused on sort of building managed services of their software portfolio. I think we need a third model where we have sort of an open set of interfaces and an open standards based cloud provider that might be a pure software company, it might be a company that builds on the rails and the infrastructure that Amazon has laid down, spending tens of billions in cap ex, or it could be something based on a project like Kubernetes or built from the community ecosystem. So I think we need something like that just to sort of provide, speed the innovation, and disaggregate the services away from a monolithic kind of closed vendor like Amazon or Azure. >> I want to come back to that whole startup opportunity, but I want to get Sarbjeet in here, because we've been in the B2B area with just last week at IBM Think 2019. Obviously they're trying to get back into the cloud game, but this digital transformation that has been the cliche for almost a couple of years now, if not five or plus. Business has got to move to the cloud, so there's a whole new ball game of complete cultural shift. They need stability. So I want to talk more about this open cloud, which I love that conversation, but give me the blocking and tackling capabilities first, 'cause I got to get out of that old cap ex model, move to an operating model, transform my business, whether it's multi clouds. So Sarbjeet, what's your take on the cloud market for say, the enterprise? >> Yeah, I think for the enterprise... you're just sitting in that data center and moving those to cloud, it's a cumbersome task. For that to work, they actually don't need all the bells and whistles which Amazon has in the periphery, if you will. They need just core things like compute, network, and storage, and some other sort of services, maybe database, maybe data share and stuff like that, but they just want to move those applications as is to start with, with some replatforming and with some changes. Like, they won't make changes to first when they start moving those applications, but our minds are polluted by this thinking. When we see a Facebook being formed by a couple of people, or a company of six people sold for a billion dollars, it just messes up with our mind on the enterprise side, hey we can do that too, we can move that fast and so forth, but it's sort of tragic that we think that way. Well, having said that, and I think we have talked about this in the past. If you are doing anything in the way of systems innovation, if your building those at, even at the enterprise, I think cloud is the way to go. To your original question, if there's room for newer cloud players, I think there is, provided that we can detach the platforms from the environments they are sitting on. So the proprietariness has to kinda, it has to be lowered, the degree of proprietariness has to be lower. It can be through open source I think mainly, it can be from open technologies, they don't have to be open source, but portable. >> JJ was mentioning that, I think that's a big point. Jim Long, you're an entrepreneur, you've been a VC, you know all the VCs, been around for a while, you're also, you're an entrepreneur, you're a serial entrepreneur, starting out at Cal Berkeley back in the day. You know, small ideas can move fast, and you're building on Amazon, and you've got a media kind of thing going on, there's a cloud opportunity for you, 'cause you are cloud native, 'cause you're built in the cloud. How do you see it playing out? 'Cause you're scaling with Amazon. >> Well, so we obviously, as a new startup, don't have the issues the enterprise folks have, and I could really see the enterprise customers, what we used to call the Fortune 500, for example, getting together and insisting on at least a base set of APIs that Amazon and Microsoft et cetera adopt, and for a startup, it's really about moving fast with your own solution that solves a problem. So you don't necessarily care too much that you're tied into Amazon completely because you know that if you need to, you can make a change some day. But they do such a good job for us, and their costs, while they can certainly be lower, and we certainly would like more volume discounts, they're pretty darn amazing across the network, across the internet, we do try to price out other folks just for the heck of it, been doing that recently with CDNs, for example. But for us, we're actually creating a hybrid cloud, if you will, a purpose-built cloud to support local television stations, and we do think that's going to be, along with using Amazon, a unique cloud with our own APIs that we will hopefully have lots of different TV apps use our hybrid cloud for part of their application to service local TV. So it's kind of a interesting play for us, the B2B part of it, we're hoping to be pretty successful as well, and we hope to maybe have multiple cloud vendors in our mix, you know. Not that our users will know who's behind us, maybe Amazon, for something, Limelight for another, or whatever, for example. >> Well you got to be concerned about lock-in as you become in the cloud, that's something that everybody's worried about. JJ, I want to get back to you on the investment thesis, because you have a cutting edge business model around investing in open source software, and there's two schools of thought in the open source community, you know, free contribution's great, and let tha.t be organic, and then there's now commercialization. There's real value being created in open source. You had put together a chart with your team about the billions of dollars in exits from open source companies. So what are you investing in, what do you see as opportunities for entrepreneurs like Jim and others that are out there looking at scaling their business? How do you look at success, what's your advice, what do you see as leading indicators? >> I think I'll broadly answer your question with a model that we've been thinking a lot about. We're going to start writing publicly about it and probably eventually maybe publish a book or two on it, and it's around the sort of fundamental perspective of creating value and capturing value. So if you model a famous investor and entrepreneur in Silicon Valley who has commonly modeled these things using two different letter variables, X and Y, but I'll give you the sort of perspective of modeling value creation and value capture around open source, as compared to closed source or proprietary software. So if you look at value creation modeled as X, and value capture modeled as Y, where X and Y are two independent variables with a fully proprietary software company based approach, whether you're building a cloud service or a proprietary software product or whatever, just a software company, your value creation exponent is typically bounded by two things. Capital and fundraising into the entity creating the software, and the centralization of research and development, meaning engineering output for producing the software. And so those two things are tightly coupled to and bounded to the company. With commercial open source software, the exact opposite is true. So value creation is decoupled and independent from funding, and value creation is also decentralized in terms of the research and development aspect. So you have a sort of decentralized, community-based, crowd-sourced, or sort of internet, global phenomena of contributing to a code base that isn't necessarily owned or fully controlled by a single entity, and those two properties are sort of decoupled from funding and decentralized R and D, are fundamentally changing the value creation kind of exponent. Now let's look at the value capture variable. With proprietary software company, or proprietary technology company, you're primarily looking at two constituents capturing value, people who pay for accessing the service or the software, and people who create the software. And so those two constituents capture all the value, they capture, you know, the vendor selling the software captures maybe 10 or 20% of the value, and the rest of the value, I would would express it say as the customer is capturing the rest of the value. Most economists don't express value capture as capturable by an end user or a customer. I think that's a mistake. >> Jim, you're-- >> So now... >> Okay, Jim, your reaction to that, because there's an article went around this weekend from Motherboard. "The internet was built on free labor "of open source developers. "Is that sustainable?" So Jim, what's your reaction to JJ's comments about the interactions and the dynamic between value creation, value capture, free versus sustainable funding? >> Well if you can sort of mix both together, that's what I would like, I haven't really ever figured out how to make open source work in our business model, but I haven't really tried that hard. It's an intriguing concept for sure, particularly if we come up with APIs that are specific to say, local television or something like that, and maybe some special processes that do things that are of interest to the wider community. So it's something I do plan to look at because I do agree that if you, I mean we use open source, we use this thing called FFmpeg, and several other things, and we're really happy that there's people out there adding value to them, et cetera, and we have our own versions, et cetera, so we'd like to contribute to the community if we could figure out how. >> Sarbjeet, your reactions to JJ's thesis there? >> I think two things. I will comment on two different aspects. One is the lack of standards, and then open source becoming the standard, right. I think open source kind of projects take birth and life in its own, because we have lack of standard, 'cause these different vendors can't agree on standards. So remember we used to have service-oriented architecture, we have Microsoft pushing some standards from one side and IBM pushing from other, SOAP versus xCBL and XML, different sort of paradigms, right, but then REST API became the de facto standard, right, it just took over, I think what REST has done for software in last about 10 years or so, nothing has done that for us. >> well Kubernetes is right now looking pretty good. So if you look at JJ, Kubernetes, the movement you were really were pioneering on, it's having similar dynamic, I mean Kubernetes is becoming a forcing function for solidarity in the community of cloud native, as well as an actual interoperable orchestration layer for multiple clouds and other services. So JJ, your thoughts on how open source continues as some of these new technologies, like Kubernetes, continue to hit the scene. Is there any trajectory change in open source that you see, that you could share, I'd love to get your insights on what's next behind, you know, the rise of Kubernetes is happening, what's next? >> I think more abstractly from Kubernetes, we believe that if you just look at the rate of innovation as a primary factor for progress and forward change in the world, open source software has the highest rate of innovation of any technology creation phenomena, and as a consequence, we're seeing more standards emerge from the open source ecosystem, we're seeing more disruption happen from the open source ecosystem, we're seeing more new technology companies and new paradigms and shifts happen from the open source ecosystem, and kind of all progress across the largest, most difficult sort of compound, sensitive problems, influenced and kind of sourced from the open source ecosystem and the open source world overall. Whether it's chip design, machine learning or computing innovations or new types of architectures, or new types of developer paradigms, you know, biological breakthroughs, there's kind of things up and down the technology spectrum that have a lot to sort of thank open source for. We think that the future of technology and the future of software is really that open source is at the core, as opposed to the periphery or the edges, and so today, every software technology company, and cloud providers included, have closed proprietary cores, meaning that where the core is, the data path, the runtime, the core business logic of the company, today that core is proprietary software or closed source software, and yet what is also true, is at the edges, the wrappers, the sort of crust, the periphery of every technology company, we have lots of open source, we have client libraries and bindings and languages and integrations, configuration, UIs and so on, but the cores are proprietary. We think the following will happen over the next few decades. We think the future will gradually shift from closed proprietary cores to open cores, where instead of a proprietary core, an open core is where you have core open source software project, as the fundamental building block for the company. So for example, Hadoop caused the creation of MapR and Cloudera and Hortonworks, Spark caused the creation of Databricks, Kafka caused the creation of Confluent, Git caused the creation of GitHub and GitLab, and this type of commercial open source software model, where there's a core open source project as the kernel building block for the company, and then an extension of intellectual property or wrappers around that open source project, where you can derive value capture and charge for licensed product with the company, and impress customer, we think that model is where the future is headed, and this includes cloud providers, basically selling proprietary services that could be based on a mixture of open source projects, but perhaps not fundamentally on a core open source project. Now we think generally, like abstractly, with maybe somewhat of a reductionist explanation there, but that open core future is very likely, fundamentally because of the rate of innovation being the highest with the open source model in general. >> All right, that's great stuff. Jim, you're a historian of tech, you've lived it. Your thoughts on some of the emerging trends around cloud, because you're disrupting linear TV with Didja, in a new way using cloud technology. How do you see cloud evolving? >> Well, I think the long lines we discussed, certainly I think that's a really interesting model, and having the open source be the center of the universe, then figure out how to have maybe some proprietary stuff, if I can use that word, around it, that other people can take advantage of, but maybe you get the value capture and build a business on that, that makes a lot of sense, and could certainly fit in the TV industry if you will from where I sit... Bring services to businesses and consumers, so it's not like there's some reason it wouldn't work, you know, it's bound to, it's bound to figure out a way, and if you can get a whole mass of people around the world working on the core technology and if it is sort of unique to what mission of, or at least the marketplace you're going after, that could be pretty interesting, and that would be great to see a lot of different new mini-clouds, if you will, develop around that stuff would be pretty cool. >> Sarbjeet, I want you to talk about scale, because you also have experience working with Rackspace. Rackspace was early on, they were trying to build the cloud, and OpenStack came out of that, and guess what, the world was moving so fast, Amazon was a bullet train just flying down the tracks, and it just felt like Rackspace and their cloud, you know OpenStack, just couldn't keep up. So is scale an issue, and how do people compete against scale in your mind? >> I think scale is an issue, and software chops is an issue, so there's some patterns, right? So one pattern is that we tend to see that open source is now not very good at the application side. You will hardly see any applications being built as open source. And also on the extreme side, open source is pretty sort of lame if you will, at very core of the things, like OpenStack failed for that reason, right? But it's pretty good in the middle as Joseph said, right? So building pipes, building some platforms based on open source, so the hooks, integration, is pretty good there, actually. I think that pattern will continue. Hopefully it will go deeper into the core, which we want to see. The other pattern is I think the software chops, like one vendor has to lead the project for certain amount of time. If that project goes into sort of open, like anybody can grab it, lot of people contribute and sort of jump in very quickly, it tends to fail. That's what happened to, I think, OpenStack, and there were many other reasons behind that, but I think that was the main reason, and because we were smaller, and we didn't have that much software chops, I hate to say that, but then IBM could control like hundred parties a week, at the project >> They did, and look where they are. >> And so does HP, right? >> And look where they are. All right, so I'd love to have a Power Panel on open source, certainly JJ's been in the thick of it as well as other folks in the community. I want to just kind of end on lightweight question for you guys. What have you guys learned? Go down the line, start with Jim, Sarbjeet, and then JJ we'll finish with you. Share something that you've learned over the past three months that moved you or that people should know about in tech or cloud trends that's notable. What's something new that you've learned? >> In my case, it was really just spending some time in the last few months getting to know our end users a little bit better, consumers, and some of the impact that having free internet television has on their lives, and that's really motivating... (distorted speech) Something as simple as you might take for granted, but lower income people don't necessarily have a TV that works or a hotel room that has a TV that works, or heaven forbid they're homeless and all that, so it's really gratifying to me to see people sort of tuning back into their local media through television, just by offering it on their phone and laptops. >> And what are you going to do as a result of that? Take a different action, what's the next step for you, what's the action item? >> Well we're hoping, once our product gets filled out with the major networks, et cetera, that we actually provide a community attachment to it, so that we have over-the-air television channels is the main part of the app, and then a side part of the app could be any IP stream, from city council meetings to high schools, to colleges, to local community groups, local, even religious situations or festivals or whatever, and really try to tie that in. We'd really like to use local television as a way to strengthening all local media and local communities, that's the vision at least. >> It's a great mission you guys have at Didja, thanks for sharing that. Sarbjeet, what have learned over the past quarter, three months that was notable for you and the impact and something that changed you a little bit? >> What actually I have gravitated towards in last three to six months is the blockchain, actually. I was light on that, like what it can do for us, and is there really a thing behind it, and can we leverage it. I've seen more and more actually usage of that, and sort of full SCM, supply chain management and healthcare and some other sort of use cases if you will. I'm intrigued by it, and there's a lot of activity there. I think there's some legs behind it, so I'm excited about that. >> And are doing a blockchain project as a result, or are you still tire-kicking? >> No actually, I will play with it, I'm a practitioner, I play with it, I write code and play with it and see (Jim laughs) what does that level of effort it takes to do that, and as you know, I wrote the Alexa scale couple of weeks back, and play with AI and stuff like that. So I try to do that myself before I-- >> We're hoping blockchain helps even out the TV ad economy and gets rid of middle men and makes more trusting transactions between local businesses and stuff. At least I say that, I don't really know what I'm talking about. >> It sounds good though. You get yourself a new round of funding on that sound byte alone. JJ, what have you learned in the past couple months that's new to you and changed you or made you do something different? >> I've learned over the last few months, OSS Capital is a few months and change old, and so just kind of getting started on that, and it's really, I think potentially more than one decade, probably multi-decade kind of mostly consensus building effort. There's such a huge lack of consensus and agreement in the industry. It's a fascinatingly polarizing area, the sort of general topic of open source technology, economics, value creation, value capture. So my learnings over the past few months have just intensified in terms of the lack of consensus I've seen in the industry. So I'm trying to write a little bit more about observations there and sort of put thoughts out, and that's kind of been the biggest takeaway over the last few months for me. >> I'm sure you learned about all the lawyer conversations, setting up a fund, learnings there probably too right, (Jim laughs) I mean all the detail. All right, JJ, thanks so much, Sarbjeet, Jim, thanks for joining me on this Power Panel, cloud conversation impact, to entrepreneurship, open source. Jim Long, Sarbjeet Johal and Joseph Jacks, JJ, thanks for joining us, theCUBE Conversation here in Palo Alto, I'm John Furrier, thanks for watching. >> Thanks John. (lively classical music)

Published Date : Feb 20 2019

SUMMARY :

so great to have you on. Google, and then you got IBM and Oracle, sort of the internet of monopolies, there's got to be room for more clouds. and the open source that has been the cliche So the proprietariness has to kinda, Berkeley back in the day. across the internet, we do in the open source community, you know, and the rest of the value, about the interactions and the dynamic to them, et cetera, and we have One is the lack of standards, the movement you were and the future of software is really that How do you see cloud evolving? and having the open source be just flying down the tracks, and because we were smaller, and look where they are. over the past three months that moved you and some of the impact that of the app could be any IP stream, and the impact and something is the blockchain, actually. and as you know, I wrote the Alexa scale the TV ad economy and in the past couple months and agreement in the industry. I mean all the detail. (lively classical music)

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JimPERSON

0.99+

IBMORGANIZATION

0.99+

Jim LongPERSON

0.99+

JJPERSON

0.99+

AmazonORGANIZATION

0.99+

OracleORGANIZATION

0.99+

SarbjeetPERSON

0.99+

MicrosoftORGANIZATION

0.99+

Sarbjeet JohalPERSON

0.99+

JosephPERSON

0.99+

JohnPERSON

0.99+

Joseph JacksPERSON

0.99+

OSS CapitalORGANIZATION

0.99+

FacebookORGANIZATION

0.99+

February 2019DATE

0.99+

GoogleORGANIZATION

0.99+

six peopleQUANTITY

0.99+

John FurrierPERSON

0.99+

Silicon ValleyLOCATION

0.99+

Palo AltoLOCATION

0.99+

10QUANTITY

0.99+

two thingsQUANTITY

0.99+

20%QUANTITY

0.99+

CUBEORGANIZATION

0.99+

Palo Alto, CaliforniaLOCATION

0.99+

fiveQUANTITY

0.99+

HPORGANIZATION

0.99+

twoQUANTITY

0.99+

two constituentsQUANTITY

0.99+

Open Source Software CapitalORGANIZATION

0.99+

UKLOCATION

0.99+

Office 365TITLE

0.99+

last weekDATE

0.99+

DidjaORGANIZATION

0.99+

two propertiesQUANTITY

0.99+

bothQUANTITY

0.98+

two schoolsQUANTITY

0.98+

OneQUANTITY

0.98+

first pointQUANTITY

0.98+

RackspaceORGANIZATION

0.98+

third modelQUANTITY

0.98+

first campQUANTITY

0.98+

AlexaTITLE

0.98+

Aaron Sullivan, Rackspace | OpenPOWER Summit 2016


 

hi this is David flora back at the open power foundation conference here in San Jose and with me I've got Erin Sullivan who is a distinguished engineer at Rackspace welcome our thank you so what do you think of the conference so far it's amazing it's grown so much in the last year 15 designs to almost 60 in a year and lots of system launches yeah very impressed well one of the things that has been announced today which was caught my eye in a big big way was the agreement so the announcement that you and Google have can you paint a little more put a little more light on that announcement yeah sure so Rackspace and Google started working together when Rackspace was developing barrel I of course Google had already had their system available at the time and our collaboration just on what we had with barrel I was very positive we were just kind of looking to trade notes and you know share our experience and a few months ago we got back in touch and said hey this was ministers posit enough we should think about doing the next one together from the start and so that's basically what we're doing now we're going to do a power 9 system that comes in multiple mechanical form factors but just one motherboard and we're going to like we did with barrel I we're going to contribute that to open compute when we're finished out of the Open Compute foundation part of the OpenStack yes heart of the open power founder that's right open everything open ever yeah yeah excellent so what about the barreleye that you also announced some things about today can you what is barrel i and what's what's what's different about it so so paralyzed named after a fish that's got a transparent body most of our servers are named after we thought having a server that was fully open would be great to have that name barrel I just entered its first data center shipments it's headed to our Virginia data centers right now and in a few months we expect we will begin providing services to customers on it so that's the progress on barrel I so far we contributed to open compute about 2-3 months ago now and it was accepted so the specifications are online and if you look around the show floor here you will see there are other companies that have put their brand on it or something else and are also taking at the market which is exactly what we hope for great well I've got a question which is why have you why have you put these resources into barrel I and in the future into the power 9 etc what are you looking for that's different about open power that for example you couldn't get with a standard x86 server yeah so I know it gets to be tired and people get tired of hearing the word open but really even with open compute and OpenStack the freedom that comes with developing in that particular universe is really significant before open power even started there were parts of the system we really wish we could get into in an open way where we could develop and share instead of just doing it all on our own and having open power come in the first place fit that but then we also have this problem this Moore's Law problem and the types of changes that we're going to have to implement as an industry to continue to accelerate and and and get higher performance computing and more efficient computing over the next year's they're really huge challenges they go from the chips all the way to the top of the stack and if you don't have the chip part open and you don't have the firmware part open it becomes really difficult to collaborate you can't bring to bear the sort of force of the world software developers onto it you end up in these little silos and niches so for us beryl I provides a lot of value as a business and it has a great influence on the industry at large and so wills IOUs the power 9 system Google but it also is there as a platform for developers to begin to start wrapping their minds around these new problems and opportunities that we have and if it's not done in the open these types of software aren't really scalable across the whole industry that that's a very interesting answer indeed and as you say um does laura has come to a screeching halt from the point of Mount of power per CPU is still going on in terms of the number of transistors etc that you can have what are the what are the things you as a distinguished engineer what are the things that really are most important about the power architecture that allow you to develop these new ways of doing things yeah I think it's it depends on the type of your business you're in but in our business I think in many cloud service providers and in some other environments certainly some HPC and a lot of enterprise the performance of a single core is still really important and it will continue to be for as long as we can keep getting more performance out of a single core so power provided a great platform with a very powerful core and it also has a huge number of threads per core so you get a little bit of the best of both worlds there and you need a really powerful core you have it if you want to spread your load really wide over a more cloudy webby type application you get to use all those threads and there's all that memory bandwidth and so forth so so that was the benefit of power in general and then we run out of core performance and those cycles per you know CPU aren't going up and maybe we can't even scale cores like we used to anymore which is coming in a few years I the the fact that the platform is open in areas that others aren't allows us to bend the rules about how components communicate and we cut out a lot of overhead between them so that's a sort of software in silicon type argument you want to bring the software closer to the silicon yeah closer and in many cases to do the same work that we do today like that's the hard part is people think it's all about genomics or oil and gas or something it's the same work but you know we've already demonstrated that open harcum you it is demonstrated that there are certain workloads that are very common today that you can boost tenfold or more simply by reintegrating your software tighter the hardware right you pull out overhead that we were fine with when Moore's Law is working but now we got to do something yeah great well thanks very much indeed for for being here and thanks very much for watching

Published Date : Apr 19 2016

**Summary and Sentiment Analysis are not been shown because of improper transcript**

ENTITIES

EntityCategoryConfidence
Erin SullivanPERSON

0.99+

RackspaceORGANIZATION

0.99+

San JoseLOCATION

0.99+

GoogleORGANIZATION

0.99+

Aaron SullivanPERSON

0.99+

VirginiaLOCATION

0.99+

last yearDATE

0.97+

todayDATE

0.97+

David floraPERSON

0.97+

both worldsQUANTITY

0.96+

next yearDATE

0.96+

single coreQUANTITY

0.95+

firstQUANTITY

0.95+

lauraPERSON

0.93+

single coreQUANTITY

0.91+

oneQUANTITY

0.89+

first data centerQUANTITY

0.88+

2-3 months agoDATE

0.84+

OpenStackORGANIZATION

0.84+

a few months agoDATE

0.83+

OpenPOWER Summit 2016EVENT

0.82+

almost 60 in a yearQUANTITY

0.79+

OpenStackTITLE

0.75+

one motherboardQUANTITY

0.72+

15 designsQUANTITY

0.7+

aboutDATE

0.68+

x86OTHER

0.68+

few monthsQUANTITY

0.66+

power foundationEVENT

0.61+

tenfoldQUANTITY

0.61+

few yearsQUANTITY

0.58+

lots ofQUANTITY

0.54+

thingsQUANTITY

0.52+

Moore'sTITLE

0.48+

MooreORGANIZATION

0.33+