Rajesh Garg, Landmark Group | UiPath FORWARD IV
>>From the Bellagio hotel in Las Vegas, it's the cube covering UI path forward for brought to you by UI path >>Live from Las Vegas. It's the cube. We are here with UI path at forward for I'm Lisa Martin, with Dave Volante and a lovely setting at the Bellagio. We're going to be talking about automation from the CFO's perspective. Our next guest is our jet guard group financial officer at landmark group, or just welcome to the program. >>Thank you so much. Thank >>You. Before we dig into your transformation strategy and how automation is a key to that, help the audience understand a little bit about landmark. >>Absolutely. So landmark is one of the largest, uh, non-food primarily retailer in the middle east and Asia, India, and now increasingly in Southeast Asia. So we've got about 50 brands, uh, more than half of them, which are homegrown our own brands and some franchise brands. So about 2,200 stores, uh, across 20 countries, 55,000 employees. Um, so 30 million square feet of retail space >>They company. When was the company founded, >>Uh, 48 years ago, >>Legacy institution you were mentioning before we went live that you guys have been working with UI path since 2017. So talk to me about that legacy institution, embracing cloud digital transformation and automation as a, from a visionary strategic perspective. >>Yeah. So look, I mean, you know, you get so many technologies that are being thrown at you. So I would say you have packed or robotic process automation was just another one like that. So I wouldn't say it was like part of a grand strategy. You know, it comes as it looks like, Hey, this is cool. You know, in the, in the back office, when somebody showed me first 10 desks with nobody sitting on them, it's kind of spooky. So he said, Hey, this, this looks very interesting. So it started off like that, but then it has just grown because we've stayed with it. So we've amongst things in the early part of your parts customers and, and it's been phenomenal, you know, what, uh, what we're able to do with, uh, with, uh, robotic process automation. Uh, I mean, you know, I've been in this industry with my past employers, like Proctor and gamble and Cadbury, Schweppes, and all, and essentially we used to follow the part of, you know, you eliminate all the non-value add you, then try and automate whatever your ERP system, then all allowed you to automate. >>Then what's left, you consolidate, and then you find the right shore, right. It can be offshore or wherever. So that was the sequence. But I think a lot could not be automated because there are huge gaps in the systems that are being offered and you have a mosaic of systems, every company will have. Right. Um, and then we would end up doing lot more offshore or, you know, other kinds of tactics, but then once RPA showed up on the scene, it's suddenly disrupted everything because now whatever the systems can do, or when you have to move data from one system to the other or make sense out of it, that's where this technology sits. And so that's, so that's very, I, you know, we've now got a pretty large, uh, robotic process automation practice. And, and, you know, we are touching started with finance and now we are pretty much enterprise wide. So all the, >>These technologies are coming together, automation, RPA, cloud AI, they're all sort of converging. And as a retailer, I'm curious as to what your cloud strategy is and how that fits and all, there's always a lot of sensitivity from retailers that don't want to be on Amazon, maybe some do. And they say, Hey, we've, we've we compete in other ways, what's your posture in that? >>So we've also been an early adopter of cloud, both. If I talk within the UI path thing, we were, I think the first ones to put it on the cloud, because we just saw, even before you are part, uh, we saw how people could tamper with it, you know, attended robots, you know, on the desktop one. So we went on the cloud and that was good, uh, way back. But overall, the company also has a very pro you know, Val defined cloud strategy. So we are, you know, pretty much all a large part of our systems are on the cloud with Azure. >>Yeah. So, which makes sense, right. As a retailer, go, go with Azure, plus somebody, Microsoft, you know, X, such a lot of Microsoft expertise out there that you can leverage. And I got to ask you because everybody's freaked out on wall street about power automate, you know, competing with UI path. And I've told people they kind of different parts of the spectrum, but I've talked to a lot of customers this week. So yeah, we use both. We use UI path for end-to-end automation. We use power automate for a lot of our personal productivity stuff. How do you guys, do you use, uh, the power automate? How do you see those two? Yeah, >>No, I think, look, it's inevitable. A lot of technologies will keep evolving. I think Microsoft is a fantastic company. I mean, the way they perfected teams right in time, you know, and pretty, always hit, uh, a year before COVID hit teams was not ready, you know? So I think I know power automate is good. We use it, but not as you know, it's not ready for enterprise wide. So I think more, I'm not an expert in power automate yet. Um, you know, what, it kind of seemed more like when it's linked to the office automation versus linking major enterprise wide or >>Which is really where you're headed. Yeah. Talk about the results that you've seen, the higher you're measuring the return and the whole business case. When you evaluate it as CFO, >>See it being a CFO, I wear two hats. Right. I'm trying to help digital transformation. Although I must say I'm not the only one our company has. Every function is these days talking digital. Right. Because it's almost like table stakes. Yeah. Uh, you, you can't be in business a leader and we are like a leader in all the markets we are, and there's no choice, but to be fully digital. Right. Uh, but being a CFO absolutely. You know, you do look at the hard dollars. Right. Um, and initially when you're pushing any technology to any functional head or your colleague or the CEO or the board, they do want to see the dollars because a lot of softwares talk about the soft benefits. Um, I think they gotta pay for themselves. So I think it's like, yes, if I can get the hard dollars and then I can demonstrate softer benefits, whether it is the quality of work, less errors, better compliance, right. >>Or I think employee, uh, work work-life balance, right. I mean, in, in, uh, we are, uh, in a growing company we've been growing for the last four decades and there's a constant struggle to help colleagues maintain better work life balance. So I think once the basic return is off the table, everyone's talking about the quality of work enabling. And I think now we've, we are proudly talking, you know, that, Hey, we've got a lot of people, um, we've hired them. But what we are using of them is their fingers, their eyes, ears, and that's about it. Can we now get them to use their brain? So it's like, Hey, it's a freebie. You got so many people let's start using the gray matter. And that's, I think what this technology does, it takes away the Gronk and you can then tell them, Hey, analyze the data, look at it, better business outcomes. And I think that's where the real value is. >>That is, so we've heard a lot about time saved hours saved. That's kind of the key, a key metric. And you look at that as hard dollars. How, how do you translate that to the income statement? >>So, so let's put it, uh, you know, I was looking at applied science, applied materials presentation, and they had a 150,000 hours saved. Uh, I just did our math. I mean, so we've so far saved 342,000 hours per annum removed out of the system. Right. But I would say not all I can say, I took them to the bottom line. So probably 70% of that, because the rest is probably gone back to people doing more value added stuff. >>So how does it hit the income statement? Is it hit it as new revenue or cost savings or savings reduction in >>Yeah. Or are you don't hire as many as you needed to? Uh, >>Yes. That's the missing link. Yeah. Okay. Absolutely. Is I was going to need to hire or what 1,100 people hire 10 or whatever it is. Okay. Now I'm sorry. Does that, is that, does that get into a debate? Like, cause I can see a lot of people, if we don't do this, we're going to, you know, and then as a CFO, you might say let's defend that a little bit. >>Seek cost avoidance is always debated. Yep. And that's why I said, as long as you can prove that the hard dollars taken to the bottom line are visible and you can put your finger on them, then people become more comfortable saying, okay, as long as you know, I've got my payback, I've got something I can, you know, make sure that my cost line is not going up because it's very easy to do, you know, kind of say, Hey look, all this soft benefits and now your cost has also gone up. So I think once the, the, the hard dollars that you can bank are out of the way, then you can talk about costs avoided, and then you can talk about the softer benefits. Are there, there is no doubt because you try and what we do is we tell people if they're in a cell, okay, we'll shut, shut it down. >>I say, Hey, wait, well, right then, you know, but so you have four years of data on this, so you can prove it. And by the way, soft dollars are where the real money is. I don't mean to denigrate that, but I get into a lot of discussions with CFO's like, okay, show me the hard dollars first and then the hard, the soft dollars or telephone numbers. Yeah. >>Yeah. I think I look at it as an inverted pyramid. Yeah. Where you start with the cost saved, which is the smaller part of the pyramid. And then you get speed, right. Because speed is actually a big thing, which is very difficult to measure. Right? I mean, I'll give you an example in none of our largest markets, right. In the middle of COVID, they announced all products that are being imported, which is for us about 80,000 of them, um, uh, need to have a whole bunch of compliance forms on the government portal, import certifications. And you got like a month to do all that work. So now you'll get an army of 20, 30 people train them. We did nothing. We built the barns and we were ready ahead of competition. And I think, and, and life continues. Now the supply chain officer will sign on the dotted line for you saying he would have had to hire 30 people. And he, it's not easy to hire suddenly, but we were compliant and, and now that's cost avoided. But I would say a big business benefit because we were the first ones to have all our products compliant with the market requirements. That's a >>Great example. >>I think about some of the IDC data that was, did you see that that was presented this morning, looking at, you know, the positive outlook as, as RPA being a jobs creator over time. Talk to me a little bit about how you've navigated that through the organization and even done upskilling of some of those folks so that they're not losing, but they're gaining. >>I think there is, you know, you have to take all these projections with a pinch of salt, you know, I mean, saying you will, the world will save $150 billion and all, I mean, if you add all the soft dollars. Yes. But in reality, you know, I lose joke about it. If you take all the technology initiatives in a company and you add all the MPVs and that they have submitted, that would be larger than the market cap of the company. >>It's true. All the projects add up to more value. >>I think, I think, you know, we don't get carried away by these major projections, but I think some of it is true. I mean, you know, I kind of talk about the Luddites, right? I mean, when the first, you know, weaving machines game in, in Northern England, near Manchester and these Luddites, they were called, they were going around breaking down these machines because they were supposed to take away jobs. Now reality is a lot of people did lose jobs who could not make the transition, could not retrain themselves. It is inevitable. It will happen. But over time I would say yes, there have been lot more employment. So I think both go hand in hand. Um, but yes, the more one can help retrain people, get them to, you know, say, Hey, you don't need to spend the rest of your life. Copy pasting and just doing data entry. Uh, you can look at the data and make sense out of it. How much >>Of that was a part of your strategic vision years ago? >>I think years ago we knew it, but it was more, let's get these, you know, simple. When you have hundreds of people in a, in a back office, how do I get them to do more work or have slate or meet my, you know, my productivity goals? I would say it starts with that. Okay. Uh, if you start, uh, deep down because I, I am, you know, I believe in technology, I knew it, it would happen that we would eventually go from, let's say, robotic process automation to intelligent process automation. Right. Which is coming for us. It's we are able to see it, you try and sell that as the lead in and people shut down >>Because they're seen by intelligent process automation. W what do you mean? And, and >>So it's look, if I've got, uh, my robots and the tech, the RP infrastructure, which is processing whole bunch of transactions right now, if I'm able to add in some machine learning or AI, or what have you on top of it, and then I can read the patterns I can, for example, you know, we, we now have built on top of all the various security in our payment systems. If you've got a bot, which then does a final check, which goes and checks the history of that particular vendor as to what is the typical payments being done to that. And then it flags, if it's V out and it stops the payment, for example, right? So, or it goes and does a whole bunch of tests. We're building constantly building tools. So that's kind of, you know, a bit more intelligent than just a simple copy paste or, or doing a transaction >>Because why that's their job or because they it's a black box. They don't know how that decision is made. Or >>I think a lot of these have been sold previously similar technologies and things that would be, you know, the next best thing since sliced water and people have lost fit. So you got to show them the money and then take them along the journey. If you go too fast and try and give this whole, you know, people are smart enough and it, it turns them off. >>It's one of the failures of the tech industry is the broken promises. I can, I can rattle many off >>Cultural shift. It is. It is. How did you help facilitate that? See, I mean, we, we took, you know, the bottoms up and top down approach, uh, you know, the top down was, uh, I have my whole leadership team and as a joke, we locked them up in the boardroom and we got them to build bonds a long ago. And we said, let each of you, you know, download your bank statement and send yourself, uh, you know, if you say any transaction above 10,000, whatever, um, send, send an email to yourself. So as simple as that, or download the electricity bill and, and send it to your wife, you know, something like that. And half of them were able to build a bot in that couple of hours. The other half looked at it, and obviously are, you know, many of them are not as tech savvy, but it helped build the kind of it's aha moment three years ago that, wow, you know, I can build a bot. Um, for some people it was like, oh, they taught these metallic 10 bots are going to walk into the room. >>I love it. The bottom who's responsible for governance. >>So we've got a, we've got a team across it and finance. Um, I mean, somehow I have kind of, you know, created the skunkworks team. The S the center of excellence sits with me. Um, uh, but overall it's a combination and they now run governance, uh, you know, 24 7, >>Uh, you know, sorry, I got to get my crypto question. I ask every CFO's, when are you going to put crypto in the balance sheet? I know I'm teasing, but what you see companies doing this? Has it ever come up in conversation? Is it sort of tongue in cheek joke? Or what do you make of the crypto? >>Yeah, I think personally I'm a big believer, uh, but not for, uh, for a company. I think the, the benefit case of a company, we are not that, you know, we have enough other face too, you know? Um, uh, I think, uh, it's a bit further out for a company to start taking balance sheet position because that's then a speculation, right? Because, so I'm a believer in the benefit of the blockchain technology. We actually did a blockchain experiment a couple of years ago, moving goods, uh, from China to Dubai and also making the payments through a blockchain to, um, so we see huge benefits. We are working with our bankers on certain other initiatives, but I think on the balance sheet sounds like speculation and use of capital. So yeah, if it brings efficiency, if it brings transparency, which is what blockchains do, uh, I think absolutely it's, it is here to stay >>Last question. And then the last 30 seconds, or so for your peers in any industry who are it was, we saw some of the stats yesterday, the amount of percentage of processes that are automateable that aren't automated. What's your advice, recommendations to peers about pulling automation into their digital transformation strategy? >>I think, um, digital transformation can be hugely aided and accelerated if you first put RPLs, because that is the layer, which goes between the humans and whatever technology is out there or whatever you keep buying. So I think because they will be in every area, new technologies coming up, it's better to put RPA first because you can then get more benefit from whatever other technologies you're bolting on. So I would say it's a predecessor to your broader digital transformation, rather than just a part of it. >>Got it. A predecessor, or just thank you for joining Dave and me on the program today, talking about what you're, how you're transforming landmark. Good luck in your presentation this afternoon. I'm sure a lot of folks will get some great takeaways from your talk. >>Thank you so much. It's been >>Great. Our pleasure for Dave Volante. I'm Lisa Martin live in Las Vegas UI path forward for it. We'll be right back after a break.
SUMMARY :
It's the cube. Thank you so much. a little bit about landmark. So landmark is one of the largest, uh, non-food primarily When was the company founded, Legacy institution you were mentioning before we went live that you guys have been working with UI path Uh, I mean, you know, I've been in this industry with my past employers, so that's, so that's very, I, you know, we've now got a pretty large, uh, robotic process automation And as a retailer, I'm curious as to what your cloud strategy But overall, the company also has a very pro you know, And I got to ask you because everybody's freaked out on wall street about power automate, Um, you know, what, it kind of seemed more When you evaluate it as CFO, You know, you do look at the hard dollars. now we've, we are proudly talking, you know, that, Hey, we've got a lot of people, And you look at that as hard dollars. So, so let's put it, uh, you know, I was looking at applied science, Uh, we're going to, you know, and then as a CFO, you might say let's defend that a little bit. So I think once the, the, the hard dollars that you can bank are out of the way, I say, Hey, wait, well, right then, you know, but so you have four years of data on this, I mean, I'll give you an example in none of our largest markets, right. I think about some of the IDC data that was, did you see that that was presented this morning, looking at, I think there is, you know, you have to take all these projections with a pinch of salt, All the projects add up to more value. I mean, you know, I kind of talk about the Luddites, you know, my productivity goals? W what do you mean? So that's kind of, you know, a bit more intelligent than just a simple copy paste They don't know how that decision is made. would be, you know, the next best thing since sliced water and people have lost fit. It's one of the failures of the tech industry is the broken promises. See, I mean, we, we took, you know, the bottoms up and top down approach, uh, I love it. Um, I mean, somehow I have kind of, you know, created the skunkworks team. Uh, you know, sorry, I got to get my crypto question. you know, we have enough other face too, you know? And then the last 30 seconds, or so for your peers in any industry who are accelerated if you first put RPLs, because that is the A predecessor, or just thank you for joining Dave and me on the program today, talking about what you're, Thank you so much. I'm Lisa Martin live in Las Vegas UI
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Volante | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
Lisa Martin | PERSON | 0.99+ |
Dave | PERSON | 0.99+ |
70% | QUANTITY | 0.99+ |
Rajesh Garg | PERSON | 0.99+ |
$150 billion | QUANTITY | 0.99+ |
Manchester | LOCATION | 0.99+ |
Cadbury | ORGANIZATION | 0.99+ |
10 | QUANTITY | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Las Vegas | LOCATION | 0.99+ |
150,000 hours | QUANTITY | 0.99+ |
55,000 employees | QUANTITY | 0.99+ |
four years | QUANTITY | 0.99+ |
30 people | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
1,100 people | QUANTITY | 0.99+ |
Northern England | LOCATION | 0.99+ |
Southeast Asia | LOCATION | 0.99+ |
Dubai | LOCATION | 0.99+ |
30 million square feet | QUANTITY | 0.99+ |
342,000 hours | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
Schweppes | ORGANIZATION | 0.99+ |
yesterday | DATE | 0.99+ |
this week | DATE | 0.99+ |
10 bots | QUANTITY | 0.99+ |
China | LOCATION | 0.99+ |
2017 | DATE | 0.98+ |
today | DATE | 0.98+ |
first | QUANTITY | 0.98+ |
Proctor and gamble | ORGANIZATION | 0.98+ |
three years ago | DATE | 0.98+ |
about 2,200 stores | QUANTITY | 0.97+ |
India | LOCATION | 0.97+ |
a year | DATE | 0.97+ |
hundreds of people | QUANTITY | 0.97+ |
first 10 desks | QUANTITY | 0.96+ |
one | QUANTITY | 0.95+ |
above 10,000 | QUANTITY | 0.94+ |
one system | QUANTITY | 0.94+ |
Landmark Group | ORGANIZATION | 0.94+ |
48 years ago | DATE | 0.93+ |
about 50 brands | QUANTITY | 0.93+ |
this afternoon | DATE | 0.93+ |
COVID | ORGANIZATION | 0.93+ |
Asia | LOCATION | 0.92+ |
Bellagio | ORGANIZATION | 0.92+ |
each | QUANTITY | 0.91+ |
about 80,000 of | QUANTITY | 0.91+ |
half | QUANTITY | 0.91+ |
20 countries | QUANTITY | 0.9+ |
years ago | DATE | 0.89+ |
couple of years ago | DATE | 0.89+ |
a month | QUANTITY | 0.89+ |
this morning | DATE | 0.89+ |
more than half of them | QUANTITY | 0.89+ |
two hats | QUANTITY | 0.86+ |
skunkworks | ORGANIZATION | 0.84+ |
UiPath | ORGANIZATION | 0.84+ |
20, 30 people | QUANTITY | 0.82+ |
first ones | QUANTITY | 0.81+ |
years | DATE | 0.81+ |
couple | QUANTITY | 0.8+ |
24 | OTHER | 0.8+ |
landmark | ORGANIZATION | 0.79+ |
last four decades | DATE | 0.77+ |
Azure | TITLE | 0.76+ |
X | ORGANIZATION | 0.69+ |
seconds | DATE | 0.64+ |
hours | QUANTITY | 0.63+ |
jet | ORGANIZATION | 0.59+ |
IDC | ORGANIZATION | 0.55+ |
Gronk | ORGANIZATION | 0.52+ |
30 | QUANTITY | 0.51+ |
UI path | ORGANIZATION | 0.51+ |
east | LOCATION | 0.37+ |
Arti Garg & Sorin Cheran, HPE | HPE Discover 2020
>> Male Voice: From around the globe, it's theCUBE covering HPE Discover Virtual Experience brought to you by HPE. >> Hi everybody, you're watching theCUBE. And this is Dave Vellante in our continuous coverage of the Discover 2020 Virtual Experience, HPE's virtual event, theCUBE is here, theCUBE virtual. We're really excited, we got a great session here. We're going to dig deep into machine intelligence and artificial intelligence. Dr. Arti Garg is here. She's the Head of Advanced AI Solutions and Technologies at Hewlett Packard Enterprise. And she's joined by Dr. Sorin Cheran, who is the Vice President of AI Strategy and Solutions Group at HPE. Folks, great to see you. Welcome to theCUBE. >> Hi. >> Hi, nice to meet you, hello! >> Dr. Cheran, let's start with you. Maybe talk a little bit about your role. You've had a variety of roles and maybe what's your current situation at HPE? >> Hello! Hi, so currently at HPE, I'm driving the Artificial Intelligence Strategy and Solution group who is currently looking at how do we bring solutions across the HPE portfolio, looking at every business unit, but also on the various geos. At the same time, the team is responsible for building the strategy around the AI for the entire company. We're working closely with the field, we're working closely with the things that are facing the customers every day. And we're also working very closely with the various groups in order to make sure that whatever we build holds water for the entire company. >> Dr. Garg, maybe you could share with us your focus these days? >> Yeah, sure, so I'm also part of the AI Strategy and Solutions team under Sorin as our new vice president in that role, and what I'm focused on is really trying to understand, what are some of the emerging technologies, whether those be things like new processor architectures, or advanced software technologies that could really enhance what we can offer to our customers in terms of AI and exploring what makes sense and how do we bring them to our customers? What are the right ways to package them into solutions? >> So everybody's talking about how digital transformation has been accelerated. If you're not digital, you can't transact business. AI infused into every application. And now people are realizing, "Hey, we can't solve all the world's problems with labor." What are you seeing just in terms of AI being accelerated throughout the portfolio and your customers? >> So that's a very good idea, because we've been talking about digital transformation for some time now. And I believe most of our customers believed initially that the one thing they have is time thinking that, "Oh yes I'm going to somehow at one point apply AI "and somehow at one point "I'm going to figure out how to build the data strategy, "or how to use AI in my different line of businesses." What happened with COVID-19 and in this area is that we lost one thing: time. So I think discussed what they see in our customers is the idea of accelerating their data strategy accelerating, moving from let's say an environment where they would compute center models per data center models trying to understand how do they capture data, how they accelerate the adoption of AI within the various business units, why? Because they understand that currently the way they are actually going to the business changed completely, they need to understand how to adapt a new business model, they need to understand how to look for value pools where there are none as well. So most of our customers today, while initially they spend a lot of time in an never ending POC trying to investigate where do they want to go. Currently they do want to accelerate the application of AI models, the build of data strategies, how then they use all of this data? How do they capture the data to make sure that they look at new business models, new value pools, new customer experience and so on and so forth. So I think what they've seen in the past, let's say three to six months is that we lost time. But the shift towards an adoption of analytics, AI and data strategy is accelerated a lot, simply because customers realize that they need to get ahead of the game. >> So Dr. Garg, what if you could talk about how HPE is utilizing machine intelligence during this pandemic, maybe helping some of your customers, get ahead of it, or at least trying to track it. How are you applying AI in this context? >> So I think that Sorin sort of spoke to one of the things with adopting AI is, it's very transformational for a business so it changes how you do things. You need to actually adopt new processes to take advantage of it. So what I would say is right now we're hearing from customers who recognize that the context in which they are doing their work is completely different. And they're exploring how AI can help them really meet the challenges of those context. So one example might be how can AI and computer vision be coupled together in a way that makes it easier to reopen stores, or ensures that people are distancing appropriately in factories. So I would say that it's the beginning of these conversations as customers as businesses try to figure out how do we operate in the new reality that we have? And I think it's a pretty exciting time. And I think just to the point that Sorin just made, there's a lot of openness to new technologies that there wasn't before, because there's this willingness to change the business processes to really take advantage of any technologies. >> So Dr. Cheran, I probably should have started here but help us understand HPE's overall strategy with regard to AI. I would certainly know that you're using AI to improve IT, the InfoSite product and capability via the Nimble acquisition, et cetera, and bringing that across the portfolio. But what's the strategy for HPE? >> So, yeah, thank you. That's (laughs) a good question. So obviously you started with a couple of our acquisition in the past because obviously Nimble and then we talked a lot about our efforts to bring InfoSite across the portfolio. But currently, in the past couple of months, let's say close to a year, we've been announcing a lot of other acquisitions and we've been talking about Tuteybens, we've been talking about Scytale we've been talking about Cray, and so on, so forth, and now what we're doing at HPE is to bring all of this IP together into one place and try to help our customers within their region out. If you're looking at what, for example, what did they actually get when Cray play was not only the receiver, but we also acquire and they also have a lot of software and a lot of IP around optimization and so on and so forth. Also within our own labs, we've been investigating AI around like, for example, some learning or accelerators or a lot of other activity. So right now what we're trying to help our customers with is to understand how do they lead from the production stage, from the POC stage to the production stage. So (mumbles) what we are trying to do is we are trying to accelerate their adoption of AI. So simply starting from an optimized platform infrastructure up to the solution they are actually going to apply or to use to solve their business problems and wrapping all of that around with services either consumed on-prem as a service and so on. So practically what we want to do is we want to help our customers optimize, orchestrate and operationalize AI. Because the problem of our customers is not to start in our PLC, the problem is how do I then take everything that I've been developing or working on and then put it in production at the edge, right? And then keep it, maintaining production in order to get insights and then actually take actions that are helping the enterprise. So basically, we want to be data driven assets in cloud enable, and we want to help our customers move from POC into production. >> Or do you work with obviously a lot of data folks, companies or data driven data scientists, you are hands on practitioners in this regard. One of the challenges that I hear a lot from customers is they're trying to operationalize AI put AI into production, they have data in silos, they spend all their time, munging data, you guys have made a number of acquisitions. Not a list of which is prey, obviously map of, data specialist, my friend Kumar's company Blue Data. So what do you see as HPE's role in terms of helping companies operationalize AI. >> So I think that a big part of operationalizing AI moving away from the PLC to really integrate AI into the business processes you have and also the sort of pre existing IT infrastructure you talked about, you might already have siloed data. That's sort of something we know very well at HPE, we understand a lot of the IT that enterprises already have the incumbent IT and those systems. We also understand how to put together systems and integrated systems that include a lot of different types of computing infrastructure. So whether that being different types of servers and different types of storage, we have the ability to bring all of that together. And then we also have the software that allows you to talk to all of these different components and build applications that can be deployed in the real world in a way that's easy to maintain, and scale and grow as your AI applications will almost invariably get more complex involved, more outputs involved and more input. So one of the important things as customers try to operationalize AI is think is knowing that it's not just solving the problem you're currently solving. It's not just operationalizing the solution you have today, it's ensuring that you can continue to operationalize new things or additional capabilities in the future. >> I want to talk a little bit about AI for good. We talked about AI taking away jobs, but the reality is, when you look at the productivity data, for instance, in the United States, in Europe, it's declining and it has for the last several decades and so I guess my point is that we're not going to be able to solve some of the world problems in the coming decades without machine intelligence. I mean you think about health care, you think about feeding populations, you think about obviously paying things like pandemics, climate change, energy alternatives, et cetera, productivity is coming down. Machines are potential opportunity. So there's an automation imperative. And you feel, Dr. Cheran, the people who are sort of beyond that machines replacing human's issue? Is that's still an item or has the pandemic sort of changed that? >> So I believe it is, so it used to be a very big item, you're right. And every time we were speaking at a conference and every time you're actually looking at the features of AI, right? Two scenarios are coming to plays, right? The first one where machines are here, actually take a walk, and then the second one as you know even a darker version where terminator is coming, yes and so forth, right? So basically these are the two, is the lesser evil in the greater evil and so on and so forth. And we still see that regular thing coming over and over again. And I believe that 2019 was the year of reckoning, where people are trying to realize that not only we can actually take responsible AI, but we can actually create an AI that is trustworthy, an AI that is fair and so on and so forth. And that we also understood in 2019 it was highly debated everywhere, which part of our jobs are going to be replaced like the parts that are mundane, or that can actually be easily automated and so on and so forth. With the COVID-19 what happened is that people are starting to look at AI differently, why? Because people are starting to look at data differently. And looking at data differently, how do I actually create this core of data which is trusted, secure and so on and so forth, and they are trying to understand that if the data is trusted and secure somehow, AI will be trusted and secure as well. Now, if I actually shifted forward, as you said, and then I try to understand, for example on the manufacturing floor, how do I add more machines? Or how do I replace humans with machines simply because, I need to make sure that I am able to stay in production and so on and so forth. From their perspective, I don't believe that the view of all people are actually looking at AI from the job marketplace perspective changed a lot. The view that actually changes how AI is helping us better certain prices, how AI is helping us, for example, in health care, but the idea of AI actually taking part of the jobs or automating parts of the jobs, we are not actually past yet, even if 2018 and even more so in 2019, it was the year also where actually AI through automation replaced the number of jobs but at the same time because as I was saying the first year where AI created more jobs it's because once you're displacing in one place, they're actually creating more work more opportunities in other places as well. But still, I don't believe the feeling changed. But we realize that AI is a lot more valuable and it can actually help us through some of our darkest hours, but also allow us to get better and faster insights as well. >> Well, machines have always replaced humans and now for the first time in history doing so in a really cognitive functions in a big way. But I want to ask you guys, I'll start with Dr. Arti, a series of questions that I think underscore the impact of AI and the central role that it plays in companies digital transformations, we talk about that a lot. But the questions that I'm going to ask you, I think will hit home just in terms of some hardcore examples, and if you have others I'd love to hear them but I'm going to start with Arti. So when do you think Dr. or machines will be able to make better diagnoses than doctors? We're actually there today already? >> So I think it depends a little bit on how you define that. And I'm just going to preface this by saying both of my parents are physicians. So I have a little bit of bias in this space. But I think that humans can bring creativity in a certain type of intelligence that it's not clear to me. We even know how to model with the computer. And so diagnoses have sometimes two components. One is recognizing patterns and being able to say, "I'm going to diagnose this disease that I've seen before." I think that we are getting to the place where there are certain examples. It's just starting to happen where you might be able to take the data that you need to make a diagnosis as well understood. A machine may be able to sort of recognize those subtle patterns better. But there's another component of doing diagnosis is when it's not obvious what you're looking for. You're trying to figure out what is the actual sort of setup diseases I might be looking at. And I think that's where we don't really know how to model that type of inspiration and creativity that humans still bring to things that they do, including medical diagnoses. >> So Dr. Cheran my next question is, when do you think that owning and driving your own vehicle will become largely obsolete? >> (laughs) Well, I believe my son is six year old now. And I believe, I'm working with a lot of companies to make sure that he will not get his driving license with his ID, right? So depending who you're asking and depending the level of autonomy that you're looking at, but you just mentioned the level five most likely. So there are a lot of dates out there so some people actually say 2030. I believe that my son in most of the cities in US but also most of the cities in Europe, by the time he's 18 in let's say 2035, I'll try to make sure that I'm working with the right companies not to allow them to get the driving license. >> I'll let my next question is from maybe both of you can answer. Do you take the traditional banks will lose control of payment system? >> So that's an interesting question, because I think it's broader than an AI question, right? I think that it goes into some other emerging technologies, including distributed ledgers and sort of the more secure forms of blockchain. I think that's a challenging question to my mind, because it's bigger than the technology. It's got Economic and Policy implications that I'm not sure I can answer. >> Well, that's a great answer, 'cause I agree with you already. I think that governments and banks have a partnership. It's important partnership for social stability. But similar we've seen now, Dr. Cheran in retail, obviously the COVID-19 has affected retail in a major way, especially physical retail, do you think that large retail stores are going to go away? I mean, we've seen many in chapter 11. At this point, how much of that is machine intelligence versus just social change versus digital transformation? It's an interesting question, isn't it? >> So I think most of the... Right now the retailers are here to stay I guess for the next couple of years. But moving forward, I think their capacity of adapting to stores like to walk in stores or to stores where basically we just go in and there are no shop assistants and just you don't even need the credit card to pay you're actually being able to pay either with your face or with your phone or with your small chips and so on and so forth. So I believe currently in the next couple of years, obviously they are here to stay. Moving forward then we'll get artificial intelligence, or robotics applied everywhere in the store and so on and so forth. Most likely their capacity of adapting to the new normal, which is placing AI everywhere and optimizing the walk in through predicting when and how to guide the customers to the shop, and so on and so forth, would allow them to actually survive. I don't believe that everything is actually going to be done online, especially from the retailer perspective. Most of the... We've seen a big shift at COVID-19. But what I was reading the other day, especially in France that the counter has opened again, we've seen a very quick pickup in the retailers of people that actually visiting the stores as well. So it's going to be some very interesting five to 10 years, and then most of the companies that have adapted to the digital transformation and to the new normal I think they are here to stay. Some of them obviously are going to take sometime. >> I mean, I think it's an interesting question too that you really sort of triggering in my mind is when you think about the framework for how companies are going to come back and come out of this, it's not just digital, that's a big piece of it, like how digital businesses, can they physically distance? I mean, I don't know how sports arenas are going to be able to physically distance that's going to be interesting to see how essential is the business and if you think about the different industries that it really is quite different across those industries. And obviously, digital plays a big factor there, but maybe we could end on that your final thoughts and maybe any other other things you'd like to share with our audience? >> So I think one of the things that's interesting anytime you talk about adopting a new technology, and right now we're happening to see this sort of huge uptick in AI adoption happening right at the same time but this sort of massive shift in how we live our lives is happening and sort of an acceptance, I think that can't just go back to the way things work as you mentioned, they'll probably be continued sort of desire to maintain social distancing. I think that it's going to force us to sort of rethink why we do things the way we do now, a lot, the retail, environments that we have the transportation solutions that we have, they were adapted in many cases in a very different context, in terms of what people need to do on a day-to-day basis within their life. And then what were the sort of state of technologies available. We're sort of being thrust and forced to reckon with like, what is it I really need to do to live my life and then what are the technologies I have available to meet to answer that and I think, it's really difficult to predict right now what people will think is important about a retail experience, I wouldn't be surprised if you start to find in person retail actually be much less, technologically aided, and much more about having the ability to talk to a human being and get their opinion and maybe the tactile sense of being able to like touch new clothes, or whatever it is. And so it's really difficult I think right now to predict what things are going to look like maybe even a year or two from now from that perspective. I think that what I feel fairly confident is that people are really starting to understand and engage with new technologies, and they're going to be really open to thinking about what those new technologies enable them to do in this sort of new way of living that we're going to probably be entering pretty soon. >> Excellent! All right, Sorin, bring us home. We'll give you the last word on this topic. >> Now, so I wanted to... I agree with Arti because what these three months of staying at home and of busy shutting down allowed us to do was to actually have a very big reset. So let's say a great reset but basically we realize that all the things we've taken from granted like our freedom of movement, our technology, our interactions with each other, and also for suddenly we realize that everything needs to change. And the only one thing that we actually kept doing is interacting with each other remotely, interacting with each other with our peers in the house, and so on and so forth. But the one thing that stayed was generating data, and data was here to stay because we actually leave traces of data everywhere we go, we leave traces of data when we put our watch on where we are actually playing with our phone, or to consume digital and so on and so forth. So what these three months reinforced for me personally, but also for some of our customers was that the data is here to stay. And even if the world shut down for three months, we did not generate less data. Data was there on the contrary, in some cases, more data. So the data is the main enabler for the new normal, which is going to pick up and the data will actually allow us to understand how to increase customer experience in the new normal, most likely using AI. As I was saying at the beginning, how do I actually operate new business model? How do I find, who do I partner with? How do I actually go to market together? How do I make collaborations more secure, and so on and so forth. And finally, where do I actually find new value pools? For example, how do I actually still enjoy for having a beer in a pub, right? Because suddenly during the COVID-19, that wasn't possible. I have a very nice place around the corner, but it's actually cheaply stuff. I'm not talking about beer but in general, I mean, so the finance is different the pools of data, the pools (mumbles) actually, getting values are different as well. So data is here to stay, and the AI definitely is going to be accelerated because it needs to use data to allow us to adopt the new normal in the digital transformation. >> A lot of unknowns but certainly machines and data are going to play a big role in the coming decade. I want to thank Dr. Arti Garg and Dr. Sorin Cheran for coming on theCUBE. It's great to have you. Thank you for a wonderful conversation. Really appreciate it. >> Thank you very much. >> Thanks so much. >> All right. And thank you for watching everybody. This is Dave Vellante for theCUBE and the HPE 2020 Virtual Experience. We'll be right back right after this short break. (upbeat music)
SUMMARY :
brought to you by HPE. of the Discover 2020 Virtual Experience, and maybe what's your in order to make sure Dr. Garg, maybe you could share with us and your customers? that the one thing they So Dr. Garg, what And I think just to the and bringing that across the portfolio. from the POC stage to the production stage. One of the challenges that the solution you have today, but the reality is, when you I need to make sure that I am able to stay and now for the first time in history and being able to say, question is, when do you think but also most of the cities in Europe, maybe both of you can answer. and sort of the more obviously the COVID-19 has Right now the retailers are here to stay for how companies are going to having the ability to talk We'll give you the last and the data will actually are going to play a big And thank you for watching everybody.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Dave Vellante | PERSON | 0.99+ |
Cheran | PERSON | 0.99+ |
France | LOCATION | 0.99+ |
Blue Data | ORGANIZATION | 0.99+ |
Europe | LOCATION | 0.99+ |
2019 | DATE | 0.99+ |
US | LOCATION | 0.99+ |
2018 | DATE | 0.99+ |
HPE | ORGANIZATION | 0.99+ |
Kumar | PERSON | 0.99+ |
Nimble | ORGANIZATION | 0.99+ |
Sorin Cheran | PERSON | 0.99+ |
Arti Garg | PERSON | 0.99+ |
Arti Garg | PERSON | 0.99+ |
three | QUANTITY | 0.99+ |
COVID-19 | OTHER | 0.99+ |
Garg | PERSON | 0.99+ |
three months | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
Hewlett Packard Enterprise | ORGANIZATION | 0.99+ |
United States | LOCATION | 0.99+ |
two | QUANTITY | 0.99+ |
18 | QUANTITY | 0.99+ |
five | QUANTITY | 0.99+ |
2035 | DATE | 0.99+ |
six months | QUANTITY | 0.99+ |
Two scenarios | QUANTITY | 0.99+ |
one | QUANTITY | 0.98+ |
one thing | QUANTITY | 0.98+ |
first time | QUANTITY | 0.98+ |
Arti | PERSON | 0.98+ |
10 years | QUANTITY | 0.98+ |
One | QUANTITY | 0.98+ |
first year | QUANTITY | 0.98+ |
InfoSite | ORGANIZATION | 0.98+ |
Sorin | PERSON | 0.98+ |
2030 | DATE | 0.98+ |
today | DATE | 0.98+ |
two components | QUANTITY | 0.97+ |
AI Strategy and Solutions Group | ORGANIZATION | 0.97+ |
a year | QUANTITY | 0.97+ |
one example | QUANTITY | 0.96+ |
six year old | QUANTITY | 0.96+ |
second one | QUANTITY | 0.96+ |
next couple of years | DATE | 0.96+ |
Dr. | PERSON | 0.96+ |
chapter 11 | OTHER | 0.96+ |
one place | QUANTITY | 0.95+ |
Discover 2020 Virtual Experience | EVENT | 0.95+ |
Cray play | TITLE | 0.94+ |
HPE 2020 | EVENT | 0.91+ |
pandemic | EVENT | 0.89+ |
past couple of months | DATE | 0.88+ |
Scytale | ORGANIZATION | 0.87+ |
Limor Fried, Adafruit, Saloni Garg, LNM Institute, & DeLisa Alexander, Red Hat | Red Hat Summit 2019
>> Announcer: Live from Boston, Massachusetts, it's theCUBE covering Red Hat Summit 2019. Brought to you by Red Hat. >> Welcome back to our coverage here on theCUBE of Red Hat Summit 2019. We're live in Boston right now, and I'm joined by a couple of award winning professionals. And we're looking forward to hearing what their story is because it's fascinating on both fronts. And also by DeLisa Alexander who has a great job title at Red Hat. Chief People Officer. I love that title. DeLisa, thanks for joining us. >> Thanks for having us. >> Also with us, Limor Fried who is the and founder and lead engineer of Adafruit and Saloni Garg who is an undgergrad student, third year student, at the LNM Institute of Technology. And that's in Jaipur, India. So Saloni, glad to have you with us. And Limor, a pleasure as well. >> Thank you. >> And you're all lit up. You've got things going on there, right? >> I'm glowing, we're gonna get all into that. >> We'll get into that later. First, let's talk about the award that, they're two women in open-source are our winners this year. On the community side, Limor won, on the academic side, Saloni won, so talk about the awards if you would, DeLisa. The process and really what you're trying to do with recognizing these kinds of achievements. >> Well, this is our fifth year for the Women in Open-Source Award. So after this period of time, I can tell you what we wanna do is make an impact by really fostering more diverse communities, particularly gender diverse in open-source. And so that's the whole goal. Five years into it, what we've discovered is that when you really focus on diversity and inclusion within a community, you actually can make an impact. And the thing that's so exciting this year is that our award winners are really evidence of that. >> So talk about the two categories then if you would please. You have community on one side, academics on the other. It appears to be pretty clear cut what you're hoping to achieve there by recognizing an active contributor, and then somebody who is in the wings and waiting for their moment. But go ahead and fill in a little bit about, >> Yeah, absolutely. >> Limor and Saloni too about, why are they here. >> Limor: Why am I here? >> Yes, well, really what we're trying to do is create role models for women and girls who would like to participate in technology but perhaps are not sure that that's the way that they can go. And they don't see people that are like them, so there's less a tendency to join into this type of community. So with the community award winner, we're looking at the professional who's been contributing to open-source for a period of time. And with our academic winner, we're looking to score more people who are in university to think about it. And, of course, the big idea is you'll all be looking at these women as people that will inspire you to potentially do more things with open-source and more things with technology. We've been hearing for many, many years that we definitely need to have more gender diversity in tech in general and in open-source. And Red Hat is kind of uniquely situated to focus on the open-source community, and so with our role as the open-source leader, we really feel like we need to make that commitment and to be able to foster that. >> Well, it makes perfect sense. Obviously. Great perfect sense. Saloni, if you would, let's talk first about your work. You've been involved in open-source for quite some time. I know you have a lot of really interesting projects that you're working on right now. We'll get to that in a bit, but just talk about, I guess, the attraction for you in terms of open-source and really kind of where that came from originally through your interest in stem education. >> Okay, so when I first came to college, I was really influenced to contribute to open-source by my seniors. They have already selected in programs like Google Summer of Code Outreach channel, so they actually felt empowered by open-source. So they encouraged me to join it too. I tried open-source, and I feel really, like, I'm a part of something bigger than myself. And I was helped greatly by my seniors, so I feel it's my duty to give it back to my juniors and to help them when they need it so that they can do wonders, yeah. >> Great. And Limor, for you, I know you founded the company. 100% female owned. You've got-- >> Yeah, 100% me. >> Yeah, right. 100% you. >> It's my fault. >> Right. Well, I wasn't going to blame you. I'll credit you instead. >> Yeah, that's our big thing. We wanna change. Get blame to get credit. >> Right. It's all about credit. >> More positive. >> So 100 employees? Is that right? >> 100, 150, yep. >> Okay, talk a little bit about kind of the origin, the genesis of the company and where that came from and then your connection on the open-source side. >> Well, I, yeah, so I grew up actually in Boston. So I've lived here a very long time. >> You said like a block from here. Two blocks. >> I used to live, actually, yes, in South Station nearby. I used to live by the Griffin Book line, and so Wilson has a very strong open-source community, you know. Ephesoft is here. And, yeah, that's kind of the origins of a lot of this free software and open-source software community. And when I went to school, I ended up going to MIT, and the open-source software and open-source technology is kind of part of, like, the genetics there. There's actually this thinking that you wouldn't do it. It's kind of by default. People write code, you open-source, you release it. There's a culture of collaboration. Scientists, engineers, students, researchers. All working together and sharing code. And when I was in school, so I had to take Thesis. I really didn't wanna do it, and so instead, I started building, like, MP3 players and video games. Taking all the engineering that I was studying and, like, not doing the work I was supposed to be doing. But instead, I was having fun and building cool electronic parts, and I would publish these projects online. I had, like, a MediaLab webs page, and I would publish, you know, here's all the chips and the schematics and the layout. And people sort of started coming up with the idea of open-source hardware. Let's take the philosophy of open-source software where we release the source code. But, in here, you release CAD files, firmware, layouts, 3D models. And so I did that, and I was publishing here's how you make this, like, Lite-Brite toy for Burning Man or an MP3 player or a cell phone jammer. All these fun projects, and people would end up contacting me and saying, hey, these are really cool projects. I would like to build this project myself, but unlike software where you just, like, type in, like, make, config, and compile and all that. You actually have to buy parts, you have to get these physical things. And so they said, you know, could you sell me a kit, like a box, where we'd get it and take it home and be able to build it. And I was totally like, no, I'm busy. I have to, like, not write this thesis. >> That's not what I do. >> But eventually, I did write the thesis. And then I was really stuck because I'm like, now what do I do? So I ended up selling kits. So I sold the synthesizer kits and such, and I did an art fellowship and stuff. And then, eventually, I was kind of like, this is, I was doing, you know, it's, you kind of fall into business by accident because if you knew what you were getting into, you wouldn't do it in my opinion. So I ended up sort of developing that, and that was 13 years ago. And now we have 4,000 products in the store, you know. >> 4,000 products? >> Yeah, I know. Ridiculous, right? That's a lot. >> Yeah, who's doing that inventory, right? >> Well, we have a pretty intense inventory system that I'd love to talk to you about, but it's kind of boring. >> I'll bet you do. Now, I was reading something about an circuit playground express. >> Yes. >> Is that right? So is that what this is all about is-- >> Yes! I knew you'd ask, and that's why I wore this. >> So it's a, kind of, an exploratory circuit board of-- >> Yeah! It's open-source, open-source hardware, open-source software and firmware. And we had a lot of parents and teachers and educators and camp counselors come to us and say, we wanna teach physical computing. We wanna teach coding but with physical hardware because, you know, we all, all the tier coders, right? No, I don't know. But, eventually, you're like, I'm typing on the screen. And you want to take that and you wanna make it physical. You wanna bring it out into the world where there's a wearable or a cosplay or assistive technology, or you wanna make video games, that are, like, physical video games. And the problem that teachers had were the classrooms, a lot of these classrooms, they don't have a lot of money. So they said it has to be very low-cost. It has to be durable because these kids are, like, chewing on it and stuff, which is fun. And it also has to work on any computer, even extremely old computers. 'Cause a lot of these schools, they only have a budget every seven years to buy laptops. And so this actually becomes a very difficult technological problem. How do you design something that's $20 but can teach physical computing to anybody? From kids who are not even good at typing all the way to college students who wanna implement fast 48 transforms, and so we designed this hardware. It's open-source, and it's cool 'cause people are, like, remixing it and making improvements to it. It's open-source circuit playground, and I'm wearing it. And it's glowing, and I don't know. It's fun! It's got LEDs and sensors. And you can just alligator clip to it and make projects, and we've got schools from around the world learning how to code. And I think it's a much more fun experience than just typing at a computer. >> Absolutely. Yeah, Solani, on your side of the fence, so I obviously, in your education years if you will, not that we ever stop learning, but formally right now. But you're involved, among the many projects that you've been involved with, a smart vehicle. >> Yeah, I'm working on it. >> Project, right? So tell us a little bit about that and how open-source has come into play with what you're looking at in terms of, I assume, traffic and congestion and flows and those kinds of things. >> Yeah. So what we're working on is, basically, we'll be fitting cameras and Raspberry Pis on buses, college buses. And then they'll detect, like, they'll detect lane detection and traffic signal violation and will report the assigned people. If there's any breakage of law or any breakage of traffic signals, so that's what, basically, we are working on and how open-source comes into the play is that we actually knew nothing about OpenCV and all the technology that is before all this. So I looked up some open-source projects that had already the lump sum of all this, and I got to learn a lot about how things actually work on the code-based side. So that's how open-source actually helped me to make this project. >> And, ultimately, who do you report to on that? Or how is that data gonna become actionable or, I assume it can be. >> Yeah. >> At some point, right? I mean, who's your partner in that? Or who is the agency or the body that, you know, can most benefit from that? >> Yeah, so, currently, this is an academy project, and a classmate of mine has been working with me. And we are working on a faculty member. And so, basically, we have decided to expand this project and to use it as a government project. And we, authorities we'll be reporting to whenever there's a signal or law breakage is that the traffic police department will be notifying them in case of any signal breakage. >> So if there's an uptick in speeding or red light running in Jaipur, we know who to blame. >> Yeah. >> Right? >> Shouldn't have run a report. >> It's, Solani, why'd you do that to them, right? All right, ladies, if you would. And I'm gonna end with DeLisa, but I'd like to hear your thoughts about each other. Just about, as you look at the role of women in tech and the diversity that Red Hat is trying to encourage, Limor, what have you seen in Solani here over the last day, day and a half, that maybe you think will leave a lasting impression on you? >> I love Solani's energy and her passion, and I can just, she's has this emanated strength. I can just tell that nothing stops her from achieving what she wants. Like, she wants to, like, do this Raspberry Pi traffic camera. She's just gonna figure out what it takes to solve that problem. She's gonna use open-source software, hardware, whatever it takes. And she's just gonna achieve her goal. I totally sense that from her from the last few days we've been together. >> That's great. >> Thank you. >> Yeah! >> All right. Solani, your turn. For Limor. >> What I have done is just a fraction of what she has been doing. She's, like, inspiration. I look up to her, and I, also, I mean, I hope I start my own company someday. And she's really a role model and an inspiration for me. So yeah. >> Yeah, I think you've got a pretty good mentor there in that respect. And then, DeLisa, when you see young ladies like this who are, you know, their achievements are so impressive in their respects. What does that say to you about Red Hat, the direction of the program, and then the impact on young women that you're having? >> Well, the program has gotten so much more participation. So many people, 8,000 people actually voted to select our winners. And all of our finalists were so impressive. We have major contributors to open-source, and so, along with our finalists, our winners are people who are just role models. And I am just so impressed with them, and I think that every year, we're learning something different from each of the winners. And so, as they round down into a community, the things that they'll be able to mentor people on will just be exponentially increasing. And so it's really exciting. >> Fantastic. Well, thank you all. The three of you, the ladies. Congratulations on your recognition, your accomplishments. Well done. Safe travels back to New York and back to India as well, and I would look forward to hearing more about what you're up to down the road. I think this is not the last we're gonna hear from the two of you. >> Thank you for having us. >> And thank you for calling me a young lady. >> Absolutely. I mean, look at the source. Open-source, you might say. That was awful. All right, back with more Red Hat Summit 2019. We're live here on theCUBE in Boston. (gentle music)
SUMMARY :
Brought to you by Red Hat. And also by DeLisa Alexander who has a great job title So Saloni, glad to have you with us. And you're all lit up. Saloni won, so talk about the awards if you would, DeLisa. And so that's the whole goal. So talk about the two categories then if you would please. Limor and but perhaps are not sure that that's the way the attraction for you in terms of open-source And I was helped greatly by my seniors, And Limor, for you, I know you founded the company. Yeah, right. I'll credit you instead. Get blame to get credit. It's all about credit. the genesis of the company and where that came from So I've lived here a very long time. You said like a block from here. And so they said, you know, could you sell me a kit, And now we have 4,000 products in the store, you know. Yeah, I know. to you about, but it's kind of boring. I'll bet you do. I knew you'd ask, and that's why I wore this. And you want to take that and you wanna make it physical. that we ever stop learning, but formally right now. what you're looking at in terms of, I assume, traffic and all the technology that is before all this. do you report to on that? that the traffic police department will be notifying them or red light running in Jaipur, we know who to blame. that maybe you think will leave a lasting impression on you? I can just tell that nothing stops her from achieving Solani, your turn. And she's really a role model and an inspiration for me. What does that say to you about Red Hat, the direction And I am just so impressed with them, and I think Well, thank you all. I mean, look at the source.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Saloni | PERSON | 0.99+ |
Limor | PERSON | 0.99+ |
DeLisa | PERSON | 0.99+ |
Boston | LOCATION | 0.99+ |
Limor Fried | PERSON | 0.99+ |
New York | LOCATION | 0.99+ |
$20 | QUANTITY | 0.99+ |
India | LOCATION | 0.99+ |
Red Hat | ORGANIZATION | 0.99+ |
Jaipur | LOCATION | 0.99+ |
South Station | LOCATION | 0.99+ |
4,000 products | QUANTITY | 0.99+ |
First | QUANTITY | 0.99+ |
Solani | PERSON | 0.99+ |
Ephesoft | ORGANIZATION | 0.99+ |
8,000 people | QUANTITY | 0.99+ |
Saloni Garg | PERSON | 0.99+ |
100 | QUANTITY | 0.99+ |
100% | QUANTITY | 0.99+ |
two categories | QUANTITY | 0.99+ |
DeLisa Alexander | PERSON | 0.99+ |
fifth year | QUANTITY | 0.99+ |
three | QUANTITY | 0.99+ |
Jaipur, India | LOCATION | 0.99+ |
Boston, Massachusetts | LOCATION | 0.99+ |
100 employees | QUANTITY | 0.99+ |
150 | QUANTITY | 0.99+ |
two | QUANTITY | 0.99+ |
LNM Institute | ORGANIZATION | 0.99+ |
LNM Institute of Technology | ORGANIZATION | 0.99+ |
third year | QUANTITY | 0.99+ |
each | QUANTITY | 0.99+ |
Red Hat Summit 2019 | EVENT | 0.98+ |
both fronts | QUANTITY | 0.98+ |
Two blocks | QUANTITY | 0.98+ |
Five years | QUANTITY | 0.98+ |
13 years ago | DATE | 0.98+ |
first | QUANTITY | 0.98+ |
this year | DATE | 0.98+ |
48 | QUANTITY | 0.96+ |
Griffin Book | ORGANIZATION | 0.96+ |
Wilson | PERSON | 0.96+ |
two women | QUANTITY | 0.96+ |
MediaLab | ORGANIZATION | 0.96+ |
Adafruit | ORGANIZATION | 0.95+ |
Women in Open-Source Award | TITLE | 0.93+ |
ORGANIZATION | 0.92+ | |
MIT | ORGANIZATION | 0.92+ |
every seven years | QUANTITY | 0.91+ |
OpenCV | TITLE | 0.89+ |
Garg | PERSON | 0.88+ |
Adafruit | PERSON | 0.86+ |
one side | QUANTITY | 0.85+ |
Raspberry Pis | COMMERCIAL_ITEM | 0.8+ |
Raspberry Pi | ORGANIZATION | 0.73+ |
couple | QUANTITY | 0.73+ |
day | QUANTITY | 0.72+ |
Saloni | ORGANIZATION | 0.64+ |
winning | QUANTITY | 0.63+ |
theCUBE | ORGANIZATION | 0.6+ |
last | DATE | 0.58+ |
a half | QUANTITY | 0.58+ |
Burning Man | TITLE | 0.56+ |
Code | ORGANIZATION | 0.47+ |
Summer of | TITLE | 0.34+ |
Brite | COMMERCIAL_ITEM | 0.33+ |
Steve Herrod, General Catalyst & Devesh Garg, Arrcus | CUBEConversation, July 2018
[Music] [Applause] [Music] welcome to the special cube conversations here in Palo Alto cube studios I'm John Ferrier the founder of Silicon angle in the cube we're here with divest cargoes the founder and CEO of arcus Inc our curse com ar-are see us calm and Steve Herod General Partner at at General Catalyst VCU's funded him congratulations on your launch these guys launched on Monday a hot new product software OS for networking powering white boxes in a whole new generation of potentially cloud computing welcome to this cube conversation congratulations on your >> launch thank you John >> so today I should talk about this this >> startup when do you guys were founded let's get to the specifics date you were founded some of the people on the team and the funding and we were formally incorporated in February of 2016 we really got going in earnest in August of 2016 and have you know chosen to stay in stealth the the founding team consists of myself a gentleman by the name of Kop tell he's our CTO we also have a gentleman by the name of Derek Young he's our chief architect and our backgrounds are a combination of the semiconductor industry I spent a lot of time in the semiconductor industry most recently I was president of easy chip and we sold that company to Mellanox and Kher and Derek our networking protocol experts spent 20 plus years at places like Cisco and arguably some of the best protocol guys in the world so the three of us got together and basically saw an opportunity to to bring some of the insights and and architectural innovation you know we had in mind to the Mobius a pedigree in there some some top talent absolutely some of the things that they've done in the past from some notable yeah I mean you know some if you if you'd like some just high-level numbers we have 600 plus years of experience of deep networking expertise within the company our collective team has shipped over 400 products to production we have over 200 IETF RFC papers that have been filed by the team as well as 150 plus patents so we really can do something on the pedigree for sure yeah we absolutely focused on getting the best talent in the world because we felt that it would be a significant differentiation to be able to start from a clean sheet of paper and so really having people who have that expertise allowed us to kind of take a step back and you know reimagine what could be possible with an operating system and gave us the benefit of being able to you know choose >> best-in-class approaches so what's the >> cap the point that this all came >> together what was the guiding vision was it network os's are going to be cloud-based was it going to be more I owe t what was the some of the founding principles that really got this going because clearly we see a trend where you know Intel's been dominating we see what NVIDIA is doing competitively certainly on the GPU side you're seeing the white box has become a trend Google makes their own stuff apples big making their own silicon seeking the that's kind of a whole big scale world out there that has got a lot of hardware experience what was the catalyst for you guys when you found this kinda was the guiding principle yeah I would say there were three John and you hit you hit on a couple of them in your reference to Intel and NVIDIA with some of the innovation but if I start at the top level the market the networking market is a large market and it's also very strategic and foundational in a hyper-connected world that market is also dominated by a few people and there's essentially three vertically integrated OEM so that dominate that market and when you have that type of dominance it leads to ultimately high prices and muted innovations so we felt number one the market was going through tremendous change but at the same time it had been tightly controlled by a few people the other part of it was that there was a tremendous amount of innovation that was happening at the silicon component level coming from the semiconductor industry I was early at Broadcom very you know involved in some of the networking things that happened in the early stages of the company we saw tremendous amounts of innovation feature velocity that was happening at the silicon component level that in turn led to a lot of system hardware people coming into the market and producing systems based on this wide variety of choices for you know for the silicon but the missing link was really an operating system that would unleash all that innovation so Silicon Valley is back Steve you you know you're a VC now but you were the CTO at VMware one of the companies that actually changed how data centers operate certainly as it certainly as a pretext and cloud computing was seeing with micro services and the growth of cloud silicon's hot IT operations is certainly being decimated as we old knew it in the past everything's being automated away you need more function now there's a demand this is this penny how you see I mean you always see things are a little early as of technologist now VC what got you excited about these guys what's the what's the bottom line yeah maybe two points on that which so one silicon is is definitely become interesting again if you will in the in the Silicon Valley area and I think that's partly because cloud scale and web scale allows these environments where you can afford to put in new hardware and really take advantage of it I was a semiconductor I first austerity too so it's exciting for me to see that but um you know is the fish that it's kind of a straightforward story you know especially in a world of whether it's cloud or IOT or everything networking is you know like literally the core to all of us working going forward and the opportunity to rethink it in a new design and in software first mentality felt kind of perfect right now I think I I think device even sell the team a little short even is with all the numbers that are there kr for instance this co-founder was sort of everyone you talk to will call him mister BGP which is one of the main routing protocols in the internet so just a ridiculously deep team trying to take this on and there been a few companies trying to do something kind of like this and I think what do they say that the second Mouse gets the cheese and I think I think we've seen some things that didn't work the first time around and we can really I think improve on them and have a >> chance to make a major impact on the networking market you know just to kind of go on a tangent here for a second >> because you know as you're talking kind of my brain is kind of firing away because you know one of things I've been talking about on the cube a lot is ageism and if you look at the movement of the cloud that's brought us systems mindset back you look at all the best successes out there right now it's almost a old guys and gals but it's really systems people people who understand networking and systems because the cloud is an operating system you have an operating system for networking so you're seeing that trend certainly happened that's awesome the question I have for you device is what is the difference what's the impact of this new network OS because I'm almost envisioning if I think through my mind's eye you got servers and server list certainly big train seeing and cloud it's one resource pools one operating system and that needs to have cohesiveness and connectedness through services so is this how you guys are thinking about how are you guys think about the network os what's different about what you guys are doing with ARC OS versus what's out there today now that's a great question John so in terms of in terms of what we've done the the third piece you know of the puzzle so to speak when we were talking about our team I talked a little bit about the market opportunity I talked a little bit about the innovation that was happening at the semiconductor and systems level and said the missing link was on the OS and so as I said at the onset we had the benefit of hiring some of the best people in the world and what that gave us the opportunity was to look at the twenty plus years of development that had happened on the operating system side for networking and basically identify those things that really made sense so we had the benefit of being able to adopt what worked and then augment that with those things that were needed for a modern day networking infrastructure environment and so we set about producing a product we call it our Co s and the the characteristics of it that are unique are that its first of all its best-in-class protocols we have minimal dependency on open source protocols and the reason for that is that no serious network operator is going to put an open source networking protocol in the core of their network they're just not going to risk their business and the efficacy and performance of their network for something like that so we start with best-in-class protocols and then we captured them in a very open modular Services microservices based architecture and that allows us the flexibility and the extensibility to be able to compose it in a manner that's consistent with what the end-use case is going to be so it's designed from the onset to be very scalable and very versatile in terms of where it can be deployed we can deploy it you know in a physical environment we can deploy it visa via a container or we could deploy it in the cloud so we're agnostic to all of those use case scenarios and then in addition to that we knew that we had to make it usable it makes no sense to have the best-in-class protocols if our end customers can't use them so what we've done is we've adopted open config yang based models and we have programmable api's so in any environment people can leverage their existing tools their existing applications and they can relatively easily and efficiently integrate our Co s into their networking environment and then similarly we did the same thing on the hardware side we have something that we call D pal it's a data plane adaptation layer it's an intelligent how and what that allows us to do is be Hardware agnostic so we're indifferent to what the underlying hardware is and what we want to do is be able to take advantage of the advancements in the silicon component level as well as at the system level and be able to deploy our go S anywhere it's let's take a step back so you guys so the protocols that's awesome what's the value proposition for our Co S and who's the target audience you mentioned data centers in the past is a data center operators is it developers is it service providers who was your target customer yeah so so the the piece of the puzzle that wraps everything together is we wanted to do it at massive scale and so we have the ability to support internet scale with deep routing capabilities within our Co s and as a byproduct of that and all the other things that we've done architectural II were the world's first operating system that's been ported to the high-end Broadcom strata DNX family that product is called jericho plus in the marketplace and as a byproduct of that we can ingest a full internet routing table and as a byproduct of that we can be used in the highest end applications for network operators so performance is a key value public performance as measured by internet scale as measured by convergence times as measured by the amount of control visibility and access that we provide and by virtue of being able to solve that high-end problem it's very easy for us to come down so in terms of your specific question about what are the use cases we have active discussions in data center centric applications for the leaf and spine we have active discussions for edge applications we have active discussions going on for cloud centric applications arcus can be used anywhere who's the buyer those network operator so since we can go look a variety of personas network operator large telco that's right inner person running a killer app that's you know high mission-critical high scale is that Mike right yeah you're getting you're absolutely getting it right basically anybody that has a network and has a networking infrastructure that is consuming networking equipment is a potential customer for ours now the product has the extensibility to be used anywhere in the data center at the edge or in the cloud we're very focused on some of the use cases that are in the CDN peering and IP you know route reflector IP peering use cases great Steve I want to get your thoughts because I say I know how you invest you guys a great great firm over there you're pretty finicky on investments certainly team check pedigrees they're on the team so that's a good inside market tamp big markets what's the market here for you but how do you see this market what's the bet for you guys on the market side yeah it's pretty pretty straightforward as you look at the size of the networking market with you know three major players around here and you know a longer tail owning a small piece of Haitian giant market is a great way to get started and if you believe in the and the secular trends that are going on with innovation and hardware and the ability to take advantage of them I think we have identified a few really interesting starting use cases and web-scale companies that have a lot of cost and needs in the networking side but what I would love about the software architecture it reminds me a lot of things do have kind of just even the early virtualization pieces if you if you can take advantage of movement in advantages and hardware as they improve and really bring them into a company more quickly than before then those companies are gonna be able to have you know better economics on their networking early on so get a great layer in solve a particular use case but then the trends of being able to take advantage of new hardware and to be able to provide the data and the API is to programmatic and to manage it who one would that it's creative limp limitless opportunity because with custom silicon that has you know purpose-built protocols it's easy to put a box together and in a large data center or even boxes yeah you can imagine the vendors of the advances and the chips really love that there's a good company that can take advantage of them more quickly than others can so cloud cloud service refined certainly as a target audience here large the large clouds would love it there's an app coming in Broadcom as a customer they a partner of you guys in two parts first comes a partner so we we've ported arc OS onto multiple members of the Broadcom switching family so we have five or six of their components their networking system on chip components that we've ported to including the two highest end which is the jericho plus and you got a letter in the Broadcom buying CA and that's gonna open up IT operations to you guys and volge instead of applications and me to talk about what you just said extensibility of taking what you just said about boxes and tying applique and application performance you know what's going to see that vertically integrated and i think i think eloping yeah from from a semiconductor perspective since i spent a lot of time in the industry you know one of the challenges i had founded a high court count multi processor company and one of the challenges we always had was the software and at easy chip we had the world's highest and network processor challenge with software and i think if you take all the innovation in the silicon industry and couple it with the right software the combination of those two things opens up a vast number of opportunities and we feel that with our Co s we provide you know that software piece that's going to help people take advantage of all the great innovation that's happening you mentioned earlier open source people don't want to bring open source at the core the network yet the open source communities are growing really at an exponential rate you starting to see open source be the lingua franca for all developers especially the modern software developers wine not open sourcing the core the amino acids gotta be bulletproof you need security obviously answers there but that seems difficult to the trend on open source what's the what's the answer there on why not open source in the core yeah so we we take advantage of open source where it makes sense so we take advantage of open and onl open network Linux and we have developed our protocols that run on that environment the reason we feel that the protocols being developed in-house as opposed to leveraging things from the open source community are the internet scale multi-threading of bgp integrating things like open config yang based models into that environment right well it's not only proven but our the the the capabilities that we're able to innovate on and bring unique differentiation weren't really going back to a clean sheet of paper and so we designed it ground-up to really be optimized for the needs of today Steve your old boss Palmer rich used to talk about the harden top mmm-hmm similar here right you know one really no one's really gonna care if it works great it's under the under the harden top where you use open source as a connection point for services and opportunities to grow that similar concept yes I mean at the end of the day open source is great for certain things and for community and extensibility and for visibility and then on the flip side they look to a company that's accountable and for making sure it performs and as high quality and so I think I think that modern way for especially for the mission critical infrastructure is to have a mix of both and to give back to community where it makes sense to be responsible for hardening things are building them when they don't expense so how'd you how'd you how'd you land these guys you get him early and don't sit don't talk to any other VCS how did it all come together between you guys we've actually been friends for a while which has been great in it at one point we actually decided to ask hey what do you actually do I found that I was a venture investor and he is a network engineer but now I actually have actually really liked the networking space as a whole as much as people talk about the cloud or open source or storage being tough networking is literally everywhere and will be everywhere and whatever our world looks like so I always been looking for the most interesting companies in that space and we always joke like the investment world kind of San Francisco's applications mid here's sort of operating systems and the lower you get the more technical it gets and so well there's a vaccine I mean we're a media company I think we're doing things different we're team before we came on camera but I think media is undervalued I wrote just wrote a tweet on that got some traction on that but it's shifting back to silicon you're seeing systems if you look at some of the hottest areas IT operations is being automated away AI ops you know Auto machine learning starting to see some of these high-end like home systems like that's exactly where I was gonna go it's like the vid I I especially just love very deep intellectual property that is hard to replicate and that you can you know ultimately you can charge a premium for something that is that hard to do and so that's that's really something I get drugs in the deal with in you guys you have any other syndicates in the video about soda sure you know so our initial seed investor was clear ventures gentleman by the name of Chris rust is on our board and then Steve came in and led our most recent round of funding and he also was on the board what we've done beyond that institutional money is we have a group of very strategic individual investors two people I would maybe highlight amongst the vast number of advisers we have our gentleman by the name of Pankaj Patel punka JH was the chief development officer at Cisco he was basically number two at Cisco for a number of years deep operating experience across all facets of what we would need and then there's another gentleman by the name of Amarjeet Gill I've been friends with armored teeth for 30 years he's probably one of the single most successful entrepreneurs in the he's incubated companies that have been purchased by Broadcom by Apple by Google by Facebook by Intel by EMC so we were fortunate enough to get him involved and keep him busy great pedigree great investors with that kind of electoral property and those smart mines they're a lot of pressure on you as the CEO not to screw it up right I mean come on now get all those smart man come on okay you got it look at really good you know I I welcome it actually I enjoy it you know we look when you have a great team and you have as many capable people surrounding you it really comes together and so I don't think it's about me I actually think number one it's about I was just kidding by the way I think it's about the team and I'm merely a spokesperson to represent all the great work that our team has done so I'm really proud of the guys we have and frankly it makes my job easier you've got a lot of people to tap for for advice certainly the shared experiences electively in the different areas make a lot of sense in the investors certainly yeah up to you absolutely absolutely and it's not it's not just at the at the board it's just not at the investor level it's at the adviser level and also at you know at our individual team members when we have a team that executes as well as we have you know everything falls into place well we think the software worlds change we think the economics are changing certainly when you look at cloud whether it's cloud computing or token economics with blockchain and new emerging tech around AI we think the world is certainly going to change so you guys got a great team to kind of figure it out I mean you got a-you know execute in real time you got a real technology play with IP question is what's the next step what is your priorities now that you're out there congratulations on your launch thank you in stealth mode you got some customers you've got Broadcom relationships and looking out in the landscape what's your what's your plan for the next year what's your goals really to take every facet of what you said and just scale the business you know we're actively hiring we have a lot of customer activity this week happens to be the most recent IETF conference that happened in Montreal given our company launch on Monday there's been a tremendous amount of interest in everything that we're doing so that coupled with the existing customer discussions we have is only going to expand and then we have a very robust roadmap to continue to augment and add capabilities to the baseline capabilities that we brought to the market so I I really view the next year as scaling the business in all aspects and increasingly my time is going to be focused on commercially centric activities right well congratulations got a great team we receive great investment cube conversation here I'm John furry here the hot startup here launching this week here in California in Silicon Valley where silicon is back and software is back it's the cube bringing you all the action I'm John Fourier thanks for watching [Music]
**Summary and Sentiment Analysis are not been shown because of improper transcript**
ENTITIES
Entity | Category | Confidence |
---|---|---|
Steve | PERSON | 0.99+ |
February of 2016 | DATE | 0.99+ |
John Ferrier | PERSON | 0.99+ |
Derek Young | PERSON | 0.99+ |
August of 2016 | DATE | 0.99+ |
Derek | PERSON | 0.99+ |
Steve Herod | PERSON | 0.99+ |
twenty plus years | QUANTITY | 0.99+ |
20 plus years | QUANTITY | 0.99+ |
Steve Herrod | PERSON | 0.99+ |
California | LOCATION | 0.99+ |
EMC | ORGANIZATION | 0.99+ |
Cisco | ORGANIZATION | 0.99+ |
July 2018 | DATE | 0.99+ |
Montreal | LOCATION | 0.99+ |
30 years | QUANTITY | 0.99+ |
NVIDIA | ORGANIZATION | 0.99+ |
Monday | DATE | 0.99+ |
six | QUANTITY | 0.99+ |
arcus Inc | ORGANIZATION | 0.99+ |
John Fourier | PERSON | 0.99+ |
Amarjeet Gill | PERSON | 0.99+ |
150 plus patents | QUANTITY | 0.99+ |
John | PERSON | 0.99+ |
600 plus years | QUANTITY | 0.99+ |
Apple | ORGANIZATION | 0.99+ |
ORGANIZATION | 0.99+ | |
five | QUANTITY | 0.99+ |
today | DATE | 0.99+ |
ORGANIZATION | 0.99+ | |
VMware | ORGANIZATION | 0.99+ |
easy chip | ORGANIZATION | 0.99+ |
Silicon Valley | LOCATION | 0.99+ |
Broadcom | ORGANIZATION | 0.99+ |
two people | QUANTITY | 0.99+ |
Mike | PERSON | 0.99+ |
Intel | ORGANIZATION | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
first time | QUANTITY | 0.98+ |
Chris rust | PERSON | 0.98+ |
three | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
next year | DATE | 0.98+ |
two parts | QUANTITY | 0.98+ |
over 400 products | QUANTITY | 0.98+ |
first | QUANTITY | 0.97+ |
third piece | QUANTITY | 0.97+ |
John furry | PERSON | 0.97+ |
Linux | TITLE | 0.97+ |
two points | QUANTITY | 0.97+ |
first operating system | QUANTITY | 0.97+ |
this week | DATE | 0.97+ |
three major players | QUANTITY | 0.96+ |
both | QUANTITY | 0.95+ |
Kop | PERSON | 0.95+ |
General Catalyst | ORGANIZATION | 0.95+ |
Mobius | ORGANIZATION | 0.94+ |
San Francisco | LOCATION | 0.93+ |
Palmer | PERSON | 0.92+ |
Arrcus | ORGANIZATION | 0.9+ |
Mellanox | ORGANIZATION | 0.89+ |
single | QUANTITY | 0.88+ |
one point | QUANTITY | 0.88+ |
two things | QUANTITY | 0.88+ |
lingua franca | TITLE | 0.87+ |
General Catalyst VCU | ORGANIZATION | 0.87+ |
Kher | PERSON | 0.86+ |
VCS | ORGANIZATION | 0.8+ |
Arun Garg, NetApp | Cisco Live 2018
>> Live from Orlando, Florida it's theCUBE covering Cisco Live 2018. Brought to you by Cisco, NetApp and theCUBE's ecosystem partners. >> Hey, welcome back everyone. This is theCUBE's coverage here in Orlando, Florida at Cisco Live 2018. Our first year here at Cisco Live. We were in Barcelona this past year. Again, Cisco transforming to a next generation set of networking capabilities while maintaining all the existing networks and all the security. I'm John Furrier your host with Stu Miniman my co-host for the next three days. Our next guest is Arun Garg. Welcome to theCUBE. You are the Director of Product Management Converged Infrastructure Group at NetApp. >> Correct, thank you very much for having me on your show and it's a pleasure to meet with you. >> One of the things that we've been covering a lot lately is the NetApp's really rise in the cloud. I mean NetApp's been doing a lot of work on the cloud. I mean I've wrote stories back when Tom Georges was the CEO when Amazon just came on the scene. NetApp has been really into the cloud and from the customer's standpoint but now with storage and elastic resources and server lists, the customers are now startin' to be mindful. >> Absolutely. >> Of how to maximize the scale and with All Flash kind of a perfect storm. What are you guys up to? What's your core thing that you guys are talking about here at Cisco Live? >> So absolutely, thank you. So George Kurian, our CEO at NetApp, is very much in taking us to the next generation and the cloud. Within that I take care of some of the expansion plans we have on FlexPod with Cisco and in that we have got two new things that we are announcing right now. One is the FlexPod for Healthcare which is in FlexPod we've been doing horizontal application so far which are like the data bases, tier one database, as well as applications from Microsoft and virtual desktops. Now we are going vertical. Within the vertical our application, the first one we're looking in the vertical is healthcare. And so it's FlexPod for Healthcare. That's the first piece that we are addressing. >> What's the big thing with update on FlexPod? Obviously FlexPod's been very successful. What's the modernization aspect of it because Cisco's CEO was onstage today talking about Cisco's value proposition, about the old ways now transitioning to a new network architecture in the modern era. What's the update on FlexPod? Take a minute to explain what are the cool, new things going on with FlexPod. >> Correct, so the All Flash FAS, which is the underlying technology, which is driving the FlexPod, has really picked up over the last year as customers keep wanting to improve their infrastructure with better latencies and better performance the All Flash FAS has driven even the FlexPod into the next generation. So that's the place where we are seeing double-digit growth over the last five quarters consistently in FlexPod. So that's a very important development for us. We've also done more of the standard CVDs that we do on SAP and a few other are coming out. So those are all out there. Now we are going to make sure that all these assets can be consumed by the vertical industry in healthcare. And there's another solution we'll talk about, the managed private cloud on FlexPod. >> Yeah, Arun, I'd love to talk about the private cloud. So I think back to when Cisco launched UCS it was the storage partners that really helped drive that modernization for virtualization. NetApp with FlexPod, very successful over the years doing that. As we know, virtualization isn't enough to really be a private cloud. All the things that Chuck Robbins is talking about onstage, how do I modernize, how do I get you know, automation in there? So help us connect the dots as to how we got from you know, a good virtualized platform to this is, I think you said managed private cloud, FlexPod in Cisco. >> Absolutely. So everybody likes to consume a cloud. It's easy to consume a cloud. You go and you click on I need a VM, small, medium, large, and I just want to see a dashboard with how my VMs are doing. But in reality it's more difficult to just build your own cloud. There's complexity associated with it. You need a service platform where you can give a ticket, then you need an orchestration platform where you can set up the infrastructure, then you need a monitoring platform which will show you all of the ways your infrastructure's working. You need a capacity planning tool. There's tens of tools that need to be integrated. So what we have done is we have partnered with some of the premium partners and some DSIs who have already built this. So the risk of a customer using their private cloud infrastructure is minimized and therefore these partners also have a managed service. So when you combine the fact that you have a private cloud infrastructure in the software domain as well as a managed service and you put it on the on-prem FlexPod that are already sold then the customer benefits from having the best of both worlds, a cloud-like experience on their own premise. And that is what we are delivering with this FlexPod managed private cloud solution. >> Talk about the relationship with Cisco. So we're here at Cisco Live you guys have a good relationship with Cisco. What should customers understand about the relationship? What are the top bullet points and value opportunities and what does it mean to the impact for the customer? >> So we, all these solutions we work very closely with the Cisco business unit and we jointly develop these solutions. So within that what we do is there's the BU to BU interaction where the solution is developed and defined. There is a marketing to marketing interaction where the collateral gets created and reviewed by both parties. So you will not put a FlexPod brand unless the two companies agree. >> So it's tightly integrated. >> It's tightly integrated. The sales teams are aligned, the marketing, the communications team, the channel partner team. That's the whole value that the end customer gets because when a partner goes to a high-end enterprise customer he knows that both Cisco and NetApp teams can be brought to the table for the customer to showcase the value as well as help them through it all. >> Yeah, over in one of the other areas that's been talked about this show we talk about modernization. You talk about things like microservices. >> Yes. >> Containers are pretty important. How does that story of containerization fit into FlexPod? >> Absolutely. So containerization helps you get workloads, the cloud-native workloads or the type two native. Type two workloads as Gartner calls them. So our mode two. What we do is we work with the Cisco teams and we already had a CVD design with a hybrid cloud with a Cisco cloud center platform, which is the quicker acquisition. And we showed a design with that. What we are now bringing to the table is the ability for our customers to benefit with a managed service on top of it. So that's the piece we are dealing with the cloud teams. With the Cisco team the ACI fabric is very important to them. So that ACI fabric is visible and shown in our designs whether you do SAP, you do Oracle, you do VDI and you do basic infrastructure or you do the managed private cloud or FlexPod on Healthcare. All of these have the core networking technologies from Cisco, as well as the cloud technologies from Cisco in a form factor or in a manner that easily consumable by our customers. >> Arun, talk about the customer use cases. So say you've got a customer, obviously you guys have a lot of customers together with Cisco, they're doing some complex things with the technology, but for the customer out there that has not yet kinda went down the NetApp Cisco route, what do they do? 'Cause a lot of storage guys are lookin' at All Flash, so check, you guys have that. They want great performance, check. But then they gotta integrate. So what do you say to the folks watching that aren't yet customers about what they should look at and evaluate vis-a-vis your opportunity with them and say the competition? >> So yes, there are customers who are doing all this as separate silos, but the advantage of taking a converged infrastructure approach is that you benefit from the years of man experience or person experience that we have put behind in our labs to architect this, make sure that everything is working correctly and therefore is reduces their deployment time and reduces the risk. And if you want to be agile and faster even in the traditional infrastructure, while you're being asked to go to the cloud you can do it with our FlexPod design guides. If you want the cloud-like experience then you can do it with a managed private cloud solution on your premise. >> So they got options and they got flexibility on migrating to the cloud or architecting that. >> Yes. >> Okay, great, now I'm gonna ask you another question. This comes up a lot on theCUBE and certainly we see it in the industry. One of the trends is verticalization. >> Yes. >> So verticalization is not a new thing. Vertical industry, people go to market that way, they build products that are custom to verticals. But with cloud one of the benefits of cloud and kind of a cloud operations is you have a horizontally scalable capability. So how do you guys look at that, because these verticals, they gotta get closer to the front lines and have apps that are customized. I mean data that's fastly delivered to the app. How should verticals think about architecting storage to maintain the scale of horizontally scalable but yet provide customization into the applications that might be unique to the vertical? >> Okay, so let me give a trend first and then I'll get to the specific. So in the vertical industry, the next trend is industry clouds. For example, you have healthcare clouds and you'll have clouds to specific industries. And the reason is because these industries have to keep their data on-prem. So the data gravity plays a lot of impact in all of these decisions. And the security of their data. So that is getting into industry-specific clouds. The second pieces are analytics. So customers now are finding that data is valuable and the insight you can get from the data are actually more valuable. So what they want is the data on their premise, they want the ability all in their control so to say, they want the ability to not only run their production applications but also the ability to run analytics on top of that. In the specific example for health care what it does is when you have All Flash FAS it provides you a faster response for the patient because the physician is able to get the diagnostics done better if he has some kind of analytics helping him. [Interviewer] - Yeah. >> Plus the first piece I talked about, the rapid deployment is very important because you want to get your infrastructure set up so I can give an example on that too. >> Well before we get to the example, this is an important point because I think this is really the big megatrend. It's not really kinda talked much about but it's pretty happening is that what you just pointed out was it's not just about speeds and feeds and IOPs, the performance criteria to the industry cloud has other new things like data, the role of data, what they're using for the application. >> Correct. >> So it's just you've gotta have table stakes of great, fast storage. >> Yes. >> But it's gotta be integrated into what is becoming a use case for the verticals. Did I get that right? >> Yes, absolutely. So I'll give two examples. One I can name the customer. So they'll come at our booth tomorrow, in a minute here. So LCMC Health, part of UMC, and they have the UMC Medical Center. So when New Orleans had this Katrina disaster in Louisiana, so they came up with they need a hospital, fast. And they decided on FlexPod because within three months with the wire one's architecture and application they could scale their whole IT data center for health care. So that has helped them tremendously to get it up and running. Second is with the All Flash FAS they're able to provide faster response to their customer. So that's a typical example that we see in these kind of industries. >> Arun, thanks for coming on theCUBE. We really appreciate it. You guys are doing a great job. In following NetApps recent success lately, as always, NetApp's always goin' the next level. Quick question for you to end the segment. What's your take of Cisco Live this year? What's some of the vibe of the show? So I know it's day one, there's a lot more to come and you're just getting a sense of it. What's the vibe? What's coming out of the show this year? What's the big ah-ha? >> So I attended the keynote today and it was very interesting because Cisco has taken networking to the next level within 10 base networking, its data and analytics where you can put on a subscription mode on all the pieces of the infrastructure networking. And that's exactly the same thing which NetApp is doing, where we are going up in the cloud with this subscription base. And when you add the two subscription base then for us, at least in the managed private cloud solution we can provide the subscription base through the managed private cloud through our managed service provider. So knowing where the industry was going, knowing where Cisco was going and knowing where we want to go, we have come up with this solution which matches both these trends of Cisco as well as NetApp. >> And the number of connected devices going up every day. >> Yes. >> More network connections, more geo domains, it's complicated. >> It is complicated, but if you do it correctly we can help you find a way through it. >> Arun, thank you for coming on theCUBE. I'm John Furrier here on theCUBE with Stu Miniman here with NetApp at Cisco Live 2018. Back with more live coverage after this short break. (upbeat music)
SUMMARY :
Brought to you by Cisco, NetApp and all the security. and it's a pleasure to meet with you. and from the customer's standpoint What are you guys up to? One is the FlexPod for What's the modernization aspect of it So that's the place where we All the things that Chuck So the risk of a customer using Talk about the relationship with Cisco. So you will not put a FlexPod brand that the end customer gets Yeah, over in one of the other areas How does that story of So that's the piece we are and say the competition? and reduces the risk. on migrating to the cloud One of the trends is verticalization. the benefits of cloud and the insight you can get from the data Plus the first piece I talked the big megatrend. So it's just you've case for the verticals. One I can name the customer. What's some of the vibe of the show? So I attended the keynote today And the number of connected it's complicated. we can help you find a way through it. Arun, thank you for coming on theCUBE.
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Cisco | ORGANIZATION | 0.99+ |
Tom Georges | PERSON | 0.99+ |
Amazon | ORGANIZATION | 0.99+ |
Arun | PERSON | 0.99+ |
George Kurian | PERSON | 0.99+ |
UMC Medical Center | ORGANIZATION | 0.99+ |
Barcelona | LOCATION | 0.99+ |
Arun Garg | PERSON | 0.99+ |
two companies | QUANTITY | 0.99+ |
John Furrier | PERSON | 0.99+ |
LCMC Health | ORGANIZATION | 0.99+ |
Chuck Robbins | PERSON | 0.99+ |
Stu Miniman | PERSON | 0.99+ |
Microsoft | ORGANIZATION | 0.99+ |
UMC | ORGANIZATION | 0.99+ |
second pieces | QUANTITY | 0.99+ |
Louisiana | LOCATION | 0.99+ |
Orlando, Florida | LOCATION | 0.99+ |
Katrina | EVENT | 0.99+ |
FlexPod | COMMERCIAL_ITEM | 0.99+ |
NetApp | ORGANIZATION | 0.99+ |
both parties | QUANTITY | 0.99+ |
New Orleans | LOCATION | 0.99+ |
Second | QUANTITY | 0.99+ |
10 base | QUANTITY | 0.99+ |
three months | QUANTITY | 0.99+ |
first piece | QUANTITY | 0.99+ |
two examples | QUANTITY | 0.99+ |
tomorrow | DATE | 0.99+ |
both | QUANTITY | 0.99+ |
One | QUANTITY | 0.99+ |
first one | QUANTITY | 0.99+ |
today | DATE | 0.98+ |
theCUBE | ORGANIZATION | 0.98+ |
first year | QUANTITY | 0.98+ |
last year | DATE | 0.98+ |
Gartner | ORGANIZATION | 0.98+ |
both worlds | QUANTITY | 0.97+ |
Cisco Live 2018 | EVENT | 0.97+ |
two | QUANTITY | 0.97+ |
NetApp | TITLE | 0.97+ |
this year | DATE | 0.97+ |
All Flash FAS | COMMERCIAL_ITEM | 0.97+ |
one | QUANTITY | 0.97+ |
two new things | QUANTITY | 0.96+ |
tens of tools | QUANTITY | 0.95+ |
UCS | ORGANIZATION | 0.94+ |
Oracle | ORGANIZATION | 0.94+ |
Making AI Real – A practitioner’s view | Exascale Day
>> Narrator: From around the globe, it's theCUBE with digital coverage of Exascale day, made possible by Hewlett Packard Enterprise. >> Hey, welcome back Jeff Frick here with the cube come due from our Palo Alto studios, for their ongoing coverage in the celebration of Exascale day 10 to the 18th on October 18th, 10 with 18 zeros, it's all about big powerful giant computing and computing resources and computing power. And we're excited to invite back our next guest she's been on before. She's Dr. Arti Garg, head of advanced AI solutions and technologies for HPE. Arti great to see you again. >> Great to see you. >> Absolutely. So let's jump into before we get into Exascale day I was just looking at your LinkedIn profile. It's such a very interesting career. You've done time at Lawrence Livermore, You've done time in the federal government, You've done time at GE and industry, I just love if you can share a little bit of your perspective going from hardcore academia to, kind of some government positions, then into industry as a data scientist, and now with originally Cray and now HPE looking at it really from more of a vendor side. >> Yeah. So I think in some ways, I think I'm like a lot of people who've had the title of data scientists somewhere in their history where there's no single path, to really working in this industry. I come from a scientific background. I have a PhD in physics, So that's where I started working with large data sets. I think of myself as a data scientist before the term data scientist was a term. And I think it's an advantage, to be able to have seen this explosion of interest in leveraging data to gain insights, whether that be into the structure of the galaxy, which is what I used to look at, or whether that be into maybe new types of materials that could advance our ability to build lightweight cars or safety gear. It's allows you to take a perspective to not only understand what the technical challenges are, but what also the implementation challenges are, and why it can be hard to use data to solve problems. >> Well, I'd just love to get your, again your perspective cause you are into data, you chose that as your profession, and you probably run with a whole lot of people, that are also like-minded in terms of data. As an industry and as a society, we're trying to get people to do a better job of making database decisions and getting away from their gut and actually using data. I wonder if you can talk about the challenges of working with people who don't come from such an intense data background to get them to basically, I don't know if it's understand the value of more of a data kind decision making process or board just it's worth the effort, cause it's not easy to get the data and cleanse the data, and trust the data and get the right context, working with people that don't come from that background. And aren't so entrenched in that point of view, what surprises you? How do you help them? What can you share in terms of helping everybody get to be a more data centric decision maker? >> So I would actually rephrase the question a little bit Jeff, and say that actually I think people have always made data driven decisions. It's just that in the past we maybe had less data available to us or the quality of it was not as good. And so as a result most organizations have developed organize themselves to make decisions, to run their processes based on a much smaller and more refined set of information, than is currently available both given our ability to generate lots of data, through software and sensors, our ability to store that data. And then our ability to run a lot of computing cycles and a lot of advanced math against that data, to learn things that maybe in the past took, hundreds of years of experiments in scientists to understand. And so before I jumped into, how do you overcome that barrier? Just I'll use an example because you mentioned, I used to work in industry I used to work at GE. And one of the things that I often joked about, is the number of times I discovered Bernoulli's principle, in data coming off a GE jet engines you could do that overnight processing these large data but of course historically that took hundreds of years, to really understand these physical principles. And so I think when it comes to how do we bridge the gap between people who are adapt at processing large amounts of data, and running algorithms to pull insights out? I think it's both sides. I think it's those of us who are coming from the technical background, really understanding the way decisions are currently made, the way process and operations currently work at an organization. And understanding why those things are the way they are maybe their security or compliance or accountability concerns, that a new algorithm can't just replace those. And so I think it's on our end, really trying to understand, and make sure that whatever new approaches we're bringing address those concerns. And I think for folks who aren't necessarily coming from a large data set, and analytical background and when I say analytical, I mean in the data science sense, not in the sense of thinking about things in an abstract way to really recognize that these are just tools, that can enhance what they're doing, and they don't necessarily need to be frightening because I think that people who have been say operating electric grids for a long time, or fixing aircraft engines, they have a lot of expertise and a lot of understanding, and that's really important to making any kind of AI driven solution work. >> That's great insight but that but I do think one thing that's changed you come from a world where you had big data sets, so you kind of have a big data set point of view, where I think for a lot of decision makers they didn't have that data before. So we won't go through all the up until the right explosions of data, and obviously we're talking about Exascale day, but I think for a lot of processes now, the amount of data that they can bring to bear, is so dwarfs what they had in the past that before they even consider how to use it they still have to contextualize it, and they have to manage it and they have to organize it and there's data silos. So there's all this kind of nasty processes stuff, that's in the way some would argue has been kind of a real problem with the promise of BI, and does decision support tools. So as you look at at this new stuff and these new datasets, what are some of the people in process challenges beyond the obvious things that we can think about, which are the technical challenges? >> So I think that you've really hit on, something I talk about sometimes it was kind of a data deluge that we experienced these days, and the notion of feeling like you're drowning in information but really lacking any kind of insight. And one of the things that I like to think about, is to actually step back from the data questions the infrastructure questions, sort of all of these technical questions that can seem very challenging to navigate. And first ask ourselves, what problems am I trying to solve? It's really no different than any other type of decision you might make in an organization to say like, what are my biggest pain points? What keeps me up at night? or what would just transform the way my business works? And those are the problems worth solving. And then the next question becomes, if I had more data if I had a better understanding of something about my business or about my customers or about the world in which we all operate, would that really move the needle for me? And if the answer is yes, then that starts to give you a picture of what you might be able to do with AI, and it starts to tell you which of those data management challenges, whether they be cleaning the data, whether it be organizing the data, what it, whether it be building models on the data are worth solving because you're right, those are going to be a time intensive, labor intensive, highly iterative efforts. But if you know why you're doing it, then you will have a better understanding of why it's worth the effort. And also which shortcuts you can take which ones you can't, because often in order to sort of see the end state you might want to do a really quick experiment or prototype. And so you want to know what matters and what doesn't at least to that. Is this going to work at all time. >> So you're not buying the age old adage that you just throw a bunch of data in a data Lake and the answers will just spring up, just come right back out of the wall. I mean, you bring up such a good point, It's all about asking the right questions and thinking about asking questions. So again, when you talk to people, about helping them think about the questions, cause then you've got to shape the data to the question. And then you've got to start to build the algorithm, to kind of answer that question. How should people think when they're actually building algorithm and training algorithms, what are some of the typical kind of pitfalls that a lot of people fall in, haven't really thought about it before and how should people frame this process? Cause it's not simple and it's not easy and you really don't know that you have the answer, until you run multiple iterations and compare it against some other type of reference? >> Well, one of the things that I like to think about just so that you're sort of thinking about, all the challenges you're going to face up front, you don't necessarily need to solve all of these problems at the outset. But I think it's important to identify them, is I like to think about AI solutions as, they get deployed being part of a kind of workflow, and the workflow has multiple stages associated with it. The first stage being generating your data, and then starting to prepare and explore your data and then building models for your data. But sometimes I think where we don't always think about it is the next two phases, which is deploying whatever model or AI solution you've developed. And what will that really take especially in the ecosystem where it's going to live. If is it going to live in a secure and compliant ecosystem? Is it actually going to live in an outdoor ecosystem? We're seeing more applications on the edge, and then finally who's going to use it and how are they going to drive value from it? Because it could be that your AI solution doesn't work cause you don't have the right dashboard, that highlights and visualizes the data for the decision maker who will benefit from it. So I think it's important to sort of think through all of these stages upfront, and think through maybe what some of the biggest challenges you might encounter at the Mar, so that you're prepared when you meet them, and you can kind of refine and iterate along the way and even upfront tweak the question you're asking. >> That's great. So I want to get your take on we're celebrating Exascale day which is something very specific on 1018, share your thoughts on Exascale day specifically, but more generally I think just in terms of being a data scientist and suddenly having, all this massive compute power. At your disposal yoy're been around for a while. So you've seen the development of the cloud, these huge data sets and really the ability to, put so much compute horsepower against the problems as, networking and storage and compute, just asymptotically approach zero, I mean for as a data scientist you got to be pretty excited about kind of new mysteries, new adventures, new places to go, that we just you just couldn't do it 10 years ago five years ago, 15 years ago. >> Yeah I think that it's, it'll--only time will tell exactly all of the things that we'll be able to unlock, from these new sort of massive computing capabilities that we're going to have. But a couple of things that I'm very excited about, are that in addition to sort of this explosion or these very large investments in large supercomputers Exascale super computers, we're also seeing actually investment in these other types of scientific instruments that when I say scientific it's not just academic research, it's driving pharmaceutical drug discovery because we're talking about these, what they call light sources which shoot x-rays at molecules, and allow you to really understand the structure of the molecules. What Exascale allows you to do is, historically it's been that you would go take your molecule to one of these light sources and you shoot your, x-rays edit and you would generate just masses and masses of data, terabytes of data it was each shot. And being able to then understand, what you were looking at was a long process, getting computing time and analyzing the data. We're on the precipice of being able to do that, if not in real time much closer to real time. And I don't really know what happens if instead of coming up with a few molecules, taking them, studying them, and then saying maybe I need to do something different. I can do it while I'm still running my instrument. And I think that it's very exciting, from the perspective of someone who's got a scientific background who likes using large data sets. There's just a lot of possibility of what Exascale computing allows us to do in from the standpoint of I don't have to wait to get results, and I can either stimulate much bigger say galaxies, and really compare that to my data or galaxies or universes, if you're an astrophysicist or I can simulate, much smaller finer details of a hypothetical molecule and use that to predict what might be possible, from a materials or drug perspective, just to name two applications that I think Exascale could really drive. >> That's really great feedback just to shorten that compute loop. We had an interview earlier in some was talking about when the, biggest workload you had to worry about was the end of the month when you're running your financial, And I was like, why wouldn't that be nice to be the biggest job that we have to worry about? But now I think we saw some of this at animation, in the movie business when you know the rendering for whether it's a full animation movie, or just something that's a heavy duty three effects. When you can get those dailies back to the, to the artist as you said while you're still working, or closer to when you're working versus having this, huge kind of compute delay, it just changes the workflow dramatically and the pace of change and the pace of output. Because you're not context switching as much and you can really get back into it. That's a super point. I want to shift gears a little bit, and talk about explainable AI. So this is a concept that a lot of people hopefully are familiar with. So AI you build the algorithm it's in a box, it runs and it kicks out an answer. And one of the things that people talk about, is we should be able to go in and pull that algorithm apart to know, why it came out with the answer that it did. To me this just sounds really really hard because it's smart people like you, that are writing the algorithms the inputs and the and the data that feeds that thing, are super complex. The math behind it is very complex. And we know that the AI trains and can change over time as you you train the algorithm it gets more data, it adjusts itself. So it's explainable AI even possible? Is it possible at some degree? Because I do think it's important. And my next question is going to be about ethics, to know why something came out. And the other piece that becomes so much more important, is as we use that output not only to drive, human based decision that needs some more information, but increasingly moving it over to automation. So now you really want to know why did it do what it did explainable AI? Share your thoughts. >> It's a great question. And it's obviously a question that's on a lot of people's mind these days. I'm actually going to revert back to what I said earlier, when I talked about Bernoulli's principle, and just the ability sometimes when you do throw an algorithm at data, it might come the first thing it will find is probably some known law of physics. And so I think that really thinking about what do we mean by explainable AI, also requires us to think about what do we mean by AI? These days AI is often used anonymously with deep learning which is a particular type of algorithm that is not very analytical at its core. And what I mean by that is, other types of statistical machine learning models, have some underlying theory of what the population of data that you're studying. And whereas deep learning doesn't, it kind of just learns whatever pattern is sitting in front of it. And so there is a sense in which if you look at other types of algorithms, they are inherently explainable because you're choosing your algorithm based on what you think the is the sort of ground truth, about the population you're studying. And so I think we going to get to explainable deep learning. I think it's kind of challenging because you're always going to be in a position, where deep learning is designed to just be as flexible as possible. I'm sort of throw more math at the problem, because there may be are things that your sort of simpler model doesn't account for. However deep learning could be, part of an explainable AI solution. If for example, it helps you identify what are important so called features to look at what are the important aspects of your data. So I don't know it depends on what you mean by AI, but are you ever going to get to the point where, you don't need humans sort of interpreting outputs, and making some sets of judgments about what a set of computer algorithms that are processing data think. I think it will take, I don't want to say I know what's going to happen 50 years from now, but I think it'll take a little while to get to the point where you don't have, to maybe apply some subject matter understanding and some human judgment to what an algorithm is putting out. >> It's really interesting we had Dr. Robert Gates on a years ago at another show, and he talked about the only guns in the U.S. military if I'm getting this right, that are automatic, that will go based on what the computer tells them to do, and start shooting are on the Korean border. But short of that there's always a person involved, before anybody hits a button which begs a question cause we've seen this on the big data, kind of curve, i think Gartner has talked about it, as we move up from kind of descriptive analytics diagnostic analytics, predictive, and then prescriptive and then hopefully autonomous. So I wonder so you're saying will still little ways in that that last little bumps going to be tough to overcome to get to the true autonomy. >> I think so and you know it's going to be very application dependent as well. So it's an interesting example to use the DMZ because that is obviously also a very, mission critical I would say example but in general I think that you'll see autonomy. You already do see autonomy in certain places, where I would say the States are lower. So if I'm going to have some kind of recommendation engine, that suggests if you look at the sweater maybe like that one, the risk of getting that wrong. And so fully automating that as a little bit lower, because the risk is you don't buy the sweater. I lose a little bit of income I lose a little bit of revenue as a retailer, but the risk of I make that turn, because I'm going to autonomous vehicle as much higher. So I think that you will see the progression up that curve being highly dependent on what's at stake, with different degrees of automation. That being said you will also see in certain places where there's, it's either really expensive or it's humans aren't doing a great job. You may actually start to see some mission critical automation. But those would be the places where you're seeing them. And actually I think that's one of the reasons why you see actually a lot more autonomy, in the agriculture space, than you do in the sort of passenger vehicle space. Because there's a lot at stake and it's very difficult for human beings to sort of drive large combines. >> plus they have a real they have a controlled environment. So I've interviewed Caterpillar they're doing a ton of stuff with autonomy. Cause they're there control that field, where those things are operating, and whether it's a field or a mine, it's actually fascinating how far they've come with autonomy. But let me switch to a different industry that I know is closer to your heart, and looking at some other interviews and let's talk about diagnosing disease. And if we take something specific like reviewing x-rays where the computer, and it also brings in the whole computer vision and bringing in computer vision algorithms, excuse me they can see things probably fast or do a lot more comparisons, than potentially a human doctor can. And or hopefully this whole signal to noise conversation elevate the signal for the doctor to review, and suppress the noise it's really not worth their time. They can also review a lot of literature, and hopefully bring a broader potential perspective of potential diagnoses within a set of symptoms. You said before you both your folks are physicians, and there's a certain kind of magic, a nuance, almost like kind of more childlike exploration to try to get out of the algorithm if you will to think outside the box. I wonder if you can share that, synergy between using computers and AI and machine learning to do really arduous nasty things, like going through lots and lots and lots and lots of, x-rays compared to and how that helps with, doctor who's got a whole different kind of set of experience a whole different kind of empathy, whole different type of relationship with that patient, than just a bunch of pictures of their heart or their lungs. >> I think that one of the things is, and this kind of goes back to this question of, is AI for decision support versus automation? And I think that what AI can do, and what we're pretty good at these days, with computer vision is picking up on subtle patterns right now especially if you have a very large data set. So if I can train on lots of pictures of lungs, it's a lot easier for me to identify the pictures that somehow these are not like the other ones. And that can be helpful but I think then to really interpret what you're seeing and understand is this. Is it actually bad quality image? Is it some kind of some kind of medical issue? And what is the medical issue? I think that's where bringing in, a lot of different types of knowledge, and a lot of different pieces of information. Right now I think humans are a little bit better at doing that. And some of that's because I don't think we have great ways to train on, sort of sparse datasets I guess. And the second part is that human beings might be 40 years of training a model. They 50 years of training a model as opposed to six months, or something with sparse information. That's another thing that human beings have their sort of lived experience, and the data that they bring to bear, on any type of prediction or classification is actually more than just say what they saw in their medical training. It might be the people they've met, the places they've lived what have you. And I think that's that part that sort of broader set of learning, and how things that might not be related might actually be related to your understanding of what you're looking at. I think we've got a ways to go from a sort of artificial intelligence perspective and developed. >> But it is Exascale day. And we all know about the compound exponential curves on the computing side. But let's shift gears a little bit. I know you're interested in emerging technology to support this effort, and there's so much going on in terms of, kind of the atomization of compute store and networking to be able to break it down into smaller, smaller pieces, so that you can really scale the amount of horsepower that you need to apply to a problem, to very big or to very small. Obviously the stuff that you work is more big than small. Work on GPU a lot of activity there. So I wonder if you could share, some of the emerging technologies that you're excited about to bring again more tools to the task. >> I mean, one of the areas I personally spend a lot of my time exploring are, I guess this word gets used a lot, the Cambrian explosion of new AI accelerators. New types of chips that are really designed for different types of AI workloads. And as you sort of talked about going down, and it's almost in a way where we were sort of going back and looking at these large systems, but then exploring each little component on them, and trying to really optimize that or understand how that component contributes to the overall performance of the whole. And I think one of the things that just, I don't even know there's probably close to a hundred active vendors in the space of developing new processors, and new types of computer chips. I think one of the things that that points to is, we're moving in the direction of generally infrastructure heterogeneity. So it used to be when you built a system you probably had one type of processor, or you probably had a pretty uniform fabric across your system you usually had, I think maybe storage we started to get tearing a little bit earlier. But now I think that what we're going to see, and we're already starting to see it with Exascale systems where you've got GPUs and CPUs on the same blades, is we're starting to see as the workloads that are running at large scales are becoming more complicated. Maybe I'm doing some simulation and then I'm running I'm training some kind of AI model, and then I'm inferring it on some other type, some other output of the simulation. I need to have the ability to do a lot of different things, and do them in at a very advanced level. Which means I need very specialized technology to do it. And I think it's an exciting time. And I think we're going to test, we're going to break a lot of things. I probably shouldn't say that in this interview, but I'm hopeful that we're going to break some stuff. We're going to push all these systems to the limit, and find out where we actually need to push a little harder. And I some of the areas I think that we're going to see that, is there We're going to want to move data, and move data off of scientific instruments, into computing, into memory, into a lot of different places. And I'm really excited to see how it plays out, and what you can do and where the limits are of what you can do with the new systems. >> Arti I could talk to you all day. I love the experience and the perspective, cause you've been doing this for a long time. So I'm going to give you the final word before we sign out and really bring it back, to a more human thing which is ethics. So one of the conversations we hear all the time, is that if you are going to do something, if you're going to put together a project and you justify that project, and then you go and you collect the data and you run that algorithm and you do that project. That's great but there's like an inherent problem with, kind of data collection that may be used for something else down the road that maybe you don't even anticipate. So I just wonder if you can share, kind of top level kind of ethical take on how data scientists specifically, and then ultimately more business practitioners and other people that don't carry that title. Need to be thinking about ethics and not just kind of forget about it. That these are I had a great interview with Paul Doherty. Everybody's data is not just their data, it's it represents a person, It's a representation of what they do and how they lives. So when you think about kind of entering into a project and getting started, what do you think about in terms of the ethical considerations and how should people be cautious that they don't go places that they probably shouldn't go? >> I think that's a great question out a short answer. But I think that I honestly don't know that we have a great solutions right now, but I think that the best we can do is take a very multifaceted, and also vigilant approach to it. So when you're collecting data, and often we should remember a lot of the data that gets used isn't necessarily collected for the purpose it's being used, because we might be looking at old medical records, or old any kind of transactional records whether it be from a government or a business. And so as you start to collect data or build solutions, try to think through who are all the people who might use it? And what are the possible ways in which it could be misused? And also I encourage people to think backwards. What were the biases in place that when the data were collected, you see this a lot in the criminal justice space is the historical records reflect, historical biases in our systems. And so is I there are limits to how much you can correct for previous biases, but there are some ways to do it, but you can't do it if you're not thinking about it. So I think, sort of at the outset of developing solutions, that's important but I think equally important is putting in the systems to maintain the vigilance around it. So one don't move to autonomy before you know, what potential new errors you might or new biases you might introduce into the world. And also have systems in place to constantly ask these questions. Am I perpetuating things I don't want to perpetuate? Or how can I correct for them? And be willing to scrap your system and start from scratch if you need to. >> Well Arti thank you. Thank you so much for your time. Like I said I could talk to you for days and days and days. I love the perspective and the insight and the thoughtfulness. So thank you for sharing your thoughts, as we celebrate Exascale day. >> Thank you for having me. >> My pleasure thank you. All right she's Arti I'm Jeff it's Exascale day. We're covering on the queue thanks for watching. We'll see you next time. (bright upbeat music)
SUMMARY :
Narrator: From around the globe, Arti great to see you again. I just love if you can share a little bit And I think it's an advantage, and you probably run with and that's really important to making and they have to manage it and it starts to tell you which of those the data to the question. and then starting to prepare that we just you just and really compare that to my and pull that algorithm apart to know, and some human judgment to what the computer tells them to do, because the risk is you the doctor to review, and the data that they bring to bear, and networking to be able to break it down And I some of the areas I think Arti I could talk to you all day. in the systems to maintain and the thoughtfulness. We're covering on the
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Jeff Frick | PERSON | 0.99+ |
50 years | QUANTITY | 0.99+ |
40 years | QUANTITY | 0.99+ |
Jeff | PERSON | 0.99+ |
Paul Doherty | PERSON | 0.99+ |
GE | ORGANIZATION | 0.99+ |
both sides | QUANTITY | 0.99+ |
Arti | PERSON | 0.99+ |
six months | QUANTITY | 0.99+ |
Bernoulli | PERSON | 0.99+ |
Arti Garg | PERSON | 0.99+ |
second part | QUANTITY | 0.99+ |
Gartner | ORGANIZATION | 0.99+ |
hundreds of years | QUANTITY | 0.99+ |
first | QUANTITY | 0.99+ |
Palo Alto | LOCATION | 0.99+ |
Hewlett Packard Enterprise | ORGANIZATION | 0.99+ |
one | QUANTITY | 0.99+ |
10 years ago | DATE | 0.99+ |
1018 | DATE | 0.98+ |
Dr. | PERSON | 0.98+ |
Exascale | TITLE | 0.98+ |
each shot | QUANTITY | 0.98+ |
Caterpillar | ORGANIZATION | 0.98+ |
Robert Gates | PERSON | 0.98+ |
15 years ago | DATE | 0.98+ |
ORGANIZATION | 0.98+ | |
HPE | ORGANIZATION | 0.98+ |
first stage | QUANTITY | 0.97+ |
both | QUANTITY | 0.96+ |
five years ago | DATE | 0.95+ |
Exascale day | EVENT | 0.95+ |
two applications | QUANTITY | 0.94+ |
October 18th | DATE | 0.94+ |
two phases | QUANTITY | 0.92+ |
18th | DATE | 0.91+ |
10 | DATE | 0.9+ |
one thing | QUANTITY | 0.86+ |
U.S. military | ORGANIZATION | 0.82+ |
one type | QUANTITY | 0.81+ |
a years ago | DATE | 0.81+ |
each little component | QUANTITY | 0.79+ |
single path | QUANTITY | 0.79+ |
Korean border | LOCATION | 0.72+ |
hundred | QUANTITY | 0.71+ |
terabytes of data | QUANTITY | 0.71+ |
18 zeros | QUANTITY | 0.71+ |
three effects | QUANTITY | 0.68+ |
one of these light | QUANTITY | 0.68+ |
Exascale Day | EVENT | 0.68+ |
Exascale | EVENT | 0.67+ |
things | QUANTITY | 0.66+ |
Cray | ORGANIZATION | 0.61+ |
Exascale day 10 | EVENT | 0.6+ |
Lawrence Livermore | PERSON | 0.56+ |
vendors | QUANTITY | 0.53+ |
few | QUANTITY | 0.52+ |
reasons | QUANTITY | 0.46+ |
lots | QUANTITY | 0.46+ |
Cambrian | OTHER | 0.43+ |
DMZ | ORGANIZATION | 0.41+ |
Exascale | COMMERCIAL_ITEM | 0.39+ |
Incompressible Encodings
>> Hello, my name is Daniel Wichs, I'm a senior scientist at NTT research and a professor at Northeastern University. Today I want to tell you about incompressible encodings. This is a recent work from Crypto 2020 and it's a joint work with Tal Moran. So let me start with a question. How much space would it take to store all of Wikipedia? So it turns out that you can download Wikipedia for offline use and some reasonable version of it is about 50 gigabytes in size. So as you'd expect, it's a lot of data, it's quite large. But there's another way to store Wikipedia which is just to store the link www.wikipedia.org that only takes 17 bytes. And for all intents and purposes as long as you have a connection to the internet storing this link is as good as storing the Wikipedia data. You can access a Wikipedia with this link whenever you want. And the point I want to make is that when it comes to public data like Wikipedia, even though the data is huge, it's trivial to compress it down because it is public just by storing a small link to it. And the question for this talk is, can we come up with an incompressible representation of public data like Wikipedia? In other words can we take Wikipedia and represent it in some way such that this representation requires the full 50 gigabytes of storage store, even for someone who has the link to the underlying Wikipedia data and can get the underlying data for free. So let me actually tell you what this means in more detail. So this is the notion of incompressible encodings that we'll focus on in this work. So incompressible encoding consists of an encoding algorithm and a decoding algorithm, these are public algorithms. There's no secret key. Anybody can run these algorithms. The encoding algorithm takes some data m, let's say the Wikipedia data and encodes it in some probabilistic randomized way to derive a codeword c. And the codeword c, you can think of it as just an alternate representation of the Wikipedia data. Anybody can come and decode the codeword to recover the underlying data m. And the correctness property we want here is that no matter what data you start with, if you encode the data m and then decode it, you get back the original data m. This should hold with probably one over the randomness of the encoding procedure. Now for security, we want to consider an adversary that knows the underlying data m, let's say has a link to Wikipedia and can access the Wikipedia data for free does not pay for storing it. The goal of the adversary is to compress this codeword that we created this new randomized representation of the Wikipedia data. So the adversary consists of two procedures a compression procedure and a decompression procedure. The compression procedure takes its input the codeword c and output some smaller compressed value w and the decompression procedure takes w and its goal is to recover the codeword c. And a security property says that no efficient adversary should be able to succeed in this game with better than negligible property. So there are two parameters of interest in this problem. One is the codeword size, which we'll denote by alpha, and ideally we want the codeword size alpha to be as close as possible to the original data size. In other words we don't want the encoding to add too much overhead to the data. The second parameter is the incompressibility parameter beta and that tells us how much space, how much storage and adversary needs to use in order to store the codeword. And ideally, we want this beta to be as close as possible to the codeword size alpha, which should also be as close as possible to the original data size. So I want to mention that there is a trivial construction of incompressible encodings that achieves very poor parameters. So the trivial construction is just take the data m and add some randomness, concatenate some randomness to it and store the original data m plus the concatenated randomness as the codeword. And now even an adversary that knows the underlying data m cannot compress the randomness. So the incompressibility, so we ensure that this construction is incompressible with incompressibility parameter beta that just corresponds to the size of this randomness we added. So essentially the adversary cannot compress the red part of the codeword. So this gets us a scheme where alpha the size of the codeword, is the original data size m plus the incompressible parameter beta. And it turns out that you cannot do better than this information theoretically. So this is not what we want for this we want to focus on what I will call good incompressible encodings. So here, the codeword size should be as close as possible to the data size, should be just one plus little o one of the data size. And the incompressibility should be as essential as large as the entire codeword the adversary cannot compress the codeword almost at all, the incompressible parameter beta is one minus little o one of the data size or the codeword size. And in essence, what this means is that we're somehow want to take the randomness of the encoding procedure and spread it around in some clever way throughout the codeword in such a way that's impossible for the adversary to separate out the randomness and the data, and only store the randomness and rely on the fact that it can get the data for free. We want to make sure it's impossible that adversary accesses essentially this entire code word which contains both the randomness and data and some carefully intertwined way and cannot compress it down using the fact that it knows the data parts. So this notion of incompressible encodings was defined actually in a prior work of Damgard-Ganesh and Orlandi from crypto 2019. They defined a variant of this notion, they had a different name for it. As a tool or a building block for a more complex cryptographic primitive that they called Proofs of Replicated Storage. And I'm not going to talk about what these are. But in this context of constructing these Proofs of Replicated Storage, they also constructed incompressible encodings albeit with some major caveats. So in particular, their construction relied on the random Oracle models, the heuristic construction and it was not known whether you could do this in the standard model, the encoding and decoding time of the construction was quadratic in the data size. And in particular, here we want to apply this, we want to use these types of incompressible encodings on fairly large data like Wikipedia data, 50 gigabytes in size. So quadratic runtime on such huge data is really impractical. And lastly the proof of security for their construction was flawed or someone incompleted, didn't consider general adversaries. And the slope was actually also noticed by concurrent work of Garg-Lu and Waters. And they managed to give a fixed proof for this construction but this required actually quite a lot of effort. It was a highly non-trivial and subtle proof to proof the original construction of Damgard-Ganesh and Orlandi secure. So in our work, we give a new construction of these types of incompressible encodings, our construction already achieved some form of security in the Common Reference String Model come Random String Model without the use of Random Oracles. We have a linear encoding time, linear in the data size. So we get rid of the quadratic and we have a fairly simple proof of security. In fact, I'm hoping to show you a slightly simplified form of it and the stock. We also give some lower bounds and negative results showing that our construction is optimal in some aspects and lastly we give a new application of this notion of incompressible encodings to something called big-key cryptography. And so I want to tell you about this application, hopefully it'll give you some intuition about why incompressible encodings are interesting and useful, and also some intuition about what their real goal is or what it is that they're trying to achieve. So, the application of big-key cryptography is concerned with the problem of system compromise. So, a computer system can become compromised either because the user downloads a malware or remote attacker manages to hack into it. And when this happens, the remote attacker gains control over the system and any cryptographic keys that are stored on the system can easily be exfiltrated or just downloaded out of the system by the attacker and therefore, any security that these cryptographic keys were meant to provide is going to be completely lost. And the idea of big-key cryptography is to mitigate against such attacks by making the secret keys intentionally huge on the order of many gigabytes to even terabytes. And the idea is that by having a very large secret key it would make it harder to exfiltrate such a secret key. Either because the adversary's bandwidth to the compromised system is just not large enough to exfiltrate such a large key or because it might not be cost-effective to have to download so much data of compromised system and store so much data to be able to use the key in the future, especially if the attacker wants to do this on some mass scale or because the system might have some other mechanisms let's say firewall that would detect such large amounts of leakage out of the compromised system and block it in some way. So there's been a lot of work on this idea building big-key crypto systems. So crypto systems where the secret key can be set arbitrarily huge and these crypto systems should testify two goals. So one is security, security should hold even if a large amount of data about the secret key is out, as long as it's not the entire secret key. So when you have an attacker download let's say 90% of the data of the secret key, the security of the system should be preserved. And the second property is that even though the secret key of the system can be huge, many gigabytes or terabytes, we still want the crypto system to remain efficient even though the secret is huge. And particularly this means that the crypto system can even read the entire secret key during each cryptographic operation because that would already be too inefficient. So it can only read some small number of bits of the secret key during each operation, then it performs. And so there's been a lot of work constructing these types of crypto systems but one common problem for all these works is that they require the user to waste a lot of their storage the storage on their computer in storing this huge secret key which is useless for any other purpose, other than providing security. And users might not want to do this. So that's the problem that we address here. And the new idea in our work is let's make the secret key useful instead of just having a secret key with some useless, random data that the cryptographic scheme picks, let's have a secret key that stores let's say the Wikipedia data at which a user might want to store in their system anyway or the user's movie collection or music collection et cetera and the data that the user would want to store on their system. Anyway, we want to convert it. We want to use that as the secret key. Now we think about this for a few seconds. Well, is it a good idea to use Wikipedia as a secret key? No, that sounds like a terrible idea. Wikipedia is not secret, it's public, it's online, Anyone can access it whenever they want. So it's not what we're suggesting. We're suggesting to use an incompressible encoding of Wikipedia as a secret key. Now, even though Wikipedia is public the incompressible encoding is randomized. And therefore the accuracy does not know the value of this incompressible encoding. Moreover, because it's incompressible in order for the adversary to steal, to exfiltrate the entire secret key, it would have to download a very large amount of data out of the compromised system. So there's some hope that this could provide security and we show how to build public encryption schemes and the setting that make use of a secret key which is an incompressible coding of some useful data like Wikipedia. So the secret key is an incompressible encoding of useful data and security ensures that the adversary will need to exfiltrate almost entire key to break the security of this critical system. So in the last few minutes, let me give you a very brief overview of our construction of incompressible encodings. And for this part, we're going to pretend we have something a real beautiful cryptographic object called Lossy Trapdoor Permutations. It turns out we don't quite have an object that's this beautiful and in the full construction, we relax this notion somewhat in order to be able to get our full construction. So Lossy Trapdoor Permutation is a function f we just key by some public key pk and it maps end bits to end bits. And we can sample the public key in one of two indistinguishable modes. In injective mode, this function of fPK is a permutation, and there's in fact, a trapdoor that allows us to invert it efficiently. And in the Lossy mode, if we sample the public in Lossy mode, then if we take some value, random value x and give you fpk of x, then this loses a lot of information about x. And in particular, the image size of the function is very small, much smaller than two to the n and so fpk of x does not contain all the information about x. Okay, so using this type of Lossy Trapdoor Permutation, here's the encoding of a message m using long random CRS come random string. So the encoding just consists of sampling the public key of this Lossy Trapdoor Permutation in injected mode, along with the trapdoor. And the encoding is just going to take the message m, x over it with a common reference string, come random string and invert the trapdoor permutation on this value. And then Coding will just be the public key and the inverse x. So this is something anybody can decode by just taking fpk of x, x over it with the CRS. And that will recover the original message. Now, to add the security, we're going to in the proof, we're going to switch to choosing the value x uniformly at random. So the x component of the codeword is going to be chosen uniformly random and we're going to set the CRS to be fpk of x, x over the message. And if you look at it for a second this distribution is exactly equivalent. It's just a different way of sampling the exact same distribution. And in particular, the relation between the CRS and X is preserved. Now in the second step, we're going to switch the public key to Lossy mode. And now when we do this, then the Codeword part, sorry then the CRS fpk of x, x over m only leaks some small amount of information about the random value x. In other words, even if that resists these, the CRS then the value x and the codeword has a lot of entropy. And because it has a lot of entropy it's incompressible. So what we did here is that we actually start to show that the code word and the CRS are indistinguishable from a different way of sampling them where we placed information about the message and the CRS and the codeword actually is truly random, has a lot of real entropy. And therefore even given the CRS the Codeword is incompressible that's the main idea behind the proof. I just want to make two remarks, our full constructions rely on a relaxed notion of Lossy Trapdoor Permutations which we're able to construct from either the decisional residuoisity or the learning with errors assumption. So in particular, we don't actually know how to construct trapdoor permutations from LWE from any postquantum assumption but the relaxed notion that we need for our actual construction, we can achieve from post quantum assumptions that get post quantum security. I want to mention two caveats of the construction. So one is that in order to make this work, the CRS needs to be long essentially as long as the message size. And also this construction achieves a weak form of selective security where the adversary decides to choose the message before seeing the CRS. And we show that both of these caveats are inherent. We show this by black-box separation and one can overcome them only in the random oracle model. Unless I want to just end with an interesting open question. I think one of the most interesting open questions in this area all of the constructions of incompressible encodings from our work and prior work required the use of some public key crypto assumptions some sort of trapdoor permutations or trapdoor functions. And one of the interesting open question is can you construct and incompressible encodings without relying on public key crypto, using one way functions or just the random oracle model. We conjecture this is not possible, but we don't know. So I want to end with that open questions and thank you very much for listening.
SUMMARY :
in order for the adversary to steal,
SENTIMENT ANALYSIS :
ENTITIES
Entity | Category | Confidence |
---|---|---|
Daniel Wichs | PERSON | 0.99+ |
second step | QUANTITY | 0.99+ |
NTT | ORGANIZATION | 0.99+ |
two caveats | QUANTITY | 0.99+ |
17 bytes | QUANTITY | 0.99+ |
50 gigabytes | QUANTITY | 0.99+ |
two remarks | QUANTITY | 0.99+ |
both | QUANTITY | 0.99+ |
two procedures | QUANTITY | 0.99+ |
Wikipedia | ORGANIZATION | 0.99+ |
www.wikipedia.org | OTHER | 0.99+ |
two goals | QUANTITY | 0.99+ |
second parameter | QUANTITY | 0.99+ |
second property | QUANTITY | 0.99+ |
each operation | QUANTITY | 0.99+ |
two parameters | QUANTITY | 0.98+ |
one | QUANTITY | 0.98+ |
Orlandi | PERSON | 0.98+ |
Tal Moran | PERSON | 0.97+ |
Today | DATE | 0.97+ |
one common problem | QUANTITY | 0.97+ |
One | QUANTITY | 0.97+ |
Garg-Lu | ORGANIZATION | 0.96+ |
Damgard-Ganesh | PERSON | 0.96+ |
Northeastern University | ORGANIZATION | 0.96+ |
two | QUANTITY | 0.95+ |
two indistinguishable modes | QUANTITY | 0.94+ |
Crypto 2020 | ORGANIZATION | 0.94+ |
about 50 gigabytes | QUANTITY | 0.94+ |
each cryptographic | QUANTITY | 0.94+ |
CRS | ORGANIZATION | 0.94+ |
Wikipedia | TITLE | 0.93+ |
90% of the data | QUANTITY | 0.89+ |
LWE | ORGANIZATION | 0.89+ |
Oracle | ORGANIZATION | 0.84+ |
terabytes | QUANTITY | 0.83+ |
Waters | ORGANIZATION | 0.79+ |
one way | QUANTITY | 0.77+ |
seconds | QUANTITY | 0.74+ |
Lossy Trapdoor | OTHER | 0.71+ |
Proofs of Replicated Storage | OTHER | 0.64+ |
2019 | DATE | 0.62+ |
second | QUANTITY | 0.56+ |
much | QUANTITY | 0.55+ |
lot of | QUANTITY | 0.54+ |
caveats | QUANTITY | 0.51+ |
gigabytes | QUANTITY | 0.48+ |
crypto | TITLE | 0.33+ |