Image Title

Search Results for Kim Kardashian:

Farah Papaioannou and Kilton Hopkins, Edgeworx.io | CUBEConversation, 2018


 

(intense orchestral music) >> Hey, welcome back everybody, Jeff Frick here with theCUBE, we're at our Palo Alto studios for a CUBEConversation, and we're talking about startups today, which we don't often get to do but it's really one of the more exciting things that we get to do, because that's what really, what keeps Silicon Valley Silicon Valley; and this next new company is playing on a very hot space which is edge, you're all about cloud then the next big move is edge, especially with internet things and industrial internet things. So we're really happy to welcome Edgeworx here, fresh off the announcement of the new company and their funding. We got the, both Founders, we have Farah Papaioannou, and she is the President, and Kilton Hopkins, the CEO, both of Edgeworx, welcome. >> Thank you, >> Thanks. >> thanks for having us. >> So for those of us that aren't familiar, give us kind of the quick 101 on Edgeworx. >> So I've been looking at the space as a venture capitalist before I've joined up with Kilton, and I've been looking at edge computing for a long time because it just made intuitive sense to me. You're looking at all these devices that are now not just devices but they're compute platforms, or you know generating all this data; well how are we going to address all that data? If you think about sending all that back to the cloud, latency, bandwidth, and cost, you talk about breaking the internet, this is what's going to break the internet not Kim Kardashian's you know butt photo right? (guys laugh) So, how do you solve that problem? You know if you think about autonomous vehicles for example these are now computers on wheels, they're not just a transportation mechanism. If they're generating all this data, and they need to interact with each other, and make decisions in near realtime; how are they going to do that if they have to send all that data back to the cloud? >> Right, great. >> So that's where I came across Kilton's company, or actually the technology that he'd built, and we formed a company together. I looked at everything, and the technology that he'd developed, was far, leaps and bounds beyond anything anyone else had come to to date, so. >> So, Kilton, how did you start on that project? >> Yeah, so this actually goes way back, this goes way back to like about 2010. Back in Chicago I was looking at what architecture is going to allow us to do the types of processing that's really expensive, and do it close to where the data is? This architecture was in the back of my mind. When I came to the bay area, I jumped in with the city of San Francisco as an IOT Advisor; and everywhere I looked I saw the same problems. Nobody was doing secure processing at the edge in any kind of way that was manageable, so I started to solve it. Then, years later after doing, you know I did some deployments myself, and after seeing how was this stuff working, it finally arrived at an architecture that I thought: okay, this thing's passing all these trials, and now I think we've got this pretty well nailed, so. I basically got into it before the terms fog and edge computing were being thrown around, and just said this is what has to happen. And then of course, it turns out that the world catches up, and now of course there's terms for it, and everyone's talking about the edge. >> So it's an interesting problem, right, it's the same old problem we've been having forever, which is do you move the data to the compute or do you move the compute to the data? And then we've had these other things happening with suddenly this you know huge swell of data flow, and that's even before we start you know kind of the IOT connection on the data flow, luckily the networks are getting faster, 5G's around the corner, chips are getting faster and cheaper, memory's getting cheaper and faster. And then we had the development of the cloud and really the hyper growth of the public cloud. But that still doesn't help you with kind of these low latency applications that you have to execute on the edge. And obviously we've talked to GE a lot, and everyone wants to talk about turbines and you know harsh conditions and you know nasty weather, and it's not this pristine data center; how do you put compute, and how much compute do you put at the edge, and how do you manage kind of that data flow? What can you deal with there, what do you have to send up? And of course this pesky thing called physics and latency, which just prohibits, as you said, the ability to get stuff up to some compute and get it back in time necessarily to do something about it. So what is the approach that you guys are taking? What's a little bit different about what you've built with Edgeworx? >> Sure. >> So, in most cases, people think about the edge as like almost a lead into the cloud. They say: how can I pre-process the data, maybe curtail some of the bandwidth volume that I need in order to send data up to the cloud? But that doesn't actually solve the problem, you'll never get rid of cloud latency if you're sending just smaller packages. And in addition, you have done nothing to address the security issues of the edge, if you're just trying to package data, maybe reduce it a bit and send it to the cloud. So what's different about us is with us you can use the cloud, but you don't have to, we're completely at the edge. So you can run software with Edgeworx that stays within the four walls of a factory, if you so choose, and no data will ever leave the building; and that is a stark difference from the approaches that've been taken to date which've been tied to the cloud, but we do a little at the edge, it's like come on, this is real edge. >> Right, right. And so is it a software layer that sits on top of whatever kind of bios and firmware are on a lot of these dumb sensors, is that kind of the idea? >> Yeah, no actually it sits, exactly, it sits above the bios level, it sits above the firmware. It creates an application runtime, so it allows developers to write applications that are containerized, so we run containers at the edge, which allows our developers to run applications they've already developed for the cloud, to write new applications, but they don't have to learn an entirely new framework or an entirely new SDK, they can write using tools they already know: Java, C#, C++, Python, if you can write that language, we can run it, and at the edge. Which again allows people to use skillsets that they already know, they don't have to like learn specialized skillsets for the edge, why should they have to do that you know? >> I think, and you know good for you guys, to get Stacey Higginbotham to write a nice article about the company long before you launched, which is good. But I thought she had a really interesting breakdown on kind of edge computing, and she broke it down into four layers: the device, the sensors, as you said as dumb as it can be, right, you want a lot of these things. Then this gateway layer that collects the data. You know some level of compute close to the edge, not necessarily in the camera or in any of these sensors, but close, and then of course a connection back to the cloud. So you guys run in the sensor, or probably more likely in that gateway layer? Or do you see, in some of the early customers you're talkin' to, are they putting these like little micro data centers? I mean how are you actually seeing this stuff deployed in the field at scale? >> So we actually gave Stacey that four layer chart because were trying to explain people to the edge, to people who didn't understand what that was, and again, people refer to all these different layers at the edge. We actually think that the layer right above the sensors is actually the most difficult to solve for. And the reason we don't want to run on the sensor level is because sensors are becoming more and more commoditized, a customer would rather have a thousand dumb sensors where they could get more and more data, than have like 10 really smart sensors where they could run compute on them. So, unless there's special circumstances, like you know a case of a camera where we're actually working with a camera that has GPU capability, where they can actually run on the edge, we'd like to run at a level up there, and there's a couple of reasons for that. One is, if you run on the devices itself, you can't really aggregate each other's devices, you can't aggregate-- a temperature sensor cannot aggregate a pressure sensor's data, you need to set up a layer above. Also we're able to serve as a broker between low levels of you know Wi-Fi and Bluetooth, versus you know high levels of TCP/IP, right, which you also cannot do at the sensor level. If you were run at the sensor, you'd basically have to do what Amazon does, which is device-to-cloud; which doesn't really afford you the capability of running real software at the edge. >> Right. So, when you're out, let's just say the camera, we talked a little bit before we turned the cameras on about the surveillance and surveillance cameras, I mean where are those gateways, and where's the power and the connectivity to that gateway, what're you seeing in some of these early examples? >> So, you know, for cameras you've got basically two choices, either the camera is a dumb camera that puts a video feed to some kind of a compute box that's nearby, or is on a wired network, or wireless network that's private to it, so. In building cameras that are already in place, that are analog, you can put a box in the building that can take the feeds, but the better option than that even is to have smart cameras, so probably a new greenfield deploy would have smart cameras that have the ability to do the AI processing right there in the module. So the answer is: somewhere you have a feed of sensor data, whether it be video, audio, or just like a temperature, you know time series data, and then it hits a point of where you're still on the edge, but you can do compute. Sometimes they're in the same unit, sometimes they're a little spread out, sometimes they're over wireless; that first layer up is where we sit no matter how the compute is done. >> Okay. And I'm just curious on some of the early use cases. How do people see the opportunity now to have kind of a software-driven IOT device that's separate from the actual firmware that's in the in the sensor? What is that going to enable them to do that they're excited to do they couldn't do before? >> Yeah, so if you think about the older model, it's: how can I make this device, get it's sensor readings and somehow communicate that data, and I'm going to write low-level code, probably C code or whatever to operate that and it's how often do I pull the sensor? And you're really thinking about just jeez I need this data somewhere to make useable. And when you use us you think: okay, I have streams of data, what would I do if I wanted to run software right where the data is, I can increase my sampling frequency, I can undo everything we were going to do in the cloud, and do it right there for free once it's deployed there's no bandwidth cost. So it opens the world of, of thinking, we're now running software at the edge, instead of running firmware, so I can just move the data upstream. You stop moving the data, and you start moving the applications, and that's what's like the world changer for everybody. >> Right, right. >> Plus you can use the same skillsets you have for the cloud and up until now programming IOT devices has been a matter of saying oh, you know, if I know how to work the GPIO pins you know and you know I can write in C, maybe I can make it work. And now you say: I know Python, and I know how to do data analytics with Python, I can just move that into the sensor module, if it's smart enough, or the gateway right there, and I can pretty much push my code into the factory instead of waiting for the factory to wire the data to me. >> And we actually have a customer right now that's doing real-time surveillance at the edge, and they have smart city deployments and they're looking at an example of, border control for example. And what they want to be able to do is put these cameras out there and say: well, I've detected something on the maritime border here, is it a whale, is it debris, or is it a boat full of refugees, or is it a boat full of like pirates, or is it a boat full of migrants? Well before what they would have to do is okay well, as an edge device maybe I, at the basic level of processing I could run is to say let me compress that video data and send some of it back, right, and then do the analysis back there; well that's not really going to be that helpful, because if I have to send it back to some cloud and do some analysis, by the time I've recognized what's out there: too late. What we can do now with our software capability, because we have our platform running on these cameras is we can deploy software that says: okay well I can detect, right there, right at the edge, what we're seeing, and I can not just send back video data, which I don't really want to do, that's really you know heavy on bandwidth and latency, cost as well, is I can just send back text data and say: well, I've actually detected something, so let's take some sort of action on it, and say okay the next camera should be able to detect it or pick it up or send some notifications that we need to address it back here. If I'm sending textual data back, and say I've already done that processing right there and then, I can run thousands of cameras out there at the edge versus just 10 or you know, 10 or 12 because of the amount of cost and latency. And then the customer can decide, well you know what, I want to add another application that you know does target tracking of certain individual terrorists, right? Okay, well that's easy for me to deploy that software because our platform's already running. We can, you know, and just push it out there at the edge. Oh, you know what, I'm able to model train at the edge, and I can actually do better detection, I can go from 80% to 90%, well I can just push that data and do an upgrade right there at the edge as opposed to going out there and flashing that board, and you know upgrading that way, or sending out some sort of firmware upgrade; so it allows a lot of flexibility that we couldn't do before. >> Right. Well I just got to ask ya now, you got a pile of money, which is exciting, and congratulations. >> Thank you. >> I was going to say, kind of, where do you kind of focus on your go-to-market, you know within any particular vertical, or any specific horizontal application? But it sounds like, I think we've use cameras now three or four times (laughs) in the last three or four questions, so I'm guessin' that's a, that's a good-- >> That's been a strong one for us. >> You know kind of early early adopt to market for you guys. >> That one's been a strong one for us, yeah. We've had some real success with telco's, another use case that we've seen some real good traction is being able to detect quality-of-service issues on Wi-Fi routers, so, that's one that we're looking at as well that's had some adoption. Oil and gas has been pretty strong for us as well. So it seems to be kind of a horizontal play for us, and we're excited about the opportunity. >> Alright. Well thanks for comin' on and tellin' the story, and congratulations on your funding and launching the company, and, >> Thank you. >> And bringin' it to reality. >> Great, thanks. >> Alright, Kilton, Farah, I'm Jeff, you're watchin' theCUBE, thanks for watchin' we'll see ya next time. (intense orchestral music)

Published Date : Aug 16 2018

SUMMARY :

and she is the President, So for those of us that aren't familiar, and they need to interact with each other, and the technology that he'd developed, and do it close to where the data is? and you know harsh conditions from the approaches that've been taken to date which've been is that kind of the idea? for the edge, why should they have to do that you know? about the company long before you launched, which is good. is actually the most difficult to solve for. what're you seeing in some of these early examples? that have the ability to do the AI And I'm just curious on some of the early use cases. and you start moving the applications, if I know how to work the GPIO pins you know and and say okay the next camera should be able to Well I just got to ask ya now, you got a pile of money, So it seems to be kind of a horizontal play for us, and launching the company, and, you're watchin' theCUBE,

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FrickPERSON

0.99+

Stacey HigginbothamPERSON

0.99+

AmazonORGANIZATION

0.99+

ChicagoLOCATION

0.99+

San FranciscoLOCATION

0.99+

KiltonPERSON

0.99+

Farah PapaioannouPERSON

0.99+

80%QUANTITY

0.99+

EdgeworxORGANIZATION

0.99+

StaceyPERSON

0.99+

Palo AltoLOCATION

0.99+

12QUANTITY

0.99+

10QUANTITY

0.99+

JavaTITLE

0.99+

PythonTITLE

0.99+

GEORGANIZATION

0.99+

FarahPERSON

0.99+

Kilton HopkinsPERSON

0.99+

C++TITLE

0.99+

JeffPERSON

0.99+

Kim KardashianPERSON

0.99+

bothQUANTITY

0.99+

two choicesQUANTITY

0.99+

four timesQUANTITY

0.99+

threeQUANTITY

0.99+

90%QUANTITY

0.98+

four questionsQUANTITY

0.98+

thousands of camerasQUANTITY

0.98+

10 really smart sensorsQUANTITY

0.98+

C#TITLE

0.98+

OneQUANTITY

0.98+

telcoORGANIZATION

0.98+

Silicon ValleyLOCATION

0.98+

first layerQUANTITY

0.97+

IOTORGANIZATION

0.93+

oneQUANTITY

0.92+

todayDATE

0.9+

SDKTITLE

0.9+

2018DATE

0.89+

years laterDATE

0.87+

KiltonORGANIZATION

0.84+

four layerQUANTITY

0.79+

2010DATE

0.77+

four wallsQUANTITY

0.76+

Edgeworx.ioOTHER

0.6+

theCUBEORGANIZATION

0.54+

sensorsQUANTITY

0.53+

coupleQUANTITY

0.52+

EdgeworxTITLE

0.46+

thousandQUANTITY

0.42+

CUBEConversationEVENT

0.42+