Image Title

Search Results for When IoT Met:

Tom Stuermer, Accenture – When IoT Met AI: The Intelligence of Things - #theCUBE


 

>> Narrator: From the Fairmont Hotel in the heart of Silicon Valley, it's theCUBE. Covering When IoT met AI: The Intelligence of Things. Brought to you by Western Digital. >> Hey welcome back here everybody Jeff Frick here with theCUBE. We're in downtown San Jose at the Fairmont Hotel. At a little event it's When IoT Met AI: The Intelligence of Things. As we hear about the Internet of Things all the time this is really about the data elements behind AI, and machine learning, and IoT. And we're going to get into it with some of the special guests here. We're excited to get the guy that's going to kick off this whole program shortly is Tom Stuermer. He is the I got to get the new title, the Global Managing Director, Ecosystem and Partnership, from Accenture. Tom, welcome-- >> Thank you, Jeff. >> And congrats on the promotion. >> Thank you. >> So IoT, AI, buzz words, a lot of stuff going on but we're really starting to see stuff begin to happen. I mean there's lots of little subtle ways that we're seeing AI work its way in to our lives, and machine learning work our way into its life, but obviously there's a much bigger wave that's about to crest here, shortly. So as you kind of look at the landscape from your point of view, you get to work with a lot of customers, you get to see this stuff implemented in industry, what's kind of your take on where we are? >> Well, I would say that we're actually very early. There are certain spaces with very well-defined parameters where AI's been implemented successfully, industrial controls on a micro level where there's a lot of well-known parameters that the systems need to operate in. And it's been very easy to be able to set those parameters up. There's been a lot of historical heuristic systems to kind of define how those work, and they're really replacing them with AI. So in the industrial spaces a lot of take up and we'll even talk a little bit later about Siemens who's really created a sort of a self-managed factory. Who's been able to take that out from a tool level, to a system level, to a factory level, to enable that to happen at those broader capabilities. I think that's one of the inflection points we're going to see in other areas where there's a lot more predictability and a lot of other IoT systems. To be able to take that kind of system level and larger scale factors of AI and enable prediction around that, like supply chains for example. So we're really not seeing a lot of that yet, but we're seeing some of the micro pieces being injected in where the danger of it going wrong is lower, because the training for those systems is very difficult. >> It's interesting, there's so much talk about the sensors, and the edge, and edge computing, and that's interesting. But as you said it's really much more of a system approach is what you need. And it's really kind of the economic boundaries of the logical system by which you're trying to make a decision in. We talk all the time, we optimizing for one wind turbine? Are you optimizing for one field that contains so many wind turbines? Are you optimizing for the entire plant? Or are you optimizing for a much bigger larger system that may or may not impact what you did on that original single turbine? So a systems approach is a really critical importance. >> It is and what we've seen is that IoT investments have trailed a lot of expectations as to when they were going to really jump in the enterprise. And what we're finding is that when we talk to our customers a lot of them are saying, look I've already got data. I've got some data. Let's say I'm a mining company and I've got equipment down in mines, I've got sensors around oxygen levels, I just don't get that much value from it. And part of the challenge is that they're looking at it from a historical data perspective. And they're saying well I can see the trajectory over time of what's happening inside of my mind. But I haven't really been able to put in prediction. I haven't been able to sort of assess when equipment might fail. And so we're seeing that when we're able to show them the ability to affect an eventual failure that might shut down revenue for a day or two when some significant equipment fails, we're able to get them to start making those investments and they're starting to see the value in those micro pockets. And so I think we're going to see it start to propagate itself through in a smaller scale, and prove itself, because there's a lot of uncertainty. There's a lot of work that's got to be done to stitch them together, and IoT infrastructure itself is already a pretty big investment as it is. >> Short that mine company, because we had Caterpillar on a couple weeks ago and you know their driving fleets of autonomous vehicles, they're talking about some of those giant mining trucks who any unscheduled downtime the economic impact is immense well beyond worrying about a driver being sick, or had a fight with his wife, or whatever reason is bringing down the productivity of those vehicles. So it's actually amazing the little pockets where people are doing it. I'm curious to get your point of view too on kind of you managed to comment the guy's like I'm not sure what the value is because the other kind of big topic that we see is when will the data and the intelligence around the data actually start to impact the balance sheet? Because data used to be kind of a pain, right? You had to store it, and keep it, and it cost money, and you had to provision servers, and storage, but really now and the future the data that you have, the algorithms you apply to it will probably be an increasing percentage of your asset value if not the primary part of you asset value, you seeing some people start to figure that out? >> Well they are. So if you look, if step back away from IoT for a minute and you look at how AI is being applied more broadly, we're finding some transformational value propositions that are delivering a lot of impacts to the bottom line. And it's anywhere from where people inside of a company interact with their customers, being able to anticipate their next move, being able to predict given these parameters of this customer what kind of customer care agent should I put on the phone with them before you even pick up the phone to anticipate some of those expectations. And we're seeing a lot of value in things like that. And so, excuse me, and so when you zoom it back in to IoT some of the challenges are that the infrastructure to implement IoT is very fragmented. There's 360 some IoT platform providers out in the world and the places where we're seeing a lot of traction in using predictive analytics and AI for IoT is really coming in the verticals like industrial equipment manufacturers where they've kind of owned the stack and they can define everything from the bottom up. And what they're actually being able to do is to start to sell product heavy equipment by the hour, by the use, because they're able to get telemeter off of that product, see what's happening, be able to see when a failure is about to come, and actually sell it as a service back to a customer and be able to predictably analyze when something fails and get spares there in time. And so those are some of the pockets where it's really far ahead because they've got a lot of vertical integration of what's happening. And I think the challenge on adoption of broader scale for companies that don't sell very expensive assets into the market is how do I as a company start to stitch my own assets that are for all kinds of different providers, and all kinds of the different companies, into a single platform? And what the focus has really been in IoT lately for the past couple of years is what infrastructure should I place to get the data? How do I provision equipment? How do I track it? How do I manage it? How do I get the data back? And I think that's necessary but completely insufficient to really get a lot of value IoT, because really all your able to do then is get data. What do you do with it? All the value is really in the data itself. And so the alternative approach a lot of companies are taking is starting to attack some of these smaller problems. And each one of them tends to have a lot of value on its own, and so they're really deploying that way. And some of them are looking for ways to let the battles of the platforms, let's at least get from 360 down to 200 so that I can make some bets. And it's actually proving to be a value, but I think that is one of the obstacles that we have to adoption. >> The other thing you mentioned interesting before we turned on the cameras is really thinking about AI as a way to adjust the way that we interact with the machines. There's two views of the machines taking over the world, is it the beautiful view, or we can freeze this up to do other things? Or certainly nobody has a job, right? The answer is probably somewhere in the middle. But clearly AI is going to change the way, and we're starting to see just the barely the beginnings with Alexa, and Siri, and Google Home, with voice interfacing and the way that we interact with these machines which is going to change dramatically with the power of, as you said, prescriptive analytics, presumptive activity, and just change that interaction from what's been a very rote, fixed, hard to change to putting as you said, some of these lighter weight, faster to move, more agile layers on the top stack which can still integrate with some of those core SAP systems, and systems of record in a completely different way. >> Exactly, you know I often use the metaphor of autonomous driving and people seem to think that that's kind of way far out there. But if you look at how driving an autonomous vehicle's so much different from driving a regular car, right? You have to worry about at the minutia of executing the driving process. You don't have to worry about throttle, break. You'd have to worry about taking a right turn on red. You'd have to worry about speeding. What you have to worry about is the more abstract concepts of source, destination, route that I might want to take. You can offload that as well. And so it changes what the person interacting with the AI system is actually able to do, and the level of cognitive capability that they're able to exercise. We're seeing similar things in medical treatment. We're using AI to do predictive analytics around injury coming off of medical equipment. It's not only starting to improve diagnoses in certain scenarios, but it's also enabling the techs and the doctors involved in the scans to think on a more abstract level about what the broader medical issues are. And so it's really changing sort of the dialogue that's happening around what's going on. And I think this is a good metaphor for us to look at when we talk about societal impacts of AI as well. Because there are some people who embrace moving forward to those higher cognitive activities and some who resist it. But I think if you look at it from a customer standpoint as well, no matter what business you're in if you're a services business, if you're a product business, the way you interact with your employees and the way you interact with your customers can fundamentally be changed with AI, because AI can enable the technology to bend it to your intentions. Someone at the call center that we talked about. I mean those are subtle activities. It's not just AI for voice recognition, but it's also using AI to alter what options are given to you, and what scenarios are going to be most beneficial. And more often than not you get it right. >> Well the other great thing about autonomous vehicles, it's just a fun topic because it's something that people can understand, and they can see, and they can touch in terms of a concept to talk about, some of these higher level concepts. But the second order impacts which most people don't even begin to think, they're like I want to drive my car is, you don't need parking lots anymore because the cars can all park off site. Just Like they do at airports today at the rental car agency. You don't need to build a crash cage anymore, because the things are not going to crash that often compared to human drivers. So how does the interior experience of a car change when you don't have to build basically a crash cage? I mean there's just so many second order impacts that people don't even really begin to think about. And we see this time and time again, we saw it with cloud innovation where it's not just is it cheaper to rent a server from Amazon than to buy one from somebody else? It's does the opportunity for innovation enable more of your people to make more contributions than they could before because they were too impatient to wait to order the server from the IT guy? So that's where I think too people so underestimate kind of the big Moore's Law my favorite, we overestimate in the short term and completely underestimate in the long term, the impacts of these things. >> It's the doubling function, exactly. >> Jeff: Yeah, absolutely. >> I mean it's hard for people, human kind is geared towards linear thinking, and so when something like Moore's Law continues to double every 18 months price performance continues to increase. Storage, compute, visualization, display. >> Networking, 5G. >> You know the sensors in MEMS, all of these things have gotten so much cheaper. It's hard for human of any intelligence to really comprehend what happens when that doubling occurs for the next 20 years. Which we're now getting on the tail end of that fact. And so those manifest themselves in ways that are a little bit unpredictable, and I think that's going to be one of our most exciting challenges over the next five years is what does an enterprise look like? What does a product look like? One of the lessons that, I spent a lot of time in race car engineering in my younger days and actually did quants and analytics, what we learned from that point is as you learned about the data you started to fundamentally change the architecture of the product. And I think that's going to be a whole new series of activities that are going to have to happen in the marketplace. Is people rethinking fundamental product. There's a great example of a company that's completely disrupted an industry. On the surface of it it's been disrupted because of the fact that they essentially disassociated the consumption from the provision of the product. And didn't have to own those assets so they could grow rapidly. But what they fundamentally did was to use AI to be able to broker when should I get more cars, where should the cars go? And because they're also we're on the forefront of being able to drive, this whole notion of consumption of cars, and getting people's conceptual mindset shifted to having owned a car to I know an Uber's going to be there. It becomes like a power outlet. I can just rely on it. And now people are actually starting to double think about should I even own a car? >> Whole different impact of the autonomous vehicles. And if I do own a car why should it be sitting in the driveway when I'm not driving it? Or I send it out to go work for me make it a performing asset. Well great conversation. You guys Accenture's in a great spot. You're always at the cutting edge. I used to tease a guy I used to work with at Accenture you've got to squeeze out all the fat in the supply chain (laughs) your RP days and again a lot of these things are people changing the lens and seeing fat and inefficiency and then attacking it in a different way whether it's Uber, Airbnb, with empty rooms in people's houses. We had Paul Doherty on at the GE Industrial Internet launch a few years back, so you guys are in a great position because you get to sit right at the forefront and help these people make those digital transformations. >> I appreciate that. >> I will tell you I mean supply chains is another one of those high level systems opportunities for AI where being able to optimize, think about it a completely automated distribution chain from factory all the way to the drone landing at your front doorstep as a consumer. That's a whole nother level of efficiency that we can't even contemplate right now. >> Don't bet against Bezos that's what I always say. All right, Tom Stuermer thanks for spending a few minutes and good luck with the keynote. >> I appreciate it Jeff. >> All right, I'm Jeff Frick you're watching theCUBE. We are at The Intelligence of Things, When IoT met AI. You're watching theCUBE. Thanks for watching. (upbeat music)

Published Date : Jul 3 2017

SUMMARY :

Brought to you by Western Digital. He is the I got to get the new title, that's about to crest here, shortly. that the systems need to operate in. And it's really kind of the economic boundaries the ability to affect an eventual failure the data that you have, the algorithms you apply to it and all kinds of the different companies, to adjust the way that we interact with the machines. and the way you interact with your customers because the things are not going to crash continues to double every 18 months And I think that's going to be a whole new series Whole different impact of the autonomous vehicles. all the way to the drone landing a few minutes and good luck with the keynote. We are at The Intelligence of Things, When IoT met AI.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Tom StuermerPERSON

0.99+

Jeff FrickPERSON

0.99+

UberORGANIZATION

0.99+

AccentureORGANIZATION

0.99+

JeffPERSON

0.99+

Western DigitalORGANIZATION

0.99+

AmazonORGANIZATION

0.99+

Paul DohertyPERSON

0.99+

Silicon ValleyLOCATION

0.99+

SiemensORGANIZATION

0.99+

twoQUANTITY

0.99+

360QUANTITY

0.99+

two viewsQUANTITY

0.99+

AirbnbORGANIZATION

0.99+

TomPERSON

0.99+

BezosPERSON

0.99+

a dayQUANTITY

0.99+

oneQUANTITY

0.99+

todayDATE

0.99+

second orderQUANTITY

0.98+

The Intelligence of ThingsTITLE

0.98+

SiriTITLE

0.98+

single platformQUANTITY

0.96+

Global Managing DirectorTITLE

0.96+

200QUANTITY

0.96+

AlexaTITLE

0.96+

When IoT Met AI: The Intelligence of ThingsTITLE

0.96+

singleQUANTITY

0.95+

couple weeks agoDATE

0.95+

each oneQUANTITY

0.93+

one fieldQUANTITY

0.92+

past couple of yearsDATE

0.91+

second order impactsQUANTITY

0.88+

EcosystemTITLE

0.86+

PartnershipTITLE

0.84+

San JoseLOCATION

0.83+

Moore's LawTITLE

0.81+

The Intelligence ofTITLE

0.78+

Google HomeCOMMERCIAL_ITEM

0.77+

one wind turbineQUANTITY

0.75+

theCUBEORGANIZATION

0.75+

One of theQUANTITY

0.72+

every 18 monthsQUANTITY

0.71+

doubleQUANTITY

0.7+

GE IndustrialORGANIZATION

0.69+

next five yearsDATE

0.66+

few years backDATE

0.66+

one of the obstaclesQUANTITY

0.62+

Fairmont HotelLOCATION

0.61+

next 20 yearsDATE

0.6+

InternetEVENT

0.6+

a minuteQUANTITY

0.6+

SAPORGANIZATION

0.56+

CaterpillarORGANIZATION

0.56+

theCUBETITLE

0.43+

#theCUBEORGANIZATION

0.39+

Shaun Moore, Trueface.ai – When IoT Met AI: The Intelligence of Things - #theCUBE


 

>> Male Voice: From the Fairmont Hotel in the heart of Silicon Valley, it's the Cube covering when IoT Met AI: the Intelligence of Things brought to you by Western Digital. >> Hey welcome back here everybody. Jeff Frick with the Cube. We're in downtown San Jose at the Fairmont Hotel at a small event talking about data and really in IoT and the intersection of all those things and we're excited to have a little startup boutique here and one of the startups is great enough to take the time to sit down with us. This is Shaun Moore, he's the founder and CEO of the recently renamed Trueface.ai. Shaun, welcome. >> Thank you for having me. >> So you've got a really cool company, Trueface,ai. I looked at the site. You have facial recognition software so that's cool but what I think is really more interesting is you're really doing facial recognition as a service. >> Shaun: Yes. >> And you a have a freemium model so I can go in and connect to your API and basically integrate your facial recognition software into whatever application that I built. >> Right so we were thinking about what we wanted to do in terms of pricing structure. We wanted to focus on the developer community so we wanted tinkers, people that just want to play with technology to help us improve it and then go after the kind of bigger clients and so we'll be hosting hack-a-thons. We just actually had one this past week in San Francisco. We had great feedback. We're really trying to get a base of you know, almost outsource engineers to help us improve this technology and so we have to offer it to them for free so we can see what they build from there. >> Right but you don't have an opensource component yet so you haven't gone that route? >> Not quite yet, no. >> Okay. >> We're thinking about that though. >> Okay, and still really young company, angel-funded, haven't taken it the institutional route yet. >> Right, yeah, we've been around since 2013, end of 2013, early 2014, and we were building smart home hardware so we had built the technology around originally to be a smart doorbell that used facial recognition to customize the smart home. From the the trajectory went, we realized our clients were using it more for security purposes and access-control, not necessarily personalization. We made a quick pivot to a quick access control company and continue to learn about how people are using facial recognition in practice. Could it be a commercial technology that people are comfortable with? And throughout that thought process and going through and testing a bunch of other facial recognition technologies, we realized we could actually build our own platform and reach a larger audience with it and essentially be the core technology of a lot cooler and more innovative products. >> Right, and not get into the hardware business of doorbells >> Yeah, the hardware business is tough. >> That's a tough one. >> We were going to through manufacturing one and I'm glad we don't have to do that again. >> So what are some of the cool ways that people are using facial recognition that maybe we would never have thought about? >> Sure, so for face matching - The API is four components. It's face matching, face detection, face identification, and what we call spoof detection. Face matching is what it sounds like: one-to-one matching. Face detection is just detecting that someone is in the frame. The face identification is your one to act so your going into a database of people. And your spoof detection is if someone holds up a picture of me or of you and tries to get it, we'll identify that as an attack attempt and that's kind of where we differentiate our technology from most is not a lot of technology out there can do that piece and so we've packaged that all up into essentially the API for all these developers to use and some of the different ideas that people have come up with for us have been for banking logins, so for ATMs, you walk up to an ATM, you put your card in and set up a PIN so to prevent against fraud it actually scans your face and does a one-to-one match. For ship industries, so for things like cruise ships, when people get off and then come back on, instead of having them show ID, they use quick facial recognition scans. So we're seeing a lot of different ideas. One of the more funny ones is based off a company out in LA that is doing probation monitoring for drunk drivers and so we've built technology that's drunk or not drunk. >> Drunk or not drunk? >> Right so we can actually measure based on historical data if your face appears to be drunk and so you know, the possibilities are truly endless. And that's why I said we went after the development community first because >> Right right >> They're coming to use with these creative ideas. >> So it's interesting with this drunk or not drunk, of course, not to make fun of drunk driving, it's not a funny subject but obviously you've got an algorithm that determines anchor points on the eyes and the nose and certain biometric features but drunk, you're looking for much softer, more subtle clues, I would imagine because the fundamental structure of your face hasn't changed. >> Right so it's a lot of training data, so it's a lot of training data. >> Well a lot of training data, yeah. We don't want to go down that path. >> So a lot of research on our team's part. >> Well then the other thing too is the picture, is the fraud attempt. You must be looking around and shadowing and really more 3D-types of things to look over something as simple as holding up a 2D picture. >> Right so a lot of the technology that's tried to do it, that's tried to prevent against picture attacks has done so with extra hardware or extra sensors. We're actually all cloud-based right now so it isn't our software and that is what is special to us is that picture attack detection but we've a got a very very intelligent way to do it. Everything is powered by deep learning so we're constantly understanding the surroundings, the context, and making an analysis on that. >> So I'm curious from the data side, obviously you're pulling in kind of your anchor data and then for doing comparisons but then are you constantly updating that data? I mean, what's kind of your data flow look like in terms of your algorithms, are you constantly training them and adjusting those algorithms? How does that work kind of based on real time data versus your historical data? >> So we have to continue to innovate and that is how we do it, is we continue to train every single time someone shows up we train their profile once more and so if you decide to grow a beard, you're not going to grow a beard in one day, right? It's going to take you a week, two weeks. We're learning throughout those two weeks and so it's just a way for use to continue to get more data for us but also to ensure that we are identifying you properly. >> Right, do you use any external databases that you pull in as some type of you know, adding more detail or you know, kind of, other public sources or it's all your own? >> It's all our own. >> Okay and I'm curious too on the kind of opening up to the developer community, how has that kind of shaped your product roadmap and your product development? >> It - we've got to be very very conscious of not getting sidetracked because we get to hear cool ideas about what we could do but we've got our core focus of building this API for more people to use. So you know, we continue to reach out them and ask for help and you know if they find flaw or they find something cool that we want to continue to improve, we'll keep working on that so I think it's more of a - we're finding the developer community likes to really tinker and to play and because they're doing it out of passion, it helps us drive our product. >> Right right. Okay, so priorities for the rest of the year? What's at the top of the list? >> We'll be doing a bigger rollout with a couple of partners later on this year and those will be kind of our flagship partners. But again, like I said, we want to continue to support those development communities so we'll be hosting a lot of hack-a-thons and just really pushing the name out there. So we launched our product yesterday and that helped generate some awareness but we're going to have to continue to have to get the brand out there as it's now one day old. >> Right right, well good. Well it was Chui before and it's Trueface.ai so we look forward to keeping an eye on progress and congratulations on where you've gotten to date. >> Thank you very much. I appreciate that. >> Absolutely. Alrighty, Shaun Moore, it's Trueface.ai. Look at the cameras, smile, it will know it's you. You're watching Jeff Frick down at the Cube in downtown San Jose at the When IoT Met AI: The Intelligence of Things. Thanks for watching. We'll be right back after this short break.

Published Date : Jul 3 2017

SUMMARY :

in the heart of Silicon Valley, and really in IoT and the intersection of all those things I looked at the site. so I can go in and connect to your API and so we have to offer it to them for free angel-funded, haven't taken it the institutional route yet. the technology around originally to be a smart doorbell and I'm glad we don't have to do that again. and some of the different ideas and so you know, the possibilities are truly endless. anchor points on the eyes and the nose Right so it's a lot of training data, Well a lot of training data, yeah. the picture, is the fraud attempt. Right so a lot of the technology that's tried to do it, and so if you decide to grow a beard, and ask for help and you know Okay, so priorities for the rest of the year? and just really pushing the name out there. so we look forward to keeping an eye on progress Thank you very much. in downtown San Jose at the

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FrickPERSON

0.99+

Shaun MoorePERSON

0.99+

ShaunPERSON

0.99+

TruefaceORGANIZATION

0.99+

LALOCATION

0.99+

a weekQUANTITY

0.99+

San FranciscoLOCATION

0.99+

Silicon ValleyLOCATION

0.99+

two weeksQUANTITY

0.99+

one dayQUANTITY

0.99+

early 2014DATE

0.99+

Western DigitalORGANIZATION

0.99+

yesterdayDATE

0.99+

OneQUANTITY

0.98+

Trueface.aiORGANIZATION

0.97+

2013DATE

0.97+

end of 2013DATE

0.97+

The Intelligence of ThingsTITLE

0.96+

Fairmont HotelORGANIZATION

0.95+

San JoseLOCATION

0.93+

oneQUANTITY

0.93+

this yearDATE

0.88+

CubeCOMMERCIAL_ITEM

0.83+

four componentsQUANTITY

0.81+

2DQUANTITY

0.8+

firstQUANTITY

0.8+

past weekDATE

0.73+

When IoT MetTITLE

0.71+

of ThingsTITLE

0.69+

ChuiORGANIZATION

0.68+

single timeQUANTITY

0.67+

FairmontLOCATION

0.63+

ofTITLE

0.62+

Trueface.aiTITLE

0.61+

3DQUANTITY

0.6+

#theCUBEORGANIZATION

0.58+

HotelORGANIZATION

0.56+

CubeLOCATION

0.52+

Jack McCauley, Oculus VR – When IoT Met AI: The Intelligence of Things - #theCUBE


 

>> Announcer: From the Fairmont Hotel in the heart of Silicon Valley, it's The Cube. Covering when IOT met AI, the intelligence of things. Brought to you by Western Digital. >> Hey, welcome back everybody. Jeff Rick here with The Cube. We're in downtown San Jose at the Fairmont Hotel at a little show called when IOT Met AI, the Intelligence of Things. Talking about big data, IOT, AI and how those things are all coming together with virtual reality, artificial intelligence, augmented reality, all the fun buzz words, but this is where it's actually happening and we're real excited to have a pioneer in this space. He's Jack McCauley. He was a co-founder at Occulus VR, now spending his time at UC Berkeley as an innovator in residence. Jack welcome. >> Thank you. >> So you've been watching this thing evolve, obviously Occulus, way out front in kind of the VR space and I think augmented a reality in some ways is even more exciting than just kind of pure virtual reality. >> Right. >> So what do you think as you see this thing develop from the early days when you first sat down and started putting this all together? >> Well, I come from a gaming background. That's what I did for 30 years. I worked in video game development, particularly in hardware and things, console hardware. >> That's right, you did the Guitar Hero. >> Guitar Hero. Yeah, that's right. >> We got that one at home. >> I built their guitars and designed and built their guitars for Activision. And when were part of Red Octane, which is a studio. I primarily worked in the studio, not the headquarters, but I did some of the IP work with them too, so, to your question, you know when you produce a product and put it on the market, you never really know how it's going to do. >> Jeff: Right. >> So we make, we made two developer kits, put them out there and they exceeded our expectations and that was very good. It means that there is a market for VR, there is. We produce a consumer version and sales are not what we expected for that particular product. That was designated towards PC gamers and hopefully console games. But what has done well is the mobile stuff has exceeded everyone's mildest expectations. I heard numbers, Gear VR, which is Occulus designed product for me, sold 7 million of those. That's a smash hit. Now, worldwide for phone mounted VR goggles, it's about 20 million and that's just in two years, so that's really intriguing. So, what has happened is it's shifted away from an expensive PC based rig with $700 or whatever it costs, plus $1,500 for the computer to something that costs $50 and you just stick your cell phone in it and that's what people, it doesn't give you the best experience, but that's what has sold and so if I were doing a start-up right now, I would not be working on PC stuff, I'd be working on mobile stuff. >> Jeff: Right. >> And the next thing I think, which will play out of this is, and I think you mentioned it prior to the interview, is the 360 cameras and Google has announced a camera that they're going to come out and it's for their VR 180 initiative, which allows you to see 180 video in stereo with a cell phone strapped to your face. And that's very intriguing. There's a couple of companies out there working on similar products. Lucid Cam, which is a start-up company here has a 180 camera that's very, very good and they have one coming out that's in 4K. They just launched their product. So to answer your question, it looks like what is going to happen is for VR, is that it's a cell phone strapped to your face and a camera somewhere else that you can view and experience. A concert. Imagine taking it to a sporting event where 5,000 people can view your video, 10,000 from your seat. That's very intriguing. >> Yeah, it's interesting I had my first kind of experience just not even 360 or live view, but I did a periscope from the YouTube concert here at Levi Stadium a couple of months ago, just to try it out, I'd never really done it and it was fascinating to watch the engagement of people on that application who had either seen them the prior week in Seattle or were anticipating them coming to the Rose Bowl, I think, you know, within a couple of days, and to have an interaction just based on my little, you know, mobile phone, I was able to find a rail so I had a pretty steady vantage point, but it was a fascinating, different way to experience media, as well as engagement, as well as kind of a crowd interaction beyond the people that happened to be kind of standing in a circle. >> You, what's intriguing about VR 180 is that anybody can film the concert and put the video on YouTube or stream it through their phone. And formerly it would require a $10,000 camera, a stereo camera set up professionally, but can you imagine though that a crowd, you know, sourced sort of thing where the media is sourced by the crowd and anyone can watch it with a mobile phone. That's what's happening, I think, and with Google's announcement, it even that reinforces my opinion anyways that that is where the market will be. It's live events, sporting events. >> Right, it's an experience, right? It all comes back to kind of experience. People are so much more experience drive these days than I think thing driven from everything from buying cars versus taking a new Uber and seeing it over and over and over again. People want the experience, but not necessarily, as the CEO of Zura said, the straps and straddles of ownership, let me have the fun, I don't necessarily want to own it. But I think the other thing that gets less talked about, get your opinion, is really the kind of combination of virtual reality plus the real world, augmented reality. We see the industrial internet of things all the time where, you know, you go take a walk on that factory before you put your goggles on and not only do you see what you see that's actually in front of you, but now you can start to see, it's almost like a heads up display, certain characteristics of the machinery and this and that are now driven from the database side back into the goggles, but now the richness of your observation has completely changed. >> Yes, and in some ways when you think of what Google did with Google Glass, not as well as we had liked. >> But for a first attempt. >> Yeah. They're way ahead of their time and there will come a time when, you know, Snap has their specs, right? Have you seen those? It's not augmented reality, but, there will come a time when you can probably have a manacle on your face and see the kinds of things you need to see if your driving a car for instance that, I mean, a heads up display or a projector projecting right into your retina. So, and, so I think that's the main thing for augmented reality. Will people, I mean, your Pokemon Go, that's kind of a AR game in a way. You look through your cell phone and the character stays fixed on the table or wherever you're looking for it. I mean that uses a mobile device to do that and I can imagine other applications that use a mobile device to do that and I'm aware of people working on things like that right now. >> So do you think that the breakthrough on the mobile versus the PC-based system was just good enough? In being able to just experience that so easily, you know, I mean, Google gave out hundreds and hundreds of thousands of the cardboard boxes, so wow. >> Yeah. Well, it didn't mean that Gear VR didn't move into the market, it did. You know, it did anyways, but to answer your question about AR, you know, I think that, you know, without having good locals, I mean the problem with wearing the Google Glass and the Google cardboard and Gear VR is it kind of makes you sick a little bit and nobody's working on the localization part. Like how to get rid of the nausea effect. I watched a video that was filmed with Lucid Cam at the Pride Parade in San Francisco and I put it on and somebody was moving with the crowd and I just felt nauseous, so that problem probably probably is one I would attempt to attack if I were going to build a company or something like that right now. >> But I wonder too, how much of that is kind of getting used to the format because people when they first put them on for sure, there's like, ah, but you know, if you settle in a little bit and our eyes are pretty forgiving, you get used to things pretty quickly. Your mind can get accustomed to it to a certain degree, but even I get nauseous and I don't get nauseous very easily. >> Okay, so you're title should just be tinkerer. I looked at your Twitter handle. You're building all kinds of fun stuff in your not a garage, but your big giant lab and you're working at Berkeley. What are some of the things that you can share that you see coming down the road that people aren't necessary thinking about that's going to take some of these technologies to the next level. >> I got one for you. So you've heard of autonomous vehicles, right? >> Jeff: Yep, yep. >> And you've heard of Hollow Lens, right. Hollow Lens is an augmented reality device you put on your had and it's got built in localization and it creates what's, it's uses what's know as SLAM or S-L-A-M to build a mesh of the world around you. And with that mesh, the next guy that comes into that virtual world that you mapped will be away ahead. In other words, the map will already exists and he'll modify upon that and the mesh always gets updated. Can you imagine getting that into a self-driving vehicle just for safety's sake, mapping out the road ahead of you, the vehicle ahead of you has already mapped the road for you and you're adding to the mesh and adjusting the mesh, so I think that that's, you know, as far as Hollow Lens is concerned and their localization system, that's going to be really relevant to self-driving cars. Now whether or not it'll be Microsoft's SLAM or somebody else's, I think that that's probably the best, that's the good thing that came out of Hollow Lens and that will bleed into the self-driving car market. It's a big data crunching number and in Jobs, he was actually looking at this a long time ago, like what can we do with self-driving vehicles and I think he had banned the idea because he realized he had a huge computing and data problem. That was 10 years ago. Things have changed. But I think that that's the thing that will possibly come out of, you know, this AR stuff is that localization is just going to be transported to other areas of technology and self-driving cars and so forth. >> I just love autonomous vehicles because everything gets distilled and applied into that application, which is a great application for people to see and understand it's so tangible. >> Yeah, it may change the way we think about cars and we may just not ever own a car. >> I think absolutely. The car industry, it's ownership, it's usage, it's frequency of usage, how they're used. It's not a steel cage anymore for safety as the crash rates go down significantly. I think there's a lot of changes. >> Yeah, you buy a car and it sits for 20 hours a day. >> Right. >> Unutilized. >> All right. Well, Jack I hope maybe I get a chance to come out and check out your lab one time because you're making all kinds of cool stuff. When's that car going to be done? >> I took it upon myself to remodel a house the same time I was doing that, but the car is moving ahead. In September I think I can get it started. Get the engine running and get the power train up and running. Right now I'm working on the electronics and we have an interesting feature on that car that we're going to do an announcement on later. >> Okay, we'll look out for that. We'll keep watching the Twitter. All right, thanks for taking a few minutes. All right, let's check with Cauley. I'm Jeff Rick. You're watching The Cube from When IOT Met AI, the Intelligence of Things in San Jose. We'll be right back after this short break. Thanks for watching. (technological jingle)

Published Date : Jul 3 2017

SUMMARY :

Brought to you by Western Digital. We're in downtown San Jose at the Fairmont Hotel and I think augmented a reality in some ways I worked in video game development, Yeah, that's right. it on the market, you never really know to something that costs $50 and you just stick and a camera somewhere else that you the people that happened to be kind but can you imagine though that a crowd, you know, but now the richness of your observation Yes, and in some ways when you think of what a time when, you know, Snap has their specs, right? you know, I mean, Google gave out hundreds is it kind of makes you sick a little bit there's like, ah, but you know, if you settle What are some of the things that you can share I got one for you. and adjusting the mesh, so I think that that's, you know, gets distilled and applied into that application, Yeah, it may change the way we think about as the crash rates go down significantly. When's that car going to be done? the same time I was doing that, the Intelligence of Things in San Jose.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff RickPERSON

0.99+

Jack McCauleyPERSON

0.99+

JeffPERSON

0.99+

$700QUANTITY

0.99+

Western DigitalORGANIZATION

0.99+

JackPERSON

0.99+

Levi StadiumLOCATION

0.99+

7 millionQUANTITY

0.99+

SeptemberDATE

0.99+

30 yearsQUANTITY

0.99+

SeattleLOCATION

0.99+

$10,000QUANTITY

0.99+

Silicon ValleyLOCATION

0.99+

$1,500QUANTITY

0.99+

GoogleORGANIZATION

0.99+

OcculusORGANIZATION

0.99+

$50QUANTITY

0.99+

CauleyPERSON

0.99+

5,000 peopleQUANTITY

0.99+

San FranciscoLOCATION

0.99+

OculusORGANIZATION

0.99+

MicrosoftORGANIZATION

0.99+

10,000QUANTITY

0.99+

The CubeTITLE

0.99+

Pokemon GoTITLE

0.99+

360QUANTITY

0.99+

Red OctaneORGANIZATION

0.99+

UberORGANIZATION

0.99+

180 cameraQUANTITY

0.99+

first attemptQUANTITY

0.99+

Gear VRCOMMERCIAL_ITEM

0.98+

10 years agoDATE

0.98+

two yearsQUANTITY

0.98+

YouTubeORGANIZATION

0.98+

two developerQUANTITY

0.98+

Pride ParadeEVENT

0.98+

20 hours a dayQUANTITY

0.98+

TwitterORGANIZATION

0.97+

about 20 millionQUANTITY

0.97+

San JoseLOCATION

0.96+

firstQUANTITY

0.96+

Guitar HeroTITLE

0.96+

ActivisionORGANIZATION

0.96+

180 videoQUANTITY

0.95+

Fairmont HotelORGANIZATION

0.95+

When IoT Met AI: The Intelligence of ThingsTITLE

0.94+

360 camerasQUANTITY

0.93+

UC BerkeleyORGANIZATION

0.92+

SnapORGANIZATION

0.92+

prior weekDATE

0.91+

a couple of daysQUANTITY

0.91+

one timeQUANTITY

0.91+

first kindQUANTITY

0.9+

oneQUANTITY

0.9+

a couple of months agoDATE

0.9+

hundreds of thousandsQUANTITY

0.9+

VR 180COMMERCIAL_ITEM

0.85+

hundreds andQUANTITY

0.84+

Google GlassCOMMERCIAL_ITEM

0.82+

BerkeleyLOCATION

0.81+

When IOT MetTITLE

0.79+

CamTITLE

0.75+

SLAMTITLE

0.73+

the Intelligence of ThingsTITLE

0.73+

GlassCOMMERCIAL_ITEM

0.71+

ZuraORGANIZATION

0.68+

cardboard boxesQUANTITY

0.66+

#theCUBETITLE

0.63+

The CubeORGANIZATION

0.58+

Rose BowlEVENT

0.52+

coupleQUANTITY

0.48+

IOTTITLE

0.48+

companiesQUANTITY

0.47+

4KOTHER

0.47+

Lucid CamORGANIZATION

0.47+

LucidPERSON

0.47+

Hollow LensORGANIZATION

0.46+

The CubeCOMMERCIAL_ITEM

0.43+

Dave Tang, Western Digital – When IoT Met AI: The Intelligence of Things - #theCUBE


 

>> Presenter: From the Fairmont Hotel, in the heart of Silicon Valley, it's theCUBE. Covering When IoT Met AI The Intelligence of Things. Brought to you by Western Digital. >> Hey welcome back everybody, Jeff Frick here with theCUBE. We're in downtown San Jose at the Fairmont Hotel, at an event called When IoT Met AI The Intelligence of Things. You've heard about the internet of things, and on the intelligence of things, it's IoT, it's AI, it's AR, all this stuff is really coming to play, it's very interesting space, still a lot of start-up activity, still a lot of big companies making plays in this space. So we're excited to be here, and really joined by our host, big thanks to Western Digital for hosting this event with WDLabs' Dave Tang. Got newly promoted since last we spoke. The SVP of corporate marketing and communications, for Western Digital, Dave great to see you as usual. >> Well, great to be here, thanks. >> So I don't think the need for more storage is going down anytime soon, that's kind of my takeaway. >> No, no, yeah. If this wall of data just keeps growing. >> Yeah, I think the term we had yesterday at the Ag event that we were at, also sponsored by you, is really the flood of data using an agricultural term. But it's pretty fascinating, as more, and more, and more data is not only coming off the sensors, but coming off the people, and used in so many more ways. >> That's right, yeah we see it as a virtual cycle, you create more data, you find more uses for that data to harness the power and unleash the promise of that data, and then you create even more data. So, when that virtual cycle of creating more, and finding more uses of it, and yeah one of the things that we find interesting, that's related to this event with IoT and AI, is this notion that data is falling into two general categories. There's big data, and there's fast data. So, big data I think everyone is quite familiar with by this time, these large aggregated likes of data that you can extract information out of. Look for insights and connections between data, predict the future, and create more prescriptive recommendations, right? >> Right. >> And through all of that you can gain algorithms that help to make predictions, or can help machines run based on that data. So we've gone through this phase where we focused a lot on how we harness big data, but now we're taking these algorithms that we've gleaned from that, and we're able to put them in real time applications, and that's sort of been the birth of fast data, it's been really-- >> Right, the streaming data. We cover Spark Summit, we cover Flink, and New, a new kind of open source project that came out of Berlin. That some people would say the next generation of Spark, and the other thing, you know, good for you guys, is that it used to be, not only was it old data, but it was a sampling of old data. Now on this new data, and the data stream that's all of the data. And I would actually challenge, I wonder if that separation as you describe, will stay, because I got to tell you, the last little drive I bought, just last week, was an SSD drive, you know, one terabyte. I needed some storage, and I had a choice between spinning disc and not, and I went with the flat. I mean, 'cause what's fascinating to me, is the second order benefits that we keep hearing time, and time, and time again, once people become a data-driven enterprise, are way more than just that kind of top-level thing that they thought. >> Exactly, and that's sort of that virtual cycle, you got to taste, and you learn how to use it, and then you want more. >> Jeff: Right, right. >> And that's the great thing about the breadth of technologies and products that Western Digital has, is from the solid state products, the higher performance flash products that we have, to the higher capacity helium-filled drive technologies, as well as devices going on up into systems, we cover this whole spectrum of fast data and big data. >> Right, right. >> I'll give an example. So credit card fraud detection is an interesting area. Billions of dollars potentially being lost there. Well to learn how to predict when transactions are fraudulent, you have to study massive amounts of data. Billions of transactions, so that's the big data side of it, and then as soon as you do that, you can take those algorithms and run them in real time. So as transactions come in for authorization, those algorithms can determine, before they're approved, that one's fraudulent, and that one's not. Save a lot of time and processing for fraud claims. So that's a great example of once you learn something from big data, you apply it to the real-time realm, and it's quite dire right? And then that spawned you to collect even more data, because you want to find new applications and new uses. >> Right, and too kind of this wave of computing back and forth from the shared services computer, then the desktop computer, now it's back to the cloud, and then now it's-- >> Dave: Out with the edge. >> IoT, it's all about the edge. >> Yeah, right. >> And at the end of the day, it's going to be application-specific. What needs to be processed locally, what needs to be processed back at the computer, and then all the different platforms. We were again at a navigation for autonomous vehicles show, who knew there was such a thing that small? And even the attributes of the storage required in the ecosystem of a car, right? And the environmental conditions-- >> That's right. >> Is the word I'm looking for. Completely different, new opportunity, kind of new class of hardware required to operate in that environment, and again that still combines cloud and Edge, sensors and maps. So just the, I don't think that the man's going down David. >> Yeah, absolutely >> I think you're in a good spot. (Jeff laughing) >> You're absolutely right, and even though we try to simplify into fast data, and big data, and Core and Edge, what we're finding is that applications are increasingly specialized, and have specialized needs in terms of the type of data. Is it large amounts of data, is it streaming? You know, what are the performance characteristics, and how is it being transformed, what's the compute aspect of it? And what we're finding, is that the days of general-purpose compute and storage, and memory platforms, are fading, and we're getting into environments with increasingly specialized architectures, across all those elements. Compute, memory and storage. So that's what's really exciting to be in our spot in the industry, is that we're looking at creating the future by developing new technologies that continue to fuel that growth even further, and fuel the uses of data even further. >> And fascinating just the ongoing case of Moore's law, which I know is not, you know you're not making microprocessors, but I think it's so powerful. Moore's law really is a philosophy, as opposed to an architectural spec. Just this relentless pace of innovation, and you guys just continue to push the envelope. So what are your kind of priorities? I can't believe we're halfway through 2017 already, but for kind of the balance of the year kind of, what are some of your top-of-mind things? I know it's exciting times, you're going through the merger, you know, the company is in a great space. What are your kind of top priorities for the next several months? >> Well, so, I think as a company that has gone through serial acquisitions and integrations, of course we're continuing to drive the transformation of the overall business. >> But the fun stuff right? It's not to increase your staff (Jeff laughing). >> Right, yeah, that is the hardware. >> Stitching together the European systems. >> But yeah, the fun stuff includes pushing the limits even further with solid state technologies, with our 3D NAND technologies. You know, we're leading the industry in 64 layer 3D NAND, and just yesterday we announced a 96 layer 3D NAND. So pushing those limits even further, so that we can provide higher capacities in smaller footprints, lower power, in mobile devices and out on the Edge, to drive all these exciting opportunities in IoT an AI. >> It's crazy, it's crazy. >> Yeah it is, yeah. >> You know, terabyte SD cards, terabyte Micro SD cards, I mean the amount of power that you guys pack into these smaller and smaller packages, it's magical. I mean it's absolutely magic. >> Yeah, and the same goes on the other end of the spectrum, with high-capacity devices. Our helium-filled drives are getting higher and higher capacity, 10, 12, 14 terabyte high-capacity devices for that big data core, that all the data has to end up with at some point. So we're trying to keep a balance of pushing the limits on both ends. >> Alright, well Dave, thanks for taking a few minutes out of your busy day, and congratulations on all your success. >> Great, good to be here. >> Alright, he's Dave Tang from Western Digital, he's changing your world, my world, and everyone else's. We're here in San Jose, you're watching theCUBE, thanks for watching.

Published Date : Jul 3 2017

SUMMARY :

in the heart of Silicon Valley, it's theCUBE. and on the intelligence of things, is going down anytime soon, that's kind of my takeaway. If this wall of data just keeps growing. is not only coming off the sensors, and then you create even more data. and that's sort of been the birth of fast data, and the other thing, you know, good for you guys, and then you want more. And that's the great thing about the breadth and then as soon as you do that, And at the end of the day, and again that still combines cloud and Edge, I think you're in a good spot. is that the days of general-purpose compute and storage, but for kind of the balance of the year kind of, of the overall business. But the fun stuff right? in mobile devices and out on the Edge, I mean the amount of power that you guys pack that all the data has to end up with at some point. and congratulations on all your success. and everyone else's.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FrickPERSON

0.99+

Dave TangPERSON

0.99+

JeffPERSON

0.99+

San JoseLOCATION

0.99+

Western DigitalORGANIZATION

0.99+

DavePERSON

0.99+

12QUANTITY

0.99+

10QUANTITY

0.99+

BerlinLOCATION

0.99+

Silicon ValleyLOCATION

0.99+

DavidPERSON

0.99+

yesterdayDATE

0.99+

last weekDATE

0.99+

2017DATE

0.99+

second orderQUANTITY

0.99+

both endsQUANTITY

0.98+

Billions of dollarsQUANTITY

0.98+

one terabyteQUANTITY

0.97+

FlinkORGANIZATION

0.96+

The Intelligence of ThingsTITLE

0.95+

14 terabyteQUANTITY

0.95+

AgEVENT

0.94+

oneQUANTITY

0.94+

two general categoriesQUANTITY

0.91+

EuropeanOTHER

0.9+

theCUBEORGANIZATION

0.87+

#theCUBEORGANIZATION

0.87+

Billions of transactionsQUANTITY

0.87+

Fairmont HotelLOCATION

0.87+

WDLabs'ORGANIZATION

0.81+

64QUANTITY

0.81+

Spark SummitEVENT

0.71+

96 layerQUANTITY

0.67+

MoorePERSON

0.66+

yonePERSON

0.66+

next several monthsDATE

0.64+

CoreORGANIZATION

0.59+

EdgeTITLE

0.58+

terabyteORGANIZATION

0.55+

layer 3DOTHER

0.55+

SparkTITLE

0.46+

theCUBETITLE

0.42+

When IoTTITLE

0.36+

3DQUANTITY

0.26+

Janet George, Western Digital –When IoT Met AI: The Intelligence of Things - #theCUBE


 

(upbeat electronic music) >> Narrator: From the Fairmont Hotel in the heart of Silicon Valley, it's theCUBE. Covering when IoT met AI, The Intelligence of Things. Brought to you by Western Digital. >> Welcome back here everybody, Jeff Frick here with theCUBE. We are at downtown San Jose at the Fairmont Hotel. When IoT met AI it happened right here, you saw it first. The Intelligence of Things, a really interesting event put on by readwrite and Western Digital and we are really excited to welcome back a many time CUBE alumni and always a fan favorite, she's Janet George. She's Fellow & Chief Data Officer of Western Digital. Janet, great to see you. >> Thank you, thank you. >> So, as I asked you when you sat down, you're always working on cool things. You're always kind of at the cutting edge. So, what have you been playing with lately? >> Lately I have been working on neural networks and TensorFlow. So really trying to study and understand the behaviors and patterns of neural networks, how they work and then unleashing our data at it. So trying to figure out how it's training through our data, how many nets there are, and then trying to figure out what results it's coming with. What are the predictions? Looking at how the predictions are, whether the predictions are accurate or less accurate and then validating the predictions to make it more accurate, and so on and so forth. >> So it's interesting. It's a different tool, so you're learning the tool itself. >> Yes. >> And you're learning the underlying technology behind the tool. >> Yes. >> And then testing it actually against some of the other tools that you guys have, I mean obviously you guys have been doing- >> That's right. >> Mean time between failure analysis for a long long time. >> That's right, that's right. >> So, first off, kind of experience with the tool, how is it different? >> So with machine learning, fundamentally we have to go into feature extraction. So you have to figure out all the features and then you use the features for predictions. With neural networks you can throw all the raw data at it. It's in fact data-agnostic. So you don't have to spend enormous amounts of time trying to detect the features. Like for example, If you throw hundreds of cat images at the neural network, the neural network will figure out image features of the cat; the nose, the eyes, the ears and so on and so forth. And once it trains itself through a series of iterations, you can throw a lot of deranged cats at the neural network and it's still going to figure out what the features of a real cat is. >> Right. >> And it will predict the cat correctly. >> Right. So then, how does that apply to, you know, the more specific use case in terms of your failure analysis? >> Yeah. So we have failures and we have multiple failures. Some failures through through the human eye, it's very obvious, right? But humans get tired, and over a period of time we can't endure looking at hundreds and millions of failures, right? And some failures are interconnected. So there is a relationship between these failure patterns or there is a correlation between two failures, right? It could be an edge failure. It could a radial failure, eye pattern type failure. It could be a radial failure. So these failures, for us as humans, we can't escape. >> Right. >> And we used to be able to take these failures and train them at scale and then predict. Now with neural networks, we don't have to take and do all that. We don't have to extract these labels and try to show them what these failures look like. Training is almost like throwing a lot of data at the neural networks. >> So it almost sounds like kind of the promise of the data lake if you will. >> Yes. >> If you have heard about, from the Hadoop Summit- >> Yes, yes, yes. >> For ever and ever and ever. Right? You dump it all in and insights will flow. But we found, often, that that's not true. You need hypothesis. >> Yes, yes. >> You need to structure and get it going. But what you're describing though, sounds much more along kind of that vision. >> Yes, very much so. Now, the only caveat is you need some labels, right? If there is no label on the failure data, it's very difficult for the neural networks to figure out what the failure is. >> Jeff: Right. >> So you have to give it some labels to understand what patterns it should learn. >> Right. >> Right, and that is where the domain experts come in. So we train it with labeled data. So if you are training with a cat, you know the features of a cat, right? In the industrial world, cat is really what's in the heads of people. The domain knowledge is not so authoritative. Like the sky or the animals or the cat. >> Jeff: Right. >> The domain knowledge is much more embedded in the brains of the people who are working. And so we have to extract that domain knowledge into labels. And then you're able to scale the domain. >> Jeff: Right. >> Through the neural network. >> So okay so then how does it then compare with the other tools that you've used in the past? In terms of, obviously the process is very different, but in terms of just pure performance? What are you finding? >> So we are finding very good performance and actually we are finding very good accuracy. Right? So once it's trained, and it's doing very well on the failure patterns, it's getting it right 90% of the time, right? >> Really? >> Yes, but in a machine learning program, what happens is sometimes the model is over-fitted or it's under-fitted or there is bias in the model and you got to remove the bias in the model or you got to figure out, well, is the model false-positive or false-negative? You got to optimize for something, right? >> Right, right. >> Because we are really dealing with mathematical approximation, we are not dealing with preciseness, we are not dealing with exactness. >> Right, right. >> In neural networks, actually, it's pretty good, because it's actually always dealing with accuracy. It's not dealing with precision, right? So it's accurate most of the time. >> Interesting, because that's often what's common about the kind of difference between computer science and statistics, right? >> Yes. >> Computers is binary. Statistics always has a kind of a confidence interval. But what you're describing, it sounds like the confidence is tightening up to such a degree that it's almost reaching binary. >> Yeah, yeah, exactly. And see, brute force is good when your traditional computing programing paradigm is very brute force type paradigm, right? The traditional paradigm is very good when the problems are simpler. But when the problems are of scale, like you're talking 70 petabytes of data or you're talking 70 billion roles, right? Find all these patterns in that, right? >> Jeff: Right. >> I mean you just, the scale at which that operates and at the scale at which traditional machine learning even works is quite different from how neural networks work. >> Jeff: Okay. >> Right? Traditional machine learning you still have to do some feature extraction. You still have to say "Oh I can't." Otherwise you are going to have dimensionality issues, right? It's too broad to get the prediction anywhere close. >> Right. >> Right? And so you want to reduce the dimensionality to get a better prediction. But here you don't have to worry about dimensionality. You just have to make sure the labels are right. >> Right, right. So as you dig deeper into this tool and expose all these new capabilities, what do you look forward to? What can you do that you couldn't do before? >> It's interesting because it's grossly underestimating the human brain, right? The human brain is supremely powerful in all aspects, right? And there is a great deal of difficulty in trying to code the human brain, right? But with neural networks and because of the various propagation layers and the ability to move through these networks we are coming closer and closer, right? So one example: When you think about driving, recently, Google driverless car got into an accident, right? And where it got into an accident was the driverless car was merging into a lane and there was a bus and it collided with the bus. So where did A.I. go wrong? Now if you train an A.I., birds can fly, and then you say penguin is a bird, it is going to assume penguin can fly. >> Jeff: Right, right. >> We as humans know penguin is a bird but it can't fly like other birds, right? >> Jeff: Right. >> It's that anomaly thing, right? Naturally when are driving and a bus shows up, even if it's yield, the bus goes. >> Jeff: Right, right. >> We yield to the bus because it's bigger and we know that. >> A.I. doesn't know that. It was taught that yield is yield. >> Right, right. >> So it collided with the bus. But the beauty is now large fleets of cars can learn very quickly based on what it just got from that one car. >> Right, right. >> So now there are pros and cons. So think about you driving down Highway 85 and there is a collision, it's Sunday morning, you don't know about the collision. You're coming down on the hill, right? Blind corner and boom that's how these crashes happen and so many people died, right? If you were driving a driverless car, you would have knowledge from the fleet and from everywhere else. >> Right. >> So you know ahead of time. We don't talk to each other when we are in cars. We don't have universal knowledge, right? >> Car-to-car communication. >> Car-to-car communications and A.I. has that so directly it can save accidents. It can save people from dying, right? But people still feel, it's a psychology thing, people still feel very unsafe in a driverless car, right? So we have to get over- >> Well they will get over that. They feel plenty safe in a driverless airplane, right? >> That's right. Or in a driveless light rail. >> Jeff: Right. >> Or, you know, when somebody else is driving they're fine with the driver who's driving. You just sit in the driver's car. >> But there's that one pesky autonomous car problem, when the pedestrian won't go. >> Yeah. >> And the car is stopped it's like a friendly battle-lock. >> That's right, that's right. >> Well good stuff Janet and always great to see you. I'm sure we will see you very shortly 'cause you are at all the great big data conferences. >> Thank you. >> Thanks for taking a few minutes out of your day. >> Thank you. >> Alright she is Janet George, she is the smartest lady at Western Digital, perhaps in Silicon Valley. We're not sure but we feel pretty confident. I am Jeff Frick and you're watching theCUBE from When IoT meets AI: The Intelligence of Things. We will be right back after this short break. Thanks for watching. (upbeat electronic music)

Published Date : Jul 2 2017

SUMMARY :

Brought to you by Western Digital. We are at downtown San Jose at the Fairmont Hotel. So, what have you been playing with lately? Looking at how the predictions are, So it's interesting. behind the tool. So you have to figure out all the features So then, how does that apply to, you know, So these failures, for us as humans, we can't escape. at the neural networks. the promise of the data lake if you will. But we found, often, that that's not true. But what you're describing though, sounds much more Now, the only caveat is you need some labels, right? So you have to give it some labels to understand So if you are training with a cat, in the brains of the people who are working. So we are finding very good performance we are not dealing with preciseness, So it's accurate most of the time. But what you're describing, it sounds like the confidence the problems are simpler. and at the scale at which traditional machine learning Traditional machine learning you still have to But here you don't have to worry about dimensionality. So as you dig deeper into this tool and because of the various propagation layers even if it's yield, the bus goes. It was taught that yield is yield. So it collided with the bus. So think about you driving down Highway 85 So you know ahead of time. So we have to get over- Well they will get over that. That's right. You just sit in the driver's car. But there's that one pesky autonomous car problem, I'm sure we will see you very shortly 'cause you are Alright she is Janet George, she is the smartest lady

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
JeffPERSON

0.99+

Jeff FrickPERSON

0.99+

Janet GeorgePERSON

0.99+

JanetPERSON

0.99+

Western DigitalORGANIZATION

0.99+

90%QUANTITY

0.99+

Silicon ValleyLOCATION

0.99+

one carQUANTITY

0.99+

Highway 85LOCATION

0.99+

Sunday morningDATE

0.99+

two failuresQUANTITY

0.99+

70 billion rolesQUANTITY

0.99+

GoogleORGANIZATION

0.99+

CUBEORGANIZATION

0.98+

one exampleQUANTITY

0.96+

The Intelligence of ThingsTITLE

0.94+

hundreds of cat imagesQUANTITY

0.93+

firstQUANTITY

0.92+

theCUBEORGANIZATION

0.84+

San JoseLOCATION

0.8+

one pesky autonomous carQUANTITY

0.77+

70 petabytes of dataQUANTITY

0.77+

hundreds andQUANTITY

0.76+

IoTORGANIZATION

0.74+

millions of failuresQUANTITY

0.66+

Fairmont HotelLOCATION

0.66+

ollisionPERSON

0.65+

meetsTITLE

0.64+

#theCUBEORGANIZATION

0.57+

Hadoop SummitEVENT

0.51+

ofTITLE

0.47+

Scott Noteboom, Litbit – When IoT Met AI: The Intelligence of Things - #theCUBE


 

>> Announcer: From the Fairmont Hotel in the heart of Silicon Valley, it's The Cube. Covering When IoT met AI: The Intelligence of Things. Brought to you by Western Digital. >> Hey, welcome back, everybody. Jeff Frick here with The Cube. We're in downtown Los Angeles at the Fairmont Hotel at a interesting little show called When IoT Met AI: The Intelligence of Things. A lot of cool startups here along with some big companies. We're really excited go have our next guest, taking a little different angle. He's Scott Noteboom. He is the co-founder and CEO of a company called Litbit. First off, Scott, welcome. >> Yeah, thank you very much. >> Absolutely. For folks that aren't familiar, what is Litbit, what's your core mission? >> Well, probably, the simplest way to put it is, is in business we enable our users who have a lot of experience in a lot of different areas to take their expertise and experience which may not be coding software, or understanding, or even being able to spell what an algorithm is on the data science perspective, and being able to give them an easy interface so they can kind of create their own Siro or Alexa, an AI but an AI that's based on their own subject matter expertise that they can put to work in a lot of different ways. >> So, there's often a lot of talk about kind of tribal knowledge, and how does tribal knowledge get passed down so people know how to do things. Whether it's with new employees, or as you were talking about a little bit off camera, just remote locations for this or that. And there hasn't really been a great system to do that. So, you're really attacking that, not only with the documentation, but then making an AI actionable piece of software that can then drive machines and using IoT to do things. Is that correct? >> That's right. So, if you created, say an AI that I've been passionate about 'cause I ran data centers for a lot of years, is DAC. So, DAC's an AI that has a lot of expertise, and how to run a data center by, and kind of fueled and mentored by a lot of the experts in the industry. So, how can you take DAC and put Dak to work in a lot of places? And the people who need the best trained DAC aren't people who are building apps. They are people who have their area of subject matter expertise, and we view these AI personas that can be put to work as kind of apps of the future, where can people can prescribe to personas that are build directly by the experts, which is a pretty pure way to connect AIs with the right people, and then be able to get them and put them-- >> So, there's kind of two steps to the process. How does the information get from the experts into your system? How's that training happen? >> So, where we spend a lot of attention is, a lot of people question and go, "Well, an AI lives in this virtual logical world "that's disconnected from the physical world." And I always questions for people to close their eyes and imagine their favorite person that loves them in the world. And when they picture that person hear that person's voice in their head, that's actually a very similar virtual world as what AIs working. It's not the physical world. And what connects us as people to the physical world, our senses, our sight, our hearing, our touch, our feeling. And what we've done is we've enabled using IoT sensors, the ability of combining those sensors with AI to turn sensors into senses, which then provide the ability for the AI to connect really meaningful ways to the physical world. And then the experts can teach the AI this is what this looks like, this is what this sounds like, this is what it's supposed to feel like. If it's greater than 80 degrees in an office location, it's hot. Really teaching the AI to be able to form thoughts based on a specific expertise and then be able to take the right actions to do the right things when those thoughts are formed. >> How do you deal with nuance, 'cause I'm sure there's a lot of times where people, as you said, are sensing or smelling or something, but they don't even necessarily consciously know that that's an input into their decision process, even though it really is. They just haven't really thought of it as a discrete input. How do you separate out all these discreet inputs so you get a great model that represents your best of breed technicians? >> Well, to try to answer the question, first of all, the more training the better. So, the good way to think of the AI is, unlike a lot of technologies that typically age and go out of life over time, an AI continuously gets smarter the more it's mentored by people, which would be supervised learning. And the more it can adjust and learn on it's own combined with real day to day data activity combined with that supervised learning and unsupervised learning approach, so enabling it to continuously get better over time. We've figure out some ways that it can produce some pretty meaningful results with a small amount of training. So, yeah. >> Okay. What are some of the applications, kind of your initial go to market? >> We're a small startup, and really, what we've done is we've developed a platform that we really like to, our goal is for it to be very horizontal in nature. And then the applications or the AI personas can be very vertical or subject matter experts across different silos. So, what we're doing is, is we're working with partners right now in different silos developing AIs that have expertise in the oil and gas business, in the pharmaceutical space, in the data center space, in the corporate facilities manage space, and really making sure that people who aren't technologists in all of those spaces, whether you're a very specific scientists who're running a lab, or a facilities guy in a corporate building, can successfully make that experiential connection between themselves and the AI, and put it to practical use. And then as we go, there's a lot of efforts that can be very specific to specific silos, whatever they may be. >> So, those personas are actually roles of individuals, if you will, performing certain tasks within those verticals. >> Absolutely. What we call them is coworkers, and the way things are designed is, one of the things that I think is really important in the AI world is that we approach everything from a human perspective because it's a big disruptive shift, and there's a lot of concern over it. So, if you get people to connect to it in a humanistic way, like coworker Viv works along with coworker Sophia, and Viv has this expertise, Sophia has this expertise, and has better improving ways to interface with people who have names that aren't a lot different from them and have skillsets that aren't a lot different. When you look at the AIS, they don't mind working longer hours. Let them work the weekends so I can spend hours with my family. Let them work the crazy shifts. So, things are different in that regard. But the relationship aspect of how the workplace works, try not to disrupt that too much. >> So, then on a consumption side, with the person coworker that's working with the persona, how do they interact with it, how do they get the data out, and I guess even more importantly, maybe, how do they get the new data back in to continue to train the model? >> So, the biggest thing you have to focus on with a human and machine learning interface that doesn't require a program or a data science, is that the language that the AI is taught in is human language, natural human language. So, we developed a lot of natural human language files that are pretty neat because a human coworker in California here could be interfacing in english to their coworker, and at the same time, someone speaking Mandarin in Shanghai could be interfacing with the same coworker speaking mandarin unless you can get multilingual functionality. Right now, to answer your question, people are doing it in a text based scenario. But the future vision, I think when the industry timing is right, is we view that every one of the coworkers we're developing will have a very distinct unique fingerprint of a voice. So, therefor, when you're engaging with your coworker using voice, you'll begin to recognize, oh, that's Dax, or that's Viv, or that's Sophia, based on their voice. So, like many people, this is how we're communicating with voice, and we believe the same thing's going to occur. And a lot of that's in timing. That's the direction where things are headed. >> Interesting. The whole voice aspect is just a whole 'nother interesting thing in terms of what type of voice personality attributes associated with voice. That's probably going to be a huge piece in terms of the adoption, in terms of having a true coworker experience, if you will. >> One of the things we haven't figure out, and these are important questions, and there's so many unknowns, is we feel really confident that the AI persona should have a unique voice because then I know who I'm engaging with, and I can connect by ear without them saying what their name is. But what does an AI persona look like? That's something where actually we don't know that, and we explore different things and, oh, that looks scary, or oh, that doesn't make sense. Should it look like anything? Which has largely been the approach of what does an Alexa or a Siri look like. As you continue to advance those engagements, and particularly when augmented reality comes into play, through augmented reality, if you're able to look and say, "Oh, a coworker's working over there," there's some value in that. But what is it going to look like? That's interesting, and we don't know that. >> Hopefully, better than those things at the San Jose Airport that are running around. >> Yeah, exactly. >> Classic robot. All right, Scott, very interesting story. I look forward to watching you grow and develop over time. >> Awesome, it's good to talk. >> Absolutely, all right, he's Scott Noteboom, he's from Litbit. I'm Jeff Frick, you're watching The Cube. We're at When IoT met AI: The Intelligence of Things, here at San Jose California. We'll be right back after the short break. Thanks for watching. (upbeat music)

Published Date : Jul 2 2017

SUMMARY :

in the heart of Silicon Valley, We're in downtown Los Angeles at the Fairmont Hotel For folks that aren't familiar, that they can put to work in a lot of different ways. And there hasn't really been a great system to do that. by a lot of the experts in the industry. the experts into your system? Really teaching the AI to be able to that represents your best of breed technicians? So, the good way to think of the AI is, What are some of the applications, in the pharmaceutical space, in the data center space, So, those personas are actually and the way things are designed is, So, the biggest thing you have to in terms of the adoption, in terms of One of the things we haven't figure out, at the San Jose Airport that are running around. I look forward to watching you We'll be right back after the short break.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FrickPERSON

0.99+

CaliforniaLOCATION

0.99+

SophiaPERSON

0.99+

ScottPERSON

0.99+

Scott NoteboomPERSON

0.99+

Western DigitalORGANIZATION

0.99+

LitbitORGANIZATION

0.99+

Silicon ValleyLOCATION

0.99+

ShanghaiLOCATION

0.99+

SiriTITLE

0.99+

two stepsQUANTITY

0.99+

San Jose CaliforniaLOCATION

0.99+

San Jose AirportLOCATION

0.99+

MandarinOTHER

0.99+

The CubeTITLE

0.98+

greater than 80 degreesQUANTITY

0.98+

The CubeORGANIZATION

0.98+

mandarinOTHER

0.98+

VivPERSON

0.98+

oneQUANTITY

0.97+

FirstQUANTITY

0.95+

Fairmont HotelORGANIZATION

0.94+

When IoT Met AI: The Intelligence of ThingsTITLE

0.94+

AlexaTITLE

0.88+

AI: The Intelligence of ThingsTITLE

0.86+

When IoT met AI: The Intelligence of ThingsTITLE

0.86+

When IoTTITLE

0.83+

Los AngelesLOCATION

0.78+

AISORGANIZATION

0.77+

OneQUANTITY

0.77+

englishOTHER

0.72+

SiroTITLE

0.72+

#theCUBETITLE

0.64+

LitbitTITLE

0.58+

timesQUANTITY

0.55+

firstQUANTITY

0.52+

lotQUANTITY

0.49+

VivORGANIZATION

0.41+

DaxORGANIZATION

0.4+

Modar Alaoui, Eyeris – When IoT Met AI: The Intelligence of Things - #theCUBE


 

>> Narrator: From the Fairmont Hotel in the heart of Silicon Valley it's theCUBE covering when IoT met AI, The Intelligence of Things. Brought to you by Western Digital. >> Hey welcome back here everybody Jeff Frick here with theCUBE. We're in San Jose, California at the Fairmont Hotel, at the when IoT met AI show, it's all about the intelligence of things. A lot of really interesting start ups here, we're still so early days in most of this technology. Facial recognition gets a lot of play, iris recognition, got to get rid of these stupid passwords. We're really excited to have our next guest, he's Modar Alaoui, he's the CEO and founder of Eyeris. And it says here Modar that you guys are into face analytics and emotion recognition. First off welcome. >> Thank you so much for having me. >> So face analytics, I'm a clear customer I love going to clear at the airport, I put my two fingers down, I think they have my iris, they have different things but what's special about the face compared to some of these other biometric options that people have? >> We go beyond just the biometrics, we do pretty much the entire suites of face analytics. Anything from eye openness, face, gender, emotion recognition, head bows, gaze estimation, et cetera et cetera. So it is pretty much anything and everything you can derive from the face including non verbal clues, yawning, head nod, head shake, et cetera. >> That was a huge range of things, so clearly just the face recognition to know that I am me probably relatively straight forward. A couple anchor points, does everything measure up and match the prior? But emotion that's a whole different thing, not only are there lots of different emotions, but the way I express my emotion might be different than the way you express the very same emotion. Right, everybody has a different smile. So how do you start to figure out the algorithms to sort through this? >> Right, so you're right. There are some nuances between cultures, ages, genders, ethnicities and things like that. Generally they've been universalized for the past three and a half decades by the scholars the psychologists et cetera. So what they actually have a consensus on is that there are only seven or six universal emotions plus neutral. >> Six, what are the six? >> Joy, surprise, anger, disgust, fear, sadness, and neutral. >> Okay and everything is some derivation of that, you can kind of put everything into little buckets. >> That is correct so think of them as seven universal colors or seven primary colors and then everything else is a derivative of that. The other thing is that emotions are hard wired into our brain they happen in a 1/15th or a 1/25th of a second, particularly micro expressions. And they can generally give up a lot of information as to whether a person has suppressed the certain emotion or not or whether they are thinking about something negatively before they could respond positively, et cetera. >> Okay so now you've got the data, you know how I'm feeling, what are you doing with it? It must tie back to all types of different applications I would assume. >> That's right there are a number of applications. Initially when we created this, what we call, enabling technology we wanted to focus on two things. One, is what type of application could have the biggest impact but also the quickest adoption in terms of volumes. Today we focus on driver monitoring AI as well as occupants monitoring AI so we focus on Autonomous and semi autonomous vehicles. And a second application is social robotics, but in essence if you think of a car it's also another robot except that social robotics are those potentially AI engines, or even AI engines in form of an actual robot that communicates with humans. Therefore, the word social. >> Right, so I can see a kind of semi autonomous vehicle or even a not autonomous vehicle you want to know if I'm dosing off. And some of those things have been around in a basic form for a little while. But what about in an autonomous vehicle is impacted by my emotion as a passenger, not necessarily a driver if it's a level five? >> That's right, so when we talk about an autonomous vehicle I think what you're referring to is level five autonomy where a vehicle does not actually have a steering wheel or gas pedal or anything like that. And we don't foresee that those will be on a road for at least another 10 years or more. The focus today is on level two, three, and four, and that's semi autonomy. Even for autonomous, fully autonomous vehicles, you would see them come out with vision sensors or vision AI inside the vehicle. So that these sensors could, together with the software that could analyze everything that's happening inside, cater to the services towards what is going to be the ridership economy. Once the car drives itself autonomously, the focus shifts from the driver to the occupants. As a matter of a fact it's the occupants that would be riding in these vehicles or buying them or sharing them, not the driver. And therefore all these services will revolve around who is inside the vehicle like age, gender emotion, activity, et cetera. >> Interesting, so all these things the age, gender emotion, activity, what is the most important do you think in terms of your business and kind of where as you say you can have a big impact. >> We can group them into two categories, the first one is safety obviously, eye openness, head bows, blinking, yawning, and all these things are utmost importance especially focused on the driver at this point. But then there is a number of applications that relates to comfort and personalization. And so those could potentially take advantage of the emotions and the rest of the analytics. >> Okay, so then where are you guys, Eyeris as a company? Where do have some installations I assume out there? Are you still early days kind of? Where are you in terms of the development of the company? >> We have quite a mature product, what I can disclose is we have plans to go into mass production starting 2018. Some plans for Q4 2017 have been pushed out. So we'll probably start seeing some of those in Q1, Q2 2018. >> Okay. >> We made some announcements earlier this year at CS with Toyota and Honda. But then we'll be seeing some mass volume starting 2019 and beyond. >> Okay, and I assume you're a cloud based solution. >> We do have that as well, but we are particularly a local processing solution. >> Jeff: Oh you are? >> Yes so think of it as an edge computing type of solution. >> Okay and then you work with other peoples sensors and existing systems or are you more of a software component that plugs in? Or you provide the whole system in terms of the, I assume, cameras to watch the people? >> So we're a software company only, we however, are hardware processor camera diagnostic. And of course for everything to succeed there will have to be some components of sensor fusion. And therefore we can work and do work with other sensor companies in order to provide higher confidence level of all the analytics that we provide. >> Pretty exciting, so is it commercially available you're GA now or not quite yet? >> We'll be commercially available, you'll start seeing it on the roads or in the market sometime early next year. >> Sometime early next year? Alright well we will look forward to it. >> Thank you so much. >> Very exciting times, alright, he's Modar Alaoui. And he's going to be paying attention to you to make sure you're paying attention to the roads. So you don't fall asleep, or doze off and go to sleep. So I'm Jeff Frick, you're watching theCUBE at IoT met AI, The Intelligence of Things. San Jose, California, we'll be right back after this short break, thanks for watching. (bright techno music)

Published Date : Jul 2 2017

SUMMARY :

Brought to you by Western Digital. And it says here Modar that you guys So it is pretty much anything and everything you can derive than the way you express the very same emotion. by the scholars the psychologists et cetera. you can kind of put everything into little buckets. as to whether a person has suppressed the certain emotion you know how I'm feeling, what are you doing with it? but in essence if you think of a car you want to know if I'm dosing off. the focus shifts from the driver to the occupants. activity, what is the most important do you think in terms of the emotions and the rest of the analytics. to go into mass production starting 2018. We made some announcements earlier this year We do have that as well, but we are particularly of all the analytics that we provide. or in the market sometime early next year. Alright well we will look forward to it. And he's going to be paying attention to you

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FrickPERSON

0.99+

ToyotaORGANIZATION

0.99+

JeffPERSON

0.99+

Modar AlaouiPERSON

0.99+

HondaORGANIZATION

0.99+

2019DATE

0.99+

2018DATE

0.99+

Western DigitalORGANIZATION

0.99+

sixQUANTITY

0.99+

EyerisORGANIZATION

0.99+

SixQUANTITY

0.99+

Silicon ValleyLOCATION

0.99+

San Jose, CaliforniaLOCATION

0.99+

TodayDATE

0.99+

two thingsQUANTITY

0.99+

two fingersQUANTITY

0.99+

todayDATE

0.99+

second applicationQUANTITY

0.99+

ModarPERSON

0.99+

two categoriesQUANTITY

0.99+

early next yearDATE

0.98+

10 yearsQUANTITY

0.98+

seven primary colorsQUANTITY

0.98+

OneQUANTITY

0.97+

Q1DATE

0.97+

Q2 2018DATE

0.97+

fourQUANTITY

0.96+

FirstQUANTITY

0.96+

earlier this yearDATE

0.96+

sevenQUANTITY

0.95+

level fiveQUANTITY

0.95+

EyerisPERSON

0.94+

first oneQUANTITY

0.93+

theCUBEORGANIZATION

0.92+

The Intelligence of ThingsTITLE

0.9+

seven universal colorsQUANTITY

0.9+

threeQUANTITY

0.89+

level twoQUANTITY

0.88+

Met AI: The Intelligence of ThingsTITLE

0.87+

Q4 2017DATE

0.84+

six universal emotionsQUANTITY

0.84+

couple anchor pointsQUANTITY

0.83+

1/25thQUANTITY

0.83+

Fairmont HotelLOCATION

0.77+

Fairmont HotelORGANIZATION

0.75+

levelOTHER

0.72+

1/15thQUANTITY

0.69+

a secondQUANTITY

0.66+

theCUBETITLE

0.61+

#theCUBEORGANIZATION

0.59+

half decadesQUANTITY

0.55+

past three and aDATE

0.54+

fiveQUANTITY

0.53+

ofTITLE

0.49+

Mike Wilson, BriteThings – When IoT Met AI: The Intelligence of Things - #theCUBE


 

(upbeat music) >> Announcer: From the Fairmont Hotel, in the heart of Silicon Valley, it's theCUBE. Covering, When IoT met AI: The Intelligence of Things. Brought to you by Western Digital. >> Welcome back everybody. Jeff Frick here with theCUBE. We're at Downtown San Jose at the Fairmont Hotel at a small little conference, very intimate affair, talking about IoT and AI, The Intelligence of Things. When IoT met AI. Now, they've got a cool little start up, kind of expo hall. We're excited to have our next guest here from that. It's Mike Wilson, he's the CEO of BriteThings. Mike, welcome. >> Good to be here, Jeff, how you doin'? >> Absolutely. So, BriteThings. What are BriteThings? >> BriteThings are intelligent plugs, power strips, wall sockets, anything that fits into the plug load space. It learns users behavior and then provides them an intelligent on-off schedule. The goal here is to turn stuff off when it's on and not being needed. >> Right. >> So wasted energy. Nights and weekends in the workspace, for example. >> It sounds like such a simple thing. >> Totally. >> But we were talking before we turned the cameras on, this actually has giant economic impact >> It does. >> in building maintenance, which is a huge category >> Yup. >> as you said, I'll let you kind of break down the numbers as to where >> Sure. >> that energy's being spent and the impact that you guys are having. >> Well our customers are building owners and operators, and they pay an electrical bill to run that building. It's a cost of running the building. About 27% of it goes to lighting, about 38% goes to heating and cooling, and all the rest goes to plug loads. And where we come to the market it, of course there's huge lighting companies, famous names, same with HVAC, but no one's doing anything about plug loads, and the reason is is because plug loads are distributed, they're hard to control. And so what we bring to the market is a product that is small, inexpensive, and can suddenly give owners and operators all the control that they enjoy with lighting and HVAC over their plug loads. >> So it's kind of like Dest, in that it takes a relatively simple function, now because of the cloud, because of the internet, you can add a lot more intelligence into a relatively, I don't want to say dumb device, but the device itself doesn't have to have that much power 'cause you can put the application somewhere else. >> Exactly, so if you just imagine, you're sitting here with me right now. Probably at your workplace and at home there's a bunch of stuff turned on, you're not using it, >> Right >> but you're spending money to keep it powered up, and that's causing CO2 to be generated at power plant down the road. So that's bad for your pocket, it's bad for the environment. So if we can automatically turn that stuff off, then people don't have to worry about it. We can measure it, so here's where the money is. >> Right. >> Not only energy savings, but data. So I can tell you when you turned your stuff on and off, so that means human presence. When you're at work, there's a value to that. If you're going to put a floor of an office building out there and heat it or light it, we can tell you if people are there or not. So you can look at that and make, and save even more money. >> Jeff: Right. >> We've got one customer that uses our product for inventory management. If it plugs in, you can see it on our screen, and you can see if it's on or off, if it's connected and how it's running. So that kind of data ends up being valuable, not only for energy savings, because we turn stuff on and off, but human presence, inventory control, the list goes on and on. Our customers actually every year are coming up with new ways to use our device. >> Right. And just for the baseline savings, you just basically plug it in and turn it on, and you're reporting some huge savings just by just the basic operation of your strips versus a regular strip. >> Exactly. So just imagine, this device is learning your behavior, so that's part of our, you know, that's kind of our core competency here, is these devices measure the amount of energy you're using. When you're not using something, it goes into standby mode, or sleep mode. Then we turn that off to save you the money. But the way we're able to do that is using artificial intelligence to learn patterns, and take those patterns and you can basically guess the best optimized schedule for your devices to be turned and off. >> Right. >> On and off. So if you imagine you've got 100,000 employees, 100,000 different schedules, this thing has to be smart and it can't affect worker productivity. >> Right. >> So we have to be smart enough to know when to turn it on before you come into work, when to turn it off to save you the max amount of money, and be able to measure all of that so you can roll that up and see how much money you're saving. How much CO2 are you reducing? >> Right. >> You know, so sustainability officers love our product too. >> So do you integrate with other types of intelligent systems in that space? The lightings, and the HVAC? >> Yeah. Exactly. So one of the most important things is, I've got a portfolio, my office building is a portfolio of devices and systems, so just one of them is our plug load management, right? So I want to be able to see my plug load in my current control panel. So we've got APIs where our cloud technology is able to take that reporting and stick it into, for example, a Lucid control panel. We're working with Trane right now to integrate their BACnet solution for their building control management. >> Right, right. >> So that their customers are able to see lighting, HVAC, and plug load, >> Just what I was going to say. >> right off the same old screen and operating tools that they've always used. >> Right, right. What's kind of the typical ROI that you pitch people just for the straight-up money savings that they're going to get? >> We got our foot in the door by saying we can reduce your plug load cost a minimum of 30%, and what we're seeing on average is about 40 to 45%. >> Wow. >> It's a huge huge reduction. >> Now where do you go next? >> Well, conquer the world. (Jeff laughs) You know, so imagine this, anywhere in the commercial office space where there's a plug, so let your mind go, how many power strips are out there? >> Right, right. >> How many of those-- >> We're using about 20 of them right here. >> Yeah, so, just, you know, every person at every desk is a potential customer. Every time there's a coffeemaker or a break room, a fax machine, you know, any piece of equipment that's plugged in, we can save you money. Vending machines. We have a customer with these, you know, raise and lower desks. Crazy, they want to just see, they don't want to save energy, they want to know who's using that and how often. >> Jeff: Right, right. >> Our device can do that, too. >> Right. >> And that's that data I was telling you about. Once you start collecting data of how people use plugged-in devices, I'm collecting information about you, how you use your laptop, how you use your charger, how often. >> Because the signature on the draw is different depending on the activity of the device. >> You got it. Exactly. >> I love this. You know, it's so funny because the second-order impact of all these types of things is so much more significant than people give it credit, I think. >> It's about the data. >> Jeff: Yeah. >> And our customer's just love that, because the data gives them control, and when you have control, cost savings. >> And is it just commercial, or you sell them for regular retail customers as well? Or do you-- >> I imagine some day in the future that's a potential, but you know, our focus right now, 'cause the big problem out there is that buildings use 40% of all the energy generated in the United States, and commercial space is the big opportunity, because nights and weekends. >> Right. >> Stuff should be turned off, and we can do that right now. >> Right, right. >> We're the market doing it. >> Buildings with big, big POs. >> Yup. (Jeff laughs) >> Alright, Michael, sounds like exciting stuff, can't wait til I can get one at Best Buy or Office Depot, or something. >> Coming to a store near you, or www.britethings.com. >> Alright, thanks a lot, he's Mike Wilson. Save some energy, get one of these things when they're available, or at least tell the boss to get one at the office. (Michael laughs) >> Definitely. >> Alright, I'm Jeff Frick, you're watching theCUBE. When IoT meets AI in San Jose, California. Thanks for watching. (upbeat music)

Published Date : Jul 2 2017

SUMMARY :

Brought to you by Western Digital. We're at Downtown San Jose at the Fairmont Hotel What are BriteThings? The goal here is to turn stuff off when it's on Nights and weekends in the workspace, for example. and the impact that you guys are having. and operators all the control that they enjoy with lighting because of the internet, you can add a lot more intelligence Exactly, so if you just imagine, you're sitting here So if we can automatically turn that stuff off, and heat it or light it, we can tell you and you can see if it's on or off, if it's connected just the basic operation of your strips and take those patterns and you can basically guess So if you imagine you've got 100,000 employees, and be able to measure all of that so you can roll that up So one of the most important things is, right off the same What's kind of the typical ROI that you pitch people We got our foot in the door by saying we can reduce Well, conquer the world. of them right here. that's plugged in, we can save you money. how you use your charger, how often. on the activity of the device. You got it. You know, it's so funny because the second-order impact And our customer's just love that, because the data in the future that's a potential, but you know, and we can do that right now. Buildings with big, (Jeff laughs) Alright, Michael, sounds like exciting stuff, to get one at the office. Alright, I'm Jeff Frick, you're watching theCUBE.

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FrickPERSON

0.99+

Mike WilsonPERSON

0.99+

JeffPERSON

0.99+

MichaelPERSON

0.99+

BriteThingsORGANIZATION

0.99+

40%QUANTITY

0.99+

MikePERSON

0.99+

San Jose, CaliforniaLOCATION

0.99+

Silicon ValleyLOCATION

0.99+

United StatesLOCATION

0.99+

100,000 employeesQUANTITY

0.99+

Western DigitalORGANIZATION

0.99+

oneQUANTITY

0.99+

about 38%QUANTITY

0.99+

30%QUANTITY

0.99+

www.britethings.comOTHER

0.99+

About 27%QUANTITY

0.98+

Best BuyORGANIZATION

0.98+

Office DepotORGANIZATION

0.98+

The Intelligence of ThingsTITLE

0.98+

one customerQUANTITY

0.97+

Fairmont HotelORGANIZATION

0.96+

about 40QUANTITY

0.95+

TraneORGANIZATION

0.93+

second-orderQUANTITY

0.88+

45%QUANTITY

0.87+

about 20 of themQUANTITY

0.85+

Downtown San JoseLOCATION

0.85+

100,000 different schedulesQUANTITY

0.78+

theCUBEORGANIZATION

0.78+

FairmontORGANIZATION

0.66+

HotelLOCATION

0.48+

#theCUBEORGANIZATION

0.37+