Image Title

Search Results for Modar:

Modar Alaoui, Eyeris – When IoT Met AI: The Intelligence of Things - #theCUBE


 

>> Narrator: From the Fairmont Hotel in the heart of Silicon Valley it's theCUBE covering when IoT met AI, The Intelligence of Things. Brought to you by Western Digital. >> Hey welcome back here everybody Jeff Frick here with theCUBE. We're in San Jose, California at the Fairmont Hotel, at the when IoT met AI show, it's all about the intelligence of things. A lot of really interesting start ups here, we're still so early days in most of this technology. Facial recognition gets a lot of play, iris recognition, got to get rid of these stupid passwords. We're really excited to have our next guest, he's Modar Alaoui, he's the CEO and founder of Eyeris. And it says here Modar that you guys are into face analytics and emotion recognition. First off welcome. >> Thank you so much for having me. >> So face analytics, I'm a clear customer I love going to clear at the airport, I put my two fingers down, I think they have my iris, they have different things but what's special about the face compared to some of these other biometric options that people have? >> We go beyond just the biometrics, we do pretty much the entire suites of face analytics. Anything from eye openness, face, gender, emotion recognition, head bows, gaze estimation, et cetera et cetera. So it is pretty much anything and everything you can derive from the face including non verbal clues, yawning, head nod, head shake, et cetera. >> That was a huge range of things, so clearly just the face recognition to know that I am me probably relatively straight forward. A couple anchor points, does everything measure up and match the prior? But emotion that's a whole different thing, not only are there lots of different emotions, but the way I express my emotion might be different than the way you express the very same emotion. Right, everybody has a different smile. So how do you start to figure out the algorithms to sort through this? >> Right, so you're right. There are some nuances between cultures, ages, genders, ethnicities and things like that. Generally they've been universalized for the past three and a half decades by the scholars the psychologists et cetera. So what they actually have a consensus on is that there are only seven or six universal emotions plus neutral. >> Six, what are the six? >> Joy, surprise, anger, disgust, fear, sadness, and neutral. >> Okay and everything is some derivation of that, you can kind of put everything into little buckets. >> That is correct so think of them as seven universal colors or seven primary colors and then everything else is a derivative of that. The other thing is that emotions are hard wired into our brain they happen in a 1/15th or a 1/25th of a second, particularly micro expressions. And they can generally give up a lot of information as to whether a person has suppressed the certain emotion or not or whether they are thinking about something negatively before they could respond positively, et cetera. >> Okay so now you've got the data, you know how I'm feeling, what are you doing with it? It must tie back to all types of different applications I would assume. >> That's right there are a number of applications. Initially when we created this, what we call, enabling technology we wanted to focus on two things. One, is what type of application could have the biggest impact but also the quickest adoption in terms of volumes. Today we focus on driver monitoring AI as well as occupants monitoring AI so we focus on Autonomous and semi autonomous vehicles. And a second application is social robotics, but in essence if you think of a car it's also another robot except that social robotics are those potentially AI engines, or even AI engines in form of an actual robot that communicates with humans. Therefore, the word social. >> Right, so I can see a kind of semi autonomous vehicle or even a not autonomous vehicle you want to know if I'm dosing off. And some of those things have been around in a basic form for a little while. But what about in an autonomous vehicle is impacted by my emotion as a passenger, not necessarily a driver if it's a level five? >> That's right, so when we talk about an autonomous vehicle I think what you're referring to is level five autonomy where a vehicle does not actually have a steering wheel or gas pedal or anything like that. And we don't foresee that those will be on a road for at least another 10 years or more. The focus today is on level two, three, and four, and that's semi autonomy. Even for autonomous, fully autonomous vehicles, you would see them come out with vision sensors or vision AI inside the vehicle. So that these sensors could, together with the software that could analyze everything that's happening inside, cater to the services towards what is going to be the ridership economy. Once the car drives itself autonomously, the focus shifts from the driver to the occupants. As a matter of a fact it's the occupants that would be riding in these vehicles or buying them or sharing them, not the driver. And therefore all these services will revolve around who is inside the vehicle like age, gender emotion, activity, et cetera. >> Interesting, so all these things the age, gender emotion, activity, what is the most important do you think in terms of your business and kind of where as you say you can have a big impact. >> We can group them into two categories, the first one is safety obviously, eye openness, head bows, blinking, yawning, and all these things are utmost importance especially focused on the driver at this point. But then there is a number of applications that relates to comfort and personalization. And so those could potentially take advantage of the emotions and the rest of the analytics. >> Okay, so then where are you guys, Eyeris as a company? Where do have some installations I assume out there? Are you still early days kind of? Where are you in terms of the development of the company? >> We have quite a mature product, what I can disclose is we have plans to go into mass production starting 2018. Some plans for Q4 2017 have been pushed out. So we'll probably start seeing some of those in Q1, Q2 2018. >> Okay. >> We made some announcements earlier this year at CS with Toyota and Honda. But then we'll be seeing some mass volume starting 2019 and beyond. >> Okay, and I assume you're a cloud based solution. >> We do have that as well, but we are particularly a local processing solution. >> Jeff: Oh you are? >> Yes so think of it as an edge computing type of solution. >> Okay and then you work with other peoples sensors and existing systems or are you more of a software component that plugs in? Or you provide the whole system in terms of the, I assume, cameras to watch the people? >> So we're a software company only, we however, are hardware processor camera diagnostic. And of course for everything to succeed there will have to be some components of sensor fusion. And therefore we can work and do work with other sensor companies in order to provide higher confidence level of all the analytics that we provide. >> Pretty exciting, so is it commercially available you're GA now or not quite yet? >> We'll be commercially available, you'll start seeing it on the roads or in the market sometime early next year. >> Sometime early next year? Alright well we will look forward to it. >> Thank you so much. >> Very exciting times, alright, he's Modar Alaoui. And he's going to be paying attention to you to make sure you're paying attention to the roads. So you don't fall asleep, or doze off and go to sleep. So I'm Jeff Frick, you're watching theCUBE at IoT met AI, The Intelligence of Things. San Jose, California, we'll be right back after this short break, thanks for watching. (bright techno music)

Published Date : Jul 2 2017

SUMMARY :

Brought to you by Western Digital. And it says here Modar that you guys So it is pretty much anything and everything you can derive than the way you express the very same emotion. by the scholars the psychologists et cetera. you can kind of put everything into little buckets. as to whether a person has suppressed the certain emotion you know how I'm feeling, what are you doing with it? but in essence if you think of a car you want to know if I'm dosing off. the focus shifts from the driver to the occupants. activity, what is the most important do you think in terms of the emotions and the rest of the analytics. to go into mass production starting 2018. We made some announcements earlier this year We do have that as well, but we are particularly of all the analytics that we provide. or in the market sometime early next year. Alright well we will look forward to it. And he's going to be paying attention to you

SENTIMENT ANALYSIS :

ENTITIES

EntityCategoryConfidence
Jeff FrickPERSON

0.99+

ToyotaORGANIZATION

0.99+

JeffPERSON

0.99+

Modar AlaouiPERSON

0.99+

HondaORGANIZATION

0.99+

2019DATE

0.99+

2018DATE

0.99+

Western DigitalORGANIZATION

0.99+

sixQUANTITY

0.99+

EyerisORGANIZATION

0.99+

SixQUANTITY

0.99+

Silicon ValleyLOCATION

0.99+

San Jose, CaliforniaLOCATION

0.99+

TodayDATE

0.99+

two thingsQUANTITY

0.99+

two fingersQUANTITY

0.99+

todayDATE

0.99+

second applicationQUANTITY

0.99+

ModarPERSON

0.99+

two categoriesQUANTITY

0.99+

early next yearDATE

0.98+

10 yearsQUANTITY

0.98+

seven primary colorsQUANTITY

0.98+

OneQUANTITY

0.97+

Q1DATE

0.97+

Q2 2018DATE

0.97+

fourQUANTITY

0.96+

FirstQUANTITY

0.96+

earlier this yearDATE

0.96+

sevenQUANTITY

0.95+

level fiveQUANTITY

0.95+

EyerisPERSON

0.94+

first oneQUANTITY

0.93+

theCUBEORGANIZATION

0.92+

The Intelligence of ThingsTITLE

0.9+

seven universal colorsQUANTITY

0.9+

threeQUANTITY

0.89+

level twoQUANTITY

0.88+

Met AI: The Intelligence of ThingsTITLE

0.87+

Q4 2017DATE

0.84+

six universal emotionsQUANTITY

0.84+

couple anchor pointsQUANTITY

0.83+

1/25thQUANTITY

0.83+

Fairmont HotelLOCATION

0.77+

Fairmont HotelORGANIZATION

0.75+

levelOTHER

0.72+

1/15thQUANTITY

0.69+

a secondQUANTITY

0.66+

theCUBETITLE

0.61+

#theCUBEORGANIZATION

0.59+

half decadesQUANTITY

0.55+

past three and aDATE

0.54+

fiveQUANTITY

0.53+

ofTITLE

0.49+